Changeset 242


Ignore:
Timestamp:
05/21/10 21:46:54 (6 years ago)
Author:
mmckerns
Message:

edits to fix epydoc formatting and content

Location:
mystic/mystic
Files:
11 edited

Legend:

Unmodified
Added
Removed
  • mystic/mystic/__init__.py

    r240 r242  
    9898    - setuptools, version >= 0.6 
    9999    - matplotlib, version >= 0.91 
     100    - pathos, version >= 0.1a1 
    100101 
    101102 
     
    138139 
    139140Important classes and functions are found here:: 
    140     - mystic.mystic.abstract_solver [the solver API definition] 
    141     - mystic.mystic.abstract_map_solver [the parallel solver API] 
     141    - mystic.mystic.abstract_solver        [the solver API definition] 
     142    - mystic.mystic.abstract_map_solver    [the parallel solver API] 
    142143    - mystic.mystic.abstract_nested_solver [the nested solver API] 
    143     - mystic.mystic.termination     [solver termination conditions] 
    144     - mystic.mystic.strategy        [solver population mutation strategies] 
    145     - mystic.models.abstract_model  [the model API definition] 
    146     - mystic.models.forward_model   [cost function generator] 
    147     - mystic.mystic.tools           [monitors, function wrappers, and other tools] 
    148     - mystic.mystic.math            [some useful mathematical functions and tools] 
     144    - mystic.mystic.termination            [solver termination conditions] 
     145    - mystic.mystic.strategy               [solver population mutation strategies] 
     146    - mystic.models.abstract_model         [the model API definition] 
     147    - mystic.models.forward_model          [cost function generator] 
     148    - mystic.mystic.tools                  [monitors, function wrappers, and other tools] 
     149    - mystic.mystic.math                   [some useful mathematical functions and tools] 
    149150 
    150151Solvers are found here:: 
  • mystic/mystic/abstract_map_solver.py

    r240 r242  
    1616The default map API settings are provided within mystic, while 
    1717distributed and high-performance computing mappers and launchers 
    18 can be obtained within the "pathos" package, found here: 
    19     - http://dev.danse.us/trac/pathos   (see subpackage = pyina) 
     18can be obtained within the "pathos" package, found here:: 
     19    - http://dev.danse.us/trac/pathos 
    2020 
    2121 
     
    2727    >>> # the function to be minimized and the initial values 
    2828    >>> from mystic.models import rosen 
    29     >>> x0 = [0.8, 1.2, 0.7] 
     29    >>> lb = [0.0, 0.0, 0.0] 
     30    >>> ub = [2.0, 2.0, 2.0] 
    3031    >>>  
    3132    >>> # get monitors and termination condition objects 
     
    3940    >>> from pyina.ez_map import ez_map2 
    4041    >>> NNODES = 4 
     42    >>>> npts = 20 
    4143    >>> 
    4244    >>> # instantiate and configure the solver 
    43     >>> from mystic.scipy_optimize import NelderMeadSimplexMapSolver 
    44     >>> solver = NelderMeadSimplexMapSolver(len(x0)) 
    45     >>> solver.SetInitialPoints(x0)            #FIXME: use batchgrid w/ bounds 
     45    >>> from mystic.nested import ScattershotSolver 
     46    >>> solver = ScattershotSolver(len(lb), npts) 
    4647    >>> solver.SetMapper(ez_map2, equalportion_mapper) 
    4748    >>> solver.SetLauncher(mpirun_launcher, NNODES) 
     
    8788 
    8889Additional inputs: 
    89     npop     -- size of the trial solution population.  [default = 1] 
     90    npop     -- size of the trial solution population.     [default = 1] 
    9091 
    9192Important class members: 
     
    9394    generations    - an iteration counter. 
    9495    bestEnergy     - current best energy. 
    95     bestSolution   - current best parameter set. [size = dim] 
    96     popEnergy      - set of all trial energy solutions. [size = npop] 
    97     population     - set of all trial parameter solutions. 
    98         [size = dim*npop] 
    99     energy_history - history of bestEnergy status. 
    100         [equivalent to StepMonitor] 
    101     signal_handler - catches the interrupt signal. 
    102         [***disabled***] 
     96    bestSolution   - current best parameter set.           [size = dim] 
     97    popEnergy      - set of all trial energy solutions.    [size = npop] 
     98    population     - set of all trial parameter solutions. [size = dim*npop] 
     99    energy_history - history of bestEnergy status.         [equivalent to StepMonitor] 
     100    signal_handler - catches the interrupt signal.         [***disabled***] 
    103101        """ 
    104102        super(AbstractMapSolver, self).__init__(dim, **kwds) 
  • mystic/mystic/abstract_nested_solver.py

    r240 r242  
    1616The default map API settings are provided within mystic, while 
    1717distributed and high-performance computing mappers and launchers 
    18 can be obtained within the "pathos" package, found here: 
    19     - http://dev.danse.us/trac/pathos   (see subpackage = pyina) 
     18can be obtained within the "pathos" package, found here:: 
     19    - http://dev.danse.us/trac/pathos 
    2020 
    2121 
     
    9292 
    9393Additional inputs: 
    94     npop     -- size of the trial solution population.  [default = 1] 
     94    npop     -- size of the trial solution population.      [default = 1] 
    9595    nbins    -- tuple of number of bins in each dimension.  [default = [1]*dim] 
    96     npts     -- number of solver instances.  [default = 1] 
     96    npts     -- number of solver instances.                 [default = 1] 
    9797 
    9898Important class members: 
     
    100100    generations    - an iteration counter. 
    101101    bestEnergy     - current best energy. 
    102     bestSolution   - current best parameter set. [size = dim] 
    103     popEnergy      - set of all trial energy solutions. [size = npop] 
    104     population     - set of all trial parameter solutions. 
    105         [size = dim*npop] 
    106     energy_history - history of bestEnergy status. 
    107         [equivalent to StepMonitor] 
    108     signal_handler - catches the interrupt signal. 
    109         [***disabled***] 
     102    bestSolution   - current best parameter set.            [size = dim] 
     103    popEnergy      - set of all trial energy solutions.     [size = npop] 
     104    population     - set of all trial parameter solutions.  [size = dim*npop] 
     105    energy_history - history of bestEnergy status.          [equivalent to StepMonitor] 
     106    signal_handler - catches the interrupt signal.          [***disabled***] 
    110107        """ 
    111108        super(AbstractNestedSolver, self).__init__(dim, **kwds) 
  • mystic/mystic/abstract_solver.py

    r234 r242  
    9191 
    9292Additional inputs: 
    93     npop     -- size of the trial solution population.  [default = 1] 
     93    npop     -- size of the trial solution population.      [default = 1] 
    9494 
    9595Important class members: 
     
    9797    generations    - an iteration counter. 
    9898    bestEnergy     - current best energy. 
    99     bestSolution   - current best parameter set. [size = dim] 
    100     popEnergy      - set of all trial energy solutions. [size = npop] 
    101     population     - set of all trial parameter solutions. 
    102         [size = dim*npop] 
    103     energy_history - history of bestEnergy status. 
    104         [equivalent to StepMonitor] 
     99    bestSolution   - current best parameter set.            [size = dim] 
     100    popEnergy      - set of all trial energy solutions.     [size = npop] 
     101    population     - set of all trial parameter solutions.  [size = dim*npop] 
     102    energy_history - history of bestEnergy status.          [equivalent to StepMonitor] 
    105103    signal_handler - catches the interrupt signal. 
    106104        """ 
  • mystic/mystic/differential_evolution.py

    r233 r242  
    7070Solver for your own objective function can be found on R. Storn's 
    7171web page (http://www.icsi.berkeley.edu/~storn/code.html), and is 
    72 reproduced here: 
    73  
    74   First try the following classical settings for the solver configuration: 
    75   Choose a crossover strategy (e.g. Rand1Bin), set the number of parents 
    76   NP to 10 times the number of parameters, select ScalingFactor=0.8, and 
    77   CrossProbability=0.9. 
    78  
    79   It has been found recently that selecting ScalingFactor from the interval 
    80   [0.5, 1.0] randomly for each generation or for each difference vector, 
    81   a technique called dither, improves convergence behaviour significantly, 
    82   especially for noisy objective functions. 
    83  
    84   It has also been found that setting CrossProbability to a low value, 
    85   e.g. CrossProbability=0.2 helps optimizing separable functions since 
    86   it fosters the search along the coordinate axes. On the contrary, 
    87   this choice is not effective if parameter dependence is encountered, 
    88   something which is frequently occuring in real-world optimization 
    89   problems rather than artificial test functions. So for parameter 
    90   dependence the choice of CrossProbability=0.9 is more appropriate. 
    91  
    92   Another interesting empirical finding is that rasing NP above, say, 40 
    93   does not substantially improve the convergence, independent of the 
    94   number of parameters. It is worthwhile to experiment with these suggestions. 
    95  
    96   Make sure that you initialize your parameter vectors by exploiting 
    97   their full numerical range, i.e. if a parameter is allowed to exhibit 
    98   values in the range [-100, 100] it's a good idea to pick the initial 
    99   values from this range instead of unnecessarily restricting diversity. 
    100  
    101   Keep in mind that different problems often require different settings 
    102   for NP, ScalingFactor and CrossProbability (see Ref 1, 2). If you 
    103   experience misconvergence, you typically can increase the value for NP, 
    104   but often you only have to adjust ScalingFactor to be a little lower or 
    105   higher than 0.8. If you increase NP and simultaneously lower ScalingFactor 
    106   a little, convergence is more likely to occur but generally takes longer, 
    107   i.e. DE is getting more robust (a convergence speed/robustness tradeoff). 
    108  
    109   If you still get misconvergence you might want to instead try a different 
    110   crossover strategy. The most commonly used are Rand1Bin, Rand1Exp, 
    111   Best1Bin, and Best1Exp. The crossover strategy is not so important a 
    112   choice, although K. Price claims that binomial (Bin) is never worse than 
    113   exponential (Exp). 
    114  
    115   In case of continued misconvergence, check your choice of objective function. 
    116   There might be a better one to describe your problem. Any knowledge that 
    117   you have about the problem should be worked into the objective function. 
    118   A good objective function can make all the difference. 
     72reproduced here:: 
     73 
     74    First try the following classical settings for the solver configuration: 
     75    Choose a crossover strategy (e.g. Rand1Bin), set the number of parents 
     76    NP to 10 times the number of parameters, select ScalingFactor=0.8, and 
     77    CrossProbability=0.9. 
     78 
     79    It has been found recently that selecting ScalingFactor from the interval 
     80    [0.5, 1.0] randomly for each generation or for each difference vector, 
     81    a technique called dither, improves convergence behaviour significantly, 
     82    especially for noisy objective functions. 
     83 
     84    It has also been found that setting CrossProbability to a low value, 
     85    e.g. CrossProbability=0.2 helps optimizing separable functions since 
     86    it fosters the search along the coordinate axes. On the contrary, 
     87    this choice is not effective if parameter dependence is encountered, 
     88    something which is frequently occuring in real-world optimization 
     89    problems rather than artificial test functions. So for parameter 
     90    dependence the choice of CrossProbability=0.9 is more appropriate. 
     91 
     92    Another interesting empirical finding is that rasing NP above, say, 40 
     93    does not substantially improve the convergence, independent of the 
     94    number of parameters. It is worthwhile to experiment with these suggestions. 
     95   
     96    Make sure that you initialize your parameter vectors by exploiting 
     97    their full numerical range, i.e. if a parameter is allowed to exhibit 
     98    values in the range [-100, 100] it's a good idea to pick the initial 
     99    values from this range instead of unnecessarily restricting diversity. 
     100 
     101    Keep in mind that different problems often require different settings 
     102    for NP, ScalingFactor and CrossProbability (see Ref 1, 2). If you 
     103    experience misconvergence, you typically can increase the value for NP, 
     104    but often you only have to adjust ScalingFactor to be a little lower or 
     105    higher than 0.8. If you increase NP and simultaneously lower ScalingFactor 
     106    a little, convergence is more likely to occur but generally takes longer, 
     107    i.e. DE is getting more robust (a convergence speed/robustness tradeoff). 
     108 
     109    If you still get misconvergence you might want to instead try a different 
     110    crossover strategy. The most commonly used are Rand1Bin, Rand1Exp, 
     111    Best1Bin, and Best1Exp. The crossover strategy is not so important a 
     112    choice, although K. Price claims that binomial (Bin) is never worse than 
     113    exponential (Exp). 
     114 
     115    In case of continued misconvergence, check the choice of objective function. 
     116    There might be a better one to describe your problem. Any knowledge that 
     117    you have about the problem should be worked into the objective function. 
     118    A good objective function can make all the difference. 
    119119 
    120120See `mystic.examples.test_rosenbrock` for an example of using 
     
    584584 
    585585    args -- extra arguments for func. 
    586     bounds -- list - n pairs of bounds (min,max), one pair for each 
    587         parameter. 
    588     ftol -- number - acceptable relative error in func(xopt) for 
    589         convergence. 
    590     gtol -- number - maximum number of iterations to run without 
    591         improvement. 
     586    bounds -- list - n pairs of bounds (min,max), one pair for each parameter. 
     587    ftol -- number - acceptable relative error in func(xopt) for convergence. 
     588    gtol -- number - maximum number of iterations to run without improvement. 
    592589    maxiter -- number - the maximum number of iterations to perform. 
    593590    maxfun -- number - the maximum number of function evaluations. 
    594591    cross -- number - the probability of cross-parameter mutations 
    595     scale -- number - multiplier for impact of mutations on trial 
    596         solution. 
    597     full_output -- number - non-zero if fval and warnflag outputs are 
    598         desired. 
     592    scale -- number - multiplier for impact of mutations on trial solution. 
     593    full_output -- number - non-zero if fval and warnflag outputs are desired. 
    599594    disp -- number - non-zero to print convergence messages. 
    600     retall -- number - non-zero to return list of solutions at each 
    601         iteration. 
     595    retall -- number - non-zero to return list of solutions at each iteration. 
    602596    callback -- an optional user-supplied function to call after each 
    603597        iteration.  It is called as callback(xk), where xk is the 
  • mystic/mystic/math/approx.py

    r237 r242  
    11#!/usr/bin/env python 
    22 
     3""" 
     4tools for measuring equality 
     5""" 
    36def _float_approx_equal(x, y, tol=1e-18, rel=1e-7): 
    47    if tol is rel is None: 
  • mystic/mystic/math/grid.py

    r240 r242  
    11#!/usr/bin/env python 
    22 
     3""" 
     4tools for generating points on a grid 
     5""" 
    36from numpy import asarray 
    47 
  • mystic/mystic/nested.py

    r236 r242  
    33 
    44""" 
    5 ... 
     5Solvers 
     6======= 
     7 
     8This module contains a collection of optimization that use map-reduce 
     9to distribute several optimizer instances over parameter space. Each 
     10solver accepts a imported solver object as the "nested" solver, which 
     11becomes the target of the map function. 
     12 
     13The set of solvers built on mystic's AbstractNestdSolver are:: 
     14   BatchGridSolver -- start from center of N grid points 
     15   ScattershotSolver -- start from N random points in parameter space 
     16 
     17 
     18Usage 
     19===== 
     20 
     21See `mystic.examples.scattershot_example06` for an example of using 
     22ScattershotSolver. See `mystic.examples.batchgrid_example06` 
     23or an example of using BatchGridSolver. 
     24 
     25All solvers included in this module provide the standard signal handling. 
     26For more information, see `mystic.mystic.abstract_solver`. 
    627""" 
    728__all__ = ['BatchGridSolver','ScattershotSolver'] 
     
    1435class BatchGridSolver(AbstractNestedSolver): 
    1536    """ 
    16 ... 
     37parallel mapped optimization starting from the center of N grid points 
    1738    """ 
    1839    def __init__(self, dim, nbins): 
     
    3556              EvaluationMonitor=Null, StepMonitor=Null, ExtraArgs=(), **kwds): 
    3657        """Minimize a function using batch grid optimization. 
    37         ... 
     58 
     59Description: 
     60 
     61    Uses parallel mapping of solvers on a regular grid to find the 
     62    minimum of a function of one or more variables. 
     63 
     64Inputs: 
     65 
     66    cost -- the Python function or method to be minimized. 
     67    termination -- callable object providing termination conditions. 
     68 
     69Additional Inputs: 
     70 
     71    sigint_callback -- callback function for signal handler. 
     72    EvaluationMonitor -- a callable object that will be passed x, fval 
     73        whenever the cost function is evaluated. 
     74    StepMonitor -- a callable object that will be passed x, fval 
     75        after the end of a solver iteration. 
     76    ExtraArgs -- extra arguments for cost. 
     77 
     78Further Inputs: 
     79 
     80    callback -- an optional user-supplied function to call after each 
     81        iteration.  It is called as callback(xk), where xk is the 
     82        current parameter vector.                           [default = None] 
     83    disp -- non-zero to print convergence messages.         [default = 0] 
    3884        """ 
    3985        #allow for inputs that don't conform to AbstractSolver interface 
     
    174220class ScattershotSolver(AbstractNestedSolver): 
    175221    """ 
    176 ... 
     222parallel mapped optimization starting from the N random points 
    177223    """ 
    178224    def __init__(self, dim, npts): 
     
    189235              EvaluationMonitor=Null, StepMonitor=Null, ExtraArgs=(), **kwds): 
    190236        """Minimize a function using scattershot optimization. 
    191         ... 
     237 
     238Description: 
     239 
     240    Uses parallel mapping of solvers on randomly selected points 
     241    to find the minimum of a function of one or more variables. 
     242 
     243Inputs: 
     244 
     245    cost -- the Python function or method to be minimized. 
     246    termination -- callable object providing termination conditions. 
     247 
     248Additional Inputs: 
     249 
     250    sigint_callback -- callback function for signal handler. 
     251    EvaluationMonitor -- a callable object that will be passed x, fval 
     252        whenever the cost function is evaluated. 
     253    StepMonitor -- a callable object that will be passed x, fval 
     254        after the end of a solver iteration. 
     255    ExtraArgs -- extra arguments for cost. 
     256 
     257Further Inputs: 
     258 
     259    callback -- an optional user-supplied function to call after each 
     260        iteration.  It is called as callback(xk), where xk is the 
     261        current parameter vector.                           [default = None] 
     262    disp -- non-zero to print convergence messages.         [default = 0] 
    192263        """ 
    193264        #allow for inputs that don't conform to AbstractSolver interface 
  • mystic/mystic/python_map.py

    r226 r242  
    44Defaults for mapper and launcher. These should be 
    55available as a minimal (dependency-free) pure-python 
    6 install from pyina. 
    7  
    8 serial_launcher: syntax for standard python execution 
    9 python_map: wrapper around the standard python map 
    10 carddealer_mapper: the carddealer map strategy 
     6install from pathos:: 
     7    - serial_launcher:   syntax for standard python execution 
     8    - python_map:        wrapper around the standard python map 
     9    - carddealer_mapper: the carddealer map strategy 
    1110""" 
    1211 
     
    2928 
    3029NOTES: 
    31  - run non-python commands with: {'python':'', ...}  
     30   run non-python commands with: {'python':'', ...}  
    3231    """ 
    3332    mydict = defaults.copy() 
     
    4443 
    4544Further Input: [***disabled***] 
    46   - nnodes -- the number of parallel nodes 
    47   - launcher -- the launcher object 
    48   - mapper -- the mapper object 
    49   - timelimit -- string representation of maximum run time (e.g. '00:02') 
    50   - queue -- string name of selected queue (e.g. 'normal') 
     45    nnodes -- the number of parallel nodes 
     46    launcher -- the launcher object 
     47    mapper -- the mapper object 
     48    timelimit -- string representation of maximum run time (e.g. '00:02') 
     49    queue -- string name of selected queue (e.g. 'normal') 
    5150""" 
    5251   #print "ignoring: %s" % kwds  #XXX: should allow use of **kwds 
  • mystic/mystic/scipy_optimize.py

    r219 r242  
    153153    callback -- an optional user-supplied function to call after each 
    154154        iteration.  It is called as callback(xk), where xk is the 
    155         current parameter vector. [default = None] 
    156     disp -- non-zero to print convergence messages. [default = 0] 
    157     radius -- percentage change for initial simplex values. 
    158         [default = 0.05] 
     155        current parameter vector.                           [default = None] 
     156    disp -- non-zero to print convergence messages.         [default = 0] 
     157    radius -- percentage change for initial simplex values. [default = 0.05] 
    159158 
    160159""" 
  • mystic/mystic/tools.py

    r225 r242  
    214214generate a custom Sow 
    215215 
    216 takes *args & **kwds, where args will be required inputs for the Sow 
     216takes *args & **kwds, where args will be required inputs for the Sow:: 
    217217    - args: property name strings (i.e. 'x') 
    218218    - kwds: must be in the form: property="doc" (i.e. x='Params') 
Note: See TracChangeset for help on using the changeset viewer.