Changeset 243
- Timestamp:
- 05/21/10 23:50:18 (6 years ago)
- Files:
-
- 15 edited
- 2 copied
- 1 moved
Legend:
- Unmodified
- Added
- Removed
-
Makefile
r241 r243 6 6 7 7 doc: 8 cp -rf mystic/_math mystic/math 8 9 epydoc --config mystic.epydoc 10 rm -rf mystic/math 9 11 mkdir html 10 12 mv -f shared,mpi html/mystic-${VERSION} -
mystic.epydoc
r241 r243 4 4 # dotted names, module filenames, or package directory names. 5 5 # Alases for this option include "objects" and "values". 6 modules: mystic/mystic, mystic/models, mystic/m ystic/math6 modules: mystic/mystic, mystic/models, mystic/math 7 7 #modules: mystic/mystic 8 8 -
mystic/Make.mm
r241 r243 6 6 mystic \ 7 7 models \ 8 mystic/math \8 _math \ 9 9 10 10 OTHER_DIRS = \ -
mystic/_math/grid.py
r242 r243 47 47 48 48 Inputs: 49 -lower bounds -- a list of the lower bounds50 -upper bounds -- a list of the upper bounds51 -npts -- number of sample points [default = 10000]49 lower bounds -- a list of the lower bounds 50 upper bounds -- a list of the upper bounds 51 npts -- number of sample points [default = 10000] 52 52 """ 53 53 from numpy.random import random … … 65 65 66 66 Inputs: 67 -lower bounds -- a list of the lower bounds68 -upper bounds -- a list of the upper bounds69 -npts -- number of sample points67 lower bounds -- a list of the lower bounds 68 upper bounds -- a list of the upper bounds 69 npts -- number of sample points 70 70 """ 71 71 q = random_samples(lb,ub,npts) -
mystic/setup.py
r241 r243 30 30 packages = ['mystic','mystic.models','mystic.math'], 31 31 package_dir = {'mystic':'mystic','mystic.models':'models', 32 'mystic.math':' mystic/math'},32 'mystic.math':'_math'}, 33 33 """ 34 34 -
releases/mystic-0.2a1/_math/approx.py
r237 r243 1 1 #!/usr/bin/env python 2 2 3 """ 4 tools for measuring equality 5 """ 3 6 def _float_approx_equal(x, y, tol=1e-18, rel=1e-7): 4 7 if tol is rel is None: -
releases/mystic-0.2a1/_math/grid.py
r240 r243 1 1 #!/usr/bin/env python 2 2 3 """ 4 tools for generating points on a grid 5 """ 3 6 from numpy import asarray 4 7 … … 44 47 45 48 Inputs: 46 -lower bounds -- a list of the lower bounds47 -upper bounds -- a list of the upper bounds48 -npts -- number of sample points [default = 10000]49 lower bounds -- a list of the lower bounds 50 upper bounds -- a list of the upper bounds 51 npts -- number of sample points [default = 10000] 49 52 """ 50 53 from numpy.random import random … … 62 65 63 66 Inputs: 64 -lower bounds -- a list of the lower bounds65 -upper bounds -- a list of the upper bounds66 -npts -- number of sample points67 lower bounds -- a list of the lower bounds 68 upper bounds -- a list of the upper bounds 69 npts -- number of sample points 67 70 """ 68 71 q = random_samples(lb,ub,npts) -
releases/mystic-0.2a1/mystic/__init__.py
r240 r243 98 98 - setuptools, version >= 0.6 99 99 - matplotlib, version >= 0.91 100 - pathos, version >= 0.1a1 100 101 101 102 … … 138 139 139 140 Important classes and functions are found here:: 140 - mystic.mystic.abstract_solver [the solver API definition]141 - mystic.mystic.abstract_map_solver [the parallel solver API]141 - mystic.mystic.abstract_solver [the solver API definition] 142 - mystic.mystic.abstract_map_solver [the parallel solver API] 142 143 - mystic.mystic.abstract_nested_solver [the nested solver API] 143 - mystic.mystic.termination [solver termination conditions]144 - mystic.mystic.strategy [solver population mutation strategies]145 - mystic.models.abstract_model [the model API definition]146 - mystic.models.forward_model [cost function generator]147 - mystic.mystic.tools [monitors, function wrappers, and other tools]148 - mystic.mystic.math [some useful mathematical functions and tools]144 - mystic.mystic.termination [solver termination conditions] 145 - mystic.mystic.strategy [solver population mutation strategies] 146 - mystic.models.abstract_model [the model API definition] 147 - mystic.models.forward_model [cost function generator] 148 - mystic.mystic.tools [monitors, function wrappers, and other tools] 149 - mystic.mystic.math [some useful mathematical functions and tools] 149 150 150 151 Solvers are found here:: -
releases/mystic-0.2a1/mystic/abstract_map_solver.py
r240 r243 16 16 The default map API settings are provided within mystic, while 17 17 distributed and high-performance computing mappers and launchers 18 can be obtained within the "pathos" package, found here: 19 - http://dev.danse.us/trac/pathos (see subpackage = pyina)18 can be obtained within the "pathos" package, found here:: 19 - http://dev.danse.us/trac/pathos 20 20 21 21 … … 27 27 >>> # the function to be minimized and the initial values 28 28 >>> from mystic.models import rosen 29 >>> x0 = [0.8, 1.2, 0.7] 29 >>> lb = [0.0, 0.0, 0.0] 30 >>> ub = [2.0, 2.0, 2.0] 30 31 >>> 31 32 >>> # get monitors and termination condition objects … … 39 40 >>> from pyina.ez_map import ez_map2 40 41 >>> NNODES = 4 42 >>>> npts = 20 41 43 >>> 42 44 >>> # instantiate and configure the solver 43 >>> from mystic.scipy_optimize import NelderMeadSimplexMapSolver 44 >>> solver = NelderMeadSimplexMapSolver(len(x0)) 45 >>> solver.SetInitialPoints(x0) #FIXME: use batchgrid w/ bounds 45 >>> from mystic.nested import ScattershotSolver 46 >>> solver = ScattershotSolver(len(lb), npts) 46 47 >>> solver.SetMapper(ez_map2, equalportion_mapper) 47 48 >>> solver.SetLauncher(mpirun_launcher, NNODES) … … 87 88 88 89 Additional inputs: 89 npop -- size of the trial solution population. [default = 1]90 npop -- size of the trial solution population. [default = 1] 90 91 91 92 Important class members: … … 93 94 generations - an iteration counter. 94 95 bestEnergy - current best energy. 95 bestSolution - current best parameter set. [size = dim] 96 popEnergy - set of all trial energy solutions. [size = npop] 97 population - set of all trial parameter solutions. 98 [size = dim*npop] 99 energy_history - history of bestEnergy status. 100 [equivalent to StepMonitor] 101 signal_handler - catches the interrupt signal. 102 [***disabled***] 96 bestSolution - current best parameter set. [size = dim] 97 popEnergy - set of all trial energy solutions. [size = npop] 98 population - set of all trial parameter solutions. [size = dim*npop] 99 energy_history - history of bestEnergy status. [equivalent to StepMonitor] 100 signal_handler - catches the interrupt signal. [***disabled***] 103 101 """ 104 102 super(AbstractMapSolver, self).__init__(dim, **kwds) -
releases/mystic-0.2a1/mystic/abstract_nested_solver.py
r240 r243 16 16 The default map API settings are provided within mystic, while 17 17 distributed and high-performance computing mappers and launchers 18 can be obtained within the "pathos" package, found here: 19 - http://dev.danse.us/trac/pathos (see subpackage = pyina)18 can be obtained within the "pathos" package, found here:: 19 - http://dev.danse.us/trac/pathos 20 20 21 21 … … 92 92 93 93 Additional inputs: 94 npop -- size of the trial solution population. [default = 1]94 npop -- size of the trial solution population. [default = 1] 95 95 nbins -- tuple of number of bins in each dimension. [default = [1]*dim] 96 npts -- number of solver instances. [default = 1]96 npts -- number of solver instances. [default = 1] 97 97 98 98 Important class members: … … 100 100 generations - an iteration counter. 101 101 bestEnergy - current best energy. 102 bestSolution - current best parameter set. [size = dim] 103 popEnergy - set of all trial energy solutions. [size = npop] 104 population - set of all trial parameter solutions. 105 [size = dim*npop] 106 energy_history - history of bestEnergy status. 107 [equivalent to StepMonitor] 108 signal_handler - catches the interrupt signal. 109 [***disabled***] 102 bestSolution - current best parameter set. [size = dim] 103 popEnergy - set of all trial energy solutions. [size = npop] 104 population - set of all trial parameter solutions. [size = dim*npop] 105 energy_history - history of bestEnergy status. [equivalent to StepMonitor] 106 signal_handler - catches the interrupt signal. [***disabled***] 110 107 """ 111 108 super(AbstractNestedSolver, self).__init__(dim, **kwds) -
releases/mystic-0.2a1/mystic/abstract_solver.py
r234 r243 91 91 92 92 Additional inputs: 93 npop -- size of the trial solution population. [default = 1]93 npop -- size of the trial solution population. [default = 1] 94 94 95 95 Important class members: … … 97 97 generations - an iteration counter. 98 98 bestEnergy - current best energy. 99 bestSolution - current best parameter set. [size = dim] 100 popEnergy - set of all trial energy solutions. [size = npop] 101 population - set of all trial parameter solutions. 102 [size = dim*npop] 103 energy_history - history of bestEnergy status. 104 [equivalent to StepMonitor] 99 bestSolution - current best parameter set. [size = dim] 100 popEnergy - set of all trial energy solutions. [size = npop] 101 population - set of all trial parameter solutions. [size = dim*npop] 102 energy_history - history of bestEnergy status. [equivalent to StepMonitor] 105 103 signal_handler - catches the interrupt signal. 106 104 """ -
releases/mystic-0.2a1/mystic/differential_evolution.py
r233 r243 70 70 Solver for your own objective function can be found on R. Storn's 71 71 web page (http://www.icsi.berkeley.edu/~storn/code.html), and is 72 reproduced here: 73 74 First try the following classical settings for the solver configuration:75 Choose a crossover strategy (e.g. Rand1Bin), set the number of parents76 NP to 10 times the number of parameters, select ScalingFactor=0.8, and77 CrossProbability=0.9.78 79 It has been found recently that selecting ScalingFactor from the interval80 [0.5, 1.0] randomly for each generation or for each difference vector,81 a technique called dither, improves convergence behaviour significantly,82 especially for noisy objective functions.83 84 It has also been found that setting CrossProbability to a low value,85 e.g. CrossProbability=0.2 helps optimizing separable functions since86 it fosters the search along the coordinate axes. On the contrary,87 this choice is not effective if parameter dependence is encountered,88 something which is frequently occuring in real-world optimization89 problems rather than artificial test functions. So for parameter90 dependence the choice of CrossProbability=0.9 is more appropriate.91 92 Another interesting empirical finding is that rasing NP above, say, 4093 does not substantially improve the convergence, independent of the94 number of parameters. It is worthwhile to experiment with these suggestions.95 96 Make sure that you initialize your parameter vectors by exploiting97 their full numerical range, i.e. if a parameter is allowed to exhibit98 values in the range [-100, 100] it's a good idea to pick the initial99 values from this range instead of unnecessarily restricting diversity.100 101 Keep in mind that different problems often require different settings102 for NP, ScalingFactor and CrossProbability (see Ref 1, 2). If you103 experience misconvergence, you typically can increase the value for NP,104 but often you only have to adjust ScalingFactor to be a little lower or105 higher than 0.8. If you increase NP and simultaneously lower ScalingFactor106 a little, convergence is more likely to occur but generally takes longer,107 i.e. DE is getting more robust (a convergence speed/robustness tradeoff).108 109 If you still get misconvergence you might want to instead try a different110 crossover strategy. The most commonly used are Rand1Bin, Rand1Exp,111 Best1Bin, and Best1Exp. The crossover strategy is not so important a112 choice, although K. Price claims that binomial (Bin) is never worse than113 exponential (Exp).114 115 In case of continued misconvergence, check yourchoice of objective function.116 There might be a better one to describe your problem. Any knowledge that117 you have about the problem should be worked into the objective function.118 A good objective function can make all the difference.72 reproduced here:: 73 74 First try the following classical settings for the solver configuration: 75 Choose a crossover strategy (e.g. Rand1Bin), set the number of parents 76 NP to 10 times the number of parameters, select ScalingFactor=0.8, and 77 CrossProbability=0.9. 78 79 It has been found recently that selecting ScalingFactor from the interval 80 [0.5, 1.0] randomly for each generation or for each difference vector, 81 a technique called dither, improves convergence behaviour significantly, 82 especially for noisy objective functions. 83 84 It has also been found that setting CrossProbability to a low value, 85 e.g. CrossProbability=0.2 helps optimizing separable functions since 86 it fosters the search along the coordinate axes. On the contrary, 87 this choice is not effective if parameter dependence is encountered, 88 something which is frequently occuring in real-world optimization 89 problems rather than artificial test functions. So for parameter 90 dependence the choice of CrossProbability=0.9 is more appropriate. 91 92 Another interesting empirical finding is that rasing NP above, say, 40 93 does not substantially improve the convergence, independent of the 94 number of parameters. It is worthwhile to experiment with these suggestions. 95 96 Make sure that you initialize your parameter vectors by exploiting 97 their full numerical range, i.e. if a parameter is allowed to exhibit 98 values in the range [-100, 100] it's a good idea to pick the initial 99 values from this range instead of unnecessarily restricting diversity. 100 101 Keep in mind that different problems often require different settings 102 for NP, ScalingFactor and CrossProbability (see Ref 1, 2). If you 103 experience misconvergence, you typically can increase the value for NP, 104 but often you only have to adjust ScalingFactor to be a little lower or 105 higher than 0.8. If you increase NP and simultaneously lower ScalingFactor 106 a little, convergence is more likely to occur but generally takes longer, 107 i.e. DE is getting more robust (a convergence speed/robustness tradeoff). 108 109 If you still get misconvergence you might want to instead try a different 110 crossover strategy. The most commonly used are Rand1Bin, Rand1Exp, 111 Best1Bin, and Best1Exp. The crossover strategy is not so important a 112 choice, although K. Price claims that binomial (Bin) is never worse than 113 exponential (Exp). 114 115 In case of continued misconvergence, check the choice of objective function. 116 There might be a better one to describe your problem. Any knowledge that 117 you have about the problem should be worked into the objective function. 118 A good objective function can make all the difference. 119 119 120 120 See `mystic.examples.test_rosenbrock` for an example of using … … 584 584 585 585 args -- extra arguments for func. 586 bounds -- list - n pairs of bounds (min,max), one pair for each 587 parameter. 588 ftol -- number - acceptable relative error in func(xopt) for 589 convergence. 590 gtol -- number - maximum number of iterations to run without 591 improvement. 586 bounds -- list - n pairs of bounds (min,max), one pair for each parameter. 587 ftol -- number - acceptable relative error in func(xopt) for convergence. 588 gtol -- number - maximum number of iterations to run without improvement. 592 589 maxiter -- number - the maximum number of iterations to perform. 593 590 maxfun -- number - the maximum number of function evaluations. 594 591 cross -- number - the probability of cross-parameter mutations 595 scale -- number - multiplier for impact of mutations on trial 596 solution. 597 full_output -- number - non-zero if fval and warnflag outputs are 598 desired. 592 scale -- number - multiplier for impact of mutations on trial solution. 593 full_output -- number - non-zero if fval and warnflag outputs are desired. 599 594 disp -- number - non-zero to print convergence messages. 600 retall -- number - non-zero to return list of solutions at each 601 iteration. 595 retall -- number - non-zero to return list of solutions at each iteration. 602 596 callback -- an optional user-supplied function to call after each 603 597 iteration. It is called as callback(xk), where xk is the -
releases/mystic-0.2a1/mystic/nested.py
r236 r243 3 3 4 4 """ 5 ... 5 Solvers 6 ======= 7 8 This module contains a collection of optimization that use map-reduce 9 to distribute several optimizer instances over parameter space. Each 10 solver accepts a imported solver object as the "nested" solver, which 11 becomes the target of the map function. 12 13 The set of solvers built on mystic's AbstractNestdSolver are:: 14 BatchGridSolver -- start from center of N grid points 15 ScattershotSolver -- start from N random points in parameter space 16 17 18 Usage 19 ===== 20 21 See `mystic.examples.scattershot_example06` for an example of using 22 ScattershotSolver. See `mystic.examples.batchgrid_example06` 23 or an example of using BatchGridSolver. 24 25 All solvers included in this module provide the standard signal handling. 26 For more information, see `mystic.mystic.abstract_solver`. 6 27 """ 7 28 __all__ = ['BatchGridSolver','ScattershotSolver'] … … 14 35 class BatchGridSolver(AbstractNestedSolver): 15 36 """ 16 ... 37 parallel mapped optimization starting from the center of N grid points 17 38 """ 18 39 def __init__(self, dim, nbins): … … 35 56 EvaluationMonitor=Null, StepMonitor=Null, ExtraArgs=(), **kwds): 36 57 """Minimize a function using batch grid optimization. 37 ... 58 59 Description: 60 61 Uses parallel mapping of solvers on a regular grid to find the 62 minimum of a function of one or more variables. 63 64 Inputs: 65 66 cost -- the Python function or method to be minimized. 67 termination -- callable object providing termination conditions. 68 69 Additional Inputs: 70 71 sigint_callback -- callback function for signal handler. 72 EvaluationMonitor -- a callable object that will be passed x, fval 73 whenever the cost function is evaluated. 74 StepMonitor -- a callable object that will be passed x, fval 75 after the end of a solver iteration. 76 ExtraArgs -- extra arguments for cost. 77 78 Further Inputs: 79 80 callback -- an optional user-supplied function to call after each 81 iteration. It is called as callback(xk), where xk is the 82 current parameter vector. [default = None] 83 disp -- non-zero to print convergence messages. [default = 0] 38 84 """ 39 85 #allow for inputs that don't conform to AbstractSolver interface … … 174 220 class ScattershotSolver(AbstractNestedSolver): 175 221 """ 176 ... 222 parallel mapped optimization starting from the N random points 177 223 """ 178 224 def __init__(self, dim, npts): … … 189 235 EvaluationMonitor=Null, StepMonitor=Null, ExtraArgs=(), **kwds): 190 236 """Minimize a function using scattershot optimization. 191 ... 237 238 Description: 239 240 Uses parallel mapping of solvers on randomly selected points 241 to find the minimum of a function of one or more variables. 242 243 Inputs: 244 245 cost -- the Python function or method to be minimized. 246 termination -- callable object providing termination conditions. 247 248 Additional Inputs: 249 250 sigint_callback -- callback function for signal handler. 251 EvaluationMonitor -- a callable object that will be passed x, fval 252 whenever the cost function is evaluated. 253 StepMonitor -- a callable object that will be passed x, fval 254 after the end of a solver iteration. 255 ExtraArgs -- extra arguments for cost. 256 257 Further Inputs: 258 259 callback -- an optional user-supplied function to call after each 260 iteration. It is called as callback(xk), where xk is the 261 current parameter vector. [default = None] 262 disp -- non-zero to print convergence messages. [default = 0] 192 263 """ 193 264 #allow for inputs that don't conform to AbstractSolver interface -
releases/mystic-0.2a1/mystic/python_map.py
r226 r243 4 4 Defaults for mapper and launcher. These should be 5 5 available as a minimal (dependency-free) pure-python 6 install from pyina. 7 8 serial_launcher: syntax for standard python execution 9 python_map: wrapper around the standard python map 10 carddealer_mapper: the carddealer map strategy 6 install from pathos:: 7 - serial_launcher: syntax for standard python execution 8 - python_map: wrapper around the standard python map 9 - carddealer_mapper: the carddealer map strategy 11 10 """ 12 11 … … 29 28 30 29 NOTES: 31 -run non-python commands with: {'python':'', ...}30 run non-python commands with: {'python':'', ...} 32 31 """ 33 32 mydict = defaults.copy() … … 44 43 45 44 Further Input: [***disabled***] 46 -nnodes -- the number of parallel nodes47 -launcher -- the launcher object48 -mapper -- the mapper object49 -timelimit -- string representation of maximum run time (e.g. '00:02')50 -queue -- string name of selected queue (e.g. 'normal')45 nnodes -- the number of parallel nodes 46 launcher -- the launcher object 47 mapper -- the mapper object 48 timelimit -- string representation of maximum run time (e.g. '00:02') 49 queue -- string name of selected queue (e.g. 'normal') 51 50 """ 52 51 #print "ignoring: %s" % kwds #XXX: should allow use of **kwds -
releases/mystic-0.2a1/mystic/scipy_optimize.py
r219 r243 153 153 callback -- an optional user-supplied function to call after each 154 154 iteration. It is called as callback(xk), where xk is the 155 current parameter vector. [default = None] 156 disp -- non-zero to print convergence messages. [default = 0] 157 radius -- percentage change for initial simplex values. 158 [default = 0.05] 155 current parameter vector. [default = None] 156 disp -- non-zero to print convergence messages. [default = 0] 157 radius -- percentage change for initial simplex values. [default = 0.05] 159 158 160 159 """ -
releases/mystic-0.2a1/mystic/tools.py
r225 r243 214 214 generate a custom Sow 215 215 216 takes *args & **kwds, where args will be required inputs for the Sow 216 takes *args & **kwds, where args will be required inputs for the Sow:: 217 217 - args: property name strings (i.e. 'x') 218 218 - kwds: must be in the form: property="doc" (i.e. x='Params')
Note: See TracChangeset
for help on using the changeset viewer.