Optimize

Multistart optimization with support for various optimizers.

class pypesto.optimize.DlibOptimizer(method: str, options: Dict = None)

Bases: pypesto.optimize.optimizer.Optimizer

Use the Dlib toolbox for optimization.

__abstractmethods__ = frozenset()
__class__

alias of abc.ABCMeta

__delattr__

Implement delattr(self, name).

__dict__ = mappingproxy({'__module__': 'pypesto.optimize.optimizer', '__doc__': '\n Use the Dlib toolbox for optimization.\n ', '__init__': <function DlibOptimizer.__init__>, 'minimize': <function fix_decorator.<locals>.wrapped_minimize>, 'is_least_squares': <function DlibOptimizer.is_least_squares>, 'get_default_options': <staticmethod object>, '__abstractmethods__': frozenset(), '_abc_registry': <_weakrefset.WeakSet object>, '_abc_cache': <_weakrefset.WeakSet object>, '_abc_negative_cache': <_weakrefset.WeakSet object>, '_abc_negative_cache_version': 48})
__dir__() → list

default dir() implementation

__eq__

Return self==value.

__format__()

default object formatter

__ge__

Return self>=value.

__getattribute__

Return getattr(self, name).

__gt__

Return self>value.

__hash__

Return hash(self).

__init__(method: str, options: Dict = None)

Default constructor.

__init_subclass__()

This method is called when a class is subclassed.

The default implementation does nothing. It may be overridden to extend subclasses.

__le__

Return self<=value.

__lt__

Return self<value.

__module__ = 'pypesto.optimize.optimizer'
__ne__

Return self!=value.

__new__()

Create and return a new object. See help(type) for accurate signature.

__reduce__()

helper for pickle

__reduce_ex__()

helper for pickle

__repr__

Return repr(self).

__setattr__

Implement setattr(self, name, value).

__sizeof__() → int

size of object in memory, in bytes

__str__

Return str(self).

__subclasshook__()

Abstract classes can override this to customize issubclass().

This is invoked early on by abc.ABCMeta.__subclasscheck__(). It should return True, False or NotImplemented. If it returns NotImplemented, the normal algorithm is used. Otherwise, it overrides the normal algorithm (and the outcome is cached).

__weakref__

list of weak references to the object (if defined)

static get_default_options(self)

Create default options specific for the optimizer.

is_least_squares()
minimize(problem, x0, id, history_options=None)
class pypesto.optimize.OptimizeOptions(startpoint_resample: bool = False, allow_failed_starts: bool = True)

Bases: dict

Options for the multistart optimization.

Parameters:
  • startpoint_resample – Flag indicating whether initial points are supposed to be resampled if function evaluation fails at the initial point
  • allow_failed_starts (bool, optional) – Flag indicating whether we tolerate that exceptions are thrown during the minimization process.
__class__

alias of builtins.type

__contains__()

True if D has a key k, else False.

__delattr__

Delete self[key].

__delitem__

Delete self[key].

__dict__ = mappingproxy({'__module__': 'pypesto.optimize.options', '__doc__': '\n Options for the multistart optimization.\n\n Parameters\n ----------\n startpoint_resample:\n Flag indicating whether initial points are supposed to be resampled if\n function evaluation fails at the initial point\n allow_failed_starts: bool, optional\n Flag indicating whether we tolerate that exceptions are thrown during\n the minimization process.\n ', '__init__': <function OptimizeOptions.__init__>, '__getattr__': <function OptimizeOptions.__getattr__>, '__setattr__': <slot wrapper '__setitem__' of 'dict' objects>, '__delattr__': <slot wrapper '__delitem__' of 'dict' objects>, 'assert_instance': <staticmethod object>, '__dict__': <attribute '__dict__' of 'OptimizeOptions' objects>, '__weakref__': <attribute '__weakref__' of 'OptimizeOptions' objects>})
__dir__() → list

default dir() implementation

__eq__

Return self==value.

__format__()

default object formatter

__ge__

Return self>=value.

__getattr__(key)
__getattribute__

Return getattr(self, name).

__getitem__()

x.__getitem__(y) <==> x[y]

__gt__

Return self>value.

__hash__ = None
__init__(startpoint_resample: bool = False, allow_failed_starts: bool = True)

Initialize self. See help(type(self)) for accurate signature.

__init_subclass__()

This method is called when a class is subclassed.

The default implementation does nothing. It may be overridden to extend subclasses.

__iter__

Implement iter(self).

__le__

Return self<=value.

__len__

Return len(self).

__lt__

Return self<value.

__module__ = 'pypesto.optimize.options'
__ne__

Return self!=value.

__new__()

Create and return a new object. See help(type) for accurate signature.

__reduce__()

helper for pickle

__reduce_ex__()

helper for pickle

__repr__

Return repr(self).

__setattr__

Set self[key] to value.

__setitem__

Set self[key] to value.

__sizeof__() → size of D in memory, in bytes
__str__

Return str(self).

__subclasshook__()

Abstract classes can override this to customize issubclass().

This is invoked early on by abc.ABCMeta.__subclasscheck__(). It should return True, False or NotImplemented. If it returns NotImplemented, the normal algorithm is used. Otherwise, it overrides the normal algorithm (and the outcome is cached).

__weakref__

list of weak references to the object (if defined)

static assert_instance(maybe_options: Union[OptimizeOptions, Dict]) → pypesto.optimize.options.OptimizeOptions

Returns a valid options object.

Parameters:maybe_options (OptimizeOptions or dict) –
clear() → None. Remove all items from D.
copy() → a shallow copy of D
fromkeys()

Returns a new dict with keys from iterable and values equal to value.

get(k[, d]) → D[k] if k in D, else d. d defaults to None.
items() → a set-like object providing a view on D's items
keys() → a set-like object providing a view on D's keys
pop(k[, d]) → v, remove specified key and return the corresponding value.

If key is not found, d is returned if given, otherwise KeyError is raised

popitem() → (k, v), remove and return some (key, value) pair as a

2-tuple; but raise KeyError if D is empty.

setdefault(k[, d]) → D.get(k,d), also set D[k]=d if k not in D
update([E, ]**F) → None. Update D from dict/iterable E and F.

If E is present and has a .keys() method, then does: for k in E: D[k] = E[k] If E is present and lacks a .keys() method, then does: for k, v in E: D[k] = v In either case, this is followed by: for k in F: D[k] = F[k]

values() → an object providing a view on D's values
class pypesto.optimize.Optimizer

Bases: abc.ABC

This is the optimizer base class, not functional on its own.

An optimizer takes a problem, and possibly a start point, and then performs an optimization. It returns an OptimizerResult.

__abstractmethods__ = frozenset({'is_least_squares', 'minimize'})
__class__

alias of abc.ABCMeta

__delattr__

Implement delattr(self, name).

__dict__ = mappingproxy({'__module__': 'pypesto.optimize.optimizer', '__doc__': '\n This is the optimizer base class, not functional on its own.\n\n An optimizer takes a problem, and possibly a start point, and then\n performs an optimization. It returns an OptimizerResult.\n ', '__init__': <function Optimizer.__init__>, 'minimize': <function fix_decorator.<locals>.wrapped_minimize>, 'is_least_squares': <function Optimizer.is_least_squares>, 'get_default_options': <staticmethod object>, '__abstractmethods__': frozenset({'is_least_squares', 'minimize'}), '_abc_registry': <_weakrefset.WeakSet object>, '_abc_cache': <_weakrefset.WeakSet object>, '_abc_negative_cache': <_weakrefset.WeakSet object>, '_abc_negative_cache_version': 48})
__dir__() → list

default dir() implementation

__eq__

Return self==value.

__format__()

default object formatter

__ge__

Return self>=value.

__getattribute__

Return getattr(self, name).

__gt__

Return self>value.

__hash__

Return hash(self).

__init__()

Default constructor.

__init_subclass__()

This method is called when a class is subclassed.

The default implementation does nothing. It may be overridden to extend subclasses.

__le__

Return self<=value.

__lt__

Return self<value.

__module__ = 'pypesto.optimize.optimizer'
__ne__

Return self!=value.

__new__()

Create and return a new object. See help(type) for accurate signature.

__reduce__()

helper for pickle

__reduce_ex__()

helper for pickle

__repr__

Return repr(self).

__setattr__

Implement setattr(self, name, value).

__sizeof__() → int

size of object in memory, in bytes

__str__

Return str(self).

__subclasshook__()

Abstract classes can override this to customize issubclass().

This is invoked early on by abc.ABCMeta.__subclasscheck__(). It should return True, False or NotImplemented. If it returns NotImplemented, the normal algorithm is used. Otherwise, it overrides the normal algorithm (and the outcome is cached).

__weakref__

list of weak references to the object (if defined)

static get_default_options()

Create default options specific for the optimizer.

is_least_squares()
minimize(problem, x0, id, history_options=None)
class pypesto.optimize.OptimizerResult(id: str = None, x: numpy.ndarray = None, fval: float = None, grad: numpy.ndarray = None, hess: numpy.ndarray = None, res: numpy.ndarray = None, sres: numpy.ndarray = None, n_fval: int = None, n_grad: int = None, n_hess: int = None, n_res: int = None, n_sres: int = None, x0: numpy.ndarray = None, fval0: float = None, history: pypesto.objective.history.History = None, exitflag: int = None, time: float = None, message: str = None)

Bases: dict

The result of an optimizer run. Used as a standardized return value to map from the individual result objects returned by the employed optimizers to the format understood by pypesto.

Can be used like a dict.

id

Id of the optimizer run. Usually the start index.

x

The best found parameters.

fval

The best found function value, fun(x).

grad

The gradient at x.

hess

The Hessian at x.

res

The residuals at x.

sres

The residual sensitivities at x.

n_fval

Number of function evaluations.

n_grad

Number of gradient evaluations.

n_hess

Number of Hessian evaluations.

n_res

Number of residuals evaluations.

n_sres

Number of residual sensitivity evaluations.

x0

The starting parameters.

fval0

The starting function value, fun(x0).

history

Objective history.

exitflag

The exitflag of the optimizer.

time

Execution time.

message

Textual comment on the optimization result.

Type:str

Notes

Any field not supported by the optimizer is filled with None.

__class__

alias of builtins.type

__contains__()

True if D has a key k, else False.

__delattr__

Delete self[key].

__delitem__

Delete self[key].

__dict__ = mappingproxy({'__module__': 'pypesto.optimize.result', '__doc__': '\n The result of an optimizer run. Used as a standardized return value to\n map from the individual result objects returned by the employed\n optimizers to the format understood by pypesto.\n\n Can be used like a dict.\n\n Attributes\n ----------\n id:\n Id of the optimizer run. Usually the start index.\n x:\n The best found parameters.\n fval:\n The best found function value, `fun(x)`.\n grad:\n The gradient at `x`.\n hess:\n The Hessian at `x`.\n res:\n The residuals at `x`.\n sres:\n The residual sensitivities at `x`.\n n_fval\n Number of function evaluations.\n n_grad:\n Number of gradient evaluations.\n n_hess:\n Number of Hessian evaluations.\n n_res:\n Number of residuals evaluations.\n n_sres:\n Number of residual sensitivity evaluations.\n x0:\n The starting parameters.\n fval0:\n The starting function value, `fun(x0)`.\n history:\n Objective history.\n exitflag:\n The exitflag of the optimizer.\n time:\n Execution time.\n message: str\n Textual comment on the optimization result.\n\n Notes\n -----\n\n Any field not supported by the optimizer is filled with None.\n ', '__init__': <function OptimizerResult.__init__>, '__getattr__': <function OptimizerResult.__getattr__>, '__setattr__': <slot wrapper '__setitem__' of 'dict' objects>, '__delattr__': <slot wrapper '__delitem__' of 'dict' objects>, '__dict__': <attribute '__dict__' of 'OptimizerResult' objects>, '__weakref__': <attribute '__weakref__' of 'OptimizerResult' objects>})
__dir__() → list

default dir() implementation

__eq__

Return self==value.

__format__()

default object formatter

__ge__

Return self>=value.

__getattr__(key)
__getattribute__

Return getattr(self, name).

__getitem__()

x.__getitem__(y) <==> x[y]

__gt__

Return self>value.

__hash__ = None
__init__(id: str = None, x: numpy.ndarray = None, fval: float = None, grad: numpy.ndarray = None, hess: numpy.ndarray = None, res: numpy.ndarray = None, sres: numpy.ndarray = None, n_fval: int = None, n_grad: int = None, n_hess: int = None, n_res: int = None, n_sres: int = None, x0: numpy.ndarray = None, fval0: float = None, history: pypesto.objective.history.History = None, exitflag: int = None, time: float = None, message: str = None)

Initialize self. See help(type(self)) for accurate signature.

__init_subclass__()

This method is called when a class is subclassed.

The default implementation does nothing. It may be overridden to extend subclasses.

__iter__

Implement iter(self).

__le__

Return self<=value.

__len__

Return len(self).

__lt__

Return self<value.

__module__ = 'pypesto.optimize.result'
__ne__

Return self!=value.

__new__()

Create and return a new object. See help(type) for accurate signature.

__reduce__()

helper for pickle

__reduce_ex__()

helper for pickle

__repr__

Return repr(self).

__setattr__

Set self[key] to value.

__setitem__

Set self[key] to value.

__sizeof__() → size of D in memory, in bytes
__str__

Return str(self).

__subclasshook__()

Abstract classes can override this to customize issubclass().

This is invoked early on by abc.ABCMeta.__subclasscheck__(). It should return True, False or NotImplemented. If it returns NotImplemented, the normal algorithm is used. Otherwise, it overrides the normal algorithm (and the outcome is cached).

__weakref__

list of weak references to the object (if defined)

clear() → None. Remove all items from D.
copy() → a shallow copy of D
fromkeys()

Returns a new dict with keys from iterable and values equal to value.

get(k[, d]) → D[k] if k in D, else d. d defaults to None.
items() → a set-like object providing a view on D's items
keys() → a set-like object providing a view on D's keys
pop(k[, d]) → v, remove specified key and return the corresponding value.

If key is not found, d is returned if given, otherwise KeyError is raised

popitem() → (k, v), remove and return some (key, value) pair as a

2-tuple; but raise KeyError if D is empty.

setdefault(k[, d]) → D.get(k,d), also set D[k]=d if k not in D
update([E, ]**F) → None. Update D from dict/iterable E and F.

If E is present and has a .keys() method, then does: for k in E: D[k] = E[k] If E is present and lacks a .keys() method, then does: for k, v in E: D[k] = v In either case, this is followed by: for k in F: D[k] = F[k]

values() → an object providing a view on D's values
class pypesto.optimize.PyswarmOptimizer(options: Dict = None)

Bases: pypesto.optimize.optimizer.Optimizer

Global optimization using pyswarm.

__abstractmethods__ = frozenset()
__class__

alias of abc.ABCMeta

__delattr__

Implement delattr(self, name).

__dict__ = mappingproxy({'__module__': 'pypesto.optimize.optimizer', '__doc__': '\n Global optimization using pyswarm.\n ', '__init__': <function PyswarmOptimizer.__init__>, 'minimize': <function fix_decorator.<locals>.wrapped_minimize>, 'is_least_squares': <function PyswarmOptimizer.is_least_squares>, '__abstractmethods__': frozenset(), '_abc_registry': <_weakrefset.WeakSet object>, '_abc_cache': <_weakrefset.WeakSet object>, '_abc_negative_cache': <_weakrefset.WeakSet object>, '_abc_negative_cache_version': 48})
__dir__() → list

default dir() implementation

__eq__

Return self==value.

__format__()

default object formatter

__ge__

Return self>=value.

__getattribute__

Return getattr(self, name).

__gt__

Return self>value.

__hash__

Return hash(self).

__init__(options: Dict = None)

Default constructor.

__init_subclass__()

This method is called when a class is subclassed.

The default implementation does nothing. It may be overridden to extend subclasses.

__le__

Return self<=value.

__lt__

Return self<value.

__module__ = 'pypesto.optimize.optimizer'
__ne__

Return self!=value.

__new__()

Create and return a new object. See help(type) for accurate signature.

__reduce__()

helper for pickle

__reduce_ex__()

helper for pickle

__repr__

Return repr(self).

__setattr__

Implement setattr(self, name, value).

__sizeof__() → int

size of object in memory, in bytes

__str__

Return str(self).

__subclasshook__()

Abstract classes can override this to customize issubclass().

This is invoked early on by abc.ABCMeta.__subclasscheck__(). It should return True, False or NotImplemented. If it returns NotImplemented, the normal algorithm is used. Otherwise, it overrides the normal algorithm (and the outcome is cached).

__weakref__

list of weak references to the object (if defined)

static get_default_options()

Create default options specific for the optimizer.

is_least_squares()
minimize(problem, x0, id, history_options=None)
class pypesto.optimize.ScipyOptimizer(method: str = 'L-BFGS-B', tol: float = 1e-09, options: Dict = None)

Bases: pypesto.optimize.optimizer.Optimizer

Use the SciPy optimizers.

__abstractmethods__ = frozenset()
__class__

alias of abc.ABCMeta

__delattr__

Implement delattr(self, name).

__dict__ = mappingproxy({'__module__': 'pypesto.optimize.optimizer', '__doc__': '\n Use the SciPy optimizers.\n ', '__init__': <function ScipyOptimizer.__init__>, 'minimize': <function fix_decorator.<locals>.wrapped_minimize>, 'is_least_squares': <function ScipyOptimizer.is_least_squares>, 'get_default_options': <staticmethod object>, '__abstractmethods__': frozenset(), '_abc_registry': <_weakrefset.WeakSet object>, '_abc_cache': <_weakrefset.WeakSet object>, '_abc_negative_cache': <_weakrefset.WeakSet object>, '_abc_negative_cache_version': 48})
__dir__() → list

default dir() implementation

__eq__

Return self==value.

__format__()

default object formatter

__ge__

Return self>=value.

__getattribute__

Return getattr(self, name).

__gt__

Return self>value.

__hash__

Return hash(self).

__init__(method: str = 'L-BFGS-B', tol: float = 1e-09, options: Dict = None)

Default constructor.

__init_subclass__()

This method is called when a class is subclassed.

The default implementation does nothing. It may be overridden to extend subclasses.

__le__

Return self<=value.

__lt__

Return self<value.

__module__ = 'pypesto.optimize.optimizer'
__ne__

Return self!=value.

__new__()

Create and return a new object. See help(type) for accurate signature.

__reduce__()

helper for pickle

__reduce_ex__()

helper for pickle

__repr__

Return repr(self).

__setattr__

Implement setattr(self, name, value).

__sizeof__() → int

size of object in memory, in bytes

__str__

Return str(self).

__subclasshook__()

Abstract classes can override this to customize issubclass().

This is invoked early on by abc.ABCMeta.__subclasscheck__(). It should return True, False or NotImplemented. If it returns NotImplemented, the normal algorithm is used. Otherwise, it overrides the normal algorithm (and the outcome is cached).

__weakref__

list of weak references to the object (if defined)

static get_default_options(self)

Create default options specific for the optimizer.

is_least_squares()
minimize(problem, x0, id, history_options=None)
pypesto.optimize.minimize(problem: pypesto.problem.Problem, optimizer: pypesto.optimize.optimizer.Optimizer = None, n_starts: int = 100, ids: Iterable[str] = None, startpoint_method: Union[Callable, bool] = None, result: pypesto.result.Result = None, engine: pypesto.engine.base.Engine = None, options: pypesto.optimize.options.OptimizeOptions = None, history_options: pypesto.objective.history.HistoryOptions = None) → pypesto.result.Result

This is the main function to call to do multistart optimization.

Parameters:
  • problem – The problem to be solved.
  • optimizer – The optimizer to be used n_starts times.
  • n_starts – Number of starts of the optimizer.
  • ids – Ids assigned to the startpoints.
  • startpoint_method – Method for how to choose start points. False means the optimizer does not require start points, e.g. ‘pso’ method in ‘GlobalOptimizer’
  • result – A result object to append the optimization results to. For example, one might append more runs to a previous optimization. If None, a new object is created.
  • engine – Parallelization engine. Defaults to sequential execution on a SingleCoreEngine.
  • options – Various options applied to the multistart optimization.
  • history_options – Optimizer history options.
Returns:

Result object containing the results of all multistarts in result.optimize_result.

Return type:

result