Profile¶
-
class
pypesto.profile.
ProfileOptions
(default_step_size: float = 0.01, min_step_size: float = 0.001, max_step_size: float = 1.0, step_size_factor: float = 1.25, delta_ratio_max: float = 0.1, ratio_min: float = 0.145, reg_points: int = 10, reg_order: int = 4, magic_factor_obj_value: float = 0.5)¶ Bases:
dict
Options for optimization based profiling.
Parameters: - default_step_size – default step size of the profiling routine along the profile path (adaptive step lengths algorithms will only use this as a first guess and then refine the update)
- min_step_size – lower bound for the step size in adaptive methods
- max_step_size – upper bound for the step size in adaptive methods
- step_size_factor – Adaptive methods recompute the likelihood at the predicted point and try to find a good step length by a sort of line search algorithm. This factor controls step handling in this line search
- delta_ratio_max – maximum allowed drop of the posterior ratio between two profile steps
- ratio_min – lower bound for likelihood ratio of the profile, based on inverse chi2-distribution. The default corresponds to 95% confidence
- reg_points – number of profile points used for regression in regression based adaptive profile points proposal
- reg_order – maximum degree of regression polynomial used in regression based adaptive profile points proposal
- magic_factor_obj_value – There is this magic factor in the old profiling code which slows down profiling at small ratios (must be >= 0 and < 1)
-
__class__
¶ alias of
builtins.type
-
__contains__
()¶ True if D has a key k, else False.
-
__delattr__
¶ Delete self[key].
-
__delitem__
¶ Delete self[key].
-
__dict__
= mappingproxy({'__module__': 'pypesto.profile.profile', '__doc__': '\n Options for optimization based profiling.\n\n Parameters\n ----------\n default_step_size:\n default step size of the profiling routine along the profile path\n (adaptive step lengths algorithms will only use this as a first guess\n and then refine the update)\n min_step_size:\n lower bound for the step size in adaptive methods\n max_step_size:\n upper bound for the step size in adaptive methods\n step_size_factor:\n Adaptive methods recompute the likelihood at the predicted point and\n try to find a good step length by a sort of line search algorithm.\n This factor controls step handling in this line search\n delta_ratio_max:\n maximum allowed drop of the posterior ratio between two profile steps\n ratio_min:\n lower bound for likelihood ratio of the profile, based on inverse\n chi2-distribution.\n The default corresponds to 95% confidence\n reg_points:\n number of profile points used for regression in regression based\n adaptive profile points proposal\n reg_order:\n maximum degree of regression polynomial used in regression based\n adaptive profile points proposal\n magic_factor_obj_value:\n There is this magic factor in the old profiling code which slows down\n profiling at small ratios (must be >= 0 and < 1)\n ', '__init__': <function ProfileOptions.__init__>, '__getattr__': <function ProfileOptions.__getattr__>, '__setattr__': <slot wrapper '__setitem__' of 'dict' objects>, '__delattr__': <slot wrapper '__delitem__' of 'dict' objects>, 'create_instance': <staticmethod object>, '__dict__': <attribute '__dict__' of 'ProfileOptions' objects>, '__weakref__': <attribute '__weakref__' of 'ProfileOptions' objects>})¶
-
__dir__
() → list¶ default dir() implementation
-
__eq__
¶ Return self==value.
-
__format__
()¶ default object formatter
-
__ge__
¶ Return self>=value.
-
__getattr__
(key)¶
-
__getattribute__
¶ Return getattr(self, name).
-
__getitem__
()¶ x.__getitem__(y) <==> x[y]
-
__gt__
¶ Return self>value.
-
__hash__
= None¶
-
__init__
(default_step_size: float = 0.01, min_step_size: float = 0.001, max_step_size: float = 1.0, step_size_factor: float = 1.25, delta_ratio_max: float = 0.1, ratio_min: float = 0.145, reg_points: int = 10, reg_order: int = 4, magic_factor_obj_value: float = 0.5)¶ Initialize self. See help(type(self)) for accurate signature.
-
__init_subclass__
()¶ This method is called when a class is subclassed.
The default implementation does nothing. It may be overridden to extend subclasses.
-
__iter__
¶ Implement iter(self).
-
__le__
¶ Return self<=value.
-
__len__
¶ Return len(self).
-
__lt__
¶ Return self<value.
-
__module__
= 'pypesto.profile.profile'¶
-
__ne__
¶ Return self!=value.
-
__new__
()¶ Create and return a new object. See help(type) for accurate signature.
-
__reduce__
()¶ helper for pickle
-
__reduce_ex__
()¶ helper for pickle
-
__repr__
¶ Return repr(self).
-
__setattr__
¶ Set self[key] to value.
-
__setitem__
¶ Set self[key] to value.
-
__sizeof__
() → size of D in memory, in bytes¶
-
__str__
¶ Return str(self).
-
__subclasshook__
()¶ Abstract classes can override this to customize issubclass().
This is invoked early on by abc.ABCMeta.__subclasscheck__(). It should return True, False or NotImplemented. If it returns NotImplemented, the normal algorithm is used. Otherwise, it overrides the normal algorithm (and the outcome is cached).
-
__weakref__
¶ list of weak references to the object (if defined)
-
clear
() → None. Remove all items from D.¶
-
copy
() → a shallow copy of D¶
-
static
create_instance
(maybe_options: Union[ProfileOptions, Dict]) → pypesto.profile.profile.ProfileOptions¶ Returns a valid options object.
Parameters: maybe_options (ProfileOptions or dict) –
-
fromkeys
()¶ Returns a new dict with keys from iterable and values equal to value.
-
get
(k[, d]) → D[k] if k in D, else d. d defaults to None.¶
-
items
() → a set-like object providing a view on D's items¶
-
keys
() → a set-like object providing a view on D's keys¶
-
pop
(k[, d]) → v, remove specified key and return the corresponding value.¶ If key is not found, d is returned if given, otherwise KeyError is raised
-
popitem
() → (k, v), remove and return some (key, value) pair as a¶ 2-tuple; but raise KeyError if D is empty.
-
setdefault
(k[, d]) → D.get(k,d), also set D[k]=d if k not in D¶
-
update
([E, ]**F) → None. Update D from dict/iterable E and F.¶ If E is present and has a .keys() method, then does: for k in E: D[k] = E[k] If E is present and lacks a .keys() method, then does: for k, v in E: D[k] = v In either case, this is followed by: for k in F: D[k] = F[k]
-
values
() → an object providing a view on D's values¶
-
class
pypesto.profile.
ProfilerResult
(x_path, fval_path, ratio_path, gradnorm_path=None, exitflag_path=None, time_path=None, time_total=0.0, n_fval=0, n_grad=0, n_hess=0, message=None)¶ Bases:
dict
The result of a profiler run. The standardized return return value from pypesto.profile, which can either be initialized from an OptimizerResult or from an existing ProfilerResult (in order to extend the computation).
Can be used like a dict.
-
x_path
¶ The path of the best found parameters along the profile (Dimension: n_par x n_profile_points)
Type: ndarray
-
fval_path
¶ The function values, fun(x), along the profile.
Type: ndarray
-
ratio_path
¶ The ratio of the posterior function along the profile.
Type: ndarray
-
gradnorm_path
¶ The gradient norm along the profile.
Type: ndarray
-
exitflag_path
¶ The exitflags of the optimizer along the profile.
Type: ndarray
-
time_path
¶ The computation time of the optimizer runs along the profile.
Type: ndarray
-
time_total
¶ The total computation time for the profile.
Type: ndarray
-
n_fval
¶ Number of function evaluations.
Type: int
-
n_grad
¶ Number of gradient evaluations.
Type: int
-
n_hess
¶ Number of Hessian evaluations.
Type: int
-
message
¶ Textual comment on the profile result.
Type: str
Notes
Any field not supported by the profiler or the profiling optimizer is filled with None. Some fields are filled by pypesto itself.
-
__class__
¶ alias of
builtins.type
-
__contains__
()¶ True if D has a key k, else False.
-
__delattr__
¶ Delete self[key].
-
__delitem__
¶ Delete self[key].
-
__dict__
= mappingproxy({'__module__': 'pypesto.profile.result', '__doc__': '\n The result of a profiler run. The standardized return return value from\n pypesto.profile, which can either be initialized from an OptimizerResult\n or from an existing ProfilerResult (in order to extend the computation).\n\n Can be used like a dict.\n\n Attributes\n ----------\n\n x_path: ndarray\n The path of the best found parameters along the profile\n (Dimension: n_par x n_profile_points)\n\n fval_path: ndarray\n The function values, fun(x), along the profile.\n\n ratio_path: ndarray\n The ratio of the posterior function along the profile.\n\n gradnorm_path: ndarray\n The gradient norm along the profile.\n\n exitflag_path: ndarray\n The exitflags of the optimizer along the profile.\n\n time_path: ndarray\n The computation time of the optimizer runs along the profile.\n\n time_total: ndarray\n The total computation time for the profile.\n\n n_fval: int\n Number of function evaluations.\n\n n_grad: int\n Number of gradient evaluations.\n\n n_hess: int\n Number of Hessian evaluations.\n\n message: str\n Textual comment on the profile result.\n\n Notes\n -----\n\n Any field not supported by the profiler or the profiling optimizer is\n filled with None. Some fields are filled by pypesto itself.\n ', '__init__': <function ProfilerResult.__init__>, '__getattr__': <function ProfilerResult.__getattr__>, '__setattr__': <slot wrapper '__setitem__' of 'dict' objects>, '__delattr__': <slot wrapper '__delitem__' of 'dict' objects>, 'append_profile_point': <function ProfilerResult.append_profile_point>, 'flip_profile': <function ProfilerResult.flip_profile>, '__dict__': <attribute '__dict__' of 'ProfilerResult' objects>, '__weakref__': <attribute '__weakref__' of 'ProfilerResult' objects>})¶
-
__dir__
() → list¶ default dir() implementation
-
__eq__
¶ Return self==value.
-
__format__
()¶ default object formatter
-
__ge__
¶ Return self>=value.
-
__getattr__
(key)¶
-
__getattribute__
¶ Return getattr(self, name).
-
__getitem__
()¶ x.__getitem__(y) <==> x[y]
-
__gt__
¶ Return self>value.
-
__hash__
= None¶
-
__init__
(x_path, fval_path, ratio_path, gradnorm_path=None, exitflag_path=None, time_path=None, time_total=0.0, n_fval=0, n_grad=0, n_hess=0, message=None)¶ Initialize self. See help(type(self)) for accurate signature.
-
__init_subclass__
()¶ This method is called when a class is subclassed.
The default implementation does nothing. It may be overridden to extend subclasses.
-
__iter__
¶ Implement iter(self).
-
__le__
¶ Return self<=value.
-
__len__
¶ Return len(self).
-
__lt__
¶ Return self<value.
-
__module__
= 'pypesto.profile.result'¶
-
__ne__
¶ Return self!=value.
-
__new__
()¶ Create and return a new object. See help(type) for accurate signature.
-
__reduce__
()¶ helper for pickle
-
__reduce_ex__
()¶ helper for pickle
-
__repr__
¶ Return repr(self).
-
__setattr__
¶ Set self[key] to value.
-
__setitem__
¶ Set self[key] to value.
-
__sizeof__
() → size of D in memory, in bytes¶
-
__str__
¶ Return str(self).
-
__subclasshook__
()¶ Abstract classes can override this to customize issubclass().
This is invoked early on by abc.ABCMeta.__subclasscheck__(). It should return True, False or NotImplemented. If it returns NotImplemented, the normal algorithm is used. Otherwise, it overrides the normal algorithm (and the outcome is cached).
-
__weakref__
¶ list of weak references to the object (if defined)
-
append_profile_point
(x, fval, ratio, gradnorm=nan, exitflag=nan, time=nan, n_fval=0, n_grad=0, n_hess=0)¶ This function appends a new OptimizerResult to an existing ProfilerResults
-
clear
() → None. Remove all items from D.¶
-
copy
() → a shallow copy of D¶
-
flip_profile
()¶ This function flips the profiling direction (left-right) Profiling direction needs to be changed once (if the profile is new) and twice, if we append to an existing profile
-
fromkeys
()¶ Returns a new dict with keys from iterable and values equal to value.
-
get
(k[, d]) → D[k] if k in D, else d. d defaults to None.¶
-
items
() → a set-like object providing a view on D's items¶
-
keys
() → a set-like object providing a view on D's keys¶
-
pop
(k[, d]) → v, remove specified key and return the corresponding value.¶ If key is not found, d is returned if given, otherwise KeyError is raised
-
popitem
() → (k, v), remove and return some (key, value) pair as a¶ 2-tuple; but raise KeyError if D is empty.
-
setdefault
(k[, d]) → D.get(k,d), also set D[k]=d if k not in D¶
-
update
([E, ]**F) → None. Update D from dict/iterable E and F.¶ If E is present and has a .keys() method, then does: for k in E: D[k] = E[k] If E is present and lacks a .keys() method, then does: for k, v in E: D[k] = v In either case, this is followed by: for k in F: D[k] = F[k]
-
values
() → an object providing a view on D's values¶
-
-
pypesto.profile.
parameter_profile
(problem: pypesto.problem.Problem, result: pypesto.result.Result, optimizer: pypesto.optimize.optimizer.Optimizer, profile_index: numpy.ndarray = None, profile_list: int = None, result_index: int = 0, next_guess_method: Callable = None, profile_options: pypesto.profile.profile.ProfileOptions = None) → pypesto.result.Result¶ This is the main function to call to do parameter profiling.
Parameters: - problem – The problem to be solved.
- result – A result object to initialize profiling and to append the profiling results to. For example, one might append more profiling runs to a previous profile, in order to merge these. The existence of an optimization result is obligatory.
- optimizer – The optimizer to be used along each profile.
- profile_index – array with parameter indices, whether a profile should be computed (1) or not (0) Default is all profiles should be computed
- profile_list – integer which specifies whether a call to the profiler should create a new list of profiles (default) or should be added to a specific profile list
- result_index – index from which optimization result profiling should be started (default: global optimum, i.e., index = 0)
- next_guess_method – function handle to a method that creates the next starting point for optimization in profiling.
- profile_options – Various options applied to the profile optimization.
Returns: The profile results are filled into result.profile_result.
Return type: result