Misc
OptimizerFunctions (tuple)
¶
get_functional_optimizer(optimizer)
¶
Get a tuple of optimizer-related functions, from the given optimizer name.
For example, if the given string is "adam", the returned tuple will be
(adam, adam_ask, adam_tell)
, where
adam
is the function that will initialize the Adam optimizer,
adam_ask
is the function that will get the current search point as a tensor, and
adam_tell
is the function that will expect the gradient and will return the updated
state of the Adam search after applying the given gradient.
In addition to "adam", the strings "clipup" and "sgd" are also supported.
If the given optimizer is a 3-element tuple, then, the three elements within the tuple are assumed to be the initialization, ask, and tell functions of a custom optimizer, and those functions are returned in the same order.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
optimizer |
Union[str, tuple] |
The optimizer name as a string, or a 3-element tuple representing the functions related to the optimizer. |
required |
Returns:
Type | Description |
---|---|
tuple |
A 3-element tuple in the form
|
Source code in evotorch/algorithms/functional/misc.py
def get_functional_optimizer(optimizer: Union[str, tuple]) -> tuple:
"""
Get a tuple of optimizer-related functions, from the given optimizer name.
For example, if the given string is "adam", the returned tuple will be
`(adam, adam_ask, adam_tell)`, where
[adam][evotorch.algorithms.functional.funcadam.adam]
is the function that will initialize the Adam optimizer,
[adam_ask][evotorch.algorithms.functional.funcadam.adam_ask]
is the function that will get the current search point as a tensor, and
[adam_tell][evotorch.algorithms.functional.funcadam.adam_tell]
is the function that will expect the gradient and will return the updated
state of the Adam search after applying the given gradient.
In addition to "adam", the strings "clipup" and "sgd" are also supported.
If the given optimizer is a 3-element tuple, then, the three elements
within the tuple are assumed to be the initialization, ask, and tell
functions of a custom optimizer, and those functions are returned
in the same order.
Args:
optimizer: The optimizer name as a string, or a 3-element tuple
representing the functions related to the optimizer.
Returns:
A 3-element tuple in the form
`(optimizer, optimizer_ask, optimizer_tell)`, where each element
is a function, the first one being responsible for initializing
the optimizer and returning its first state.
"""
from .funcadam import adam, adam_ask, adam_tell
from .funcclipup import clipup, clipup_ask, clipup_tell
from .funcsgd import sgd, sgd_ask, sgd_tell
if optimizer == "adam":
return OptimizerFunctions(initialize=adam, ask=adam_ask, tell=adam_tell)
elif optimizer == "clipup":
return OptimizerFunctions(initialize=clipup, ask=clipup_ask, tell=clipup_tell)
elif optimizer in ("sgd", "sga", "momentum"):
return OptimizerFunctions(initialize=sgd, ask=sgd_ask, tell=sgd_tell)
elif isinstance(optimizer, str):
raise ValueError(f"Unrecognized functional optimizer name: {optimizer}")
elif isinstance(optimizer, Iterable):
a, b, c = optimizer
return OptimizerFunctions(initialize=a, ask=b, tell=c)
else:
raise TypeError(
f"`get_functional_optimizer(...)` received an unrecognized argument: {repr(optimizer)}"
f" (of type {type(optimizer)})"
)