ZOOpt is a python package for Zeroth-Order Optimization.
Zeroth-order optimization (a.k.a. derivative-free optimization/black-box optimization) does not rely on the gradient of the objective function, but instead, learns from samples of the search space. It is suitable for optimizing functions that are nondifferentiable, with many local minima, or even unknown but only testable.
ZOOpt implements some state-of-the-art zeroth-order optimization methods and their parallel versions. Users only need to add serveral keywords to use parallel optimization on a single machine. For large-scale distributed optimization across multiple machines, please refer to Distributed ZOOpt.
Citation: Yu-Ren Liu, Yi-Qi Hu, Hong Qian, Yang Yu, Chao Qian. ZOOpt: Toolbox for Derivative-Free Optimization. SCIENCE CHINA Information Sciences, 2022. CORR abs/1801.00329 (Features in this article are from version 0.2)
ZOOpt is distributed on PyPI and can be installed with pip
:
$ pip install zoopt
Alternatively, to install ZOOpt by source code, download this project and sequentially run following commands in your terminal/command line.
$ python setup.py build
$ python setup.py install
We define the Ackley function for minimization (note that this function is for arbitrary dimensions, determined by the solution)
import numpy as np
def ackley(solution):
x = solution.get_x()
bias = 0.2
value = -20 * np.exp(-0.2 * np.sqrt(sum([(i - bias) * (i - bias) for i in x]) / len(x))) - \
np.exp(sum([np.cos(2.0*np.pi*(i-bias)) for i in x]) / len(x)) + 20.0 + np.e
return value
Ackley function is a classical function with many local minima. In 2-dimension, it looks like (from wikipedia)
Then, use ZOOpt to optimize a 100-dimension Ackley function:
from zoopt import Dimension, ValueType, Dimension2, Objective, Parameter, Opt, ExpOpt
dim_size = 100 # dimension size
dim = Dimension(dim_size, [[-1, 1]]*dim_size, [True]*dim_size)
# dim = Dimension2([(ValueType.CONTINUOUS, [-1, 1], 1e-6)]*dim_size)
obj = Objective(ackley, dim)
# perform optimization
solution = Opt.min(obj, Parameter(budget=100*dim_size))
# print the solution
print(solution.get_x(), solution.get_value())
# parallel optimization for time-consuming tasks
solution = Opt.min(obj, Parameter(budget=100*dim_size, parallel=True, server_num=3))
For a few seconds, the optimization is done. Then, we can visualize the optimization progress
import matplotlib.pyplot as plt
plt.plot(obj.get_history_bestsofar())
plt.savefig('figure.png')
which looks like
We can also use ExpOpt
to repeat the optimization for performance analysis, which will calculate the mean and standard deviation of multiple optimization results while automatically visualizing the optimization progress.
solution_list = ExpOpt.min(obj, Parameter(budget=100*dim_size), repeat=3, plot=True, plot_file="progress.png")
for solution in solution_list:
print(solution.get_x(), solution.get_value())
More examples are available in the EXAMPLES part.
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。