1 Star 0 Fork 0

丁老板/ZOOpt

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
文件
克隆/下载
贡献代码
同步代码
取消
提示: 由于 Git 不支持空文件夾,创建文件夹后会生成空的 .keep 文件
Loading...
README

ZOOpt

Build Status License Documentation Status Code Coverage

ZOOpt is a python package for Zeroth-Order Optimization.

Zeroth-order optimization (a.k.a. derivative-free optimization/black-box optimization) does not rely on the gradient of the objective function, but instead, learns from samples of the search space. It is suitable for optimizing functions that are nondifferentiable, with many local minima, or even unknown but only testable.

ZOOpt implements some state-of-the-art zeroth-order optimization methods and their parallel versions. Users only need to add serveral keywords to use parallel optimization on a single machine. For large-scale distributed optimization across multiple machines, please refer to Distributed ZOOpt.

Citation: Yu-Ren Liu, Yi-Qi Hu, Hong Qian, Yang Yu, Chao Qian. ZOOpt: Toolbox for Derivative-Free Optimization. SCIENCE CHINA Information Sciences, 2022. CORR abs/1801.00329 (Features in this article are from version 0.2)

Installation

ZOOpt is distributed on PyPI and can be installed with pip:

$ pip install zoopt

Alternatively, to install ZOOpt by source code, download this project and sequentially run following commands in your terminal/command line.

$ python setup.py build
$ python setup.py install

A simple example

We define the Ackley function for minimization (note that this function is for arbitrary dimensions, determined by the solution)

import numpy as np
def ackley(solution):
    x = solution.get_x()
    bias = 0.2
    value = -20 * np.exp(-0.2 * np.sqrt(sum([(i - bias) * (i - bias) for i in x]) / len(x))) - \
            np.exp(sum([np.cos(2.0*np.pi*(i-bias)) for i in x]) / len(x)) + 20.0 + np.e
    return value

Ackley function is a classical function with many local minima. In 2-dimension, it looks like (from wikipedia)

Ackley function

Then, use ZOOpt to optimize a 100-dimension Ackley function:

from zoopt import Dimension, ValueType, Dimension2, Objective, Parameter, Opt, ExpOpt

dim_size = 100  # dimension size
dim = Dimension(dim_size, [[-1, 1]]*dim_size, [True]*dim_size)
# dim = Dimension2([(ValueType.CONTINUOUS, [-1, 1], 1e-6)]*dim_size)
obj = Objective(ackley, dim)
# perform optimization
solution = Opt.min(obj, Parameter(budget=100*dim_size))
# print the solution
print(solution.get_x(), solution.get_value())
# parallel optimization for time-consuming tasks
solution = Opt.min(obj, Parameter(budget=100*dim_size, parallel=True, server_num=3))

For a few seconds, the optimization is done. Then, we can visualize the optimization progress

import matplotlib.pyplot as plt
plt.plot(obj.get_history_bestsofar())
plt.savefig('figure.png')

which looks like

https://github.com/eyounx/ZOOpt/blob/dev/img/quick_start.png?raw=true"alt="Expeirmentresults

We can also use ExpOpt to repeat the optimization for performance analysis, which will calculate the mean and standard deviation of multiple optimization results while automatically visualizing the optimization progress.

solution_list = ExpOpt.min(obj, Parameter(budget=100*dim_size), repeat=3, plot=True, plot_file="progress.png")
for solution in solution_list:
    print(solution.get_x(), solution.get_value())

More examples are available in the EXAMPLES part.

Releases

release 0.4

  • Add Dimension2 class, which provides another format to construct dimensions. Unlike Dimension class, Dimension2 allows users to specify optimization precision.
  • Add SRacosTune class, which is used to suggest/provide trials and process results for Tune (a platform based on RAY for distributed model selection and training).
  • Deprecate Python 2 support

release 0.3

  • Add a parallel implementation of SRACOS, which accelarates the optimization by asynchronous parallelization.
  • Users can now set a customized stop criteria for the optimization

release 0.2

  • Add the noise handling strategies Re-sampling and Value Suppression (AAAI'18), and the subset selection method with noise handling PONSS (NIPS'17)
  • Add high-dimensionality handling method Sequential Random Embedding (IJCAI'16)
  • Rewrite Pareto optimization method. Bugs fixed.

release 0.1

  • Include the general optimization method RACOS (AAAI'16) and Sequential RACOS (AAAI'17), and the subset selection method POSS (NIPS'15).
  • The algorithm selection is automatic. See examples in the example fold.- Default parameters work well on many problems, while parameters are fully controllable
  • Running speed optmized for Python
马建仓 AI 助手
尝试更多
代码解读
代码找茬
代码优化
1
https://gitee.com/boss-ding/ZOOpt.git
git@gitee.com:boss-ding/ZOOpt.git
boss-ding
ZOOpt
ZOOpt
master

搜索帮助