Welcome to advisor’s documentation!¶
Advisor¶
Introduction¶
Advisor is the hyper parameters tuning system for black box optimization.
It is the open-source implementation of Google Vizier with these features.
- Easy to use with API, SDK, WEB and CLI
- Support abstractions of Study and Trial
- Included search and early stop algorithms
- Recommend parameters with trained model
- Same programming interfaces as Google Vizier
- Command-line tool just like Microsoft NNI.
Supported Algorithms¶
- [x] Grid Search
- [x] Random Search
- [x] Bayesian Optimization
- [x] TPE(Hyperopt)
- [x] Random Search(Hyperopt)
- [x] Simulate Anneal(Hyperopt)
- [x] Quasi Random(Chocolate)
- [x] Grid Search(Chocolate)
- [x] Random Search(Chocolate)
- [x] Bayes(Chocolate)
- [x] CMAES(Chocolate)
- [x] MOCMAES(Chocolate)
- [ ] SMAC Algorithm
- [x] Early Stop First Trial Algorithm
- [x] Early Stop Descending Algorithm
- [ ] Performance Curve Stop Algorithm
Installation¶
Pip¶
pip install advisor
From Source¶
git clone git@github.com:tobegit3hub/advisor.git
cd ./advisor/advisor_client/
python ./setup.py install
Docker¶
docker run -d -p 8000:8000 tobegit3hub/advisor
Docker Compose¶
wget https://raw.githubusercontent.com/tobegit3hub/advisor/master/docker-compose.yml
docker-compose up -d
Kubernetes¶
wget https://raw.githubusercontent.com/tobegit3hub/advisor/master/kubernetes_advisor.yaml
kubectl create -f ./kubernetes_advisor.yaml
Quick Start¶
Install with pip
.
pip install advisor
Start the server.
advisor_admin server start
Go to http://127.0.0.1:8000
in the browser.
Submit tuning jobs.
git clone --depth 1 https://github.com/tobegit3hub/advisor.git && cd ./advisor/
advisor run -f ./advisor_client/examples/python_function/config.json
Get result of jobs.
advisor study describe -s demo
Server¶
Command-line¶
advisor_admin server start
Docker¶
docker run -d -p 8000:8000 tobegit3hub/advisor
Docker Compose¶
wget https://raw.githubusercontent.com/tobegit3hub/advisor/master/docker-compose.yml
docker-compose up -d
Kubernetes¶
wget https://raw.githubusercontent.com/tobegit3hub/advisor/master/kubernetes_advisor.yaml
kubectl create -f ./kubernetes_advisor.yaml
From Source¶
git clone --depth 1 https://github.com/tobegit3hub/advisor.git && cd ./advisor/
pip install -r ./requirements.txt
./manage.py migrate
./manage.py runserver 0.0.0.0:8000
Command Line Interface¶
Start Server¶
advisor_admin server start
Stop Server¶
advisor_admin server stop
Submit Job¶
advisor run -f ./advisor_client/examples/python_function/config.json
List Study¶
advisor study list
Describe Study¶
advisor study describe -s demo
List Trials¶
advisor trials list
SDK¶
Create Client¶
client = AdvisorClient()
Create Study¶
study_configuration = {
"goal":
"MINIMIZE",
"randomInitTrials":
1,
"maxTrials":
5,
"maxParallelTrials":
1,
"params": [
{
"parameterName": "gamma",
"type": "DOUBLE",
"minValue": 0.001,
"maxValue": 0.01,
"feasiblePoints": "",
"scalingType": "LINEAR"
},
{
"parameterName": "C",
"type": "DOUBLE",
"minValue": 0.5,
"maxValue": 1.0,
"feasiblePoints": "",
"scalingType": "LINEAR"
},
{
"parameterName": "kernel",
"type": "CATEGORICAL",
"minValue": 0,
"maxValue": 0,
"feasiblePoints": "linear, poly, rbf, sigmoid, precomputed",
"scalingType": "LINEAR"
},
{
"parameterName": "coef0",
"type": "DOUBLE",
"minValue": 0.0,
"maxValue": 0.5,
"feasiblePoints": "",
"scalingType": "LINEAR"
},
]
}
study = client.create_study("Study", study_configuration,
"BayesianOptimization")
Get Study¶
study = client.get_study_by_id(6)
Get Trials¶
trials = client.get_suggestions(study.id, 3)
Generate Parameters¶
parameter_value_dicts = []
for trial in trials:
parameter_value_dict = json.loads(trial.parameter_values)
print("The suggested parameters: {}".format(parameter_value_dict))
parameter_value_dicts.append(parameter_value_dict)
Run Training¶
metrics = []
for i in range(len(trials)):
metric = train_function(**parameter_value_dicts[i])
metrics.append(metric)
Complete Trial¶
for i in range(len(trials)):
trial = trials[i]
client.complete_trial_with_one_metric(trial, metrics[i])
is_done = client.is_study_done(study.id)
best_trial = client.get_best_trial(study.id)
print("The study: {}, best trial: {}".format(study, best_trial))
Configuration File¶
YAML Example¶
name: "demo"
algorithm: "BayesianOptimization"
trialNumber: 10
path: "./advisor_client/examples/python_function/"
command: "./min_function.py"
search_space:
goal: "MINIMIZE"
randomInitTrials: 3
params:
- parameterName: "x"
type: "DOUBLE"
minValue: -10.0
maxValue: 10.0
JSON Example¶
{
"name": "demo",
"algorithm": "BayesianOptimization",
"trialNumber": 10,
"concurrency": 1,
"path": "./advisor_client/examples/python_function/",
"command": "./min_function.py",
"search_space": {
"goal": "MINIMIZE",
"randomInitTrials": 3,
"params": [
{
"parameterName": "x",
"type": "DOUBLE",
"minValue": -10.0,
"maxValue": 10.0,
"scalingType": "LINEAR"
}
]
}
}