metaheuristic_optimizer
This function is responsible for the metaheuristic optimization process. It is a general function that calls the specific algorithm functions.
df_all_reps, df_resume_all_reps,\
reports, status = metaheuristic_optimizer(algorithm_setup,
general_setup)
Input variables
Name | Description | Type |
---|---|---|
algorithm_setup | Metaheuristic optimization setup. See algorithms documentation for more details. Use the same setup dictionary as the desired optimization method here | Dictionary |
general_setup | Optimization process setup | Dictionary |
general_setup keys | ||
'number of repetitions' | Number of repetitions | Integer |
'type code' | Type of population. Options: 'real code' or 'combinatorial code' | String |
'initial pop. seed' | Random seed. Use None in list for random seed | List |
'algorithm' | Optimization algorithm. See the available metaheuristic algorithms | String |
Output variables
df_all_reps | All data on the population | Dataframe |
df_resume_all_reps | Best data on the population | Dataframe |
reports | Report about the repetition process | String |
status | Best repetition \(id\) | Integer |
Metaheuristic algorithms.
Function | Name |
---|---|
'hill_climbing_01' | Hill Climbing algorithm 01 |
'simulated_annealing_01' | Simulated Annealing algorithm 01 |
'genetic_algorithm_01' | Genetic Algorithm 01 |
Example 1
Use the hill climbing optimization method to optimize the 2D sphere function. Use a total of 100 iterations to perform the optimization. Consider the limits \(\mathbf{x}_L = [-5.0, -5.0]\) and \(\mathbf{x}_U = [5.0, 5.0]\) for the problem design variables. Use \(cov = 20%\), Gaussian random generator, and random initial guess. Run this complete process 30 times. Consider a population of 10 agents. Use hill_climbing_01
algorithm.
"""Object Function: of_file.py"""
def my_function(x, none_variable):
x_0 = x[0]
x_1 = x[1]
of = x_0 ** 2 + x_1 ** 2
return of
"""Run optimization: your_problem.py or your_problem.ipynb"""
# import libray
# pip install metapy-toolbox or pip install --upgrade metapy-toolbox
from metapy_toolbox import metaheuristic_optimizer
from of_file import my_function # External .py file with your objective function
# Algorithm settings
algorithm_setup = {
'number of iterations': 100,
'number of population': 2,
'number of dimensions': 2,
'x pop lower limit': [-5, -5],
'x pop upper limit': [5, 5],
'none variable': None,
'objective function': my_obj_function,
'algorithm parameters': {
'mutation': {
'cov (%)': 20,
'pdf': 'gaussian'
}
},
}
# METApy settings
general_setup = {
'number of repetitions': 30,
'type code': 'real code',
'initial pop. seed': [None] * 30,
'algorithm': 'hill_climbing_01',
}
# Run algorithm
df_all_reps, df_resume_all_reps, reports, status = metaheuristic_optimizer(algorithm_setup, general_setup)
Optimization results:
- Best repetition id: 28
- Best of: 5.3042292250e-16
- Design variables: [2.100789251887419e-08, 9.438822723774955e-09]
- Process time (s): 3.779469
Analysis
See the details repetition \(id = 0\). df_resume_all_reps
contains history details the best particle in \(id = 0\) repetition.
print(df_resume_all_reps[0])
To see all population history in repetition \(id = 0\) use:
print(df_all_reps[0])
See best \(id\) in repetitions:
print(status)
See best repetition:
print(df_resume_all_reps[status])
See complete report about best repetition:
# Report details
arq = "report_example.txt"
# Writing report
with open(arq, "w") as file:
file.write(report[status])