OPTIMIZATION • METAHEURISTICS

EGO: Entropy-Guided Optimization

A metaheuristic algorithm where entropy dynamically balances exploration vs exploitation.

Entropy Formula

The entropy of the population fitness distribution is calculated as:

$$ H = - \sum_{i=1}^n p_i \log(p_i) $$

where \( p_i \) is the normalized probability of fitness rank \( i \). High entropy → explore more. Low entropy → exploit best candidates.

Results

EGO was benchmarked against Genetic Algorithm (GA), Particle Swarm Optimization (PSO), and Differential Evolution (DE):

fbest(EGO) < fbest(GA, PSO, DE)
for 90% of runs

Impact

EGO shows that information-theoretic signals (entropy) can improve optimization stability. Future extensions: neural architecture search and ML hyperparameter tuning.