% EOA HW02 - Solving constrained problems using evolutionary algorithms % František Boháček % 2025 # Intro This is a report for hw02 for EOA course on CTU FEE. The goal has been to compare constrained evolutionary algorithms with two approaches: 1. Using stochastic ranking, 2. Using multi objective evaluation with the original fitness as first objective and weighted sum of constraints as second argument. The report covers the implemented algorithms and the results on problems taken from "Problem Definitions and Evaluation Criteria for the CEC 2006" # Used algorithms ## Stochastic ranking This is one method to solve constrained problems. Stochastic ranking first sorts the individuals based on a stochastic sorting. Then, those ranked individuals are put to a tournament selection with their new rankings, instead of using their original evaluation. Similar algorithms structure as bubble sort is utilized. It iterates through the population and swaps to elements when one is better than the other (lower fitness). But it does that only with probability `p`. With probability `1 - p` it instead compares the constraint violations of both elements. Generational replacement has been used to easily ensure that the stochastic ranking order is still respected during replacement. It would also be possible to use other replacement strategies, but then the stochastic ranking would have to run twice after the offsprings are made. Stochastic ranking is implemented in `eoa_lib/src/constraints.rs`. ## Unconstrained multi objective evolution (NSGA-II) As a second method, NSGA-II has been used to solve the constrained problems. It's mainly meant to solve multi-objective problems. But the constraint problem can be mapped to it by making the original objective the first objective and weighted sum of constraints as second objective. It works based on non-dominated fronts. All elements in one front are not dominated between each other. A dominates B if and only if A is not worse than B in all objectives and A is better than B in at least one objective. To obtain the first front, non-dominates solutions are obtained from the whole population. Then they are removed and the process is repeated to get second front and so on. NSGA-II first sorts the elements based on the non dominated fronts. Then, within each front, crowding distance is calculated. Crowding distance is used to discourage solutions that are close to each other. For both selection and replacement the comparison is then done first according to the front number and within same front, according to the crowding distance. Higher crowding distance is preferred. As part of NSGA-II, tournament selection is used and for replacement, best replacement is used, keeping only the best individuals from both original population and newly made offsprings. Specific parameters of using 5 candidates for tournament selection and probability of 0.99 for choosing the best candidate out of 5. Multi objective evolution is implemented in `eoa_lib/src/multi_objective_evolution.rs`. Note that at first the author has used the original evolution function as base, but later realized it's wrong. In NSGA-II, the whole population has to be reevaluated after offsprings are made. The original function evaluated offsprings only to save on count of function evaluations. But that meant two different front sortings were being compared in the replacement, one for the current population and the other for offsprings. Consider `population.evaluate(); offsprings.evaluate();` vs `population.append(offsprings).evaluate()`, where in case of NSGA-II `evaluate()` means to obtain the non-dominated fronts and crowding distances. # Used operators Since the chosen problems are real problems, real representation has been used. For mutation, the solution is changed with random sample from normal distribution with zero mean. The standard deviation has been chosen differently based on the problem. As for crossover, arithmetical crossover is utilized. Arithmetical crossover chooses random $\alpha$ between 0 and 1. Then, the offspring's i-th component is $\alpha \cdot p_{1i} + (1 - \alpha) \cdot p_{2i}$, where $p_{1i}$ ($p_{2i}$) is first (second) parent's i-th component respectively. # Making NSGA-II behave better for constrained problems In the default implementation, NSGA-II is not so suitable for solving constrained problems. This is because all the feasible individuals will be at 0 in the objective with constraints sum and each front will have only one feasible candidate. ## Custom solution To overcome the limitations, a solution of using an archive of feasible solutions is proposed. The idea of archive is similar to the SPEA2 algorithm, but the solutions kept in it are different. Specifically it has been chosen that the archive will have only feasible solutions. The archive has a fixed size and it contains the best known solutions (based on the original problem objective). After each iteration it's updated, joining the previous archive and new population and leaving only the best feasible candidates. Then, during pairing/crossover infesiable solutions might get replaced. The logic is as follows: 1. If only one of the parents is infeasible, replace it with a feasible with probability $p_1$ 2. If both parents are infesiable, replace the first one with probability $p_{21}$ and then, if the first one is replaced, replace the second one with probability $p_{22}$. The probabilities were chosen as: $p_1 = 0.4, p_{21} = 0.6, p_{22} = 0.3$. The idea behind these particular values has been to have majority of pair with at least one feasible solution. In the code this is implemented as a crossover wrapper, because it was the easiest, even though logically it would make more sense to put this to pairing step. But the implementation for pairing step is that it produces indices in the original population. Because of that it is not possible to add new individuals. The implementation is located in `src/constr_hw02/feasible_crossover_wrapper.rs` ## Using NSGA-II for constrained problems NSGA-II can be modified for constrained problems. While the idea is mainly that multiple objectives are used and on top of that, there are constraints, the idea can be expanded to the problem that's solved here. Specifically still the two objectives can be used, but the definition of `i` dominates `j` is changed as follows, let's say we have solutions `i` and `j`, solution `i` constraint-dominates `j` if any of the following holds: 1. Solution `i` and `j` are both feasible and solution `i` dominates `j`. 2. Solution `i` is feasible and solution `j` is not. 3. Solutions `i` and `j` are both infesiable, but each constraint is violated less by `i` than by `j`. It would be possible to use just one objective function, being the original objective, but then we no longer have a multi objective problem. This is implemented along with NSGA-II at `src/eoa_lib/multi_objective_evolution.rs`. # Results Every configuration has been ran 10 times to accomodate for inherent randomness. The initialization has been done randomly, uniformly within bounds given in the definitions of the problems. Later in the algorithm run, candidates could escape those bounds, they weren't penalized for it nor disqualified in the results. It has been tried to also bound the results of mutation and crossover, but it was abandoned as it seems the algorithms still stay within the bounds, because they contain better solutions. This gives more challenge to the algorithms. For stochastic ranking approach, following parameters have been used: tournament selection with 5 individuals and 0.95 probability of choosing the best one. All weights have been chosen as `1.0`. Arithmetic crossover is used. The offsprings mutate with probability of `0.1`. Mutation uses a sample from normal distribution with mean of `0.0` and standard deviation given according to the following table: - g04 - `1.0` - g05 - `10.0` - g06 - `1.0` - g08 - `0.5` - g09 - `1.0` - g11 - `0.01` (epsilon is `0.00015`) - g24 - `0.1` - g21 - `10.0` TODO parameters As for runtime parameters, population had 500 individuals, there are 1000 iterations, 500 parents in each iteration. The number of iterations in stochastic ranking is 2 * population. Probability of comparing fitness in stochastic ranking is 0.45. For NSGA-II TODO ## Comparing the approaches The following plot compares: - Stochastic ranking - NSGA-II - NSGA-II with multiple objectives - NSGA-II customized archive - NSGA-II constrain-optimized tournament TODO add the plot TODO add discussion ## Comparing g11 with g06 `g11` is special in the regard that it has an equality condition that is harder to satisfy. So here only `g11` is compared with `g06`. A fraction of feasible solutions for each iteration has been obtained, as well as average constraint violation. Here is a plot with fraction of feasible solutions in iteration for both problems: ## Comparing TODO insert hard problem with g11 # Tasks done above minimal requirements - 1 point - Multi-objective approach with more objectives - 1 point - Improve NSGA with archive - 1 point - Comparison on more complex problems - 1 point - NSGA-II with modified tournament operator (constrained domination) # Code structure Rust has been chosen as the language. There are four subdirectories, `eoa_lib`, `tsp_hw01`, `constr_hw02` and `tsp_plotter`. `eoa_lib` is the library with the generic operators defined, with random search, local search and evolution algorithm functions. It also contains the most common representations, perturbations and crossovers for them. `constr_hw02` is the implementation of hw02. It contains the problem definitions and a simple CLI program to run any of the problems with either stochastic ranking or with MOE. `tsp_plotter` plots the graphs. It can now be used forjboth homework. `tsp_hw01` is folder with source of the first folder, kept for my convenience as I keep them in the same repo and would have to be changing Cargo.toml properly to make sure the build doesn't fail if it's not present. As part of the work, apart from the standard library, third party libraries have been used, namely (the list is the same as for hw01): - nalgebra for working with vertices and matrices, - plotters for plotting - rand - rand_distr - thiserror for easing work with errors - serde, serde_json for loading json files - csv for loading csv files ## Running the algorithms For run instructions, including generation of the plots in this report, see README.md. # Usage of LLM LLM has been used similarly to how in HW01. The only thing that changed is that Gemini has also been utilized, not only Claude. (there was an offer for students to get it for free for a year :)) # Resources - Lectures of EOA - Claude, Gemini LLMs - https://sciences.uodiyala.edu.iq/uploads/00%20Abdullah%20New%20website/Lectures/comp/%D9%85%D8%AD%D8%A7%D8%B6%D8%B1%D8%A7%D8%AA%20%D8%AF%20%D8%B2%D9%8A%D8%A7%D8%AF%20%D8%B7%D8%A7%D8%B1%D9%82/%D8%A7%D8%AD%D8%AA%D8%B3%D8%A7%D8%A8%20%D8%AA%D8%B7%D9%88%D8%B1%D9%8A%203.pdf - https://www.researchgate.net/figure/Blend-crossover-operator-BLX_fig1_226044085 - https://cw.fel.cvut.cz/wiki/_media/courses/a0m33eoa/cviceni/2006_problem_definitions_and_evaluation_criteria_for_the_cec_2006_special_session_on_constraint_real-parameter_optimization.pdf