Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

initial_population not effectively used/retained for multiobjective problems? #279

Open
kaurao opened this issue Feb 19, 2024 · 4 comments
Labels
question Further information is requested

Comments

@kaurao
Copy link

kaurao commented Feb 19, 2024

Hi,

Thanks for PyGAD, it's a great resource.
I am trying to use it for an multiobjective optimization task. I have a good idea for what good solutions could be so I am providing them via the initial_population argument. However, some of those solutions are bot being used, I think.

My set up is as follows.

ga_instance = pygad.GA(num_genes=68,
                       sol_per_pop=100,
                       initial_population=initial_population,
                       num_generations=100,
                       num_parents_mating=np.round(sol_per_pop/2).astype(int),
                       parent_selection_type=parent_selection_type,
                       gene_space={'low': 0, 'high': 1},
                       crossover_type="uniform",
                       mutation_type=None,
                       mutation_num_genes=[5, 1],
                       keep_elitism=10,
                       fitness_func=fitness_func,
                       on_generation=on_generation)

The image shows the final front 0 as blue dots and the red cross as one of the initial solutions that I provided whiuch apparently "disappeared".

image

I will appreciate any tips on how to set this up properly using PyGAD, thanks!

@kaurao
Copy link
Author

kaurao commented Feb 19, 2024

My guess is the difference is coming from parent_selection_type="tournament_nsga2" (the details and the figure above) versus parent_selection_type="nsga2" which seems to work as expected (see the figure below).

image

@ahmedfgad ahmedfgad added the question Further information is requested label Feb 19, 2024
@ahmedfgad
Copy link
Owner

tournament_nsga2 and nsga2 are different types of parent selectors for multi-objective optimization using NSGA-II.

tournament_nsga2 applies a tournament between the candidates and parents are selected randomly. This means the solution with the best fitness is not guaranteed to be selected. In contrast, nsga2 always select the best solutions as parents.

@kaurao
Copy link
Author

kaurao commented Feb 22, 2024

Makes sense. Can you please provide referene(s) for tournament_nsga2 I have not come across it before. Thanks!

@ahmedfgad
Copy link
Owner

These are some resources that describes tournament selection for NSGA-II:

You can definitely have more resources.

ahmedfgad added a commit that referenced this issue Jan 7, 2025
1. The `delay_after_gen` parameter is removed from the `pygad.GA` class constructor. As a result, it is no longer an attribute of the `pygad.GA` class instances. To add a delay after each generation, apply it inside the `on_generation` callback. #283
2. In the `single_point_crossover()` method of the `pygad.utils.crossover.Crossover` class, all the random crossover points are returned before the `for` loop. This is by calling the `numpy.random.randint()` function only once before the loop to generate all the K points (where K is the offspring size). This is compared to calling the `numpy.random.randint()` function inside the `for` loop K times, once for each individual offspring.
3. Bug fix in the `examples/example_custom_operators.py` script. #285
4. While making prediction using the `pygad.torchga.predict()` function, no gradients are calculated.
5. The `gene_type` parameter of the `pygad.helper.unique.Unique.unique_int_gene_from_range()` method accepts the type of the current gene only instead of the full gene_type list.
6. Created a new method called `unique_float_gene_from_range()` inside the `pygad.helper.unique.Unique` class to find a unique floating-point number from a range.
7. Fix a bug in the `pygad.helper.unique.Unique.unique_gene_by_space()` method to return the numeric value only instead of a NumPy array.
8. Refactoring the `pygad/helper/unique.py` script to remove duplicate codes and reformatting the docstrings.
9. The plot_pareto_front_curve() method added to the pygad.visualize.plot.Plot class to visualize the Pareto front for multi-objective problems. It only supports 2 objectives. #279
10. Fix a bug converting a nested NumPy array to a nested list. #300
11. The `Matplotlib` library is only imported when a method inside the `pygad/visualize/plot.py` script is used. This is more efficient than using `import matplotlib.pyplot` at the module level as this causes it to be imported when `pygad` is imported even when it is not needed. #292
12. Fix a bug when minus sign (-) is used inside the `stop_criteria` parameter (e.g. `stop_criteria=["saturate_10", "reach_-0.5"]`). #296
13. Make sure `self.best_solutions` is a list of lists inside the `cal_pop_fitness` method. #293
14. Fix a bug where the `cal_pop_fitness()` method was using the `previous_generation_fitness` attribute to return the parents fitness. This instance attribute was not using the fitness of the latest population, instead the fitness of the population before the last one. The issue is solved by updating the `previous_generation_fitness` attribute to the latest population fitness before the GA completes. #291
@ahmedfgad ahmedfgad mentioned this issue Jan 7, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants