Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Lära Parameter Tuning and Convergence | Genetic Algorithms
Bio-Inspired Algorithms

bookParameter Tuning and Convergence

Note
Definition

Parameter tuning in genetic algorithms refers to selecting and adjusting key values such as mutation rate, crossover probability, and population size to control algorithm behavior and improve convergence.

Static vs. Dynamic Parameter Tuning

Two main approaches exist for controlling parameters in genetic algorithms:

  • Static parameter tuning: parameters such as mutation rate, crossover probability, and population size are fixed before the algorithm starts and remain constant during execution;
  • Dynamic (adaptive) parameter tuning: parameters are adjusted automatically based on population feedback or algorithm progress. Adaptive tuning helps maintain diversity and avoid premature convergence.

Static tuning is simple but may be suboptimal when optimal settings change over time.
Dynamic tuning provides flexibility, improving both convergence speed and robustness.

1234567891011121314151617181920212223242526272829
import numpy as np def adaptive_mutation_rate(population, min_rate=0.01, max_rate=0.2): """ Adjusts mutation rate based on population diversity. Diversity is measured as the average Hamming distance between individuals. """ def hamming_distance(ind1, ind2): return sum(a != b for a, b in zip(ind1, ind2)) n = len(population) if n < 2: return min_rate # No diversity in a single-individual population # Compute average Hamming distance distances = [] for i in range(n): for j in range(i + 1, n): distances.append(hamming_distance(population[i], population[j])) avg_distance = np.mean(distances) max_distance = len(population[0]) # Normalize diversity diversity = avg_distance / max_distance if max_distance else 0 # Inverse relationship: lower diversity -> higher mutation mutation_rate = max_rate - (max_rate - min_rate) * diversity return np.clip(mutation_rate, min_rate, max_rate)
copy

When population diversity drops, individuals become too similar, and the algorithm risks stagnation. Increasing the mutation rate in these moments injects new genetic material, helping escape local optima. When diversity is high, lowering mutation allows more focused exploitation of good solutions. This adaptive control balances exploration and exploitation dynamically, improving convergence stability.

123456789101112131415161718192021222324
import matplotlib.pyplot as plt # Simulate parameter and fitness changes for illustration generations = np.arange(1, 51) mutation_rates = np.linspace(0.2, 0.01, 50) + 0.02 * np.random.randn(50) avg_fitness = np.linspace(10, 90, 50) + 5 * np.random.randn(50) plt.figure(figsize=(10, 5)) plt.subplot(1, 2, 1) plt.plot(generations, mutation_rates, label="Mutation Rate") plt.xlabel("Generation") plt.ylabel("Mutation Rate") plt.title("Adaptive Mutation Rate Over Time") plt.legend() plt.subplot(1, 2, 2) plt.plot(generations, avg_fitness, color="green", label="Average Fitness") plt.xlabel("Generation") plt.ylabel("Average Fitness") plt.title("Population Fitness Over Time") plt.legend() plt.tight_layout() plt.show()
copy

Monitoring and Convergence

To achieve reliable convergence and performance in genetic algorithms:

  • Start with parameter ranges from literature or previous experiments;
  • Use static parameters for simple tasks, but prefer adaptive tuning for complex problems;
  • Monitor metrics such as population diversity, best and average fitness;
  • Adjust parameters dynamically to maintain balance between exploration and exploitation.
question mark

Which statements about parameter tuning and convergence in genetic algorithms are correct? Select all that apply.

Select the correct answer

Var allt tydligt?

Hur kan vi förbättra det?

Tack för dina kommentarer!

Avsnitt 2. Kapitel 3

Fråga AI

expand

Fråga AI

ChatGPT

Fråga vad du vill eller prova någon av de föreslagna frågorna för att starta vårt samtal

Suggested prompts:

Can you explain how the adaptive mutation rate function works in more detail?

What are some other parameters that can be tuned adaptively in genetic algorithms?

How do I decide when to use static vs. dynamic parameter tuning?

Awesome!

Completion rate improved to 6.25

bookParameter Tuning and Convergence

Svep för att visa menyn

Note
Definition

Parameter tuning in genetic algorithms refers to selecting and adjusting key values such as mutation rate, crossover probability, and population size to control algorithm behavior and improve convergence.

Static vs. Dynamic Parameter Tuning

Two main approaches exist for controlling parameters in genetic algorithms:

  • Static parameter tuning: parameters such as mutation rate, crossover probability, and population size are fixed before the algorithm starts and remain constant during execution;
  • Dynamic (adaptive) parameter tuning: parameters are adjusted automatically based on population feedback or algorithm progress. Adaptive tuning helps maintain diversity and avoid premature convergence.

Static tuning is simple but may be suboptimal when optimal settings change over time.
Dynamic tuning provides flexibility, improving both convergence speed and robustness.

1234567891011121314151617181920212223242526272829
import numpy as np def adaptive_mutation_rate(population, min_rate=0.01, max_rate=0.2): """ Adjusts mutation rate based on population diversity. Diversity is measured as the average Hamming distance between individuals. """ def hamming_distance(ind1, ind2): return sum(a != b for a, b in zip(ind1, ind2)) n = len(population) if n < 2: return min_rate # No diversity in a single-individual population # Compute average Hamming distance distances = [] for i in range(n): for j in range(i + 1, n): distances.append(hamming_distance(population[i], population[j])) avg_distance = np.mean(distances) max_distance = len(population[0]) # Normalize diversity diversity = avg_distance / max_distance if max_distance else 0 # Inverse relationship: lower diversity -> higher mutation mutation_rate = max_rate - (max_rate - min_rate) * diversity return np.clip(mutation_rate, min_rate, max_rate)
copy

When population diversity drops, individuals become too similar, and the algorithm risks stagnation. Increasing the mutation rate in these moments injects new genetic material, helping escape local optima. When diversity is high, lowering mutation allows more focused exploitation of good solutions. This adaptive control balances exploration and exploitation dynamically, improving convergence stability.

123456789101112131415161718192021222324
import matplotlib.pyplot as plt # Simulate parameter and fitness changes for illustration generations = np.arange(1, 51) mutation_rates = np.linspace(0.2, 0.01, 50) + 0.02 * np.random.randn(50) avg_fitness = np.linspace(10, 90, 50) + 5 * np.random.randn(50) plt.figure(figsize=(10, 5)) plt.subplot(1, 2, 1) plt.plot(generations, mutation_rates, label="Mutation Rate") plt.xlabel("Generation") plt.ylabel("Mutation Rate") plt.title("Adaptive Mutation Rate Over Time") plt.legend() plt.subplot(1, 2, 2) plt.plot(generations, avg_fitness, color="green", label="Average Fitness") plt.xlabel("Generation") plt.ylabel("Average Fitness") plt.title("Population Fitness Over Time") plt.legend() plt.tight_layout() plt.show()
copy

Monitoring and Convergence

To achieve reliable convergence and performance in genetic algorithms:

  • Start with parameter ranges from literature or previous experiments;
  • Use static parameters for simple tasks, but prefer adaptive tuning for complex problems;
  • Monitor metrics such as population diversity, best and average fitness;
  • Adjust parameters dynamically to maintain balance between exploration and exploitation.
question mark

Which statements about parameter tuning and convergence in genetic algorithms are correct? Select all that apply.

Select the correct answer

Var allt tydligt?

Hur kan vi förbättra det?

Tack för dina kommentarer!

Avsnitt 2. Kapitel 3
some-alt