Search-based engineering exercises
Search-based engineering exercises
Metaheuristics
Definition: General-purpose strategies for efficiently exploring a complex solution space.
Examples:
Heuristics Metaheuristics
Optimization in SBSE
Finding the optimal solution—or a good approximation—according to some criterion (e.g.,
minimizing test cost while maximizing bug detection).
Applications of SBSE
Solution Space
Also called the search space or state space—the set of all possible solutions.
Fitness Function
The function guiding the search (evaluation function).
Example: In testing, measure how many code branches are covered.
Population‑Based Metaheuristic
Example: Genetic Algorithm—simulates natural selection using crossover and mutation.
Selection Schemes
Tournament example:
Crossover Operators
Single‑Point example:
Parents “ABCDE” and “VWXYZ” → Children “ABXYZ” and “VWCDE”.
Mutation Operators
● Bit‑Flip: Invert a bit (0→1 or 1→0).
Key Differentiations
a. Greedy Algorithms vs. Optimization
● Greedy: Makes the best local choice (e.g., pack the heaviest item first).
Metaheuristic Classification
A. Selection Operators
Example:
If A, B, and C have fitness values of 5, 3, and 2 → A has a 50% chance, B 30%, C 20%.
2. Boltzmann Selection
Example:
At high T, fitness 100 and 80 are nearly equal; at low T, 100 is clearly preferred.
B. Mutation Operators
1. Random Resetting
● Selects one or more genes and replaces them with random values.
Example:
Binary string 1010110 → change a bit → 1011110.
2. Inversion Mutation
Example:
1234567 → invert positions 2–5 → 1543267.
3. Scramble Mutation
Example:
1234567 → scramble 2–5 → 1534267 (random permutation).
Summary Table
Operator Purpose
Selection chooses the fittest individuals to reproduce. Mutation keeps genetic diversity and
helps explore new solutions. Both are key to the success of genetic algorithms.
🌟 MAIN IDEA
Instead of tweaking one solution, keep a population of solutions and let them evolve
together to find a very good (often near‑optimal) answer to hard optimization problems .
🧱 BASIC CONCEPTS
1. Population
A group of candidate solutions you improve over time.
4. Operators
● Complex interactions: Changing one bit can have big ripple effects.
● Dynamic conditions: The “fitness” might change over time (e.g., traffic patterns).
● Neighborhood → Small tweaks you can make (swap two genes, flip one bit).
❌
Finds the best solution
Impossible on big problems
2. Backtracking
Smart pruning of bad branches
3. Heuristic
Rules of thumb for speed (no optimality guarantee)
4. Meta‑heuristic
Population + randomness + strategy (e.g., GA, PSO)
🧪 FINAL TIPS
● Use meta‑heuristics when exact methods choke on size or complexity.
● Pick representations & operators that fit your problem (bits for binary, swaps for
ordering).
🧱 BASIC CONCEPTS
1. Trajectory (Local) Search
Handle just one solution at a time, moving it around to find better spots.
2. Neighborhood
All solutions you can reach by a small tweak (e.g., flip one bit, swap two cities).
● Local Optima Risk: Without extra tricks, you can get stuck on a “hill” that isn’t the
highest.
🔍 KEY ALGORITHMS
1. Hill Climbing (HC)
○ Idea: From the current solution, jump to the best neighbor until no better
neighbor exists.
○ Idea: Like HC, but keep a “tabu list” of recent moves or solutions so you
don’t cycle back immediately.
○ Pros: Can escape small loops and revisit promising regions later.
○ Cons: You must choose how many past moves to remember (tabu list size).
○ Example: In TSP, if you swapped city A⇄B, mark that swap taboo for the
next 5 moves—forces exploring other swaps.
○ Cons: Needs a good cooling schedule (start high, cool slowly enough).
○ Example: For f(x)=−(x−3)²+10, at high T you might move from x=3 to x=2
even though worse—later this lets you jump to new peaks.
🛠 WORKFLOW TEMPLATE
1. Initialize one random solution.
2. Repeat until stopping condition (max steps, no improvement, time limit):
○ Generate a neighbor (tweak your single solution).
○ Decide to move:
■ HC → only if better.
✏️ FINAL TIPS
● Multi‑start HC: Run hill climbing from different random seeds to avoid bad local
peaks.
● Tune parameters: Tabu list length, initial temperature & cooling rate, neighbor
generation strategy.
● Remember: the goal is good & fast, not perfect & slow.