0% found this document useful (0 votes)
17 views22 pages

Lecture 09 - Local Search

The lecture covers local search algorithms, focusing on their necessity, hill climbing, and simulated annealing. Local search is effective for optimization problems where the path to the goal is irrelevant, and it operates by iteratively improving a current state. Simulated annealing enhances local search by allowing the acceptance of worse solutions to escape local minima, making it suitable for large search spaces.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views22 pages

Lecture 09 - Local Search

The lecture covers local search algorithms, focusing on their necessity, hill climbing, and simulated annealing. Local search is effective for optimization problems where the path to the goal is irrelevant, and it operates by iteratively improving a current state. Simulated annealing enhances local search by allowing the acceptance of worse solutions to escape local minima, making it suitable for large search spaces.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

Lecture 7: Local Search Algorithms

Key Points from Last


Lecture
Outline of today’s lecture

Local Search Algorithms


1. Why they are needed?
2. Hill Climbing
3. Simulated Annealing
Local Search Algorithms

In this lecture:

❑ Part 1: What/Why Local Search


❑ Part 2: Hill-Climbing Search

❑ Part 3: Simulated Annealing Search

❑ Part 4: Genetic Algorithms

0
4
Local Search Algorithms

In many optimization problems, the path to the goal is irrelevant; the


goal state itself is the solution.

ِExamples:
– to reduce cost, as in cost functions

– to reduce conflicts, as in n-queens

The idea: keep a single "current" state, try to improve it according to


an objective function.

Local search algorithms:


– Use little memory
– Find reasonable solutions in large infinite spaces
0
5
Local Search Algorithms

• Local search can be used on problems that can be formulated as


finding a solution maximizing a criterion among a number of
candidate solutions.

• Local search algorithms move from solution to solution in the space


of candidate solutions (the search space) until a solution deemed
optimal is found or a time bound is elapsed.

• For example, the travelling salesman problem, in which a solution


is a cycle containing all nodes of the graph and the target is to
minimize the total length of the cycle. i.e. a solution can be a cycle
and the criterion to maximize is a combination of the number of
nodes and the length of the cycle.

• A local search algorithm starts from a candidate solution and then


iteratively moves to a neighbor solution.
0
6
Local Search Algorithms

Terminate on a time bound or if the situation is not improved after number


of steps.

Local search algorithms are typically incomplete algorithms, as the


search may stop even if the best solution found by the algorithm is not
optimal.

0
7
Search Landscape (two-dimension)

0
8
Birzeit University, 2018

Local Search Algorithms

In this lecture:
❑ Part 1: What/Why Local Search Algorithms

❑ Part 2: Hill-Climbing Search


❑ Part 3: Simulated Annealing Search

❑ Part 4: Genetic Algorithms in nutshell

9
Hill-Climbing Search
• Continually moves in the direction of increasing value (i.e., uphill).
• Terminates when it reaches a “peak”, no neighbor has a higher value.
• Only records the state and its objective function value.
• Does not look ahead beyond the immediate.

Sometimes called
Greedy Local Search

• Problem: can get stuck in local maxima or local minima


• Its success depends very much on the shape of the state-space
land-scape: if there are few local maxima, random-restart hill
climbing will find a “good” solution very quickly.
0
10
Hill-Climbing Search Algorithm

The hill-climbing search algorithm, which is the most basic


local search technique. At each step the current node is replaced
by the best neighbor; in this version, that means the neighbor
with the highest VALUE, but if a heuristic cost estimate h is used,
we would find the neighbor with the lowest h.

0
11
Example: n-queens

Put n queens on an n×n board with no two queens on the


same row, column, or diagonal.

Move a queen to reduce number of conflicts.

0
12
Example: 8-queens

Each number indicates h


if we move a queen in its
corresponding column

h = number of pairs of queens that are attacking each other,


either directly or indirectly (h = 17 for the above state)

0
13
Example: n-queens

A local minimum with h = 1

0
14
Birzeit University, 2020

Local Search Algorithms

In this lecture:
❑ Part 1: What/Why Local Search Algorithms

❑ Part 2: Hill-Climbing Search

❑ Part 3: Simulated Annealing Search


❑ Part 4: Genetic Algorithms in nutshell

0
15
Simulated Annealing (SA) is an optimization algorithm
inspired by the annealing process in metallurgy,
where a material is heated and then slowly cooled to
remove defects and achieve a stable structure. It is
particularly useful for finding a good (not necessarily
perfect) solution in large search spaces, especially
when a problem has many local optima.
Algorithm Steps
1.Initialize a random solution.
2.Compute the cost (or energy) using an objective function.
3.Perturb (modify) the solution slightly to generate a
neighbor solution.
4.Compute the new cost:
1. If the new solution is better, accept it.
2. If the new solution is worse, accept it with some
probability to avoid being stuck in a local minimum.
5.Gradually decrease temperature (T):
1. The probability of accepting worse solutions decreases
over time.
6.Repeat until termination condition is met (e.g.,
temperature is very low or solution is stable).
Example : 8-Queens Problem

• Suppose we start with a random board where h = 17 (17


attacking queen pairs).
• We randomly move one queen, decreasing conflicts to h =
14. We accept this move.
• At another step, moving a queen increases h to 16.
• Instead of rejecting immediately, we might accept it with
probability i.e. : P=e(14−16)/T.
• As temperature decreases, we become less likely to
accept worse moves.
• Eventually, h = 0, meaning a valid solution is found.
Simulated Annealing Search
Based on [1]

To avoid being stuck in a local maxima, it tries randomly


(using a probability function) to move to another state, if this
new state is better it moves into it, otherwise try another
move… and so on.

0
19
Simulated Annealing Search
Based on [1]

Terminates when finding an acceptably good solution in a


fixed amount of time, rather than the best possible solution.

Locating a good approximation to the global minimum of a


given function in a large search space.

Widely used in VLSI layout, airline scheduling, etc.


0
20
Properties of Simulated Annealing Search

The problem with hill climbing approach is that the


neighbors of a state are not guaranteed to contain any of
the existing better solutions which means that failure to
find a better solution among them does not guarantee
that no better solution exists.
But, here:
• Escapes local minima by accepting worse solutions
early on.
If it runs for an infinite amount of time, the global
optimum will be found.
• Good for large search spaces where brute force is
impractical.
0
21
Comparison of both approaches

Feature Simulated Annealing Hill Climbing

Accepts Worse Moves? Yes (with probability) No

Local Minima Avoidance Strong Weak

Global Optimization Higher Lower


Potential
Speed Slower Faster

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy