0% found this document useful (0 votes)
3 views48 pages

Lec 9 & 10 (A) Multi-Objective Optimization

The document discusses multi-objective optimization, highlighting the need to balance conflicting objectives and the concept of the Pareto front, which represents optimal trade-offs. It outlines various algorithm design issues, including fitness assignment strategies, diversity preservation, and elitism in environmental selection. Additionally, it presents techniques for solving multi-objective problems, such as a priori, a posteriori, and interactive methods, emphasizing the advantages of evolutionary algorithms in this context.

Uploaded by

amnaakb197
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views48 pages

Lec 9 & 10 (A) Multi-Objective Optimization

The document discusses multi-objective optimization, highlighting the need to balance conflicting objectives and the concept of the Pareto front, which represents optimal trade-offs. It outlines various algorithm design issues, including fitness assignment strategies, diversity preservation, and elitism in environmental selection. Additionally, it presents techniques for solving multi-objective problems, such as a priori, a posteriori, and interactive methods, emphasizing the advantages of evolutionary algorithms in this context.

Uploaded by

amnaakb197
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 48

Multi-objective

Optimization
Dr. Sobia Tariq Javed
Outline
• The objectives of this lecture are to:
• Multi-objective Optimization
• Address the design issues of evolutionary multi-objective optimization
algorithms
• Explore ways to handle Constraints
Multi-objective
Optimization
• Multiple Objectives: The problem has
several goals to optimize, often
conflicting (e.g., cost vs. quality).

• Trade-offs: Improving one objective may


degrade another, requiring compromises.

• Pareto Front: The set of all Pareto-


optimal solutions, showing the trade-offs
between objectives.
Example-
Car Buying
• Problem: Optimizing Car
Buying.
• Objectives:
• Minimize costs.
• Maximize comfort.
• Conflict: Increased
comfort often leads to
higher costs, requiring a
trade-off between
affordability and luxury.
Example-Supply
Chain
Management
• Problem: Optimizing supply
chain operations.
• Objectives:
• Minimize transportation
costs.
• Maximize delivery speed.
• Minimize environmental
impact (e.g., CO₂
emissions).
• Conflict: Faster deliveries often
result in higher costs and
emissions.
Example-
Environmental
Management
• Problem: Managing a water reservoir
system.
• Objectives:
• Maximize water storage for drought
conditions.
• Minimize flood risks by reducing
water levels.
• Ensure water availability for
agriculture and energy production.
• Conflict: High water levels for storage
may increase flood risk.
Single objective optimization
Objective function – Minimize f(x)
f(x)

X is better than
Multi-objective optimization
Objective function – Minimize f(x) = (f1(x), f2(x), f3(x))
f1(x) f2(x) f3(x)

is better than
Multi-objective optimization
Objective function – Minimize f(x) = (f1(x), f2(x), f3(x))
f1(x) f2(x) f3(x)

is ???? than
Multi-Objective -Dominance
Relationship
• A solution x dominates another solution y if
• x is no worse than y in all objectives And
• x is strictly better than y in at least one of the objectives.
Dominated
Solution
• A solution x is said to
dominate the other
solution y if
• the solution x is no • 1 Vs 2: ?
worse than y in all • 1 Vs 5:?
objectives and • 1 Vs 4:?
• the solution x is strictly
better than y in at least
one objective
Pareto Optimal Front
• Non-Dominated Solution Set
• The non-dominated solution set consists of all solutions within a given set that are not dominated
by any other solution in the set.

• Pareto Optimal Set


• The subset of the feasible decision space containing all non-dominated solutions is called the
Pareto optimal set.

• Pareto Optimal Front


• The boundary formed by mapping all points in the Pareto optimal set to the objective space is
referred to as the Pareto optimal front.
Pareto Dominance
Dominates x≺y Weakly dominates x≼y

Non-dominate
2D-case
f1(x) f2(x)
f1(x)

f2(x)
Dominance in 2D
f1(x)

dominated

dominating incomparable

f2(x)
Pareto Optimal Front
Min-min Min-max

Max-min Max-max
Pareto Optimal Solutions
• Pareto optimal solutions are the non-dominated solutions

• Most multi-objective optimization methods use this domination concept to search


for non-dominated solutions

• Solution 3 and 5 are non-dominated solutions 6


5 2

f2(minimize)
4
3 4
2 1 5
1 3
0
8 9 11 12 16

f1 (maximize)
Decision and Objective Space

Eckart Zitzler, A Tutorial on Evolutionary Multiobjective Optimization. ETH Zurich, 2002.


Pareto Front
f1(x)

f2(x)
Pareto front = { }
Algorithm Design issues
Optimization Goal?
• Find all Pareto-optimal solutions?
• How should the decision maker handle 10000 solutions?
• Find a representative subset of Pareto set?
• Many problems are NP-hard
• What does representative really mean?
• Find a good approximation of the
Pareto set?
• What is a good approximation?
• How to formalize intuitive
understanding:
• Close to the Pareto-front
• Well distributed
Algorithm Design issues
Evolutionary Multiobjective Optimization (EMO) Principles
Two ideal goals of multi-objective optimization:

• Convergence
• Find a (finite) set of solutions which lie on the Pareto-
optimal front.
• Diversity
• Find a set of solutions which are diverse enough to
represent the entire range of the Pareto-optimal front.

Basic aim: Guiding the search towards the Pareto set and
keeping a diverse set of nondominated solutions.
Algorithm Design issues
2. (Diversity Preservation)

3. (Elitism)

1. (Fitness assignment)

Eckart Zitzler, A Tutorial on Evolutionary Multiobjective Optimization. ETH Zurich, 2002.


1. Fitness Assignment Strategies

• Unlike single objective, multiple objectives exists.


• Fitness assignment and selection go hand in hand.
• Fitness assignment can be classified in to following categories:
• A) Aggregation-based
• e.g., MOEA/D
• B) Criterion-based (Objective-based)
• VEGA
• C) Dominance-based
• NSGA-II
A) Aggregation-based (scalarization
based)
• Aggregation Approach: Combines multiple objectives into a single
parameterized objective function to generate a trade-off surface.
• Systematic Variation: Parameters are adjusted during optimization to find a
set of nondominated solutions instead of just one.

• Weighted-Sum Aggregation: Some MOEAs use a weighted-sum approach, where


weights change dynamically during evolution.

• Pareto Dominance in Fitness Calculation: The concept of evaluating fitness


based on Pareto dominance was introduced by Goldberg.
A) Aggregation-based (scalarization
based)

Parameters -
weights

w1f1(x) + w2f2(x),…, wkfk


f1(x), f2(x),…, fk(x) F
Or, max(wi(fi - zi ))
A) Aggregation-based -Advantages
• Simple Implementation:
• Easy to integrate into traditional optimization frameworks.
• Reduces multi-objective optimization to a single-objective problem.
• Computationally Efficient:
• Faster convergence compared to Pareto-based methods.
• Requires fewer function evaluations.
• Allows Fine Control Over Trade-offs:
• Weight parameters can be adjusted to prioritize specific objectives.
• Useful when decision-makers have clear preferences.
A) Aggregation-based -Disadvantages
• Difficult to Define Appropriate Weights:
• Choosing the right weights can be subjective and problem-dependent.
• Small changes in weights may lead to very different solutions.
• Cannot Capture Non-Convex Pareto Fronts:
• Weighted-sum methods struggle with non-convex trade-off surfaces.
• Some Pareto-optimal solutions may be missed.
A) Aggregation-based -Disadvantages
• Loss of Diversity in Solutions:
• May favor certain objectives over others, leading to biased results.
• Difficult to obtain a well-distributed set of Pareto-optimal solutions.

• Lack of Adaptability:
• Fixed weights do not adapt dynamically during evolution.
• Requires multiple runs with different weights to explore the entire Pareto front.
B) Criterion-based (Objective-based)
• These methods switch between objectives during the selection phase.
• Different objectives may influence the selection of individuals at different times.
• Example: Vector Evaluated Genetic Algorithm (VEGA) by David Schaffer.
• Subpopulations are created, each evaluated based on a different objective.
• Ensures that solutions for all objectives are explored separately.
B) Criterion-based (Objective-based)
• Advantages:
• Simple & Easy to Implement (can extend single-objective algorithms).
• Finds Individual Best Solutions (useful when independent optimal solutions for each objective are
needed).

• Disadvantages:
• Each solution considers only one objective (ignores the trade-offs).
• May get stuck in local optima (reduces diversity in the final solution set).
B) Criterion-based -Example
• A company wants to design a lightweight yet strong material for an aircraft
component. The optimization problem has two conflicting objectives:
• 1. Minimize Weight (kg) → Reduce material usage.
• 2. Maximize Strength (MPa) → Ensure durability.

• Apply VEGA
• 1) Initialization
• A population of 10 solutions is randomly selected from the available materials.
• Each solution is a material selection for the component.

• 2) Subpopulation Creation
• VEGA splits the population into two equal subpopulations:
• Subpopulation 1 → Optimizes Weight
• Subpopulation 2 → Optimizes Strength
B) Criterion-based -Example
• 3. Selection Process
• In Subpopulation 1, solutions are ranked by minimum weight.
• In Subpopulation 2, solutions are ranked by maximum strength.
• The best-ranked solutions from each subpopulation are combined to form the next generation.

• 4. Crossover & Mutation


• Crossover: Materials can be swapped between solutions to create new combinations.
• Mutation: A randomly selected material can be replaced with another to explore more options.
• 5. Repeat for Multiple Generations
• The algorithm iterates, maintaining diversity and evolving toward a Pareto-optimal front, balancing weight vs.
strength.
C) Dominance based
• Dominance Metrics in Multi-Objective Optimization
• In Pareto-based evolutionary algorithms, dominance metrics help in ranking solutions based on how they
compare with others.
• 1) Dominance Rank (DR) → "How many individuals dominate a given solution?"
• Definition: The number of individuals in the population that dominate a given solution.
• Lower is better! A solution with DR = 0 is non-dominated (Pareto front 1).
• Example: If Solution A is dominated by three other solutions, then DR(A) = 3.

• 2) Dominance Count (DC) → "How many individuals does a solution dominate?"


• Definition: The number of individuals that a given solution dominates.
• Higher is better! A strong solution dominates many weaker solutions.
• Example: If Solution B dominates five other solutions, then DC(B) = 5.
C) Dominance based
• 3) Dominance Depth (DD) → "At which Pareto front is a solution located?"
• Definition: The layer number (Pareto front index) in which a solution lies.
• Front 1 contains non-dominated solutions.
• Front 2 contains solutions dominated only by those in Front 1, and so on.
• Example: If Solution C is in the second Pareto front, then DD(C) = 2.
C) Dominance based
• Example Population & Dominance Metrics
• Consider a multi-objective problem with three solutions (A, B, C) optimizing two conflicting objectives (e.g.,
cost vs. quality).

• Interpretation:
• Solution A is the best → It is non-dominated (Front 1, DR = 0, DC = 2).
• Solution B is dominated only by A → It is in Front 2.
• Solution C is the worst → It is dominated by both A and B (Front 3).
C) Dominance based
2. Diversity Preservation
• Ensuring a well-distributed Pareto front is crucial for obtaining a diverse set of trade-
off solutions. Diversity preservation prevents solutions from clustering in certain
regions while maintaining coverage across all objectives.

• The chance of an individual being selected depends on the density of solutions in its
vicinity:
• Increases: If the number of nearby solutions is low (promotes diversity).
• Decreases: If the number of nearby solutions is high (prevents overcrowding).

• Common Diversity Preservation Methods


• 1. Crowding Distance (NSGA-II)
• 2. Strength Pareto Ranking (SPEA2)
• 3. Reference-Point-Based Methods (MOEA/D)
3. Elitism-Environmental Selection
• Ensures good solutions are retained even if new generations produce worse
solutions.
• Reduces the risk of genetic drift, where random variations might eliminate optimal
solutions.
• Helps maintain diversity while guiding the population toward the Pareto front.
3. Elitism-Environmental Selection
• Selection Criteria for Environmental Selection
1. Dominance:
• Only non-dominated solutions (Pareto-optimal) are kept.
• Dominated solutions (worse in all objectives) are discarded.

2. Density-Based Selection:
• Solutions in less crowded regions are preferred.
• Ensures diversity by avoiding too many solutions in one part of the search space.
• Example: Crowding Distance in NSGA-II.
3. Elitism-Environmental Selection
• Selection Criteria for Environmental Selection
3. Time-Based Selection:

• Older, high-quality solutions are favored over newly introduced solutions.


• Prevents frequent replacement of good solutions.
• Example: Used in steady-state evolutionary algorithms.

4️. Chance-Based Selection:

• Each solution has an equal probability of being retained in the elite archive.
• Helps maintain exploration by not always favoring the best-known solutions.
Techniques to solve MOP
• 1) A Priori Methods
• Definition: Preferences are defined before the optimization process starts.
• How it Works: The decision-maker assigns weights, goals, or priorities to objectives in
advance.
• Example: Weighted Sum Method (assigning importance to each objective).
• Pros: Simple and fast.
• Cons: Requires accurate preference estimation upfront, may miss other valuable
solutions.
Techniques to solve MOP
• 2) A Posteriori Methods:
• Definition: Optimization is performed first, and preferences are applied after generating
a set of solutions.
• How it Works: The algorithm finds the Pareto front, and the decision-maker selects the
preferred solution from this set.
• Example: NSGA-II, MOPSO (evolutionary algorithms providing diverse solutions).
• Pros: Provides a range of trade-off solutions, flexible for decision-making.
• Cons: Computationally intensive, especially for large problems.
Techniques to solve MOP
• Interactive Methods:
• Definition: Preferences are refined during the optimization process in an iterative
manner.
• How it Works: The decision-maker gives feedback after each iteration to guide the
search toward more preferred regions.
• Example: Reference Point-Based Methods, Trade-off Analysis.
• Pros: Dynamic adjustment to evolving preferences, better alignment with decision-
maker’s goals.
• Cons: Requires continuous involvement, can be time-consuming.
Techniques to solve MOP
Optimization and Decision Making

Eckart Zitzler, A Tutorial on Evolutionary Multiobjective Optimization. ETH Zurich, 2002.


Why EC algorithms?
• Evolutionary algorithms (EAs) are well-suited for solving multi-
objective optimization problems because they work with a
population of solutions simultaneously. This enables the
discovery of multiple Pareto-optimal solutions in a single run,
unlike traditional mathematical programming techniques that
require multiple runs.
• Additionally, EAs are robust and less affected by the shape or
continuity of the Pareto front, making them effective for complex,
non-linear problems.
References

• Multi-Objective Optimization using Artificial Intelligence Techniques


Chapter 2
• A.E. Eiben and J.E. Smith Chapter 12

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy