0% found this document useful (0 votes)
11 views47 pages

2. Exact Methods

The document discusses exact methods in combinatorial optimization, focusing on linear programming (LP) techniques like the Simplex Algorithm and mixed-integer linear programming (MILP) methods such as Branch and Bound. It highlights the Hungarian algorithm for assignment problems and various scheduling algorithms that provide optimal solutions for specific cases. Additionally, it explains the importance of bounding methods and the role of continuous relaxation in the Branch and Bound approach for solving MILP problems.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views47 pages

2. Exact Methods

The document discusses exact methods in combinatorial optimization, focusing on linear programming (LP) techniques like the Simplex Algorithm and mixed-integer linear programming (MILP) methods such as Branch and Bound. It highlights the Hungarian algorithm for assignment problems and various scheduling algorithms that provide optimal solutions for specific cases. Additionally, it explains the importance of bounding methods and the role of continuous relaxation in the Branch and Bound approach for solving MILP problems.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 47

The essentials of

Combinatorial Optimization
(part 2 – Exact methods)

Prof M. Ghirardi
Exact methods
• LP with continuous variables only
– Simplex Algorithm - pseudo-polynomial complexity (n3 in
practice)

• Ad-hoc methods: for some problem, methods specifically


developed on the problem properties may be available.

• MILP (mixed-integer LP - with integer/boolean variables)


– Branch and Bound, Branch and Cut, Column Generation,
Dynamic programming, etc.
– Complexity strongly depends on the problem characteristics and
dimensions.
Simplex method
Simplex method
• An inital vertex is selected
• Repeat
• Find the direction where the objective
function improves.
• Move in that direction, to an adjacent
improving vertex.
until no more improvements are
possible

– Simplex method complexity is not


polynomial, but it can be shown that it is
pseudo-polynomial (mostly performs as a
polynomial one)
Ad-hoc methods example: the hungarian
algorithm for the assignment problem
• The Hungarian method is an algorithm that solves the
assignment problem in polynomial time.
• Published in 1955 by Harold Kuhn, who gave the name
"Hungarian method" because the algorithm was largely based
on the earlier works of two Hungarian mathematicians: Dénes
Kőnig and Jenő Egerváry.
• Consider the following assignment problem, where every job
must be assigned to a machine for processing.

Machine 1 Machine 2 Machine 3


Job 1 5 7 9
Job 2 14 10 12
Job 3 15 13 16
Ad-hoc methods example: the hungarian
algorithm for the assignment problem
Step 1: Select the smallest value in each row.
Subtract this value from each value in that row
Step 2: Do the same for the columns that do not
have any zero value. Repeat from Step 1 if possible.
Ad-hoc methods example: the hungarian
algorithm for the assignment problem

Step 3: Assignments are made at zero values.


Therefore, we assign job 1 to M1, job 2 to M3,
job 3 to M2.
Ad-hoc methods example: the hungarian
algorithm for the assignment problem
Machine 1 Machine 2 Machine 3
Job 1 5 7 9
Job 2 14 10 12
Job 3 15 13 16

• Total cost is 5+12+13 = 30.

• It is not always possible to obtain a feasible assignment


as in here.
Ad-hoc methods example: the hungarian
algorithm for the assignment problem
Example 2:

Line subtractions

Column subtractions

Here row 3 and column 4


do not have any 0s.
Ad-hoc methods example: the hungarian
algorithm for the assignment problem

• A feasible assignment is not possible at this moment. In


such a case the algorithm have a few additional steps:

• Step 4: Draw a minimum number of lines through some


of the rows and columns, such that all zero values are
crossed out.
Ad-hoc methods example: the hungarian
algorithm for the assignment problem

Step 5: Select the smallest uncrossed out element. This element is


subtracted from every uncrossed out element and added to every
element at the intersection of two lines.
Ad-hoc methods example: the hungarian
algorithm for the assignment problem

• We can now easily assign to the zero values.


Solution is to assign (1 to 1), (2 to 3), (3 to 2)
and (4 to 4).

• If drawing lines do not provide an easy solution,


then we should perform steps 4 and 5, until a
feasible assignment is possible.
Ad-hoc methods example: the hungarian
algorithm for the assignment problem

• The Hungarian method in its original formulation


had a complexity of O(n4).
• It has been later modified to achieve the same
result with a O(n3) complexity.

• The assignment problem is an “easy one”


(belongs to class P)
Ad-hoc methods example:
scheduling
• In scheduling, a few problems are solvable
with known algorithms.

• Among others, Lawler’s algorithm, Moore’s


algorithm, Johnson’s algorithm, etc. are
proven to provide an optimal solution for
specific scheduling problems.
Ad-hoc methods example:
scheduling
• Consider the problem of scheduling n jobs on a
single machine. Each job has its own duration
(processing time). The machine can process
one job at a time. The goal is to minimize the
average completion time of all the jobs.

• Consider the following example:


Jobs J1 J2 J3 J4 J5

Processing 8 2 3 2 5
Time
Ad-hoc methods example:
scheduling
Jobs J1 J2 J3 J4 J5

Processing 8 2 3 2 5
Time

Theorem: Any sequence that places the jobs


in order of nondecreasing processing times
is optimal

(SPT Rule: Shortest Processing Time)


Ad-hoc methods example:
scheduling
Jobs J1 J2 J3 J4 J5

Processing 8 2 3 2 5
Time

• In above example, optimal sequences are: J4-


J2-J3-J5-J1 or J2-J4-J3-J5-J1.

Machine J4 J2 J3 J5 J1

0 2 4 7 12 20
• In both cases the total completion time is
2+4+7+12+20=45.
Ad-hoc methods example:
scheduling
Proof:
• Minimizing the average completion time is
equivalent to minimize the sum of
completion time (a function minimum is in
the same point if the function is divided by
a positive constant, the number of job in
our case)
• We build a proof based on a reductio ad
absurdum.
• Consider a sequence S in which the jobs are not in order
of nondecreasing pj. Then there is a job JK that is
immediately preceded by a job Jj with pj>pK.

• If the two jobs start at time T, JJ completes at time T+pj,


and Jk completes at time T+pj+pk. The contribution to the
objective function of the two jobs is then 2T+2pj+pK

• The effect of interchanging these two jobs in the


sequence is to have a contribution of the two jobs equal
to 2T+2pK+pj . Hence, the cost decreased by the
following amount:
[2T+2pj+pK] - [2T+2pK+pj] = pj - pK > 0

• Hence S cannot be optimal.


Ad-hoc methods example:
scheduling

• Then, the algorithm complexity is the complexity


of an ordering algorithm: n∙log(n)

• The problem belongs to the class P.


Ad-hoc methods
• In graph theory, a lot of algorithms are
optimal and polynomial in complexity:
– The fleury algorithm we saw in the
introduction for the eulerian path/cycle
problem.
– Shortest path algorithms (Dijkstra, Bellmann-
Ford, A*, etc)
– Minimum cost Spanning Tree algorithms
– etc
Exact methods: Branch & Bound
• Consider an MILP model with only 4 binary variables.
• The following is a Search Tree representing the complete
set of candidate solutions:
Exact methods: Branch & Bound
• Note that each node in a search tree
represents a different problem.
• The root node represents the original
problem.
• Leaf nodes represents complete solutions
(all variables are assigned)
• Intermediate levels nodes represents
«partial solutions», containing a model
where only a subset has been fixed.
Exact methods: Branch & Bound
• For example, node 9 represents the
original problem with three added
constraints (x1=1, x2=0, x3=1).
Exact methods: Branch & Bound
• In this small example with four jobs, it is
possible to totally evaluate all 16
alternatives.

• But as the number of jobs, n, gets larger,


the total number of alternatives, 2n, grows
exponentially; making total enumeration
computationally infeasible even for the
fastest computers
Exact methods: Branch & Bound
• B&B was proposed in the sixties (one of the first
applications has been the TSP).

• It consists of a systematic enumeration of candidate


solutions forming a search tree.

• It explores branches of this tree. Before enumerating the


candidate solutions of a branch, the branch is checked
against estimated bounds on the optimal solution
reachable exploring further that way.

• The branch is discarded if it cannot produce a better


solution than the best one found so far by the algorithm.
Exact methods: Branch & Bound
• The algorithm depends on efficient estimation of
the lower and upper bounds of regions/branches
of the search space.

• Strong bounds means that the branches are


“pruned” at the higher levels of the search tree,
saving a lot of time in enumerating that branch
solutions.

• If no bounds are available, the algorithm


degenerates to an exhaustive search.
Relaxed problems
• Given a (minimization) problem P, a problem R(P) is a
relaxation of P if it is the same problem with one or
more deleted constraints.
– The solution space of the relaxed problem R(P) is larger than the
one of P.
– It contains all the solution of P, and other solutions, which are
not feasible for P.
– The objective function of the optimal solution of R(P) is less or
equal to the optimal solution of P (it is a «lower bound» for P
optimal solution).
– If the optimal solution of R(P) is feasible for P, then it is optimal
for P.
– If a solution is not feasible for R(P), it is not feasible for P.
Exact methods: Branch & Bound
Example of a bounding method:

• Consider a (minimization) MILP model


with integer variables (problem P).
• Build another problem, identical to the
original one, but dropping the integrality
constraint (variables are now real valued).
• The resulting problem is a relaxed
problem.
Exact methods: Branch & Bound

Direction of the
o.f. improvement
Exact methods: Branch & Bound
Exact methods: Branch & Bound

Notes:

• Sometimes the continuous solution could be integer.

• Easy to model.

• Easy to be solved (simplex algorithm is pseudo-


polynomial)
Exact methods: Branch & Bound
• At every node n, a lower bound LB(n) (from a
relaxation) is computed.
• Optionally an upper bound UB(n) (from any
heuristic) is computed.
• If UB(n) =LB(n) then node n can be closed (the
optimal solution from n is UB(n)).
• Let z be the best solution found at that moment
z=min(UB(n)).
• If z <=LB(n) then node n can be ignored and
deleted from the open nodes list.
Continuous Relaxation

X*
Branching
• Partition the solution state of P into two or more
subproblems P1, P2,…Pn such that
– Every feasible solution to P appears in at least one
(often exactly one) problem P1, P2, …Pn
– x* is infeasible to each of R(P1), R(P2), … R(Pn)

• For the continuous relaxation, we can choose a


non integer variable xj* and create one problem
with xj <= floor(xj*) and the other with xj >=
floor(xj*)+1
Branching

P1

P2
Selection
• A set of open nodes is kept in memory. Each
selected node may be closed (due to the bound
criterions) or generate other nodes (branch).
• The branched nodes are added to the set.
• The selection of the next node to branch is
another parameter of the algorithm. Often used
ones are:
– Select the lower LB node (best first)
– Select the higher level node - explore the tree by
depth (depth first)
– Select the lower level node - explore the tree by
depth (breadth first)
37
Selection

P1

xrel

P2

P2 is selected with a best-first strategy (more promising lower bound)


Branching

P1
P2 is branched.

Note that P2_2 (the second


branch of P2) is not
feasible

P2_1
Selection

P1
xrel

P2_1

P1 is selected with a best-first strategy (more promising lower bound)


Branching

P1 is branched.

P1_1 Note that P1_2 (the second


branch of P1) is not
feasible

P2_1
Selection

P2_1 is selected
The lower bound is integer
P1_1 (hence, an upper bound as
well)
Node is closed, and x* is
kept as the best solution
xrel found.

P2_1
Pruning nodes

The lower bound of P1_1 is


worse than the solution x*
P1_1 objective function.
Hence, P1_1 is closed.

x*
Final result

X* is the optimal solution


of the problem

x*
LP Based Branch and Bound
Initialization
z*=, x*= NodesList empty? x* optimal
NodesList = {Root node}

Choose a node P
From NodesList

Solve the relaxed problem


(i.e. continuous relaxation)
R(P)→LB(P),xlb
If infeasible,
then delete
If LB(P)≥ z*,
then delete
If xlb is feasible for the original
problem (i.e. integral) and
z*>LB(i) (for a minimization)
then z*  LB(i), x* xlb
and delete

Branch
Branch and Bound
• Modern MILP solvers embed a Branch and Bound
algorithm coupled with many additional pruning methods,
based on
– Geometrical representation of the problems.
– Known sub-problem structures detection.
– Powerful heuristics
– etc..
• Solvers are very effective, but still time grows more than
linearly when the problems dimension grows, especially
in the case of “big-M” parameters.
• Other Exact Methods for MILP: Column Generation,
Branch and Cut, Branch and Price.
B&B – Typical behaviour

Best solution found


until time t

Minimal bound of all nodes still


to be explored of the search tree

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy