Dealing_with_Several_Variables
Dealing_with_Several_Variables
Derivatives
&
Multivariable
Functions
A 2-Variable Function
Some questions
• The surface shown on the preceding slide depicts a
function of two variables – 𝑓(𝑥, 𝑦) or 𝑧 = 𝑓(𝑥, 𝑦) .
• Some choice of 𝑥, 𝑦 will maximize 𝑧 = 𝑓 𝑥, 𝑦 .
• How can we find this value?
• Can we use a similar approach to the case 𝑦 = 𝑓 𝑥 ?
• What if 3 or 4 or, in general, 𝑛 variables are involved?
Some Problems with Answers
• Before the notions of “maximal” and “minimal” were
supported by considering the “gradients of lines
touching a point”.
• When this gradient was 0 we could justify the point
as critical since the line concerned was “horizontal”:
small changes would result in positive (increasing)
lines and negative (decreasing) lines.
• For two or more variables we can no longer, safely,
use this analogy.
A Different Approach
• Consider a function such as 𝑧 = − 3𝑥 2 + 2𝑦 2 + 𝑥𝑦 .
• What are the value(s) of 𝑥 and 𝑦 which maximize 𝑧?
• Are there, in fact, any such values?
• and, if so, how do we find them?
• We use the concept of partial derivative to do this.
Partial Derivatives
Consider again the function 𝑧 = − 3𝑥 2 + 2𝑦 2 + 𝑥𝑦 .
1. We need value(s) of 𝑥 and 𝑦 which maximize 𝑧.
2. We know 𝑥 to maximize 𝑧 when 𝑦 is fixed
3. We know 𝑦 to maximize 𝑧 when 𝑥 is fixed
4. But we want to do things simultaneously.
5. Assume that 𝑦 is fixed and find the “right” 𝑥.
6. Assume that 𝑥 is fixed and find the “right” 𝑦.
7. (2) and (3) yield simultaneous equations.
Deriving the system of equations
Consider again the function 𝑧 = − 3𝑥 2 + 2𝑦 2 + 𝑥𝑦 .
1. Assume that 𝑦 is fixed and find the “right” 𝑥.
Meaning: differentiate 𝑓(𝑥, 𝑦) “as if 𝑦 was a constant”
𝑓𝑥 𝑥, 𝑦 = − 6𝑥 + 𝑦
2. Assume that 𝑥 is fixed and find the “right” 𝑦.
Meaning: differentiate 𝑓(𝑥, 𝑦) “as if 𝑥 was a constant”
𝑓𝑦 𝑥, 𝑦 = −(4𝑦 + 𝑥)
3. Now find the values of 𝑥, 𝑦 for which 𝑓𝑥 𝑥, 𝑦 = 0
AND 𝑓𝑦 𝑥, 𝑦 =0: that is, solve the simultaneous
equations.
A Bit of notation
• For functions 𝑧 = 𝑓(𝑥, 𝑦) we have:
𝑓𝑥 𝑥, 𝑦 : the (first) partial derivative of 𝑓(𝑥, 𝑦) wrt 𝑥
𝑓𝑦 𝑥, 𝑦 : the (first) partial derivative of 𝑓(𝑥, 𝑦) wrt 𝑦
Also used are:
𝜕𝑧 𝜕𝑧
;
𝜕𝑥 𝜕𝑦
• z = 𝑓(𝑥1 , 𝑥2 , … , 𝑥𝑘 , … , 𝑥𝑛 )
𝜕𝑧
𝜕𝑥𝑘
For 𝑧 = − 3𝑥 2 + 2𝑦 2 + 𝑥𝑦
• 𝑓𝑥 𝑥, 𝑦 = − 6𝑥 + 𝑦
• 𝑓𝑦 𝑥, 𝑦 = −(4𝑦 + 𝑥)
−6𝑥 − 𝑦 = 0
−𝑥 − 4𝑦 = 0
• Only solution is: 𝑥 = 0 ; 𝑦 = 0
• − 3𝑥 2 + 2𝑦 2 + 𝑥𝑦 is maximized at the point 0,0 .
• Q: How do we know this is a maximum?
• A: We need an analogue of The Second Derivative Test.
The Second Derivative Test with 2 variables
• This is similar, but rather more involved.
• Problem 1: With 2 variables there are 4 possible
forms of “second order” partial derivative.
• Problem 2: with 𝑛 variables there are 𝑛2
• Problem 3: how do we “combine” these?
• The 4 Forms: 𝑓𝑥𝑥 , 𝑓𝑥𝑦 , 𝑓𝑦𝑥 , 𝑓𝑦𝑦 or
𝜕2 𝑧 𝜕2 𝑧 𝜕2 𝑧 𝜕2 𝑧
, , ,
𝜕𝑥 2 𝜕𝑥𝜕𝑦 𝜕𝑦𝜕𝑥 𝜕𝑦 2
Interpretation
• 𝑓𝑥𝑥 means “the (partial) derivative of 𝑓𝑥 wrt 𝑥”
• 𝑓𝑥𝑦 means “the (partial) derivative of 𝑓𝑦 wrt 𝑥”
• 𝑓𝑦𝑥 means “the (partial) derivative of 𝑓𝑥 wrt 𝑦”
• 𝑓𝑦𝑦 means “the (partial) derivative of 𝑓𝑦 wrt 𝑦”
• For 𝑓 𝑥, 𝑦 = − 3𝑥 2 + 2𝑦 2 + 𝑥𝑦 :
𝑓𝑥𝑥 = −6 ; 𝑓𝑥𝑦 = −1 ; 𝑓𝑦𝑥 = −1 ; 𝑓𝑦𝑦 = −4
• In general: 𝑓𝑥𝑦 and 𝑓𝑦𝑥 are identical functions.
Using the Test
• .There is a precondition:
2
(𝑓𝑥𝑥 𝑓𝑦𝑦 − 𝑓𝑥𝑦 )(𝛼, 𝛽) > 0
• if 𝑓𝑥𝑥 𝛼, 𝛽 > 0 the point is a minimum.
• if 𝑓𝑥𝑥 𝛼, 𝛽 < 0 the point is a maximum
• if 𝑓𝑥𝑥 𝛼, 𝛽 = 0 no conclusion can be made.
• There is a sophisticated extension for 𝑛 variables called
The Hessian. (see textbook pages 168-9)
• A more detailed two variable case may be found on
pages 169 – 173 of the module text.