CE880 Lecture9 Slides
CE880 Lecture9 Slides
Mohsin Ali
14 March 2023
1
About Myself
2
Why Explainable AI?
Machine Learning models are often black boxes that make their interpretation difficult,
even by the creator of that model. In order to understand why a particular model is
returning some prediction, it is important to understand the main features that affect
the model’s output.
0
Topbots
3
Clever Hans
In 1907 there was a horse (Clever Hans) that was claimed to have performed
arithmetic and other intellectual tasks
0
wikipedia
4
Does the highest accuracy mean the best model?
Examples:
I Covid Detection
I Cancer Detection
I Fraud Detection
5
Real-life Examples | Amazon
By 2015, Amazon realized its new system was not rating candidates for software
developer jobs and other technical posts in a gender-neutral way.
0
Reuters
6
Real-life Examples | IBM, Microsoft
Face recognition algorithms boast high classification accuracy (over 90%), but these
outcomes are not universal. A growing body of research exposes divergent error rates
across demographic groups, with the poorest accuracy consistently found in subjects
who are female, Black, and 18-30 years old.
0
Harvard University, BBC
7
Further Explanation
Coded Bias lists actual examples of algorithms already making decisions about your
credit, your health, your housing, your college or job applications, even your access to
possibility. The system holds in its hands your hope of a better life – and there’s no
appeal if the computer says no.
0
Netflix, MIT, Cnet
8
What is Explainable AI (XAI)
Explainable artificial intelligence (XAI) is a set of processes and methods that allows
human users to understand the results generated by machine learning algorithms.
Which helps to:
I Build trust
I Meet the regulatory standards
I Acquire fairness
9
Explainable AI (XAI) Process
10
Types of Explainable AI methods
11
SHAP (SHapley Additive exPlanations)
0
https://github.com/slundberg/shap
12
SHAP Explanation
0
https://github.com/slundberg/shap
13
Game Theory
14
Type of Game Theory
15
Non-Cooperative Game Theory | Prisoners Dilemma
16
Non-Cooperative Game Theory
Non-Cooperative Game Theory state that a player has found Nash Equilibrium when
they make that choice that leaves them better off no matter what their opponents
decide to do
17
Cooperative Game Theory
I Cooperative Games are games where all players agree to work to archive a
common goal
I Cooperative Game Theory tries to find how much each player has contributed to
the game and how much reward each player should receive
I Similar, to Non-Cooperative Game Theory where there is Nash Equilibrium
Cooperative Game Theory has shapely values
18
Shapley Values
I The contribution of each player is determined by what they lost or gained after
removing them from the game (Marginal Contribution)
I Interchangeable players have equal values
I Dummy players should have zero values
I If a game has multiple parts reward should be calculated for each part separately
19
Shapley Values | Example
20
Shapley Values | Example
21
Shapley Values | Example
Shapely values state that you should get 15$ and your friend should get 25$ from 40$
22
Titanic
On April 15, 1912, during her maiden voyage, the widely considered “unsinkable” RMS
Titanic sank after colliding with an iceberg. Unfortunately, there weren’t enough
lifeboats for everyone onboard, resulting in the death of 1502 out of 2224 passengers
and crew.
While there was some element of luck involved in surviving, it seems some groups of
people were more likely to survive than others.
0
Wikipedia, Kaggle
23
Titanic Dataset
0
Kaggle
24
Install SHAP
0
https://shap-lrjball.readthedocs.io/
25
Downloading the dataset
0
Kaggle
26
Load Dataset
27
Change the String data to Numerical data
28
Data Information
29
Fill null values
30
Verify Data
31
Continuous data to categorical data
We will convert the continuous data to categorical data for a better understanding
32
Continuous data to categorical data
33
String data to Numerical
34
Remove unnecessary features
35
Fit Machine Learning model
36
Get SHAP values
37
Plot SHAP | Summary Plot | Bar
38
Plot SHAP | Summary Plot | Beeswarm
39
Plot SHAP | Individual Instance | Bar
40
Plot SHAP | Individual Instance | Waterfall
41
Returning back to ’Titanic’
42
CE880: An Approachable Introduction to Data Science
43