Using Artificial Intelligence To Promote Diversity
Using Artificial Intelligence To Promote Diversity
Intelligence to
Promote Diversity
WINTER 2019
ISSUE AI can help us overcome biases instead of perpetuating them,
with guidance from the humans who design, train, and refine
its systems.
Paul R. Daugherty
H. James Wilson
Rumman Chowdhury
A
REFERENCES rtificial intelligence has had
1. R. Hogan, T. Chamorro-Premuzic, and R.B. some justifiably bad press
Kaiser, “Employability and Career Success:
recently. Some of the worst
Bridging the Gap Between Theory and Reality,”
Industrial and Organizational Psychology 6, no. 1 stories have been about systems that
(March 2013): 3-16. exhibit racial or gender bias in facial
2. T. Chamorro-Premuzic, D. Winsborough, R.A. recognition applications or in evaluat-
Sherman, and R. Hogan, “New Talent Signals:
Shiny New Objects or a Brave New World?” ing people for jobs, loans, or other
Industrial and Organizational Psychology 9, considerations.1 One program was
no. 3 (September 2016): 621-640. routinely recommending longer prison
3. J. Bersin, “HR Technology Disruptions for sentences for blacks than for whites
2018: Productivity, Design, and Intelligence
Reign,” Bersin by Deloitte, 2017. on the basis of the flawed use of
4. N. Perveen, N. Ahmad, M. Abdul Qadoos Bilal recidivism data.2
Khan, R. Khalid, and S. Qadri, “Facial Expression But what if instead of perpetuating
Recognition Through Machine Learning,” Inter-
harmful biases, AI helped us overcome
national Journal of Scientific and Technology
Research 5, no. 4 (March 2016): 91-97. them and make fairer decisions? That
5. C.P. Latha and M.M. Priya, “A Review on could eventually result in a more di-
Deep Learning Algorithms for Speech and Facial verse and inclusive world. What if, for
Emotion Recognition,” International Journal of
Control Theory and Applications 9, no. 24
instance, intelligent machines could
(January 2016): 183-204. help organizations recognize all worthy
6. G. Park, H.A. Schwartz, J.C. Eichstaedt, M.L. job candidates by avoiding the usual hidden prejudices that derail applicants who don’t
Kern, M. Kosinski, D.J. Stillwell, L.H. Ungar, and look or sound like those in power or who don’t have the “right” institutions listed on
M.E.P. Seligman, “Automatic Personality As-
sessment Through Social Media Language,” their résumés? (See “New Ways to Gauge Talent and Potential,” p. 7.) What if software
Journal of Personality and Social Psychology programs were able to account for the inequities that have limited the access of minorities
108, no. 6 (June 2015): 934-952. to mortgages and other loans? In other words, what if our systems were taught to ignore
7. G. Farnadi, G. Sitaraman, S. Sushmita, F. Celli,
data about race, gender, sexual orientation, and other characteristics that aren’t relevant
M. Kosinski, D. Stillwell, S. Davalos, M.F.
Moens, and M. De Cock, “Computational to the decisions at hand?
Personality Recognition in Social Media,” AI can do all of this — with guidance from the human experts who create, train, and
User Modeling and User-Adapted Interaction
refine its systems. Specifically, the people working with the technology must do a much
26, no. 2-3 (June 2016).
better job of building inclusion and diversity into AI design by using the right data to
8. L.M. Hough, F.L. Oswald, and R.E. Ployhart,
“Determinants, Detection, and Amelioration train AI systems to be inclusive and thinking about gender roles and diversity when de-
of Adverse Impact in Personnel Selection veloping bots and other applications that engage with the public.
Procedures: Issues, Evidence, and Lessons
Learned,” International Journal of Selection and
Assessment 9, no. 1-2 (March 2001): 152-194. Design for Inclusion
Software development remains the province of males — only about one-quarter of com-
Reprint 60212.
Copyright © Massachusetts Institute of Technology, puter scientists in the United States are women3 — and minority racial groups, including
2019. All rights reserved. blacks and Hispanics, are underrepresented in tech work, too.4 Groups like Girls Who Code
10 MIT SLOAN MANAGEMENT REVIEW WINTER 2019 PLEASE NOTE THAT GRAY AREAS REFLECT ARTWORK THAT HAS BEEN INTENTIONALLY REMOVED.
THE SUBSTANTIVE CONTENT OF THE ARTICLE APPEARS AS ORIGINALLY PUBLISHED.
and AI4ALL have been founded to help developing technology on the basis of the with doctor — and if those associations
close those gaps. Girls Who Code has input from multiple stakeholders to better aren’t identified and removed, they will
reached almost 90,000 girls from various represent the needs of nonmainstream be perpetuated and reinforced.10
backgrounds in all 50 states,5 and AI4ALL populations. While AI programs learn by finding
specifically targets girls in minority com- Some AI-powered tools are designed to patterns in data, they need guidance from
munities. Among other activities, AI4ALL mitigate biases in hiring. Intelligent text humans to ensure that the software doesn’t
sponsors a summer program with visits to editors like Textio can rewrite job descrip- jump to the wrong conclusions. This pro-
the AI departments of universities such as tions to appeal to candidates from groups vides an important opportunity for
Stanford and Carnegie Mellon so that par- that aren’t well represented. Using Textio, promoting diversity and inclusion.
ticipants might develop relationships with software company Atlassian was able to Microsoft, for example, has set up the
researchers who could serve as mentors increase the percentage of females among Fairness, Accountability, Transparency, and
and role models. And fortunately, the AI its new recruits from about 10% to 57%.9 Ethics in AI team, which is responsible for
field has a number of prominent women — Companies can also use AI technology to uncovering any biases that have crept into
including Fei-Fei Li (Stanford), Vivienne help identify biases in their past hiring de- the data used by the company’s AI systems.
Ming (Singularity University), Rana el cisions. Deep neural networks — clusters Sometimes AI systems need to be
Kaliouby (Affectiva), and Cynthia Breazeal of algorithms that emulate the human refined through more inclusive represen-
(MIT) — who could fill such a need. tation in images. Take, for instance, the
These relationships don’t just open fact that commercial facial recognition
up development opportunities for the Many of the data applications struggle with accuracy when
mentees — they’re also likely to turn the sets used to train dealing with minorities: The error rate for
mentors into diversity and inclusion
AI systems contain identifying dark-skinned women is 35%,
champions, an experience that may affect
how they approach algorithm design.
historical artifacts of compared with 0.8% for light-skinned
men. The problem stems from relying on
Research by sociologists Frank Dobbin of biases, and if those freely available data sets (which are rife
Harvard University and Alexandra Kalev associations aren’t with photos of white faces) for training
of Tel Aviv University supports this idea:
identified and the systems. It could be corrected by cu-
They’ve found that working with mentees
from minority groups actually moves the removed, they will rating a new training data set with better
representation of minorities or by apply-
needle on bias for the managers and pro- be perpetuated. ing heavier weights to the underrepresented
fessionals doing the mentoring, in a way data points.11 Another approach —
that forced training does not.6 ability to spot patterns in data — can be proposed by Microsoft researcher Adam
Other organizations have pursued especially effective in uncovering evidence Kalai and his colleagues — is to use differ-
shorter-term solutions for AI-design of hidden preferences. Using this tech- ent algorithms to analyze different groups.
teams. LivePerson, a company that devel- nique, an AI-based service such as Mya For example, the algorithm for determin-
ops online messaging, marketing, and can help companies analyze their hiring ing which female candidates would be the
analytics products, places its customer records and see if they have favored candi- best salespeople might be different from
service staff (a profession that is 65% dates with, for example, light skin. the algorithm used for assessing males —
female in the United States) alongside its sort of a digital affirmative action tactic.12
coders (usually male) during the develop- Train Systems With In that scenario, playing a team sport in
ment process to achieve a better balance Better Data college might be a higher predictor of
of perspectives.7 Microsoft has created a Building AI systems that battle bias is not success for women than for men going
framework for assembling “inclusive” only a matter of having more diverse and after a particular sales role at a particular
design teams, which can be more effective diversity-minded design teams. It also company.
for considering the needs and sensitivities involves training the programs to behave
of myriad types of customers, including inclusively. Many of the data sets used to Give Bots a Variety of Voices
those with physical disabilities.8 The train AI systems contain historical arti- Organizations and their AI system devel-
Diverse Voices project at the University facts of biases — for example the word opers must also think about how their
of Washington has a similar goal of woman is more associated with nurse than applications are engaging with customers.
that Amazon has begun to address in its have a preference 5. “About Us,” girlswhocode.com.
6. F. Dobbin and A. Kalev, “Why Diversity
recent version of Alexa: The intelligent for a female digital Programs Fail,” Harvard Business Review 94,
assistant’s voice,
bot has been reprogrammed to have little no. 7/8 (July-August 2016).
7. R. Locascio, “Thousands of Sexist AI Bots
patience for harassment, for instance, and
now sharply answers sexually explicit which they perceive Could Be Coming. Here’s How We Can Stop
ferent versions of their bots to appeal to a gender stereotypes? 10. C. DeBrusk, “The Risk of Machine-Learning
Bias (and How to Prevent It),” MIT Sloan Man-
diverse customer base. Apple’s Siri is now agement Review, March 26, 2018.
available in a male or female voice and can 11. J. Zou and L. Schiebinger, “AI Can Be Sexist
speak with a British, Indian, Irish, or ULTIMATELY, WE BELIEVE that AI will and Racist — It’s Time to Make It Fair,” Nature,
Australian accent. It can also speak in a help create a more diverse and better July 12, 2018.
variety of languages, including French, world if the humans who work with the 12. D. Bass and E. Huet, “Researchers Combat
Gender and Racial Bias in Artificial Intelligence,”
German, Spanish, Russian, and Japanese. technology design, train, and modify Bloomberg.com, Dec. 4, 2017.
Although Siri typically defaults to a female those systems properly. This shift requires 13. B. Lovejoy, “Sexism Rules in Voice Assis-
voice, the default is male for Arabic, French, a commitment from the senior executives tant Genders, Show Studies, but Siri Stands
Dutch, and British English languages. setting the direction. Business leaders may Out,” 9to5Mac.com, Feb. 22, 2017.
Just as important as the way they claim that diversity and inclusivity are 14. J. Elliot, “Let’s Stop Talking to Sexist Bots:
The Future of Voice for Brands,” Fast Company,
speak, AI bots must also be able to under- core goals, but they then need to follow March 7, 2018.
stand all types of voices. But right now, through in the people they hire and the
15. S. Paul, “Voice Is the Next Big Platform,
many don’t.15 To train voice recognition products their companies develop. Unless You Have an Accent,” Wired, March 20,
algorithms, companies have relied on The potential benefits are compelling: 2017.
speech corpora, or databases of audio access to badly needed talent and the Reprint 60216.
clips. Marginalized groups in society — ability to serve a much wider variety of Copyright © Massachusetts Institute of Technology,
low-income, rural, less educated, and consumers effectively. 2019. All rights reserved.