0% found this document useful (0 votes)
29 views5 pages

Using Artificial Intelligence To Promote Diversity

Uploaded by

plspay2nimbueom
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views5 pages

Using Artificial Intelligence To Promote Diversity

Uploaded by

plspay2nimbueom
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Using Artificial

Intelligence to
Promote Diversity
WINTER 2019
ISSUE AI can help us overcome biases instead of perpetuating them,
with guidance from the humans who design, train, and refine
its systems.

Paul R. Daugherty
H. James Wilson
Rumman Chowdhury

Vol. 60, No. 2 Reprint #60216 https://mitsmr.com/2DQz2XT


FRONTIERS

New Ways to Gauge Talent and Potential [DIVERSITY ]

Using Artificial Intelligence


(Continued from page 9)

If machines can make this process more


accurate and less biased, every business
can see tremendous benefits. to Promote Diversity
Josh Bersin (@josh_bersin) is a global
industry analyst, covering HR, talent, and AI can help us overcome biases instead of perpetuating
leadership, and he is the founder of Bersin
by Deloitte. Tomas Chamorro-Premuzic
them, with guidance from the humans who design, train,
(@drtcp) is the chief talent scientist at Man- and refine its systems.
powerGroup and a professor of business BY PAUL R. DAUGHERTY, H. JAMES WILSON, AND RUMMAN CHOWDHURY
psychology at University College London and
Columbia University. Comment on this article
at http://sloanreview.mit.edu/x/60212.

A
REFERENCES rtificial intelligence has had
1. R. Hogan, T. Chamorro-Premuzic, and R.B. some justifiably bad press
Kaiser, “Employability and Career Success:
recently. Some of the worst
Bridging the Gap Between Theory and Reality,”
Industrial and Organizational Psychology 6, no. 1 stories have been about systems that
(March 2013): 3-16. exhibit racial or gender bias in facial
2. T. Chamorro-Premuzic, D. Winsborough, R.A. recognition applications or in evaluat-
Sherman, and R. Hogan, “New Talent Signals:
Shiny New Objects or a Brave New World?” ing people for jobs, loans, or other
Industrial and Organizational Psychology 9, considerations.1 One program was
no. 3 (September 2016): 621-640. routinely recommending longer prison
3. J. Bersin, “HR Technology Disruptions for sentences for blacks than for whites
2018: Productivity, Design, and Intelligence
Reign,” Bersin by Deloitte, 2017. on the basis of the flawed use of
4. N. Perveen, N. Ahmad, M. Abdul Qadoos Bilal recidivism data.2
Khan, R. Khalid, and S. Qadri, “Facial Expression But what if instead of perpetuating
Recognition Through Machine Learning,” Inter-
harmful biases, AI helped us overcome
national Journal of Scientific and Technology
Research 5, no. 4 (March 2016): 91-97. them and make fairer decisions? That
5. C.P. Latha and M.M. Priya, “A Review on could eventually result in a more di-
Deep Learning Algorithms for Speech and Facial verse and inclusive world. What if, for
Emotion Recognition,” International Journal of
Control Theory and Applications 9, no. 24
instance, intelligent machines could
(January 2016): 183-204. help organizations recognize all worthy
6. G. Park, H.A. Schwartz, J.C. Eichstaedt, M.L. job candidates by avoiding the usual hidden prejudices that derail applicants who don’t
Kern, M. Kosinski, D.J. Stillwell, L.H. Ungar, and look or sound like those in power or who don’t have the “right” institutions listed on
M.E.P. Seligman, “Automatic Personality As-
sessment Through Social Media Language,” their résumés? (See “New Ways to Gauge Talent and Potential,” p. 7.) What if software
Journal of Personality and Social Psychology programs were able to account for the inequities that have limited the access of minorities
108, no. 6 (June 2015): 934-952. to mortgages and other loans? In other words, what if our systems were taught to ignore
7. G. Farnadi, G. Sitaraman, S. Sushmita, F. Celli,
data about race, gender, sexual orientation, and other characteristics that aren’t relevant
M. Kosinski, D. Stillwell, S. Davalos, M.F.
Moens, and M. De Cock, “Computational to the decisions at hand?
Personality Recognition in Social Media,” AI can do all of this — with guidance from the human experts who create, train, and
User Modeling and User-Adapted Interaction
refine its systems. Specifically, the people working with the technology must do a much
26, no. 2-3 (June 2016).
better job of building inclusion and diversity into AI design by using the right data to
8. L.M. Hough, F.L. Oswald, and R.E. Ployhart,
“Determinants, Detection, and Amelioration train AI systems to be inclusive and thinking about gender roles and diversity when de-
of Adverse Impact in Personnel Selection veloping bots and other applications that engage with the public.
Procedures: Issues, Evidence, and Lessons
Learned,” International Journal of Selection and
Assessment 9, no. 1-2 (March 2001): 152-194. Design for Inclusion
Software development remains the province of males — only about one-quarter of com-
Reprint 60212.
Copyright © Massachusetts Institute of Technology, puter scientists in the United States are women3 — and minority racial groups, including
2019. All rights reserved. blacks and Hispanics, are underrepresented in tech work, too.4 Groups like Girls Who Code

10 MIT SLOAN MANAGEMENT REVIEW WINTER 2019 PLEASE NOTE THAT GRAY AREAS REFLECT ARTWORK THAT HAS BEEN INTENTIONALLY REMOVED.
THE SUBSTANTIVE CONTENT OF THE ARTICLE APPEARS AS ORIGINALLY PUBLISHED.
and AI4ALL have been founded to help developing technology on the basis of the with doctor — and if those associations
close those gaps. Girls Who Code has input from multiple stakeholders to better aren’t identified and removed, they will
reached almost 90,000 girls from various represent the needs of nonmainstream be perpetuated and reinforced.10
backgrounds in all 50 states,5 and AI4ALL populations. While AI programs learn by finding
specifically targets girls in minority com- Some AI-powered tools are designed to patterns in data, they need guidance from
munities. Among other activities, AI4ALL mitigate biases in hiring. Intelligent text humans to ensure that the software doesn’t
sponsors a summer program with visits to editors like Textio can rewrite job descrip- jump to the wrong conclusions. This pro-
the AI departments of universities such as tions to appeal to candidates from groups vides an important opportunity for
Stanford and Carnegie Mellon so that par- that aren’t well represented. Using Textio, promoting diversity and inclusion.
ticipants might develop relationships with software company Atlassian was able to Microsoft, for example, has set up the
researchers who could serve as mentors increase the percentage of females among Fairness, Accountability, Transparency, and
and role models. And fortunately, the AI its new recruits from about 10% to 57%.9 Ethics in AI team, which is responsible for
field has a number of prominent women — Companies can also use AI technology to uncovering any biases that have crept into
including Fei-Fei Li (Stanford), Vivienne help identify biases in their past hiring de- the data used by the company’s AI systems.
Ming (Singularity University), Rana el cisions. Deep neural networks — clusters Sometimes AI systems need to be
Kaliouby (Affectiva), and Cynthia Breazeal of algorithms that emulate the human refined through more inclusive represen-
(MIT) — who could fill such a need. tation in images. Take, for instance, the
These relationships don’t just open fact that commercial facial recognition
up development opportunities for the Many of the data applications struggle with accuracy when
mentees — they’re also likely to turn the sets used to train dealing with minorities: The error rate for
mentors into diversity and inclusion
AI systems contain identifying dark-skinned women is 35%,
champions, an experience that may affect
how they approach algorithm design.
historical artifacts of compared with 0.8% for light-skinned
men. The problem stems from relying on
Research by sociologists Frank Dobbin of biases, and if those freely available data sets (which are rife
Harvard University and Alexandra Kalev associations aren’t with photos of white faces) for training
of Tel Aviv University supports this idea:
identified and the systems. It could be corrected by cu-
They’ve found that working with mentees
from minority groups actually moves the removed, they will rating a new training data set with better
representation of minorities or by apply-
needle on bias for the managers and pro- be perpetuated. ing heavier weights to the underrepresented
fessionals doing the mentoring, in a way data points.11 Another approach —
that forced training does not.6 ability to spot patterns in data — can be proposed by Microsoft researcher Adam
Other organizations have pursued especially effective in uncovering evidence Kalai and his colleagues — is to use differ-
shorter-term solutions for AI-design of hidden preferences. Using this tech- ent algorithms to analyze different groups.
teams. LivePerson, a company that devel- nique, an AI-based service such as Mya For example, the algorithm for determin-
ops online messaging, marketing, and can help companies analyze their hiring ing which female candidates would be the
analytics products, places its customer records and see if they have favored candi- best salespeople might be different from
service staff (a profession that is 65% dates with, for example, light skin. the algorithm used for assessing males —
female in the United States) alongside its sort of a digital affirmative action tactic.12
coders (usually male) during the develop- Train Systems With In that scenario, playing a team sport in
ment process to achieve a better balance Better Data college might be a higher predictor of
of perspectives.7 Microsoft has created a Building AI systems that battle bias is not success for women than for men going
framework for assembling “inclusive” only a matter of having more diverse and after a particular sales role at a particular
design teams, which can be more effective diversity-minded design teams. It also company.
for considering the needs and sensitivities involves training the programs to behave
of myriad types of customers, including inclusively. Many of the data sets used to Give Bots a Variety of Voices
those with physical disabilities.8 The train AI systems contain historical arti- Organizations and their AI system devel-
Diverse Voices project at the University facts of biases — for example the word opers must also think about how their
of Washington has a similar goal of woman is more associated with nurse than applications are engaging with customers.

SLOANREVIEW.MIT.EDU WINTER 2019 MIT SLOAN MANAGEMENT REVIEW 11


FRONTIERS

Using Artificial Intelligence to Promote Diversity (Continued from page 11)


To compete in diverse consumer markets, a non-native speakers — tend to be under- Paul R. Daugherty is Accenture’s chief tech-
company needs products and services that represented in such data sets. Specialized nology and innovation officer. He tweets
@pauldaugh. H. James Wilson is managing
can speak to people in ways they prefer. databases can help correct such deficien- director of IT and business research at Accen-
In tech circles, there has been consider- cies, but they, too, have their limitations. ture Research. He tweets @hjameswilson.
Rumman Chowdhury is a data scientist and
able discussion over why, for instance, The Fisher speech corpus, for example,
social scientist, and Accenture’s global lead
the voices that answer calls in help centers includes speech from non-native speakers for responsible AI. She tweets @ruchowdh.
or that are programmed into personal of English, but the coverage isn’t uniform. Comment on this article at http://sloanreview
.mit.edu/x/60216.
assistants like Amazon’s Alexa are female. Although Spanish and Indian accents are
Studies show that both men and women included, there are relatively few British REFERENCES
tend to have a preference for a female accents. Baidu, the Chinese search-engine
1. L. Hardesty, “Study Finds Gender and Skin-
assistant’s voice, which they perceive as company, is taking a different approach Type Bias in Commercial Artificial Intelligence
warm and nurturing. This preference can by trying to improve the algorithms Systems,” MIT News Office, Feb. 11, 2018.
change depending on the subject matter: themselves. It is developing a new “deep 2. E.T. Israni, “When an Algorithm Helps
Male voices are generally preferred for in- speech” algorithm that it says will handle Send You to Prison,” The New York Times,
Oct. 26, 2017.
formation about computers, while female different accents and dialects.
3. L. Camera, “Women Can Code — as Long as
voices are preferred for information about No One Knows They’re Women,” U.S. News &
relationships.13 World Report, Feb. 18, 2016.
But are these female “helpers” perpetu- Studies show that 4. M. Muro, A. Berube, and J. Whiton, “Black
ating gender stereotypes? It doesn’t help
both men and and Hispanic Underrepresentation in Tech: It’s
Time to Change the Equation,” The Brookings
matters that many female bots have sub-
servient, docile voices. That’s something
women tend to Institution, March 28, 2018.

that Amazon has begun to address in its have a preference 5. “About Us,” girlswhocode.com.
6. F. Dobbin and A. Kalev, “Why Diversity
recent version of Alexa: The intelligent for a female digital Programs Fail,” Harvard Business Review 94,

assistant’s voice,
bot has been reprogrammed to have little no. 7/8 (July-August 2016).
7. R. Locascio, “Thousands of Sexist AI Bots
patience for harassment, for instance, and
now sharply answers sexually explicit which they perceive Could Be Coming. Here’s How We Can Stop

as warm. But are


Them,” Fortune, May 10, 2018.
questions along the lines of “I’m not going 8. “Inclusive Design,” Microsoft.com.
to respond to that” or “I’m not sure what female “helpers” 9. T. Halloran, “How Atlassian Went From 10%
outcome you expected.”14
Companies might consider offering dif-
like Alexa reinforcing Female Technical Graduates to 57% in Two
Years,” Textio, Dec. 12, 2017.

ferent versions of their bots to appeal to a gender stereotypes? 10. C. DeBrusk, “The Risk of Machine-Learning
Bias (and How to Prevent It),” MIT Sloan Man-
diverse customer base. Apple’s Siri is now agement Review, March 26, 2018.
available in a male or female voice and can 11. J. Zou and L. Schiebinger, “AI Can Be Sexist
speak with a British, Indian, Irish, or ULTIMATELY, WE BELIEVE that AI will and Racist — It’s Time to Make It Fair,” Nature,
Australian accent. It can also speak in a help create a more diverse and better July 12, 2018.

variety of languages, including French, world if the humans who work with the 12. D. Bass and E. Huet, “Researchers Combat
Gender and Racial Bias in Artificial Intelligence,”
German, Spanish, Russian, and Japanese. technology design, train, and modify Bloomberg.com, Dec. 4, 2017.
Although Siri typically defaults to a female those systems properly. This shift requires 13. B. Lovejoy, “Sexism Rules in Voice Assis-
voice, the default is male for Arabic, French, a commitment from the senior executives tant Genders, Show Studies, but Siri Stands
Dutch, and British English languages. setting the direction. Business leaders may Out,” 9to5Mac.com, Feb. 22, 2017.

Just as important as the way they claim that diversity and inclusivity are 14. J. Elliot, “Let’s Stop Talking to Sexist Bots:
The Future of Voice for Brands,” Fast Company,
speak, AI bots must also be able to under- core goals, but they then need to follow March 7, 2018.
stand all types of voices. But right now, through in the people they hire and the
15. S. Paul, “Voice Is the Next Big Platform,
many don’t.15 To train voice recognition products their companies develop. Unless You Have an Accent,” Wired, March 20,
algorithms, companies have relied on The potential benefits are compelling: 2017.
speech corpora, or databases of audio access to badly needed talent and the Reprint 60216.
clips. Marginalized groups in society — ability to serve a much wider variety of Copyright © Massachusetts Institute of Technology,
low-income, rural, less educated, and consumers effectively. 2019. All rights reserved.

12 MIT SLOAN MANAGEMENT REVIEW WINTER 2019 SLOANREVIEW.MIT.EDU


PDFs ■ Reprints ■ Permission to Copy ■ Back Issues
Articles published in MIT Sloan Management Review are
copyrighted by the Massachusetts Institute of Technology
unless otherwise specified at the end of an article.
MIT Sloan Management Review articles, permissions,
and back issues can be purchased on our website:
sloanreview.mit.edu, or you may order through our
Business Service Center (9 a.m. - 5 p.m. ET) at the phone
numbers listed below. Paper reprints are available in
quantities of 250 or more.
To reproduce or transmit one or more MIT Sloan
Management Review articles by electronic or
mechanical means (including photocopying or archiving
in any information storage or retrieval system) requires
written permission.
To request permission, use our website:
sloanreview.mit.edu
or
Email: smr-help@mit.edu
Call (US and International): 617-253-7170
Fax: 617-258-9739
Posting of full-text MIT SMR articles on publicly
accessible Internet sites is prohibited. To obtain
permission to post articles on secure and/or password-
protected intranet sites, email your request to smr-
help@mit.edu.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy