Temp
Temp
CASE SUMMARY
The process of data collection started in 2014, led by Aleksandr Kogan, a researcher
at Cambridge University. He developed an app for a personality quiz named "This Is
Your Digital Life." While about 270,000 people downloaded the app, Facebook's API
subsequently allowed the app to access users' friends too. Thus, tens of millions of
accounts' data—87 million—were collected. User actions such as likes, check-ins,
posts, etc., were part of the data, and along with that, people's personalities,
political views, and behaviors could be inferred.
The crisis deepened when Wylie revealed the information on the abuse of data to
The Guardian and The New York Times. The response of Facebook was not strong
enough, and it resulted in public protests, legal probes, and U.S. congressional
hearings. Facebook was finally hit with a $5 billion fine by the Federal Trade
Commission for the way it handled users' data.
STAKEHOLDER ANALYSIS
FACEBOOK INC.
As the host platform, Facebook was caught between two conflicting priorities:
protecting user privacy and reaping maximum profits from user interaction and
advertising. Revenue for the company was largely based on the acquisition of data
and targeted advertising, which created enormous challenges to the full protection
of user privacy. After the data breach, Facebook suffered regulatory fines, economic
losses, and long-term damage to its reputation with the public.
CAMBRIDGE ANALYTICA
This political consulting company aimed to use data from Facebook in a bid to offer
advanced voter targeting technology to its clients. Its core goal involved gaining a
competitive edge in political consulting and increasing profitability from its
operations of psychographic profiling.
Some political parties, like those that took part in the 2016 Trump election and the
Brexit vote, hired the services of Cambridge Analytica. Their primary aim was to win
in elections by effectively targeting voters. But many accused them of not knowing
anything about the immoral methods used in collecting information.
Such institutions as the U.S. Federal Trade Commission, the United Kingdom
Information Commissioner's Office, and legislatures have been required to protect
the public interest in privacy, election integrity, and corporate accountability. These
institutions were unable to enforce existing law but, more significantly, established
new and innovative ways of protecting digital information. General Public and
Democratic Institutions The broader public was invested in guaranteeing the
transparency of elections, that democratic engagement was founded upon good
decision and not coercion. The scandal revealed long-standing concerns for the
susceptibility of democratic processes in a period characterized by the
predominance of the use of online spaces.
ETHICAL ISSUES
This opaqueness makes null and void the ethical provision of informed consent
required for good data stewardship. The mischief was bad in the first instance
because the data went beyond mere profile information; it enabled the construction
of elaborate psychological profiles with a view to exploiting user behavior for
political ends. Individuals were unwittingly exposed to data-funneling tactics
manipulating their political opinions and activism, undermining their self-
determination.
Such manipulation makes one question whether democratic processes can remain
fair and transparent. By having political messages being crafted for voters by hidden
psychological profiling, the democratic premise that citizens are supposed to have
free and independent choices is undermined. The manipulation of opinions by
advanced data analytics intentionally raises doubts about the credibility of elections
and inherently undermines democratic values.
UTILITARIAN PERSPECTIVE
Utilitarian ethics evaluate actions according to their consequences and the degree
to which they enhance overall welfare. Applying it to the Facebook-Cambridge
Analytica case makes the analysis complicated. Some would suggest that targeted
political advertising, if it led to improved leadership or positive policy change, might
be justified by the good that it achieved.
This perspective collapses, though, when wider effects are taken into account. The
damage inflicted comprises the large-scale invasion of privacy of 87 million users,
erosion of public trust in online platforms, undermining democratic institutions, and
setting pernicious precedents for future data abuse. The net damage—both at the
individual and the collective level—by far exceeds any conceivable benefit of more
efficient political campaigning. Based on these results, utilitarian argument
eventually denounces the actions of Facebook and Cambridge Analytica because
long-term harm to privacy and democracy outweighs the short-term gains.
DEONTOLOGICAL PERSPECTIVE
In order to prevent or minimize the ethical mishaps witnessed in this case, a number
of essential steps might have been taken:
BETTER CONSENT MECHANISMS
Facebook ought to have had stricter controls over app developers, such as regular
reviews and audits to guarantee apps were handling data in agreement with
platform policy and user intentions.
The Facebook-Cambridge Analytica case offers crucial lessons for information and
communication technology professionals:
ICT professionals have an ethical responsibility—not only a legal one. They should
consider how their work impacts people and society, and not contribute to systems
designed to mislead, manipulate, or exploit users.
ENCOURAGING ACCOUNTABILITY
Transparency in data collection and use is critical. ICT professionals must assist in
creating systems enabling audits and making accountability achievable, internally
and externally.
Technical design choices need not just answer to clients or corporate interests. ICT
professionals need to pay attention to how their systems impact users,
communities, and democratic processes, seeing the wider social responsibility
inherent in their work.
This case illustrates that technical work is, by nature, ethical work, and ICT
professionals need to own the broader implications of the systems they design,
deploy, and maintain.
REFERENCES
1. Burns, H. (2017, July 11). HOW TO PROTECT YOUR USERS WITH THE PRIVACY BY
DESIGN FRAMEWORK . Smashing Magazine.
https://www.smashingmagazine.com/2017/07/privacy-by-design-framework/
2. Weitzner, D. J. (2018, April 4). HOW CAMBRIDGE ANALYTICA, FACEBOOK AND
OTHER PRIVACY ABUSES COULD HAVE BEEN PREVENTED . Lawfare.
https://www.lawfaremedia.org/article/how-cambridge-analytica-facebook-and-
other-privacy-abuses-could-have-been-prevented
3. Bradshaw, P. (2019, July 23). THE GREAT HACK REVIEW – SEARING EXPOSÉ OF
THE CAMBRIDGE ANALYTICA SCANDAL . The Guardian.
https://www.theguardian.com/film/2019/jul/23/the-great-hack-review-
cambridge-analytica-netflix
4. Langley, D. (2018, August 30). HOW FACEBOOK’S CAMBRIDGE ANALYTICA
SCANDAL IMPACTED THE INTERSECTION OF PRIVACY AND REGULATION . CMSWire.
https://www.cmswire.com/information-management/how-facebooks-
cambridge-analytica-scandal-impacted-the-intersection-of-privacy-and-
regulation/
5. Taylor, E. (2018). The Cambridge Analytica affair and internet-mediated
research. INFORMATION & COMMUNICATIONS TECHNOLOGY ETHICS JOURNAL ,
12(1), 45–60.