Articles

Facebook - Cambridge Analytica Scandal Something That You Should Be Afraid Of?

by Pranjal Kaushiley Content Writer


Facebook’s role in the 2016 US election grew more complicated this past weekend when bombshell reports in The New York Times and The Guardian revealed the extent to which London-based data mining and analytics firm Cambridge Analytica misused user data from as many as 50 million Facebook users.


The recently revealed Facebook data “breach” that allowed Cambridge Analytica to get access to millions of users’ worth of Facebook data has been greeted as a shocking scandal. Reporters and readers have been surprised to learn about the ability to gather personal data on the friends of people who install a Facebook app, the conversion of a personality quiz into a source of political data, the idea that you can target marketing messages based on individual psychographic profiles, and the surreptitious collection of data under the guise of academic research, later used for political purposes.


The data were obtained from Cambridge psychology professor Aleksandr Kogan and given to the affiliated behaviour research firm Strategic Communication Laboratories in a violation of Facebook’s terms of service. The actions of the firm, which denies any wrongdoing, has kicked up a massive debate over Facebook’s failures to police its platform and its response to both user privacy and the institution of democracy itself. Check back here for all the news as this story develops.


After days of prodding by the media, Mark Zuckerberg offered a mea culpa, apologising for the “breach of trust between Facebook and the people who share their data with us and expect us to protect it”. So once again, in a Snowdenish moment, we are hit by the revelation that Cambridge Analytica conducted behavioural modeling and psychographic profiling (creating personality profiles by gauging motives, interests, attitudes, beliefs, values, etc,) based on data it collected, to successfully (allegedly) target Americans prior to the recent presidential election.


But there is one group of people who are mostly unsurprised by these revelations: the market researchers and digital marketers who have known about (and in many cases, used) these tactics for years. I was one of them (or rather a lackey to one of them).


I recently overheard a conversation at the local tea stall where people were discussing the lack of development in my huge country because all politicians are uneducated and morons (pardon my French). If only they knew the half of it. There is a reason why these so-called uneducated politicians are in power and the common honest man who thought about stepping into politics after watching too many patriotic movies faded into oblivion without anyone noticing.


The reason being these great and powerful political personas have highly educated experts who have impressive backgrounds as marketing experts, financial experts, media experts, journalists and others with experience at Fortune 500 MNCs. The common man nowadays does not think twice about the information they leave online. Think about it, have YOU ever read the permissions you grant a smartphone app before you tap yes? The answer most likely for most people is no.

This brings us to the question of what the social role of digital intelligence means for the future of democracy. Elections play a vital role in a robust democracy. We seek to safeguard their free and fair nature through regulations that impose restrictions on exit polls or call out parties for unduly influencing voters through the distribution of freebies. Wouldn’t then, a nudging of voters through intimate knowledge of their behaviour be a threat to this socio-political hygiene we seek to maintain?


Can we allow the replacement of the will of the people by a market democracy in which the masses can be gamed? How should the Election Commission take due cognizance of and address such mass-scale manipulation?


Beyond electoral fairness, there are severe repercussions for the sanctity of the public sphere in the rapidly unfolding role of algorithms. When people know that online behaviour is monitored, they carefully moderate how they interact online, a phenomenon referred to as social cooling.


However, if this Facebook Cambridge Analytica scandal has one easy explanation and precaution for us it is that we are far too dependant on our smartphones and social media platforms without even knowing what we are putting out there for the world. Facebook collects all kinds of social data about its users, like their relationship status, place of work, colleagues, last time they visited their parents, songs they like listening to, as well as other kinds of information such as device data, websites visited from the platform, etc. This may be information that is shared by the user or what their friends may share about them on the platform.


That aside, let us not forget that Facebook has bought over WhatsApp and Instagram and can tap into data from those platforms as well, apart from the data it buys from data brokers!


Many digital corporations, including Uber, Twitter and Microsoft sell their data to third parties who build apps and provide services on top of it. With machine learning, the targeting of individuals assumes new dimensions; it becomes possible to do nano-targeting, zoom in precisely on one individual.


The news that Facebook shared user data with a number of organizations, including Cambridge Analytica seems to reflect the paradox of surveillance society — that our data is never safe, but share it we must.


Sponsor Ads


About Pranjal Kaushiley Freshman   Content Writer

11 connections, 0 recommendations, 42 honor points.
Joined APSense since, March 22nd, 2018, From Ahmedabad, India.

Created on Mar 26th 2018 05:45. Viewed 607 times.

Comments

No comment, be the first to comment.
Please sign in before you comment.