Digital footprints and why you need to stop telling the internet everything about you

Cambridge Analytica and how our personal information is being used against us all

Propaganda has taken various forms over time. During World War II, it was through Hollywood films that Americans were “encouraged” to “support the troops”. The films released during this time had powerful themes that not only encouraged Americans to fight, but also fueled the hatred Americans felt towards Nazis and the Japanese at the time.

The digitally-boosted era of Big Data, as it is often called, has made the influence of the media on how we perceive the world more distinct. Organisations can now use data gathered from browsing history, age, gender, location, and other data to tailor propaganda to specific people. Facebook, for example, curates our feeds so that we get more of the stuff we “like”. Instagram, Snapchat and Twitter have also reviewed and updated their algorithms to further streamline how content is assimilated on these platforms. Considering that these are some of the top news sources in this era, this matters quite a lot. Just a few years ago, Facebook and Twitter were hailed as tools for democracy activism, which suggests that activists can use this tool to skew user’s opinions but the opposite is also true. As Samidh Chakrabarti, Facebook’s Product Manager for Civic Engagement said:

“Social media’s impact on democracy it’s that it amplifies human intent — both good and bad. At its best, it allows us to express ourselves and take action. At its worst, it allows people to spread misinformation and corrode democracy”

This is not a quote out of a Black Mirror episode. In the last few days, news of Facebook’s involvement in propaganda that may have colluded with the American elections became public news. Facebook confirmed that in 2015, it learnt that Cambridge Analytica had collected personal data shared by users on a prediction app from Dr Aleksandr Kogan, a psychology professor at the University of Cambridge. According to reports, the data analysis company, Cambridge Analytica, worked with Donald Trump’s election team and last year’s Brexit campaign in the UK. In an undercover reveal conducted by UK’s Channel 4 news, the company confessed to skewing votes in two main ways: Spreading false information and using personal data gathered from information shared on social media to manipulate action.

More reports on The Guardian UK’s Cambridge Analytica Files series revealed that Cambridge Analytica’s parent company, SCL Elections, used tools similar analytical tools in more than 200 elections around the world, mostly in undeveloped democracies that the scandal’s whistleblower, Christopher Wylie, said he later realised were unequipped to defend themselves. This is not far from the truth, as Cambridge Analytica was also found to have been involved in Nigeria’s 2015 elections. The company was hired by a ‘Nigerian Billionaire’ to sway voters in former President, Goodluck Jonathan’s favour. They are believed to have been provided with data about the current president. A former Cambridge Analytica employee said they were provided with the current President’s medical records which they used for Social Media propaganda against him. The Facebook-Cambridge Analytica scandal has since been the centre point of news reportage, questioning the limits to privacy on the internet and the susceptibility of personal data misuse in a world where we willingly give our data to big corporations.

One unnamed executive confessed, “We just put information into the bloodstream of the internet and then watch it grow, give it a little push every now and again over time to watch it take shape”

The information put “into the bloodstream of the internet” is targeted at a certain group of people and continues to spread among this group thanks to the algorithm developed to help us reach content we would be interested in. Many sites offer tailored content based on data they’ve gathered or bought from third-party services— like Dr Kogan’s application used by Cambridge Analytica— to collect and correlate personal information from multiple respondents. The result is an influx of articles, content and feeds curated by information inferred from data about the personal lives of social media users. Even when the articles are diverse, users are hardly ever exposed to anything outside their frame of thinking, not if ad-targeting algorithms can help it. But even offline, this is how our minds work, we are always drawn to people who agree with us, and when these people share content with us, either in discussion or through shared articles, we find that it is hardly ever anything we won’t be interested in.

This natural tendency to keep friends within and outside the confinement of the web alters our realities more than we would realize. This reality altering one-way mirror we create in our “safe spaces” is referred to as a ‘filter bubble’. Filter bubbles make it easy for us to think the information we’re taking in is exactly how the world works, this information becomes all we have to reference when discussing with people who perhaps think differently from us. The bubble is why, at a pleasing level, you’ve probably seen that meme your friend who isn’t on Twitter is about to send to you. But the filtering, among other algorithm driven tools, is why it has become very easy for organisations like Cambridge Analytica to strengthen and also manipulate people’s opinions.

The fear when people started using social media was that strangers would find us, or our personal information would be used against us. At the time, anonymity was the selling point of most platforms “your content will not be shared with any third parties”. But this has since shifted. Your information is no longer your own. Any footprint you leave will either be used to sell you a product you may like, or used as some social experiment you won’t be aware of until it is over, or worse. As Author, Nicholas Carr, said in his book, The Shallows, “The faster we surf across the surface of the Web, the more opportunities Google gains to collect information about us and to feed us advertisements[…] And it’s in Google’s economic interest to make sure we click as often as possible”.

Breaking the cycle is difficult, mainly because our capacity to resist the manipulation is limited. To avoid leaving any redundant information, we are told to try: deleting social media accounts we’re not using and keeping settings private where we’re active, going back to forums and discussion boards we have posted on and deleting those comments. We are also advised to take time off these platforms and the bubbles our actions and the algorithm have formed.

While this may not protect us from offline interactions spawned by our individual filter bubbles, the surest way is to understand that what we see is not all there. That our realities are distorted and we need to take off the frames. We have to hop out of our bubbles, perhaps by actively seeking activities we would not normally try out and asking questions that even we are uncomfortable with.  Either way, reducing our digital footprints is probably the best way to keep our information from being possible tools of unlawfully toppling legitimate democracies.


“Tomiwa is figuring it out…” Tweet at her @fauxbella


Abducted Dapchi girls are freed; but at what cost?

Share