With the recent Cambridge Analytica revelations, will we finally draw the line on Facebook’s invasions of privacy, especially when it has acted as a platform of mis- and dis-information for so long?

In the last three hours on my Facebook newsfeed, I have witnessed:

 

  1. Two people boasting about illegal activity on community pages;
  2. A petition to end an animal welfare issue that has been illegal for 9 years;
  3. A misogynistic joke straight out of the 1950s. It wasn’t funny then, either;
  4. An overtly racist meme (adding idiocy to insult, it said burka when it meant niqab);
  5. A mainstream news outlet banging on about how undemocratic the European Parliament is, followed by 600 uninformed comments from foaming at the mouth nationalists; and
  6. A picture of a cat. I think it was a cat. It could have been a rabbit. It was fluffy.

 

It has not been an unusually busy, or indeed, stupid morning on Facebook, so it’s not difficult to argue that the Facebook experiment is not going well. And that’s before Cambridge Analytica hit the fan.

Briefly, some background: Cambridge Analytica is a UK based data firm who harvested personal information from around 50 million Facebook users in 2014 via a personality quiz conducted by Global Science Research (GSR). About 270,000 people took the quiz, who were friends with about 50 million people, all of whose personal data was made available to GSR. The gathering of the data by GSR was legal and in line with Facebook’s terms of service. It was only when that information was used by a third party (Cambridge Analytica) to identify preferences of the quiz-takers and their friends, that lines were crossed.

Cambridge Analytica then used this data to manipulate users with targeted ads, thereby influencing the US election. In response to this breach, Facebook tweaked a couple of its data protection policies and asked for the information back. Chances are, Cambridge Analytica are only getting around the shredding the last of the information this week.

Now, we’ve known about Facebook algorithms for a while now. I see them in practice on my own newsfeed. When I’m having a bad day, I like to take it out in the comments sections of right wing, mainstream news outlets. There you go, I admit it, I’m a troll. But the good kind.

After a decent session of fact-checking on the Daily Mail page, the suggested posts on my feed turn into Bigot Bingo; anti-abortion, men’s rights (not the good kind), Christian Groups (the kind you feel like Jesus wouldn’t have endorsed), anti-immigration, holocaust deniers, flat earth, pro-Brexit, anti-vax, etc etc. One time I got something about Queen Elizabeth II being a lizard.

So that’s Facebook algorithms in action. And we know about them, and, judging by the absence of legal oversight into how social media platforms operate, and the number of times users click “I agree” to terms of service, everyone is okay with them. Facebook is allowed to not share information about how its algorithms work, because, and you’ll like this, it wants to protect the privacy of its users.

Facebook also cites a concern that if it shared how the algorithms work, then predatory parties would be able to exploit this for their own ends. This is effectively Facebook calling dibs because, of course, they exploit their users for their own ends already.

So apparently this is where we’re drawing the line on our love affair with Facebook. It wasn’t at the idiocy that has abounded for years. It wasn’t the proliferation of disinformation and misinformation it has facilitated, the normalisation of bigotry, or even the pictures of cats (or rabbits). It wasn’t users being sold to advertisers. It wasn’t our opinions and politics being manipulated by Facebook’s filter bubbles.

The saying goes, there’s no such thing as a free lunch. If you’re not paying, you’re the food. Facebook sells its users to advertisers. That’s where it makes its billions. Its users have been feasted on for a decade now, in an unregulated and unsupported environment. Its users, who have not been given any training in media literacy, have not been paying attention. They have become food for corporations and political think tanks.

The media literacy community is banging its head on its desk right now. This really could have been avoided with just a little bit of regulation and even more media literacy education. Empowered, informed consumers make for responsible producers.

The EU’s General Data Protection Regulation (GDPR) comes into force in May this year. It will provide protections for users against data breaches, and will fine up to €17million for any breach that does take place. Which I’m sure will impact Facebook’s $500 billion net worth. The Regulation will also make it easier for users to see what data is being collected about them. But given that Facebook already makes this information available, what is likely to change within their corporate culture? How this Regulation is ratified into national law remains to be seen in terms of its effectiveness.

So what will happen in response to this latest scandal? Well, given that Cambridge Analytica seem to have been behind both Trump’s election and Brexit, Kenya’s general election, and I’m sure others in the coming days, what exactly do we expect from an investigation undertaken by Trump, and indeed pro-Brexit MPs?

This is unlikely to be the end of Facebook. It’s also unlikely to be the start of effective regulation of social media platforms, despite the outcry. But it is an insight into what is happening behind the curtain.

With the manifest advantages social media provides, and our dependence on it, it would be rash to drop social media entirely. But with any luck, these revelations will prompt users to pay more attention to their social media habits, to be more mindful of the information they share and the information they absorb. It might even encourage them to make themselves media literate.

Share This Post, Choose Your Platform!

With the recent Cambridge Analytica revelations, will we finally draw the line on Facebook’s invasions of privacy, especially when it has acted as a platform of mis- and dis-information for so long?

In the last three hours on my Facebook newsfeed, I have witnessed:

 

  1. Two people boasting about illegal activity on community pages;
  2. A petition to end an animal welfare issue that has been illegal for 9 years;
  3. A misogynistic joke straight out of the 1950s. It wasn’t funny then, either;
  4. An overtly racist meme (adding idiocy to insult, it said burka when it meant niqab);
  5. A mainstream news outlet banging on about how undemocratic the European Parliament is, followed by 600 uninformed comments from foaming at the mouth nationalists; and
  6. A picture of a cat. I think it was a cat. It could have been a rabbit. It was fluffy.

 

It has not been an unusually busy, or indeed, stupid morning on Facebook, so it’s not difficult to argue that the Facebook experiment is not going well. And that’s before Cambridge Analytica hit the fan.

Briefly, some background: Cambridge Analytica is a UK based data firm who harvested personal information from around 50 million Facebook users in 2014 via a personality quiz conducted by Global Science Research (GSR). About 270,000 people took the quiz, who were friends with about 50 million people, all of whose personal data was made available to GSR. The gathering of the data by GSR was legal and in line with Facebook’s terms of service. It was only when that information was used by a third party (Cambridge Analytica) to identify preferences of the quiz-takers and their friends, that lines were crossed.

Cambridge Analytica then used this data to manipulate users with targeted ads, thereby influencing the US election. In response to this breach, Facebook tweaked a couple of its data protection policies and asked for the information back. Chances are, Cambridge Analytica are only getting around the shredding the last of the information this week.

Now, we’ve known about Facebook algorithms for a while now. I see them in practice on my own newsfeed. When I’m having a bad day, I like to take it out in the comments sections of right wing, mainstream news outlets. There you go, I admit it, I’m a troll. But the good kind.

After a decent session of fact-checking on the Daily Mail page, the suggested posts on my feed turn into Bigot Bingo; anti-abortion, men’s rights (not the good kind), Christian Groups (the kind you feel like Jesus wouldn’t have endorsed), anti-immigration, holocaust deniers, flat earth, pro-Brexit, anti-vax, etc etc. One time I got something about Queen Elizabeth II being a lizard.

So that’s Facebook algorithms in action. And we know about them, and, judging by the absence of legal oversight into how social media platforms operate, and the number of times users click “I agree” to terms of service, everyone is okay with them. Facebook is allowed to not share information about how its algorithms work, because, and you’ll like this, it wants to protect the privacy of its users.

Facebook also cites a concern that if it shared how the algorithms work, then predatory parties would be able to exploit this for their own ends. This is effectively Facebook calling dibs because, of course, they exploit their users for their own ends already.

So apparently this is where we’re drawing the line on our love affair with Facebook. It wasn’t at the idiocy that has abounded for years. It wasn’t the proliferation of disinformation and misinformation it has facilitated, the normalisation of bigotry, or even the pictures of cats (or rabbits). It wasn’t users being sold to advertisers. It wasn’t our opinions and politics being manipulated by Facebook’s filter bubbles.

The saying goes, there’s no such thing as a free lunch. If you’re not paying, you’re the food. Facebook sells its users to advertisers. That’s where it makes its billions. Its users have been feasted on for a decade now, in an unregulated and unsupported environment. Its users, who have not been given any training in media literacy, have not been paying attention. They have become food for corporations and political think tanks.

The media literacy community is banging its head on its desk right now. This really could have been avoided with just a little bit of regulation and even more media literacy education. Empowered, informed consumers make for responsible producers.

The EU’s General Data Protection Regulation (GDPR) comes into force in May this year. It will provide protections for users against data breaches, and will fine up to €17million for any breach that does take place. Which I’m sure will impact Facebook’s $500 billion net worth. The Regulation will also make it easier for users to see what data is being collected about them. But given that Facebook already makes this information available, what is likely to change within their corporate culture? How this Regulation is ratified into national law remains to be seen in terms of its effectiveness.

So what will happen in response to this latest scandal? Well, given that Cambridge Analytica seem to have been behind both Trump’s election and Brexit, Kenya’s general election, and I’m sure others in the coming days, what exactly do we expect from an investigation undertaken by Trump, and indeed pro-Brexit MPs?

This is unlikely to be the end of Facebook. It’s also unlikely to be the start of effective regulation of social media platforms, despite the outcry. But it is an insight into what is happening behind the curtain.

With the manifest advantages social media provides, and our dependence on it, it would be rash to drop social media entirely. But with any luck, these revelations will prompt users to pay more attention to their social media habits, to be more mindful of the information they share and the information they absorb. It might even encourage them to make themselves media literate.

Share This Post, Choose Your Platform!

With the recent Cambridge Analytica revelations, will we finally draw the line on Facebook’s invasions of privacy, especially when it has acted as a platform of mis- and dis-information for so long?

In the last three hours on my Facebook newsfeed, I have witnessed:

 

  1. Two people boasting about illegal activity on community pages;
  2. A petition to end an animal welfare issue that has been illegal for 9 years;
  3. A misogynistic joke straight out of the 1950s. It wasn’t funny then, either;
  4. An overtly racist meme (adding idiocy to insult, it said burka when it meant niqab);
  5. A mainstream news outlet banging on about how undemocratic the European Parliament is, followed by 600 uninformed comments from foaming at the mouth nationalists; and
  6. A picture of a cat. I think it was a cat. It could have been a rabbit. It was fluffy.

 

It has not been an unusually busy, or indeed, stupid morning on Facebook, so it’s not difficult to argue that the Facebook experiment is not going well. And that’s before Cambridge Analytica hit the fan.

Briefly, some background: Cambridge Analytica is a UK based data firm who harvested personal information from around 50 million Facebook users in 2014 via a personality quiz conducted by Global Science Research (GSR). About 270,000 people took the quiz, who were friends with about 50 million people, all of whose personal data was made available to GSR. The gathering of the data by GSR was legal and in line with Facebook’s terms of service. It was only when that information was used by a third party (Cambridge Analytica) to identify preferences of the quiz-takers and their friends, that lines were crossed.

Cambridge Analytica then used this data to manipulate users with targeted ads, thereby influencing the US election. In response to this breach, Facebook tweaked a couple of its data protection policies and asked for the information back. Chances are, Cambridge Analytica are only getting around the shredding the last of the information this week.

Now, we’ve known about Facebook algorithms for a while now. I see them in practice on my own newsfeed. When I’m having a bad day, I like to take it out in the comments sections of right wing, mainstream news outlets. There you go, I admit it, I’m a troll. But the good kind.

After a decent session of fact-checking on the Daily Mail page, the suggested posts on my feed turn into Bigot Bingo; anti-abortion, men’s rights (not the good kind), Christian Groups (the kind you feel like Jesus wouldn’t have endorsed), anti-immigration, holocaust deniers, flat earth, pro-Brexit, anti-vax, etc etc. One time I got something about Queen Elizabeth II being a lizard.

So that’s Facebook algorithms in action. And we know about them, and, judging by the absence of legal oversight into how social media platforms operate, and the number of times users click “I agree” to terms of service, everyone is okay with them. Facebook is allowed to not share information about how its algorithms work, because, and you’ll like this, it wants to protect the privacy of its users.

Facebook also cites a concern that if it shared how the algorithms work, then predatory parties would be able to exploit this for their own ends. This is effectively Facebook calling dibs because, of course, they exploit their users for their own ends already.

So apparently this is where we’re drawing the line on our love affair with Facebook. It wasn’t at the idiocy that has abounded for years. It wasn’t the proliferation of disinformation and misinformation it has facilitated, the normalisation of bigotry, or even the pictures of cats (or rabbits). It wasn’t users being sold to advertisers. It wasn’t our opinions and politics being manipulated by Facebook’s filter bubbles.

The saying goes, there’s no such thing as a free lunch. If you’re not paying, you’re the food. Facebook sells its users to advertisers. That’s where it makes its billions. Its users have been feasted on for a decade now, in an unregulated and unsupported environment. Its users, who have not been given any training in media literacy, have not been paying attention. They have become food for corporations and political think tanks.

The media literacy community is banging its head on its desk right now. This really could have been avoided with just a little bit of regulation and even more media literacy education. Empowered, informed consumers make for responsible producers.

The EU’s General Data Protection Regulation (GDPR) comes into force in May this year. It will provide protections for users against data breaches, and will fine up to €17million for any breach that does take place. Which I’m sure will impact Facebook’s $500 billion net worth. The Regulation will also make it easier for users to see what data is being collected about them. But given that Facebook already makes this information available, what is likely to change within their corporate culture? How this Regulation is ratified into national law remains to be seen in terms of its effectiveness.

So what will happen in response to this latest scandal? Well, given that Cambridge Analytica seem to have been behind both Trump’s election and Brexit, Kenya’s general election, and I’m sure others in the coming days, what exactly do we expect from an investigation undertaken by Trump, and indeed pro-Brexit MPs?

This is unlikely to be the end of Facebook. It’s also unlikely to be the start of effective regulation of social media platforms, despite the outcry. But it is an insight into what is happening behind the curtain.

With the manifest advantages social media provides, and our dependence on it, it would be rash to drop social media entirely. But with any luck, these revelations will prompt users to pay more attention to their social media habits, to be more mindful of the information they share and the information they absorb. It might even encourage them to make themselves media literate.

Share This Post, Choose Your Platform!