Why Algorithmic Accountability and Transparency are important in the Digital Age ?

The Internet and digital technologies have revolutionised our world over the last 50 years. The way we communicate and interact with others, the way we spend our free time, the way we buy, the way we work and do business are not the same as they were before.

ICT skills and digital competences became the synonymous with success. The Digital Agenda is a policy priority in many developed countries. The European Union is not an exception. The European Commission’s President Junker named the Digital Single Market as the second priority after job growth, with an understanding that a growing digital single market will create hundreds of thousands of new jobs..

Digital technologies based on mathematical concepts of binary code have enabled huge swathes of information to be stored on small storage devices, which can be easily preserved and transported. In the paper “The Internet of Everything is a New Economy” Ph.D. Plamen Nedeltchev wrote that the amount of data created in recent years is more than that created by all humans in the previous 5000 years. “It is estimated that 4000 exabytes of data were stored in the cloud by the end of 2012, and the contributions from Amazon and Facebook amounted to 20 petabytes of data per day.”

Therefore, we live in a digitized world, we are part of it and we do not question ourselves often enough as to how possible it is to make sense of this ocean of data? We do this by using computational processes, sets of instructions, or simply  “algorithms” to make decisions of high complexity based on the data available to us. Algorithms are deployed in almost every sphere of our digital life. Inputs and outputs of complex computational process are neither visible nor analysable by an ordinary human mind, making the final decision unintelligible.

Search engines and social media platforms actively use algorithms as well to provide customised content for users. With increased usage of mobile devices and a rising number of social media users, thousands of data points of information are available online. This data is used by search engine companies and social media platforms to create a deeper profile of individuals in order to provide more personalised and tailored content.  

But why should we even care if the algorithms are doing the most difficult part of the job for us by recycling our personal data and providing us with personalised solutions? Isn’t it good that search engines like Google,Yahoo, Bing or social media platforms like Facebook, Twitter, Instagram, offer us content we are interested in based on our previously stored data?

There is no doubt that personalised search and customised news feeds can be convenient, as users receive and see content that they are most likely to want to see, not losing time on news filtering and engaging in active information seeking. According to Reuters Institute Digital news Report 2016, a comparative study of news consumption of more than 50,000 people in 26 countries, 36% of respondents were happy with the news automatically selected for them based on what they have read before, 30% were happy for news to be selected for them based on the judgement of editors or journalists. In addition, 22% were happy with the news selected based on what their friends had consumed. But what if this selected content and personalised search results are only supporting our opinion not providing any alternative points of view? What if some key information is lost in an algorithmically driven bubble?

So called “filter bubbles” or “echo chambers” are recently recognised phenomena. There is at least some evidence that they impacted the US election and the results of the UK’s Brexit referendum. Individually curated content of big networking platforms and search engines powered by algorithms is not always neutral, and it can influence political decision making processes.

There are a number of concerns and discussions surrounding the role of algorithm driven digital services in our political life. The World Web Foundation, in its white paper on Algorithmic Accountability, provided some evidence that algorithms can be discriminatory and harmful in nature, and underlined the need for algorithmic accountability and transparency by proposing potential areas of actions on how to achieve it.

However, on the side of regulators and policy makers, it is not always that simple to implement laws and regulations, as technologies are constantly changing and evolving. Moreover, big data firms and online platforms are tirelessly working on improving the quality of the algorithms proposed, which contributes to a more sophisticated, more precise computational analysis which would enable a higher level of psychological profiling.

Some big stakeholders are stepping forward and opening up about the use of the algorithms, for instance, Facebook revealed its experiments, where it monitored the effect of the change in the number of negative and positive posts shown to the user.  This experiment became an evidence of massive-scale emotional contagion through social networks and was a target of severe critique from the general public. In spite of the criticism received  this revelation raised awareness about the role of algorithms and brought to light the need for higher transparency and greater accountability from big internet stakeholders on the use of the algorithms.

There are number Civil Society Organisations and research institutes advocating for algorithmic transparency. Recently EAVI has participated in the event hosted at the European Parliament where this topic was discussed (an article about the event can be found here), therefore there is a hope that there will be some practical measures on how to make internet companies more  transparent about the use of algorithms. However we can not expect that social-profiling and as a result personalised content will disappear in the future. It is more likely that the use of algorithms that customise content will in fact become more sophisticated.

So what can we, as individuals, do to engage with social platforms and search engines in a more constructive way, to get the fullest and most relevant information and news?

There are simple steps everyone can incorporate into their digital interaction to receive the fullest and most relevant information:

Search engines:

  • Surf in anonymous browsing mode, or use alternative search engines that offer unpersonalised search like DuckDuckGo, Ixquick. However, it is uncertain for how long new search engines will stay “unbiased”.
  • Explore search settings for Google, personal results > do not use personal results or use pws=0 at the end of a search URL.
  • Use Ublock origin plugin.
  • EFF (The Electronic Frontier Foundation)   has number of tools to defend your online privacy and security online.

HTTPS Everywhere extension can be useful, as it encrypts your communications with many major websites.

EFF Privacy Badger browser add-on that stops advertisers and other third-party trackers from secretly tracking where you go and what pages you look at on the web.

  • Disable  cookies.

News feed on social media platforms:

  • Subscribe to various news outlets, especially to those where opinions differ from yours.
  • Explore news feed/notification settings, for example, in Facebook, select “Most Recent” in News Feed preferences instead of Top Stories
  • Occasionally like the articles you disagree with, so the algorithms can readjust the stream of news proposed.
  • Check BBC video “Election 2017: How can you pop your filter bubble?” here

 

With thanks to Natallia Bialiayeva.

Share This Post, Choose Your Platform!

Why Algorithmic Accountability and Transparency are important in the Digital Age ?

The Internet and digital technologies have revolutionised our world over the last 50 years. The way we communicate and interact with others, the way we spend our free time, the way we buy, the way we work and do business are not the same as they were before.

ICT skills and digital competences became the synonymous with success. The Digital Agenda is a policy priority in many developed countries. The European Union is not an exception. The European Commission’s President Junker named the Digital Single Market as the second priority after job growth, with an understanding that a growing digital single market will create hundreds of thousands of new jobs..

Digital technologies based on mathematical concepts of binary code have enabled huge swathes of information to be stored on small storage devices, which can be easily preserved and transported. In the paper “The Internet of Everything is a New Economy” Ph.D. Plamen Nedeltchev wrote that the amount of data created in recent years is more than that created by all humans in the previous 5000 years. “It is estimated that 4000 exabytes of data were stored in the cloud by the end of 2012, and the contributions from Amazon and Facebook amounted to 20 petabytes of data per day.”

Therefore, we live in a digitized world, we are part of it and we do not question ourselves often enough as to how possible it is to make sense of this ocean of data? We do this by using computational processes, sets of instructions, or simply  “algorithms” to make decisions of high complexity based on the data available to us. Algorithms are deployed in almost every sphere of our digital life. Inputs and outputs of complex computational process are neither visible nor analysable by an ordinary human mind, making the final decision unintelligible.

Search engines and social media platforms actively use algorithms as well to provide customised content for users. With increased usage of mobile devices and a rising number of social media users, thousands of data points of information are available online. This data is used by search engine companies and social media platforms to create a deeper profile of individuals in order to provide more personalised and tailored content.  

But why should we even care if the algorithms are doing the most difficult part of the job for us by recycling our personal data and providing us with personalised solutions? Isn’t it good that search engines like Google,Yahoo, Bing or social media platforms like Facebook, Twitter, Instagram, offer us content we are interested in based on our previously stored data?

There is no doubt that personalised search and customised news feeds can be convenient, as users receive and see content that they are most likely to want to see, not losing time on news filtering and engaging in active information seeking. According to Reuters Institute Digital news Report 2016, a comparative study of news consumption of more than 50,000 people in 26 countries, 36% of respondents were happy with the news automatically selected for them based on what they have read before, 30% were happy for news to be selected for them based on the judgement of editors or journalists. In addition, 22% were happy with the news selected based on what their friends had consumed. But what if this selected content and personalised search results are only supporting our opinion not providing any alternative points of view? What if some key information is lost in an algorithmically driven bubble?

So called “filter bubbles” or “echo chambers” are recently recognised phenomena. There is at least some evidence that they impacted the US election and the results of the UK’s Brexit referendum. Individually curated content of big networking platforms and search engines powered by algorithms is not always neutral, and it can influence political decision making processes.

There are a number of concerns and discussions surrounding the role of algorithm driven digital services in our political life. The World Web Foundation, in its white paper on Algorithmic Accountability, provided some evidence that algorithms can be discriminatory and harmful in nature, and underlined the need for algorithmic accountability and transparency by proposing potential areas of actions on how to achieve it.

However, on the side of regulators and policy makers, it is not always that simple to implement laws and regulations, as technologies are constantly changing and evolving. Moreover, big data firms and online platforms are tirelessly working on improving the quality of the algorithms proposed, which contributes to a more sophisticated, more precise computational analysis which would enable a higher level of psychological profiling.

Some big stakeholders are stepping forward and opening up about the use of the algorithms, for instance, Facebook revealed its experiments, where it monitored the effect of the change in the number of negative and positive posts shown to the user.  This experiment became an evidence of massive-scale emotional contagion through social networks and was a target of severe critique from the general public. In spite of the criticism received  this revelation raised awareness about the role of algorithms and brought to light the need for higher transparency and greater accountability from big internet stakeholders on the use of the algorithms.

There are number Civil Society Organisations and research institutes advocating for algorithmic transparency. Recently EAVI has participated in the event hosted at the European Parliament where this topic was discussed (an article about the event can be found here), therefore there is a hope that there will be some practical measures on how to make internet companies more  transparent about the use of algorithms. However we can not expect that social-profiling and as a result personalised content will disappear in the future. It is more likely that the use of algorithms that customise content will in fact become more sophisticated.

So what can we, as individuals, do to engage with social platforms and search engines in a more constructive way, to get the fullest and most relevant information and news?

There are simple steps everyone can incorporate into their digital interaction to receive the fullest and most relevant information:

Search engines:

  • Surf in anonymous browsing mode, or use alternative search engines that offer unpersonalised search like DuckDuckGo, Ixquick. However, it is uncertain for how long new search engines will stay “unbiased”.
  • Explore search settings for Google, personal results > do not use personal results or use pws=0 at the end of a search URL.
  • Use Ublock origin plugin.
  • EFF (The Electronic Frontier Foundation)   has number of tools to defend your online privacy and security online.

HTTPS Everywhere extension can be useful, as it encrypts your communications with many major websites.

EFF Privacy Badger browser add-on that stops advertisers and other third-party trackers from secretly tracking where you go and what pages you look at on the web.

  • Disable  cookies.

News feed on social media platforms:

  • Subscribe to various news outlets, especially to those where opinions differ from yours.
  • Explore news feed/notification settings, for example, in Facebook, select “Most Recent” in News Feed preferences instead of Top Stories
  • Occasionally like the articles you disagree with, so the algorithms can readjust the stream of news proposed.
  • Check BBC video “Election 2017: How can you pop your filter bubble?” here

 

With thanks to Natallia Bialiayeva.

Share This Post, Choose Your Platform!

Why Algorithmic Accountability and Transparency are important in the Digital Age ?

The Internet and digital technologies have revolutionised our world over the last 50 years. The way we communicate and interact with others, the way we spend our free time, the way we buy, the way we work and do business are not the same as they were before.

ICT skills and digital competences became the synonymous with success. The Digital Agenda is a policy priority in many developed countries. The European Union is not an exception. The European Commission’s President Junker named the Digital Single Market as the second priority after job growth, with an understanding that a growing digital single market will create hundreds of thousands of new jobs..

Digital technologies based on mathematical concepts of binary code have enabled huge swathes of information to be stored on small storage devices, which can be easily preserved and transported. In the paper “The Internet of Everything is a New Economy” Ph.D. Plamen Nedeltchev wrote that the amount of data created in recent years is more than that created by all humans in the previous 5000 years. “It is estimated that 4000 exabytes of data were stored in the cloud by the end of 2012, and the contributions from Amazon and Facebook amounted to 20 petabytes of data per day.”

Therefore, we live in a digitized world, we are part of it and we do not question ourselves often enough as to how possible it is to make sense of this ocean of data? We do this by using computational processes, sets of instructions, or simply  “algorithms” to make decisions of high complexity based on the data available to us. Algorithms are deployed in almost every sphere of our digital life. Inputs and outputs of complex computational process are neither visible nor analysable by an ordinary human mind, making the final decision unintelligible.

Search engines and social media platforms actively use algorithms as well to provide customised content for users. With increased usage of mobile devices and a rising number of social media users, thousands of data points of information are available online. This data is used by search engine companies and social media platforms to create a deeper profile of individuals in order to provide more personalised and tailored content.  

But why should we even care if the algorithms are doing the most difficult part of the job for us by recycling our personal data and providing us with personalised solutions? Isn’t it good that search engines like Google,Yahoo, Bing or social media platforms like Facebook, Twitter, Instagram, offer us content we are interested in based on our previously stored data?

There is no doubt that personalised search and customised news feeds can be convenient, as users receive and see content that they are most likely to want to see, not losing time on news filtering and engaging in active information seeking. According to Reuters Institute Digital news Report 2016, a comparative study of news consumption of more than 50,000 people in 26 countries, 36% of respondents were happy with the news automatically selected for them based on what they have read before, 30% were happy for news to be selected for them based on the judgement of editors or journalists. In addition, 22% were happy with the news selected based on what their friends had consumed. But what if this selected content and personalised search results are only supporting our opinion not providing any alternative points of view? What if some key information is lost in an algorithmically driven bubble?

So called “filter bubbles” or “echo chambers” are recently recognised phenomena. There is at least some evidence that they impacted the US election and the results of the UK’s Brexit referendum. Individually curated content of big networking platforms and search engines powered by algorithms is not always neutral, and it can influence political decision making processes.

There are a number of concerns and discussions surrounding the role of algorithm driven digital services in our political life. The World Web Foundation, in its white paper on Algorithmic Accountability, provided some evidence that algorithms can be discriminatory and harmful in nature, and underlined the need for algorithmic accountability and transparency by proposing potential areas of actions on how to achieve it.

However, on the side of regulators and policy makers, it is not always that simple to implement laws and regulations, as technologies are constantly changing and evolving. Moreover, big data firms and online platforms are tirelessly working on improving the quality of the algorithms proposed, which contributes to a more sophisticated, more precise computational analysis which would enable a higher level of psychological profiling.

Some big stakeholders are stepping forward and opening up about the use of the algorithms, for instance, Facebook revealed its experiments, where it monitored the effect of the change in the number of negative and positive posts shown to the user.  This experiment became an evidence of massive-scale emotional contagion through social networks and was a target of severe critique from the general public. In spite of the criticism received  this revelation raised awareness about the role of algorithms and brought to light the need for higher transparency and greater accountability from big internet stakeholders on the use of the algorithms.

There are number Civil Society Organisations and research institutes advocating for algorithmic transparency. Recently EAVI has participated in the event hosted at the European Parliament where this topic was discussed (an article about the event can be found here), therefore there is a hope that there will be some practical measures on how to make internet companies more  transparent about the use of algorithms. However we can not expect that social-profiling and as a result personalised content will disappear in the future. It is more likely that the use of algorithms that customise content will in fact become more sophisticated.

So what can we, as individuals, do to engage with social platforms and search engines in a more constructive way, to get the fullest and most relevant information and news?

There are simple steps everyone can incorporate into their digital interaction to receive the fullest and most relevant information:

Search engines:

  • Surf in anonymous browsing mode, or use alternative search engines that offer unpersonalised search like DuckDuckGo, Ixquick. However, it is uncertain for how long new search engines will stay “unbiased”.
  • Explore search settings for Google, personal results > do not use personal results or use pws=0 at the end of a search URL.
  • Use Ublock origin plugin.
  • EFF (The Electronic Frontier Foundation)   has number of tools to defend your online privacy and security online.

HTTPS Everywhere extension can be useful, as it encrypts your communications with many major websites.

EFF Privacy Badger browser add-on that stops advertisers and other third-party trackers from secretly tracking where you go and what pages you look at on the web.

  • Disable  cookies.

News feed on social media platforms:

  • Subscribe to various news outlets, especially to those where opinions differ from yours.
  • Explore news feed/notification settings, for example, in Facebook, select “Most Recent” in News Feed preferences instead of Top Stories
  • Occasionally like the articles you disagree with, so the algorithms can readjust the stream of news proposed.
  • Check BBC video “Election 2017: How can you pop your filter bubble?” here

 

With thanks to Natallia Bialiayeva.

Share This Post, Choose Your Platform!