After more than two years of discussions, beginning in 2016, the updated AVMSD was adopted in November 2018. Now it is time for Member States to incorporate the guidelines into their legal frameworks and institutions.  

The hot topic of the moment, and probably one of the most controversial aspect of the AVMSD, is the implication of video sharing platforms (VSP’s) under this directive. So, let’s start with an ambiguous question, do all platforms hosting video’s fall under this directive? Or should it be specifically for platforms whose main function is video hosting?

A few years ago, the landscape was much simpler and therefore there was not so much difficulty in defining what should be covered by this law.  But, taking into account, the huge developments in technology and society and the vision becomes murkier. It is no longer a straight forward decision to have linear media platforms follow these rules, we now have non-linear media platforms. This ranges from Facebook to YouTube to Netflix. These are media that have functioned outside of the AVMSD and have followed their own terms and conditions and regulation.

With the new updated AVMSD, these platforms must now operate under it in certain sections. This includes adopting a new role as enforces of the law in areas including the protection of minors, hate speech or terrorism.

The question is how to implement a working relationship between their role in upholding the law as part of the AVMSD and their own terms and conditions.

Within this new role VSP’s will play an important role in setting boundaries on such topics as hate speech. This is a grey area when you add the fundamental right to freedom of expression. Highlighting just how momentous the task is when it comes to regulating content.  How can one determine the boundaries of political speech or the expression of opinion be it convention or not?

Before their implication in the AVMSD VSP’s already had a responsibility to remove certain content under the EU’s eCommerce Directive but now, they have additional work.

Within the new Directive, VSP’s are given more responsibility to protect users and provide safeguards including content regulation. Something which some could argue is not the top priority of private companies.

What we should conclude here is that the new AVMSD rules are good thing as its core is to protect the user with the new updates. The difficulty is tackling certain types of content while protecting freedom of speech.

What we should also recognise here is that VSP’s and new platforms have become central players and facilitating platforms for broadcasting and video content which unfortunately now also means they are facilitators of hate speech and all the negatives.

So therefore, should the argument be that regardless of whether they consider themselves under the same rules as linear mediums they are still facilitating these tools and therefore have to abide by the rules?

What is clear is that certain social networking platforms are fighting this. Their argument? That videos are not their main function or purpose and therefore should not have to follow the same rules. They existed before videos were integrated and they would exist without them. However, a key point is that they are competing for the same audiences and advertisement revenues.

We also need to highlight it from the perspective of the user. Essentially the point of the new rules is the protection of the user and therefore they must be put at the center of the arguments.

To put it simply, the user is consuming content on these platforms and so they must be protected. Regardless of whether it is the main function of the platform or not.

So, while we can easily talk about the need for protection, what does this protection involve? What are the concrete solutions and visible measures?

This can include ensuring that users can flag and report content that may be harmful, implementing age limits and control systems, ensuring that operations and functions of the platform are transparent and enable a clear system of contact for the users to address any questions or complaints.

These are crucial technical functionalities of platforms that enable the user to protect themselves. But there is, I believe, and even more crucial component.

Media literacy education.

In a simple explanation – we must educate to equip individuals with the skills to recognise hate speech, fake news, propaganda, biased opinions and so forth. Only with this education can we expect to instill a healthy, safe use of the internet and all the platforms that come with this.

We need to put the power into the hands of the individual, rather than only expect platforms to regulate its content. Then we are just putting more power into the hands of the already powerful platforms.

It is in the interest of the European Union to focus on this type of education, especially if they are worried about the influence and growth of these platforms. But how to implement this education and how to get platforms interested too? Is it in their interest to education their users? Maybe not, but it is now their legal responsibility as well as their moral one.

So, while we are left with an important question regarding how to implement media literacy education, I will remind you that in the updated AVMSD, media literacy in mentioned in three different sections establishing itself as an important component for the future. What needs to be decided is how to really take this forward.

So, as I began, in conclusion we should also end with an ambiguous question. While we can recognise that the new AVMSD contains important, relevant changes, does one fit size fit all when it comes to new and old media?

Share This Post, Choose Your Platform!

After more than two years of discussions, beginning in 2016, the updated AVMSD was adopted in November 2018. Now it is time for Member States to incorporate the guidelines into their legal frameworks and institutions.  

The hot topic of the moment, and probably one of the most controversial aspect of the AVMSD, is the implication of video sharing platforms (VSP’s) under this directive. So, let’s start with an ambiguous question, do all platforms hosting video’s fall under this directive? Or should it be specifically for platforms whose main function is video hosting?

A few years ago, the landscape was much simpler and therefore there was not so much difficulty in defining what should be covered by this law.  But, taking into account, the huge developments in technology and society and the vision becomes murkier. It is no longer a straight forward decision to have linear media platforms follow these rules, we now have non-linear media platforms. This ranges from Facebook to YouTube to Netflix. These are media that have functioned outside of the AVMSD and have followed their own terms and conditions and regulation.

With the new updated AVMSD, these platforms must now operate under it in certain sections. This includes adopting a new role as enforces of the law in areas including the protection of minors, hate speech or terrorism.

The question is how to implement a working relationship between their role in upholding the law as part of the AVMSD and their own terms and conditions.

Within this new role VSP’s will play an important role in setting boundaries on such topics as hate speech. This is a grey area when you add the fundamental right to freedom of expression. Highlighting just how momentous the task is when it comes to regulating content.  How can one determine the boundaries of political speech or the expression of opinion be it convention or not?

Before their implication in the AVMSD VSP’s already had a responsibility to remove certain content under the EU’s eCommerce Directive but now, they have additional work.

Within the new Directive, VSP’s are given more responsibility to protect users and provide safeguards including content regulation. Something which some could argue is not the top priority of private companies.

What we should conclude here is that the new AVMSD rules are good thing as its core is to protect the user with the new updates. The difficulty is tackling certain types of content while protecting freedom of speech.

What we should also recognise here is that VSP’s and new platforms have become central players and facilitating platforms for broadcasting and video content which unfortunately now also means they are facilitators of hate speech and all the negatives.

So therefore, should the argument be that regardless of whether they consider themselves under the same rules as linear mediums they are still facilitating these tools and therefore have to abide by the rules?

What is clear is that certain social networking platforms are fighting this. Their argument? That videos are not their main function or purpose and therefore should not have to follow the same rules. They existed before videos were integrated and they would exist without them. However, a key point is that they are competing for the same audiences and advertisement revenues.

We also need to highlight it from the perspective of the user. Essentially the point of the new rules is the protection of the user and therefore they must be put at the center of the arguments.

To put it simply, the user is consuming content on these platforms and so they must be protected. Regardless of whether it is the main function of the platform or not.

So, while we can easily talk about the need for protection, what does this protection involve? What are the concrete solutions and visible measures?

This can include ensuring that users can flag and report content that may be harmful, implementing age limits and control systems, ensuring that operations and functions of the platform are transparent and enable a clear system of contact for the users to address any questions or complaints.

These are crucial technical functionalities of platforms that enable the user to protect themselves. But there is, I believe, and even more crucial component.

Media literacy education.

In a simple explanation – we must educate to equip individuals with the skills to recognise hate speech, fake news, propaganda, biased opinions and so forth. Only with this education can we expect to instill a healthy, safe use of the internet and all the platforms that come with this.

We need to put the power into the hands of the individual, rather than only expect platforms to regulate its content. Then we are just putting more power into the hands of the already powerful platforms.

It is in the interest of the European Union to focus on this type of education, especially if they are worried about the influence and growth of these platforms. But how to implement this education and how to get platforms interested too? Is it in their interest to education their users? Maybe not, but it is now their legal responsibility as well as their moral one.

So, while we are left with an important question regarding how to implement media literacy education, I will remind you that in the updated AVMSD, media literacy in mentioned in three different sections establishing itself as an important component for the future. What needs to be decided is how to really take this forward.

So, as I began, in conclusion we should also end with an ambiguous question. While we can recognise that the new AVMSD contains important, relevant changes, does one fit size fit all when it comes to new and old media?

Share This Post, Choose Your Platform!

After more than two years of discussions, beginning in 2016, the updated AVMSD was adopted in November 2018. Now it is time for Member States to incorporate the guidelines into their legal frameworks and institutions.  

The hot topic of the moment, and probably one of the most controversial aspect of the AVMSD, is the implication of video sharing platforms (VSP’s) under this directive. So, let’s start with an ambiguous question, do all platforms hosting video’s fall under this directive? Or should it be specifically for platforms whose main function is video hosting?

A few years ago, the landscape was much simpler and therefore there was not so much difficulty in defining what should be covered by this law.  But, taking into account, the huge developments in technology and society and the vision becomes murkier. It is no longer a straight forward decision to have linear media platforms follow these rules, we now have non-linear media platforms. This ranges from Facebook to YouTube to Netflix. These are media that have functioned outside of the AVMSD and have followed their own terms and conditions and regulation.

With the new updated AVMSD, these platforms must now operate under it in certain sections. This includes adopting a new role as enforces of the law in areas including the protection of minors, hate speech or terrorism.

The question is how to implement a working relationship between their role in upholding the law as part of the AVMSD and their own terms and conditions.

Within this new role VSP’s will play an important role in setting boundaries on such topics as hate speech. This is a grey area when you add the fundamental right to freedom of expression. Highlighting just how momentous the task is when it comes to regulating content.  How can one determine the boundaries of political speech or the expression of opinion be it convention or not?

Before their implication in the AVMSD VSP’s already had a responsibility to remove certain content under the EU’s eCommerce Directive but now, they have additional work.

Within the new Directive, VSP’s are given more responsibility to protect users and provide safeguards including content regulation. Something which some could argue is not the top priority of private companies.

What we should conclude here is that the new AVMSD rules are good thing as its core is to protect the user with the new updates. The difficulty is tackling certain types of content while protecting freedom of speech.

What we should also recognise here is that VSP’s and new platforms have become central players and facilitating platforms for broadcasting and video content which unfortunately now also means they are facilitators of hate speech and all the negatives.

So therefore, should the argument be that regardless of whether they consider themselves under the same rules as linear mediums they are still facilitating these tools and therefore have to abide by the rules?

What is clear is that certain social networking platforms are fighting this. Their argument? That videos are not their main function or purpose and therefore should not have to follow the same rules. They existed before videos were integrated and they would exist without them. However, a key point is that they are competing for the same audiences and advertisement revenues.

We also need to highlight it from the perspective of the user. Essentially the point of the new rules is the protection of the user and therefore they must be put at the center of the arguments.

To put it simply, the user is consuming content on these platforms and so they must be protected. Regardless of whether it is the main function of the platform or not.

So, while we can easily talk about the need for protection, what does this protection involve? What are the concrete solutions and visible measures?

This can include ensuring that users can flag and report content that may be harmful, implementing age limits and control systems, ensuring that operations and functions of the platform are transparent and enable a clear system of contact for the users to address any questions or complaints.

These are crucial technical functionalities of platforms that enable the user to protect themselves. But there is, I believe, and even more crucial component.

Media literacy education.

In a simple explanation – we must educate to equip individuals with the skills to recognise hate speech, fake news, propaganda, biased opinions and so forth. Only with this education can we expect to instill a healthy, safe use of the internet and all the platforms that come with this.

We need to put the power into the hands of the individual, rather than only expect platforms to regulate its content. Then we are just putting more power into the hands of the already powerful platforms.

It is in the interest of the European Union to focus on this type of education, especially if they are worried about the influence and growth of these platforms. But how to implement this education and how to get platforms interested too? Is it in their interest to education their users? Maybe not, but it is now their legal responsibility as well as their moral one.

So, while we are left with an important question regarding how to implement media literacy education, I will remind you that in the updated AVMSD, media literacy in mentioned in three different sections establishing itself as an important component for the future. What needs to be decided is how to really take this forward.

So, as I began, in conclusion we should also end with an ambiguous question. While we can recognise that the new AVMSD contains important, relevant changes, does one fit size fit all when it comes to new and old media?

Share This Post, Choose Your Platform!