Skip main navigation

Liability of online platforms

Liability of online platforms
You can see on this slide, the basis on which online platforms have safe harbor. It is an old piece of law in cyberspace terms, it is Article 14 of the E-Commerce Directive 2000. When this was first adopted, the concern was around the liability of Internet service providers. However, you can see how this article applies to online platforms, they store information in the words of the directive on behalf of users, and therefore will not be liable for content that is on their platform. Unless under a, they knew it was illegal or b, once they are informed it was illegal, they move quickly, in the words of the article expeditiously, to remove the content.
Online platforms are not under any obligation to monitor the content on their platforms. In fact, Article 15 of the E-Commerce Directive enshrines a law that there is no duty to monitor. The effect of the safe harbor of Article 14 is that online platforms operate a notice and take down system. This means that once they are notified that content is illegal, they will remove it, but they will not proactively remove illegal or unlawful content. The onus is on users to inform them, however, it’s more complex than that. When you interact with an online platform, you have a contractual relationship with them, this relationship is defined by the platform’s own terms of service.
For social media platforms, these will often include a set of community guidelines. These guidelines set out the rules you must abide by when you post content. You’ll see on the slide, the start of Twitter’s community guidelines which state, Twitter’s purpose is to serve the public conversation. Violence, harassment and other similar types of behavior discourage people from expressing themselves and ultimately diminish the value of global public conversation. Our roles are to ensure all people can participate in the public conversation freely and safely. The point of flak here is that while we all agree with the above sentiment, note that it says our rules, Twitter makes these rules.
How it enforces these rules is up to Twitter, it can change these rules, social media platforms are therefore very much self regulating, they regulate their own contractual rules. This is a table produced by a recent House of Lords review of the issue relating to online platforms. It’s very helpful as it demonstrates that much of the conduct on social media platforms is not subject to civil or criminal laws. Much of the activity that goes on on social media platforms, falls within the two columns on the slide headed harmful or anti social.
But ultimately, these are subject only to the terms of service of the platform and then at the discretion of the provider, with a Twitter, Facebook or whichever provider as to whether it takes action. Take a moment to look at the examples in these three columns. The government is looking to address these issues by imposing a duty of care on certain online platforms. We look at these issues in more detail on the University of Laws, MSC in Legal Tech. Even with unlawful content, such as copyright infringement, the effect of safe harbor is that online platforms only have to remove content when notified about it. They don’t have to monitor the content and proactively remove offending digital content.
Rights holders thought this was unfair, and after much lobbying and compromise, the copyright directive and the single market was adopted. The most controversial provision is Article 17, this reverses the notice and take down provisions by making certain online platforms called online content sharing service providers, liable for copyright infringement as an act of communication to the public when they host unauthorized content. The effect of Article 17 is that platforms like YouTube, which is owned by Google’s parent company Alphabet, will need to enter into licensing deals in advance with major rights holders, such as the BBC.
Where an OCSSP can’t reach agreement with the rights holder, it will still have protection, a safe harbor, where it can show that you used best efforts to obtain authorization from the rights holder. And that you used best efforts to ensure unavailability of unapproved content, for example, by using a content filter. And once notified, that acted expeditiously to take down and keep down infringing content. The final text of Article 17 has many compromises, it has a number of exceptions to the obligations on OCSSPs. In particular, very small platforms that are outside the scope of the obligation, however, this will not affect the platforms most people use everyday.
There is even a specific exception which means Wikipedia is not caught by Article 17. People were also very concerned that memes would be caught by Article 17. Memes potentially infringe copyright when they imitate or vary underlying copyright works. Article 17 provides an exception for fair use, where this is for the purpose of parody. It will be up to the courts to interpret this exception, some people think Article 17 will not achieve its purpose. They feel that it will entrench the power of larger platforms such as YouTube, which have the financial power to do licensing deals with the rights holders and to develop sophisticated content filtering schemes. The other controversial new section of this directive is Article 15.
Article 15 gives press publishers excluding scientific and academic journals, protection against online sharing of their publications by service providers, such as news aggregators for 2 years There are exceptions to this right, which means that services are not prevented from hyperlinking press publications or from sharing what is called very short extracts. For anything more than that, though, service providers will need authorization from the press publishers, and potentially have to pay a fee to the press publisher. Article 15 is therefore sometimes referred to as the link tax, as it will require platforms who aggregate news as a service to pay a fee to publishers. Some people think this will result in these services being closed down as a consequence.
You will now watch a short video on the new directive, followed by a quiz to test you understanding of the material we have just covered.
This article is from the free online

The Laws of Digital Data, Content and Artificial Intelligence (AI)

Created by
FutureLearn - Learning For Life

Reach your personal and professional goals

Unlock access to hundreds of expert online courses and degrees from top universities and educators to gain accredited qualifications and professional CV-building certificates.

Join over 18 million learners to launch, switch or build upon your career, all at your own pace, across a wide range of topic areas.

Start Learning now