Skip main navigation

New offer! Get 30% off one whole year of Unlimited learning. Subscribe for just £249.99 £174.99. New subscribers only. T&Cs apply

Find out more

What service provider responsibility means in practice

Explanation of each component of principle
woman on laptop
© RMIT 2023

Principle 1: Service provider responsibility

The first Safety by Design principle is based on the premise that user safety is a shared responsibility and the burden of safety should never fall solely on the user. The service provider must make every attempt to make sure that online harms are understood, assessed and addressed in the design and provision of online services, products and platforms.

The service provider can take various preventative steps to guard against their service being used to facilitate, inflame or encourage inappropriate and illegal behaviours, content and activity. This involves the service provider assessing the potential risks of online interactions upfront and taking active steps to engineer out potential misuse – at all stages of the user journey – to reduce people’s exposure to harms.

To help make sure known and anticipated harms are evaluated in the design and provision of an online service, product or platform, here are some examples of steps eSafety recommends:

Nominate

Nominate individuals or teams and make them accountable for user safety policy creation, evaluation, implementation and operations.

Good Practice Note icon All major online services have dedicated ‘trust and safety’ teams. Most services have a dedicated Head of Safety and, at Microsoft, they also have a dedicated Chief Digital Safety Officer.

Guidelines

Develop community guidelines, terms of service and moderation procedures that are fairly and consistently implemented.

Good Practice Note icon French social media service Yubo uses AI and machine learning on their livestream function to flag to users when their behaviour breaches community standards. Yubo provides warnings and an opportunity for the user to rectify their actions, with penalties and cautions if the user does not change their behaviour.

Infrastructure

Put in place infrastructure that supports internal and external triaging, clear escalation pathways and reporting on all user safety concerns, alongside readily accessible mechanisms for users to flag and report concerns and violations at the point they occur.

Good Practice Note icon YouTube has developed a ‘trusted flagger program’ that provides individuals, government agencies, and non-government organisations who are effective at identifying content that violates community standards with expedited escalation pathways for review by their content moderators.

Protocols

Ensure there are clear internal protocols for engaging with law enforcement agencies, support services and illegal content hotlines.

Good Practice Note icon Most major service providers have a safety centre or hub which houses information about how the platform keeps its users safe. This information usually covers what privacy, security and safety features are available to users, and in some cases, links to resources and safety partners.

Processes

Put processes in place to detect, surface, flag and remove illegal and harmful behaviour, contact, activity and content with the aim of preventing harms before they occur.

Good Practice Note icon Chip manufacturers Nvidia and Intel have developed AI chips to filter livestreamed content to expedite the identification of bad actors, instances of self-harm and suicide and other acts of violence.
   

Risk management

Prepare documented risk management and impact assessments to assess and remediate any potential online harms that could be enabled or facilitated by the product, service or platform.

Good Practice Note icon Facebook has developed an internal functional expert privacy, safety and security review through which all products have to pass. LEGO has developed a risk assessment tool against which every new feature and architectural component is weighed.

Social contracts

Implement social contracts at the point of registration. These outline the duties and responsibilities of the service, user and third parties for the safety of all users.

Good Practice Note icon LEGO has introduced a ‘Safety Pledge’ at registration – users are taken through an induction process where they learn the rules of the site via tutorials and tips. Users are required to sign the pledge before they are allowed to use the site, after which they earn badges as they progress.

Security, safety and privacy design

Consider security by design, privacy by design, and user safety considerations which are balanced when securing the ongoing confidentiality, integrity, and availability of personal data and information.

Good Practice Note icon Quizlet, a mobile and web-based study application, houses data protection, privacy and user programs under a unified Trust & Safety department. Product and business initiatives undergo review with this department in relation to a set of trust and safety principles.

(Source: Adapted from eSafety Commissioner 2019, Safety by Design Overview, May 2019, pp. 20-27)

© RMIT 2023
This article is from the free online

Safety By Design

Created by
FutureLearn - Learning For Life

Reach your personal and professional goals

Unlock access to hundreds of expert online courses and degrees from top universities and educators to gain accredited qualifications and professional CV-building certificates.

Join over 18 million learners to launch, switch or build upon your career, all at your own pace, across a wide range of topic areas.

Start Learning now