Skip main navigation

What user empowerment and autonomy means in practice

What User Empowerment and Autonomy Means in Practice
Close up of laptop

Principle 2: User empowerment and autonomy

The dignity of users is of central importance. Products, services and platforms should align with the best interests of users.

This principle focuses on the dignity of users, and the need to design features and functionality that preserve fundamental consumer and human rights. This means understanding that inappropriate content, activity and abuse can be intersectional, impacting the user in multiple ways for multiple reasons, and that technology can deepen societal inequalities. To combat this, services and platforms need to give users tools to manage their own safety and privacy and engage in meaningful consultation with diverse and at-risk groups to make sure their features and functions are accessible to all.

To help ensure that features, functionality and an inclusive design approach give users a level of empowerment, control and autonomy that supports safe online interactions, here are some examples of steps eSafety recommends:

Technical measures and tools

Provide technical measures and tools that adequately allow users to manage their own safety, and that are set to the most secure privacy and safety levels by default. Include blocking, muting and reporting functions to empower users in their online interactions.

Good Practice Note icon Tumblr has a safe mode filter which is turned on by default for all users. This ensures that sensitive content does not appear on a user’s dashboard or in their search results. Users are encouraged to flag content as ‘sensitive’ or ‘explicit’ at point of upload.

Protocols and consequences

Establish clear protocols and consequences for service violations that serve as meaningful deterrents and reflect the values and expectations of the users.

Good Practice Note icon YouTube applies a three-strike and email notification system for violations. To better educate and raise awareness of its community guidelines, YouTube give first-time offenders information to better understand why their post violates the standards before it administers the strike system.

Risk and harm mitigation

Leverage the use of technical features to mitigate online risks and harms, which can be flagged to users at relevant points in the service, and which prompt and optimise safer interactions.

Good Practice Note icon Roblox provides users with different safety settings and experiences for users that are dependent on the age of the account holder.

Support and feedback

Provide built-in support functions and feedback loops for users that inform them on the status of their reports, what outcomes have been taken and offer an opportunity for appeal.

Good Practice Note icon YouTube has a reporting history dashboard which indicates to users whether their reports are active, removed or restricted.


Evaluate all design and function features to ensure that risk factors for all users – particularly for those with distinct characteristics and capabilities – have been mitigated before products or features are released to the public.

Good Practice Note icon Microsoft has published an inclusive design toolkit that outlines three broad principles: recognise exclusion; learn from diversity; and, solve for one, extend to many.

(Source: Adapted fromeSafety Commissioner 2019, Safety by Design Overview, May 2019, pp. 20-27)

© RMIT 2023
This article is from the free online

Safety By Design

Created by
FutureLearn - Learning For Life

Reach your personal and professional goals

Unlock access to hundreds of expert online courses and degrees from top universities and educators to gain accredited qualifications and professional CV-building certificates.

Join over 18 million learners to launch, switch or build upon your career, all at your own pace, across a wide range of topic areas.

Start Learning now