Skip main navigation

New lower prices! Get up to 50% off 1000s of courses. 

Explore courses

Philosophy of Technology: Can We Design Morality?

Moralising technology is not self-evident. In this article, Verbeek discusses the desirability and the different approaches to ethical design.

In this video we introduce you to the idea that we can start “moralising things”. Here you can read more about the different approaches to do that.

Nudging

In their book Nudge (Thaler and Sunstein 2008), Richard Thaler and Cass Sunstein make a case for designing our material surroundings in such a way that it influences us in a positive sense without taking control away from us. A nudge is a tiny push, a small stimulus that guides people’s behaviour in a certain direction. Our material world is full of such nudges, Thaler and Sunstein claim, varying from photocopying machines with a default setting of single-sided copies to urinals with a built-in image of a fly to seduce men to aim for it. Thaler and Sunstein propose that we design these nudges in an optimal manner, so that we can guide our own behaviour in directions that are widely considered beneficial.

The central idea in their approach is that human decisions are to a considerable extent organised and pre-structured by our material surroundings. When we make choices, two systems are at work in our brains, which Thaler and Sunstein call an ‘automatic system’ and a ‘reflexive system’. Most of our decisions are made automatically, without explicit reflection. But for some decisions, we really have to stop and think: they require reflection and critical distance.

To a significant degree, our automatic system is organised by our material surroundings. To use one of Thaler and Sunstein’s examples: when fried snacks are within reaching distance in a company’s canteen and the salads are hidden behind refrigerator doors, it is very likely that many people will choose the less healthy food. The layout of canteens gives nudges in a certain direction. If we want to take responsibility for such situations, we must learn to think critically about nudges. If we can design them better, we in fact design our automatic system in a more desirable way. Thaler and Sunstein, therefore, call such design activities ‘choice architecture’: the design of choice situations. We need to rewrite the default settings of our material world.

But these activities of choice architecture should never close down the reflexive system. For Thaler and Sunstein, it is extremely important that nudges always remain open to reflection and discussion, and can move from the automatic to the reflexive system. This is why they indicate their approach as ‘libertarian paternalism’. It is paternalistic because it explicitly exposes people to nudges in a direction that is considered desirable. But it is libertarian as well, because these nudges can always be ignored or undone, in all freedom. Just like everyone is currently free to use both sides of the paper when copying, even though the standard setting is one side, no one should be forced to eat a salad and pass up the croquettes in a ‘re-nudged’ cafeteria.

Thaler and Sunstein’s way out of the dilemma between influencing behaviour and respecting autonomy, therefore, is the “opt-out”: by drawing on our reflexive system, we should always be able to move away from the nudges. Every act of paternalism is compensated by the explicit possibility to take a libertarian stance toward it.

Value Sensitive Design

Another approach for ‘moralising technology’ is the method of Value-Sensitive Design (VSD). This design method, that was developed by Batya Friedman and others (Friedman et al, 2002), aims to account explicitly for human values throughout the design process. In the VSD approach, not technological functionalities are the primary focus of design activities here, but the moral values that need to be supported by the technology-in-design. The method has been used, for instance, to design a web browser that requires informed consent before saving a ‘cookie’ (a small file which contains personal information about the person surfing the internet). Starting from the values of privacy and autonomy, this design realised in creating a web browser that is not only functional for surfing the internet, but that also respects some important values that are often threatened by other web browsers.

Value-Sensitive Design uses an iterative methodology, that integrates conceptual, empirical, and technical investigations. At the conceptual level, the values that are to be implemented are carefully analysed in all their facets. For the web browser mentioned, the researchers analysed what the elements of “informed” and “consent” entail, like the adequate “disclosure” of the information needed, “comprehension” of this information, and “voluntariness” and “competence” in people’s “agreement” with what they consent to, implying the “clear opportunity to accept or decline”, the actual freedom to do so, and the “capabilities needed to give informed consent” (Friedman et al. 2002, 4).

The empirical level, subsequently, concerns “the human context in which the technical artifact is situated” (ibid., 3). Investigations here concern, for instance, how stakeholders apprehend different values, how they prioritise competing values, and how much impact these values have on their own behaviour. In the case of the privacy-sensitive web browser, this step was conducted only after work done at the third, technological, level. At the technological level, investigations concern the specific “value suitabilities that follow from properties of the technology” (ibid, 3). The central idea here, just like in mediation theory, is that technologies support certain activities and values, while discouraging other ones. In the VSD method, technical investigations can concern both the ‘value impacts’ of existing technologies, and the explicit design of technologies that support specific values.

For the privacy-sensitive web browser, technical investigations concerned the development of the technological properties of web browsers in relation to their impact on privacy. What are the default settings for cookie use? How much information is disclosed to users about the benefits and disadvantages of cookies, and about the specific information that cookies make available about their surfing behaviour. On the basis of this analysis, a redesign of the (Mozilla) web browser was made. This browser introduced a peripheral awareness of cookies; information about individual cookies and cookies in general; and user management of cookies. Empirical investigations were subsequently directed at the ways users appreciated this redesign, in order to adapt it in the most desirable direction.

Reference

Excerpt from P.P. Verbeek (2013). ‘Technology Design as Experimental Ethics’. In: S. van den Burg and Tsj. Swierstra, Ethics on the Laboratory Floor. Basingstoke: Palgrave Macmillan, pp. 83-100. ISBN 9781137002921

[1] COPD.com is developed by Roessingh Research and Development, the University of Twente, and Medisch Spectrum Twente; for more information see http://www.copddotcom.nl.

This article is from the free online

Philosophy of Technology and Design: Shaping the Relations Between Humans and Technologies

Created by
FutureLearn - Learning For Life

Reach your personal and professional goals

Unlock access to hundreds of expert online courses and degrees from top universities and educators to gain accredited qualifications and professional CV-building certificates.

Join over 18 million learners to launch, switch or build upon your career, all at your own pace, across a wide range of topic areas.

Start Learning now