Skip main navigation

The 4 pillars of virtual worlds

In this article, we introduce the core elements that are used to put a virtual world together. Let's discuss.

What would you need to know to put together a virtual world?

Pillars of virtual worlds

A VR experience depends on a few key enabling concepts. These allow the creator to implement a virtual world to achieve its intended objective.

In a way, these concepts are pillars that support the virtual world’s existence and intended experience.

These pillars are:

  • Computer graphics
  • Sounds, sensing and AI
  • Input systems and interactions
  • User

Computer graphics

This pillar deals with the core aspects of how virtual worlds look and how computing systems generate the virtual worlds. Users experience virtual worlds mainly by seeing the virtual world.

Computing systems create high fidelity imagery that is shown to the user. We will spend quite some time familiarizing ourselves with some of the core concepts that are essential to building the VR experiences you sketched out in previous tasks.

Sounds, sensing and AI

Sound plays a very significant role in VR applications. In VR experiences that simulate the environment or surroundings, users expect the presence of sound.

Users may anticipate sounds when events occur (e.g. a glass falling and shattering). Sounds can trigger emotional responses too. Thus sounds are important for immersion in a virtual world.

The ability of the virtual world to sense the context and intent of the user can result in powerful experiences.

Knowing where the user is located (whole-body) and oriented (body parts) within the virtual world allows it to respond to the user as if the user is part of the world (see Figure 1).

An image sensing user's position Figure 1: Sensing the user’s position and intent to enhance the experience in the virtual world (adapted from: https://www.youtube.com/watch?v=Khoer5DpQkE)

Immersion can be improved by knowing coarse details like the GPS coordinates of the user to high-precision full body mapping and gaze tracking which are all related to sensing (see Figure 2).

An image immersion with GPS Figure 2: Immersion with coarse GPS details

Immersion also relies on conformance to the user’s pre-learned experiences. The virtual world which conforms to the user’s pre-learned experiences will have more immersion.

This is achieved by mimicking real-world behaviour and following the laws of physics. If laws of physics are violated within the pre-learned experience, immersion can be lost unless the violation is intention and core to the experience (see Figure 3).

Figure 3 show objects in a virtual world that have failed the law of Physics and therefore not properly immersed into the virtual world environment, they have lost their balance or immersion and tend to be positioned in the wrong direction. If any of these are violated in the virtual world design, immersion can be broken.

The designer should only allow the violation to occur when it is essential for the experience. AI algorithms can help achieve these objectives.

An image law of Physics violation Figure 3: Show objects with the violation of the laws of physics during immersion

Similarly, anthropomorphic entities are expected to demonstrate intelligent behaviour. AI algorithms are used to create “intelligent” entities.

Thus, this means the virtual world is also expected to exhibit intelligent behaviour to a certain level. The balanced combination of sounds, sensing and AI can help create an immersive and engrossing experience in the VR environment (see Figure 4).

An image of anthropomorphic entities Figure 4: Anthropomorphic entities exhibiting intelligent behaviour (adapted from: https://www.youtube.com/watch?v=JyG9Y3BMLT0)

Input systems and interaction

The user interacts with the VR experience through input systems. Input systems provide an interface through which the user can directly convey their intent and actions to the VR system.

Examples of input systems used for virtual worlds include:

  • Game controllers
    The Microsoft Xbox One and the Sony PS4 consoles Controllers are in the category of game controllers. Figure 5 (Game controller device – first image from L-R) show an example of a game controller.
  • Hand-tracking sensors
    Figure 5 (Hand-tracking motion input sensors – second image [middle] from L-R) show low-cost input devices that track hand motions and recognize gestures in virtual worlds.
  • Wireless hand, body and eye trackers
    These devices combine wand-like hand motion sensing with control buttons similar to those found in game controllers (see Figure 5 (Wireless hand and body tracker – third image from L-R).

Images of input systems and interaction Figure 5: Images of input systems and interaction

User

The user is the key element in the virtual world experience. The entire existence of the virtual world is predicated by the ability of the users to experience it.

It is critical to understand the users’ needs and design the virtual world experience with the sole objective of facilitating the users in what they want to achieve, perceive and experience within the virtual world.

As with any software system, virtual worlds designed without users in mind are most likely to not achieve their main goals.

Figure 6 illustrates a user experience in the virtual world. Virtual world design without the user will be meaningless. The users’ needs are important tasks to consider while developing a virtual world application.

An image of user interaction and experienceFigure 6: User experiencing a virtual world design and interaction

This article is from the free online

Construct a Virtual Reality Experience

Created by
FutureLearn - Learning For Life

Reach your personal and professional goals

Unlock access to hundreds of expert online courses and degrees from top universities and educators to gain accredited qualifications and professional CV-building certificates.

Join over 18 million learners to launch, switch or build upon your career, all at your own pace, across a wide range of topic areas.

Start Learning now