Mixed Prototyping: Test your UI with AR
Alyona Morozova Sandra Engel

Mixed Prototyping: Test your UI with AR

Alyona Morozova UX Designer

Sandra Engel Senior Communication Manager, Leadmanagement Expert

21.10.2020 • 8 minutes reading time

"The technologies that we design do not, and never will, exist in a vacuum." — Bill Buxton

Imagine this: you have an idea or a product in an early development stage. Now, you want to know if it fulfills its purpose and gain a better understanding of its future Look & Feel. You also want to test its acceptance with real users.

Yet not developed, your product does not leave the chances to be tested conventionally. But you still need like to know in advance if the design is worth further development.

What would you do?

Leaves room for improvement: classic prototyping

Typically, usability professionals start with defining the product goals and user groups, and continue with developing prototypes for testings. Prototypes, as opposed to the final product, need not to be fully functional. Basically, they only need to fulfill two goals: look like a product and let researchers test a selected number of scenarios.

So far, so good.

But imagine you’re prototyping a software or an embedded UI for an industrial machine or a complex environment. In this context, the influence of the interaction and impression of the machine might be crucial to the success of your interface.

Will testing your idea in a neutral environment deliver a realistic experience?
How could you efficiently test an interface without its respective hardware?

Well… Mixed Reality might be the answer.

UX Designer from Ergosign holds HoloLens
When the project started we used the HoloLens 1 — the second generation was not available yet

Exploring new possibilities: Prototyping in Mixed Reality

There exist various ways of prototyping, ranging from completely physical to completely virtual scenarios.

Before Microsoft’s HoloLens 2 was released, we explored augmenting the hardware to the testing scenario using the first generation of HoloLens.

The idea was to give the user a more realistic understanding of the size and appearance of a product, i. e. the hardware, plus a real display to interact with.

Imagine the basic workflow as follows:

  1. A designer creates a digital UI prototype in a prototyping tool and exports it to a web format (HTML,CSS, JS).
  2. The resulting export is used as an input for a hologram which represents a physical device.
  3. During the testing, the user wears a HoloLens. He or she interacts with the interface through the browser on a smartphone (or any other web-enabled device) and gets an instant animation or sound.
  4. The machine surrounding the interface is added as a hologram to the screen via the HoloLens.

Both the hologram and the interface are now seen as one system. The interaction and the interface can be tested more realistically.

Hence designers and developers can test a ready-to-interact product more quickly.

Our testing scenario: a new interface for a coffee machine
Our testing scenario: a new interface for a coffee machine

Of course, we have tested the concept at Ergosign. Here’s how:


Working in the background: the technology

Communication model of the interaction between the servers, clients and the HoloLens
Communication between the servers, clients and the HoloLens

We set up an IPC Network for the interaction between two devices (the screen and the HoloLens) and a user. It includes three major components:

  1. NodeJS Server
  2. Socket.io Webclient
  3. TCP .NET Client

The NodeJS server runs on a computer, and two clients run on the smartphone and the HoloLens.

We used WebSockets, a TCP based network protocol for a web browser client, with a Unity/UWP (Universal Windows Platform) client.

Code Snippet
Code Snippet

The web client sends a message to the server, while the UWP client constantly requests state updates from the server.

The input that the WebSocket client receives serves as a trigger for the animation of the virtual prototype. If a user starts a machine process, the machine hologram reacts accordingly.

Our 3D models are done in Blender or Autodesk 3ds Max, all interactions and particle systems are implemented in Unity.

We also used HammerJS for recognizing touch interactions from the web client and mapping a virtual model to the UI with the help of Vuforia SDK in Unity.

This project started before the second generation of the HoloLens was published and delivered, so the machine is hologrammed with the first generation of the Microsoft HoloLens. We used these glasses as they are wireless, handsfree, offer a decent resolution and might be considered an industry standard. And because technology can have a big impact, we’re curious about how the HoloLens 2 will change the game!

Let’s go: our showcase

We built a prototype of a coffee machine to demonstrate how the whole setup should function:

Here's how our showcase works

The hologram of a coffee machine is anchored around the physical display of a smartphone. The smartphone is mounted on a tripod, with the UI opened in its browser. Because the interface is designed for a touch display, the user interaction experience is as close as possible to a real interaction with an embedded display.

The user is welcome to play around with the prototype and offered classic scenarios, such as

This setup also allows to test how the digital UI accounts for possible safety issues. For example, having a user make herself a cup of tea, requires observing that it is poured from another tap.

Summary: the numerous benefits

Remote usability testings have high chances to become a new standard, requiring for reusable, mobile, but robust setup for smooth and effortless interaction.

It will provide the main stakeholders, namely designers, researchers, clients and system users, with a common ground.

It will also equip designers with a tool for better understanding the underlying physical product and early testing design ideas and interaction techniques.
At the same time, this setup allows researchers to test the interface in order to meet users’ needs.

Our project is the first step towards implementing virtual prototypes in a normal environment.

The role of remote work and testing increased enormously this year. If you’re interested in this topic, you might also like our earlier article with tips and best practices for agile remote testing. Click here to read on!

Let's shape the future together! Meet us in our offices in Germany, Switzerland and China for an inspiring talk.

Get in touch

In order to give you a better service Ergosign uses cookies. Please agree to our use of cookies. I Disagree I Agree