How can AR be more useful in our everyday lives?

How can AR be more useful in our everyday lives?


My Role —

User Experience &
User Interface Design

@Samsung Research America

Pluto is a project that explored how AR technology could play a more useful role in our everyday lives. Our popular understanding of AR is either very science fiction, or locked into interface positioned in 3D. These aren’t really playing a big role for people in daily usage. Using intelligence from Samsung’s Bixby, we designed experiences that makes AR more than just entertainment.

How it works

AR recognition view

Augmented Reality shouldn’t just be about anchoring information on an object in camera. It should augment the user’s whole mobile experience. That is true Augmented Reality. One way to achieve that is by providing an efficient way to pull in helpful information based on the object in view. Through the AR view, the viewer can simply tap and drag a region with the object, and information relevant to the object will appear. The information can be quick links to relevant apps, or even deep links to a function inside a particular app.


Locale is an information hot spot provided by a third-party service and accessed through the AR lens. When a user enters the hot spot, he/she will get an icon notification on the phone. This icon takes the user to a screen where they can get helpful information about their surroundings. Any function with the AR icon opens up the camera view, and the user can explore their environment to discover information placed on top of objects in the real world. Locale is an open platform solution that can attract third-party business partners to provide unique AR experience that leads users to discover more.


Through Pluto, we also wanted to find a way to augment the way we consume content from our mobile phone. When users find something interesting from the web, they usually perform another action with that information on another app. We wanted to streamline this experience through OCR reading technology. AR recognition shouldn’t just be about the camera. It should also work on the content the user is looking at. With Assistant, the user will get quick links and even deep links to relevant applications.



AR technology has become mainstream. People use it to take unique selfies, position graphics on an object or an environment. The entertainment value of AR has been well-established, but there hasn’t been a strong use case for everyday lives. We wanted to explore ways where we can make AR practical.

Our challenge was to expand AR beyond entertainment.

Design Direction

To make AR technology practical, first, we needed to truly understand what we mean by Augmented Reality. We realized that the term Augmented Reality is hyper-focused on what users see through the camera. This limited understanding of reality needed to change. We wanted to expand the scope to user’s whole mobile experience. AR should augment the entire experience of a user’s mobile phone, not just the camera view.

Our challenge was to expand AR beyond entertainment.

With computer vision technology, applications can provide shortcuts when the user wishes to act upon something they encountered. This could be an object in front, the content being read, or simply about the place user is at. When the user wishes to take action, he/she fires up the appropriate app to perform that action. With AR technology, this process can be streamlined.

Efficiency has become the major focus of the project, and we decided on three scenarios where users often decide to explore more information and take action. The scenarios are when a user sees an object of interest, when a user is at the place of interest, and when the user consumes information from the phone. These three scenarios were used as the backbone to develop the AR recognition view, Locale, and the Assistant.

When people find a subject of interest, they use a search ?engine and pick an option from there.

With the vision of AR technology, this process of information quarry can be streamlined.


These are some early UI iterations. My main emphasis was to use less textual information anchored to the real world view and provide organized actionable items to the user.

Selected Final UIs