What is Omni

Challenges & Brainstorm

Design Process

Prototype & Deliverables

Storytelling Demo Video

Influences

Omni learns you from your virtual data. Eventually it can read your mind

It's hard to be a good human assistant while knowing nothing about the boss's mind. So is the virtual assistant. Omni imitates how humans understand others' wishes: learning from their behaviours and experience. With the user's permission, Omni records and searches through all virtual data created by the user, such as the places he/she has been, the social posts and collections. Then, Omni analyzes this data with its algorithm, growing more and better knowledge of the user, including the characteristics, preferences and habits through time. With a deep understanding of the user, Omni can eventually do "mind-reading" to the user.

AR device + smart band with motion /muscle sensor

Omni will be held based on AR so that it can observe the situation around the user. It sees so it can react quickly and accurately. Also, Omni can be interacted by gesture controls for those situations that the user can't or prefer not to talk. It is controlled by the smart band wearing on the hand with muscle sensors so the user will not be limited by the display device's sensing range.

Eye-contact with Omni to active it whenever needed

Based on our research, people always naturally gaze at the fellows on whom they wish to rely. We thought it would be an efficient and cute interaction between the user and Omni and integrated it into our design. When the user needs to activate Omni, he/she just needs to gaze at Omni on the top right of their view. Also, different microexpressions from gazes can offer various valuable feedback to Omni. For example, if the user is satisfied or not.

Omni observes the environment, listens to you, assists you by interacting with the physical world

Omni is always ready behind, observing the surrounding environment, the user's behaviours and physiological data. Also, it listens the user, learning through his/her life pieces. When the user has a demand, Omni can immediately react to it, analyze and decide what and how to assist based on the previous observations. Then, it uses AR to guide and indicate directly on the user's visual field. Omni may respond smartly in a passive situation. For example, if the user has been depressed for a long time, Omni will find an indirect, comforting means to cheer him/her up.

FIRE your assistant if it does bad. The system will learn and "Hire" you a better one

Inspired by our low-fidelity tests and interviews with experienced Siri users, I designed this UX flow. Users used to be able to do nothing towards their virtual assistant, but not any more. Omni allows the user to "fire" the current assistant if the learning progress is poor. The system will mark this as an important warning and learn a lesson from it to improve the experience & knowledge of the user. At the same time, a new assistant (self-updated version covered by a new name & avatar) can be "hired" instead.

Named after "omniscience", Omni is a "mind-reading" AR assistant. It can figure out the user's inner desire and offer the most user-centered self-satisfying assistance.
BTW, you may fire your assistant and "hire" a better one!

A perfect human assistant in real life always understands his/her boss so well. Thus, they offer perfectly satisfying assistance. Omni imitates this, learning about the user's characteristics to figure out what's the user's true desires, assisting him/her like a true friend after his/her own heart. Various from current virtual assistants, Omni focus on fulfilling the user with subjective self-satisfaction rather than the objective correctness.

Our team's design flow & Strategies

We interviewed several voice assistant users and quoted some of their experience & concerns.

It's hard to ask virtual assistants for opinions. It only delivers whatever logically but not always emotionally fits the users' wishes

People always have a preference in their mind, but hesitate because of side factors or lack of a persuasive/objective support;
+ Satisfying Assistant = Fit the user's true desire
+ Keep logical awareness, but also bring humanity

Virtual assistants know nothing about the users' true desire, preferences or demand.

Contents on social networks reflect a person's personality and preferences; Map, alarm, calendar applications, etc. reflect a person's habit;
+ Virtual assistant should learn about the user
+ Social networks reflect the user's personalities
+ Make use of the virtual reflections of the user

The virtual assistant can't sense what is going on or offer accurate assistance passively.

Sensors + cameras to observe the world & the happening situation; Analyze & Understand the actual demand & choices; Voice, Gesture or Voice + Gesture control for various scenarios;
+ Explore how AR can make both the virtual assistant and the user interact between two worlds

Bring hundreds of audience into live E-sports game interactively and nicely; Research the various desires of potential users while watching live games.

Build game mechanism and in-game interactions attractive, interest, playable and immersive along with the live E-sports game.

Fulfill the players' game experience with visual, physical and psychological feedback and interactions between them and the game.

Instead of watching the e-sports as an audience outside of the game itself, Taunt's wants to make audience involve into the live game. Interacting with other fans & audience, making the whole experience more playable and "game-like"

My missions during my internship were heavy. Firstly I fully attended the design for the first ever open beta product (launched in late August 2018), then I need to provide refined UX design for our next possible iteration to make the game more interactive and interesting.

When I joined as the UX Design intern in June 2018, the team just decided to redesign the product and the whole UX and planned to launch the first ever open beta in APP stores in late August. So I mainly worked with Kevin Hanna and co-designed the whole product, microinteractions for fast start-up-paced iterations. In the end of my internship, we launched the application together, and I designed a refined product's UX. Some of the features and designs of it have been built into the latest update for better in-game experience.

Omni: an AR virtual assistant understanding you true desires like your inner avatar

When a live E-sports match starts, a Taunt game starts, connecting hundreds of fans and audience all over the world and bringing them into the live. Taunt creates an online mobile card game over the live match, allowing players to challenge others with their predictions of the match status and the confidence of the team and player they trust. Also, players can use ability cards (Taunt cards) to gain as many advantages to win points as possible against other players. Or, just for showing off and taunting other fans. Thus, the fans and the audience will be into the match, rather than watching still in front of the screen only.

In the game, players will first pick their preferred team, players and initial cards during the pre-game session. Then, they will be divided into different tables for various rotating game sessions until the live match end. The goal is to winning points

Storyline 1

How Omni magically offers instant satisfying suggestions

Storyline 2

How Omni smartly comforts the user while detecting a possibile depression / bad mental status

Final Presentation

We presented our UX design of Omni and vision of achieving the balance between the personal data usage and the privacy protection.

It inspired me and the team a lot about how AI and future virtual assistant can be and how we should make correct use of our digital data and identity. The balance is hard to achieve indeed, as not everyone will feel safe to make their private data "open", even towards their virtual assistant computed all on the local database. However, the experience design here has persuasively solved many of the trouble we used to face. Indeed, a prepared educational design and the clearance how the data would be used do a lot of help.