MR UX Design: Seattle AR/VR Hackathon

First Magic Leap Award at Seattle AR/VR Hackathon

Designing an Interactive Immersive Application for the Magic Leap One

Overview

This project was completed during the Seattle AR/VR Hackathon that I was invited to participate in September of 2018. During this project, I acted as both the Lead UX Designer and 3D Artist for the application. Because this was a Magic Leap-based project, I worked five different developers and a 3D artist to create a seamless MR experience that utilized some of the most advanced features the Magic Leap One Headset (ML1) utilized at the time. Our entire project was created over the course of a couple days and utilized eye tracking and hand gesture technology.

fire2

Execution

As the Lead UX Designer, I wanted to work with the developers to create something we all would enjoy. Not only that, but I wanted to create something that also utilized the advanced technology the ML1 had to offer. We assembled Friday evening, and ideated on many different project types and styles before settling on a game pupetteering system that utilized the hand gesture software appropriately, while also emphasizing the game development background of a lot of our team members. We determined this would be beneficial to everyone in the group, and also something we could reasonably create within the time span of the hackathon.

DargonRender01

Saturday morning rolled around and after a good nights sleep the main developer and I collaborated together to figure out what would make the most logical sense for a puppeteering game. Looking at a lot of the hand gestures, I found that they operated very similarly and decided on action-based properties that one main character could do. We settled on a medieval theme, with our main character being a dragon. This dragon would have three main actions the the user could control according to the hand gesture and eye tracking system native to the ML1.

Screen Shot 2018-09-16 at 4.57.56 PM

 

Seen below are some of the environmental models that I created. Prior to this project, I had done a little bit with environmental modeling but not nearly as much with Unity scripting for onCollision effects. This was a really fun learning experiment for me about learning and playing with Rigidbody effects in Unity. I welcomed the opportunity to create new models for the interaction system.

DnNgeAXU8AAQjPY.jpg-large

 

Final Result

This project was amazingly fun to play with, and we won first place in the Magic Leap project category for our integration of hand gestures and eye tracking. This project really showed me the extent of this hardware and how integral it is to design with hardware in mind.

DnQ8uMZVYAAf3iT.jpg-large

Below you see a video demo of the project and all of the individuals that made this project possible (it was no small feat). I had a wonderful time learning about the ML1 and working with these developers. I hope to do it again soon!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s