The Mouse of virtual reality, or The TomorrowLand Orb.
TomorrowLand PosterThe motivation for this project was to replicate the future technology that is displayed in TomorrowLand. Specifically, a scene where the characters are able to travel to any location and time with a sphere in the center of the room. They navigate the world by rotating and touching the controller. I wanted to replicate this controller for virtual reality and also explore the potential of creating a mouselike device for virtual-reality, Instead of a “touchscreen” like device.
Many people and companies are currently exploring gloves and gesture controlled interfaces for navigating a virtual environment. However, there may be some use cases for a devices that doesn’t require pointing or gesturing, but simply using subtle movements of the hand… like you can with a mouse on a traditional, flat screen.
Overview of Approach
The orb can be controlled by rotating the device as well as gestures on the device (pinching). You operate the orb standing with a virtual reality headset on, resting your hands on top of the sphere. Throughout the explanations below, I try to give a reason why I chose the approach I took over other alternatives.
Hardware – Mechanical
Mechanically, a 4″ diameter PVC pipe stands vertically on top of a base. Above the pipe, there is an un-noticable flower pot. The flower pot fits snuggly into the top of the PVC pipe. The flower pot is surrounded by a lamp shade (which is hot-glued to the base of the flower-pot). Inside the flower-pot is a confetti-like substance to fill the space.
Therefore, the only thing you can see as a user, externally, is the pipe and lampshade, with an orb resting on top of that. The components that can be seen from the outside were spray painted silver to give a futuristic look.
A hacked-mouse (see below) rests on top of the confetti. Finally, the orb sits atop the confetti and hacked-mouse. Inside the orb, there is a half-sphere that rests at the bottom of the orb but is not attached. An iPhone is placed on top of the half-sphere, so that when the orb rotates, the phone stays at the bottom of the sphere without flipping over.
All of the hardware was purchased at Home Depot, and none of it required machining or power tools (since I do not have readily available access to either this semester).
Hardware – Electrical
The orb input is completely wireless. The rotation information is picked up by an upside down hacked-mouse (the buttons are removed so that the orb can not accidentally click, and as the ball rotates, the laser from the mouse is picking up the ball’s movement.
A user’s touch can be picked up by the camera on the iPhone inside the sphere (see software — iPhone, below).
Software — Overview
Rotational information from the mouse is sent to the computer via bluetooth, and then interpreted as translational motion. The touch information from the iPhone is sent “to the cloud”, the Mac then constantly checks to see if the touch information has been updated on the server (this is because direct communication from the iPhone to Mac is not officially supported with APIs by Apple, and is therefore very unreliable. The easiest way to do it is using UDP packets over USB — but I wanted the system to be wireless).
Software — iPhone
The iPhone runs an app I created to sense the pinch gesture on top of the sphere. OpenCV and Parse were the only two third party libraries used to create this app, both of which are available for free. The software loop is explained in detail here:
- For a given frame of video:
- Run a canny edge detection filter (result of two fingers shown in image to the right)
- Check to see if edges appear to be moving towards a center, or away from each other
- If either is true, make sure the gesture is sustained for about one second.
- If gesture sustained for a full second:
- Send a request to Parse to update the boolean variable stored on Parse’s servers.
Software — Mac
The Mac uses a Unity application I created to handle the inputs and display the results to the Oculus VR headset. The rotation information is received via Bluetooth, and the pinch control is received using a constant check to the Parse server looking for a change.
Using C#, these values are converted into changes in the view.
- I had not used Unity before this semester, so I learned Unity and C# language
- Including how to create scripts, how to move objects in the view and change camera locations
- I learned how to integrate multiple data sources into an application
- I learned OpenCV for iOS (and in general)
Things to Expand On
- In the future, this system could take more inputs (for example, different touch gestures besides pinch to zoom).
- With better access to machine shops, the industrial design could be improved to more closely replicate the movie.