Throughout the years, humans have created various forms of entertainment. More recently, there has been a shift to make entertainment much more interactive. With the advent of virtual reality technology such as the Oculus Rift, people have been able to experience virtual worlds like never before. VR puts you in worlds that you could only dream of. Yet there is still something missing.
Science fiction has presented us with many incredible reals that many of us would love to be able to visit and experience. Using virtual reality we are getting much closer to being able to do that. In books such as Ready Player One the characters are able to be fully immersed in virtual worlds seemingly being able to live in them. Today though, the best we can really do is look.
Currently, in video games and virtual experiences, we are able to walk across fantastic landscapes, and it can feel like we really are there. Yet when we try and interact with out environment, we don’t get any feedback, and that can really take us out of the experience.
My project is VirtuTouch and it aims to bring our sense of touch into the virtual realm. With head mounted displays, such as the Oculus Rift, we can experience virtual reality around us. This project takes this technology and pairs if with a system that supplies haptic feedback based off of what is happening in the virtual world to create an even more immersive experience. The hope is that this project can bring us one step closer to truly being able to experience the fantastic worlds we keep reading about in books.
I first looked into various ways to replicate the sense of touch. Some of the ideas that came to mind were using either a motor/servo rig to physically press on the user’s arm, or the use of piezoelectric materials to create a finer resolution of touch. I ended up going with Transcutaneous electrical nerve stimulation (TENS). This technique is usually used to treat sore muscles and joints using small electric pulses, applied using electrodes which are placed on the skin.
I got a TENs machine and re-purposed it for this project. By playing around with the settings and placement of the electrodes, I was able to simulate a feeling similar to being tapped on the arm in different regions.
In order to be bale to control this accurately I used an Arduino and a relay switch so that I could control each region separately.
I then turned to the software Unity, and the Oculus rift to create my virtual test environment. By using the Oculus SDK I was able to get things up and running fairly easily.
I used a leap motion unity to integrate accurate, real-time hand tracking into the application. This way the user is able to use their real hand to interact with their surroundings, adding another level of immersion,
The last step was to get it all working together. I wrote some code in order to allow the Arduino to communicate with Unity so that it knew which region of the arm was being hit in order to activate the corresponding electrodes. Currently the project works well with Windows, but has some issues on Mac because of serial connection issues between Unity and the Arduino.
The next steps for this project are to refine the electrodes a bit and expand them to perhaps cover more regions. Another idea is that, if you place the electrodes in the right spots, it is possible to activate the muscles in that area. This could be used to simulate the weight of objects that are picked up in the virtual world, or to induce strain when climbing ladders and the like.
Here is a link to all of the code and Unity files used in this project: https://www.dropbox.com/sh/329kq3xykim39wa/AABQsXXyfvZ0QztoaxrPF77za?dl=0
Project By: Phil Cherner