Saccase – Turrets tracking Lasers tracking Pupils tracking Targets (Final Project – Daniel Fitzgerald)

Ideation

Normally, our eyes are our window in to the world, the primary way we sense our environment, and the embodiment of our attention. Eyes evolved over millions of years to be fast, precise, and accurate, especially for tracking visual elements of interest, like predators, prey, or baseballs. However, these are also properties that we usually seek in actuators as well as sensors, or at-least in the control of such actuators. What if the eyes can not only be used to attentively focus on salient features in the world, but as an interface to give saliency to our focus and attention? What if we added lasers?

Summarization

This project presents a wearable system to couple the pointing direction of a laser on a gimbal to the gaze direction of the user. A demonstrative scenario is also presented as with potential military applications. The concept and system are evaluated and discussed.

Inspiration

Lasers are a common trope in Science Fiction, features in everything from laser guns to laser screwdrivers. They are especially common in robots, namely automatic sentry turrets and “smart” weapons. These include laser-sighted weapons, laser targeting missiles, etc. but in almost all cases, the laser is used as an indicator or control for some other process or device – the laser is an interface.

One particularly impressive example is in the 1987 sci-fi action movie Predator starring Arnold Schwarzenegger. In one classic scene, the antagonist, an alien “big game hunter” stalking Schwarzenegger’s character, appears to use helmet-mounted laser beams to direct the aim of a shoulder-mounted automated plasma cannon. Although this is clearly a very advanced weapon that gives it’s user a striking advantage in fast-paced combat where their arms are needed for other tasks, like climbing jungle trees, is not obvious exactly how this technology works in the film: How is the cannon triggered? Is it integrated with the Predator’s advanced multispectral visual targeting system? To what extent does the cannon aim on it’s own, and how much must it be guided by the precise alignment of it’s user’s head and helmet with the intended target?

 

Predator with laser sight/targeting and shoulder-mounted laser-tracking turret.

This project attempts to explore some possible solutions for laser-based interfaces, while also taking the concept of “hands free” direction indication a step further: instead of pointing with your head, what if you can indicate a vector in space with a part of your body that you are likely already putting in perfect alignment with the subject of your attention in a natural, intuitive, fast, and precise way: your eyes?

Motivation

There are several advantages to eye-based control. When we think of “dexterity” we usually associate nimbleness, precision, and speed with the hands. However, we have other organs, the eyes, which are perhaps even faster and more precise, but are as-yet largely unused for interaction or control. Such an interaction method is, of course, also hands-free, another compelling property, and eye-tracking technology is becoming easy and inexpensive, with hobby-level systems exhibiting performance suitable for most tasks. Finally, when an eye-tracker is hooked up to a laser pointer, there is the undeniable cool-factor of creating a system that essentially allows the user to shoot lasers from their eyes.

Superman: The original laser-vision and laser-vision-based weapon.

Application

The primary exception to the underutilization is for physically disabled users who cannot manipulate or interact with devices in the normal way.

A system developed by Artistic Realization Technologies allows an individual with cerebral palsey to paint by indicating to a human mediator with a head-mounted laser pointer.

The eyeWriter system allows people with ALS to “paint” by “drawing” with their gaze on a projected “canvas” (including walls and buildings).

However, there is also class of tasks for-which simply looking is itself a natural interaction method. These tasks usually involve targeting, pointing, selection, etc. via a 3D vector in space – the gaze direction. Although all of these actions are applicable to tasks on a standard computer UI (2D screen), and eye-control of common computer tasks (controlling a mouse, opening email, writing documents, etc.) has been well demonstrated, this is not the best use of our eyes to interact with computers. With the exception of cases involving disabled people, such tasks are usually accomplished far more efficiently with a standard keyboard and mouse or touchscreen. Where the eyes may represent a natural and powerful interaction method, however, is in spatial interactions with objects in the real world (not directly within reach), where these standard interfaces fail.

 

Exploration

The obvious and powerful class of applications is targeting, marking, and indication. This includes interaction with laser-tracking devices, such as weapons like the predator’s shoulder cannon, as well as interaction with any other camera-equipped device in which the user must indicate specific points or directions in the world. This may include human-robot interaction and collaboration, online shopping (stare at an item you’re interested in to indicate attention to your phone automatically notices and finds the item in an online vendor).

Another class of applications could be in psychological experimentation. Eye-tracking is already used extensively to analyze marketing material, user attention, but this is currently limited to highly controlled and intrusive testing environments. With wearable technology, such research could be conducted “in the field.” Furthermore, interesting crowd-psychology experiments could be conducted involving dozens of participants, interacting with each-other and their laser-indicated gaze direction. Will people be more or less likely to look in specific places or at other people if they know that everyone else knows exactly where they are looking all the time.  The saliency of other people’s attention might make the individual’s less unique, since they feel an urge to look where other’s are looking. Can that urge be translated and mediated through lasers so that is it no longer necessary to be looking at someone to feel compelled to follow their gaze.

Normalization

Any science-fiction technology could be “too weird” to be accepted, and must be designed to map the strange new experience to something that is already understood, usually through stories and metaphors, if it is to catch on. Shooting lasers from one’s eyes is a strange experience, but not a new concept – in fact ancient cultures tended to assume that the eyes emit light and that this is the mechanism of sight (See Extramission theory.) This was only disproved during the 18th century by Newton and others. Even in practice, the device is only extending our previous expectations of the technology. We usually are looking where we point lasers anyway, and point lasers where we look (or want to indicate we’re looking). The primary common use of lasers is already for pointing and associated with gaze, and these uses are consistent with most people’s experiences with lasers.

The “strangest” aspect of Saccase, then, is the eye-tracker and, more, generally, the oddity of wearing any big bulky technological device on one’s head. One solution is the package the system in an aesthetic reminiscent or characteristic of those devices which exhibit similar modes or goals of interaction. These include telescopes, binoculars, cameras, and cellphones. Exploration of these and other potential normalizations should be the subject of future research.

Fabrication

The system consists of four primary subsystems: the eye-tracking camera, the laser pointer gimbal, the laser-tracking turret, driving electronics for the turret and gimbal, and a computer with software for eye-tracking and vision processing.

  • Eye-Tracker The eye tracker is based off the open-source pupil system and consists of a standard usb webcam with the outer case and infrared filter removed. The webcam is mounted on the frames of safety glasses with an adjustable boom. In the first iteration, the webcam is positioned directly in front of the user’s eyeball, a couple centimeters away. Although this allows for highly reliable eye tracking, it is also obtrusive. In the final version, the webcam “sees” the user’s pupil reflected off an infrared mirror placed at an angle in front of the eye. Because this mirror passes visible light, the user can look through it and hardly notice anything is there. Meanwhile the webcam can be situated to look down at the mirror from above the eye, where it is less intrusive. Finally, the webcam status LEDs are replaced with infrared LEDs, which are angled to illuminate the eyeball area.
Pupil Eye Tracking "Glasses"

Pupil Eye Tracking “Glasses”

  • Laser Pointer : The laserpointer is a simply a small laser module attached to a 2 Degree of Freedom (2-DOF) mini servo gimbal. This assembly is mounted to a GoPro headstrap camera mount (not shown).
Head-mounter Laser Gimbal

Head-mounter Laser Gimbal

  • Tracking Turret:  The turret consists the functional components of a fully automatic NURF gun with a new case mounted on a 2-DOF high-torque servo mechanism. A standard webcam is attached on the base. This assembly can be mounted on a modified GoPro shoulder strap camera mount. (The turret can also be mounted on a standard camera tripod to act as a standalone automated sentry turret, with separate laser targeting system or Saccase-enabled human of course.)
Automatic Laser-Guided NERF Turret

Automatic Laser-Guided NERF Turret

MOVIE

  • Control Electronics The electronics consists of an Arduino Uno micocontroller which controls all servo motors (four in total through PWM), the DC motor of the nerf gun (through a power MOSFET), the laser gimbal laser module (direct), and the illumination LEDs of the turret (through a constant-current LED driver circuit for 1W LEDs) All servos and LEDs are powered by a separate regulated 5V supply, which is derived from a 7.4V 2S 25C LiPo battery pack. The DC motor is powered directly by the by the battery. All electronics are housed in a protective box.
Electronics Box

Electronics Box

  • Computer and Software A macbook air is used to coordinate the system and perform computer vision processing. The two webcams and Arduino are connected to the computer through a usb hub. The open-source Pupil Capture software processes the video stream from the eye-tracking webcam and broadcasts pupil positions on an ftp server. A custom python script subscribes to these messages and commands the laser gimbal accordingly by mapping the pupil position (x,y) screen position to gimbal yaw and pitch angles. Inverse-distance-weighted interpolation is used on a set of 9 calibration points near the corners, edge-centers, and center of the eye’s Field of View (FOV, assumed to be a rectangular frustum.) These corresponding points are recorded during an initial calibration routine in which the laser is pointed at each point and the program records the average pupil position when the user is looking at the laser dot.  After calibration, when the user is targeting the laser with their gaze, a separate python routine is run to control the laser-tracking turret. This routine processes the video stream from the turret webcam to detect laser dots. This process involves color filtering and thresholding to produce a binary mask of likely laser dot pixels. A dilution kernel is applied to eliminate single-pixel outliers. If the number of remaining pixels is sufficient, their centroid of the laser dot is then derived from the first moment of the mask. The (x,y) position of this centroid is linearly mapped to the (yaw, pitch) of the turret based on the webcam’s FOV. The only software connection between the software system for the eye-tracking-laser-gimbal and the laser-tracking-turret is that user winks detected by the eye-tracker activate the firing mechanism on the turret.

Evaluation

This project demonstrated a pupil-tracking laser interface system and one application for controlling a man-portable laser-tracking nerf turret.

  • Technical Challenges The primary difficulty in this project was in software for calibrating and mapping the pupil positions to laser angles, and in detecting laser dots for the turret tracking. Furthermore, doing both simultaneously was found to be problematic, likely due to limits on processing power and communication bandwidth. Mechanical slop in the turret gimbal made it impractical to use in real nerf wars, especially in the shoulder-mounted configuration.

Speculation 

  • Future Improvements With sufficient refinement, a system like Saccase could achieve laser targeting that is even more accurate than is possible with a hand-held laser pointer. This is because handheld laser pointers (as well as guns and anything else that is manually manipulated with a laser beam) are prone to shakiness and instability, but the eyes can maintain a tight lock on a target even as the user is moving the rest of their body wildly. A system that could keep up with the eye movement would be as accurate as our gaze. This may be possible with piezoelectric mirror gimbals, etc.
  • Far Future Improvements Perhaps the largest limitation of eye-based control interfaces is the lack of direct feedback. The system has no way of knowing if the user is actually looking at what it thinks they are looking at, other than through a small sample calibration points. A closed loop feedback system could theoretically achieve much better accuracy, but would require the system knows how far off target it’s aim is, and in what direction, and that is receives this error update very frequently. It is not impossible to image a neural implant in the low-level visual cortex that can register the detection of a laser dot in the visual field and provide this feedback. (Although, with that level of neural interfaces, you might as well make the entire system directly brain controlled.)
  • Societal Adoption What would a society with a high percentage of the population wearing Saccase look like? What happens to the individual and their privacy and freedom when the attention of everyone else around them is made glaringly obvious (pun intended), and their own most intimate glances are broadcast for all to see in real time? One could imagine a society with vastly different norms, rules, and customs regulating where one may look, as well as vastly different patterns in where groups look and why.