Beruflich Dokumente
Kultur Dokumente
Of course, the promising future of haptics owes much to its history. In the next section, we'll
examine this history to understand that computer haptics falls on a continuum of haptics
research.
4.
HIRO, a haptic interface robot, helps a user feel a dinosaur during the Prototype Robot
Exhibition at the 2005 World Exposition in Japan.
Junko Kimura/Getty Images
Then came the development of machines and robots. These mechanical devices also had to touch
and feel their environment, so researchers began to study how this sensation could be transferred
to machines. The era of machine haptics had begun. The earliest machines that allowed haptic
interaction with remote objects were simple lever-and-cable-actuated tongs placed at the end of a
pole. By moving, orienting and squeezing a pistol grip, a worker could remotely control tongs,
which could be used to grab, move and manipulate an object.
In the 1940s, these relatively crude remote manipulation systems were improved to serve the
nuclear and hazardous material industries. Through a machine interface, workers could
manipulate toxic and dangerous substances without risking exposure. Eventually, scientists
developed designs that replaced mechanical connections with motors and electronic signals. This
made it possible to communicate even subtle hand actions to a remote manipulator more
efficiently than ever before.
The next big advance arrived in the form of the electronic computer. At first, computers were
used to control machines in a real environment (think of the computer that controls a factory
robot in an auto assembly plant). But by the 1980s, computers could generate virtual
environments -- 3-D worlds into which users could be cast. In these early virtual environments,
users could receive stimuli through sight and sound only. Haptic interaction with simulated
objects would remain limited for many years.
Then, in 1993, the Artificial Intelligence Laboratory at the Massachusetts Institute of Technology
(MIT) constructed a device that delivered haptic stimulation, finally making it possible to touch
and feel a computer-generated object. The scientists working on the project began to describe
their area of research as computer haptics to differentiate it from machine and human haptics.
Today, computer haptics is defined as the systems required -- both hardware and software -- to
render the touch and feel of virtual objects. It is a rapidly growing field that is yielding a number
of promising haptic technologies.
Before we look at some of these technologies in greater detail, let's look at the types of touch
sensations a haptic system must provide to be successful.
4.
Though many video gamers may not know haptic technology by name, they probably know what
Force Feedback is -- it's been marketed by name on game controllers for years.
2008 HowStuffWorks
of the ball as force is applied. Even the thermal properties of the ball are sensed through tactile
receptors.
Force feedback is a term often used to describe tactile and/or kinesthetic feedback. As our
baseball example illustrates, force feedback is vastly complex. Yet, if a person is to feel a virtual
object with any fidelity, force feedback is exactly the kind of information the person must
receive. Computer scientists began working on devices -- haptic interface devices -- that would
allow users to feel virtual objects via force feedback. Early attempts were not successful. But as
we'll see in the next section, a new generation of haptic interface devices is delivering an
unsurpassed level of performance, fidelity and ease of use.
7.
The Omni, the entry-level device in the PHANTOM line from SensAble Technologies
Courtesy SensAble Technologies
Haptic Systems
There are several approaches to creating haptic systems. Although they may look drastically
different, they all have two important things in common -- software to determine the forces that
result when a user's virtual identity interacts with an object and a device through which those
forces can be applied to the user. The actual process used by the software to perform its
calculations is called haptic rendering. A common rendering method uses polyhedral models to
represent objects in the virtual world. These 3-D models can accurately portray a variety of
shapes and can calculate touch data by evaluating how force lines interact with the various faces
of the object. Such 3-D objects can be made to feel solid and can have surface texture.
The job of conveying haptic images to the user falls to the interface device. In many respects, the
interface device is analogous to a mouse, except a mouse is a passive device that cannot
communicate any synthesized haptic data to the user. Let's look at a few specific haptic systems
to understand how these devices work.
The PHANTOM interface from SensAble Technologies was one of the first haptic
systems to be sold commercially. Its success lies in its simplicity. Instead of trying to
display information from many different points, this haptic device simulates touching at a
single point of contact. It achieves this through a stylus which is connected to a lamp-like
arm. Three small motors give force feedback to the user by exerting pressure on the
stylus. So, a user can feel the elasticity of a virtual balloon or the solidity of a brick wall.
He or she can also feel texture, temperature and weight. The stylus can be customized so
that it closely resembles just about any object. For example, it can be fitted with a syringe
attachment to simulate what it feels like to pierce skin and muscle when giving a shot.
The CyberGrasp system, another commercially available haptic interface from Immersion
Corporation, takes a different approach. This device fits over the user's entire hand like an
exoskeleton and adds resistive force feedback to each finger. Five actuators produce the
forces, which are transmitted along tendons that connect the fingertips to the exoskeleton.
With the CyberGrasp system, users are able to feel the size and shape of virtual objects
that only exist in a computer-generated world. To make sure a user's fingers don't
penetrate or crush a virtual solid object, the actuators can be individually programmed to
match the object's physical properties.
Researchers at Carnegie Mellon University are experimenting with a haptic interface that
does not rely on actuated linkage or cable devices. Instead, their interface uses a powerful
electromagnet to levitate a handle that looks a bit like a joystick. The user manipulates
the levitated tool handle to interact with computed environments. As she moves and
rotates the handle, she can feel the motion, shape, resistance and surface texture of
simulated objects. This is one of the big advantages of a levitation-based technology: It
reduces friction and other interference so the user experiences less distraction and
remains immersed in the virtual environment. It also allows constrained motion in six
degrees of freedom (compared to the entry-level Phantom interface, which only allows
for three active degrees of freedom). The one disadvantage of the magnetic levitation
haptic interface is its footprint. An entire cabinet is required to house the maglev device,
power supplies, amplifiers and control processors. The user handle protrudes from a bowl
embedded in the cabinet top.
As you can imagine, systems like we've described here can be quite expensive. That means the
applications of the technology are still limited to certain industries and specialized types of
training. On the next page, we'll explore some of the applications of haptic technology.
4.
Computer scientists in Greece are incorporating haptic technology into touchable maps for the
blind. To create a map, researchers shoot video of a real-world location, either an architectural
model of a building or a city block. Software evaluates the video frame by frame to determine
the shape and location of every object. The data results in a three-dimensional grid of force fields
for each structure. Using a haptic interface device, a blind person can feel these forces and, along
with audio cues, get a much better feel of a city's or building's layout.
long before astronomers and planet scientists actually hold and manipulate a Martian rock
through an advanced haptics-enabled telerobot -- a high-touch version of the Mars Exploration
Rover.
On the next page, we'll take a look at how haptic technology has gained in its importance and is
becoming essential in some applications.
4.
Haptic Learning: The Next Generation of Hands-on
Teachers are often tasked these days with assessing their students' learning styles so they can
adapt their teaching methods accordingly. A learning style is how a person learns best. Although
there are many learning style models, a popular model is based on sensory input. In this model,
there are three basic learning styles: auditory, visual and kinesthetic. Most students learn best
through one of these three modes, although some are multi-modal, which means they have more
than one strong learning preference.
Research is showing that even auditory and visual learners benefit greatly from activities that
involve the sense of touch. In one study, middle and high school students developed more
positive attitudes about science and achieved a deeper understanding of key concepts when they
use haptic learning techniques. Based on this and similar studies, science teachers in particular
are attracted to haptics. Many are using the technology to help students interact with objects,
such as viruses or nanoparticles which would otherwise be too small to be touched or seen.
Others are enabling their students to probe 3-D renderings of cells. And still others are using
haptic feedback devices to teach students about invisible forces like gravity and friction more
completely.