Sie sind auf Seite 1von 8

How Haptic Technology Works

Inside this Article


1. Introduction to How Haptic Technology Works
2. The Haptics Continuum
3. Types of Haptic Feedback
4. Haptic Systems
5. Applications of Haptic Technology
6. The Importance of Haptic Technology
If you thought the Apple iPhone was amazing, then feast your eyes -- and fingers -- on this
phone from Samsung. Dubbed the Anycall Haptic, the phone features a large touch-screen
display just like the iPhone. But it does Apple's revolutionary gadget one better, at least for now:
It enables users to feel clicks, vibrations and other tactile input. In all, it provides the user with
22 kinds of touch sensations.
Those sensations explain the use of the term haptic in the name. Haptic is from the Greek
"haptesthai," meaning to touch. As an adjective, it means relating to or based on the sense of
touch. As a noun, usually used in a plural form (haptics), it means the science and physiology of
the sense of touch. Scientists have studied haptics for decades, and they know quite a bit about
the biology of touch. They know, for example, what kind of receptors are in the skin and how
nerves shuttle information back and forth between the central nervous system and the point of
contact.
Unfortunately, computer scientists have had great difficulty transferring this basic understanding
of touch into their virtual reality systems. Visual and auditory cues are easy to replicate in
computer-generated models, but tactile cues are more problematic. It is almost impossible to
enable a user to feel something happening in the computer's mind through a typical interface.
Sure, keyboards allow users to type in words, and joysticks and steering wheels can vibrate. But
how can a user touch what's inside the virtual world? How, for example, can a video game player
feel the hard, cold steel of his or her character's weapon? How can an astronaut, training in a
computer simulator, feel the weight and rough texture of a virtual moon rock?
Since the 1980s, computer scientists have been trying to answer these questions. Their field is a
specialized subset of haptics known as computer haptics. Over the next few pages, we'll cover
how haptic technology works by:

relating computer haptics to related fields of haptics research


characterizing the types of haptic feedback required for realistic virtual touching

examining haptics systems either in development or currently available on the market

exploring current and potential applications

Of course, the promising future of haptics owes much to its history. In the next section, we'll
examine this history to understand that computer haptics falls on a continuum of haptics
research.
4.

HIRO, a haptic interface robot, helps a user feel a dinosaur during the Prototype Robot
Exhibition at the 2005 World Exposition in Japan.
Junko Kimura/Getty Images

The Haptics Continuum


As a field of study, haptics has closely paralleled the rise and evolution of automation. Before the
industrial revolution, scientists focused on how living things experienced touch. Biologists
learned that even simple organisms, such as jellyfish and worms, possessed sophisticated touch
responses. In the early part of the 20th century, psychologists and medical researchers actively
studied how humans experience touch. Appropriately so, this branch of science became known
as human haptics, and it revealed that the human hand, the primary structure associated with the
sense of touch, was extraordinarily complex.
With 27 bones and 40 muscles, including muscles located in the forearm, the hand offers
tremendous dexterity. Scientists quantify this dexterity using a concept known as degrees of
freedom. A degree of freedom is movement afforded by a single joint. Because the human hand
contains 22 joints, it allows movement with 22 degrees of freedom. The skin covering the hand is
also rich with receptors and nerves, components of the nervous system that communicate touch
sensations to the brain and spinal cord.

Then came the development of machines and robots. These mechanical devices also had to touch
and feel their environment, so researchers began to study how this sensation could be transferred
to machines. The era of machine haptics had begun. The earliest machines that allowed haptic
interaction with remote objects were simple lever-and-cable-actuated tongs placed at the end of a
pole. By moving, orienting and squeezing a pistol grip, a worker could remotely control tongs,
which could be used to grab, move and manipulate an object.
In the 1940s, these relatively crude remote manipulation systems were improved to serve the
nuclear and hazardous material industries. Through a machine interface, workers could
manipulate toxic and dangerous substances without risking exposure. Eventually, scientists
developed designs that replaced mechanical connections with motors and electronic signals. This
made it possible to communicate even subtle hand actions to a remote manipulator more
efficiently than ever before.
The next big advance arrived in the form of the electronic computer. At first, computers were
used to control machines in a real environment (think of the computer that controls a factory
robot in an auto assembly plant). But by the 1980s, computers could generate virtual
environments -- 3-D worlds into which users could be cast. In these early virtual environments,
users could receive stimuli through sight and sound only. Haptic interaction with simulated
objects would remain limited for many years.
Then, in 1993, the Artificial Intelligence Laboratory at the Massachusetts Institute of Technology
(MIT) constructed a device that delivered haptic stimulation, finally making it possible to touch
and feel a computer-generated object. The scientists working on the project began to describe
their area of research as computer haptics to differentiate it from machine and human haptics.
Today, computer haptics is defined as the systems required -- both hardware and software -- to
render the touch and feel of virtual objects. It is a rapidly growing field that is yielding a number
of promising haptic technologies.
Before we look at some of these technologies in greater detail, let's look at the types of touch
sensations a haptic system must provide to be successful.
4.

Though many video gamers may not know haptic technology by name, they probably know what
Force Feedback is -- it's been marketed by name on game controllers for years.
2008 HowStuffWorks

Types of Haptic Feedback


When we use our hands to explore the world around us, we receive two types of feedback -kinesthetic and tactile. To understand the difference between the two, consider a hand that
reaches for, picks up and explores a baseball. As the hand reaches for the ball and adjusts its
shape to grasp, a unique set of data points describing joint angle, muscle length and tension is
generated. This information is collected by a specialized group of receptors embedded in
muscles, tendons and joints.
Known as proprioceptors, these receptors carry signals to the brain, where they are processed
by the somatosensory region of the cerebral cortex. The muscle spindle is one type of
proprioceptor that provides information about changes in muscle length. The Golgi tendon
organ is another type of proprioceptor that provides information about changes in muscle
tension. The brain processes this kinesthetic information to provide a sense of the baseball's
gross size and shape, as well as its position relative to the hand, arm and body.
When the fingers touch the ball, contact is made between the finger pads and the ball surface.
Each finger pad is a complex sensory structure containing receptors both in the skin and in the
underlying tissue. There are many types of these receptors, one for each type of stimulus: light
touch, heavy touch, pressure, vibration and pain. The data coming collectively from these
receptors helps the brain understand subtle tactile details about the ball. As the fingers explore,
they sense the smoother texture of the leather, the raised coarseness of the laces and the hardness

of the ball as force is applied. Even the thermal properties of the ball are sensed through tactile
receptors.

Force feedback is a term often used to describe tactile and/or kinesthetic feedback. As our
baseball example illustrates, force feedback is vastly complex. Yet, if a person is to feel a virtual
object with any fidelity, force feedback is exactly the kind of information the person must
receive. Computer scientists began working on devices -- haptic interface devices -- that would
allow users to feel virtual objects via force feedback. Early attempts were not successful. But as
we'll see in the next section, a new generation of haptic interface devices is delivering an
unsurpassed level of performance, fidelity and ease of use.
7.

The Omni, the entry-level device in the PHANTOM line from SensAble Technologies
Courtesy SensAble Technologies

Haptic Systems
There are several approaches to creating haptic systems. Although they may look drastically
different, they all have two important things in common -- software to determine the forces that
result when a user's virtual identity interacts with an object and a device through which those
forces can be applied to the user. The actual process used by the software to perform its
calculations is called haptic rendering. A common rendering method uses polyhedral models to
represent objects in the virtual world. These 3-D models can accurately portray a variety of
shapes and can calculate touch data by evaluating how force lines interact with the various faces
of the object. Such 3-D objects can be made to feel solid and can have surface texture.

The job of conveying haptic images to the user falls to the interface device. In many respects, the
interface device is analogous to a mouse, except a mouse is a passive device that cannot
communicate any synthesized haptic data to the user. Let's look at a few specific haptic systems
to understand how these devices work.

The PHANTOM interface from SensAble Technologies was one of the first haptic
systems to be sold commercially. Its success lies in its simplicity. Instead of trying to
display information from many different points, this haptic device simulates touching at a
single point of contact. It achieves this through a stylus which is connected to a lamp-like
arm. Three small motors give force feedback to the user by exerting pressure on the
stylus. So, a user can feel the elasticity of a virtual balloon or the solidity of a brick wall.
He or she can also feel texture, temperature and weight. The stylus can be customized so
that it closely resembles just about any object. For example, it can be fitted with a syringe
attachment to simulate what it feels like to pierce skin and muscle when giving a shot.

The CyberGrasp system, another commercially available haptic interface from Immersion
Corporation, takes a different approach. This device fits over the user's entire hand like an
exoskeleton and adds resistive force feedback to each finger. Five actuators produce the
forces, which are transmitted along tendons that connect the fingertips to the exoskeleton.
With the CyberGrasp system, users are able to feel the size and shape of virtual objects
that only exist in a computer-generated world. To make sure a user's fingers don't
penetrate or crush a virtual solid object, the actuators can be individually programmed to
match the object's physical properties.

Researchers at Carnegie Mellon University are experimenting with a haptic interface that
does not rely on actuated linkage or cable devices. Instead, their interface uses a powerful
electromagnet to levitate a handle that looks a bit like a joystick. The user manipulates
the levitated tool handle to interact with computed environments. As she moves and
rotates the handle, she can feel the motion, shape, resistance and surface texture of
simulated objects. This is one of the big advantages of a levitation-based technology: It
reduces friction and other interference so the user experiences less distraction and
remains immersed in the virtual environment. It also allows constrained motion in six
degrees of freedom (compared to the entry-level Phantom interface, which only allows
for three active degrees of freedom). The one disadvantage of the magnetic levitation
haptic interface is its footprint. An entire cabinet is required to house the maglev device,
power supplies, amplifiers and control processors. The user handle protrudes from a bowl
embedded in the cabinet top.

As you can imagine, systems like we've described here can be quite expensive. That means the
applications of the technology are still limited to certain industries and specialized types of
training. On the next page, we'll explore some of the applications of haptic technology.
4.

Helping the Blind Feel a City

Computer scientists in Greece are incorporating haptic technology into touchable maps for the
blind. To create a map, researchers shoot video of a real-world location, either an architectural
model of a building or a city block. Software evaluates the video frame by frame to determine
the shape and location of every object. The data results in a three-dimensional grid of force fields
for each structure. Using a haptic interface device, a blind person can feel these forces and, along
with audio cues, get a much better feel of a city's or building's layout.

Applications of Haptic Technology


It's not difficult to think of ways to apply haptics. Video game makers have been early adopters
of passive haptics, which takes advantage of vibrating joysticks, controllers and steering wheels
to reinforce on-screen activity. But future video games will enable players to feel and manipulate
virtual solids, fluids, tools and avatars. The Novint Falcon haptics controller is already making
this promise a reality. The 3-D force feedback controller allows you to tell the difference between
a pistol report and a shotgun blast, or to feel the resistance of a longbow's string as you pull back
an arrow.
Graphical user interfaces, like those that define Windows and Mac operating environments, will
also benefit greatly from haptic interactions. Imagine being able to feel graphic buttons and
receive force feedback as you depress a button. Some touchscreen manufacturers are already
experimenting with this technology. Nokia phone designers have perfected a tactile touchscreen
that makes on-screen buttons behave as if they were real buttons. When a user presses the button,
he or she feels movement in and movement out. He also hears an audible click. Nokia engineers
accomplished this by placing two small piezoelectric sensor pads under the screen and designing
the screen so it could move slightly when pressed. Everything -- movement and sound -- is
synchronized perfectly to simulate real button manipulation.
Although several companies are joining Novint and Nokia in the push to incorporate haptic
interfaces into mainstream products, cost is still an obstacle. The most sophisticated touch
technology is found in industrial, military and medical applications. Training with haptics is
becoming more and more common. For example, medical students can now perfect delicate
surgical techniques on the computer, feeling what it's like to suture blood vessels in an
anastomosis or inject BOTOX into the muscle tissue of a virtual face. Aircraft mechanics can
work with complex parts and service procedures, touching everything that they see on the
computer screen. And soldiers can prepare for battle in a variety of ways, from learning how to
defuse a bomb to operating a helicopter, tank or fighter jet in virtual combat scenarios.
Haptic technology is also widely used in teleoperation, or telerobotics. In a telerobotic system, a
human operator controls the movements of a robot that is located some distance away. Some
teleoperated robots are limited to very simple tasks, such as aiming a camera and sending back
visual images. In a more sophisticated form of teleoperation known as telepresence, the human
operator has a sense of being located in the robot's environment. Haptics now makes it possible
to include touch cues in addition to audio and visual cues in telepresence models. It won't be

long before astronomers and planet scientists actually hold and manipulate a Martian rock
through an advanced haptics-enabled telerobot -- a high-touch version of the Mars Exploration
Rover.
On the next page, we'll take a look at how haptic technology has gained in its importance and is
becoming essential in some applications.
4.
Haptic Learning: The Next Generation of Hands-on

Teachers are often tasked these days with assessing their students' learning styles so they can
adapt their teaching methods accordingly. A learning style is how a person learns best. Although
there are many learning style models, a popular model is based on sensory input. In this model,
there are three basic learning styles: auditory, visual and kinesthetic. Most students learn best
through one of these three modes, although some are multi-modal, which means they have more
than one strong learning preference.
Research is showing that even auditory and visual learners benefit greatly from activities that
involve the sense of touch. In one study, middle and high school students developed more
positive attitudes about science and achieved a deeper understanding of key concepts when they
use haptic learning techniques. Based on this and similar studies, science teachers in particular
are attracted to haptics. Many are using the technology to help students interact with objects,
such as viruses or nanoparticles which would otherwise be too small to be touched or seen.
Others are enabling their students to probe 3-D renderings of cells. And still others are using
haptic feedback devices to teach students about invisible forces like gravity and friction more
completely.

The Importance of Haptic Technology


In video games, the addition of haptic capabilities is nice to have. It increases the reality of the
game and, as a result, the user's satisfaction. But in training and other applications, haptic
interfaces are vital. That's because the sense of touch conveys rich and detailed information
about an object. When it's combined with other senses, especially sight, touch dramatically
increases the amount of information that is sent to the brain for processing. The increase in
information reduces user error, as well as the time it takes to complete a task. It also reduces the
energy consumption and the magnitudes of contact forces used in a teleoperation situation.
Clearly, Samsung is hoping to capitalize on some of these benefits with the introduction of the
Anycall Haptic phone. Nokia will push the envelope even farther when it introduces phones with
tactile touchscreens. Yes, such phones will be cool to look at. And, yes, they will be cool to
touch. But they will also be easier to use, with the touch-based features leading to fewer input
errors and an overall more satisfying experience.

Das könnte Ihnen auch gefallen