Sie sind auf Seite 1von 7

TECHNOZION-XI

Topic of paper:

Haptic Interfacing

Name of College participants MVGR 1)M.Girish Chandra 2)R.Sasimithara


MVGR

Branch Year
CSE CSE 3rd 3rd

Mobile No.
9494421298 9494704425

Mail ID
girishmgc@gmail.com sasimitrar@gmail.com

Category: Artificial intelligence techniques for electrical applications

Introduction:
Haptics is poised for rapid growth. We need to develop smart devices to interface with information-rich real and virtual worlds. Given the ever-increasing types quantities and of information that surrounds us, and to which we need to respond rapidly, there is a critical need to explore new ways to interact with information. In order to be efficient in this interaction, it is essential that we utilize all of our sensorimotor capabilities. Our haptic system (its tactile, kinesthetic, and motor capabilities together with associated cognitive processes) presents a uniquely bi-directional information channel to our brains. For example, if we add force and distributed tactile feedback of sufficient range, resolution and frequency bandwidth to match that of our hands and other body parts, a large number of applications open up, such as haptic aids for a blind user surfing the net or a surgical trainee perfecting his trade. The combination of high performance forcecontrollable haptic interfaces, computational geometric modeling and collision techniques, cost-effective processing and memory, and an understanding of the perceptual needs of the human haptic system allows us to assemble computer haptic systems that can display objects of sophisticated complexity and behavior. With the commercial availability of 3 degree-of freedom Haptic interfaces, software toolkits from several corporate and academic sources, and several commercial haptics-enabled applications, the field is experiencing rapid and exciting growth. Ongoing engineering revolutions going on in information technology and the miniaturization of sensors and actuators are bringing this dream ever closer to reality. Human hand is a versatile organ that is able to explore object properties such as surface texture, shape and softness. Real or virtual environments that deprive the human user of the touch and feel of objects seem deficient and seriously handicap human interaction capabilities. It is likely that a more immersive experience in a virtual environment can be achieved by the synchronous operation of even a simple haptic interface with a visual and auditory display. Haptics can be subdivided into three areas: a. Human haptics : The study of human sensing and manipulation through touch. b. Machine haptics : The design, construction, and use of machines to replace or augment human touch. c. Computer haptics : The algorithms and software associated with generating and rendering the touch and feel of virtual objects (analogous to computer graphics).

Human Haptics :
Tactual sensory information conveyed to the brain from the hand in contact with an object can be divided into two classes: a. Tactile information, refers to the sense of the nature of contact with the object, mediated by the responses of low threshold mechano-receptors innervating the skin (say the finger-pad) within and around the contact region.

b. Kinesthetic information, refers to the sense of position and motion of limbs along with the associated forces, conveyed by the sensory receptors in the skin around the joints, joint capsules, tendons, and muscles, together with

neural signals derived from motor commands.

Computer Haptics :
Typically, a haptic rendering algorithm is made of two parts: a. Collision detection. b. Collision response.

Only tactile information is conveyed when objects are made to contact a passive, stationary hand, and kinesthetic information is conveyed during active, free (i.e., no contact with any object or other regions of skin) motion of the hand. Even when the two extreme cases mentioned above are included, it is clear that all sensory and manipulatory tasks performed actively with the normal hand involve both classes of information.

Machine Haptics :
Haptic interfaces can be viewed as having two basic functions. The first is to measure the positions and contact forces (and time derivatives) of the user's hand (and/or other body parts) and second one is to display contact forces and positions (and/or their spatial and temporal distributions) to the user. The desirable features of force-reflecting haptic interfaces are as follows: a. Low back-drive inertia and friction, and no constraints on motion imposed by the device kinematics, so that free motion feels free. b. The range, resolution, and bandwidth, both in terms of position sensing and force reflection, should match those of the human for the tasks for which the haptic interface is employed. c. Ergonomics and comfort: Making the human user comfortable when wearing or manipulating a haptic interface is of paramount importance, since pain, or even discomfort, supersedes all other sensations.

As the user manipulates the generic probe of the haptic device, the new position and orientation of the probe is sensed by the sensors in the device. If no collision between the simulated avatar of the probe and virtual objects are detected, the haptic interface device remains passive. If the probe is detected to have collided with the object, the mechanistic model calculates the reaction force based on the penetration depth of the probe into the virtual object. The calculated force vectors may then be modified by appropriately mapping them over the object surface to take into account the surface details. The modified force vectors are then fed back to the user through the haptic device. This sequence of collision detection and response is termed a haptic loop, which needs to continuously run at around 1000 times a second; otherwise, virtual surfaces feel softer, or, at worst, instead of feeling a surface, the user feels as if the haptic device is just vibrating in his/her hand.

Haptic devices create a closed loop between user and haptic rendering/simulation algorithms. x(t) and F(t) are continuoustime position and force signals exchanged between user and haptic device. x(K) and F(K) are discrete-time position and force signals exchanged between haptic device and virtual environment.

Touching Real and Virtual Objects :

When a human user touches a real object directly or through a tool, forces are imposed on the users skin. The associated sensory information, mediated by sensors in the skin, joints, tendons and muscles, is conveyed to the brain by the nervous system and leads to haptic perception. The subsequent motor commands issued by the brain activate the muscles. This sensorimotor loop continues to occur during both exploration and manipulation of objects. In order to create the sensation of touching virtual objects, we need to generate the reaction force of objects applied on the skin. Touching a real object through a tool is mimicked by the use of a force reflecting haptic interface device. When the human user manipulates the end-effectors of the haptic interface device, the position sensors on the device convey its tip position to the computer. The models of objects in the computer calculate in real-time the torque commands to the actuators on the haptic interface, so that appropriate change takes place in the real object. By the human user who feels this real object in a virtual environment.

We split haptic rendering into three main blocks. Collision detection algorithms provide information about contacts S occurring between an avatar at position X and objects in the virtual environment. Force-response algorithms return the ideal interaction force Fd between avatar and virtual objects. Control algorithms return a force Fr to the user, approximating the ideal interaction force to the best of the devices capabilities. Haptic interfaces are devices that enable manual interactions with virtual environments or teleoperated remote systems. In general, they receive motor action commands from the human user and display appropriate tactual images to the user. Success has been achieved in simulating 3D object-object interactions where one of the objects is viewed as a collection haptic display of shapes, textures, and friction of rigid and deformable objects has been achieved. Haptic rendering of dynamics of rigid objects, and to a lesser extent, linear dynamics of deformable objects has also been accomplished. Methods for recording and playing back haptic stimuli as well as algorithms for haptic interactions between multiple users in shared virtual environments are beginning to emerge.

Brain Machine Interfaces :


Controlling of a robot in real-time using signals from about 100 neurons in the motor cortex of a monkey was done at Duke. It was demonstrated that this could be done not only with a robot within Duke, but also across the Internet with a robot in our lab. This work opens a whole new paradigm for studying the sensorimotor functions in the Central Nervous System. In addition, a future application is the possibility of implanted brain-machine interfaces for paralyzed patients to control external devices such as smart prostheses, similar to pacemakers or cochlear implants. Phantom : The Phantom is a device, which enables tactile or rather haptic interaction with a computer. Using haptic technology it is also possible to extend the range of touch from the length of an

Haptic interfaces:

arm to a virtually unlimited distance. The Phantom is a small robot, which acts as a haptic interface between a human and a computer. The Phantom adds a new dimension to human computer interaction, i.e. haptic interaction. Haptic interaction uses both the sense of touch in a small scale and movement in a slightly bigger scale. In Phantom usually a robot is connect to a computer. The special thing in this case is that both movement and the sense of touch are used for interaction between the human and the computer. At the same timeone can use movement to give commands and to get feedback from the program. When activated the Phantom works together with the computer to interpret the users finger position in three dimensional paces and apply an appropriate and variable resisting force. This process is completed 1000 times per second. When the Phantom is extended to meet the needs of disabled persons it becomes a complete system. It also includes a lot of ideas and thoughts about what can be done for people with special needs using this hardware and software.

for a blind person. The structure of the computer system is represented by pictures, and if you cannot see the pictures it is very hard to grasp this underlying structure and it becomes hard to use the computer at all. Despite the fact many blind users prefer Windows to older computer systems. Most blind computer users use a screen reader that gives them access to text on the screen via either synthetic speech or a Braille display. With a haptic interface, it is possible to access at least some of the graphical interface elements, and that in turn will enable users with visual disabilities to benefit more from their computer system.

IT Potentials :
The potential of Information Technology and its applications have already given people with different disabilities, many

power of haptics : For the blind :


Today, computers are important tools for blind people, but they are mostly used as text machines. With a haptic computer interface, a blind person can play haptic computer games, learn mathematics by tracing touchable mathematical curves, and gain better access to graphical user interfaces like Windows. Three concept studies in the area of haptic Windows access are presented: with synthetic speech and Braille for general Windows access small haptic devices tools to be used as aids for searching disordered virtual objects like icons on the desktop. Windows is designed to be explored: when you see it you will understand it. The drawback is that Windows makes the computer harder to use

new opportunities. Haptic user interfaces has an important IT potential that will probably become better known and more widely used in the near future. Another potential area to be explored is haptic robot control. Working on robots to help physically disabled people, and this is still an important part of our work. It must be done with precision, accuracy, and without taking a long time. It is important to enable a user to move a robotic hand freely in order to reach areas that he cannot reach on his own, Phantom can be a great help here. It gives a natural conversion between the movements of the hand and the movements of the robot, and it also gives feedback so that you can feel what is happening. Force feedback is essential for good robot control in these cases.

An extra feature of using virtual touch for the feedback is that it becomes possible to use personalized settings to control the size of movements and the force exerted to the finger.

transducer) can be attached to a portable sensor, it can also be used in such applications as extravehicular space exploration.

Surgical Simulators and medical training : Military Applications :


Haptics can also been used in aerospace and military training and simulations. For certain applications, where terrain or texture information needs to be conveyed, haptics may be the most efficient communication channel. Haptics is an alternative modality to sound and vision that can be exploited to provide situation information, commands, and threat warning. Haptics could function as a supplemental information source to sound or vision. Users can be alerted haptically to interesting portions of a military simulation, learning quickly and intuitively about objects, their motion, what persons may interact with them in reality, and so on. One of the applications has been a simulation of a distributed environment where workers at remote locations can collaborate in reconfiguring a single vehicle chassis with

A primary application area for haptics has been in surgical simulation and medical training. The multimodal virtual environment system developed can be used in developing virtual reality surgical simulators that will enable a medical trainee to see, touch, and manipulate realistic models of biological tissues and organs. The haptics interface creates a VE where the trainee is provided with a real time situation. It can also be applied in manipulating micro and macro robots for minimally invasive surgery; remote diagnosis for telemedicine.

different weapons components, using instrumented forcefeedback gloves to manipulate the three dimensional components. The Naval Aerospace Medical Research Laboratory has developed a Tactile Situation Awareness System for providing accurate orientation information in land, sea, and aerospace environments. One application of the system is to alleviate problems related to the spatial disorientation that occurs when a pilot incorrectly perceives the attitude, altitude, or motion of his aircraft; some of this error may be attributable to momentary distraction, reduced visibility, or an increased workload. Because the system (a vibrotactile

Medical stitching and robotic surgery:


Delicate stitching of puncture of interior structures can be done with a great accuracy with robot-assisted procedures. The task can be best completed in time when physical force feedback is combined with augmented visual feedback. Robotics can increase the precision and accuracy of the surgeons fine motor performance by stabilizing tremor, or by scaling down the motions made by a surgeons hands. This robotic technology, with its promise of extending human capabilities and enhancing performance, will enable more difficult procedures to be performed.

Conclusion :
Haptics moves beyond the buzzes and thumps of todays video games, technology will enable increasingly believable and complex physical interaction with virtual or remote objects like doctors train for simple procedures without endangering patients. It has opened up a new dimension is the field of science and technology. It is making humans believe in the unbelievable. With this technology teaming up with the Artificial Intelligence, nano-technology and MEMS can bring out a revolution, and where the winner will be mankind. Its now a matter of time to see that all the physically handicapped people are getting back hands, limbs and vision. Its not unreasonable to expect that future advancements in haptics will have equally deep effects. Though the field is still in its infancy, hints of vast, unexplored intellectual and commercial territory add excitement and energy to a growing number of conferences, courses, product releases, and invention efforts. For the field to move beyond todays state of the art what it is today. Device and software tool oriented corporate efforts have provided the tools we need to step out of the laboratory, yet we need new business models. For example, can we create haptic content and authoring tools that will make the technology broadly attractive? Can the interface devices be made practical and inexpensive enough to make them widely accessible and also to common man

The bright future :


Smart Homes :
Haptics made us believe that we can mobilize and control objects in our house. according to our thoughts. Researches at MIT, Labs concluded that two monkeys sitting nearby and reacting in responded to each others behavior did react in the same way when one monkey was absent, as it would have in the others presence. This was only possible by sending and trapping same electrical signals through the monkeys brain. This small experiment gives way to our vision. In our homes we can make our thought signals implement on the machines and make them work as according to our thoughts.

For Physically Disabled :


A modular mobility aid system based on haptics technology can be attached to most commercial electric wheelchairs to increase the independence and safety of the wheelchair user. The system will provide falling avoidance, obstacle avoidance, and beaconless navigation functions. An idea can be presented to combine map information to environmental perceptions in order to provide the wheelchair user a better awareness of his/her whereabouts. This navigation system can also be applied to different mobile robot applications.

References
[1] What is Haptics?, by Mandayam A Srinivasan. [2] Surgery rehearsal in virtual reality: Freedom to learn, byMatthew Birtwisle. [3] THE IT POTENTIAL OF HAPTICS,Touch access for people with disabilities, by Calle Sjstrm [4] Y. Shi and D.K. Pai, Haptic Display of Visual Images, Proc.IEEE Virtual Reality Ann. Intl Symp. (VRAIS 97), IEEE CS Press, 1997, pp. 188-191.

Autonomous Control
The vehicle navigates solely with the sensory information of the five cameras. The user is not able to interfere except for initiating an emergency stop. Assisted Navigation Primarily, the wheelchair is driven by the user. His actions are supervised by the computer to prevent accidents due to unintentional misuse. Manual Operation

Das könnte Ihnen auch gefallen