Sie sind auf Seite 1von 36

EYEGAZE SYSTEMS

1. INTRODUCTION
The Eyegaze System is a communication and control system for people with complex physical disabilities. You run the system with your eyes. By looking at control keys displayed on a screen, a person can synthesize speech, control his environment (lights, appliances, etc.), type, operate a telephone, run computer software, operate a computer mouse, and access the Internet and e-mail. Eyegaze Systems are being used to write books, attend school and enhance the quality of life of people with disabilities all over the world. Imagine yourself being a intelligent, motivated, and working person in the fiercely competitive market of information technology, but just one problem You can't use your hands. Or you can't speak. How do you do your job? How do you stay employed? You can, because of a very good gift from computer Industry: The Eyegaze, a communication & control system you run with your eyes.

The Eyegaze System is a direct-select vision-controlled communication and control system. It was developed in Fairfax, Virginia, by LC Technologies, Inc., An Eyegaze Desktop system is shown in Fig 1.

Fig 1 Eyegaze System Dept. Of EEE, KLEIT, HUBLI. 1|Page

EYEGAZE SYSTEMS

2. WHO's USING THE EYEGAZE SYSTEM?

This system is mainly developed for those who lack the use of their hands or voice. Only requirements to operate the Eyegaze are control of at least one eye with good vision & ability to keep head fairly still. Eyegaze Systems are in use around the world. Its users are adults and children with cerebral palsy, spinal cord injuries, brain injuries, ALS, multiple sclerosis, brainstem strokes, muscular dystrophy, and Werdnig-Hoffman syndrome. Eyegaze Systems are being used in homes, offices, schools, hospitals, and long-term care facilities. By looking at control keys displayed on a screen, a person can synthesize speech, control his environment (lights, appliances, etc.), type, operate a telephone, run computer software, operate a computer mouse, and access the Internet and e-mail. Eyegaze Systems are being used to write books, attend school and enhance the quality of life of people with disabilities all over the world.

2.1 THE SKILLS NEEDED BY THE USER:


2.1.1 GOOD CONTROL OF ONE EYE: The user must be able to look up, down, left and right. He must be able to fix his gaze on all areas of a 15-inch screen that is about 24 inches in front of his face. He must be able to focus on one spot for at least 1/2 second. 1. Several common eye movement problems may interfere with Eyegaze use. These include:

Nystagmus (constant, involuntary movement of the eyeball):


The user may not be able to fix his gaze long enough to make eyegaze selections.

Alternating strabismus (eyes cannot be directed to the same object, either one deviates):
The Eyegaze System is constantly tracking the same single eye. If, for example, a user with alternating strabismus is operating the Eyegaze System with the right eye, and that eye begins to deviate, the left eye will take over and focus on the screen. The Eyegaze camera, however, will continue to take pictures of the right eye, and the System will not be able to determine where the user's left eye is focused. Dept. Of EEE, KLEIT, HUBLI. 2|Page

EYEGAZE SYSTEMS When the left eye deviates and the right eye is again fixed on the screen the Eyegaze System will resume predicting the gazepoint. Putting a partial eye patch over the nasal side of the eye not being observed by the camera often solves this tracking problem. Since only the unpatched eye can the screen, it will continuously focus on the screen. By applying only a nasal-side patch to the other eye, the user will retain peripheral vision on that side. 2.1.2 ADEQUATE VISION: Several common vision problems may affect a user's ability to see text clearly on the Eyegaze monitor. These include the following: INADEQUATE VISUAL ACUITY: The user must be able to see text on the screen clearly. If, prior to his injury or the onset of his illness he wore glasses, he may need corrective lenses to operate the Eyegaze System. If he's over 40 years old and has not had his vision checked recently, he might need reading glasses in order to see the screen clearly. In most cases, eyetracking works well with glasses. The calibration procedure accommodates for the refractive properties of most lenses. Hard-line bifocals can be a problem if the lens boundary splits the image of the pupil, making it difficult for the system's image processing software to determine the pupil center accurately. Graded bifocals, however, typically do not interfere with eyetracking. Soft contact lenses that cover all or most of the cornea generally work well with the Eyegaze System. The corneal reflection is obtained from the contact lens surface rather than the cornea itself. Small, hard contacts can interfere, if the lens moves around considerably on the cornea and causes the corneal reflection to move across the discontinuity between the contact lens and the cornea. BLURRED VISION: Another occurrence associated with some brain injuries, as well as a side effect of medications, a blurred image on the screen decreases the accuracy of eye fixations.

Dept. Of EEE, KLEIT, HUBLI.

3|Page

EYEGAZE SYSTEMS DIPLOPIA (DOUBLE VISION):

Diplopia may be the result of an injury to the brain, or a side effect of many commonly prescribed medications, and may make it difficult for the user to fix his gaze on a given point. Partially patching the eye not being tracked may alleviate double vision during Eyegaze System operation.

CATARACTS (CLOUDING OF THE LENS OF THE EYE): If a cataract has formed on the portion of the lens that covers the pupil, it may prevent light from passing through the pupil to reflect off the retina. Without a good retinal reflection the Eyegaze System cannot accurately predict the user's eye fixations. The clouded lens may also make it difficult for a user to see text on the screen clearly. Surgical removal of the cataracts will normally solve the problem and make Eyegaze use possible.

HOMONYMOUS HEMIANOPSIA (blindness or defective vision in the right or left halves of the visual fields of both eyes): This may make calibration almost impossible if the user cannot see calibration points on one side of the screen.

2.2 ABILITY TO MAINTAIN A POSITION IN FRONT OF THE EYEGAZE MONITOR:


It is generally easiest to run the System from an upright, seated position, with the head centered in front of the Eyegaze monitor. However the Eyegaze System can be operated from a semi-reclined position if necessary.

Continuous, uncontrolled head movement can make Eyegaze operation difficult, since the Eyegaze System must relocate the eye each time the user moves away from the camera's field of view and then returns. Even though the System's eye search is completed in just a second or two, it will be more tiring for a user with constant head movement to operate the System.

Dept. Of EEE, KLEIT, HUBLI.

4|Page

EYEGAZE SYSTEMS 1. ABSENCE OF MEDICATION SIDE EFFECTS THAT AFFECT EYEGAZE

OPERATION: Many commonly prescribed medications have potential side effects that can make it difficult to operate Eyegaze. Anticonvulsants (seizure drugs) can cause: nystagmus, blurred vision, diplopia, dizziness, drowsiness, headache and confusion. Some antidepressants can cause blurred vision and mydriasis (abnormally dilated pupil.) And Baclofen, a drug commonly used to decrease muscle spasms, can cause dizziness, drowsiness, headache, disorientation, blurred vision and mydriasis. Mydriasis can be severe enough to block eyetracking. If the retinal reflection is extremely bright, and the corneal reflection is sitting on top of a big, bright pupil, the corneal reflection may be indistinguishable and therefore unreadable by the computer.

2.3 MENTAL ABILITIES THAT IMPROVE THE PROBABILITY FOR SUCCESSFUL EYEGAZE USE:
2.3.1 ABILITY TO READ: At present, the Eyegaze System is configured for users who are literate. The System is textbased. A young child with average intelligence may not be reading yet, but probably has the capability to learn to read at an average age. He may be able to recognize words, and may be moving his eyes in a left to right pattern in preparation for reading. As an interim solution many teachers and parents stick pictures directly onto the screen. When the child looks at the picture he activates the Eyegaze key that is located directly underneath it. 2.3.2 MEMORY: Memory deficits are a particular concern in considering the Eyegaze System for someone with a brain injury. A user who can't remember from one day to the next how to operate the system may find it too difficult to use effectively.

Dept. Of EEE, KLEIT, HUBLI.

5|Page

EYEGAZE SYSTEMS

3. EYEGAZE SYSTEM WORKING


3.1 WORKING:
As a user sits in front of the Eyegaze monitor, a specialized video camera mounted below the monitor observes one of the user's eyes. Sophisticated image- processing software in the Eyegaze System's computer continually analyzes the video image of the eye and determines where the user is looking on the screen. Nothing is attached to the user's head or body. The Eyegaze system working principle can be understood by the Eyegaze System Architecture shown in Fig 2 clearly.

Fig 2 Eyegaze System Architecture

In detail the procedure can be described as follows: The Eyegaze System uses the pupilcenter/corneal-reflection method to determine where the user is looking on the screen. An infrared-sensitive video camera, mounted beneath the System's monitor, takes 60 pictures per second of the user's eye. A low power, infrared light emitting diode (LED), mounted in the center of the camera's lens illuminates the eye. The LED reflects a small bit of light off the surface of the eye's cornea. The light also shines through the pupil and reflects off of the retina, the back surface of the eye, and causes the pupil to appear white. The bright-pupil effect enhances the camera's image of the pupil and makes it easier for the image processing functions to locate the center of the pupil. The computer calculates the person's gazepoint, i.e., the coordinates of where he is looking on the screen, based on the relative positions of the pupil center and corneal reflection within the video image of the eye. Typically the Eyegaze System predicts the gaze-point with an average accuracy of a quarter inch or better.

Dept. Of EEE, KLEIT, HUBLI.

6|Page

EYEGAZE SYSTEMS Prior to operating the eyetracking applications, the Eyegaze System must learn several physiological properties of a user's eye in order to be able to project his gazepoint accurately. The system learns these properties by performing a calibration procedure. The user calibrates the system by fixing his gaze on a small yellow circle displayed on the screen, and following it as it moves around the screen. The calibration procedure usually takes about 15 seconds, and the user does not need to recalibrate if he moves away from the Eyegaze System and returns later.

EYE TRACKER/EYEGAZE COMPONENTS:


An eye tracker consists of several parts; a general overview of these is provided in Figure 3. A video-based eye tracker obtains its information from one or more cameras (Image Data). The first step of an eye tracker is to find the initial eye position (Detection component) in the images. The position is used for initializing the Eye Tracking component, which, in turn, aims at following the eye over time. Based on information obtained from the eye region and possibly head pose, the Gaze Estimation component will then determine where the user is looking. This information is then used in the gaze-based application.

The objective is to review all possible feature combinations and to evaluate the ability of the resulting models to estimate gaze.

Fig 3: Eye Tracker Components Dept. Of EEE, KLEIT, HUBLI. 7|Page

EYEGAZE SYSTEMS

Eye tracker software components were analysed in deliverable D5.2 Report on new approaches to Eye Tracking (Daynys et al. 2006). The focus of the current deliverable is mainly on the hardware components of the system. The hardware components must ensure good quality image data with required features such as the eye pupil and the glints produced by infrared light sources. The main hardware component of an eye tracker is the camera.

Dept. Of EEE, KLEIT, HUBLI.

8|Page

EYEGAZE SYSTEMS

4. HOW TO RUN THE EYEGAZE SYSTEM?

A user operates the Eyegaze System by looking at rectangular keys that are displayed on the control screen. To "press" an Eyegaze key, the user looks at the key for a specified period of time. The gaze duration required to visually activate a key, typically a fraction of a second, is adjustable. An array of menu keys and exit keys allow the user to navigate around the Eyegaze programs independently.

1. The Edge Analysis System uses the Pupil-Center/Corneal-Reflection method to determine the eye's gaze direction. 2. A video camera located below the computer screen remotely and unobtrusively observes the subject's eye. 3. No attachments to the head are required. 4. A small, low power, infrared light emitting diode (LED) located at the center of the camera lens illuminates the eye. The LED generates the corneal reflection and causes the bright pupil effect, which enhances the camera's image of the pupil.

Dept. Of EEE, KLEIT, HUBLI.

9|Page

EYEGAZE SYSTEMS

5. USES OF EYEGAZE
Every year more than 100 000 people are diagnosed with motor neurone diseases. Typically, even when all other ways of communicating are either severely damaged or completely lost, the eyes still function. Communication by Gaze Interaction (COGAIN) is a Network of Excellence designed specifically to help people with these disabilities to communicate more effectively with eye gaze. At the COGAIN stand you can see how this technology is used by a person who relies on it.

Current eye tracking equipment allows users to generate text on a computer by using eye gaze. Users are able to select letters and numbers by looking at a keyboard on a screen with their eyes, and can construct words and sentences that can be spoken aloud by the system. Using these systems both empowers and enables people with disabilities as they can now communicate without the need for an assistant or helper, giving the users greater freedom in their lives.

Eye tracking systems that allow text entry by eye gaze have been in existence for about two decades, but the technology is still only available to a small portion of the potential user population. Obstacles for more wide-spread use currently include: the high cost of eye tracking equipment, the limitation that gaze communication applications may only work with a particular dedicated eye tracking device, and finally that eye tracking devices are often hard to use and require experts to operate them.

5.1 The Basic Eyegaze Can:


1. 2. 3. 4. 5.

ADJUST TO A NEW USER in about 15 seconds. (Calibration) TYPE with one of four keyboards, then print or speak. (Typewriter) TURN pages on the computer screen by looking at "up" or "down". (Read Text) PLAY games, two "Paddle" games, plus Solitaire and Slot Machine. TEACH new users with simplified screens. (Teach Screens)

Dept. Of EEE, KLEIT, HUBLI.

10 | P a g e

EYEGAZE SYSTEMS

5.2 With Options the Eyegaze Can:


1.

BE AT TWO SITES!! Portable computer has a handle to hand-carry between two sites. Two sets of other components and cables for access to Eyegaze System at schools, work or home. Dimensions 9"x5'txl7'1, weight approximately 16 lbs. (Transportable Computer) BE A KEYBOARD to a second computer to run any keyboard-controlled software, by means of the T-TAM connector. (Second Computer Mode)

2.

3.

SPEAK 100 "canned phrases" through a speech synthesizer, with a single glance of the eye. Phrases can be changed by caregiver or user. (Phrases) CONTROL appliances anywhere in the home or office from one Eyegaze screen. No special wiring. (Lights and Appliances) DIAL and answer a speaker phone from one screen. "Phone Book" stores 16 frequently used numbers. (Telephone)

4.

5.

Dept. Of EEE, KLEIT, HUBLI.

11 | P a g e

EYEGAZE SYSTEMS

6. MENUS OF EYEGAZE SYSTEM


THE MAIN MENU:
The Main Menu appears on the screen as soon as the user completes a 15-second calibration procedure. The Main Menu shown in Fig 4 presents a list of available Eyegaze programs. The user calls up a desired program by looking at the Eyegaze key next to his program choice.

Fig 4 Main Menu Screen Dept. Of EEE, KLEIT, HUBLI. 12 | P a g e

EYEGAZE SYSTEMS

MAIN MENU OPTIONS:


6.1 THE PHRASE PROGRAM: The Phrases program, shown in Fig 5 along with the speech synthesizer, provides quick communications for non-verbal users. Looking at a key causes a preprogrammed message to be spoken. The Phrases program stores up to 126 messages, which can be composed and easily changed to suit the user.

Fig 5 Phrases Screen 6.2 TYPEWRITER PROGRAM: Simple word processing can be done using the Typewriter Program. The user types by looking at keys on visual keyboards. Four keyboard configurations, simple to complex, are available. Typed text appears on the screen above the keyboard display.

Fig 6 Alpha Keyboar Dept. Of EEE, KLEIT, HUBLI. 13 | P a g e

EYEGAZE SYSTEMS The user may "speak" or print what he has typed. He may also store typed text in a file to be retrieved at a later time. The retrieved text may be verbalized, edited or printed. The Alpha Keyboard is used for typing purpose shown in the Fig 6.

6.3 THE TELEPHONE PROGRAM: The telephone program allows the user to place and receive calls. Frequently used numbers are stored in a telephone "book". Non-verbal users may access the speech synthesizer to talk on the phone. The telephone control screen is shown in Fig 6.

Fig 7 Telephone Control Screen

6.4 RUN SECOND PC:


The Run Second PC program permits the Eyegaze Communication System to act as a peripheral keyboard and Mouse interface to a Windows computer. The user can run any offthe-shelf software he chooses on the second computer. He can access the Internet, and send email by looking at keyboard and mouse control screens on the Eyegaze monitor.

Dept. Of EEE, KLEIT, HUBLI.

14 | P a g e

EYEGAZE SYSTEMS The programs being run are displayed on the second computer's monitor. Typed text appears simultaneously on the Eyegaze and second pc's screens. For children, two new Eyegaze programs have been added to the Eyegaze System. Both run with the Second PC option. Eye Switch is a big, basic on-screen switch to run "cause & effect" software programs on a Second PC. Simple Mouse is an easy mouse control program to provide simplified access to educational software on a Second PC.

Figs 8 and Fig 9 shows the Keyboard and Mouse control screens used in the Eyegaze systems respectively.

Fig 8 Frequency Keyboard

Fig 9 Mouse Control Screen

Dept. Of EEE, KLEIT, HUBLI.

15 | P a g e

EYEGAZE SYSTEMS 6.5 PADDLE GAMES & SCORE FOUR: These are the visually controlled Games. 6.6 READ TEXT PROGRAM: The Read Text Program allows the user to select text for display and to "turn pages" with his eyes. Any ASCII format text can be loaded for the user to access. Books on floppy disk are available from Services for the Blind.

6.7 TELEVISION: Television programs can be displayed directly on the desktop Eyegaze System screen. Onscreen volume and channel controls provide independent operation. (Not available on the Portable Eyegaze System.

6.8 THE LIGHTS & APPLIANCES PROGRAM: The Lights & appliances Program which includes computer-controlled switching equipment, provides Eyegaze control of lights and appliances anywhere in the home or office. No special house wiring is necessary. The user turns appliances on and off by looking at a bank of switches displayed on the screen. For controlling the Lights and other equipments the eyegaze system uses a Light and Appliances screen as shown in Fig 9.

Fig 10 Lights And Appliances Screen Dept. Of EEE, KLEIT, HUBLI. 16 | P a g e

EYEGAZE SYSTEMS

7. FOR PEOPLE WITH LIMITED EYE CONTROL


Scanning Keyboard is the new row/column keyboard with an on-screen eye "switch" for people with limited eye movement. The switch can be placed on either side, above, or below the keyboard to accommodate users with only horizontal movement, or only vertical movement. The user may "speak" what he has typed.

Dept. Of EEE, KLEIT, HUBLI.

17 | P a g e

EYEGAZE SYSTEMS

8. ENVIRONMENT REQUIRED FOR AN EYEGAZE SYSTEM


Environment required for an Eyegaze system are as follows:
1. Because eye-tracking is done using infrared light, Eyegaze system must take care of light sources in the room in order to ensure the best accuracy.

2. The Eyegaze System must be operated in an environment where there is limited of ambient infrared light.

3. Common sources of infrared light are sunlight and incandescent light bulbs. The System makes its predictions based on the assumption that the only source of infrared light shining on the user's eye is coming from the center of the camera.

4. Therefore, stray sources of infrared may degrade the accuracy or prevent Eyegaze operation altogether. The System works best away from windows, and in a room lit with fluorescent or mercury-vapor lights, which are low in infrared.

Dept. Of EEE, KLEIT, HUBLI.

18 | P a g e

EYEGAZE SYSTEMS

9. NEW PORTABLE EYEGAZE SYSTEM

The Portable Eyegaze System, as in Fig 10 and Fig 11, can be mounted on a wheelchair and run from a 12-volt battery or wall outlet. It weighs only 6 lbs (2.7 kg) and its dimensions are 2.5"x8"x9" (6.5cm x20cm x23cm). The Portable Eyegaze System comes with a flat screen monitor and a table mount for its monitor. The monitor can be lifted off the table mount and slipped into a wheelchair mount.

Fig 10 Portable Eyegaze System Mounted On Wheelchair

Fig 11 Monitor Of Portable Eyegaze System

Dept. Of EEE, KLEIT, HUBLI.

19 | P a g e

EYEGAZE SYSTEMS

10. EYEGAZE SYSTEM PERFORMANCE SPECIFICATIONS

10.1 SPECIFICATIONS: 10.1.1 Accuracy


Angular Gaze Eyegaze Measurement Orientation Spatial Gaze Point (with head 20" (51 cm) from camera) Typical Average Bias Error* (over the monitor screen range) Maximum Average Bias Error* (over the monitor screen range) Frame-to-frame variation+ (1-sigma variation with eye fixed on a point) Table 1 Accuracy Of The System * Bias errors result from inaccuracies in the measurement of head range, asymmetries of the pupil opening about the eye's optic axis, and astigmatism. They are constant from frame to frame and cannot be reduced by averaging or smoothing. + Frame-to-frame variations result from image brightness noise and pixel position quantization in the camera image and may be reduced by averaging or smoothing. 0.18 degree 0.06 inch (0.15 cm) 0.70 degree 0.25 inch (0.63 cm) 0.45 degree 0.15 inch (0.38 cm)

10.1.2 Speed:
Sampling Rate: 60 Hertz

Dept. Of EEE, KLEIT, HUBLI.

20 | P a g e

EYEGAZE SYSTEMS

10.1.3 Angular Gaze-track Range:


Gaze Cone Diameter: 80 degrees

As the eye's gaze axis rotates away from the camera, the corneal reflection moves away from the center of the cornea. Accurate gaze angle calculation ceases when the corneal reflection "falls off" the edge of the cornea. The eye's gaze axis may range up to 40 degrees away from the camera, depending on the arc of the person's cornea. The lower 15 degrees of the gaze cone, however, is generally clipped due to the upper eyelid blocking the corneal reflection when the eye is looking down below the camera.

10.1.4 Tolerance To Head Motion (Fixed-camera Eyegaze Systems)


Lateral Range: Vertical Range:
Longitudinal Range:

1.5 inch (3.8 cm) 1.2 inch (3.0 cm)


1.5 inch (3.8 cm)

Table 2 Tolerance To Head Motion In fixed-camera Edge Analysis Systems, the eye must remain within the field of view of the camera. However, if the subject moves away from the camera's field of view, eyetracking will resume once he returns to a position where his eye is again visible to the camera.

10.1.5 COMPUTER USAGE: Memory Consumption: CPU Time Consumption: 6 MB 30-50%

Light Emitting Diode:


Wave Length: Beam Width: 880 nanometers (near infrared) 20 degrees, between half power points

Radiated Power: 20 milliwatts, radiated over the 20 degree beam width Safety Factor: 5 -- At a range of 15 inches the LED illumination on the eye is 20% of the HEW max permissible exposure. Table 3 Specifications Of LED Dept. Of EEE, KLEIT, HUBLI. 21 | P a g e

EYEGAZE SYSTEMS

10.2 OPERATIONAL REQUIREMENTS:


Low Ambient Infrared Light: There must be low levels of ambient infrared light falling on the subject's eye. Stray IR sources obscure the lighting from the Edge Analysis System's light emitting diode and degrade the image of the eye. The sun and incandescent lamps contain high levels of infrared light. The environment may be brightly illuminated with lights such as fluorescent or mercury-vapor which do not emit in the infrared region of the spectrum. The Edge Analysis System also works well in the dark. Eye Visibility: The camera(s) must have a clear view of the subject's eye(s). If either his pupil or the corneal reflection are occluded, there may be insufficient image information to make an accurate gaze measurement. The camera's view of the eye can be obstructed by, for example, 1) an object between the camera and the eye, 2) the person's nose or cheek if his head is rotated too much with respect to the camera, or 3) by excessive squinting. Alternative Edge Analysis software is included to accommodate for an obstructed image of the top of the pupil, usually caused by a droopy eyelid or an unusually large pupil. The software returns a "false" condition for the EyeFound flag whenever an adequate image of the eye is not present. Glasses and Contact Lenses In most cases, eyetracking works with glasses and contact lenses. The calibration procedure accommodates for the refractive properties of the lenses. When wearing glasses, the glasses may not be tilted significantly downward, or the reflection of the LED off the surface of the glass is reflected back into the camera and obscures the image of the eye. The lens boundary in hard-line bifocal or trifocal glasses often splits the camera's image of the eye, and the discontinuity in the image invalidates the image measurements. Soft contact lenses that cover all or most of the cornea generally work well with the Edge Analysis System. The corneal reflection is obtained from the contact lens surface rather than the cornea itself. Small, hard contacts can cause problems, however, if the lenses move around considerably on the cornea, and the corneal reflection moves across the discontinuity between the contact lens and the cornea. Dept. Of EEE, KLEIT, HUBLI. 22 | P a g e

EYEGAZE SYSTEMS

10.3 CAMERAS FOR EYE TRACKING/EYEGAZE


A camera is a key hardware component of an eye tracker. Other technical solutions are mainly influenced by the choice of a camera. The cameras technical properties have considerable impact on the performance and stability of the employed algorithms. Furthermore, the influence of several external conditions, e.g. lighting conditions, can be minimized by a proper choice. The aim of the current section is to share information between partners about good practice and provide criteria for camera selection.

10.3.1 CLASSES OF CAMERAS


Majority of cameras on market can be classified into 5 classes: 1. Machine vision cameras; 2. CCTV (Closed Circuit Television) cameras; 3. Webcams; 4. Camcorders; 5. Digital still cameras.

1. Machine vision cameras: Machine vision cameras are developed for video data transfer to computer for further data analysis by machine vision algorithms. Machine vision cameras have big variety because of different frame rate, resolution, connectivity, image sensor type, spectral response. Functionally machine vision cameras are most suitable for eye tracking, as they transfer uncompressed digital data. A disadvantage is the high price of such cameras.

2. CCTV(closed circuit television) cameras: CCTV cameras are mainly used in surveillance systems. They deliver analogue video signal by NTSC, PAL or SECAM standard. Analogue signal can be transferred by long cables to monitor or writing devices. Because the target is visual output, CCTV cameras deliver interlaced frames by frame rate of 30 or 25 fps (frames per second). Most CCTV cameras operate in low light conditions. They have sensitive sensors. Some cameras are optimised for near infrared range. The last two features are attractive for eye tracking. Also price are lower than for machine vision cameras.

Dept. Of EEE, KLEIT, HUBLI.

23 | P a g e

EYEGAZE SYSTEMS 3. Webcams: Webcams were introduced recently. Their purpose is deliver video information through Internet in real time. Most of web cameras have frame rate of 30 fps. To reduce the data transfer volume, they deliver compressed video data. Webcams have built in optical lenses. Inside the cameras there are colour response image sensors. Infrared light is an artefact for webcams. Usually they have filters to remove infrared illumination. Functionally webcams are not the best choice for eye tracking. Attractive features include low price, digital output with (fully available for most PCs) USB connectivity.

4. Camcorders: Camcorders are portable devices for recording video images on an internal storage device. Nowadays camcorders are digital. Some of them could be used as webcams. Their benefit against webcams is a better optical system with a possibility of optical zooming. Sony camcorders also have Night Vision function. In this regime a built in infrared light source is used for scene lighting. Most camcorders also have Firewire (i-Link for Sony camcorders) connectivity, which has advantages compared with USB. Though the prices for camcorders are significantly higher than for webcams, camcorders are more preferable for eye tracking than webcams because of the above mentioned features. A disadvantage, as for most of all cameras, is the (low) frame rate of NTSC or PAL video.

5. Digital still cameras: Digital still cameras are devices used to capture and to store photographs in a digital format. More advanced cameras also have a video function. The benefit of still cameras is their higher resolution combined with high quality optics. However, the benefit invokes a disadvantage: a lower frame rate as for camcorders.

10.4 IMAGE SENSOR:


Every camera, which in real time can be connected to a computer, has the following elements: 1. Optical system; 2. Image sensor; 3. Interface circuit.

Dept. Of EEE, KLEIT, HUBLI.

24 | P a g e

EYEGAZE SYSTEMS The optical system projects the scene on image sensor. The function of the image sensor is to convert the optical image information into electrical signal. The interface circuit ensures delivering of the electrical signal from the camera to a computer.

Many technical features of a camera are significantly influenced by its image sensor. The features are: image resolution, windowing, scan type, frame rate, shuttering, responsiveness, spectral response, dynamic range.

Until recently, charge-coupled devices (CCDs) were the only solid state image sensors used in digital cameras. In 1993 NASAs Jet Propulsion Laboratory succeeded with a new type of sensor CMOS image sensor (http://ntrs.nasa.gov). Both image sensors are pixilated metal oxide semiconductors. They accumulate the signal charge in each pixel, proportional to the local illumination intensity, serving a spatial sampling function. When the exposure is complete, a CCD transfers each pixels charge packet sequentially to a common output structure, which converts the charge to a voltage, buffers it and sends it offchip. In a CMOS imager, the charge-to-voltage conversion takes place in each pixel. Such pixels are called active pixels.

Image resolution corresponds to the size of the image sensor matrix in pixels. Windowing is the capability to read out a portion of data from the image sensor. There are two methods for scanning the pixel matrix: progressive scan and interlaced scan. During a progressive scan all pixels are read in order. In an interlaced scan only every second line of the pixel matrix is read. In such a case, one frame consists of two frames: one field with odd lines, and a second field with even lines. CMOS sensors allow windowing. It is possible to read only the selected region of interest (ROI) from the image sensor. Thus, windowing allows reducing data, which must be transferred from the image sensor. As a consequence, the frame rate can be increased. Shuttering controls the time duration for pixels exposition, during which time, the conversion of light photons into a charge in pixels occurs. A longer shutter time ensures that more light is transferred to the sensor in the same illumination conditions. However, long shutter speeds are problematic if the target moves.

Dept. Of EEE, KLEIT, HUBLI.

25 | P a g e

EYEGAZE SYSTEMS During a saccade the speed of the eye is high. In that case, long expositions invoke blurring of a moving object contour in the direction of movement. Hence, the exposition time must be optimised together with the frame rate and the illumination of the eye.

Responsiveness defines the relation between the incident light energy and the pixel output. CMOS imagers are marginally superior to CCDs, in general, because the gain elements are easier to place on a CMOS image sensor. Their complementary transistors allow low-power high-gain amplifiers, whereas CCD amplification usually comes at a significant power penalty. Some CCD manufacturers are challenging this conception with new readout amplifier techniques. Dynamic range is the ratio of a pixels saturation level to its signal threshold. It gives CCDs an advantage by about a factor of two in comparable circumstances (Dalsa, 2007). CCDs still benefit from significant noise advantages over CMOS imagers because of quieter sensor substrates (less on-chip circuitry), inherent tolerance to bus capacitance variations and common output amplifiers with transistor geometries that can be easily adapted for minimal noise. Externally coddling the image sensor through cooling, better optics, more resolution or adapted off-chip electronics still cannot make CMOS sensors equivalent to CCDs in this regard. The dynamic range can be defined in decibels or effective number of bits. The parameter causes number of output bits for pixel; usually it is 8, 10, or 12. An important image sensor parameter for selecting the optical system is the sensors format because lens must produce image of approximately the same size. Format is evaluated from the sensor width. There are formats of sensors of 1, 2/3, 1/2, 1/3, 1/4, and 1/6. Their geometrical size is given in Table 4.

Table 4: Geometrical Size Of Image Sensors Dept. Of EEE, KLEIT, HUBLI. 26 | P a g e

EYEGAZE SYSTEMS

10.5 RESOLUTION AND FRAME RATE


The quality of a digital image depends in part on the number of pixels used to create the image. The maximum number that one can capture depends on how many pixels there are on the image sensor used to capture the image.

Resolution can be of two kinds: optical and interpolated. The optical resolution of a camera is an absolute number because an image sensor's pixels are physical entities that can be counted. To improve resolution in certain limited respects, the resolution can be increased using software. This process, called interpolated resolution, adds pixels to the image.

To do so, software evaluates those pixels surrounding each new pixel to determine what its colours should be. What's important to keep in mind is that interpolated resolution doesn't add any new information to the imageit just adds pixels and makes the file larger. Interpolation is often used for coloured images. An example is given in Figure 12 (Micron 2007). The next algorithm is used: Have blue, need green and red. G= average of 4 neighbouring greens. R= average of 4 neighbouring reds. Have green, need blue and red. B= average of 2 neighbouring blues. R= average of 2 neighbouring reds. Have red, need green and blue. G= average of 4 neighbouring greens. B= average of 4 neighbouring blues

Fig 12. Interpolation Of Pixels (Micron, 2007)

Dept. Of EEE, KLEIT, HUBLI.

27 | P a g e

EYEGAZE SYSTEMS
More pixels add detail and sharpen edges. If any digital image is enlarged enough, the pixels will begin to show-an effect called pixelization. The more pixels there are in an image, the more it can be enlarged before pixelization occurs.

10.6 RECOMMENDATIONS FOR CAMERA SELECTION:


Native light as well as infrared light approaches have been tested by COGAIN partners. While the former still poses problems to be solved, the latter (IR) gave promising results. At partner UNI KO-LD, different cameras were tested for the corneal reflection approach. 1. A FireWire camera from UniBrain, Inc, (Unibrain 2007) named Fire-I, offering 640x480 pixel in colour at up to 30 fps at a price of 109. 2. A low cost USB 2 based PC-Camera (web cam) named SN9C201 from an unknown manufacturer, providing 640x480 at 25 fps for approx. 15.

3. A small high sensitivity camera with a conventional, analogue video output named 2005XA from RFConcepts, Ltd. (RF-Concepts, 2007) At approx. 90.

While it is almost impossible to get detailed technical information concerning the low cost webcam, the other cameras are suitably documented. The Fire-I camera employs a Sony ICX098BQ sensor chip. The device complies to the IIDC-1394 Digital Camera protocol, V1.04 (DCAM).

The RF-Concepts 2005XA camera was mainly chosen for its ability to work under weak illumination conditions. It is based on the Sony CXD2463R controller and a ICX059CL sensor, which employs Sonys Exview HAD CCD technology. The camera produces images with a resolution up to 768x576 pixels at a frame rate of 25 fps. Due to its good image quality and its sensitivity to infrared light the camera was already used in an earlier project. The standard lens with 3,6mm focal length was replaced with a 8mm lens. For this camera, an illumination of only 0.003 lux is sufficient to provide images, while usual cameras need at least 1-2 lux. This sensitivity is of major importance when using infrared light only.

Dept. Of EEE, KLEIT, HUBLI.

28 | P a g e

EYEGAZE SYSTEMS Alternatively, a Pan-Tilt-Zoom (PTZ) camera might be considered, however such cameras need considerably more space, are more expensive and require active control for the positional parameters by implementing an appropriate head tracking and therefore were not chosen for our tests. Furthermore, the available models do not meet the sensitivity requirements which turned out to be crucial.

As tracking in visible light lacks a geometric reference to estimate the gaze direction, it has to be accompanied by geometrically accurate head pose estimation. While such approaches are under further investigation, as they don't require an additional infrared light source and thus perfectly meet the intention to set up a system from COTS hardware, we also tried to setup an infrared based system using a minimum of extra hardware. This however requires a camera with a sufficiently high IR sensitivity.

10.7 EYEGAZE SYSTEM FEATURES:


1. Calibration: Fully automated calibration is fast (15 - seconds), easy and long lasting. 2. Accuracy: Highly accurate and tolerant to many variations, such as pupil drift and head range variation - typical average bias error .45 degrees. Good tolerance to ambient infrared light and to high-speed head motion. 3. Remote Video Tracking: Unobtrusive and non-distracting with nothing attached to the subject. 4. Accommodates Most Human Eye Variations: Useable by 90% of the people who try it, functional with most glasses and contacts. 5. Binocular Eye Tracking: Tracks both of the subjects eyes to increase gaze point accuracy by adjusting for head rotation. 6. Trace Suite: This suite of programs captures a subjects gaze when presented with static or moving images or during computer operations. In addition to generating the standard data of our Analysis System, the users gaze and the stimulus are linked so they can be replayed and studied in depth. The observer can view the subjects screen, eye images and gaze point in real time on a separate monitor.

Dept. Of EEE, KLEIT, HUBLI.

29 | P a g e

EYEGAZE SYSTEMS 7. Eyegaze Software Development Tool Kit: EgWin API - Eyegaze Function Library (DLL and import library), Gaze Tracking Demonstration Program (executable and C source code), EgTraceImage and EgTraceText Programs (executable and C source code), Fixation/Saccade Analysis Function (C source code), EgServer and EgClientDemo Programs (C source code). 8. NYAN Analysis Tool: NYAN is an analysis tool based on screen recording technology with synchronized gaze and mouse data. It can be used instead of the Trace Suite to analyze web pages and intranet applications as well as standard application software and still images.

Dept. Of EEE, KLEIT, HUBLI.

30 | P a g e

EYEGAZE SYSTEMS

11. APPLICATIONS

Every year more than 100,000 people are diagnosed with motor neurone diseases. Typically, even when all other ways of communicating are either severely damaged or completely lost, the eyes still function. Communication by Gaze Interaction (COGAIN) is a Network of Excellence designed specifically to help people with these disabilities to communicate more effectively with eye gaze. At the COGAIN stand you can see how this technology is used by a person who relies on it.

Current eye tracking equipment allows users to generate text on a computer by using eye gaze. Users are able to select letters and numbers by looking at a keyboard on a screen with their eyes, and can construct words and sentences that can be spoken aloud by the system. Using these systems both empowers and enables people with disabilities as they can now communicate without the need for an assistant or helper, giving the users greater freedom in their lives.

A wide variety of disciplines use eye tracking techniques, including cognitive science, psychology (notably psycholinguistics, the visual world paradigm), human-computer interaction (HCI), marketing research and medical research (neurological diagnosis). Specific applications include the tracking eye movement in language reading, music reading, human activity recognition, the perception of advertising, and the playing of sport. Uses include: Cognitive Studies Medical Research Laser refractive surgery Human Factors Computer Usability Translation Process Research Vehicle Simulators In-vehicle Research Training Simulators Virtual Reality Adult Research Dept. Of EEE, KLEIT, HUBLI. 31 | P a g e

EYEGAZE SYSTEMS Infant Research Adolescent Research Geriatric Research Primate Research Sports Training fMRI / MEG / EEG Commercial eye tracking (web usability, advertising, marketing, automotive, etc) Communication systems for disabled Improved image and video communications Computer Science: Activity Recognition

Fig 13: Eyegaze System Communication System

Dept. Of EEE, KLEIT, HUBLI.

32 | P a g e

EYEGAZE SYSTEMS

12. THE EYEGAZE SYSTEM: COMPONENTS & PRICES

Desktop Eyegaze System

$14,900 US$

Software Programs Main Menu Keyboard Games Read Text Teach Settings Program Hardware Desktop computer with Windows 2000, Video frame grabber, sound, CD and floppy drives 15" LCD Flat Panel Monitor Adjustable monitor tray with camera bracket High-speed infrared sensitive camera and lens Surge protector, cables and connectors

Upgrades and Options Portable computer (in place of desktop computer) Computer access (hardware and software to run a PC) Lights & Appliances Telephone Television Options for use outside the U.S. are slightly different. All money is in U.S. Dollars. The above prices do not include shipping costs or travel-related installation expenses. $1000 $500 $350 $350 $350

Dept. Of EEE, KLEIT, HUBLI.

33 | P a g e

EYEGAZE SYSTEMS

13. ADVANTAGES AND LIMITATIONS OF THE SYSTEM

13.1 ADVANTAGES
1. Easy to Use. 2. Enhance the quality of life of people with disabilities all over the world. 3. Moving the eyes is natural, requires little conscious effort, and frees the hands for other tasks. 4. Experiments show that our eye gaze selection technique is faster than selecting with a mouse.

13.2 LIMITATIONS
1. The Obvious one is the price. 2. Stray sources of infrared may degrade the accuracy or prevent Eyegaze operation altogether. 3. Not all applications can be controlled using EYEGAZE.

Dept. Of EEE, KLEIT, HUBLI.

34 | P a g e

EYEGAZE SYSTEMS

CONCLUSION

Today, the human eye-gaze can be recorded by relatively unremarkable techniques. This thesis argues that it is possible to use the eye-gaze of a computer user in the interface to aid the control of the application. Care must be taken, though, that eye-gaze tracking data is used in a sensible way, since the nature of human eye-movements is a combination of several voluntary and involuntary cognitive processes. The main reason for eye-gaze based user interfaces being attractive is that the direction of the eye-gaze can express the interests of the user-it is a potential porthole into the current cognitive processes-and communication through the direction of the eyes is faster than any other mode of human communication. It is argued that eye-gaze tracking data is best used in multimodal interfaces where the user interacts with the data instead of the interface, in so-called non-command user interfaces.

Dept. Of EEE, KLEIT, HUBLI.

35 | P a g e

EYEGAZE SYSTEMS

BIBLIOGRAPHY

www.eyegaze.com http://www.diku.dk/~panic/eyegaze/node29.html http://www.gschlosser.de/eyegaze_english.htm www.abilityhub.com/mouse/eyegaze.htm www.eyetechds.com Pullivelli, A. (2005) Low-Cost Digital Cameras: Calibration, Stability Analysis, and Applications. Thesis. Department of Geomatics Engineering, University of Calgary. Available online at http://www.geomatics.ucalgary.ca/Papers/Thesis/AH/05.20216.Anoop Pullivelli.pdf RF-Concepts (2007). CCTV cameras. http://www.rfconcepts.co.uk

Dept. Of EEE, KLEIT, HUBLI.

36 | P a g e

Das könnte Ihnen auch gefallen