Sie sind auf Seite 1von 4

Michael Mangialardi

Dan Widing
Erik Johnson

SEED Proposal for Multi-Touch Screen using


Frustrated Total Internal Reflection
Overview

The goal of this project is to develop a simple, effective, and inexpensive multi-
touch screen for the IEEE student chapter at UIUC. The planned screen will allow users
to interact directly with the computer display surface using multiple simultaneous
touches. This will be accomplished using the technology and techniques detailed by
Jeffery Han of New York University in his paper “Low-Cost Multi-Touch Sensing
through Frustrated Total Internal Reflection.” Once completed, the project will provide a
unique platform for IEEE students to implement signal processing and human computer
interface projects.

Sample of multi-touch screen, from “Low-Cost Multi-Touch Sensing through Frustrated Total Internal
Reflection” (J. Han).

Implementation

Diagram of proposed system, from “Low-Cost Multi-Touch Sensing through Frustrated Total Internal
Reflection” (J. Han).
The multi-touch screen will be implemented using a combination of hardware and
software. It consists of a touch surface (which also is the display surface), an infrared
camera, a projector, and a personal computer.
For the touch surface, the core technique used in this project will be Frustrated
Total Internal Reflection. The surface will consist of a piece of Plexiglas surrounded on
the edge by several infrared LEDs powered by a simple DC circuit. The infrared light is
reflected within the Plexiglas by the phenomenon of total internal reflection. When
contact is made with the touch surface, the total internal reflection is frustrated, creating
an infrared pattern visible to the infrared camera.
For this project, it is best if the touch surface and the display surface are the same,
allowing the user to interact directly with the displayed images. This can be accomplished
by stretching a layer of vinyl rear-projection screen material over the touch surface. As
the total internal reflection is occurring in the infrared spectrum, it is possible to project
visible spectrum light onto the rear-projection screen without interfering with the camera
operation.
The infrared camera used in the initial implementation of this project will be a
standard Microsoft Lifecam VX-1000 webcam with the infrared filter removed, making it
infrared sensitive. This USB device will interface directly to the computer.
In this project, the computer will provide the signal processing and programming
necessary to create applications. The image processing will be done using the C/C++
languages. This will allow the use of Intel’s OpenCV project, a robust open source image
processing library written in C/C++. Position data created by the image processing can
then be used by applications run on the PC. Potential applications could be written using
OpenGL or Ogre3D for the graphics and projected directly onto the touch surface,
allowing for a unique and powerful user interface.

Materials Needed

For Initial Implementation:

Touch Screen:
4’X4’x1/2” Acrylic Sheet - $50
20 IR LEDs -$20
Battery, resistors, and wire for DC circuit -$15
Baffle Material - $15
Vinyl Rear Projection Surface -$20
Mounting Material -$20

Camera:
.3 Mega pixel Microsoft VX-1000 USB webcam (already
available)

PC:
Laptop or Desktop computer (already available)

Total- $140
Growth Options (for consideration after successful implementation):

Camera:
IR Optical cut-off filter- $50
1.1 Mega pixel IR Camera - $100
Projector:
LCD projector (borrowed for demonstration)
Brief Timeline:

This project will consist of three main phases: hardware development, image
processing development, and initial application development. The timeline of this project
runs throughout the fall semester. By the end of the semester, the hardware platform and
image processing software should be developed.
Once the hardware and image processing phases are complete, the application
development phase becomes open ended. The goal is to create a platform where new
users can develop new applications and add to the image processing library indefinitely.

Hardware Development:
Procuring materials: 1.5 weeks
Building: 3 weeks
Debugging: 1.5 weeks
Producing data sets: 1 week

Image Processing Development:


Analyzing data sets: 1 week
Developing initial touch recognition software: 4 weeks
Debugging developed software: 2 weeks
Creating basic API: 2 weeks

Initial application:
Create OpenGL code: 3 weeks
Debug OpenGL code: 1 week
Demo OpenGL code: 1 week

People:

To begin this project, there is already a core group of three people to lead the
project: Erik Johnson (ECE), Dan Widing (MatSE/Comp Sci), Michael Mangialardi (Eng
Phys).
Volunteers will be welcome, especially for the image processing and application
development. Once the platform is in place, volunteers will be able to lead and develop
their own image processing and application projects. Around 5 volunteers could be used
during the hardware development phase. Once the hardware platform is in place, any
number of volunteers could participate using a content managing system to develop
applications and image processing code.
Documentation Plan:

The thorough documentation of this project will be critical for allowing further
projects to be done on this platform.
The progress of this project will simply be documented in a word file, updated as
the project advances.
To document the code, we will do several things. First, the code will be fully
commented. Second, the code functionality will be documented in a wiki-style format,
along with example code and applications. This will be tied in with a content managing
server (using a program such as CVS) to allow and control further contributions to the
image processing and application code.
The hardware details and construction will be fully photographed and
documented, also in a wiki-style format. Hopefully this will be hosted online and allow
further dissemination of this technology and design.

Ownership

The IEEE student chapter of University of Illinois has the right to any hardware
purchased and produced by this project. Ideally, the finished screen will remain with the
IEEE lab to promote further development on the platform.

Facilities and Equipment

This project will use the IEEE lab regularly, and may need storage room in the
form of an IEEE locker.

Das könnte Ihnen auch gefallen