Sie sind auf Seite 1von 55

SIM UNIVERSITY SCHOOL OF SCIENCE AND TECHNOLOGY

DEVELOPMENT OF A TOUCHLESS INTERACTIVE SCREEN

STUDENT : ANJU SEBASTIAN (Q0806388) SUPERVISOR : DR CAI ZHI QIANG PROJECT CODE : JAN2011/ENG/0089

A project report submitted to SIM University in partial fulfilment of the requirements for the degree of Bachelor of Engineering in Electronics November 2011

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

ABSTRACT

Touchless Interactive screen or system is an upcoming technology in the real life world; it will be another biggest realization followed by the technologies based on touch screens such as iPads and iPhones. A touchless interactive screen helps to control the monitors such as PC monitor, TV Screens without a stroke or touch.

The aim of this project is to develop a touchless interactive screen which helps to ease the controlling of any kind of monitors. Project development is completed on MATLAB platform. In this work, the overall objective is to apply image processing techniques to track the movement of the finger gestures which is captured by a webcam and that will be converted into mouse movements to control the screen.

As a result of working on this project, it helped me to earn a good knowledge in MATLAB based programming platform and I had a good opportunity to explore various scopes and dimensions in MATLAB such as how to use Java class in MATLAB. Development of touchless interactive screen helps to ease and redefine the interaction between human and computers.

Various Analysis result has included in the report to show the feasibility of the application of this development. A Graphical Interface User (GUI) is developed to let the user to choose perform calibration and number of times its required to perform calibration and to trigger the program upon users request.

This opportunity will be used to explore the extended possibilities of using touchless interactive screen in the market in very near future.

ii

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

ACKNOWLEDGEMENTS

I would like to take this opportunity to express my deepest appreciation to the following people for their valuable contributions and assistance with this project.

Initially, I would like to thank my project supervisor, Dr Cai Zhi Qiang for his guidance and support, especially for his valuable ideas and knowledge provided throughout the project. His expertise and experience in Image Processing tools in MATLAB, valuable comments and suggestions have been very useful in solving problems encountered during the project

I would like to thank Design Officer in Hitachi, Mr Tertius Koshy Thomas for sharing his knowledge on this project.

I would like to extend my deepest appreciation to Mathwork tutors and members for sharing their knowledge and guidance throughout the project.

And I would like to extend my warm appreciation to my colleagues and friends for sharing me their knowledge, valuable contributions and help with this project.

Despite my hectic work in my company I would like to thanks my supervisor and manager who provide me with understanding help and arrangement on the tight schedule to complete the project on time.

Finally, my special thanks go to my family for their continuous support and help throughout my academic years and for their continual support and encouragement for this project.

iii

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

TABLE OF CONTENTS
Page ABSTRACT ACKNOWLEDGEMENT LISTS OF FIGURES LIST OF TABLES CHAPTER ONE INTRODUCTION 1.1 1.2 1.3 1.4 Background Motivation Project Objectives Project Scope 1 1 1 1 2 ii iii vi vii

CHAPTER TWO THEORY AND LITERATURE REVIEW 2.1 2.2 2.3 2.4 2.5 2.6 Introduction Digital Image Processing Data Types Image Acquisition In Matlab Aspects of Image Processing In Matlab Image Enhancement 2.6.1 2.7 Nonlinear Spatial Filters to remove Salt and Pepper 4 4 4 5 6 6 7 7 8 8 9 9 10 11 12 13

Image Segmentation 2.7.1 Binary Image Labeling

2.8

Image Representation in Matlab 2.8.1 2.8.2 2.8.3 RGB Image HSI Color Space RGB to HSI Conversion

2.9

Introduction to Matlab 2.9.1 Graphical User Interface (GUI)

iv

Anju Sebastian (Q0806388)


CHAPTER THREE PROJECT MANAGEMENT 3.1 3.2 Project Plan Project Task and Schedules

JAN2011/ENG/0089

15 15 17

CHAPTER FOUR DESIGN AND METHODOLOGY 4.1 4.2 4.3 Project Approach and Method Software Calibration of Color Code Tracker 4.3.1 4.3.2 4.3.3 4.3.4 4.4 Webcam Initialization Image Cropping RGB to HSI Conversion Codes Minimum and Maximum of HUE 19 19 19 20 20 21 22 23 23 24 24 25 25 26

Object Tracking and Screen Interactivity 4.4.1 4.4.2 4.4.3 4.4.4 HSI to Binary Image Conversion Object Detection Window Screen Synchronization Mouse Controls and Functions

4.5

MATLAB GUI Design

CHAPETR FIVE RESULTS AND ANALYSIS 5.1 5.2 Calibration of Various color code Trackers GUI Analysis 30 30 32

CHAPTER SIX RECOMMANDATIONS AND CONCLUSIONS 6.1 6.2 REFERENCES APPENDIX A: MATLAB GUI CODES APPENDIX B: GANTT CAHRT Recommendations Conclusion 38 38 39 40 41 48

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

LIST OF FIGURES
Page Figure 1.1 Process Flow diagram Figure 2.1 Digital Image Processing flow Figure 2.2 Image before and after filtering Figure 2.3 Sample of Labeling of connected components Figure 2.4 Connected binary image matrix Figure 2.5 Additive Color Model Figure 2.6 RGB Color space cube Figure 2.7 HSI Color space represents colors Figure 2.8 Separated Color Space of an HSI Image Figure 2.9 A Blank GUI Figure 4.3.1(a) Matlab Command Window Figure 4.3.1(b) Preview Video Figure 4.3.2 Image Cropping Figure 4.3.3(a) Cropped Image Figure 4.3.3(b) Normalized HUE Value Figure 4.4 Object Detection Figure 4.5 GUIDE Quick Starts Figure 4.6 Initial Draft of GUI Figure 4.7 Property Inspector windows of Matlab GUI components Figure 4.8 Example of m.file coding in GUI Figure 5.1 Image Cropping of Various Color trackers Figure 5.2 Initial Screen Figure 5.3 Preview Window and Calibration Number Figure 5.4 GUI Instruction Guide Figure 5.5 Cropping of Snapshot Image Figure 5.6 Calibration Completion Message Figure 5.7 Simulate Tracking Figure 5.8 Touchless Interactions on Screen 3 5 7 8 8 10 10 11 12 14 20 21 22 23 23 25 26 27 28 29 30 32 33 34 35 35 36 36

vi

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

LIST OF TABLES
Page Table 2.1 Data Class and Descriptions Table 3.1 Summary of Gantt chart Table 3.2 Detailed Project Tasks and Deadlines Table 5.1 HUE value of various colors on different Positioning Table 5.2 Minimum and Maximum of HUE values of various colors Table 5.3 Simulated Test results for the mouse action using hardware mouse and touchless Interactive screen 6 16 18 31 31 37

vii

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

CHAPTER 1 INTRODUCTION

1.1 Background
Interactive touchless screen or system is an undergoing developing technology in the real life. It is a new evasion after the newly developed touch screen technologies such as iPads and iPhones. As what it says screen or system will be controlled via touchless gestures. With the help of touchless technology anyone would be able to control the screen or system without a touch or stroke. A touchless gesture is the next frontier of the touch technology. The conceptual idea of touchless gestures has the potential to change a lot of what we know about touch. At Present touchless technologies available and named as flash scanning of finger prints, vision based input interface, touchless control technology.

1.2 Motivation
Touchless technology helps people, who work in production plants, engineering sectors, research centres, and especially those disabled ones who dont have to touch on any kind of hardware. Existing touchless technologies are developed on measuring temperatures, sensors, flash scanning of fingerprints and finger gestures via image processing method.

In my work, screen can be controlled by using additive primary colours such as Red, Green, and Blue not only that system is been enhanced to handle any kind of colours beside additive primary colours. The system is enabled to perform calibration of any desired colours by user for to identify the Hue, Saturation and Intensity values to be used in subsequent process of Image Processing. System will be ready to perform touchless interactions once after the designated colour has been identified via image processing method.

1.3 Project Objectives


The main objective of the project is to develop a touchless interactive screen, which allows human to control their system such as PCs or laptops in a coverage area without having touched using any hardware Since the Project is developed on MATLAB platform, one of the goals of the project is to analyze and implement the various possibilities and results of the image acquisition and processing tool box in MATLAB. 1

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

Another objective of the project is to develop a Graphical User Interface (GUI) to let the user to perform the desired object calibration and number of times required to perform upon users request

The academic goal of this project is to develop the research skills, programming and analysis

1.4 Project Scope


The project may include the following proposed schemes: To acquire the minimum and maximum values of the captured frames as part of calibration process using image acquisition toolbox in MATLAB To convert the captured RGB (additive primary colours) images values via webcam into HSI (Hue , Saturation , Intensity) values to not to be affected by the variation of light intensity as part of pre-processing of image data To enhance the software codes to filter the noise and to remove the unwanted components from the pre processed image data To perform the synchronization of the gesture or colour movements into mouse movements on the system screen To test and study the implementation java class functionality on MATLAB platform to perform the desired mouse clicks on the screen To analyze the vast and various possibilities and functionalities of Image Processing using MATLAB

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

The following block diagram show the process flowchart of the project. Initialisation of webcam

Snap shot of the video captured

Cropping of required color from video RGB to HSI Conversion

Extraction of Minimum and Maximum of HUE values

Tracking the movement of color based on HUE values.

Mouse Movement Synchronization via mapping of color coordinates to window screen co-ordinates. Figure 1.1 Process Flow diagram

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

CHAPTER 2 THEORY AND LITERATURE REVIEW

2.1 Introduction
This chapter provides a review of existing knowledge relevant to the aims of the research. It begins in section 2.2 with a review of existing Image enhancement methods or designed to prove image enhancement in general. These include specialized image acquisition method and image conversions and more about digital image processing methods. We may conclude from the theory and literature review digital image processing allows a more flexible and practical method for improving image quality. In the following sections briefly covered the various data classes, how to perform image acquisition, aspects of image processing and Image Representation in MATLAB.

2.2 Digital Image Processing


Digital image processing refers to the processing of digital images by means of digital computer and its been composed with a finite number of elements, each value has a particular location and elements referred as pixels. An image may be described as a twodimensional function, f(x,y) where x and y are spatial coordinates, and the amplitude of f at any pair of coordinates (x,y) is called the intensity or gray level of the image at that point. In image processing one of the hypotheses is to consider three types of computerized process in the continuum: low- , mid- and High-level process. Low-level process means image pre-processing, the primitive operation such as to reduce noise, contrast enhancement and image sharpening. Mid-level process involves image segmentation, categorization of each object and to convert into a suitable form for computer processing. Finally Higher-level process involves the ensemble of the distinguished objects [3]. A process flow diagram has been included to have a better understanding of the multiple steps in digital image processing.

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

Digital Data

Pre - Processing

Feature Extraction

Image Enhancement

Image Restoration

Image Segmentation

Manual Interpretation

Figure 2.1 Digital Image Processing flow

2.3 Data Types


Although there are numerous data types are available in MATLAB, Most frequent using data class is double in image processing applications. The first eight entries referred to as numeric data classes and ninth entry called as char class and last entry referred as logical data class. Listed below are the various data class [3].

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

Name Double Unit8 Unit16 Unit32 Int8 Int16 Int32 single char logical

Description Double Precision , Floating point numbers Unsigned 8-bit integers Unsigned 16-bit integers Unsigned 32-bit integers Signed 8-bit integers Signed 16-bit integers Signed 32-bit integers Single precision floating point numbers characters Values are 0 or 1

Table 2.1 Data Class and Descriptions [3]

2.4 Image Acquisition in MATLAB


The available Image acquisition toolbox in MATLAB helps to acquire images from a real live video and grab frames for further image processing. The toolbox allows enhancing or customizing the image acquisition process that includes object identification, Imagery enhancements, building up panoramic views and mosaics. Toolbox is designed to acquire frames based on computer performance and it automatically handles memory and buffer management. Toolbox is designed to support any color space by the image acquisition device including RGB, YUV and grayscale.

2.5 Aspects of Image Processing in MATLAB


Image Processing is a vast area and for to explain the algorithms conveniently we shall subdivide the algorithms into broad subclasses. Each subclass may contain a numerous tasks and each of them has their own algorithms in itself [1]. These are; Image Enhancement Process includes enhancements of the image that results more appropriate to certain application or experiments. Image enhancement process includes - Sharpening of a damaged image - Edge highlighting - Image brightening - Noise removal 6

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

Image Restoration Process includes restoration of damaged images due to known reason. Image Restoration process includes - Optical distortion removal - Periodic Interference removal Image Segmentation Process includes another subdivision of image into constituent component of the image or locating a certain part of an image. Image Segmentation process includes - Certain shapes in a image - Identifying specific objects in photographs

More details have been included in this report about image Enhancement method and Image Segmentation flow.

2.6 Image Enhancement


Process includes enhancements of the image that results more appropriate to certain application or experiments and it is to improve the quality of the images using softwares. Filtration is one of the tasks in image enhancement algorithm.

2.6.1 Nonlinear Spatial Filters to remove salt and pepper Commonly used order-statistic filters are called nonlinear spatial filters. It responses based on ranking result which does the replacement of the central pixel in the neighbourhood with the value based on the ranking result. In digital image processing, Median filters are the best-known order-statistic filter [3].

Noised Image before filtering [10]

Image after median filtering [10]

Figure 2.2 Image before and after filtering 7

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

2.7 Image Segmentation


Image Segmentation is the task of partitioning of an image into multiple segments and known as superpixels. Objective of image segmentation is to represent or convert the image in the easiest method to interpret. In general, this process is to locate specific objects in the images such as lines, boundaries and curves.

2.7.1 Binary Image Labeling Labeling connected components in the binary image consists of two passes over the image and with an in-between. This process step is called equivalence class resolution. Process works in such a way that performs scanning row by row and moving across from left to right of an image. It assigns labels to pixels in the image. These labels will be the connected components of the binary image and it will be numbers. For explanatory purposes a pixel that is in the image will be referred as foreground pixel and a pixel that is not in the image will be referred as background pixels.

Figure 2.3 Sample of Labeling of connected components [4]

The matrix given below will be formed when the labeling completed on the connected components of a binary image of the above sample

Figure 2.4 Connected binary image matrix [4]

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

2.8 Image Representation in MATLB


Image processing toolbox in MATLAB is powerful enough to handle various image representations. Listed below are the five types of images in MATLAB Grayscale Truecolor RGB Indexed Binary Unit8

However there are other color spaces are available in image toolbox and also referred as color Models. Theses color spaces are convenient or rather faster to process in image processing applications and some conversion functions are available in toolbox from RGB Model to the specific color spaces and vice versa. Color spaces are as follows [3]; NTSC color space The YCbCr color space The HSV color space The CMY and CMYK color space The HSI color space

In this project I shall be more focusing on Truecolor RGB and conversion from RGB to HIS color space (Hue, Saturation, and Intensity).

2.8.1 RGB Image An RGB color Model also called as Additive color Model and three colors are additive primary colors such as Red, Green and Blue. An array of M x N x 3 represents an RGB color image at specific spatial location and also referred as components image. A components images data class determines their range of values. The value range of class double RGB image is [0, 1]. The bit depth of an RGB image determines based on the pixel values of the component images, which is used to represent by bits. Graphical representation of an RGB color space is an RGB color cube.

To make the final color spectrum, the three light beams are added together and, and their light spectra add, wavelength for wavelength [2] 9

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

Figure 2.5 Additive Color Model [5]

Figure 2.6 RGB Color space cube [7]

2.8.2 HSI Color Space HSI describe it by its Hue, Saturation and brightness or Intensity. HSI color space is an attractive Model in image processing applications since it represents colors same as of how human eye sense [11]. Hue is an attribute and it depicts a pure color, where by saturation provides of the degree to which a pure color is diluted by white light [2]. Brightness or Intensity is another factor which is subjective and is not possible to measure practically; it is the key factor of color sensation. A monochromatic image can be described by Intensity level and this quantity is easily interpretable and measureable. HSI color space separates out the Intensity component from hue and saturation in an image. As per that, HSI color space model is the perfect tool for to develop image processing algorithms. Below figure illustrates representation of colors in HSI color space

10

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

Figure 2.7 HSI Color space represents colors [11]

Hue value component describes the color itself in angle form between [0, 360] degrees. Component Saturation describes the depth of the white color disturbances in the color and its ranges [0, 1]. The Intensity component ranges [0, 1] and 0 stands for black and 1 stands for white.

2.8.3 RGB to HSI Conversion Following formula explains the how the RGB values will be converted into HSI; Equation to perform normalization as follows,

From Each normalized values, the three H, S and I components shall be derived by

11

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

Then H, S, and I values are converted in the respective ranges [0,360], [0,100], [0, 255]

Figure 2.8 Separated Color Space of an HSI Image explains separate color planes (Hue, Saturation, and Intensity) of the RGB to HSI converted image. Hue value shows the transition from high to low and it is visible from the figure that deep blue have the lowest HUE value and deep red have the highest HUE value. Saturation represents the purity of the color and figure clearly shows that highest saturation has the highest values and represents in white color and the gray shade in the centre of the saturation shows the mixture of colors. Intensity is equivalent to brightness of colors and the brightest area in the Intensity image corresponds to the brightest colors of the original Image [9].

Figure 2.8 Separated Color Space of an HSI Image [9]

2.9 Introduction to MATLAB


The name MATLAB stands for Matrix Laboratory. Matlab is a data analysis and visualization tool, and it has the strong support for matrices and matrix operation. Matlab has its own dominant programming language and exceptional graphical capabilities. 12

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

Matlab is such an important tool since it been designed to support a special task and these sets of task or programs called toolboxes and the special toolbox referred as ImageProcessing toolbox. Rather than utilize all Matlab capabilities, used those functions which is relevant to the image handling process [1]. It has other operations such as: Data Acquisition Algorithm Development Application development Matlab has a lot of application in itself, but one of the powerful toolbox is Image Processing toolbox. It has various functions in it and that extended the usability of the programming environment. Image Analysis and Enhancements Image morphological operations Image topology

2.9.1 Graphical User Interface (GUI) A graphical user interface referred as pictorial interface. A GUI helps the program to provide a consistent form and intuitive and its easier to navigate by instinctive controls like Menus, Boxes and Pushbuttons. GUI should be designed to help the user to navigate around or using the applications. The hardest part of GUI based programming is that should be able to accept the hardware interruptions like mouse or keyboard functions. These kinds of interruptions or inputs called as events and a program that acts in response referred as even driven. The prime factors is involved to create a MATLAB graphical user interface are [6] Components like boxes , sliders , push buttons Figure Callbacks program to respond by user clicks

13

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

Matlab GUI is developed on development environment called GUIDE

Figure 2.9 A Blank GUI

14

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

CHAPTER 3 PROJECT MANAGEMENT

3.1 Project Plan


A Gantt chart (Appendix B) in Microsoft Excel will be used to track the project progress and to monitor the timely progress in project development of the different phases of project. In the Gantt chart has well indicated the time line of each task in order to follow the phases of the project without any delay. Gantt chart helps to monitor the project progress and enables the assessment of progression on each level since the chart has indicated the timeline of the entire project. Chart efficiently guides the process of project exploration, background materials required for the analysis and implementation and project delivery till the final presentation. Project was properly planned with the help of my supervisor Dr Cai, based on his vast knowledge and experiences from the previous projects.

Project is mainly divided into 5 parts and those are listed below, Project Research and Planning Project Research and Planning is the base and first stage of the project. Proper planning is required for to make sure that entire project work can be done within desired timing and met project objectives. Project research and planning includes the software development logic, literature review and gathering relevant information. Software Development Second stage of the project will be software development and it will span across for a longer period. Software development will be done based on understandings and conclusions of project research. Debugging and Troubleshooting Third stage of the project will be Debugging and troubleshooting and it will across for a longer period but shorter than software development, in fact it goes along with the second stage. Project requirements and specifications will be clearer at this stage of the project. Final Testing Fourth stage of the testing will be the final testing and its an end to end testing and it will span across for a shorter period. Based on the previous project experience, final testing required only a shorter period, if the project objective has been met.

15

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

Documentation Documentation is not a stage but it goes along with project development. Proper documentation is required to capture all problems encountered. Documentations consist of Project Proposal Report, Interim Report, Final Year Report and Meeting Logs.

Table given below elaborates the summary of project plan and Schedule. A Gantt chart has been attached as Appendix B.

Development of a Touchless Interactive Screen


S/N
1

Task
Research and Planning 1.1 Software Research 1.2 Literature Review 1.3 Gathering Information

Duration
Week 4 Week 10 Week 9 Week 6 Week 10 Week 11 Week 24 Week 24 Week 24 Week 25 Week 35 Week 29 Week 31 Week 35 Week 28 Week 37 Week 37 Week 37 Week 4 Week 35 Week 7 Week 16 Week 41 Week 45 Week 45 Table 3.1 Summary of Gantt chart

Resources
Library Facilities and Journals

Software Development 2.1 Codes Development 2.2 Codes testing and debugging

Digital Image Processing using MATLAB

Debugging and Troubleshooting 3.1 Code Enhancement 3.2 Graphical User Interface 3.3 Codes Finalization

MATLAB Software

Final Testing 4.1 End to End Test 4.2 Evaluation

Documentation 5.1 Project Proposal Report 5.2 Project Interim Report 5.3 Project Final Report 5.4 Oral Presentation 5.5 Meeting Logs

Previous Project Reports and Thesis

16

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

3.2 Project Task and Schedules


Listed below are the project tasks and can be segmented into different sections, 1. Literature Research and Review 2. Project Planning 3. Drafting and submission of Project Proposal Report 4. Software Research 5. Understanding Digital Image Processing 6. Research on various image processing technologies 7. Drafting and submission of Interim Report 8. Calibration Coding 9. Object Tracking codes development 10. Debugging and testing 11. Development and Interfacing of Graphical User Interface (GUI) 12. Preparation of Final Year Report and Poster 13. Preparation of Oral Presentation

17

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

Table given below elaborates each project tasks and allocated deadlines S/N 1 2 3 4 5 6 7 8 9 10 11 12 13 Project Task Literature Research and Review Project Planning Submission of Project Proposal Software Research Understanding Digital Image Processing Research on various image processing technologies Submission of Interim Report Calibration Coding Object tracking codes development Debugging and Testing Development and Interfacing of GUI Submission of Final Year Report and Poster Oral Presentation Deadlines 5 March 2011 04 April 2011 7 March 2011 29 March 2011 14 April 2011 24 April 2011 9 May 2011 10 July 2011 24 August 2011 30 August 2011 15 October 2011 14 November 2011 3 December 2011

Table 3.2 Detailed Project Tasks and Deadlines

18

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

CHAPTER 4 DESIGN AND METHODOLOGY

4.1 Project Approach and Method


Project Objective is to develop interactive touchless screen, which helps the humans to navigate the mouse on the screen or system without having any touch or stroke using hardware items such as mouse or keyboard. There are a lot of interactive technologies are available in the market and most of them are under development stage based on various image processing methods such as measuring temperatures, sensors, flash scanning of finger prints and finger gestures via image processing method.

In this project, interactivity of the touchless screen will be developed based on digital image processing methods on MATLAB platform. There are two processes has to be performed in order to achieve the project scope. 1. Perform Color code Calibration A function to perform color code calibration via webcam to decide the object that user suppose to use for tracking purpose. User may choose any additive primary coloured object for tracking purpose. Calibration codes detect the object color in RGB color model and convert into HSI color space then extract and pass down the HUE value to the object tracking program for further process. Sine the calibration takes the HUE value of the color code tracker, the light intensity variation will not be affected deeply. 2. Perform object tracking and mouse interactivity on screen A real live video will be continually captured through webcam and captured datas will be pre-processed based on the pass down HUE value from the above function. Program starts to detect the object and the movement of the object will be synchronized with mouse movement on screen once the given HUE value conditions are satisfied in object tracking program. Interactivity enables the user to navigate the mouse on screen and performs mouse functionalities such as open/close, save and edit.

4.2 Software
Software design will be performed on MATLAB platform version R2008b because of its powerful toolboxs called as image acquisition and processing toolbox. There are various functionalities are available in MATLAB which helps to perform smooth image acquisition and pre-processing of image enhancements and analysis. 19

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

4.3 Calibration of color code Tracker


Calibration of color code tracker consists of various image processing phases like capturing image from a real live video and cropping of the color code tracker. A real live video will be continually captured using webcam for an unlimited period. Calibration allows users to choose the total number of calibration is required to perform. User may opt to start perform calibration and required calibration number via Graphical User Interface.

4.3.1 Webcam Initialization Webcam Initialization will be performed differently on various PCs or Laptops. It has to be configured based on the Webcam configuration in each system. To retrieve the properties of the webcam configuration in each system, type Matlab code imaqhwinfo on Matlab command window as shown below Figure 4.3.1(a) Matlab Command Window. System shall return the hardware information cached by the toolbox.

Figure 4.3.1 (a) Matlab Command Window

Following codes initialize the webcam and preview the real live video with an image size of 320x240 as shown in Figure 4.3.1(b) Preview Video

20

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

vid=videoinput('winvideo',1,'RGB24_320x240'); %video input set(vid,'FramesPerTrigger',3); % Maximum frames per trigger set(vid,'TriggerRepeat',Inf); triggerconfig(vid, 'Manual') start(vid); % start capture preview(vid); %Preview

Figure 4.3.1(b) Preview Video

4.3.2 Image Cropping Image cropping functionality helps to crop the desired color code object tracker on the captured frames from real live video. The Matlab Code getnsapshot can be used for to capture the image from the video and code imcrop helps to enable the cropping functionality of the image as shown in Figure 4.3.2 Image Cropping. The cropped data will be converted into data type double for further image pre-processing. Below Matlab code does the image cropping process
data = getsnapshot(vid); % retrives frames imshow(data); dataCropped = imcrop; dataCropped1 = double(dataCropped);

21

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

Figure 4.3.2 Image Cropping

4.3.3 RGB to HSI Conversion Codes The captured and then cropped image in RGB color model as shown in Figure 4.3.3(a) Cropped Image, need to be converted into HSI color space to extract the HUE value from the captured images. The HUE value of each color will not be deeply affected by the light intensity variation on the background. Following codes will be used for to perform the RGB to HSI conversion and performs HUE value normalization before passing the value for the subsequent process as shown in Figure 4.3.3(b) Normalized HUE Value.
r=dataCropped1(:,:,1); g=dataCropped1(:,:,2); b=dataCropped1(:,:,3); S=1-3.*(min(min(r,g),b))./(r+g+b+eps); I=(r+g+b)/3; th=acos((0.5*((r-g)+(r-b)))./((sqrt((r-g).^2+(r-b).*(g-b)))+eps)); %H value conversion H=th; H(b>g)=2*pi-H(b>g); H=H/(2*pi); % performs normalization

22

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

Figure 4.3.3(a) Cropped Image

Figure 4.3.3(b) Normalized HUE Value

4.3.4 Minimum and Maximum of HUE Minimum and Maximum of HUE value can be extracted using simple Matlab code min (: ) and max (: ). Calibration codes allow performing more than 1 but up to 100 times of calibration. Codes are been enhanced to perform noise removal and very minor values such as 0.0010, 0.0020 to be excluded before extract the minimum and maximum of HUE, This is due to not to be extracted the zero values and very minor values for minimum values. The Minimum and maximum of the above red color as shown below and these values will be saved into a .TXT file and for to be retrieved by the object tracking program on later stage. z = 0.9005 0.9976

4.4 Object Tracking and Screen Interactivity


Object tracking codes continuously tracks the movement of the color code tracker through webcam and the movement will be synchronized with mouse movements on the screen. As per explained in section 4.3 Calibration of color code tracker, Object tracking codes also performs webcam initialization, RGB to HSI conversion before it reaches the level of HSI to Binary Image Conversion. Object tracking codes have been enhanced to perform the mouse functions like right click or left click which helps to perform open/close, save, edit functions.

23

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

4.4.1 HSI to Binary Image Conversion HSI to Binary Image conversion will be performed after computes the global threshold level using code graythresh and that level can be used to convert an intensity or RGB image into a binary image using code im2bw. A normalized intensity value that lies in the range of [0, 1] and global threshold value computation uses Otsus method. The threshold black and white pixels interclass variance will be reduced by choosing the global threshold level. Matlab code im2bw converts an RGB image into binary image through the following steps. First it converts the image into a grayscale format and then converts the grayscale image into a binary format using that threshold level. Below Matlab codes does the HSI to binary image conversion,
%Convert the resulting HSI_live image into a binary image. level=graythresh(H_live_up); H_live_up = im2bw(H_live_up,level);

4.4.2 Object Detection The converted binary image will go through morphological process for to perform the object detection. First process is to remove pixels, Matlab code bwareaopen helps to remove the unwanted pixels from the connected components and produces another binary image. Second process is to label the connected objects, Matlab code bwlabeln always returns a label matrix those contains label for the connected components in the binary image and it also returns the total number of connected objects in found in the binary image which restricts multiple objects detection those having same HUE value. Last process will be to find the properties of the binary image such as Area, Filled Area, and Perimeter. Matlab code regionprops returns the properties of each labelled region in the label matrix. Below Matlab code does the object detection process,
% Remove all those pixels less than 300 pixels H_live_up = bwareaopen(H_live_up,300); % Label all the connected components in the image. % Restrict multiple detection via num [L,num] = bwlabeln(H_live_up); obj = regionprops(L, 'BoundingBox', 'Centroid','Area','FilledArea','Perimeter','Orientation'); len = length(obj); bb = obj(object).BoundingBox; bc = obj(object).Centroid; rectangle('Position',bb,'EdgeColor','r','LineWidth',2) plot(bc(1),bc(2),'-m+'); % Determine X and Y co-ordinate a=text(bc(1)+15,bc(2), strcat('X:', num2str(round(bc(1))), 'Y:', num2str(round(bc(2)))));

24

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

set(a, 'FontName', 'Arial', 'FontWeight', 'bold', 'FontSize', 12, 'Color', 'Yellow'); set(0,'pointerLocation',[(bc(1)+250) (bc(2)+250)])

Figure 4.4 Object Detection

4.4.3 Window Screen Synchronization Based on the co-ordinates of the location of the identified color which is calibrated earlier, the window screen mouse co-ordinates (x, y) are to be mapped accordingly. This is done by co-relating the Matlab video frame co-ordinates to Window screen co-ordinates mathematically. Below Matlab codes perform the window screen synchronisation to move the mouse pointer based on this co-relation,
% Calculation of X value:if bc(1)==0, t_x=0; else t_x=bc(1)*4.666; end round(t_x); % Calculation of Y value:if bc(2)==300, t_y=0; else t_y=(bc(2)-300)*-1*2.66; end round(t_y); set(0,'pointerlocation',[t_x t_y]);

4.4.4 Mouse Controls and Functions Java class function is used to perform mouse controls and functions. Function import package_name adds the specified package name to the current import list.

25

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

In this work only 2 packages are used for the mouse controls and functions. Those are java.awt.Robot and java.awt.event. Below Matlab code does the mouse controls and functions,
%Calling java function to perform left click and right click import java.awt.Robot; import java.awt.event.*; mouse=Robot; mouse.mousePress(InputEvent.BUTTON3_MASK); mouse.mouseRelease(InputEvent.BUTTON3_MASK); mouse.mousePress(InputEvent.BUTTON1_MASK); mouse.mouseRelease(InputEvent.BUTTON1_MASK);

4.5 MATLAB GUI Design


Matlab lab GUI will be developed using MATLAB graphical User Interface design environment called GUIDE. It is both a directory and function. In Matlab command window type guide opens the GUI design environment. GUIDE initiates tools that allows GUI to be created or edited interactively from FIG-files or handle(s) to figure. Calling GUIDE by itself will open the GUIDE Quick Start Dialog where can choose to open a previously created GUI or create a new one from one of the provided GUI templates as shown in Figure 4.5 Guide Quick Starts

Figure 4.5 GUIDE Quick Starts

26

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

GUIDE(filename) opens the FIG-file named 'filename' for editing if it is on the MATLAB path. GUIDE(fullpath) opens the FIG-file at 'fullpath' even if it is not on the MATLAB path.

Here we started to create a new GUI from default blank template as shown in Figure 2.9 A Blank GUI. Using the palette on the left side of the GUI adding static text box, Axes, Pushbuttons are used to create a Matlab GUI as shown in Figure 4.6 Initial Draft of GUI.

Figure 4.6 Initial Draft of GUI

27

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

Each component has its own default properties and its editable using the property Inspector. Select the component which needs to be edited and when perform right click on that a new pop-up window will be up to edit the properties as shown in Figure 4.7 Property Inspector windows of Matlab GUI components

Figure 4.7 Property Inspector windows of Matlab GUI components

28

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

Once after the layout of the GUI completed then the next step is need to integrate the program in each component. An m.file will be auto generated upon save the GUI. Each component has callback functions and it can be edited using the m.file. For example, in the m.file , codes have been added to trigger and preview video when pushbutton2 START PREVIEW is pressed as shown in Figure 4.8 Example of m.file coding in GUI

Figure 4.8 Example of m.file coding in GUI

29

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

CHAPTER 5 RESULTS AND ANALYSIS

5.1 Calibration of Various Color Code Trackers


Given below figure shows various color trackers image cropping.

(a)

(b)

(c )

(d)

Figure 5.1 Image Cropping of Various Color trackers

30

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

Table 5.1 summarizes the analysis of the calibrated values of the four different colors like Red, Green, Orange and Yellow and the minimum and maximum of HUE values of each color. Table shows the variation in HUE values due to light intensity can be analysed from the different positioning of the color tracker.

Color

HUE Bottom Right


[0.8801 0.9757] [0.4605 0.5476] [0.1010 0.1474] [0.1795 0.2035]

HUE Bottom Left


[0.8622 0.9852] [0.4722 0.5125] [0.1003 0.1003] [0.1667 0.2068]

HUE -Top Right


[0.8363 0.9670] [0.4741 0.5107] [0.1273 0.1667] [0.1667 0.2143]

HUE -Top Left


[0.7551 0.9697] [0.4469 0.5672] [0.1002 0.1667] [0.1667 0.2500]

HUE Centre
[0.8333 0.9844] [0.3843 0.5216] [0.1002 0.1667] [0.1509 0.1667]

Table 5.1 HUE value of various colors on different Positioning

Various positioning calibration is required since the HUE values may have slight variation due to variation in the intensity of light. It is necessary to grab various HUE values for to determine the minimum and maximum of HUE and this helps track the object movement accurately and easy to get synchronized with the window screen movements. It also helps to avoid the HUE value variation caused by light intensity in a certain limit.

Table 5.2 summarizes the minimum and maximum of HUE value of each calibrated colors from the above table from various positioning.

Color

Minimum and Maximum of HUE values [0.7551 0.9852] [0.3843 0.5672] [0.1002 0.1667] [0.1509 0.2500]

Table 5.2 Minimum and Maximum of HUE values of various colors 31

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

5.2 GUI Analysis


The Graphical user interface allows users to perform color code tracker calibration and user may trigger the program from the GUI. Illustrated below on step by step that how to perform calibration for example using red color and GUIs instruction guide helps the user to perform various steps and how to trigger the tracker program

Step 1: Start-up GUI When the GUI started, user could see the following screen and allows the user to perform calibration of the designated color code tracker as shown in Figure 5.2

Figure 5.2 Initial Screen

32

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

Step 2: Required Calibration Number Real live video capturing will be performed once upon click the pushbutton START PREVIEW and it will be shown in the PREVIEW WINDOW. When the user is decided the total number of calibration is required to be performed, and this number can be entered into the free text column Enter Calibration No: for example 1 as shown in Figure 5.3. This will allows the system to decide the total number of calibration to be performed.

Figure 5.3 Preview Window and Calibration Number

33

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

Step 3: Perform Calibration When User clicks on pushbutton CALIBRATE COLOR, system prompt an instruction message in the IMAGE PROCESSING OUTPUT that how to position the color code tracker before starts to perform the calibration as shown in Figure 5.4. The screen also tells the user that how many more calibrations have left to perform.

Figure 5.4 GUI Instruction Guide

Step 4: Image Cropping Align the color code tracker in the top left corner of the window and press enter to continue the calibration , so then snapshot of the real live video will be performed and right click to choose the cropping functionality as shown in Figure 5.5

34

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

Figure 5.5 Cropping of Snapshot Image

Step 5: Simulate Tracking System prompts message upon successful completion of calibrations. Instruction panel shows that, click on pushbutton SIMULATE TRACKING to view the color code tracking and GUI shows the identified color in the screen as shown in Figure 5.6

Figure 5.6 Calibration Completion Message 35

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

Image Processing Output panel shows that tracking of the designated color code as shown in Figure 5.7. System prompt message that tracking simulation is completed and click on pushbutton TOUCHLESS SCREEN to view the touchless interaction on the window screen that allows performing mouse functionality as shown in Figure 5.8

Figure 5.7 Simulate Tracking

Figure 5.8 Touchless Interactions on Screen

36

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

Table 5.3 summarizes test results conducted under incandescent light mode and the simulation attempt was to track the mouse, to open a folder at windows desktop. Time taken for individual successful trials were recorded down and compared with actual mouse response time for the same action.

Average time taken to open a folder at windows desktop by using an external hardware mouse is found to be 2.5 to 3 seconds. Same action was simulated using the touchless interactive screen for 10 times to measure the responsivity of the system. Test results are shown below and responsivity of the mouse action via touchless interactive screen was calculated.

Number of trials 1 2 3 4 5 6 7 8 9 10
*1

Time taken to open a folder in Seconds TIS*1 (x) 4.3 1.8 3.7 4.4 3.9 0.9 3.6 3.0 1.8 4.6 3 Hardware Mouse (y)

Variance |(y-x)| 1.3 1.2 0.7 1.4 0.9 2.1 0.6 0 1.2 1.6

Responsivity (x/y) % 143.33 60.00 123.33 146.67 130.00 30.00 120.00 100.00 60.00 153.33

Touchless Interactive Screen

Table 5.3 Simulated Test Results for the mouse action using hardware mouse and touchless interactive screen

Based on the above test results, highest variance is observed at trial number 6 with a variance of 2.1seconds compared to the relative value of 3 and the least variance is at trial number 8 with a value of 0. Based on the responsivity it is found that out of 10 trials made, 3 trails were with a better responsivity compared to external hardware mouse. Rest of the 7 trials shown slower response than the hardware mouse, with the largest responsivity deviation of 1.6 seconds than the external hardware mouse.

37

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

CHAPTER 6 RECOMMENDATIONS AND CONCLUSION


6.1 Recommendations
Project objectives and scopes are achieved using Image Processing technologies and its functionalities on Matlab platform. In this work we have analyzed and concluded that window screen can be controlled by gesture analysis and its not purely developed based on gesture movement but on color code tracker though the mouse movements are controlled by gesture movements.

Following recommendations can be included in the future work. To include an additional functionality for multicolour detection to perform opening and closing of internet explorer when a specific color code tracker used for interactions on window screen. Since the project works based on color code tracker, it would be good if the detection can be performed based on figure movements, but calibration of skin color will be challenge since skin tone variation is quite high. In this project work inbuilt webcam is used for object tracking and have encountered limitation on the camera focusing. Image processing capabilities can be improved using better cameras with good resolution and dedicated image processing chips

In this work GUI is designed with limited instructional guide, GUI can be further improved for academic usage for teaching image processing techniques, like add on features such as step by step process and output plots for video capture, image calibration, RGB to HSI conversion, binary image, bounding box, plotting of detected color and its co-ordinates, mapping of video frame co-ordinates into window screen co-ordinates. It would be good if the codes are further improved to have addition of a color data bank. It would reduce the calibration step before trigger the object tracking program. The color data bank contains the minimum and maximum of HUE value of listed colors and user may use any of the listed color for object tracking purpose.

In this work we have tried to avoid the object tracking not to be affected by the light intensity variation, though at some point of time variation of light intensity has some significant impact on it. So it would be good if able to perform a theme selection in GUI before webcam initialization. 38

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

This is to set the camera mode to nullify the effect of light intensity due to outside environment like Night mood, bedroom mode, sunny mode, gloomy mode and incandescent mode etc.

6.2 Conclusion
Project objectives achieved using Image Processing technologies and its functionalities on Matlab platform. In this work we have analyzed and concluded that window screen can be controlled by gesture analysis and its not purely developed based on gesture movement but on color code tracker though the mouse movements are controlled by gesture movements. Since Image Processing and technologies and functionalities on Matlab platform was new to me and it took a great deal of time to meet the project objectives within the scheduled timing. It was a great effort of work to understand the programming structures and I was able to overcome those difficulties with the help of my supervisor Dr. Cai Zhi Qiang and technical guidance from my friends and technical supports.

39

Anju Sebastian (Q0806388)


REFERENCES

JAN2011/ENG/0089

[1] [2]

Alasdiar Mcandrew, 2004, Introduction to Digital Image Processing with MATLAB, Course Technology Charles A. Poynton (2003). Digital Video and HDTV: Algorithms and Interfaces. Morgan Kaufmann. ISBN 1558607927. Rafael C. Gonzalez, Richard Eugene Woods, Steven L. Eddins, 2004, Digital Image processing using MATLAB, Pearson, Upper Saddle River, New Jersey http://blogs.mathworks.com/steve/2007/05/11/connected-component-labeling-part-5/ http://en.wikipedia.org/wiki/RGB_color_model http://ewh.ieee.org/r8/uae/GUI.pdf http://software.intel.com http://web2.clarkson.edu/class/image_process/RGB_to_HSI.pdf http://www.mathworks.com/help/toolbox/images/f8-20792.html

[3]

[4] [5] [6] [7] [8] [9]

[10] http://www.mathworks.com/help/toolbox/images/f11-12251.html [11] http://www.scribd.com/doc/16617255/HSI-Color-Space [12] http://www.mathworks.com/matlabcentral/fileexchange/30546-tracking-yellow-color [13] http://www.mathworks.com/help/toolbox/imaq/http://www.mathworks.com /help/toolbox/imaq/f7-78082.html#f7-76006 [14] http://pratapvardhan.blogspot.com/2010/03/using-webcam-in-matlab-for-image.html [15] http://www.mathworks.com/help/toolbox/images/ref/rgb2gray.html [16] http://www.mathkb.com/Uwe/Forum.aspx/matlab/158837/ Executable-Java-Undefined-Function [17] http://www.mathworks.com/matlabcentral/newsreader/view_thread/285933 [18] http://download.oracle.com/javase/1.4.2/docs/api/java/awt/Robot.html

40

Anju Sebastian (Q0806388)


APPENDIX A: MATLAB GUI CODES

JAN2011/ENG/0089

function varargout = trial(varargin) % TRIAL MATLAB code for trial.fig % Begin initialization code - DO NOT EDIT gui_Singleton = 1; gui_State = struct('gui_Name', mfilename, ... 'gui_Singleton', gui_Singleton, ... 'gui_OpeningFcn', @trial_OpeningFcn, ... 'gui_OutputFcn', @trial_OutputFcn, ... 'gui_LayoutFcn', [] , ... 'gui_Callback', []); if nargin && ischar(varargin{1}) gui_State.gui_Callback = str2func(varargin{1}); end if nargout [varargout{1:nargout}] = gui_mainfcn(gui_State, varargin{:}); else gui_mainfcn(gui_State, varargin{:}); end % End initialization code - DO NOT EDIT % --- Executes just before trial is made visible. function trial_OpeningFcn(hObject, eventdata, handles, varargin) axes(handles.axes5); imshow('SIM.png'); axes(handles.axes1); imshow('spectrum.png'); axes(handles.axes2); imshow('spectrum.png'); axes(handles.axes7); imshow('matlab.png'); handles.output = hObject; guidata(hObject, handles); % --- Outputs from this function are returned to the command line. function varargout = trial_OutputFcn(hObject, eventdata, handles) varargout{1} = handles.output; % --- Executes on button press in pushbutton1. function pushbutton1_Callback(hObject, eventdata, handles) calib_no = str2num(get(handles.edit1, 'String')); global z; global vid; global t; t=1; % Set initial value of t as 1 Mi = ones(1,100); % Define a new array for min values Ma = zeros(1,100); % Define a new array for max values n=calib_no; while (t<=n) if t==1, axes(handles.axes4); imshow('topleft.png'); set(handles.text9,'String','NOW YOU ARE PERFORMING NO:1'); pause; end calib_vid=handles.vid; calib_vid.returnedcolorspace='rgb'; data1=getsnapshot(calib_vid);

CALIBRATION

41

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

axes(handles.axes2); imshow(data1); dataCropped=imcrop; axes(handles.axes6); imshow(dataCropped); dataCropped1 = double(dataCropped); r=dataCropped1(:,:,1); g=dataCropped1(:,:,2); b=dataCropped1(:,:,3); S=1-3.*(min(min(r,g),b))./(r+g+b+eps); I=(r+g+b)/3; th=acos((0.5*((r-g)+(r-b)))./((sqrt((r-g).^2+(r-b).*(g-b)))+eps)); %H value conversion H=th; H(b>g)=2*pi-H(b>g); H=H/(2*pi); % performs normalization J=H; remove=[(J(:)<0.2)]; J(remove)=[]; J_lower= min(J(:)); % Minimum of H value J_upper= max(J(:)); % Maximum of H value Mi(t) = J_lower; % Determines minimum and maximum of H values Ma(t) = J_upper;% Determines minimum and maximum of H values t=t+1; if t==2 && n>=2, set(handles.text9,'BackgroundColor','blue'); set(handles.text9,'String','NOW YOU ARE PERFORMING CALIBRATION NO:2'); axes(handles.axes4); imshow('topright.png'); pause; end if t==3 && n>=3, set(handles.text9,'BackgroundColor','blue'); set(handles.text9,'String','NOW YOU ARE PERFORMING CALIBRATION NO:3'); axes(handles.axes4); imshow('bottomright.png'); pause; end if t==4 && n>=4, set(handles.text9,'BackgroundColor','blue'); set(handles.text9,'String','NOW YOU ARE PERFORMING CALIBRATION NO:4'); axes(handles.axes4); imshow('bottomleft.png'); pause; end if t==5 && n>=5, set(handles.text9,'BackgroundColor','blue'); set(handles.text9,'String','NOW YOU ARE PERFORMING CALIBRATION NO:5'); axes(handles.axes4); imshow('centre.png'); pause; end if t>6 && n>=6, set(handles.text9,'BackgroundColor','blue'); set(handles.text9,'String','NOW YOU ARE PERFORMING EXTRA CALIBRATIONS'); axes(handles.axes4); imshow('anywhere.png');

42

Anju Sebastian (Q0806388)


pause; end end

JAN2011/ENG/0089

z1=min(Mi(:)); z2=max(Ma(:)); %-------------------------------------------------------------------% Write z values to storez.txt file %-------------------------------------------------------------------fid=fopen('z1.txt','w'); fprintf(fid,'%f',z1); fid=fopen('z2.txt','w'); fprintf(fid,'%f',z2); %-------------------------------------------------------------------stop(calib_vid) delete(calib_vid) axes(handles.axes1); imshow('spectrum.png'); axes(handles.axes2); imshow('spectrum.png'); axes(handles.axes4); imshow('finalmsg.png'); set(handles.text9,'BackgroundColor','blue'); set(handles.text9,'String','CALIBRATIONS COMPLETED'); guidata(hObject, handles);

ARE

SUCCESSFULLY

% --- Executes on button press in pushbutton2. function pushbutton2_Callback(hObject, eventdata, handles) global vid; global z; axes(handles.axes1); vid=videoinput('winvideo',1,'YUY2_320X240'); vidRes = get(vid, 'VideoResolution'); imWidth = vidRes(1); imHeight = vidRes(2); nBands = get(vid, 'NumberOfBands'); hImage = image( zeros(imHeight, imWidth, nBands) ); % Specify the size of the axes that contains the image preview(vid,hImage) handles.vid = vid; guidata(hObject, handles); % --- Executes on button press in pushbutton3. function pushbutton3_Callback(hObject, eventdata, handles) % No special codes to be written here. close; vid=videoinput('winvideo',1); set(vid,'TriggerRepeat',Inf); vid.returnedcolorspace='rgb'; nFrame =200; right_click=0; buffer_x=zeros(1,nFrame); buffer_y=zeros(1,nFrame); range_x=zeros(1,nFrame); range_y=zeros(1,nFrame); start(vid)

% % % %

buffer buffer buffer buffer

array array array array

for for for for

storing storing storing storing

bc(1) bc(2) X co-ordinate range Y co-ordinate range

43

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

%************************************************************************ % RUN THE DATA ACQUISITION FOR 'nFRAME' TIMES %************************************************************************ while(vid.FramesAcquired<=nFrame) frame_no=vid.FramesAcquired; %Assign acquired frames data1=getdata(vid,1); % Store the image captured to data buffer at each cycle. data=double(data1); r=data(:,:,1); % Extract RED component. g=data(:,:,2); % Extract GREEN component. b=data(:,:,3); % Extract BLUE component. S_live=1-3.*(min(min(r,g),b))./(r+g+b+eps); I_live=(r+g+b)/3; th_live=acos((0.5*((r-g)+(r-b)))./((sqrt((r-g).^2+(r-b).* (g-b)))+eps)); %H value conversion H_live=th_live; H_live(b>g)=2*pi-H_live(b>g); H_live=H_live/(2*pi); % performs normalization %-------------------------------------------------------------------J_live=medfilt2(H_live,'symmetric'); %-------------------------------------------------------------------fid = fopen('z1.txt'); pd_l = fscanf(fid, '%f'); fclose(fid); fid = fopen('z2.txt'); pd_u = fscanf(fid, '%f'); fclose(fid); %-------------------------------------------------------------------J_live_up=((J_live>=pd_l)&(J_live<=pd_u)); H_live_up=J_live_up.*J_live; %-------------------------------------------------------------------%Convert the resulting HSI_live image into a binary image. level=graythresh(H_live_up); H_live_up = im2bw(H_live_up,level); %-------------------------------------------------------------------% Remove all those pixels less than 300 pixels H_live_up = bwareaopen(H_live_up,300); %-------------------------------------------------------------------% Label all the connected components in the image. % Restrict multiple detection via num [L,num] = bwlabeln(H_live_up); %-------------------------------------------------------------------%compute area of each component total = bwarea(L); if num ==1, obj=regionprops(L,'BoundingBox', 'Centroid','Area','FilledArea','Perimeter','Orientation'); len = length(obj); object=1; if (object <= len) bc = obj(object).Centroid; % Calculation of X value:if bc(1)==0, t_x=0; else t_x=bc(1)*4.666; end round(t_x); % Calculation of Y value:if bc(2)==300,

44

Anju Sebastian (Q0806388)


t_y=0; else t_y=(bc(2)-300)*-1*2.66; end round(t_y); set(0,'pointerlocation',[t_x t_y]);

JAN2011/ENG/0089

%-----------------------------------------------------------------------%Implement Timer functionality pre_xx=round(bc(1)); %Assign X co-ordinate pre_yy=round(bc(2)); %Assign Y co-ordinate buffer_x(frame_no)=pre_xx; %Store each X co-ordinate to the corresponding frame buffer_y(frame_no)=pre_yy; %Store each Y co-ordinate to the corresponding frame % Implement new logic for range computation %-----------------------------------------------------------------------% To compare existing co-ordinates are in the range of value '10',when compared to the previous. % The range of 10 is for testing purposes. Requires fine tuning. tol=5; % Need to fine tune this value %-----------------------------------------------------------------------%Implement Right click functionality if ((frame_no>=10) && ((buffer_x(frame_no))<= (buffer_x(frame_no-1)+tol))&& ((buffer_x(frame_no))>= (buffer_x(frame_no1)-tol))&& ((buffer_y(frame_no))<= (buffer_y(frame_no-1)+tol))&& ((buffer_y(frame_no))>= (buffer_y(frame_no-1)-tol))), %-----------------------------------------------------------------------%Calling java function to perform right click import java.awt.Robot; import java.awt.event.*; mouse=Robot; mouse.mousePress(InputEvent.BUTTON3_MASK); mouse.mouseRelease(InputEvent.BUTTON3_MASK); right_click=1; %Flag turns on upon right click %-----------------------------------------------------------------------if (right_click==1), %Condition to perform left click %-----------------------------------------------------------------------%Calling java function to perform left click import java.awt.Robot; import java.awt.event.*; mouse=Robot; mouse.mousePress(InputEvent.BUTTON1_MASK); pause(0.5); %pause to select the function mouse.mouseRelease(InputEvent.BUTTON1_MASK); right_click=0; %Flag turns off upon left click end end object =object + 1; end end end stop(vid); delete(vid); close all;

45

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

% --- Executes during object creation, after setting all properties. function edit1_CreateFcn(hObject, eventdata, handles) % hObject handle to edit1 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles empty - handles not created until after all CreateFcns called % Hint: edit controls usually have a white background on Windows. % See ISPC and COMPUTER. if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white'); end

% --- Executes on button press in pushbutton4. function pushbutton4_Callback(hObject, eventdata, handles) % hObject handle to pushbutton4 (see GCBO) % Callback for displaying the color tracking %************************************************************************ global vid; global z; set(handles.text9,'BackgroundColor','blue'); set(handles.text9,'String','NOW YOU ARE TRACKING CALIBRATED COLOUR'); vid=videoinput('winvideo',1,'YUY2_320X240'); vid.returnedcolorspace='rgb'; set(vid,'TriggerRepeat',Inf); vid.FrameGrabInterval=2; start(vid); nFrame =200; while(vid.FramesAcquired<=nFrame) data1=getdata(vid,1); data=double(data1); r=data(:,:,1); % Extract RED component. g=data(:,:,2); % Extract GREEN component. b=data(:,:,3); % Extract BLUE component. S_live=1-3.*(min(min(r,g),b))./(r+g+b+eps); I_live=(r+g+b)/3; th_live=acos((0.5*((r-g)+(r-b)))./((sqrt((r-g).^2+(r-b).*(gb)))+eps)); %H value conversion H_live=th_live; H_live(b>g)=2*pi-H_live(b>g); H_live=H_live/(2*pi); % performs normalization %-------------------------------------------------------------------J_live=medfilt2(H_live,'symmetric'); %-------------------------------------------------------------------% Read the z values from storez.txt %-------------------------------------------------------------------fid = fopen('z1.txt'); pd_l = fscanf(fid, '%f'); fclose(fid); fid = fopen('z2.txt'); pd_u = fscanf(fid, '%f'); fclose(fid); %-------------------------------------------------------------------J_live_up=((J_live>=pd_l)&(J_live<=pd_u)); H_live_up=J_live_up.*J_live; %--------------------------------------------------------------------%Convert the resulting HSI_live image into a binary image. level=graythresh(H_live_up); H_live_up = im2bw(H_live_up,level); %--------------------------------------------------------------------

46

Anju Sebastian (Q0806388)

JAN2011/ENG/0089

% Remove all those pixels less than 300 pixels H_live_up = bwareaopen(H_live_up,300); %-------------------------------------------------------------------% Label all the connected components in the image. % Restrict multiple detection via num [L,num] = bwlabeln(H_live_up); %-------------------------------------------------------------------%compute area of each component total = bwarea(L); obj = regionprops(L, 'BoundingBox', 'Centroid'); axes(handles.axes4); imshow(data1); hold on len = length(obj); object=1; if (object <= len) bb = obj(object).BoundingBox; bc = obj(object).Centroid; rectangle('Position',bb,'EdgeColor','r','LineWidth',2) plot(bc(1),bc(2),'-m+'); % Determine X and Y co-ordinate a=text(bc(1)+15,bc(2), strcat('X:', num2str(round(bc(1))), 'Y:', num2str(round(bc(2))))); set(a, 'FontName', 'Arial', 'FontWeight', 'bold', 'FontSize', 12, 'Color', 'Yellow'); end hold off end stop(vid); delete(vid); axes(handles.axes4); imshow('complete.png');

47

Anju Sebastian (Q0806388)


APPENDIX B: GANTT CHART

JAN2011/ENG/0089

48

Das könnte Ihnen auch gefallen