Sie sind auf Seite 1von 72

ROBOT CONTROL USING ANDROID PHONE

OVER BLUETOOTH

Submitted by

Md.Touhidul Islam
ID:32160200558

Md.Ahosan Habib Md.Jahidul Islam Samim


ID:32160200560 ID:32160200559

Md.Yeaqub Ali Md.Jewel Rana


ID:32160200573 ID:32160200591

Supervised by

Abidur Rahman

In partial fulfillment of the


requirements for the Degree
of Bachelor of Science

December 2018

Department of Electrical & Electronic Engineering


Northern University Bangladesh
Declaration
We hereby, declare that the work presented in this project is the outcome of the project
work, investigation performed by us under the supervision of Abidur Rahman, Lecturer,
Dept. of EEE Northern University Bangladesh. We also declare that no part of this report
has been or is submitted elsewhere for the award of any degree or diploma.

……………………………………………..
Md. Touhidul Islam
ID :32160200558

……………………………………………..
Md. Ahosan Habib
ID:32160200560

……………………………………………..
Md. Jahidul Islam Samim
ID:32160200559

.............................................................
MD.Jewel Rana
ID:32160200591

………………………………………
Md. Yeaqub Ali
ID:32160200573

2|Page
Board of Examiners

1. Abidur Rahman
(Supervisor)
Lecturer, Dept. of EEE
Northern University Bangladesh (NUB)

2. Mohammad Shafiul Alam


(Examiner)
Senior Lecturer, Dept. of EEE
Northern University Bangladesh (NUB)

3. Tareq Rahman
(Examiner)
Lecturer, Dept. of EEE
Northern University Bangladesh (NUB)

…………………………..
Prof. Dr.Mohd.Ekram Ali Shaikh
Dean,Faculty of Science &Engineering
Head of the Department, EEE(Additional charge)
Northern University Bangladesh (NUB)

3|Page
ABSTRACT

Smart phone has become the most essential thing in our daily life. Android application
based smart phones are becoming each time more powerful and equipped with several
accessories that are useful for Robots. This project describes how to control a robot using
mobile through Bluetooth communication, some features about Bluetooth technology,
components of the mobile and robot. We present a review of robots controlled by mobile
phone via moving the robot upward, backward, left and right side by the android
application such as Arduino, Bluetooth. Bluetooth has changed how people use digital
device at home or office, and has transferred traditional wired digital devices into wireless
devices. Here we are using Bluetooth communication, interface microcontroller and
android application. We are using Arduino software to interface the Bluetooth module with
microcontroller. According to commands received from android the robot motion can be
controlled. We derived simple solutions to provide a framework for building robots with
very low cost but with high computation and sensing capabilities provided by the smart
phone that is used as a control device.
ACKNOWLEDGEMENT
At the very outset, all our prayers and thankfulness are to Allah the almighty for facilitating
this work and for granting us the opportunity to be surrounded by great and helpful people
at Northan University Of Bangladesh. We would like to express our everlasting gratitude
to our supervisors, Mr.Abidur Rahman for his valuable encouragement, guidance and
monitoring, without which this work would not have reached the point of fruition, so we
ask Allah to reward him on our behalf. Our warm heart, our fathers, mothers deserve all
the credit here; they have been a source of inspiration to us for years. We would never
forget their continuous prayer for the sake of our success. No acknowledgement would be
complete without expressing our appreciation and thankfulness for the maintenance
laboratory workers for their support and help.

5|Page
CONTENTS

Introduction
Overview
……………………………………………………………………………………………14
Motivation ……………………………………………………………………….............14
Project Objectives ……………………………………………………………………… 15
Organization of the report ……………………………………………………………….15

CHAPTER 1 Theoretical Background

1.1 The differential wheeled robot ………………………………………………………16


1.2 Smartphone…………………………………………………………………………..16
1.3 Microcontrollers……………………………………………………………………...17

1.3.1 Arduino ………………………………………………………………………..17

I. Arduino History ………………………………………………………………..17

II. Arduino Uno…………………………………………………………………...17

II.1 The Arduino Platform ……………………………………………….........17

II.2 hardware …………………………………………………………………..20

1.4 Bluetooth Module……………………………………………………………………20

1.4.1 HC-06 Bluetooth module ………………………………………………….......20

1.5 Motor Driver ………………………………………………………………………...21

1.5.1 Principal of operation ……………………………………………………….....21

1.5.2 The H-bridge …………………………………………………………………..22

1.5.3 Criteria of Selection……………………………………………………………23

1.6 Motors ……………………………………………………………………………….24

6|Page
1.6.1 DC Motors …………………………………………………………………….24
1.7 Distance Sensors……………………………………………………………………..25

1.7.1 Types of Distance Sensors.................................................................................25

1.7.2 Ultrasonic Sensors.............................................................................................25

I. Principal of operation…………………………………………………………..26

1.7.3 Photoelectric Sensor........................................................................................ ..27

I. TCRT5000 Reflective optical sensor………………………………… ………..27

I.1 Description…………………………………………………………………… 27

I.2 Principal of Operation………………………………………………….. …….28

1.8 Power Supply……………………………………………………………………….. 29

CHAPTER 2 System Hardware Design


Description of the system ………………………………………………………………. 30

2.1 Communication Unit ……………………………………………………………….. 31

2.1.1 UART Protocol……………………………………………………………...…...31

2.1.2 Arduino-Bluetooth connection.............................................................................. 31

2.2 Sensing Unit ………………………………………………………………………....32

2.2.1 Obstacle detection…………………………………………………………... …32

I. Arduino-Ultrasonic connection……………………………………….. ………..32

2.2.2 Colored line detection ………………………………………………………....33

I. Arduino-TCR connection ……………………………………………. ………...34

2.3 Motor Driving Unit …………………………………………………………………35

2.3.1 Arduino, H-Bridge and motors connection…………………………………. ...35

2.4 Overall System…………………………………………………………………… …36

7|Page
CHAPTER 3 System Software Design
3.1. Android ……………………………………………………………………………. 39

3.1.1. Introduction to Android ………………………………………………….... ….39

3.1.2. Overview on MIT App Inventor Software…………………………………. …39

a. App Inventor Designer……………………………………………………. …….41

b. App Inventor Block Editor………………………………………………............ 42

c. Designing and testing the application…………………………………….. …….42

3.1.3. The mobile application design description……………………………………45

I. Non autonomous approach ……………………………………………………….. 48

I.1 Modes of operation……………………………………………………... …….49

I.1.1 Mode one (Touch Screen command)………………………………. ………49

I.1.2 Mode two (Voice Command)……………………………………… ……….50

I.1.3 Mode three (Phone tilt command)………………………………………….. 51

II. Autonomous approach …………………………………………………… ……….51

3.2. Arduino IDE ……………………………………………………………………...... 51

3.2.1. The Arduino software description…………………………………………. ...53


3.3. Interaction between the application and the Arduino……………………………… 55

3.3.1. Non-Autonomuous approach interactions…………………………………. 56

a. Touch screen command mode………………………………………………. 56

b. Voice command mode……………………………………………………. …57

c. Phone tilt mode…………………………………………………………........ 58

3.3.2. Autonomuous approach interactions………………………………………..59

3.4. Necessary Programs………………………………………………………………...59

8|Page
3.5. Results and discussion……………………………………………………………... 66
Conclusion……………………………………………………………………………... .69
References

9|Page
List of Figures
Figure 1.1 Arduino Uno board…………………………………………………………………………………………………..18

Figure 1.2 Arduino Uno Board Specifications…………..……………………………………………………………….20

Figure1.3 HC-06 Bluetooth module………………..………………………………………………………………………..21

Figure1.4 H-Bridge Configuration………………………………………………….…………………………………………..22

Figure1.5 L293D pin outs………………………………………………………………………………………………………….23

Figure 1.6 The used Geared DC motor ………………………………………………………………………………………24

Figure 1.7 HC-SR04 Ultrasonic sensor……………………………………….……………………………………………...25

Figure 1.8 ultrasonic sensors Principal of Operation………………………………………………………………...26

Figure 1.9 TCRT5000 sensors and its Schematic diagram…………………………………………………………..27

Figure 1.10 TCRT5000 sensor principle of operation……………………………………………………………..….29

Figure 2.1 Block diagram of Android Smart differential robot ……………………………………………..….31

Figure 2.2 Schematic diagram of Arduino-Bluetooth module interfacing. …….….……………………..32

Figure 2.3 Schematic diagram of Arduino-Ultrasonic interfacing...……………………………………….....33

Figure 2.4 Schematic diagram of Arduino-TCR sensors interfacing ……………………………………..…..34

Figure 2.5 Schematic diagram of Arduino, H-Bridge and motors interfacing………………………….….35

Figure 2.6 Schematic diagram of the over system circuit………………………………………………….………37

Figure 2.7 Top view of the final differential robot……………………………………………………………..……..38

Figure 3.1 MIT App Inventor…………………………………………………………….……………………………………....40

Figure 3.2 App Inventor Designer…………………………………………………………………………………………....41

Figure 3.3 App Inventor Blocks Editor………………………………………………………………………………….……42

Figure 3.4 QR code……………………………………………………………………………………………………………….…..43

Figure 3.5 Option one-connecting phone or tablet over Wi-Fi-………………………………………………...43

Figure 3.6 Option two-installing and running the emulator in AI2-……………………………..……………44

10 | P a g e
Figure3.7 Option three-connecting a phone or tablet with USB cable-………………………………..…..45

Figure 3.8 User interface before connecting to any device……………………………………………………....46

Figure 3.9 Non-visible Component at the bottom of the designer …………………………………….……..46

Figure 3.10 Logic work of the application in non-autonomous routine…..……………………….…….....49

Figure 3.11 canvas positions…………………………………………………………………………………………………..…50

Figure 3.12 View of the Arduino development environment …………………………………..…………......52

Figure 3.13 Flow chart of the app in non-autonomous approach………………………………...……………53

Figure 3.14 Ulrasonic detecting obstacle…………………………………………………………………………………..54

Figure 3.15 Methods of the four basic directions………………………………………………………………………54

Figure 3.16 Flow chart of the robot t in autonomous approach…………………………………….………....55

Figure 3.17 Interaction between mobile app , Arduino ………………………………………………………..…..56

Figure 3.18 Interaction between Android application and Arduino in touch screen mode………....57
Figure 3.19 Interaction between app and Arduino in voice command mode…………………………….57
Figure 3.20 Interaction between app and Arduino in phone tilt mode……………………………….……..58

Figure 3.21 Interaction between app and Arduino in non-autonomous approach…………………...59


Figure 3.22 The corresponding application icon………………………………………………………………………...66

Figure 3.23 The final user interface…………………………………………………………………………………………...66

Figure 3.24 The differential robot tested under the non-autonomous mode…………………………….68

11 | P a g e
LIST OF TABLE

Table1.1 Specifications and parameters of Arduino Uno board…………………………19

Table1.2 Truth table of the H-Bridge switching ………..……………………………….23

Table 3.1 Non- visible component and functions…………..……………………………47

Table 3.2 Visible components…………………………………………...........................48

Table 3.3 Testing the functionality of the system using different modes …………….....67

12 | P a g e
LIST OF ACRONYMS
PLC: Programmable logic controllers.

SCADA: Supervisory control and data acquisition.

I/P: Input

O/P: Output.

T-On: On timer.

T-Off: Off timer.

M: Memory Bits.

Q: Output in program.

MW: Memory words.

NO: Normally open.

NC: Normally closed.

DCS: Distributed Control System.

HMI: Human machine interference.

VFD: Variable Frequency Drive.

MD: Memory Double Word.

MB: Memory Byte

13 | P a g e
INTRODUCTION
The following introduction highlights the general description of our work including both
project’s motivation and objectives.

Overview
A robot is usually an electro-mechanical machine guided by computer and electronic

programming. Many robots have been built for manufacturing purposes, and can be found
in factories around the world. A smart phone is A mobile phone that performs many
computer’s functions, typically having a touch screen- sensitive interface, Internet access,
and an operating system capable of running downloaded apps.

A recent statistics shows that a quarter of the world is using that sophisticated device,
running under different Operating Systems such as: Android, Symbian, Apple iOS and
BlackBerry OS. Android is ranked to be the first in powering hundreds of millions of
mobile devices in most of the countries around the world. In addition, Google has made
the Android development platform open to everyone around the world. This fact makes
millions of developers to be interested in the platform. Although some developers just focus
on building apps or games for the android devices, there are numerous possibilities as well.
One of the possibilities of Android development is its fusion with Arduino (a
microprocessor based system) which is by itself open an source hardware. The combination
of these twodevelopment platforms pushed us to think to take advantage of the Smartphone
and its hardware and software features to design and implement an industrial or home
applications. One of these applications is to build a smart android robot.

Motivation

The several software/hardware features provided with Smartphones and the use of this
latter in controlling robots makes the control system more flexible, extendable and low
cost. For that reason, we got motivated to build a smart phone based robot and merge its
autonomous and non-autonomous approaches under one application. This project focuses
on the outcome of the possible combination of Android and Arduino to make a differential
robot operating under two approaches: a non- autonomous and an autonomous one, which
can be controlled by any android device using three possible modes, touch screen, voice
command and mobile tilt control.

14 | P a g e
Project objectives

The purpose of this project is to take advantage of the availability of android Smart

phones, to build a smart android differential robot .Its remote control is made using a

developed android mobile application.

The mobile phone application enables the user to:

Control the robot using touch screen option

Control the robot using voice commands

Control the robot using mobile tilt

The differential robot is designed such that it can:

Ban any user command in front of obstacles

Follow a predefined path on the flour

Detect specific obstacles (objects or walls) at a given specified distance and alert the
user in case of their presence.

Display the distance the obstacle is far from the robot.

Organization of the report

This Report is divided into three chapters. The first chapter describes the theoretical

background to show and expose all used components and their different internal diagrams.

The second chapter describes our hardware system design. The description of our system

software design is given in chapter three. Finally our report ends up with some applications

and suggestions for future works .

15 | P a g e
Chapter 1

Theoretical Background

This Chapter Introduces the theory and technical details of the different component used
in our hardware design system.

1.1 The differential weeled robot

A differential wheeled robot is a mobile robot whose movement is based on two separately
driven wheels placed on either side of the robot body. It can thus change its direction by
varying the relative rate of rotation of its wheels. That said, in order to balance the robot,
additional passive wheels or caster may be needed. In the differential wheeled robot, if both
the wheels are driven in the same direction and speed, the robot will go in a straight line.
If both wheels are turned with equal speed in opposite direction, the robot will rotate about
the central point of the axis, otherwise, it depends on the speed of rotation and its direction,
e.i. if the wheels turn in the same direction, but with different speed, the robot turns to the
direction of the wheel with the smaller speed [1].

1.2 Smartphone

A Smartphone is a mobile phone built on a mobile computing platform, with more


advanced computing ability and connectivity than a feature phone. It has many more
advanced capabilities like powerful CPUs, large memory, sensors like accelerometer,
proximity, Global Positioning System (GPS) etc., and advanced connectivity options like
Bluetooth and Wi-Fi. Smartphone are powered by different operating systems such as
Symbian, Bada and Android OS etc. Among all available mobile operating systems
Android OS has gained significant popularity after being launched in 2008, overtaking all
previous competitors due to its open architecture. Android platform has revolutionized the
application development field for cellphone, opening new doors for technical exploration
[2].

16 | P a g e
1.3 Microcontrollers

Microcontrollers, sometimes abbreviated μC or MCU, are integrated circuits that can act
as small computers used for embedded automatic controlled products or devices.
Microcontrollers contain a similar structure as found in regular computers, integrating in a
single circuit a processor core, memory and programmable input and output peripherals.

1.3.1 Arduino

A microcontroller board contains on-board power supply, USB port to communicate with
PC, and an Atmel microcontroller chip. It simplify the process of creating any control
system by providing the standard board that can be programmed and connected to the
system without the need to any sophisticated PCB(printed circuit board) design and
implementation. It is an open source hardware. [4]

I. Arduino History

The project Arduino first began in 2005 at Interaction Design Institute Ivrea (IDII)
but the dawn of Arduino began in year 2002 when Massimo Banzi co-founder of Arduino
was appointed as an associate professor to teach the students of IDII to promote modern
ways of interactive design. Banzi wanted to offer his students something modern and
inexpensive so everybody could carry their works without many obstacles. By then, the
most used tool in the market was BASIC Stamp (Parallax 2012), which was expensive. So
as an alternative Banzi wanted to develop something better. Banzi was also involved in
processing, the processing language. So with the help of a Colombian student Hernando
Barragán who was working on a wiring platform, they tried to make processing for
hardware and make it simpler and easier to use. After working on the project, they came
up with a prototype, which was the birth of Arduino. With issues in IDII funding running
out, Banzi and the co-founders decided the project to be open source (Open Source 2012),
so that the product would be better. The hardware was then complete and only the
remaining part was software, which was later built with collaboration of other team
members.[3]

II. Arduino Uno

II.1 The Arduino Platform

The Arduino platform is an open source electronic prototyping system. It is composed of


two parts, the Arduino Uno board and the Arduino IDE (Integrated Development
Environment).

17 | P a g e
The Uno board (Figure1.1) is designed to provide an easy to use human changeable pin
interface to the Atmel AVR ATmega microcontroller, the heart of the Arduino hardware.
Arduino builds on this by adding simplicity to the hardware interface and an easy to use
software package. Arduino is meant to be used as a physical computing platform .That is,
to use the electronic hardware to interface with humans using sensors and actuators
controlled by software executed by a computer.

Figure 1.1 Arduino Uno board

The Arduino IDE is the software environment used to create the programs, called
“sketches,” that will be executed by the Arduino hardware. The IDE uses a modified C
language compiler to build, translate, and transmit the code to the microcontroller board
[3].

II.2 Hardware

The Arduino Uno is a microcontroller board based on the ATmega328. It has 14 digital
input/output pins (of which 6 can be used as PWM outputs), 6 analog inputs (A0-A5)
provide Analog Digital Converter (ADC) with 10bits resolution, a 16 MHz crystal
oscillator, a USB connection, a power jack, an ICSP header, and a reset button as shown
in figure1.2. It contains everything needed to support the microcontroller; simply connect
it to a computer with a USB cable or power it with a AC-to-DC adapter or battery to get
started. In memory specification, the ATmega328 can store code where it has 32 kB of
flash memory where its 0.5 kB is used by bootloader, 2 kB of SRAM and 1 kB of
EEPROM. The Arduino Uno board specifications are resumed in table 1.1[5]

18 | P a g e
Table 1.1 Specifications and parameter of Arduino Uno board [5]

Table1.1 Parameter

Specifications

Microcontroller ATmega328

Operating Voltage 5 V

Input Voltage 7-12V

Digital I/O Pins 14

Analogue input Pins 6

DC Current per I/O Pin 40ma

DC Current for 3.3V P 50ma

Flash Memory 32kb

SRAM 2kb

EEPROM 1kb

Clock Speed 16mHZ

19 | P a g e
Figure 1.2 Arduino Uno Board Specifications[6]

1.4 Bluetooth Module

Bluetooth is a technology developed to eradicate the need of wires to communicate


among different devices. Bluetooth is a wireless technology which has been a major
innovation in world of technology as it has made the communication robust, easy, and
low cost and energy. Most of the devices today have adopted this technology. This
technology stands out among the top in front of other wireless technologies because it
accedes the developers for both link layer and application layer definitions, allowing the
support of both data and voice communications [3].

1.4.1 HC-06 Bluetooth module

HC-06 module (figure1.3) is an easy to use Bluetooth SPP (Serial Port Protocol) module,
designed for transparent wireless serial connection setup. Serial port Bluetooth module is
fully qualified Bluetooth V2.0+EDR (Enhanced Data Rate) 3Mbps Modulation with
complete 2.4GHz radio transceiver and baseband [7].

20 | P a g e
Figure1.3 HC-06 Bluetooth module

Pin description:

VCC: Indicated in the range of 3.6V- 6V.

GND: Ground.

TXD: serial output of the module, to be connected to RX of the microcontroller.

RXD: serial input of the module, to be connected to the TX of the microcontroller.

1.5 Motor Driver

Generally, even the simplest robot requires a motor to rotate a wheel or performs
particular action. Since motors require more current than the microcontroller pin can
typically provide, we need some type of a switch (Transistors, MOSFET, Relay etc.,)
which can accept a small current, amplify it and generate a larger current. This entire
process is done by what is known as a motor driver.

1.5.1 Principal of operation

Motor driver is basically a current amplifier which takes a low-current signal from the
microcontroller and gives out a proportionally higher current signal which can control
and drive a motor. In most cases, a transistor can act as a switch and perform this task
which drives the motor in a single direction. Turning a motor ON and OFF requires only
one switch to control a single motor in a single direction, if we want to reverse the
direction of the motor we need to reverse its polarity. This can be achieved by using four
switches that are arranged in an intelligent manner such that the circuit not only drives the
motor, but also controls its direction. Out of many, one of the most common designs is an

21 | P a g e
H-bridge circuit where transistors are arranged in a shape that resembles the English
alphabet "H".

Figure1.4 H-Bridge Configuration

1.5.2 The H-bridge

An H bridge is an electronic circuit that enables a voltage to be applied across a load

in either direction. These circuits are often used in robotics and other applications to
allow DC motors to run forwards and backwards.

An H bridge is built with four switches (Figure1.4) A, B, C and D. Turning these


switches ON and OFF can drive a motor in different ways as resumed in table1.2:

22 | P a g e
Table1.2 Truth table of the H-Bridge switching.

1.5.3 Criteria of Selection

The choice of the H-bridge has a very important impact on the success of driving any
kind of motor. The L293D (figure1.5) is a dual H-bridge motor driver integrated circuit
(IC) which contains two inbuilt H-bridge driver circuits. In its common mode of
operation, two DC motors can be driven simultaneously, both in forward and reverse
direction. L293D has output current of 600mA and peak output current of 1.2A per
channel. Moreover for protection of circuit from back EMF output diode are included
within the L293D. The output supply which is external supply has a wide range from
4.5V to 36V which has made L293D a best choice for DC motor driver [8].

23 | P a g e
1.6 Motors

When coming to build a robot there are always two options in front of the designer
whether to use a DC motor or a stepper motor. When it comes to speed, weight, size and
cost, DC motors are always preferred over stepper motors. There are many things which
can do with the DC motor when interfaced with a microcontroller. For example can
control the speed of the motor and can control the direction of rotation.

1.6.1 DC Motors

An electric motor is an electromechanical device that converts electrical energy into


mechanical energy .Electric motors can be powered by direct current source, such as from
batteries. Microcontrollers command these motors through the driver circuit to take the
necessary action. A geared DC Motor (figure1.6) has a gear assembly attached to the
motor. The gear assembly helps in increasing the output torque and reducing the speed.

Figure 1.6 The used Geared DC motor

Geared DC motor advantages and disadvantages:

The advantage of using gear motors is that they are readily available in many sizes,
provide a lot of torque for the power consumed, and are available with a wide choice of
output speeds.

The main disadvantage is that gear motors are not precise. That is, two motors of
the same model, manufactured on the same day, and operated with identical current

and voltages, will NOT turn at exactly the same rate. Thus a robot with two drive

motors will not move in a straight line without some way of controlling individual

motor speeds [9].

24 | P a g e
1.7 Distance Sensors

One kind of sensors that is used extensively in robotics and the automotive industry is the
proximity sensor, also known as a distance sensor. A proximity sensor is a sensor able to
detect the absence, presence or distance of an object in a predefined range without any
physical contact [10].

1.7.1 Types of Distance Sensors

Different targets demand different sensors. Depending on the principle of operation, each
type of sensor will have different performance levels for sensing different types of
objects. Common types of non-contact proximity sensors include inductive proximity
sensors, capacitive proximity sensors, ultrasonic proximity sensors, and photoelectric
sensors.

1.7.2 Ultrasonic Sensors

Ultrasonic sensors work on a principle similar to radar or sonar which evaluates attributes
of a target by interpreting the echoes from radio or sound waves respectively. The HC-
SR04 ultrasonic sensor (figure1.7) uses sonar to determine distance to an object; it offers
excellent non-contact range detection with high accuracy can reach to 3mm and stable
readings in an easy-to-use package, from 2cm to 400 cm.

Figure 1.7 HC-SR04 Ultrasonic sensor

25 | P a g e
Pin description

Vcc: 5V power supply.

Trig: Trigger pin.

Echo: Receive pin.

GND: Power ground.

I. Principal of Operation

The ultrasonic sensor transmits an ultrasonic wave and produces an output pulse that
corresponds to the time required for the burst echo to return to the sensor. By measuring
the echo pulse width, the distance to target can easily be calculated, since the pulse width
of echo is proportional to the distance travelled as shown in Figure (1.8)[11].

The detected distance can be calculated by the formula (1.1).

(1.1)

Figure 1.8 ultrasonic sensors Principal of Operation [10]

26 | P a g e
1.7.3 Photoelectric Sensor

A photoelectric sensor is a device that detects a change in light intensity. Typically, this
means either non-detection or detection of the sensor’s emitted light source. The type of
light

and method by which the target is detected varies depending on the sensor.

I. TCRT5000 Reflective optical sensor

I.1 Description

The TCRT5000 (figure1.9) are reflective sensors which include an infrared emitter

and phototransistor in a leaded package which blocks visible light.

The TCRT5000 has a compact construction where the emitting-light source and the

detector are arranged in the same direction to sense the presence of an object by using the

reflective IR beam from the object. The operating wavelength is 950mm. The detector
consists of a phototransistor [12].

Figure 1.9 TCRT5000 sensors and its Schematic diagram

27 | P a g e
The TCRT5000 applications include:

Position sensor for shaft encoder

Detection of reflective material such as paper

, IBM cards, magnetic tapes etc.

Limit switch for mechanical motions in VCR.

General purpose – wherever the space is limited.

I. 2 Principal of Operation

Reflective sensors incorporate an infrared emitter and photo detector adjacent to each

other. As shown in figure (1.10) when an object is in the sensing area, the emitted light is

reflected back towards the photo detector, the amount of light energy reaching the
detector increases. This change in light energy or photo current is similarly used an input
signal in the application.

Figure 1.10 TCRT5000 sensor principle of operation

28 | P a g e
1.8 Power Supply

A regulated power supply is an embedded circuit, the function of which is to supply a

stable voltage to a circuit or device that must be operated within certain power supply
limits. This is used to supply the power to the microcontroller and the device circuits.
Two types of power supplies are used, a 9 volts that is used to supply the microcontroller
(Arduino) and a 6 volts power supply (4x1.5 volts batteries in parallel) to drive the two
DC motors

29 | P a g e
CHAPTER 2

System hardware design

This chapter aims to discuss the system hardware design, and the interconnectionbetween
all the parts included in that design in order to satisfy the requirements document of that
system.

Description of the system

The system interface consists of the following entities: a mobile device and a

differential Robot which includes the following parts: microcontroller based circuit
(Arduino), two DC Motors, motor driving circuit, and sensing circuit.

Android Smartphone will act as remote control of the system, Bluetooth will act as the

connection link between differential robot and android Smartphone. The microcontroller,
the brain of the robot, is acting as an intermediate between the mobile device and both the

actuators and the sensing circuits. The block diagram of the system is shown in Figure
2.1. Instructions are sent from the android mobile phone to the microcontroller via a

Bluetooth module. Then, based on both the type of the received instruction and the actual
values given by the external sensors, the microcontroller controls the two DC motors

accordingly. Once the robot obeys successfully to the given instruction and run in the
desired mode of operation, the microcontroller sends back the actual state of the robot to
the mobile

device which then displayed on its screen. The different parts of the system can be
divided into three units which are: communication, motor driving and sensing units

30 | P a g e
Figure 2.1 Block diagram of Android Smart differential robot

2.1 Communication Unit

2.1.1 UART Protocol

The mobile application is connected to the hardware (robot) through a continuous


Bluetooth connection using UART protocol. A universal asynchronous receive-
transmitter is a piece of computer hardware that translates data between parallel and serial
forms. The UART unit takes bytes of data and transmits its individual bits from the
source mobile robot in a sequential fashion. At the level of the destination (Mobile
device), a second UART re-assembles the bits into complete bytes. Each UART contains
a shift register, which is the fundamental method of conversion between serial and
parallel forms. A similar communication process is performed in the reverse direction.

The Arduino hardware has built-in support for serial communication on pins 0 (RX) and
1 (TX) The native serial support happens via a piece of hardware (built into the chip).The
UART is used for communication between the Arduino board and the mobile device.

2.1.2 Arduino-Bluetooth connection

The connection between the Bluetooth module and the Arduino board is shown in figure
2.1.

31 | P a g e
Figure 2.2 Schematic diagram of Arduino-Bluetooth module interfacing

The Bluetooth module need to be powered through the Vcc pin, and to operates safely it
should be supplied with +3.3v. The module’s Vcc is connected to 3.3v Arduino power
pin to not be damaged.

TX pin is used to send data from the module to the Arduino. It needs to be connected to
the serial receive pin (RX) of the Arduino board, which is located in pin 0.

RX pin is used to receive data from the Arduino. It needs to be connected to the Arduino
board serial transmit pin (TX), which is located in pin 1.

2.2 Sensing Unit

2.2.1 Obstacle detection

The Ultrasonic sensor is used in the system to detect the presence of obstacles in the

path of the robot. The sensor is placed in the front of the robot and any object is faced at a
distance of 15cm is considered as an obstacle.

a. Arduino-Ultrasonic connection

The interfacing of ultrasonic sensor to the Arduino is shown in figure 2.3

32 | P a g e
Figure 2.3 Schematic diagram of Arduino-Ultrasonic interfacing

2.2.2 Colored line detection

The robot is designed to follow any predefined colored line on a colored background

(black one on a white surface for simplicity). To detect and sense the black surface from
the white one; any light sensitive sensor can be used so that the robot follow the track.

The TCRT5000 Reflective Optical Sensor is what we will be using to detect black

lines. The sensor has an infrared emitter (LED) and a phototransistor inside of the
module.

TCRT5000 infrared tube working principle is to use the difference of infrared reflective
of colors to convert the strength of the reflected signal into current signal. High electrical
level of the Black and white tracing module works when it detects black area, and low
electrical level works when it detects white area.

33 | P a g e
a. Arduino-TCR connection

The interfacing of TCR sensors to the Arduino is shown in figure 2.4.

Figure 2.4 schematic diagram of Arduino-TCR sensors interfacing

To differentiate the black line from the white surface, 4 TCR sensors were placed

under the robot so that it can navigate to follow the predefined track. The emitter and
detector are mounted side by side and facing the same direction, a phototransistor's output
current is determined by the amount of light on it. So the way the TCRT5000 module
works is that the infrared LED outputs light, which is reflected by the surface it's pointing
at, and the output current of the phototransistor changes based on the amount of light
reflected (which is determined by the color of the surface). In order to read the voltage
through the phototransistor which is proportional to the color of the surface, digital inputs
on the Arduino (pin 4, 5, 6, 13) are used.

34 | P a g e
2.3 Motor Driving Unit

Since our robot should move in all directions with different speeds, a motor driving
circuit is needed for the robot to operate properly. The control of the two DC motors in
the system is carried out by using both the microcontroller and the L293D H-Bridge
circuit. The speed of the DC motor can be controlled by using a technique known as
Pulse Width Modulation (PWM). When PWM is used, power is not supplied in
continuous manner; It is provided as a square wave form with a frequency ranges from
around 60 Hz - 50 KHz.

By changing the pulse width, power supply to the motor can be adjusted. If the duty cycle
is at zero percent, the motor is at rest. But by changing it to hundred percent, motor starts
to rotate. The percentage of the PWM depends on the application in hand.

2.3.1 Arduino, H-Bridge and motors connection

The interfacing of the Arduino, L293D and the two dc motors is shown in figure 2.5.

Figure 2.5 Schematic diagram of Arduino, H-Bridge and motors interfacing

To drive the two DC motors, dual H Bridge (L293D) circuit is connected between the
microcontroller (Arduino) and to the two motors. The different possible directions at
which the two motors are rotating can be controlled by input logic at pins 2 & 7 for left
motor and 10 & 15 for the right one. The speed of the motors can be controlled by
connecting enable pins 1, 9 of the

35 | P a g e
H-bridge to Arduino PWM pin 9 and 10 respectively.

2.4 Overall System

The overall system is built by interconnecting all used parts. The system is used to

transmit and read data, process it and control the motors to accomplish the desired task.
Before implementing the final prototype, each hardware component’s functionality is
tested and displayed through Arduino integrated development environment first.

The overall system circuit is shown in Figure 2.6.

36 | P a g e
Figure 2.6 Schematics diagram of the overall system circuit

The following figure 2.7 shows the final built differential robot:

37 | P a g e
Figure 2.7 Top view of the final differential robot

38 | P a g e
CHAPTER 3

System Software Design

This chapter deals with the description of our developed mobile application, and the built

Arduino sketch that is used to translate the user application commands into actions to
control the robot. The obtained results are discussed.

3.1 Android

3.1.1 Introduction to Android

Android is an operating system based on the Linux kernel with a user interface based on
direct manipulation, designed primarily for touchscreen mobile devices such as
smartphone and tablet computers. The operating system uses touch inputs that loosely
correspond to real-world actions, like swiping, tapping, pinching, and reverse pinching to
manipulate on-screen objects, and a virtual keyboard. Despite being primarily designed
for touchscreen input, it also has been used in televisions, games consoles, digital
cameras, and other electronics [13].

3.1.2 An Overview on MIT App Inventor Software

The software or the android application for our work was designed using a very

innovative product initially provided by Google and which is now under the maintenance
of the Massachusetts Institute of Technology (MIT) known as MIT app Inventor (App
Inventor 2012). The software was previously called Google App Inventor and was
released publicly in December 15, 2010 only to be terminated one year later on
December, 2011. However, the product is now under MIT Centre for mobile learning and
by the name MIT App Inventor. App Inventor allows its users to develop different kinds
of Android apps just over a web browser. A user needs a Google account to get started
with and the app inventor server’s stores and keeps tracks of the entire user uploads [3].

As shown in figure 3.1, the application building process in App Inventor involves three

aspects:

39 | P a g e
(i) App inventor designer

(ii) App Inventor Blocks editor

(iii) An emulator or Android Phone

The set-up process for the software is very easy. The system requirements are very basic
and it is compatible with Mac OSX, Windows and Linux Operating systems. Browsers
required for the software are Mozilla Firefox 3.6 or higher, Apple Safari 5.0 or higher,
Google Chrome 4.0 or higher and Microsoft Internet Explorer 7.0 or higher [14].

Figure 3.1 MIT App Inventor

40 | P a g e
a) App Inventor Designer

The first phase of application design goes through App Inventor Designer, a snapshot

of its general view is shown in figure 3.2 below. Designer is accessible through the web
page and all the ingredients for the app are available on the left side of the window. The
ingredients include elements like a screen for the app, buttons for tapping, text boxes,
images, labels, animations and many more. The right side of the designer allows users to
view the screen and components added to the screen.

Additionally, the properties section of the window allows users to modify the properties
of components. Adding the components to the screen is a simple drag-and-drop process.
Then the alignment of the components can be managed through alignment options on the
left side of the window. The figure below shows the features added to our mobile
application. Several non-visible components are also added to the screen, which are
explored later in the block editor.

Figure 3.2 App Inventor Designer

41 | P a g e
b) App Inventor Block Editor

After the completion of design process, for the app to function as desired, users should
go through the block editor .App Inventor Block Editor uses open blocks java library.
Those open blocks integrate to create visual blocks of programming language. Hence, the
blocks are the programming codes which can be dragged and cemented with other blocks
to create a desired functional program as shown in figure 3.3 below. The Editor can be
opened from the options available in the App Inventor Designer which launches a Java
applet for Blocks Editor [14].

Figure 3.3 App Inventor Blocks Editor

c) Designing and testing the application

The final part of the application design is testing the application. Three options to test

the application are available

42 | P a g e
Option one ( Connecting a Phone or Tablet over Wi-Fi ) :

Figure 3.4 QR code

App Inventor can be used without downloading anything to the computer. The apps can
be developed on website: ai2.appinventor.mit.edu. To do live testing on the Android
device, the MIT App Inventor Companion app needs to be installed on the Android phone
or tablet. Then project in App Inventor on the web is to be opened, the companion is to be
opened on the device, and the build apps can be tested by scanning QR code ( Quick
Response Code ) using a QR scanner corresponding for each build app, to get the app
from the play store. An example of a QR code of a developed app inventor application,
which is a 2D version of the traditional barcode, is shown in figure 3.4.

Figure 3.5 Option one - Connecting a Phone or Tablet over Wi-Fi –

43 | P a g e
Option two (Installing and Running the Emulator in AI2)

For the users without the android handsets, App Inventor gives the option of testing the
application in an emulator a virtual mobile device very similar to real but with some
limitations. Some of the limitation is the incapability to use touch screen, and feel the
vibration of the phone, and no support for USB and Bluetooth connections.

Figure 3.6 Option two - Installing and Running the Emulator in AI2 –

Option 3 (Connecting to a phone or tablet with a USB cable) :

The user can also directly connect the android phone to the computer via USB

connecter and test the application

44 | P a g e
Figure 3.7 Option three -Connecting to a phone or tablet with a USB cable

3.1.3 The mobile application design description

The mechanism of our project is made up of two Approaches, non-autonomous and

autonomous one. The application interface is designed in such a way to offer the user an
easy control of the robot using three different modes of commands under the non-
autonomous Approach, i.e. Touch screen mode, voice command mode and phone tilt
(moving the phone in the plane) mode. In addition, a simple click on a check button is
enough to initialize the robot to enter the autonomous approach remotely. The final user
interface of the created app is shown in figure 3.8

45 | P a g e
Figure 3.8 User interface before connecting to any device

To create the user interface of the app, some visible and non-visible components were
created; the used non-visible components are shown in figure 3.9

Figure 3.9 Non-visible Component at the bottom of the designer

46 | P a g e
Table 3.1 lists the non-visible components, the designer’s palette it belongs and the
function it performs on the robot:

Table 3.1 non- visible component and their functions

The communication between the mobile phone and the robot is done serially using a
Bluetooth, thus a correct Bluetooth connection should be made. For this task a button at
the top of the application interface is designed so that once clicked, a list of the available
Bluetooth MAC addresses appears. When a particular MAC address is selected the status
“Connected” will be viewed on the screen. Now all the buttons are active and the app will
be connected with the robot and mobile phone can control the robot. Once the robot is
properly connected, three modes of operations are allowed, touch screen, voice command
and phone tilt. The used visible components in the two Approaches, the designer’s palette
they belongs and the functions performed are summarized in the table 3.2

47 | P a g e
Table 3.2 Visible components

I. Non autonomous approach

The principle sequences of the application in the non-autonomous Approach is

described in the flowchart given in figure 3.10

48 | P a g e
Figure 3.10 The logic work of the application in non-autonomous approach

I.1 Modes of operation

I.1.1 Mode one (Touch Screen command)

49 | P a g e
To make the mobile application operating in this mode a Canvas is designed as shown in
figure 3.11 .When the user’s finger is placed on the Canvas, the position is sensed and a

related action to the robot direction is asserted accordingly. The phone will vibrate

simultaneously with each performed action, with a vibration of 50 ms of length and


displays the direction it is driven through, on the phone’s screen.

In order to control the differential robot’s speed, the mobile application is provided

with sliders. When the slider position is scrolled, the corresponding speed is displayed in
front of the slider on the screen.

Figure 3.11 Canvas positions

I.1.2 Mode two (Voice Command)

Speech recognition is a technology where the system understands the spoken word

50 | P a g e
(not its meaning).When the user chooses the voice command option, by checking its

corresponding box, the desired direction should be spoken. The speech recognizer, which
is already built in android devices, will converts the voice command into text, and the
corresponding task is performed on the robot.

I.1.3 Mode three (Phone Tilt command)

When the user chooses the third option after checking its box, the phone is
oriented and positioned properly to the desired direction, then the accelerometer provided
with the phone will sense the coordinates of the position and assign the direction and
perform the task on the robot.

II. Autonomous approach

When the corresponding autonomous box is checked, the robot enters its autonomous

approach and follows a black path with the ability to detect obstacles. In case an obstacle
is detected, an alert message is displayed accordingly.

3.2 Arduino IDE

The Arduino IDE is the software environment used to create the programs, called
“sketches,” that will be executed by the Arduino hardware. The IDE uses a modified C

language compiler to build, translate, and transmit the code to the microcontroller board.
The

Arduino software runs on Windows, Macintosh OSX, and Linux operating systems. Most

microcontroller systems are limited to Windows. A snapshot of the Arduino IDE is


shown in

figure 3.12.

51 | P a g e
Figure 3.12 View of the Arduino development environment

For users to make runnable cyclic executive programs in Arduino, a sketch should be

written, consisting of two main functions, the void setup() and the void loop().Where The
setup() function is called when a sketch starts and it is used to initialize variables, pin
modes, libraries, …etc. The setup function will only run once, after each power up or
reset of the Arduino board. After creating a setup () function, which initializes and sets
the initial values, the loop () function does precisely what its name suggests, and loops
consecutively, allowing the program to change and respond.

52 | P a g e
3.2.1 The Arduino software description

The built sketch for the non-autonomous approach is resumed in the following flow chart.

Figure 3.13 Flow chart of the app in non-autonomous approach

53 | P a g e
The figure 3.14 illustrates the tasks associated to and by the ultrasonic sensor to operates
and detects obstacles in its path.

Figure 3.14 Ulrasonic detecting obstacle

Figure3.15 illustrates the basic tasks of the implemented methods:


Go_straight(),Go_reverse(),Turn_left(), Turn_right() and Stop() respectively in the
nonautonomous approach .

Figure 3.15 Methods of the four basic directions

54 | P a g e
The skecthed program for the autonumous mode ,and the basic assigned instructions for
the robot to detect and follow the black path ,are illustrted by the folowing flow chart
given in figure 3.16

Figure 3.16 Flow chart of the robot t in autonomous approach

3.3 Interaction between the application and the Arduino

In the following, the set of figures and block diagrams show the relation between the
mobile application and the Arduino. They demonstrate how tasks performed on the
mobile are translated into actions through the Arduino for both autonomous and non-
autonomous approaches. Figure 3.17 shows the interaction between the mobile app and
the Arduino.

55 | P a g e
Figure 3.17 Interaction between mobile app-Arduino

3.3.1 Non-Autonomuous approach interactions

The working mechanism of the robot is based on the information passed from the
android mobile phone via Bluetooth connection to the differential robot .To translate
user commands into actions assigned to the differential robot, oppcodes are assigned to
every user command. These opcodes are interpreted by the microcontroller into methods.
These latters are built to perform a pecific task on the motors and bring the differential
robot intodifferent actions. The task performed on the motors is displayed on the mobile
screen.

a. Touch screen command mode

The opcodes generated from each user command in the touch screen mode , and the
corresponding interpretaion from the arduino are illustrated in figure 3.18 below.

56 | P a g e
Figure 3.18 Interaction between Android application and Arduino in touch screen mode

The speech of the user is first converted to text, using the speech recognition engine built
in the mobile device. The corresponding text is sent to the Arduino. The received text
iscompared with the set of texts already stored in a look up table. Once found, it assigns
the corresponding action, and performs the appropriate actions on the robot’s motors;
otherwise the user is asked to reenter the vocal command. This interaction is illustrated in
figure 3.19

Figure 3.19 Interaction between app and Arduino in voice command mode

57 | P a g e
c. Phone tilt mode

If an inclination is performed on the mobile device, the accelerometer provided


with the android mobile asserts the position’s coordinates. According to the range those
coordinates belong to, Opcodes will be sent to the Arduino. This latter interpret the
opcodes into methods to be applied on the robot’s motors in order to perform the desired
task .The illustration of this mode is shown in figure 3.20

Figure 3.20 Interaction between app and Arduino in phone tilt mode

58 | P a g e
3.3.1 Autonomuous approach interactions

When the corresponding checkbox button, in the android app, is selected to enable the

autonomous mode, a corresponding opcode is sent to the Arduino. The code is translated
into corresponding methods to be performed on the robot’s motors .An illustration for
this mode is shown in figure 3.21

Figure 3.21 Interaction between app and Arduino in non-autonomous approach

3.4. Necessary Program

#define in1 5 //L298n Motor Driver pins.

#define in2 6

#define in3 10

#define in4 11

#define LED 2

int command; //Int to store app command state.

int Speed;

59 | P a g e
int Speedsec;

int Turnradius = 200; // This controls the radius of a turn, the higher the smaller the turn.
This should not exceed

// 255

void setup() {

pinMode(in1, OUTPUT);

pinMode(in2, OUTPUT);

pinMode(in3, OUTPUT);

pinMode(in4, OUTPUT);

pinMode(LED, OUTPUT); //Set the LED pin.

Serial.begin(9600); //Set the baud rate to your Bluetooth module.

void loop() {

if (Serial.available() > 0) {

command = Serial.read();

Stop(); //Initialize with motors stoped.

switch (command) {

case 'F':

forward();

break;

case 'B':

60 | P a g e
back();

break;

case 'L':

left();

break;

case 'R':

right();

break;

case 'G':

forwardleft();

break;

case 'I':

forwardright();

break;

case 'H':

backleft();

break;

case 'J':

backright();

break;

case '0':

Speed = 128;

break;

case '1':

61 | P a g e
Speed = 140;

break;

case '2':

Speed = 153;

break;

case '3':

Speed = 165;

break;

case '4':

Speed = 178;

break;

case '5':

Speed = 191;

break;

case '6':

Speed = 204;

break;

case '7':

Speed = 216;

break;

case '8':

Speed = 229;

break;

case '9':

62 | P a g e
Speed = 242;

break;

case 'q':

Speed = 255;

break;

case 'W': //LED pin on or off.

digitalWrite(LED, HIGH);

break;

case 'w':

digitalWrite(LED, LOW);

break;

case 'X':

Auto();

break;

case 'x':

Manual();

break;

Speedsec = Speed - Turnradius;

void forward() {

analogWrite(in1, Speed);

63 | P a g e
analogWrite(in3, Speed);

void back() {

analogWrite(in2, Speed);

analogWrite(in4, Speed);

void left() {

analogWrite(in4, Speed);

analogWrite(in1, Speed);

void right() {

analogWrite(in3, Speed);

analogWrite(in2, Speed);

void forwardleft() {

analogWrite(in1, Speed);

analogWrite(in3, Speedsec);

void forwardright() {

analogWrite(in1, Speedsec);

analogWrite(in3, Speed);

64 | P a g e
}

void backright() {

analogWrite(in2, Speedsec);

analogWrite(in4, Speed);

void backleft() {

analogWrite(in2, Speed);

analogWrite(in4, Speedsec);

void Stop() {

analogWrite(in1, 0);

analogWrite(in2, 0);

analogWrite(in3, 0);

analogWrite(in4, 0);

void Auto() {

//Reserved for future use

void Manual() {

//Reserved for future use

65 | P a g e
3.5 Results and discussion

The delivered icon of the designed application, after being installed on an android device
is shown in figure 3.22

Figure 3.22 The corresponding application icon

The final user interface of the mobile application after launching the app, and connecting
to the selected MAC address, is shown in figure 3.23

Figure 3.23 Final user interface

The unique way to evaluate the designed application is by testing it for all modes of
operations under the two used approaches.

The collected results from non-autonomous approach test, are resumed in table 3.3

66 | P a g e
Table 3.3 Testing the functionality of the system using different modes

67 | P a g e
When the slider position is changed from small to bigger values, the speed of the

motors changed considerably until it reached the maximum. One snapshot while testing
the commands in shown in figure 3.24 below

Figure 3.24 The differential robot tested under non-autonomous mode

When the robot is tested under the autonomous approach, by checking the autonomous

check box, it follows the path and detects obstacles at distance equal or less than 15cm.
One snapshot while testing this approach is shown in figure 3.25 below

68 | P a g e
CONCLUSION
To conclude, we can say that it was a great and a new challenge to build a Smart Android
Robot which combines two approaches of operation i.e. autonomous and nonautonomous
ones. The control of the robot is done via a very flexible and simple to use developed
mobile app which gives the user different possible ways for that control. The
implementation of the smart Android robot was successfully realized and the objectives of
our requirement document were accomplished successfully. Its functionality under the non-
autonomous approach was successfully fulfilled. The differential robot performs all the
user commands in the three different mode i.e. touch screen, voice commands and phone
tilt modes. The display on the mobile device is updated automatically according to the state
on which the robot is running. While the robot is navigating ahead, it is able to detect the
presence of any obstacle in front of it at a distance equal or less than 15cm. If an obstacle
is presented, any try from the user to move the robot ahead will be banned.

Under the autonomous approach, the robot can navigate through a predefined path
successfully, and is able to detect the presence of any obstacles during the trip. There were
some problems encountered during the implementation, especially from the hardware part.
The speed of the motor needed to be adjusted so that it is fast enough to move the mobile
robot. The dissimilar nature of two motors was one of the most critical problems that we
faced. The motors were driven in different speeds in identical operating conditions. The
major cause of this is the high friction of one motor unit. A correction method would be
needed to this process using only PWM from the microcontroller by adjusting the speed of
the motors to a single one. In addition, when using the voice command and due to the weak
WIFI internet signal, a delay in converting the speech into text was sometimes encountered.

For future work, the system can be enhanced with more features to perform other
functionalities to the user. A camera can be added to record the path of the robot and send
a real time record to the mobile phone.

We believe such a system would find wide variety of applications in research work as well
as in industrial applications.

69 | P a g e
Home automation

The “Home Automation” concept has existed for many years, Based on our project

controlling and monitoring electronic security systems, lighting, climate, appliances, audio
or

video equipment, etc can be done remotely using only one Smartphone.

Wheel chairs

on our project the wheelchair can be controlled by giving a voice commands

though android mobile. We can move wheel easily giving direction commands to android

without any hand movement.

70 | P a g e
Reference
1. Web Link: “ http://us.wow.com/wiki/Differential_wheeled_robot ” (Last access on:
21.08.18)

2. Web Link: “ www.cl.cam.ac.uk/~cm542/phds/kiranrachuri.pdf” (last access


on:23.12.18)

3. Kishan Raj KC, “Controlling a robot using android interface and voice” , Turku
university of applied sciences Finland ,written report,2012

4.Web Link: “ http://www.arduino.cc/en/Guide/Introduction” (Last Access:22.12.18)

5. Web Link: “http://www.arduino.cc/en/Main/ArduinoBoardUno” (Last


Access:23.12.18)

6. Web Link: “http://www.zenbike.co.uk/arduino/uno_components/index.html” (Last


Access:22.12.18)

7. Web Link: “http://wiki.iteadstudio.com/Serial_Port_Bluetooth_Module” (Last


Access:22.12.18)

8. Web Link: “ http://www.robotplatform.com/howto/L293/motor_driver_1.html (Last


Access:05.05.18)

9. John Piccirillo, “ The Art and Science of Selecting Robot Motors“ , University of

Alabama in Huntsville, written report, 19 February 2009.

10. Web Link: “ http://www.solarbotics.net/ (Last Access:15.08.18)

11. Web Link: “ http://accudiy.com/ (Last Access:12.10.18)

12. Web Link: “ www.vishay.com/docs/83760/tcrt5000.pdf(Last Access:21.11.18)

13. Web Link: “ https://hub.scaleway.com/android.html (Last Access:18.12.18)

14. Jason Tyler, “App Inventor for Android – Edition second Wiley”, Pages 100-102.

15.Electronics for you magine June 2013


16. Web Link: “ Keil µvision IDE, http://www.keil.com/uvision.”(Last Access:22.12.18)
17. Serial Bluetooth Module, Tiny OS Electronics, http://www.tinyosshop.com

71 | P a g e
18. AT89s52 8 bit Microcontroller, ATMEL Corporations, http://www.atmel.com3.
19. The official Bluetooth website from Bluetooth SIG: http://www.bluetooth.com

20. The 8051 microcontroller and embedded systems by Muhammad Ali Mazidi and
Janice Gillispie Mazidi.

72 | P a g e

Das könnte Ihnen auch gefallen