Beruflich Dokumente
Kultur Dokumente
OVER BLUETOOTH
Submitted by
Md.Touhidul Islam
ID:32160200558
Supervised by
Abidur Rahman
December 2018
……………………………………………..
Md. Touhidul Islam
ID :32160200558
……………………………………………..
Md. Ahosan Habib
ID:32160200560
……………………………………………..
Md. Jahidul Islam Samim
ID:32160200559
.............................................................
MD.Jewel Rana
ID:32160200591
………………………………………
Md. Yeaqub Ali
ID:32160200573
2|Page
Board of Examiners
1. Abidur Rahman
(Supervisor)
Lecturer, Dept. of EEE
Northern University Bangladesh (NUB)
3. Tareq Rahman
(Examiner)
Lecturer, Dept. of EEE
Northern University Bangladesh (NUB)
…………………………..
Prof. Dr.Mohd.Ekram Ali Shaikh
Dean,Faculty of Science &Engineering
Head of the Department, EEE(Additional charge)
Northern University Bangladesh (NUB)
3|Page
ABSTRACT
Smart phone has become the most essential thing in our daily life. Android application
based smart phones are becoming each time more powerful and equipped with several
accessories that are useful for Robots. This project describes how to control a robot using
mobile through Bluetooth communication, some features about Bluetooth technology,
components of the mobile and robot. We present a review of robots controlled by mobile
phone via moving the robot upward, backward, left and right side by the android
application such as Arduino, Bluetooth. Bluetooth has changed how people use digital
device at home or office, and has transferred traditional wired digital devices into wireless
devices. Here we are using Bluetooth communication, interface microcontroller and
android application. We are using Arduino software to interface the Bluetooth module with
microcontroller. According to commands received from android the robot motion can be
controlled. We derived simple solutions to provide a framework for building robots with
very low cost but with high computation and sensing capabilities provided by the smart
phone that is used as a control device.
ACKNOWLEDGEMENT
At the very outset, all our prayers and thankfulness are to Allah the almighty for facilitating
this work and for granting us the opportunity to be surrounded by great and helpful people
at Northan University Of Bangladesh. We would like to express our everlasting gratitude
to our supervisors, Mr.Abidur Rahman for his valuable encouragement, guidance and
monitoring, without which this work would not have reached the point of fruition, so we
ask Allah to reward him on our behalf. Our warm heart, our fathers, mothers deserve all
the credit here; they have been a source of inspiration to us for years. We would never
forget their continuous prayer for the sake of our success. No acknowledgement would be
complete without expressing our appreciation and thankfulness for the maintenance
laboratory workers for their support and help.
5|Page
CONTENTS
Introduction
Overview
……………………………………………………………………………………………14
Motivation ……………………………………………………………………….............14
Project Objectives ……………………………………………………………………… 15
Organization of the report ……………………………………………………………….15
6|Page
1.6.1 DC Motors …………………………………………………………………….24
1.7 Distance Sensors……………………………………………………………………..25
I. Principal of operation…………………………………………………………..26
I.1 Description…………………………………………………………………… 27
7|Page
CHAPTER 3 System Software Design
3.1. Android ……………………………………………………………………………. 39
8|Page
3.5. Results and discussion……………………………………………………………... 66
Conclusion……………………………………………………………………………... .69
References
9|Page
List of Figures
Figure 1.1 Arduino Uno board…………………………………………………………………………………………………..18
10 | P a g e
Figure3.7 Option three-connecting a phone or tablet with USB cable-………………………………..…..45
Figure 3.18 Interaction between Android application and Arduino in touch screen mode………....57
Figure 3.19 Interaction between app and Arduino in voice command mode…………………………….57
Figure 3.20 Interaction between app and Arduino in phone tilt mode……………………………….……..58
Figure 3.24 The differential robot tested under the non-autonomous mode…………………………….68
11 | P a g e
LIST OF TABLE
Table 3.3 Testing the functionality of the system using different modes …………….....67
12 | P a g e
LIST OF ACRONYMS
PLC: Programmable logic controllers.
I/P: Input
O/P: Output.
T-On: On timer.
M: Memory Bits.
Q: Output in program.
13 | P a g e
INTRODUCTION
The following introduction highlights the general description of our work including both
project’s motivation and objectives.
Overview
A robot is usually an electro-mechanical machine guided by computer and electronic
programming. Many robots have been built for manufacturing purposes, and can be found
in factories around the world. A smart phone is A mobile phone that performs many
computer’s functions, typically having a touch screen- sensitive interface, Internet access,
and an operating system capable of running downloaded apps.
A recent statistics shows that a quarter of the world is using that sophisticated device,
running under different Operating Systems such as: Android, Symbian, Apple iOS and
BlackBerry OS. Android is ranked to be the first in powering hundreds of millions of
mobile devices in most of the countries around the world. In addition, Google has made
the Android development platform open to everyone around the world. This fact makes
millions of developers to be interested in the platform. Although some developers just focus
on building apps or games for the android devices, there are numerous possibilities as well.
One of the possibilities of Android development is its fusion with Arduino (a
microprocessor based system) which is by itself open an source hardware. The combination
of these twodevelopment platforms pushed us to think to take advantage of the Smartphone
and its hardware and software features to design and implement an industrial or home
applications. One of these applications is to build a smart android robot.
Motivation
The several software/hardware features provided with Smartphones and the use of this
latter in controlling robots makes the control system more flexible, extendable and low
cost. For that reason, we got motivated to build a smart phone based robot and merge its
autonomous and non-autonomous approaches under one application. This project focuses
on the outcome of the possible combination of Android and Arduino to make a differential
robot operating under two approaches: a non- autonomous and an autonomous one, which
can be controlled by any android device using three possible modes, touch screen, voice
command and mobile tilt control.
14 | P a g e
Project objectives
The purpose of this project is to take advantage of the availability of android Smart
phones, to build a smart android differential robot .Its remote control is made using a
Detect specific obstacles (objects or walls) at a given specified distance and alert the
user in case of their presence.
This Report is divided into three chapters. The first chapter describes the theoretical
background to show and expose all used components and their different internal diagrams.
The second chapter describes our hardware system design. The description of our system
software design is given in chapter three. Finally our report ends up with some applications
15 | P a g e
Chapter 1
Theoretical Background
This Chapter Introduces the theory and technical details of the different component used
in our hardware design system.
A differential wheeled robot is a mobile robot whose movement is based on two separately
driven wheels placed on either side of the robot body. It can thus change its direction by
varying the relative rate of rotation of its wheels. That said, in order to balance the robot,
additional passive wheels or caster may be needed. In the differential wheeled robot, if both
the wheels are driven in the same direction and speed, the robot will go in a straight line.
If both wheels are turned with equal speed in opposite direction, the robot will rotate about
the central point of the axis, otherwise, it depends on the speed of rotation and its direction,
e.i. if the wheels turn in the same direction, but with different speed, the robot turns to the
direction of the wheel with the smaller speed [1].
1.2 Smartphone
16 | P a g e
1.3 Microcontrollers
Microcontrollers, sometimes abbreviated μC or MCU, are integrated circuits that can act
as small computers used for embedded automatic controlled products or devices.
Microcontrollers contain a similar structure as found in regular computers, integrating in a
single circuit a processor core, memory and programmable input and output peripherals.
1.3.1 Arduino
A microcontroller board contains on-board power supply, USB port to communicate with
PC, and an Atmel microcontroller chip. It simplify the process of creating any control
system by providing the standard board that can be programmed and connected to the
system without the need to any sophisticated PCB(printed circuit board) design and
implementation. It is an open source hardware. [4]
I. Arduino History
The project Arduino first began in 2005 at Interaction Design Institute Ivrea (IDII)
but the dawn of Arduino began in year 2002 when Massimo Banzi co-founder of Arduino
was appointed as an associate professor to teach the students of IDII to promote modern
ways of interactive design. Banzi wanted to offer his students something modern and
inexpensive so everybody could carry their works without many obstacles. By then, the
most used tool in the market was BASIC Stamp (Parallax 2012), which was expensive. So
as an alternative Banzi wanted to develop something better. Banzi was also involved in
processing, the processing language. So with the help of a Colombian student Hernando
Barragán who was working on a wiring platform, they tried to make processing for
hardware and make it simpler and easier to use. After working on the project, they came
up with a prototype, which was the birth of Arduino. With issues in IDII funding running
out, Banzi and the co-founders decided the project to be open source (Open Source 2012),
so that the product would be better. The hardware was then complete and only the
remaining part was software, which was later built with collaboration of other team
members.[3]
17 | P a g e
The Uno board (Figure1.1) is designed to provide an easy to use human changeable pin
interface to the Atmel AVR ATmega microcontroller, the heart of the Arduino hardware.
Arduino builds on this by adding simplicity to the hardware interface and an easy to use
software package. Arduino is meant to be used as a physical computing platform .That is,
to use the electronic hardware to interface with humans using sensors and actuators
controlled by software executed by a computer.
The Arduino IDE is the software environment used to create the programs, called
“sketches,” that will be executed by the Arduino hardware. The IDE uses a modified C
language compiler to build, translate, and transmit the code to the microcontroller board
[3].
II.2 Hardware
The Arduino Uno is a microcontroller board based on the ATmega328. It has 14 digital
input/output pins (of which 6 can be used as PWM outputs), 6 analog inputs (A0-A5)
provide Analog Digital Converter (ADC) with 10bits resolution, a 16 MHz crystal
oscillator, a USB connection, a power jack, an ICSP header, and a reset button as shown
in figure1.2. It contains everything needed to support the microcontroller; simply connect
it to a computer with a USB cable or power it with a AC-to-DC adapter or battery to get
started. In memory specification, the ATmega328 can store code where it has 32 kB of
flash memory where its 0.5 kB is used by bootloader, 2 kB of SRAM and 1 kB of
EEPROM. The Arduino Uno board specifications are resumed in table 1.1[5]
18 | P a g e
Table 1.1 Specifications and parameter of Arduino Uno board [5]
Table1.1 Parameter
Specifications
Microcontroller ATmega328
Operating Voltage 5 V
SRAM 2kb
EEPROM 1kb
19 | P a g e
Figure 1.2 Arduino Uno Board Specifications[6]
HC-06 module (figure1.3) is an easy to use Bluetooth SPP (Serial Port Protocol) module,
designed for transparent wireless serial connection setup. Serial port Bluetooth module is
fully qualified Bluetooth V2.0+EDR (Enhanced Data Rate) 3Mbps Modulation with
complete 2.4GHz radio transceiver and baseband [7].
20 | P a g e
Figure1.3 HC-06 Bluetooth module
Pin description:
GND: Ground.
Generally, even the simplest robot requires a motor to rotate a wheel or performs
particular action. Since motors require more current than the microcontroller pin can
typically provide, we need some type of a switch (Transistors, MOSFET, Relay etc.,)
which can accept a small current, amplify it and generate a larger current. This entire
process is done by what is known as a motor driver.
Motor driver is basically a current amplifier which takes a low-current signal from the
microcontroller and gives out a proportionally higher current signal which can control
and drive a motor. In most cases, a transistor can act as a switch and perform this task
which drives the motor in a single direction. Turning a motor ON and OFF requires only
one switch to control a single motor in a single direction, if we want to reverse the
direction of the motor we need to reverse its polarity. This can be achieved by using four
switches that are arranged in an intelligent manner such that the circuit not only drives the
motor, but also controls its direction. Out of many, one of the most common designs is an
21 | P a g e
H-bridge circuit where transistors are arranged in a shape that resembles the English
alphabet "H".
in either direction. These circuits are often used in robotics and other applications to
allow DC motors to run forwards and backwards.
22 | P a g e
Table1.2 Truth table of the H-Bridge switching.
The choice of the H-bridge has a very important impact on the success of driving any
kind of motor. The L293D (figure1.5) is a dual H-bridge motor driver integrated circuit
(IC) which contains two inbuilt H-bridge driver circuits. In its common mode of
operation, two DC motors can be driven simultaneously, both in forward and reverse
direction. L293D has output current of 600mA and peak output current of 1.2A per
channel. Moreover for protection of circuit from back EMF output diode are included
within the L293D. The output supply which is external supply has a wide range from
4.5V to 36V which has made L293D a best choice for DC motor driver [8].
23 | P a g e
1.6 Motors
When coming to build a robot there are always two options in front of the designer
whether to use a DC motor or a stepper motor. When it comes to speed, weight, size and
cost, DC motors are always preferred over stepper motors. There are many things which
can do with the DC motor when interfaced with a microcontroller. For example can
control the speed of the motor and can control the direction of rotation.
1.6.1 DC Motors
The advantage of using gear motors is that they are readily available in many sizes,
provide a lot of torque for the power consumed, and are available with a wide choice of
output speeds.
The main disadvantage is that gear motors are not precise. That is, two motors of
the same model, manufactured on the same day, and operated with identical current
and voltages, will NOT turn at exactly the same rate. Thus a robot with two drive
motors will not move in a straight line without some way of controlling individual
24 | P a g e
1.7 Distance Sensors
One kind of sensors that is used extensively in robotics and the automotive industry is the
proximity sensor, also known as a distance sensor. A proximity sensor is a sensor able to
detect the absence, presence or distance of an object in a predefined range without any
physical contact [10].
Different targets demand different sensors. Depending on the principle of operation, each
type of sensor will have different performance levels for sensing different types of
objects. Common types of non-contact proximity sensors include inductive proximity
sensors, capacitive proximity sensors, ultrasonic proximity sensors, and photoelectric
sensors.
Ultrasonic sensors work on a principle similar to radar or sonar which evaluates attributes
of a target by interpreting the echoes from radio or sound waves respectively. The HC-
SR04 ultrasonic sensor (figure1.7) uses sonar to determine distance to an object; it offers
excellent non-contact range detection with high accuracy can reach to 3mm and stable
readings in an easy-to-use package, from 2cm to 400 cm.
25 | P a g e
Pin description
I. Principal of Operation
The ultrasonic sensor transmits an ultrasonic wave and produces an output pulse that
corresponds to the time required for the burst echo to return to the sensor. By measuring
the echo pulse width, the distance to target can easily be calculated, since the pulse width
of echo is proportional to the distance travelled as shown in Figure (1.8)[11].
(1.1)
26 | P a g e
1.7.3 Photoelectric Sensor
A photoelectric sensor is a device that detects a change in light intensity. Typically, this
means either non-detection or detection of the sensor’s emitted light source. The type of
light
and method by which the target is detected varies depending on the sensor.
I.1 Description
The TCRT5000 (figure1.9) are reflective sensors which include an infrared emitter
The TCRT5000 has a compact construction where the emitting-light source and the
detector are arranged in the same direction to sense the presence of an object by using the
reflective IR beam from the object. The operating wavelength is 950mm. The detector
consists of a phototransistor [12].
27 | P a g e
The TCRT5000 applications include:
I. 2 Principal of Operation
Reflective sensors incorporate an infrared emitter and photo detector adjacent to each
other. As shown in figure (1.10) when an object is in the sensing area, the emitted light is
reflected back towards the photo detector, the amount of light energy reaching the
detector increases. This change in light energy or photo current is similarly used an input
signal in the application.
28 | P a g e
1.8 Power Supply
stable voltage to a circuit or device that must be operated within certain power supply
limits. This is used to supply the power to the microcontroller and the device circuits.
Two types of power supplies are used, a 9 volts that is used to supply the microcontroller
(Arduino) and a 6 volts power supply (4x1.5 volts batteries in parallel) to drive the two
DC motors
29 | P a g e
CHAPTER 2
This chapter aims to discuss the system hardware design, and the interconnectionbetween
all the parts included in that design in order to satisfy the requirements document of that
system.
The system interface consists of the following entities: a mobile device and a
differential Robot which includes the following parts: microcontroller based circuit
(Arduino), two DC Motors, motor driving circuit, and sensing circuit.
Android Smartphone will act as remote control of the system, Bluetooth will act as the
connection link between differential robot and android Smartphone. The microcontroller,
the brain of the robot, is acting as an intermediate between the mobile device and both the
actuators and the sensing circuits. The block diagram of the system is shown in Figure
2.1. Instructions are sent from the android mobile phone to the microcontroller via a
Bluetooth module. Then, based on both the type of the received instruction and the actual
values given by the external sensors, the microcontroller controls the two DC motors
accordingly. Once the robot obeys successfully to the given instruction and run in the
desired mode of operation, the microcontroller sends back the actual state of the robot to
the mobile
device which then displayed on its screen. The different parts of the system can be
divided into three units which are: communication, motor driving and sensing units
30 | P a g e
Figure 2.1 Block diagram of Android Smart differential robot
The Arduino hardware has built-in support for serial communication on pins 0 (RX) and
1 (TX) The native serial support happens via a piece of hardware (built into the chip).The
UART is used for communication between the Arduino board and the mobile device.
The connection between the Bluetooth module and the Arduino board is shown in figure
2.1.
31 | P a g e
Figure 2.2 Schematic diagram of Arduino-Bluetooth module interfacing
The Bluetooth module need to be powered through the Vcc pin, and to operates safely it
should be supplied with +3.3v. The module’s Vcc is connected to 3.3v Arduino power
pin to not be damaged.
TX pin is used to send data from the module to the Arduino. It needs to be connected to
the serial receive pin (RX) of the Arduino board, which is located in pin 0.
RX pin is used to receive data from the Arduino. It needs to be connected to the Arduino
board serial transmit pin (TX), which is located in pin 1.
The Ultrasonic sensor is used in the system to detect the presence of obstacles in the
path of the robot. The sensor is placed in the front of the robot and any object is faced at a
distance of 15cm is considered as an obstacle.
a. Arduino-Ultrasonic connection
32 | P a g e
Figure 2.3 Schematic diagram of Arduino-Ultrasonic interfacing
The robot is designed to follow any predefined colored line on a colored background
(black one on a white surface for simplicity). To detect and sense the black surface from
the white one; any light sensitive sensor can be used so that the robot follow the track.
The TCRT5000 Reflective Optical Sensor is what we will be using to detect black
lines. The sensor has an infrared emitter (LED) and a phototransistor inside of the
module.
TCRT5000 infrared tube working principle is to use the difference of infrared reflective
of colors to convert the strength of the reflected signal into current signal. High electrical
level of the Black and white tracing module works when it detects black area, and low
electrical level works when it detects white area.
33 | P a g e
a. Arduino-TCR connection
To differentiate the black line from the white surface, 4 TCR sensors were placed
under the robot so that it can navigate to follow the predefined track. The emitter and
detector are mounted side by side and facing the same direction, a phototransistor's output
current is determined by the amount of light on it. So the way the TCRT5000 module
works is that the infrared LED outputs light, which is reflected by the surface it's pointing
at, and the output current of the phototransistor changes based on the amount of light
reflected (which is determined by the color of the surface). In order to read the voltage
through the phototransistor which is proportional to the color of the surface, digital inputs
on the Arduino (pin 4, 5, 6, 13) are used.
34 | P a g e
2.3 Motor Driving Unit
Since our robot should move in all directions with different speeds, a motor driving
circuit is needed for the robot to operate properly. The control of the two DC motors in
the system is carried out by using both the microcontroller and the L293D H-Bridge
circuit. The speed of the DC motor can be controlled by using a technique known as
Pulse Width Modulation (PWM). When PWM is used, power is not supplied in
continuous manner; It is provided as a square wave form with a frequency ranges from
around 60 Hz - 50 KHz.
By changing the pulse width, power supply to the motor can be adjusted. If the duty cycle
is at zero percent, the motor is at rest. But by changing it to hundred percent, motor starts
to rotate. The percentage of the PWM depends on the application in hand.
The interfacing of the Arduino, L293D and the two dc motors is shown in figure 2.5.
To drive the two DC motors, dual H Bridge (L293D) circuit is connected between the
microcontroller (Arduino) and to the two motors. The different possible directions at
which the two motors are rotating can be controlled by input logic at pins 2 & 7 for left
motor and 10 & 15 for the right one. The speed of the motors can be controlled by
connecting enable pins 1, 9 of the
35 | P a g e
H-bridge to Arduino PWM pin 9 and 10 respectively.
The overall system is built by interconnecting all used parts. The system is used to
transmit and read data, process it and control the motors to accomplish the desired task.
Before implementing the final prototype, each hardware component’s functionality is
tested and displayed through Arduino integrated development environment first.
36 | P a g e
Figure 2.6 Schematics diagram of the overall system circuit
The following figure 2.7 shows the final built differential robot:
37 | P a g e
Figure 2.7 Top view of the final differential robot
38 | P a g e
CHAPTER 3
This chapter deals with the description of our developed mobile application, and the built
Arduino sketch that is used to translate the user application commands into actions to
control the robot. The obtained results are discussed.
3.1 Android
Android is an operating system based on the Linux kernel with a user interface based on
direct manipulation, designed primarily for touchscreen mobile devices such as
smartphone and tablet computers. The operating system uses touch inputs that loosely
correspond to real-world actions, like swiping, tapping, pinching, and reverse pinching to
manipulate on-screen objects, and a virtual keyboard. Despite being primarily designed
for touchscreen input, it also has been used in televisions, games consoles, digital
cameras, and other electronics [13].
The software or the android application for our work was designed using a very
innovative product initially provided by Google and which is now under the maintenance
of the Massachusetts Institute of Technology (MIT) known as MIT app Inventor (App
Inventor 2012). The software was previously called Google App Inventor and was
released publicly in December 15, 2010 only to be terminated one year later on
December, 2011. However, the product is now under MIT Centre for mobile learning and
by the name MIT App Inventor. App Inventor allows its users to develop different kinds
of Android apps just over a web browser. A user needs a Google account to get started
with and the app inventor server’s stores and keeps tracks of the entire user uploads [3].
As shown in figure 3.1, the application building process in App Inventor involves three
aspects:
39 | P a g e
(i) App inventor designer
The set-up process for the software is very easy. The system requirements are very basic
and it is compatible with Mac OSX, Windows and Linux Operating systems. Browsers
required for the software are Mozilla Firefox 3.6 or higher, Apple Safari 5.0 or higher,
Google Chrome 4.0 or higher and Microsoft Internet Explorer 7.0 or higher [14].
40 | P a g e
a) App Inventor Designer
The first phase of application design goes through App Inventor Designer, a snapshot
of its general view is shown in figure 3.2 below. Designer is accessible through the web
page and all the ingredients for the app are available on the left side of the window. The
ingredients include elements like a screen for the app, buttons for tapping, text boxes,
images, labels, animations and many more. The right side of the designer allows users to
view the screen and components added to the screen.
Additionally, the properties section of the window allows users to modify the properties
of components. Adding the components to the screen is a simple drag-and-drop process.
Then the alignment of the components can be managed through alignment options on the
left side of the window. The figure below shows the features added to our mobile
application. Several non-visible components are also added to the screen, which are
explored later in the block editor.
41 | P a g e
b) App Inventor Block Editor
After the completion of design process, for the app to function as desired, users should
go through the block editor .App Inventor Block Editor uses open blocks java library.
Those open blocks integrate to create visual blocks of programming language. Hence, the
blocks are the programming codes which can be dragged and cemented with other blocks
to create a desired functional program as shown in figure 3.3 below. The Editor can be
opened from the options available in the App Inventor Designer which launches a Java
applet for Blocks Editor [14].
The final part of the application design is testing the application. Three options to test
42 | P a g e
Option one ( Connecting a Phone or Tablet over Wi-Fi ) :
App Inventor can be used without downloading anything to the computer. The apps can
be developed on website: ai2.appinventor.mit.edu. To do live testing on the Android
device, the MIT App Inventor Companion app needs to be installed on the Android phone
or tablet. Then project in App Inventor on the web is to be opened, the companion is to be
opened on the device, and the build apps can be tested by scanning QR code ( Quick
Response Code ) using a QR scanner corresponding for each build app, to get the app
from the play store. An example of a QR code of a developed app inventor application,
which is a 2D version of the traditional barcode, is shown in figure 3.4.
43 | P a g e
Option two (Installing and Running the Emulator in AI2)
For the users without the android handsets, App Inventor gives the option of testing the
application in an emulator a virtual mobile device very similar to real but with some
limitations. Some of the limitation is the incapability to use touch screen, and feel the
vibration of the phone, and no support for USB and Bluetooth connections.
Figure 3.6 Option two - Installing and Running the Emulator in AI2 –
The user can also directly connect the android phone to the computer via USB
44 | P a g e
Figure 3.7 Option three -Connecting to a phone or tablet with a USB cable
autonomous one. The application interface is designed in such a way to offer the user an
easy control of the robot using three different modes of commands under the non-
autonomous Approach, i.e. Touch screen mode, voice command mode and phone tilt
(moving the phone in the plane) mode. In addition, a simple click on a check button is
enough to initialize the robot to enter the autonomous approach remotely. The final user
interface of the created app is shown in figure 3.8
45 | P a g e
Figure 3.8 User interface before connecting to any device
To create the user interface of the app, some visible and non-visible components were
created; the used non-visible components are shown in figure 3.9
46 | P a g e
Table 3.1 lists the non-visible components, the designer’s palette it belongs and the
function it performs on the robot:
The communication between the mobile phone and the robot is done serially using a
Bluetooth, thus a correct Bluetooth connection should be made. For this task a button at
the top of the application interface is designed so that once clicked, a list of the available
Bluetooth MAC addresses appears. When a particular MAC address is selected the status
“Connected” will be viewed on the screen. Now all the buttons are active and the app will
be connected with the robot and mobile phone can control the robot. Once the robot is
properly connected, three modes of operations are allowed, touch screen, voice command
and phone tilt. The used visible components in the two Approaches, the designer’s palette
they belongs and the functions performed are summarized in the table 3.2
47 | P a g e
Table 3.2 Visible components
48 | P a g e
Figure 3.10 The logic work of the application in non-autonomous approach
49 | P a g e
To make the mobile application operating in this mode a Canvas is designed as shown in
figure 3.11 .When the user’s finger is placed on the Canvas, the position is sensed and a
related action to the robot direction is asserted accordingly. The phone will vibrate
In order to control the differential robot’s speed, the mobile application is provided
with sliders. When the slider position is scrolled, the corresponding speed is displayed in
front of the slider on the screen.
Speech recognition is a technology where the system understands the spoken word
50 | P a g e
(not its meaning).When the user chooses the voice command option, by checking its
corresponding box, the desired direction should be spoken. The speech recognizer, which
is already built in android devices, will converts the voice command into text, and the
corresponding task is performed on the robot.
When the user chooses the third option after checking its box, the phone is
oriented and positioned properly to the desired direction, then the accelerometer provided
with the phone will sense the coordinates of the position and assign the direction and
perform the task on the robot.
When the corresponding autonomous box is checked, the robot enters its autonomous
approach and follows a black path with the ability to detect obstacles. In case an obstacle
is detected, an alert message is displayed accordingly.
The Arduino IDE is the software environment used to create the programs, called
“sketches,” that will be executed by the Arduino hardware. The IDE uses a modified C
language compiler to build, translate, and transmit the code to the microcontroller board.
The
Arduino software runs on Windows, Macintosh OSX, and Linux operating systems. Most
figure 3.12.
51 | P a g e
Figure 3.12 View of the Arduino development environment
For users to make runnable cyclic executive programs in Arduino, a sketch should be
written, consisting of two main functions, the void setup() and the void loop().Where The
setup() function is called when a sketch starts and it is used to initialize variables, pin
modes, libraries, …etc. The setup function will only run once, after each power up or
reset of the Arduino board. After creating a setup () function, which initializes and sets
the initial values, the loop () function does precisely what its name suggests, and loops
consecutively, allowing the program to change and respond.
52 | P a g e
3.2.1 The Arduino software description
The built sketch for the non-autonomous approach is resumed in the following flow chart.
53 | P a g e
The figure 3.14 illustrates the tasks associated to and by the ultrasonic sensor to operates
and detects obstacles in its path.
54 | P a g e
The skecthed program for the autonumous mode ,and the basic assigned instructions for
the robot to detect and follow the black path ,are illustrted by the folowing flow chart
given in figure 3.16
In the following, the set of figures and block diagrams show the relation between the
mobile application and the Arduino. They demonstrate how tasks performed on the
mobile are translated into actions through the Arduino for both autonomous and non-
autonomous approaches. Figure 3.17 shows the interaction between the mobile app and
the Arduino.
55 | P a g e
Figure 3.17 Interaction between mobile app-Arduino
The working mechanism of the robot is based on the information passed from the
android mobile phone via Bluetooth connection to the differential robot .To translate
user commands into actions assigned to the differential robot, oppcodes are assigned to
every user command. These opcodes are interpreted by the microcontroller into methods.
These latters are built to perform a pecific task on the motors and bring the differential
robot intodifferent actions. The task performed on the motors is displayed on the mobile
screen.
The opcodes generated from each user command in the touch screen mode , and the
corresponding interpretaion from the arduino are illustrated in figure 3.18 below.
56 | P a g e
Figure 3.18 Interaction between Android application and Arduino in touch screen mode
The speech of the user is first converted to text, using the speech recognition engine built
in the mobile device. The corresponding text is sent to the Arduino. The received text
iscompared with the set of texts already stored in a look up table. Once found, it assigns
the corresponding action, and performs the appropriate actions on the robot’s motors;
otherwise the user is asked to reenter the vocal command. This interaction is illustrated in
figure 3.19
Figure 3.19 Interaction between app and Arduino in voice command mode
57 | P a g e
c. Phone tilt mode
Figure 3.20 Interaction between app and Arduino in phone tilt mode
58 | P a g e
3.3.1 Autonomuous approach interactions
When the corresponding checkbox button, in the android app, is selected to enable the
autonomous mode, a corresponding opcode is sent to the Arduino. The code is translated
into corresponding methods to be performed on the robot’s motors .An illustration for
this mode is shown in figure 3.21
#define in2 6
#define in3 10
#define in4 11
#define LED 2
int Speed;
59 | P a g e
int Speedsec;
int Turnradius = 200; // This controls the radius of a turn, the higher the smaller the turn.
This should not exceed
// 255
void setup() {
pinMode(in1, OUTPUT);
pinMode(in2, OUTPUT);
pinMode(in3, OUTPUT);
pinMode(in4, OUTPUT);
void loop() {
if (Serial.available() > 0) {
command = Serial.read();
switch (command) {
case 'F':
forward();
break;
case 'B':
60 | P a g e
back();
break;
case 'L':
left();
break;
case 'R':
right();
break;
case 'G':
forwardleft();
break;
case 'I':
forwardright();
break;
case 'H':
backleft();
break;
case 'J':
backright();
break;
case '0':
Speed = 128;
break;
case '1':
61 | P a g e
Speed = 140;
break;
case '2':
Speed = 153;
break;
case '3':
Speed = 165;
break;
case '4':
Speed = 178;
break;
case '5':
Speed = 191;
break;
case '6':
Speed = 204;
break;
case '7':
Speed = 216;
break;
case '8':
Speed = 229;
break;
case '9':
62 | P a g e
Speed = 242;
break;
case 'q':
Speed = 255;
break;
digitalWrite(LED, HIGH);
break;
case 'w':
digitalWrite(LED, LOW);
break;
case 'X':
Auto();
break;
case 'x':
Manual();
break;
void forward() {
analogWrite(in1, Speed);
63 | P a g e
analogWrite(in3, Speed);
void back() {
analogWrite(in2, Speed);
analogWrite(in4, Speed);
void left() {
analogWrite(in4, Speed);
analogWrite(in1, Speed);
void right() {
analogWrite(in3, Speed);
analogWrite(in2, Speed);
void forwardleft() {
analogWrite(in1, Speed);
analogWrite(in3, Speedsec);
void forwardright() {
analogWrite(in1, Speedsec);
analogWrite(in3, Speed);
64 | P a g e
}
void backright() {
analogWrite(in2, Speedsec);
analogWrite(in4, Speed);
void backleft() {
analogWrite(in2, Speed);
analogWrite(in4, Speedsec);
void Stop() {
analogWrite(in1, 0);
analogWrite(in2, 0);
analogWrite(in3, 0);
analogWrite(in4, 0);
void Auto() {
void Manual() {
65 | P a g e
3.5 Results and discussion
The delivered icon of the designed application, after being installed on an android device
is shown in figure 3.22
The final user interface of the mobile application after launching the app, and connecting
to the selected MAC address, is shown in figure 3.23
The unique way to evaluate the designed application is by testing it for all modes of
operations under the two used approaches.
The collected results from non-autonomous approach test, are resumed in table 3.3
66 | P a g e
Table 3.3 Testing the functionality of the system using different modes
67 | P a g e
When the slider position is changed from small to bigger values, the speed of the
motors changed considerably until it reached the maximum. One snapshot while testing
the commands in shown in figure 3.24 below
When the robot is tested under the autonomous approach, by checking the autonomous
check box, it follows the path and detects obstacles at distance equal or less than 15cm.
One snapshot while testing this approach is shown in figure 3.25 below
68 | P a g e
CONCLUSION
To conclude, we can say that it was a great and a new challenge to build a Smart Android
Robot which combines two approaches of operation i.e. autonomous and nonautonomous
ones. The control of the robot is done via a very flexible and simple to use developed
mobile app which gives the user different possible ways for that control. The
implementation of the smart Android robot was successfully realized and the objectives of
our requirement document were accomplished successfully. Its functionality under the non-
autonomous approach was successfully fulfilled. The differential robot performs all the
user commands in the three different mode i.e. touch screen, voice commands and phone
tilt modes. The display on the mobile device is updated automatically according to the state
on which the robot is running. While the robot is navigating ahead, it is able to detect the
presence of any obstacle in front of it at a distance equal or less than 15cm. If an obstacle
is presented, any try from the user to move the robot ahead will be banned.
Under the autonomous approach, the robot can navigate through a predefined path
successfully, and is able to detect the presence of any obstacles during the trip. There were
some problems encountered during the implementation, especially from the hardware part.
The speed of the motor needed to be adjusted so that it is fast enough to move the mobile
robot. The dissimilar nature of two motors was one of the most critical problems that we
faced. The motors were driven in different speeds in identical operating conditions. The
major cause of this is the high friction of one motor unit. A correction method would be
needed to this process using only PWM from the microcontroller by adjusting the speed of
the motors to a single one. In addition, when using the voice command and due to the weak
WIFI internet signal, a delay in converting the speech into text was sometimes encountered.
For future work, the system can be enhanced with more features to perform other
functionalities to the user. A camera can be added to record the path of the robot and send
a real time record to the mobile phone.
We believe such a system would find wide variety of applications in research work as well
as in industrial applications.
69 | P a g e
Home automation
The “Home Automation” concept has existed for many years, Based on our project
controlling and monitoring electronic security systems, lighting, climate, appliances, audio
or
video equipment, etc can be done remotely using only one Smartphone.
Wheel chairs
though android mobile. We can move wheel easily giving direction commands to android
70 | P a g e
Reference
1. Web Link: “ http://us.wow.com/wiki/Differential_wheeled_robot ” (Last access on:
21.08.18)
3. Kishan Raj KC, “Controlling a robot using android interface and voice” , Turku
university of applied sciences Finland ,written report,2012
9. John Piccirillo, “ The Art and Science of Selecting Robot Motors“ , University of
14. Jason Tyler, “App Inventor for Android – Edition second Wiley”, Pages 100-102.
71 | P a g e
18. AT89s52 8 bit Microcontroller, ATMEL Corporations, http://www.atmel.com3.
19. The official Bluetooth website from Bluetooth SIG: http://www.bluetooth.com
20. The 8051 microcontroller and embedded systems by Muhammad Ali Mazidi and
Janice Gillispie Mazidi.
72 | P a g e