Sie sind auf Seite 1von 30

Design Proposal

4/13/16

ECEN 403 Senior Design Project


Texas A&M University, College Station, TX
Team 15
Kevin Bradshaw
Fuhua Song
Yuan Tian
Zhengshuai Zhang

Modularized Robotic Arm

TABLE OF CONTENTS
List of Figures
Figure 1: System Block Diagram
Figure 2: System Sketch
Figure 3: Gantt Chart
Figure 4: Power Block Diagram
Figure 5: PWM Driver Connection
Figure 6: Software Flowchart
Figure 7: FLTK GUI Window
Figure 8: Robotic Arm Servo Motor and Movements Definition
Figure 9: Servo Motor 1 Mapping
Figure 10: Servo Motor 2 Mapping
Figure 11: Servo Motor 3 Mapping
Figure 12: General Communication Network
Figure 13: Raspberry Pi 2 Communication Network
Figure 14: Kinect Communication Network
Figure 15: Myo Communication Network
Figure 16: Physical Structure of Hand
Figure 17: Hand Control Flowchart
Figure 19: EasyDriver Board
Figure 20: Stepper Motor Connection Diagram
Figure 21: Flex Sensor Circuit
List of Tables
Table 1: System and Subsystem Overview
Table 2: Preliminary Test Plan Overview
Table 3: Final Test Plan Overview
Table 4: Team Member Responsibility
Table 5: Robotic Arm Servo Motor Minimum Supply
Table 6: Proposed Power Supplies
Table 7: Motor Mapping
Table 8: Data Transfer Estimation

Abstract
The focus of this project is to design and build a modularized robotic arm that can mimic a
users arm movements. The overall system of this project is composed of Power, Control,
Software, Communications, and Hand/Tools Subsystems. The robotic arm system utilizes the
Kinect technology to send the arm movement sampling signal to the mechanical robotic arm. A
developed Graphical User Interface (GUI) using the C++ Fast Light Toolkit (FLTK) library
outputs the main skeletal tracking data of the user. In the sampling stage, the real-time data of
human joints captured by the Kinect will be displayed on a window. The data is further
processed to define the Pulse Width Modulation (PWM) used to drive several servo motors on
the robotic arm. Five of these servo motors in total are integrated on the robotic arm. The
movements of the robotic arm are defined by the rotation speed and rotation direction of these
servo motors accordingly. The modularization of the robotic arm is achieved by integrating the
Myo gesture controller so that the user can control the robotic arm to select and replace tools
depending on different scenarios. When a specific gesture is performed, the robotic arm will
execute automated movements including detaching the current tool and fetching the next tool.
In addition, all data and instructions are transmitted wirelessly among the subsystems. The main
costs associated with this project include the Raspberry Pi 2, the Kinect Sensor, and a
Lynxmotion Robotic Arm.

INTRODUCTION

1.1. Project Overview

The Modularized Robotic Arm is a proof of concept system thats designed and constructed to
create a better user interface for controlling robotics by wirelessly mimicking a user. Instead of
having just a hand on the end of the robotic arm, it will consist of a clamp with multiple tools to
be used in a variety of applications. The first main focus of this project was a bomb squad
vehicle because its wireless and mobile characteristics. The second main focus was then an
application for medical professionals to perform surgery or treat highly infectious patients in a
hospital without actually being in the same room as the patient. Now, the focus has shifted to
making the arm (for the most part) modular to be able to work in different types of applications.
This is done by adding the tools necessary for a specific application directly to a tool assembly in
front of the arm. Although the system wont be mounted on a mobile platform, the aim is to still
to make it wireless in order to demonstrate the possible mobile concept in future designs.
1.2. Problem Statement

The field of automated robotics is constantly improving but there are certain high stake situations
where a persons instincts and perspectives will be superior to that of current automated
processes. Additionally, when it can mean a life or death situation, an automated process cant be
predicted. By creating a robotic interface solely controlled by human interaction, movements,
and gestures, a situation where a highly trained professional can act accordingly. This also puts
these professionals in a safe environment to send a robot into dangerous situations.
1.3. Previous Solutions

In the public service application, current bomb squad vehicles have no user friendly interface
that interacts a users movement directly to the operation of the robot. In the medical field
application, current precision surgeries by robotics are controlled by mechanical controllers that
arent wireless. The robotic arms in general that have been designed before to be wirelessly
controlled dont have the modularized tools to be used in specific applications nor are they
wireless to be made into a mobile system. The Modularized Robotic Arm system is designed to
combine all these areas into one conceptual design.
1.4. Background

This project was founded on the idea of controlling a robot by only movements and gestures so
that a safe environment could be created for the user. In order to do this, a completely wireless
system that could depend on different types of communications could be designed. For this proof
of concept, Wi-Fi is used so that it could be demonstrated easily in different areas of the university.

2. SYSTEM DESIGN

2.1. Proposed System Diagrams

The proposed system will consist of the major components listed in Figure 1. There are several
more components being used in the lower level designs but are defined later in the subsystem
sections.

Figure 1. System Block Diagram

Figure 2. System Sketch (Physical layout)

2.1.1. Definition of Interfaces


Table 1 describes the main inputs and outputs to the defined system. Each subsystem has its
own specific requirements in order to be able to connect into the whole system.
Table 1: System & Subsystem Interfaces

Interface

Description

Control

5 servo motors, 5V, 1.75A


1 stepper motor minimum

Power

Raspberry Pi 2

Software

GUI

Communication

PAU06 Wifi
Myo Bluetooth
Kinect USB

Hand Motion
Recognition

Myo

Input

Input Medium

Output

Output Medium

PCA9685

Position

X,Y,Z Coordinate

Skeletal
Tracking
Kinect
Data

USB 2.0
Micro-B, 300
Mbs

PWM Duty
Cycle

Wireless

USB
Raspberry Pi 2
Attachment

Bits/Latency

Transmitter
/Receiver

Users
Gesture

Gyro,
Accelerometer

Gesture
Recognition

Bluetooth

5V, 1.2A
minimum

USB 2.0 Micro-B


External Power
Supply

5V and 3.3V at
100mA

GPIO Pins

2.1.2. Relevant Standards and Constraints


IEEE802.11n
WPA2 enterprise
AWG standard
Metric system
2.1.3. Justification of Design Strategy

This system was designed in order to be highly modifiable for different uses. This is so that the
focus would be on the interfaces, control, and physical movement of the robotic arm.
2.2. Analysis of Design Concepts

The overall system should be able to mimic the users movements within a 10 accuracy and a
3 inch radial span. Since the specifications of the Kinects skeletal tracking is more accurate than
this and every driver being used can handle high frequencies with this amount of data, the final
system should be able to fit this requirement.

2.2.1. Power Subsystem

There are several different power connections throughout the overall system. On the actual
robotic arm, there will be two battery power supplies. One will power all the motors and control
while the other powers the Raspberry Pi 2 and its peripherals. The supply that powers the motors
will consist of a buck converter that can support the current draw needed by each motor. Lastly,
the laptop, Myo, and Kinect will simply be powered through a standard wall outlet. Any other
sensor that may be added to the system will be connected through the Raspberry Pi 2 or the main
external supply. The supplies were calculated with extra power draw capabilities for this and also
so that extensive tests can be taken on a single charge.
2.2.2. Control Subsystem

This subsystem consists of the controls and coding necessary on the Raspberry Pi 2 in order to
control steady movements for the robotic arm. Each motor on the arm is driven by individual
PWM signals that have different pulse length ranges. By matching the minimum and maximum
pulse length ranges to physical movements of the user, a duty cycle can be created. This control
is dependent on the data received from the Kinect, the speed of the data, and the file format.
Individually, this subsystem will be able to control the robotic arm to any specific movement
(within the limits of the physical system) the user inputs from the computer. This can be verified
by measuring the actual position of each motor with respect to the mounting of the system.
Accurate positions of the input should closely match the position of the system. Lastly, the speed
ratio of duty cycle ranges and PWM frequency can be verified. This is done by measuring the
physical position of the arm and the speed it takes to move at different set frequencies. Overall,
the Raspberry Pi 2 should be able to output the correct position and time of the robotic arm
independently of wireless communication to the arm. When the system is complete the data from
the control system should closely match the data sent from the Kinect in order to provide
accurate movements between the two.
2.2.3. Hand/Tool System

The Hand and Tool System consists of a mechanical hand structure with a stepper motor in the
middle which controls the movement of three fingers. The stepper motor is driven by the PWM
signals that are produced from the driver board and the Raspberry Pi 2. When the stepper motor
rotates clockwise, the fingers perform motion of grabbing a specific tool. Otherwise, the fingers
perform motion of releasing the tool to its original position. To control the hand to grab or
release tools, the Myo armband will read the users specific gesture and send the control
information to the Raspberry Pi. The Myo triggers a specific code to activate and perform
particular motion to replace the current tool with the selected tool.
2.2.4. Wireless Subsystem

The wireless subsystem integrates all of the hardware and software components of the project.
This is vital in order to have everything communicating with each other. This subsystem is
divided into 3 sections, the Raspberry Pi 2, the Computer, and the Microsoft Kinect. Between the
Raspberry Pi 2 and the laptop, the selected communication method is Wi-Fi due to its long range
capabilities in the testing environment. Between the Myo Gesture Controller and the laptop, the

method of communication is Bluetooth due to the hand controller. As for between the Microsoft
Kinect and the laptop, the method of communication is through USB.
2.2.5. Software Subsystem

The software subsystem in the project mainly serves to analyze and display data. The subsystem
includes user interface designing, data analysis methods, and multi-subsystem cooperation. The
entire software subsystem is based on C++ language using Microsoft Visual Studio as the
development platform. The FLTK window is adopted for the user interface. The user can start
and terminate the entire system through the buttons on the window. Also, the system status and
the sampling data of tracked user arm movements are displayed on the window with minimal
latency. The sampling data obtained from the Kinect is further processed to obtain the data (duty
cycle) that can be sent to the Raspberry Pi 2 to define the PWM signals which are used to drive
the servo motors. As for hand tools control, the interrupts can be sent to the software subsystem
by initializing specific gestures to the hand controller. After the interrupts are recognized by the
software system, the control system exits the mimicking stage and enters the subroutines that
drive the robotic arm to perform automation movements to select and replace the hand tools. The
control system resumes to the mimicking stage after the subroutine finishes. The PAU06 Wi-Fi
adaptor is used to perform wireless data transmission in the processes mentioned above. The
codes for the PAU06 has to be integrated in the software subsystem. In addition, the software
subsystem plays the most important role to define the interactions and cooperation of multiple
other subsystems. The control, communication, and power subsystems all rely on the software to
perform tasks and cooperate correctly. A detailed description of the control methods performed
by the software subsystem, the flow chart of the correct protocols defined, and an example of a
basic GUI FLTK window can be seen in Appendix D.
2.3. Project Deliverables
2.3.1. Deliverables

Preliminary Subsystem Test

1. Power:
a. Robotic arm, the Raspberry Pi 2, and the peripherals all powered by regulated
external battery supplies.
b. An electrical load analysis at 80% of the full load should match the time of use
defined.
2. Control:
a. Robotic arm set to specific positions by user input in its defined range of motion.
b. Speed measurements and PWM relationship to robotic movement.
3. Wireless Communications:
a. Accurate data representation of camera modules.
b. Accurate data representation of packets the same size as Kinect data points.
4. Software:
a. A user-friendly graphical user interface that can organize skeletal tracking data,
display system status, and deliver an easy control over the entire system.

b. An algorithm that can analyze skeletal tracking data to yield the correct position
of each joint.
c. An interrupt code routine that ensures correct operations in the stage of selecting
and replacing tools.
5. Hand System:
a. Robotic hand that grabs/releases a specific tool with a stepper motor rotating
clockwise/counterclockwise.
b. The Myo armband sends the control information to the hand by reading the users
gestures.
2.3.2. Project functions and features

At the end of the project, the entire system should work according to the users movements and
gestures. The system should be able to show the data measurements of position and speed of the
movement for both the user and the robotic arm.
2.3.3. Preliminary Test Plan (Mini-Demo) for each subsystem

Table 2: PRELIMINARY TEST PLAN OVERVIEW

Subsystem

Design Parameter

Units

Condition

Specification

Control

Duty Cycle Ranges

Percentage
and
Degrees of
Motion

Maximum and minimum


control ranges of each
motor defined by the
range of motion of an arm

< 10 accuracy for


angle inputs, 3 inch
radial for position

Control

Speed

inches/sec

Low frequency test ranges


for arm movement

Be able to define at least


three different speeds of
movement in inches/sec.

Communications

Wi-Fi, Bluetooth,
USB

Latency,
distance

Correct data sending at a


minimum distance of 10m

>10 meters for Wi-Fi

Software

Algorithm System
Status

Percentage
and
Degrees of
Motion

< 10 Accuracy

Hand

Finger Movement

Degrees

Maximum and minimum


control ranges of each
point. Correct /clear
system status and data
display

Power

Output Power

Watts

Full power draw of motors < 20 Watts constantly


and peripherals
over 2.5 hours

The fingers can close and


grab small tools with the
motor rotating at a

Open Gesture ~0
Half Gesture
~360

specific degree of motion

Closed Gesture
~720

2.3.4. Final Test Plan (Demo Day)

Table 3: FINAL TEST PLAN OVERVIEW

Subsystem

Design Parameter

Units

Condition

Specification

Control

Duty Cycle
Ranges

Percentage
and Degrees
of Motion

Maximum and minimum control


ranges of each motor synced to
users arm movements.

< 5 Accuracy,
1.5 inch radial

Communications

Wi-Fi, Bluetooth,
USB 2.0

Latency,
distance

> 10 meters
<0.5 seconds

Software

Algorithm
System Status

Percentage
and Degrees
of Motion

Maximum bit usage of 21Mbs.


Control at minimum distance of
10M

Hand

MYO Gesture

Degree of
fingers
motion

System

User Movement

Position
Data

Power

Output Power

Watts

Full power draw of motors and


peripherals.

< 20 Watts

Maximum and minimum control


ranges of each point synced to
the robotic arm. Correct /clear
system status and data display.

< 5 Accuracy,
1.5 inch radial

The fingers can grab or release a


specific tool following the users
specific gesture.

Open Gesture ~0
Half Gesture
~360
Closed Gesture
~720

The user should be able to fully


control the robotic arm and hand
tools within the designed
specifications set for each
subsystem.

Position observation

The final test plan ensures correct overall system performance by testing the system with all
subsystems combined. The test results should meet the expectation that the robotic arm can be
controlled by a user. Using arm and gesture movements to perform specific tasks, the system

should run with minimal latency that results in close to real-time movement with the user and
robotic arm at least 10 m apart.
2.4. Project Impact (ABET Considerations)
Economical: Although the low cost of the current project, since it is a proof of concept and has
much room to grow, the costs will also grow accordingly. This would not be feasible to
manufacture commercially to anybody but there would still be much potential for public service
or medical field applications.
Environmental: This project doesnt create a negative impact on the environment unless it was
improperly disposed because of the electrical components being used.
Social: Since this project can be used for a variety of applications, this can be a huge benefit for
the open source world of Kinect users. It can help push others to improve upon robotics
controlled by user movements.
Manufacturability: Its not feasible to manufacture and distribute all the parts of this project
together such as the laptop, Kinect, Raspberry Pi 2, and mounting structure. It would be feasible
to construct the circuits used and the connections to the Raspberry Pi 2.
Health and Safety: Before this project could be introduced into actual high stake situations, it
would need to be modified to have the best possible accuracy and calibration.

3. PROJECT MANAGEMENT
3.1. Responsibility Matrix

Team Member

Kevin Bradshaw
Fuhua Song

Table 4: Team Member Responsibilities


Responsibility

Project Management
Robotic Arm Control and Positioning
Power Systems
Wireless Communications
Raspberry Pi 2 Expert

Yuan Tian

Coding Expert
Raspberry Pi 2 Expert

Zhengshuai Zhang

Hand/Tools Controls
Feedback Sensors

3.2. Gantt Chart

3.3. Budget

Subsystem

Power and Control System

Figure 3. Gantt Chart

Budget

$125

Wireless Communication System

$75

Mechanical System

$150

Tools/Hand Control System


Total

$35

$385

4. CONCLUSION

The modularized robotic arm is a proof-of-concept project that demonstrates the feasibility of
wireless movement synchronization of the users arm and the robotic arm. This project has high
value in use of various work and academic fields because the robotic arm is modularized and
allows the users to perform specific tasks depending on different scenarios. In addition, the
robotic arm is controlled wirelessly so that the risks for the user to be in a dangerous work
environment are minimized.

REFERENCES

"Force Sensor." FlexiForce A201 Sensor. Tekscan Cooperation. Web. 13 Apr. 2016.

<https://www.tekscan.com/products-solutions/force-sensors/a201?tab=specifications> .
"EasyDriver - Stepper Motor Driver." Drivers. Sparkfun. Web. 13 Apr. 2016.
<https://www.sparkfun.com/products/12779>.

Admin. "Multi Camera Adapter Module for Raspberry Pi." Arducam Home. Arducam. Web. 13
Apr. 2016. <https://www.sparkfun.com/products/12779>.

"Is It Possible to Predict How Long a Surgery Will Last?" Medascape. Web. 13 Apr. 2016.
<medscape.com/viewarticle/724756>.

APPENDIX A, BILL OF MATERIALS

Lynxmotion Robotic Arm - $0


Keyboard - $ 10.70
Wood
- $ 20.20
Screws - $ 2.48
Corner Bracket - $2.98
Drill Bits - $ 15.98
Spray Pin - $5.88
Buck Converter - $12.60
Panda Wireless PAU06 - $19.99
Raspberry Pi 2 Model B - $49.99
AD/DA Expansion Board - $38.99
External Power Supply for Raspberry Pi - $19.99

APPENDIX B, POWER SUBSYSTEM

According to Medscape, a web resource for physicians and healthcare professionals, the average
surgical time, which is dependent on a variety of different cases is typically around 2.5 hours. This
can vary between 1.2 hours. Since there are two main external power supplies that are connected
to the arm and the MCU, the requirement for the entire system is set to a maximum of 3 hours to
allow for a 30 minute average run time allowance. This allows for a realistic power draw including
losses that can match the average run time of 2.5 hours. The following electrical load analysis
calculates the minimum power supply needed to run both major systems.
According to the total current drain and time requirement, the total power supply should have
above a 4.83 Ah draw with a range of 2.2 V to 5 V. In order to test the entire system under full
load for longer periods and to account for power losses, a battery that has a current drain of
approximately three times the calculated current drain will be used. The current drain and power
for each individual motor can be seen in Table 1. The specific supply to be used is listed in Table
2 along with its ratings.
The Raspberry Pi 2 requires a minimum of 5 V, 1.2 A supply. In order for this module to have at
least a 3 hour run time, the total power supply should have above a 3.6 Ah draw. Thus a 5 V, 4.6
Ah supply should work exceedingly well for the power of the Raspberry Pi 2. The specific supply
to be used is listed in Table 5 along with its ratings.
Table 5: Robotic Arm Servo Motors - Minimum Supply

Shoulder
Motor 1

Shoulder
Motor 2

Elbow
Motor

Wrist
Motor 1

Wrist
Motor 2

Total

Current
Drain

700 mA

350 mA

230 mA

180 mA

150 mA

1.61 A

Power

3.36 W

1.68 W

1.104 W

0.864 W

0.72 W

7.73 W

Module

Table 6: Proposed Power Supplies


Power Consumption

Proposed System Power


Supply

Raspberry Pi 2

7.75 W minimum

PB4600 - Rated at 5 V,
4.6 Ah

5 Servo Motors

6 W minimum

WKDC12-14NB - Rated at
12 V, 14 Ah

Using a 12 V battery for the servo motors, a voltage regulator of some sort should be used to bring
down the voltage in between the motors allowable range. This can be set to approximately 5 V
with an adjustable buck converter. This converter can be directly connected to the motor power
supply. Lastly, the PWM Driver can be powered through the Raspberry Pis 3.3 V output pin.
Figure 4 shows the power connections throughout the entire system.
Fig 4: Power Block Diagram

APPENDIX C, CONTROL SUBSYSTEM

There are two possible methods of syncing the data from the Microsoft Kinect to the robotic arm.
Both methods can be tested for position and speed accuracy by matching the users arm to the
robotic arm. This is to be done next semester when all the systems are integrated together.
Currently, the robotic arm can be tested for independent position, movement, and speed. Each
motor can be tested for minimum and maximum pulse lengths. These ranges can be numbers out
of 4096 because of the specific PWM Driver (Figure 5) being used. A physical origin defined by
a PWM signal can also be set in order to calibrate to the user. This origin signal can be the pulse
length that needs to be set for the motors to move the robotic arm to a relaxed hanging position
to match that of the users. The end result for ECEN 404 will be to have approximately 10
accuracy, (3 inch radial offset) movement according to the users position.

Figure 5: PWM Driver Connections

In order for each motor to be mapped to a skeletal joint on the human body, certain ranges must
be defined for the system. Since the data from the software is three points in 3D space for each
joint, two vectors can be defined for each part of the robotic arm. One vector is the point from
the shoulder to the elbow and the other vector is the elbow to the wrist. With a defined origin,
two 3D angles could be defined using the following formula:
= arccos[(X * Y) / (||X||*||Y||)]
Each angle would be split into a shadowed angle on the coordinate plane listed in Table 7. The
constantly changing angle from the user would be sent to the Raspberry Pi and this would be
matched to the approximate angle from that the robotic arm can be set. That angle is mapped to a
duty cycle pertaining to a motor which would essentially follow the following formulas:
Duty Cycle = (Coordinate Input Theta - Coordinate Origin) / (Range of Motion Span)
Pulse Length = (Duty Cycle) * 4096

Table 7: Motor Mapping

Motor

Coordinate Mapping

Range of Motion

Shoulder Motor 1

X-Y

0 ~ 180

Shoulder Motor 2

Y-Z

0 ~ 135

Elbow Motor

Y-Z

0 ~ 150

Wrist Motor 1

X-Z

0 ~ 90

Wrist Motor 2

Y-Z

0 ~ 180

These formulas would be used for all 5 motors on the robotic arm. The most difficult part to
match correctly would be the speed of the users arm to the speed of the robotic arm. This is be
done by first testing the PWM frequency and finding the relationship of frequency versus actual
timed movement. In order to test this, a low frequency will be set to the duty cycles of each
motor. Then, the time it takes for the minimum to maximum range of motion will be timed with
a stopwatch. By increasing the frequency and testing the time each range takes, a relationship of
set frequency to speed of movement (time over the range of motion) can be mapped. By taking
the speed of the users arm from the software subsystem, that speed can be set to the robotic arm
after a linear relationship is defined.
With tested data for position, movement, and speed, a comparison can then be made between the
robotic arm and the users movement. With this comparison, clear offsets of these variables can
be observed and modified in order to provide proper connection when these subsystems start to
work together.

APPENDIX D, SOFTWARE SUBSYSTEM


6.

FIGURE

SOFTWARE FLOW CHART


The protocols defined by the chart above includes several check procedures to prevent the
system failures. The automated movements mentioned in previous sections are defined in the
interrupt (Fetching Tools) protocol.

Graphic User Interface Example


Fig 7. FLTK GUI Window

The GUI shown in figure 7 is a preliminary window developed using FLTK. Its main purpose in
the early development stages is to display the skeletal tracking data from the Kinect. Three duty
cycle out_boxes(*) will be added to this window to show the real-time duty cycles sent to the
Raspberry Pi 2. The algorithms used to calculate three duty cycles are defined by Hand X
Coordinate Difference,Shoulder Y Coordinate Difference, and Theta Difference. The last
two out_boxes are yet to be added to the window. Also, more functions and buttons will be
added to the window as the project proceeds.
(*) out_box is the white rectangular space differentiated from grey background of the window
that has data displayed in it.

Arm Synchronization Strategies by Software Subsystem


Fig 8. Robotic Arm Servo Motors and Movements Definition

As mentioned in other sections, the robotic arm movements are defined by the simultaneous
rotation movements of the three servo motors which are labeled in the Figure 8 above with their
specific rotation movements. The rotation movements of servo motor 2 and servo motor 3 are in
the Y-Z plane, and the rotation movements of servo motor 1 are in the X-Y plane.
The detail strategies and methods used to map real human arm movements to the robotic arm
movements are explained as follows:

(1) Servo Motor 1 (Movement 1)


Fig 9. Servo Motor 1 Mapping

Figure 9 shows that a value difference is resulted by the users hand movements in the X
coordinate in the X-Z plane. This difference is used to define servo motor 1 (movement 1) as the
rotation movement of the servo motor in the X-Y plane which can be easily represented by the
difference of X coordinate value. This mapping method is legitimate and effective due to the fact
that the robotic arm mimics the user arm movements that are limited under 180 degree (the front
side of body) in the X-Y plane.
(2) Servo Motor 2 (Movement 2)

Fig 10. Servo Motor 2 Mapping

Its tricky to map the users arm movements to servo motor 2 (movement 2) in an easy fashion.
In general, when elbow points (indicated in the figure 10 above as B1 and B2) raise up, the servo
motor 2 has to respond accordingly by rotating clockwise or counter-clockwise. However,
rotating in both directions can result in an increasing of Y coordinate value of elbow points. To
differentiate those two scenarios, another measurement has to be taken in consideration. As
shown in the figure, the users stretching elbow forward can result in a negative difference in
depth of elbow point (Z-coordinate value), while stretching elbow backward results in a positive
difference. Based on the two cases, a clear mapping method is generated to synchronize the
robotic arm movements with the users arm movements for servo motor 2.

(3) Servo Motor 3 (Movement 3)


Fig 11. Servo Motor 3 Mapping

Servo motor 3 (movement 3) is relatively easy to define. As indicted in the figure 11, the theta
(angle unit) is used to track relative position of upper limb and lower limb of the users arm. This
is achieved by calculating the angle between two vectors, where the first vector is defined by the
shoulder point and the elbow point, and the other one is defined by elbow point and wrist point.
The difference of theta therefore is used to define the servo motor 3 movements.

APPENDIX E, COMMUNICATIONS SUBSYSTEM


The communication network is divided into three sections: Raspberry Pi 2, Microsoft Kinect,
and the Myo Gesture Controller. The laptop is the central port where most of the information
would pass to.
Fig 12. General Communication Network

Raspberry Pi 2 to Laptop
The Raspberry Pi 2 communicates directly to the computer through Wi-Fi. The Raspberry Pi 2 is
to be attached to the mechanical arm mount. The goal is for the Raspberry Pi 2 to be able to
communicate from one end of a building to another. However, this is dependent upon the range
of the communication network, thus, the Raspberry Pi 2 requires a method of communication
with the maximum range. Wi-Fi is the most practical method of communication for this scenario
since it allows for significantly longer range as opposed to Bluetooth or other forms of wireless
communication.. Furthermore, Wi-Fi permits large amount of data to be transmitted at a time.
For the Raspberry Pi 2, an PAU06 Wi-Fi adaptor is attached to the USB port since the module
itself does not have an onboard Wi-Fi module. Since the university network uses WPA2
enterprise interface, the WPA-supplicant file has to be configured to bypass the security of the
network. WPA2 is the most common modern Wi-Fi interface used by public service buildings.
From the computer, Putty will be utilized to connect to the Pi. Unfortunately, there is not a
campus wireless system with static IPs available for students thus in order to connect to a known
host, a 3rd party DNS service such as DynDNS or NoIP would be needed. The service chose for
this application is the DynDNS service due to cost. The Pi when powered would connect to the
DNS service and the Raspberry Pi 2 would need to connect to this service through putty in order
to then communicate with the Pi. On the Raspberry Pi, 2 Camera modules will be connected to
an IVPORT module which is a camera module multiplexer allowing for multiple camera
attachments. I2C would also need to be enabled to allow for hardware component
communication such as the motor drive attachments to the Raspberry Pi 2 and the IVPORT
module. Furthermore, three tactile sensors that will be mounted directly to each of the three
grippers of the mechanical extensions. These sensors will be connected directly to the GPIO pins
of the Raspberry Pi. Five servo motors are connected to the PCA9685 and then connected to the

Raspberry Pi 2 GPIO pins. One stepper motor is connected to the Easydriver which is also
connected to the GPIO pins on the Raspberry Pi. Figure 13 shows the general communication
network layout for the Raspberry Pi 2.
Fig 13: Raspberry Pi 2 Communication Network

Microsoft Kinect to laptop

Kinect will be sending visual data gathered and transported through USB connection. Direct
USB connection permits faster transfer rate of the information. This visual feedback information
will then be extracted and transmitted to the computer. USB is more secure and allows for
greater transfer capacity which is necessary for the large amounts of data to be sent by the
Kinect. This information is then processed and code will convert the visual information to
coordinates and duty cycle for skeletal tracking. The information will then be transmitted
through the Wi-Fi transmitter and receiver and then turned to PWM in order to control the motor
movements. Figure 14 shows the general layout of the communication network for the Kinect.
Fig14: Kinect Communication Network.

Myo Gesture Controller to Laptop

Myo Gesture Controller will be communicating with the laptop through Bluetooth. The gesture
controller will be sending basic instructions to trigger certain codes to activate when processed

by the computer which would result in the mechanical arm to move in a predefined manner. The
user wont be very far from the computer due to the Kinect having such limited range because of
the instruction and data received/transmitted by the controller has a very small bandwidth.
Furthermore, the Myo Gesture Controller is only compatible with the Bluetooth module which
comes with it. The information sent to the Bluetooth module will then be converted into a
package that Wi-Fi can interpret and send to the Pi. From there, the processed information would
be shortened and sent off to the drives which in turn control the motors. The Raspberry Pi 2
would also be feeding information back to the Myo Gesture Controller as well. Tactile sensors
attached to the mechanical extension would send a signal from the Raspberry Pi 2 to the
computer. This information would in turn trigger coded instructions that would be transported
through the Bluetooth module resulting in the Gesture Controller to react accordingly. Figure 15
shows the general communication network layout of the Myo Gesture Controller network.
Fig 15. Myo Gesture Controller Communication Network

Table 8. Data Transfer Estimation


Raspberry Pi 2

Kinect

MYO Hand Controller

Communication

Wi-Fi

USB 2.0

Bluetooth

Bandwidth Capacity

300 Mbs

480 Mbs

3Mbs

Max Transmitted
Bandwidth

Cameras: 13.06Mbs
Sensors: 60 Bps

7.741Mbs

<1Mbs

Max Received
Bandwidth

21 Mbs

<1Mbs

Latency

0.045s

0.033s

.33s

Range

100 meters

1-3 meters

15 meters

* Video bit rate= pixel count x motion factor x 0.07 1000 = bit rate in kbps

In Table 8, observe how the bandwidth of transmitted/received communication for all modules
sufficiently low as opposed to the maximum capacity. As for the Bluetooth module, the
bandwidth depend on the efficiency/length of the C code but is estimated to have bandwidth of
less than 1 Mbs.

APPENDIX F, HAND/TOOL SUBSYSTEM

Fig 16.Physical Structure of hand

Hand movement control

Using a stepper motor in the middle part of the structure, a mechanical hand can be built with
precision movements. Since a stepper motor can define positions with utilizing multiple toothed
electromagnets arranged around a central gear, it can provide a lot of torque at the low end.
Control the stepper motor to rotate in clockwise (to grab) or counterclockwise (to release). The
stepper motor rotates the middle gear to control the fingers to grab or release.

Fig 17.Hand Control Flowchart

Stepper Motor Control: Hardware

EasyDriver is the stepper driver chosen for this part. The driver is connected between the
Raspberry 2 and stepper motor as shown in the image below. Between the Pi and the driver
board there are power lines (3.3V and GND) and four control lines which are used for stepper
motor phases switching). The motor is connected to the driver with four wires.
Fig 18.EasyDriver Board

Fig 19. Stepper Motor (XY42STH34 -0354A)

Fig 20.Stepper Motor Connection Diagram

Stepper Motor Control: Connection


Motor - Driver: Red and yellow wires are in one loop and connected to loop A on the driver
board. Grey and green wires are in one loop and connected to loop B on the driver board.

Driver - RPi: Two GND pins are connected parallel to the GND pin on Raspberry Pi 2. And
M+ pin is connected to the 5V pin on the Raspberry Pi 2. STEP pin is connected to the pin 16
and DIR is connected to the pin 18.

Stepper Motor Control: Software


Server side (Raspberry Pi)
To move the motor shaft, sequence of square wave pulses have to be generated from the GPIO
pins on the Pi. The stepper control program is written in Python.
Client side (Myo)

Myo triggers specific code to activate and perform specific motion to replace current tool with
selected tool. Myo band reads the users gestures which are programmed to specific motion.
Sensor

We will use tactile sensor to detect and show the how much force the user has applied on the
finger. The sensor will be attached to the fingertip and the connection of the sensor is shown in
figure below. (Tekscan)
Fig 21. Flex Sensor Circuit

Das könnte Ihnen auch gefallen