Sie sind auf Seite 1von 94

Modularized Robotic Arm

Team 15
Final (Progress) Report
ECEN 403 Capstone (Senior) Design Project
Texas A&M University, College Station, TX

Kevin Bradshaw
Fuhua Song
Yuan Tian
Zhengshuai Zhang

Team 15

Project Overview

Abstract
The focus of this project is to design and build a modularized robotic arm that can mimic a persons
arm movements. This overall system is composed of Power, Control, Software, Communications,
and Hand/Tools Subsystems. The robotic arm system utilizes the Kinect technology to send the
arm movement sampling signal to the mechanical robotic arm. A developed Graphical User
Interface (GUI) using the C++ Fast Light Toolkit (FLTK) library outputs the main skeletal tracking
data of the user. In the sampling stage, the real-time data of human joints captured by the Kinect
will be displayed on a window. The data is further processed to define the Pulse Width Modulation
(PWM) signal used to drive several servo motors on the robotic arm. Five of these servo motors
in total are integrated on the robotic arm. Its movements are defined by the rotation speed and
rotation direction of these servo motors accordingly. The modularization of the robotic arm is
achieved by integrating the Myo gesture controller so that the the user can control the robotic arm
to select and replace tools depending on different scenarios. When a specific gesture is
performed, the robotic arm will execute automated movements including detaching the current
tool and fetching the next tool. In addition, all data and instructions are transmitted wirelessly
among the subsystems. The main costs associated with this project include the Raspberry Pi 2,
the Kinect Sensor, and a Lynxmotion Robotic Arm.

Team 15

Project Overview

Table of Contents
Abstract
Table of Contents
1 Project Overview
1.1 Current System Design
1.2 Project Deliverables
1.3 Definition of Interfaces
1.4 Task Ownership
2 Individual Contributions Power Subsystem Kevin Bradshaw
2.1 Significant Changes in Direction
2.2 Subsystem Status
2.3 Subsystem Challenges and Solutions
2.4 Subsystem Technical Details
2.5 Subsystem Testing
3 Individual Contributions Control Subsystem Kevin Bradshaw
3.1 Significant Changes in Direction
3.2 Subsystem Status
3.3 Subsystem Challenges and Solutions
3.4 Subsystem Technical Details
3.5 Subsystem Testing
4 Individual Contributions Software Subsystem - Yuan Tian
4.1 Significant Changes in Direction
4.2 Subsystem Status
4.3 Subsystem Challenges and Solutions
4.4 Subsystem Technical Details
4.5 Subsystem Testing
5 Individual Contributions Hand/Tool Subsystem Zhengshuai Zhang
5.1 Significant Changes in Direction
5.2 Subsystem Status
5.3 Subsystem Challenges and Solutions
5.4 Subsystem Technical Details
5.5 Subsystem Testing
6 Individual Contributions Communications Subsystem Fuhua Song
6.1 Significant Changes in Direction
6.2 Subsystem Status
6.3 Subsystem Challenges and Solutions
6.4 Subsystem Technical Details

Team 15

6.5 Subsystem Testing

7 Conclusions
References
Appendix A Budget Table
Appendix B Power Subsystem Schematic and BOM
Appendix C Control Subsystem Code
Appendix D Communications Subsystem Code
Appendix E Software Subsystem Code
Appendix F Hand/Tool Subsystem

Project Overview

Team 15

Project Overview

1 Project Overview
The Modularized Robotic Arm is a proof of concept system thats designed and constructed to create a better user
interface for controlling robotics by wirelessly mimicking a user. Instead of having just a hand on the end of the robotic
arm, it will consist of a clamp with multiple tools to be used in a variety of applications. The first main focus of this
project was a bomb squad vehicle because its wireless and mobile characteristics. The second main focus was then
an application for medical professionals to perform surgery or treat highly infectious patients in a hospital without
actually being in the same room as the patient. Now, the focus has shifted to making the arm (for the most part)
modular to be able to work in different types of applications. This is done by adding the tools necessary for a specific
application directly to a tool assembly in front of the arm. Although the entire system wont be mounted on a mobile
platform, the aim is to still to make it wireless in order to demonstrate the possible mobile concept in future designs.

1.1

Current System Design

The major components of this system include a Microsoft Kinect, MYO Armband Controller, Raspberry Pi 2,
Lynxmotion Robotic Arm, two external power supplies, and a 3D printed hand. One of the main points of this project
is to make the robotic arm control wireless from the Kinect data and user interface. Figure 1.1 shows how this is
possible. The users movements and gestures are captured by both the Kinect and the MYO and sent to a laptop. This
data is processed and sent to the Raspberry Pi 2 which in turn drives the motors that produce controlled movement to
replicate the users arm.

Figure 1.1: System block diagram.

Team 15

Project Overview

The primary system can be divided into two separate physical parts. Figure 1.2 shows the annotated parts of the control
and power. Figure 1.3 shows the annotated parts of the user and the interface.

Figure 1.2: Annotated Picture - Control and Power

Figure 1.3: Annotated Picture - User and Interfaces

Team 15

Project Overview

1.2

Project Deliverables

The system delivered will consist of these main goals listed:


1. A robotic arm that correctly mimics user arm movements
2. GUI that displays user movements, angles, and status of the connection between the control and the user.
3. Communication protocol for the control and interface systems to be linked properly.
4. Software packages with the code necessary to run the system.
The major components produced in this project include three software packages. One consists of the Software
Development Kit (SDK) provided by Microsoft that is modified with an algorithm which calculates all the angles and
vectors needed for the controls. The second is the communication code and instructions that allow for the computer
to send the correct data packets to the robotic arm. Lastly, the third consists of the physical mount, power system
circuitry, and the control program used for the desired output. The components purchased or borrowed include the
Microsoft Kinect with its downloaded SDK, the Myo Armband with its downloaded SDK, the Lynxmotion Robotic
Arm, Raspberry Pi 2, Adafruits PWM Driver with its downloaded functions, a Stepper Motor Driver, and the power
supplies. The modification to the codes to enable the combination of all this hardware and software is a significant
part of this project. It constitutes the actual movement synchronization with the replicated movement to the robotic
arm. The accuracy of the output should be observable with physical movement tests. The overall benefit of this system
is the wireless control of a system interface independent of the robotics. This means with a much more refined and
financial investment, this machine could be used in areas where high stake situations require a humans perspective,
instincts, and movements.

1.2.1

Subsystem Specifications

The following specifications for each subsystem is required for later subsystem integration. These are also summarized
shown in Table 1.1:
Power Specification
The requirement for this subsystem is for full power draw of each motor and the peripherals connected to the main
external supply is less than 20 Watts for over 2.5 hours. For the second external supply, the full power draw should
be less than 12 Watts for over 2.5 hours.
Control Specification
In order to correctly translate user movement to the robotic movement, the position of the arm control system has to
have specific boundaries of position. This is so that each movement looks comparatively similar when later its
synchronized to the user. When the arm is set to a specific position, the output data displayed must be less than 10
accuracy for angle inputs. The output data should also be less than 3 inch radial for each position its set to.
Software Specification
In order to correctly interpret user movement, the software system must be able to interpret each position with less
than 10 accuracy for angle inputs. The output data should also be less than 3 inch radial for each position its set
to. In addition, the graphic user interface, which is included in software design, has to provide an easy control on the
entire system throughout the sampling and mimicking stages.
Communication Specification
Communication between the various components of the system must be rapid yet secure. The total maximum bit
transfer rate from the Raspberry Pi to the computer (largest data package sent) is estimated to be 13.06Mbs with a
maximum time estimation of 0.045s assuming the transfer is instantaneous between devices and not accounting for
disruptions. For range, the communication between the robotic extension and the laptop must be taken into highest
consideration. The range is dependent on wireless range of the router. However, for this application, the range is
approximated have at least 10 meter range.

Team 15

Project Overview

Hand/Tools Specification
To close and open the fingers, the stepper motor should be able to rotate in proper direction and degrees as users
input. The motor rotates in 1.8 degrees per step. The designed rotation steps is 270 steps, so the stepper motor should
rotate 270 steps when user input a direction instruction.
Table 1.1
Subsystem Specification Compliance Matrix
Specification
Power: The full power draw of motors and peripherals is less than 30 Watts
for over 2.5 hours.

Min

Nominal

14 W

22 W

Max
30 W

Control: Less than 10 accuracy for angle inputs, 3 inch radial for position.

~0

10

Communication: The system sends data at a minimum distance of 10 m.

10 m

15 m

20 m

Software: The accuracy of software system is less than 10.

-10

10

Hand/Tools: The stepper motor in the hand system rotates in 1.8 /step from

270

users input.

1.3

Definition of Interfaces

The following includes the interfaces for each subsystem.. These are also shown in Table 1.3:
Power Specification
One main input is the 12V, 20Ah battery which connects to the buck converter and then each servo motor in a daisy
chain design. The second input is the 5 V, 4.6Ah battery which connects to the Raspberry Pi 2 and in turn, drives all
of its peripherals and modules.
Control Specification
The input to the control is all code generated on the Raspberry Pi 2. This code produces varying PWM signals, each
independent of one another, to the servo motors of the robotic arm. The output is the movement and positioning of the
arm.
Software Specification
The input to this subsystem is the user movement interpreted through the Kinect. The outputs are the calculated angles
necessary to mimic the same movements and the system messages outputted on the GUI window.
Communication Specification
For communication network, the input to the subsystem is data packages ranging in forms of visual feedback to basic
motor control inputs. The output of the communication network is a replica of the input on the other end whether in
the form of a video or text file.
Hand/Tools Specification
Myo armband can read users gestures and send the information to the laptop. Every gesture is designed to match one
instruction of rotating the stepper motor. The laptop sends the instruction to the stepper motor and the motor performs
the instruction.

Team 15

Project Overview

Subsystem
Power

Table 1.3
Responsibility Matrix
Responsible
Responsibilities
Member
Design and build all power connections through the
Kevin Bradshaw
system.

Control

Kevin Bradshaw

Interpret the angle data packets from the communications


subsystem. Have control of each PWM signal sent to the
robotic arm.

Software

Yuan Tian

Interpret the user movements and output angle data.


Construct a user-friendly GUI.

Communication

Fuhua Song

Construct a communication protocol necessary to


integrate the physical system by allowing the flow and
replication of data packets.

Hand Motion
Recognition

Zhengshuai Zhang

Write code on both MYO and Raspberry Pi 2 to allow it


to read users gesture and sends instructions to the hand
system.

1.4

Task Ownership

Kevin Bradshaw is responsible for both the power and control subsystem. For the power subsystem, this includes the
overall design that ensures each component draws the amount of power needed to run safely under testing conditions.
This also includes building the mount and wiring the physical system together. To test, the main battery will be directly
monitored continuously for current draw. The second battery will be tested by checking the length of time it can run
in normal conditions of operating the control subsystem. For the control subsystem, this includes the proper connection
for the Raspberry Pi 2, PWM Driver, and each individual servo motor. This also includes coding on the Pi which
enables the proper movement of the arm. Lastly, the position of the arm will be tested by physical observation and
position measurement to match the codes position calculation.
Yuan Tian is responsible for taking the Microsoft Kinect SDK and incorporating it in a GUI designed for this project.
This GUI includes the algorithm necessary for calculating the vectors from the skeletal tracking done by the Kinect,
then finding the specific angles for each joint compared to a defined origin. These angles are then displayed on the
GUI and sent through the Communications subsystem. The GUI is also capable of identifying the interrupts sent by
the hand and tool subsystem, then responding and output system messages accordingly.
Fuhua Song is responsible for designing the communication protocol between the various components making up the
physical system. The physical system consists of the mechanical extension, laptop, Myo gesture controller and the
Kinect. Between these components, different protocols has to be followed for these components to efficiently send
data package between one another. In order to test the adequacy of the communication protocol between components,
the latency of the data packets being transmitted as well as the estimation of the size of future packages had to be
determined.
Zhengshuai Zhang is responsible for the hand and tool subsystem. The physical structure of the hand needs to be
designed and built. For the control of the hand, the stepper motor is the core. The stepper motor is connected to the
Raspberry Pi 2 through a stepper motor driver. The driver is powered by an external power supply and it draws control
information from two GPIO pins on the Raspberry Pi. His responsibility is to write the code thats programmed on

Team 15

Project Overview

the Raspberry Pi which allows it to be able to control the stepper motor with a users input. To control the stepper
motor wirelessly, Myo armband is used. Myo reads the gestures and controls the stepper motor with the
communication between them both. Writing this code on the Myo SDK is a big part of this subsystem.
Table 1.4
Responsibility Matrix
Subsystem

Power

Responsible
Member
Kevin
Bradshaw

Responsibilities
Design and build the entire power requirements for the project.

Control

Kevin
Bradshaw

Design and build the control requirements move the robotic arm
accordingly.

Software

Yuan Tian

Design and build the software requirements needed to interpret the data
from the user.

Communication

Fuhua Song

Design and build the communication requirements needed to send and


receive the proper data packets through the system.

Hand

Zhengshuai
Zhang

Build the physical structure of the hand. Control the hand wirelessly
through MYO armband. Set up force sensors and video systems. Design
tools for the hand to grab and use.

Individual Contribution Power Subsystem

Team 15

2 Individual Contributions - Power Subsystem


The power subsystems responsibility is to fully operate the robotic arm and its peripherals for a minimum of 2.5
hours. This includes powering the robotic arm motors, the Pi, camera module, PWM driver, stepper driver, and any
sensors attached by the other subsystems. Figure 2.1 shows the subsystem block diagram. The left side of the block
portrays where the largest portion of power is consumed. The right side shows the necessary circuitry that is driven
by the microcontrollers power.

Figure 2.1 Power Subsystem Block Diagram

2.1

Significant Changes in Direction

First, a 12 V, 14 Ah battery was going to be used instead of a 20 Ah battery. This changed because the minimum
current draw turned out to be about half of what was actually being drawn. Since the first calculation for the power
supply was over calculated to be approximately three times the minimum for testing purposes, the first battery could
have still been used but the larger was chosen so that testing procedures could last longer when the system is fully
connected. The same requirement for 2.5 hour run time is still going to be used.

2.2

Subsystem Status

According to Medscape, a web resource for physicians and healthcare professionals, the average surgical time, which
is dependent on a variety of different cases is typically around 2.5 hours. This can vary between 1.2 hours. Since
there are two main external power supplies that are connected to the arm and the MCU, the requirement for the entire
system is set to a maximum of 3 hours to allow for a 30 minute average run time allowance. This allows for a realistic
power draw including losses that can match the average run time of 2.5 hours. The following electrical load analysis
calculates the minimum power supply needed to run both major systems.

Kevin Bradshaw

Individual Contribution Power Subsystem

Team 15

According to the total current drain and time requirement, the total power supply should have above a 4.83Ah draw
with a range of 4.8V to 6.0V. In order to test the entire system under full load for longer periods and to account for
power losses, a battery that has a current drain of approximately three times the calculated current drain will be used.
The minimum current drain and power for each individual motor can be seen in Table 2.1. The specific supply to be
used is listed in Table 2.2 along with its ratings.
The Raspberry Pi 2 requires a minimum of 5V, 1.2A supply. In order for this module to have at least a 3 hour run
time, the total power supply should have above a 3.6Ah draw. Thus a 5V, 4.6Ah supply should work exceedingly well
for the power of the Raspberry Pi 2. The specific supply to be used is listed in Table 2.2 along with its ratings.
Table 2.1: Calculated Robotic Arm Servo Motors (At No Load) - Minimum Supply
Shoulder
Motor 1

Shoulder
Motor 2

Elbow
Motor

Wrist Motor
1

Wrist Motor
2

Total

Current
Drain

350 mA

700 mA

230 mA

180 mA

150 mA

1.61 A

Power

1.68 W

3.36 W

1.104 W

0.864 W

0.72 W

7.73 W

Table 2.2: Proposed Power Supplies


Module

Power Consumption

Proposed System Power Supply

Robotic Arm - Motors

7.75 W minimum

PB4600 - Rated at 5 V, 4.6 Ah

Raspberry Pi 2

6 W minimum

WKDC12-14NB - Rated at 12
V, 14 Ah

Using a 12V battery for the servo motors, a buck converter will be used to bring down the voltage in between the
motors allowable range. This can be set to approximately 5V with an adjustable buck converter. This converter can
be directly connected to the motor power supply. Then this is connected to a terminal block with daisy chained power
cables that run through the mount to power the arm. Lastly, the PWM and Stepper Driver can be powered through the
Raspberry Pis 3.3V output pin.
Table 2.3
Power Specification Compliance Matrix
Specification
Min
Robotic Arm Power Supply
6W
Raspberry Pi 2 Power Supply
9W
Robotic Arm Run Time
2 hours
Raspberry Pi 2 Run Time
6 hours

Kevin Bradshaw

Nominal
20 W
10 W
2.5 hours
8 hours

Max
25 W
11 W
3 hours
10 hours

Individual Contribution Power Subsystem

Team 15

2.3

Subsystem Challenges and Solutions

The main two challenges were figuring out how to connect all the motors to an external supply rather than each having
its own supply. The first couple of configurations did not work but lastly, the daisy chain approach was used and all
the motors were able to receive power. The second challenge was that the buck converter used did not work properly
and there wasnt enough time to purchase a new one that can handle at least up to 5 amps.

2.4

Subsystem Technical Details

In order to test the major power requirements necessary for the robotic arm, it was connected to a DC power supply
that was set to a 5V input. Current drain for each individual motor at three different positions was observed and
recorded. Each position recorded was the minimum, middle, and maximum positions that the servo motors could be
set that would match the orientation of a human arm. The following images in the test section show each test and the
power supply readings for each position. The results recorded are instantaneous and it should be made clear that when
the motor is under load and the arm moves, the current draw spikes and varies from this number for a couple of
seconds. The results of the instantaneous currents shown in each image is summarized in Table 2.5.1.

2.5

Subsystem Testing

By disconnecting the daisy chain and only powering one motor at a time at several different positions, an electrical
load analysis can be done for the entire system. All five motors were tested for each position specified in the technical
details.

2.5.1

Electrical Load Analysis

The purpose of this test is to obtain the current draw for all the motors at any given point of time. Since the current
draw varies depending on the position of the arm (because of the load on each motor), the full power draw also varies
at any given time. In order for the system to be able to run for a maximum run time of 2.5 hours, the system cant
draw over 8A continuously. Since the peak current draw ranges from 3.5 A to 4.5 A at any given time, this requirement
is met. This also gives 3.5 to 4.5 A allowable range for all the other modules that may be added to the system to use
and still meet the requirement. The battery was designed much larger than the system currently needs so that further
design change implementation can be done over the following semester.
Test Setup
The DC power supply connection was set to 5 V for each motor tested in each position. The origin was set to the
relaxed position of the arm hanging completely downwards. The minimum, middle and maximum positions are
defined in the Control Subsystem section. The supply current for each motor was measured and recorded Table 2.5.
Figure 2.5.1 shows the rotation of Shoulder Motor 1, which replicates approximately 180 degrees of left and right
motion at the joint. This is a very small motor compared to the rest and does not need much current.

Kevin Bradshaw

Individual Contribution Power Subsystem

Team 15

Figure 2.5.1 Shoulder Motor 1 - Position Power Tests


Figure 2.5.2 shows the rotation of Shoulder Motor 2, which replicates approximately 180 degrees of forward and
backward motion at the joint. This is the largest motor because it carries the largest load and must be able to move the
entire robotic arm precisely with varying amounts of power.

Figure 2.5.2
Shoulder Motor 2
Position Power
Tests

Kevin Bradshaw

Team 15

Individual Contribution Power Subsystem

Figure 2.5.3 shows the rotation of Elbow Motor 1, which replicates approximately 150 degrees of upward and
downward motion from the origin at the joint. Currently, this motor is the hardest to replicate a specific position
because it doesnt seem strong enough when doing the angle tests mentioned in the Control section. This motor may
be replaced with the same size servo as the Shoulder Motor 2. The issue with that is there would be more stress on the
shoulder motors torque towards the elbow motor. This joint has a lot of variation when it comes to current draw
depending on how far its from the origin defined for the elbow. It also depends on the position of the shoulder joint
because the movement may be in an awkward position.

Figure 2.5.3 Elbow Motor 1 - Position Power Tests


Figure 2.5.4 shows the rotation of Wrist Motor 1, which replicates approximately 180 degrees of forward and
downward motion from the origin at the joint. This motor works great at the moment but may have several stress
points when the hand is fully attached to the arm with all of its sensors next semester.

Figure 2.5.4
Wrist Motor 1
Position Power Tests

Kevin Bradshaw

Team 15

Individual Contribution Power Subsystem

Figure 2.5.5 shows the rotation of Wrist Motor 2, which replicates approximately 180 degrees of left and right motion
from the origin at the joint. This motors has had an issue with a faulty connection. The power and signal wires may
be completely replaced before next semester.

Figure 2.5.5 Wrist Motor 2 - Position Power Tests


Figure 2.5.6 shows the power requirements for the stepper motor, which replicates the hand movements mentioned in
the Hand/Tools Subsystem. This motor is not fully connected mechanically to the arm so the power requirements will
change.

Figure 2.5.6 Stepper Motor


Position Power Tests

Kevin Bradshaw

Individual Contribution Power Subsystem

Team 15

The Figure 2.5.7 shows the power requirements for all the motors being energized at the same time. This is just one
out of many positions possible but this example shows one of the maximum current draws for the entire robotic arm.
It ranges from approximately 3.5A to 4.3A when fully operated.

Figure 2.5.7 Full Power Test


Data
Table 2.5.1 summarizes the load analysis tests for each motor. The total currents for each position is also shown. This
can be compared to the final test Figure 2.5.7 where all the motors were being energized at one time. Depending on
how much torque is required for an individual motor to stay in a certain position, the current draw can change
dramatically for each total power test. This image does show the largest draw recorded.
Table 2.5.1: Motor Positions - Current Draw
Motor

Minimum Position
Current (Amps)

Middle Position
Current (Amps)

Maximum Position
Current (Amps)

Shoulder Motor 1

0.007

0.007

0.007

Shoulder Motor 2

1.041

1.617

2.070

Elbow Motor 1

2.030

0.203

0.152

Wrist Motor 1

1.001

0.968

1.005

Wrist Motor 2

0.326

0.373

0.016

Stepper

0.007

0.178

0.007

Total

4.412

3.346

3.257

Kevin Bradshaw

Team 15

Individual Contribution Power Subsystem

Conclusion
The original calculations show the minimum supply needed for each motor at no load. It was expected that much more
power would be required so the larger battery was chosen for the system implementation. More power calculations
will need to be done as next semester unfolds with more subsystem requirements.

Kevin Bradshaw

Team 15

Individual Contribution Control Subsystem

3 Individual Contributions - Control Subsystem


Currently, the robotic arm can be tested for independent movement and position. Each motor was tested to find
particular minimum and maximum pulse lengths that control the motors in respect to positions of a human arm.
These ranges are pulses out of 4096 because of the specific PWM Driver being used has a 12 bit resolution. A
physical origin defined by a PWM signal can also be set in order to calibrate to the user. This origin signal is the
pulse length that needs to be set for the motors to move the robotic arm to a relaxed hanging position to match that
of the users. The end result for ECEN 404 will be to have approximately 5 accuracy, (1.5 inch radial offset)
movement according to the users position. Figure 3.1 shows the flowchart for Mode 1 written in the code. This
code enables the user to control the arm with keyboard inputs for each individual motor direction. Figure 3.2 shows
the flowchart for Mode 2 written in the code. This code enables the user to input particular angles and the robotic
arm will be set to that position. This mode will continue to be refined to correspond with the Kinects continuous
angle output data.

Figure 3.1: Mode 1 Flow Chart

Figure 3.2 Mode 2 Flow Chart

Kevin Bradshaw

Individual Contribution Control Subsystem

Team 15

3.1

Subsystem Status

This subsystem is responsible for the code necessary to control the robotic arm movements that are similar to that of
a human arm. In order to control the arm, it needs the proper drive through a microcontroller with PWM signals. The
current physical connections to make this happen are shown in Figure 3.3. This includes the Raspberry Pi 2, the PWM
Driver, and each servo motor of the robotic arm.

Figure 3.3: Control Subsystem Physical Connections


The software subsystem defines the movements of the physical arm to the robotic arm. The control subsystem
utilizes these definitions and portrays them to each individual motor in order to create multi-directional movement.
Table 3.1 summarizes the movement capabilities for each motor.

Kevin Bradshaw

Individual Contribution Control Subsystem

Team 15

Table 3.1: Range of Motion Coordinate Mapping

Motor

Coordinate Mapping

Range of Motion

Shoulder Motor 1

X-Y

0 ~ 180

Shoulder Motor 2

Y-Z

0 ~ 150

Elbow Motor

Y-Z

0 ~ 150

Wrist Motor 1

X-Z

0 ~ 180

Wrist Motor 2

Y-Z

0 ~ 180

In order for the robotic arm movement to be closely similar to that of the users arm, the angles of each motor should
not be over 10 degrees off from each joint portrayed by the Kinect. This is because an offset in one motor can mean
an offset in the motors connected to them. This offset of 10 degrees corresponds to approximately a 3 inch offset.
The minimum is as close to zero as possible for proper user synchronization. Table 3.2 summarizes these
specifications.
Table 3.2
Control Specification Compliance Matrix
Specification
Angle Accuracy

~0

Max
10

Measured Offset

~0 in.

1.5 in.

3 in.

3.2

Min

Nominal

Subsystem Challenges and Solutions

The major challenge was that the servo motors had some chatter when set to any position. This was partly caused
because the elbow motor seems like it requires more torque in order to stay in its designated position. The other reason
is because all five PWM signal wires were joined in one harness without EMI protection. This causes some high
frequency noise for each signal. In order to get rid of this, the signals were shunted to ground with small capacitors
(0.1 F). After this application, there was still noticeable chatter but not as much as it had been. Next semester, the
wires will be moved away from each other and the shunts will be moved as close as possible to the servo motors in
order to completely get rid of any high frequency noise. Lastly, next semester, a motor with a higher torque rating
may be used for the elbow joint.

3.3

Subsystem Technical Details

Since the data from the software is three points in 3D space for each joint, two vectors can be defined for each part
of the robotic arm. One vector is the point from the shoulder to the elbow and the other vector is the elbow to the
wrist. With a defined origin, two angles could be defined using Equation 3.1.

Kevin Bradshaw

Team 15

Individual Contribution Control Subsystem

= arccos[(X * Y) / (||X||*||Y||)]

(3.1)

The continuously changing angle from the user would be sent to the Raspberry Pi and this would be matched to the
approximate angle that the robotic arm can be set.
There are two modes of operation designed for this subsystem. The first being keyboard input which has five if
statements controlling each motor. Each branch is similarly calculated and set. Looking at one block individually, an
if statement looks at a character being pressed. If that character is being pressed, it gets set in a while loop so that the
user can continuously press that character and continue to rotate the motor. Code 3.1 shows this implementation.
Code 3.1: Keyboard Input Implementation

In the while loop, the pulse variable called Shoulder_1 is incremented by a variable named StepSize that is also
defined in the code. This step size corresponds to how many pulses increase or decrease while the input is being
pressed. The pulse is then put in a defined function called getDTY( ) which defines the duty cycle of each PWM
signal. Equation 3.2 shows this calculation.
Duty Cycle = (Pulse Length) / 4096

(3.2)

This outcome is later displayed to the user through the terminal window. The same original variable that contained
the pulse variable is then put in another function called getangle1( ) which defines an angle mapped to the pulse
range based on Equation 3.3.
Angle = [(Pulse - Minimum Pulse) / (Maximum Pulse - Minimum Pulse)] * 180

(3.3)

This angle calculation is then displayed to the user through the terminal window. Lastly another if statement is
employed by checking if the pulse maximum is met. This same procedure is used for each letter. The only
differences that arise are for the letters that correspond to the opposite movement, which decrement the step size in
every loop. Table 3.3 shows all the keyboard inputs and what movements they correspond with.

Kevin Bradshaw

Individual Contribution Control Subsystem

Team 15

Table 3.3: Corresponding Movements for Keyboard Input


Motor

Direction

Keyboard Input

Shoulder Motor 1

Left

Right

Forward - Up

Backward - Up

Up

Down

Forward - Up

Backward - Up

Left

Right

Shoulder Motor 2

Elbow Motor 1

Wrist Motor 1

Wrist Motor 2

The second mode of operation takes in five angles corresponding for each motor. That angle is then mapped to a
duty cycle pertaining to a motor which would follow Equation 3.2.
Pulse = [(Angle / 180) * (Maximum Pulse - Minimum Pulse)] + Minimum Pulse

(3.4)

This pulse is then directly used in driving the PWM signal for each motor. This requires knowing the minimum and
maximum pulses for each motor that correspond with the range of motion required for that motor. Code 3.2 shows
this implementation.
Code 3.2 Angle Input Implementation

Kevin Bradshaw

Individual Contribution Control Subsystem

Team 15

These formulas are used for all 5 motors on the robotic arm. The most difficult part to match correctly would be the
speed of the users arm to the speed of the robotic arm. This is done by first testing the speed variations in the
changes in continuous data. Then, a delay variable can be defined to add a very small delay (in the order of a
hundredth of a second) in between each change of the PWM signals. With tested data for position, movement, and
speed, a comparison can then be made between the robotic arm and the users movement. With this comparison,
clear offsets of these variables can be observed and modified in order to provide proper connection when these
subsystems start to work together.

3.4

Subsystem Testing

The first mode of operation for the code was tested by measuring the pulses for each physical angle range of the
robotic arm. The code was ran and each motor was moved to its minimum, middle, and maximum positions defined
and the pulse measurements were recorded.

3.4.1

Mode 1 Operation Testing - Keyboard Movement

The purpose of this test is to clearly define the ranges of pulses that the robotic arm can move in that it matches the
movement of a human arm. This test passes if each pulse range can be clearly be defined in between the resolution of
the PWM driver (12 bits, 4096 pulses). This test was done three different times in order to get the most precise pulse
ranges to clearly define what the range of motion for the robotic arm should be. Each position can also be physically
seen in the Power Subsystem testing section.
Test Setup
The power supply was set to 5V connected to only the motors. The corresponding decrementing keyboard input was
pressed until the motor moved to the minimum position that was measured to be the angle range minimum. The same
was done for the incrementing keyboard input to the measured angle range maximum. The pulse minimum was
subtracted from the maximum and that value was divided by two and added back to the minimum pulse. This was set
to be the middle pulse and then compared visually to the middle angle for each position. Lastly, the step size for each
was set to 5 pulses and the angles for the first three tests were approximated in the essence of time.
Data
This test was done several times because the pulse ranges change depending on the power input and the load on the
power supply because it was configured in a daisy chain fashion. The results from the first test are shown in Table
3.4.1. This test was when each motor was run independently with none of the wiring configurations mentioned in the
block diagrams. There was only one direct connection at one time for each motor which included the power, signal,
and ground. These wires were also as close to each motor as possible without being in the same harness.

Kevin Bradshaw

Individual Contribution Control Subsystem

Team 15

Table 3.4.1: Motor Positions - Pulse Test 1


Motor

Minimum Position
(Out of 4096)

Middle Position
(Out of 4096)

Maximum Position
(Out of 4096)

Shoulder Motor 1

120

380

640

Shoulder Motor 2

250

385

550

Elbow Motor 1

170

400

500

Wrist Motor 1

150

400

650

Wrist Motor 2

130

390

650

The second test showed interestingly different data compared to the first. What changed was adding all the motors as
one load on a terminal block and then powering them through a daisy chain connection. This resulted in the pulses in
Table 3.4.2.
Table 3.4.2: Motor Positions - Pulse Test 2
Motor

Minimum Position
(Out of 4096)

Middle Position
(Out of 4096)

Maximum Position
(Out of 4096)

Shoulder Motor 1

65

180

305

Shoulder Motor 2

110

190

275

Elbow Motor 1

80

170

260

Wrist Motor 1

70

205

320

Wrist Motor 2

50

185

320

The third test showed an even larger change. What changed was tying together all the PWM signals and wiring them
as one harness through the mount. The results are shown in Table 3.4.3. This means that there is some sort of high
frequency disturbance when tying the signals together without EMI protection. This also produced the servo chatter
mentioned before.
Table 3.4.3: Motor Positions - Pulse Test 3
Motor

Minimum Position
(Out of 4096)

Middle Position
(Out of 4096)

Maximum Position
(Out of 4096)

Shoulder Motor 1

550

1230

2020

Shoulder Motor 2

620

1200

1780

Elbow Motor 1

570

1135

1700

Wrist Motor 1

450

1230

2200

Wrist Motor 2

360

1160

1960

Kevin Bradshaw

Individual Contribution Control Subsystem

Team 15

The same configuration was kept and the third test was redone with a step size of 1 and the angles being measured as
close to their ranges as possible. This produced some variation in a few of the positions but for the most part, are close
to each other since the ranges from the third test correspond to approximately 10 pulses per degree of movement.
Table 3.4.4: Motor Positions - Refined Test 3
Motor

Minimum Position
(Out of 4096)

Middle Position
(Out of 4096)

Maximum Position
(Out of 4096)

Shoulder Motor 1

415

1263

2110

Shoulder Motor 2

695

1188

1680

Elbow Motor 1

565

1283

2000

Wrist Motor 1

425

1303

2180

Wrist Motor 2

365

1165

1965

Conclusion
Using Equation 3.2 and the pulses that were recorded in Table 3.4.4, the duty cycles for each position was calculated
and they are shown in Table 3.4.4. The pulses and duty cycles obtained show that the tests pass for being able to find
definite ranges for each in order to map to physical ranges of motion. The high frequency noise observed by the servo
chatter was very unexpected but there are physical ways to fix this while still keeping the same configuration for the
loads. The PWM signals themselves may be separated if the servo chatter still does not seem to be reduced (keeping
in mind that it also is partially affected by motor torque requirements).
Table 3.4.5: Motor Positions - Refined Test 3 Duty Cycles
Motor

Minimum Position
(Out of 4096)

Middle Position
(Out of 4096)

Maximum Position
(Out of 4096)

Shoulder Motor 1

10.13 %

30.83 %

51.51 %

Shoulder Motor 2

16.96 %

29.00 %

41.01 %

Elbow Motor 1

13.79 %

31.32 %

48.82 %

Wrist Motor 1

10.37 %

31.81 %

53.22 %

Wrist Motor 2

8.91 %

28.44 %

47.97 %

3.4.2

Mode 2 Operation Testing - Angle Inputs

The purpose of this test is to see how accurate the positioning of the robotic arm is if the user inputs the angles for
each motor directly. The test passes if each motor can accurately be placed in the position set by the user. This position
is entirely set on the angles defined according to the origin.

Kevin Bradshaw

Individual Contribution Control Subsystem

Team 15

Test Setup
Each motor was powered through the terminal block together. Six random user positions were tested. Each position
consisted of five different angles. Each physical angle measured was measured from the respective origins of the
motors and was approximated.
Data
Table 3.4.6 shows the corresponding angles tested and the physical angles measured for the first test. Figure 3.4.1
shows the position of the robotic arm for those inputs. This position corresponds to the origin position defined in the
initialization of the code.
Table 3.4.6: Angle Inputs - Test 1
Motor

Angle Set

Shoulder Motor 1

90

95

Shoulder Motor 2

Elbow Motor 1

Wrist Motor 1

Wrist Motor 2

Figure 3.4.1 Angle Test 1

Kevin Bradshaw

Angle Measured

Individual Contribution Control Subsystem

Team 15

Table 3.4.7 shows the corresponding angles tested and the physical angles measured for the second test. Figure 3.4.2
shows the position of the robotic arm for those inputs. This position corresponds to having the elbow pointed directly
out, parallel with the ground and the forearm being perpendicular to the ground.
Table 3.4.7: Angle Inputs - Test 2
Motor

Angle Set

Shoulder Motor 1

90

95

Shoulder Motor 2

90

93

Elbow Motor 1

90

95

Wrist Motor 1

Wrist Motor 2

Figure 3.4.2 Angle Test 2

Kevin Bradshaw

Angle Measured

Individual Contribution Control Subsystem

Team 15

Table 3.4.8 shows the corresponding angles tested and the physical angles measured for the third test. Figure 3.4.3
shows the position of the robotic arm for those inputs. This position corresponds to having the upper arm extended
out with the forearm parallel with the ground. This was the first test where the torque capabilities were noticed because
the elbow motor couldnt supply continuous torque to move to that position without a little help from the user moving
it there. After it was moved there, it would slowly move away from the desired position. The angle input could also
be increased to set it to the proper position (which means that higher torque is being supplied) but this would throw
the angle measurements off.
Table 3.4.8: Angle Inputs - Test 3
Motor

Angle Set

Angle Measured

Shoulder Motor 1

90

95

Shoulder Motor 2

45

35

Elbow Motor 1

45

25

Wrist Motor 1

22.5

20

Wrist Motor 2

90

90

Figure 3.4.3 Angle Test 3

Table 3.4.9 shows the corresponding angles tested and the physical angles measured for the fourth test. Figure 3.4.4
shows the position of the robotic arm for those inputs. This position corresponds to having the arm sticking out as if
it were flexing but forward. Compared to the last test, it was expected to not be that accurate but this was much closer
to the values put into the terminal window.

Kevin Bradshaw

Individual Contribution Control Subsystem

Team 15

Table 3.4.9: Angle Inputs - Test 4


Motor

Angle Set

Angle Measured

Shoulder Motor 1

90

95

Shoulder Motor 2

45

30

Elbow Motor 1

22.5

15

Wrist Motor 1

90

92

Wrist Motor 2

45

43

Figure 3.4.4 Angle Test 4

Table 3.4.10 shows the corresponding angles tested and the physical angles measured for the fifth test. Figure 3.4.5
shows the position of the robotic arm for those inputs. This position corresponds to having the elbow coming inwards
while lifting the forearm close to being parallel but not exactly.

Kevin Bradshaw

Individual Contribution Control Subsystem

Team 15

Table 3.4.10: Angle Inputs - Test 5


Motor

Angle Set

Angle Measured

Shoulder Motor 1

45

35

Shoulder Motor 2

-45

-37

Elbow Motor 1

90

93

Wrist Motor 1

45

40

Wrist Motor 2

90

92

Figure 3.4.5 Angle Test 5


Table 3.4.11 shows the corresponding angles tested and the physical angles measured for the sixth test. Figure 3.4.6
shows the position of the robotic arm for those inputs. This position corresponds to having the arm sticking out as if
it were flexing but slanted outwards instead of forward.

Kevin Bradshaw

Individual Contribution Control Subsystem

Team 15

Table 3.4.11: Angle Inputs - Test 6


Motor

Angle Set

Angle Measured

Shoulder Motor 1

90

95

Shoulder Motor 2

45

40

Elbow Motor 1

115

110

Wrist Motor 1

Wrist Motor 2

Figure 3.4.6 Angle Test 6


Conclusion
Overall the subsystem passed but there were several positions where the motors were too inaccurate, especially for
Test 3. This shows that further calibration needs to be done for the motors for mapping the pulse ranges to the specific
ranges of motions. This calibration must also take into account the torque on heavy load conditions. This also showed
us that the elbow joint needs to be modified with a motor that has a larger torque rating but not necessarily any heavier.
Another modification that could be done is reduce the weight of the arm by replacing some of the metal brackets by
3D printed brackets. Lastly, the torque characteristics for each motor were a bit unexpected which showed that its duty
cycle mapping to angles isnt exactly a one to one relationship.

Kevin Bradshaw

Individual Contribution Software Subsystem

Team 15

4 Individual Contributions - Software Subsystem


The software subsystem in the project mainly serves to analyze and display data. The subsystem includes user
interface designing, data analysis methods, and multi-subsystem cooperation. The entire software subsystem is
based on C++ language using Microsoft Visual Studio as the development platform. The FLTK window is adopted
for the user interface. The user can start and terminate the entire system through the buttons on the window. Also,
the system status and the sampling data of tracked user arm movements are displayed on the window with minimal
latency. The sampling data obtained from the Kinect is further processed to obtain the data (duty cycle) that can be
sent to the Raspberry Pi 2 to define the PWM signals which are used to drive the servo motors. As for hand tools
control, the interrupts can be sent to the software subsystem by initializing specific gestures to the hand controller.
After the interrupts are recognized by the software system, the control system exits the mimicking stage and enters
the subroutines that drive the robotic arm to perform automation movements to select and replace the hand tools.
The control system resumes to the mimicking stage after the subroutine finishes. The PAU06 Wi-Fi adaptor is used
to perform wireless data transmission in the processes mentioned above. The codes for the PAU06 has to be
integrated in the software subsystem. In addition, the software subsystem plays the most important role to define the
interactions and cooperation of multiple other subsystems. The control, communication, and power subsystems all
rely on the software to perform tasks and cooperate correctly.

4.1

Significant Changes in Direction

The movements of the robotic arm are defined by the three servo motors. Currently, the analyzed data outputted from
software subsystem (Kinect) meets the requirements to map the most human arm movements to the robotic arm.
However, the human arm has more degree of freedom, which results in ambiguity of mapping some special human
arm movements with the degree of freedom exceeding to that of the robotic arm. After some researches, a mapping
method involving the use of T-matrix is recommended by the field experts to improve the mapping accuracy. But,
adopting this method requires re-work all the mapping algorithms, not mentioning the time required for learning and
understanding the concepts of the method. This method is under evaluation due to the risk of progress delay, and the
decision would be made in this summer break.
As for now, the boundaries and constraints are added to prevent the robotic arm mimicking the human arm movements
with the degree of freedom that is out of range.

4.2

Subsystem Status

As mentioned in the previous sections, the software subsystem mainly consists of the user interface designing and
skeletal data analysis.
In terms of the interface designing, the developed FLTK window has to be able to clearly and correctly display the
system status and the system data. This is illustrated in the figure below, the window is designed as simple as possible
only containing essential information that the user needs to know. More system messages and function buttons will
be added as the project development proceeds.

Yuan Tian

Individual Contribution Software Subsystem

Team 15

Figure 4.2.1. Graphic User Interface


As for skeletal data analysis, the goal for this process is to output accurate angle representations for the users arm
positions. The requirement for tracking accuracy in the early development stages was set to be less than 10, this is
indicated in Table 4.1. To test the accuracy of tracking system, we have to measure the actual human arm angles and
compare them to what are outputted by the tracking system. The accuracy requirement is met if the difference of the
two is within error tolerance range.
Table 4.1
Software System Specification Compliance Matrix
Specification
Min
-10
Difference Between the Tracked Angle and the Actual Angle

4.3

Nominal
0

Max
10

Subsystem Challenges and Solutions

The major challenges encountered for software subsystem are mainly focused in the process of designing a correct
mapping algorithm. The sampling accuracy of the Kinect is not satisfying the requirements for tracking all positions
of the users arm movements, especially in the boundary areas in which the mathematical equations used are close to

Yuan Tian

Individual Contribution Software Subsystem

Team 15

undefined. In addition, the tracked angle of servo motor 2 outputted by the Kinect does not have correct range, for
example, the correct range is 0 ~ 180 while the Kinect outputs a range that is 30~ 180.
For the boundary problems, a buffer area is added to replace the boundary area with ambiguous definitions. In the
buffer area, the sampling routines execute a different mapping algorithm which gives a clear definition in boundary
area, thus the ambiguity is eliminated. In terms of the range problem of servo motor 2, we noticed that the tracked
angle in the error range is still in linear relationship with the change of the users arm positions. Thus a linear map is
used to match the error range to the correct range, and the desired tracking accuracy is obtained.

4.4

Subsystem Technical Details

As mentioned in previous sections, the robotic arm movements are defined by the simultaneous rotation movements
of the three servo motors which are labeled in the Figure 4.4.1 below with their specific rotation movements. The
rotation movements of servo motor 2 and servo motor 3 are in the Y-Z plane, and the rotation movements of servo
motor 1 are in the X-Y plane.

Figure 4.4.1. Robotic Arm Servo Motors and Movements Definition


The software subsystem (Kinect) is responsible of outputting the angle representations for the rotation movements of
the three servo motors indicated in figure 4.4.1. Referring back to figure 4.2.1, the numerical data of theta values
highlighted with red boxes can be directly sent to the control subsystem to define the duty cycles for each motor.
The detail strategies and methods used to map real human arm movements to the robotic arm movements are
explained as follows:
(1) Servo Motor 1 (Movement 1)
Servo motor 1 rotates to change the direction of the robotic arm vectors in X-Y plane, and the defined range for the
direction vector in X-Y plane is 0 ~ 180. Thus a 2-D direction vector is used to output the numerical values of theta
1 indicated in figure ?. The 2-D vector is defined by the difference between the wrist point X-Y coordinate and the

Yuan Tian

Team 15

Individual Contribution Software Subsystem

elbow point X-Y coordinate. After the direction vector is defined, the angle can be found using the mathematical
formula shown below:

= 57.2 ()
In the case that the angle is over 90, the formula shown below is used instead:

= 180 57.2 ( )

(2) Servo Motor 2 (Movement 2)

Fig 4.4.2. Servo Motor 2 Mapping


Its tricky to map the users arm movements to servo motor 2 (movement 2) in an easy fashion. In general, when
elbow points (indicated in the figure 4.4.2 above as B1 and B2) raise up, the servo motor 2 has to respond
accordingly by rotating clockwise or counterclockwise. However, rotating in both directions can result in an
increasing of Y coordinate value of elbow points. To differentiate those two scenarios, another measurement has to
be taken in consideration. As shown in the figure, the users stretching elbow forward can result in a negative
difference in depth of elbow point (Z-coordinate value), while stretching elbow backward results in a positive
difference. Based on the two cases, a clear mapping method is generated to synchronize the robotic arm movements
with the users arm movements for servo motor 2.

(3) Servo Motor 3 (Movement 3)


Servo motor 3 (movement 3) is relatively easy to define, the theta 3 indicated in figure 4.2.1 is used to track
relative position of upper limb and lower limb of the users arm. This is achieved by calculating the angle between
two vectors, where the first vector is defined by the shoulder point and the elbow point, and the other one is defined
by elbow point and wrist point.

Yuan Tian

Individual Contribution Software Subsystem

Team 15

(4) Code Design:

The most difficult part for the interface design is creating an infinite loop that can support the continuous sampling
process of the Kinect. The default FLTK window library doesnt support such feature. So, a new window class has
to be re-designed from scratch to make sure the window is compatible with the sampling code segments of the
Kinect. The overall code structure is shown below.

Note: Code explanations are shown in the commentary section. The rest of the code detail of each function is
omitted due to its length. Overall code length is more than 800 lines.
class MyTimer : public Fl_Box{

// (1)*

void draw() {}

// (2)*

static void Timer_CB(void *userdata){

// (3)*

NuiSkeletonGetNextFrame(0, &ourframe);

// (4)*

for (int i = 0; i < 6; i++){


if (ourframe.SkeletonData[i].eTrackingState == NUI_SKELETON_TRACKED){ // (5)*
}
o->redraw();
}
Fl::repeat_timeout(0.05, Timer_CB, userdata);
}

// (6)*

// (7)*

static void start_cb(Fl_Widget* o, void* userdata)


static void End_cb(Fl_Widget* o, void* userdata)

// (8)*
// (9)*

Public:

// (10)*

MyTimer(int X, int Y, int W, int H, const char*L = 0) : Fl_Box(X, Y, W, H, L){ // 11)*


}

int main() {
// (12)*
Graph_lib::Window win(Point(200, 100), 800,600 , "");
win.begin();
MyTimer tim(100, 100, 600, 400);
NuiInitialize(NUI_INITIALIZE_FLAG_USES_SKELETON); // (13)*
win.end();
win.show();

Yuan Tian

Individual Contribution Software Subsystem

Team 15

while (win.shown())Fl::wait();
}

(5) Commentary Section:


(1)* Class declaration.
(2)* Tell base widget to draw its background using Fl_Box::draw().
(3)* Call back function for the kinect sampling process.
(4)* Create a place in memory to hold the data the Kinect sends to us.
(5)* Check if theres data on the location of a person in this slot. The Kinect can track 6 people at same
(6)* Redraw the entire window.
(7)* Re-enter the sampling callback function every 0.05 second.
(8)* Callback function for the button Start.
(9)* Callback function for the button End.
(10)* Public section mainly serves to initialize and store variables.
(11)* Constructor for drawing out the window.
(12)* Main function for initializing the display window and passing the control to the window.
(13)* Initialize and power up the Kinect.

4.5

time.

Subsystem Testing

The tests of the software subsystem include the graphic user interface tests and the sampling algorithm tests. The user
interface tests mainly include the system message display test and the display logic test. And the sampling algorithm
tests include the sampling accuracy test and the calibration process test.

4.5.1

<Graphic User Interface>

The graphic user interface test ensures the FLTK window output correct system messages. Also, it makes sure the
instructions given to user are clear so that the user can go through the entire sampling process without confusion.
Test Setup
Before the test start, the Kinect has to be powered up and connected to the laptop. The laptop needs to execute the file
containing the sampling process code. Moreover, the user (limited to one person at one time) has to stand in front of
the Kinect within 1 m~ 5 m in distance.
Data

1. The initial system message display without any action

Figure 4.5.1.1. Initial Stage System Message

Yuan Tian

Team 15

Individual Contribution Software Subsystem

2. After the user press start, the Kinect checks the users distance, and give the user the instructions to guide the
user enter the suggested sampling range with the best accuracy. (If the user leave the range, the system re-enter this
steps regardless of what step the system is currently on.)

Figure 4.5.1.2. Range Check Stage System Message


3. After the user enter the suggested sampling range, the calibration process starts. The information obtained from the
individual is saved for later use.

Figure 4.5.1.3. Calibration Stage System Message

4. After the calibration process, the system enters the mimicking stage in which the continuous sampling data is
outputted on the window, and also is sent to the control system.

Figure 4.5.1.4. Mimicking Stage System Message

Yuan Tian

Individual Contribution Software Subsystem

Team 15

Conclusion
Currently, the GUI system meets the preliminary requirements. It can assist the project development with feedbacks,
and also provide clear instructions for the user to conduct the sampling process. But, as the project development
proceeds, more functions and system feedback displays will be added to the FLTK window due to the increasing
complexity of the project.

4.5.2

<Sampling Algorithm>

The sampling algorithm test is conducted to check if the differences of the angle representations outputted by the
software subsystem and the actual angles of the human arm are within the range of error tolerance.
Test Setup
The test setup is same with the test setup described in section 4.5.1.
Data
In this section, a specific human arm position is sampled by the software subsystem, and the results are shown below:

(1) Servo Motor 1 (Movement 1)

Figure 4.5.2.1. Output Angle and Actual Angle Comparison (Theta 1)

Yuan Tian

Individual Contribution Software Subsystem

Team 15

(3) Servo Motor 2 (Movement 2)

Figure 4.5.2.2. Output Angle and Actual Angle Comparison (Theta 2)


(3) Servo Motor 3 (Movement 3)

Figure 4.5.2.3. Output Angle and Actual Angle Comparison (Theta 3)

Yuan Tian

Team 15

Individual Contribution Software Subsystem

Conclusion
It can be observed in the figures above that the differences between the output angles and the actual angles are less
than 10. As a result, the accuracy of the sampling algorithm tested meets the preliminary requirement of our project.

Yuan Tian

Individual Contribution Hand/Tool Subsystem

Team 15

5 Individual Contribution - Hand/Tool System


Overview
The Hand and Tool System consists of a mechanical hand structure with a stepper motor in the middle which controls
the movement of three fingers. The stepper motor is driven by the PWM signals that are produced from the driver
board and the Raspberry Pi 2. When the stepper motor rotates clockwise, the fingers perform motion of grabbing a
specific tool. Otherwise, the fingers perform the motion of releasing the tool to its original position. To control the
hand to grab or release tools, the Myo armband reads the users specific gesture and sends the control information to
the Raspberry Pi. The Myo triggers a specific code to activate and perform particular motion to replace the current
tool with the selected tool.
The stepper motor is connected to a stepper motor driver which is connected to the Raspberry Pi 2. The driver will
be connected to 2 GPIO pins on the Raspberry Pi 2, one controls the direction of the motors rotation and the other
controls the steps of the rotation. The input power of the driver is from an external power supply.
Stepper Motor Control Flowchart

5.1

Significant Changes in Direction

The previous requirement of the hand system is to be able to operate medical surgery by grabbing and using surgery
tools. However, it relies on the accuracy of the movement of the hand and fingers. So, now the requirement of the
hand system is to be able to grab and use several designed small tools.

Zhengshuai Zhang

Individual Contribution Hand/Tool Subsystem

Team 15

5.2

Subsystem Status

In the hand system, the stepper motor controls the whole system. The rotation direction and rotation degree of the
stepper motor determines the status of the fingers. When the stepper motor rotates clockwise, the fingers are in the
process of closing. Otherwise, the fingers are in the process of opening. To close the fingers, the rotation degrees
should be in the range of 0 ~ 500. When the rotation degree is 500, the fingers are totally closed. For the stepper
motor, it rotates 1.8 per step. So the input values of steps of the stepper motor has a range of 0 ~ 278 steps. To test
the stepper if it can rotate in the range it was designed, it will be set to be rotating in half closed status (50 steps, 100
steps, 150 steps), and fully closed status (270 ~ 278 steps). In the communication between Myo and stepper motor,
there is a latency that is specified in the communication subsystem.
Table 5.1
Hand/Tool System Specification Compliance Matrix
Specification
Min
Rotation degree of the stepper motor is in the range of 0 ~ 500
0

Nominal
250

Max
500

The latency between the stepper motor and Myo.

5.3

Subsystem Challenges and Solutions

The major challenges of the hand system was the physical structure. There might be too much friction between the
joints of the fingers. Too much friction will cause a problem on the flexibility of the fingers and they might not move
as expected. To resolve this challenge, we will keep refining the physical structure or change to more smooth materials
for the joints.
The second challenge of this subsystem is the protection of the stepper motor. The input voltage of the stepper motor
is 12V maximum and the input current of each phase should be limited below 0.35A. Either input voltage or current
exceeds these limits, the stepper motor will be burnt. To protect the stepper motor, we can use a voltage regulator and
a current regulator to maintain the values of input voltage and current under the limits.

5.4

Subsystem Technical Details

The main focus of the hand system is on controlling the stepper motor wirelessly with the Myo armband. The stepper
motor is connected to the motor driver through wires and the driver is also connected to the raspberry pi through wires.
As stated in the communication subsystem, the Raspberry Pi 2 communicates with the laptop through Wi-Fi and the
Myo armband communicates with the laptop through Bluetooth. Both the Raspberry Pi 2 and the Myo communicate
with the laptop, so the communication between the Raspberry Pi 2 and Myo is realized. To control the motor
wirelessly, it requires programming on both the server side and client side.

For the server side, the code is written in python. GPIO pin numbers 23 and 24 on the Raspberry Pi are used as
output. Pin 23 outputs the information of steps and pin 24 outputs the direction instruction. Only RPi library is
imported and is shown in Code 5.1.
Code 5.1 Imports of python library on server side
import RPi.GPIO as GPIO, time
GPIO.setmode(GPIO.BOARD)
GPIO.setup(16,GPIO.OUT)
GPIO.setup(18,GPIO.OUT)

Zhengshuai Zhang

Individual Contribution Hand/Tool Subsystem

Team 15

p = GPIO.PWM(16,500)

For the control part, function SpinMotor() is used to read the input of direction and steps. In the function, a while
loop is used to realize the motor rotating in the steps as input. The motor stops for 0.01 seconds between every two
steps.
Code 5.2 Control stepper motor on server side
def SpinMotor(direction,num_steps):
GPIO.output(18,direction)
While num_steps > 0:
p.start(1)
time.sleep(0.01)
Num_step -= 1
p.stop()
return True
direction_input = raw_input (Please enter O or C for Open or Close:)
num_steps = input(Please enter the number of steps:)
if direction_input == C:
SpinMotor(False,num_steps)
else
SpinMotor(True,num_steps)

Then the function getchar() is used to get input from the laptops keyboard. When the user presses a or A on the
keyboard, the stepper motor rotates in 270 steps clockwise. When the user presses d or D on the keyboard, the
stepper motor rotates in 270 steps counterclockwise.
Code 5.3 Get keyboard input
def getchar():
#return a single character from standard input
import tty, termios, sys
fd = sys.stdin.fileno()
old_settings = terminos.tcgetattr(fd)
try:
tty.setraw(sys.stdin.fileno())
ch = sys.stdin.read(1)
# get input from keyboard
finally:
termios.tcsetattr(fd, termios.TCSADRAIN, old_settings)
return ch
print press q to exit
while 1:
ch = getchar()
if ch==a or ch==A:
SpinMotor(False,270)
elif ch==d or ch==D:
SpinMotor(True,270)
elif ch==q or ch==Q:
break
else:
print you pressed, ch

After realizing control the stepper motor through keyboard input, the next step is to control the keyboard from client
side (Myo armband). The code is written in lua and ran on the software development kit (SDK) of Myo.
Code 5.4 keyboard control from Myo
scriptTitle = "MotorControl"
scriptDetailsUrl = ""
function onPoseEdge(pose,edge)

Zhengshuai Zhang

Individual Contribution Hand/Tool Subsystem

Team 15

myo.debug("onPoseEdge:"..pose..","..edge)
if pose == "fist" and edge == "on" then
myo.debug("FIST")
myo.keyboard("a","down")
end
if pose == "fingerSpread" and edge == "on" then
myo.debug("FINGERSPREAD")
myo.keyboard("d","down")
end
end
function onForegroundWindowChange(app,title)
wantActive=false
myo.debug("onForegroundWindowChange:"..app..","..title)
if app == "putty.exe" and title == "pi@raspberrypi: -" then
myo.debug("is putty")
wantActive=true
end
return wantActive
end
function activeAppName()
return "Output Everything"
end
function onActiveChange(isActive)
myo.debug("onActiveChange")
end

5.5

Subsystem Testing

Generally, the tests of this subsystem is to check if the Myo can read the users gesture and test if the motor can rotate
properly in both directions and degrees.

5.5.1

Gesture reading

The purpose of the test is to check if the Myo armband can read and send information of users gestures correctly to
the laptop. The debug console on the SDK of Myo software can show the data of the users gestures if the user uses
the designed gestures for this system.
Test Setup
Connect the USB Bluetooth adaptor to the laptop, open the debug console in develop mode of Myo software. Check
if it shows MYO armband is connected.
Data
The debug console shows the data that is being read from Myo armband. When the user makes a fist, the console
outputs a fist; and when the user spreads their fingers, the console outputs fingers spread.
Conclusion
When the debug console outputs the data as shown in Data section, the test passes. If the console doesnt output
anything, the test fails and the user should check if the Myo armband is connected to the laptop. And check if the user
uses proper designed gestures.

Zhengshuai Zhang

Individual Contribution Hand/Tool Subsystem

Team 15

5.5.2

Stepper Motor Rotation

Test Setup
Connected the stepper motor to the picture as shown below:

After setup and proper connection, the drivers light should be on. Open putty on the laptop with Myo already set up.
Data
When the user makes a fist, the motor rotates clockwise for 500 degrees. When the user spreads their fingers, the
motor rotates counterclockwise for 500 degrees.
Conclusion
If the stepper motor doesnt rotate after the user does a gesture, the test fails. Check the connection and make sure the
user uses the proper designated gesture. If the stepper motor doesnt rotate in correct degrees, check if the stepper
motor is burnt.

Zhengshuai Zhang

Team 15
Subsystem

Individual

Contribution

Communications

6 Individual Contribution - Communications


Communication is fundamental to the integration of all the different subsystems. The primary objective of this
subsystem is to allow for the sending of data packages between the Raspberry Pi and the computer, the Kinect to
computer, and the communication between the Myo Gesture controls to the computer. Below is a general block
diagram of the communication network interface of the physical system

Figure 6.1.1

Fuhua Song

Team 15
Subsystem

Individual

Contribution

Communications

Table 6.1.1 Inputs


Pointer

Communication

Data Type

USB 3.0

Skeletal Tracking

Bluetooth

Motor Commands

Bluetooth

Haptic Feedback

WIFI(IEEE 802.11)

Control

WIFI(IEEE 802.11)

Visual, Haptic

GPIO

Motor Control

GPIO

Motor Control

Mount

Visual Feedback

GPIO

Motor Control

10

GPIO

Motor Control

11

GPIO

Haptic Feedback

12

GPIO

Visual Feedback

The following chart correspond to Figure 6.1.1. It illustrates the type of communication between varying
components.

6.1

Significant Changes in Direction

Many changes has occurred for communication between the various different components of this project. Socket
communication is now employed for sending of packages between the Raspberry Pi and the laptop. Originally, a 3rd
party DNS server was intended to be used for package sending. However, due to the NAT IPv4 network design, it was
not possible (very difficult at least) to send or transmit anything through the Pi over third party network such as NoIP.com or dynamicDNS.com due to the IP address supplied to these sites being public IP rather than private IP (private
IP is whats needed to remote access the Pi).The method utilized now is socket communication. This method will be
applied for transferring all data packages between the Wi-Fi modules on the Laptop and the Raspberry Pi only.
Another change direction is the abandonment of headless access (for now). As with the problem with setting up the
3rd Party DNS communications due to the dynamic IP change, the problem lies in the inability to access the IPv4
address and the IP being dynamic. Originally, a static IP was assigned to the Pi, however this failed due to the WPA2
Enterprise router rejecting the assigned IP and simultaneously trying to provide the Pi with a dynamic IP. The result
ended up being the Pi failing to connect to the network. The new requirement is simply to connect a monitor to
Raspberry Pi 2, observing the IP manually, and storing this IP into putty to be used for several sessions. Unless a

Fuhua Song

Team 15
Subsystem

Individual

Contribution

Communications

method is determined to get access to the IPv4 address, the Pi will be required to be connected to a monitor first in
order to remote access it.

6.2

Subsystem Status

For communication, latency and range are the two main criterias that have to be considered. Latency correlates with
the amount of bit information to be transported while different distance is necessary for different circumstances. For
motor control from the Myo controller to the laptop, the latency is calculated to be .33 assuming the control code is
1Mb. This code can either be larger or smaller depending on how sophisticated the in-depth the code will be
programmed to. The longevity of the duration is attributed to the Bluetooth module sending information between the
Myo and computer. This module only has a bandwidth of 3Mbs. This is considering perfect condition scenario
assuming no possible disruptions. For visual feedback which has the largest bit transfer, it would take approximately
0.045s to transport at full load visual feedback back to the computer. For sensor response transfer from the arm to
Myo, the latency would be less than .01s due to the very low amount of bitrate transported. As for the range for the
various communication methods between the different components, the Bluetooth module has a limited range of 15m.
Beyond this distance, the computer can no longer adequately receive the Bluetooth signal transmitted by the Myo
controller. Between the Raspberry Pi and the laptop, the range is beyond 10M. The 10m is an estimate of what is the
minimum required for this specific application. However, ultimately the range is directly linked to the range of the
wireless network its connected on. Lastly, for the Kinect, the distance is only around 1-3m range detection range in
order to accurately map the users hand and transport the information wireless to laptop for processing.
Table 5.1
Communication Specification Compliance Matrix
Specification
Latency (Motor Control-1MB)
Latency(Visual Feedback-13.06MB)*
Latency(Sensor Feedback-60B)
Distance (Myo)
Distance(Raspberry Pi)
Distance(Kinect)

Min

Nominal

Max

.33s
.045s
.01s>
15m
10m+
1m

3m

*Video bit rate= pixel count x motion factor x 0.07 1000 = bit rate in kbps

6.3

Subsystem Challenges and Solutions

There were many challenged faced with the communications. The school network is WPA2 enterprise as opposed to
the typical WPA used by home networks thus many difficulties arose. One of the challenges encountered earlier on
was figuring out how to configure the Pi for wireless control. Despite trying to access the network by setting the
netmask, network, broadcast, and gateway IP acquired from helpdesk, the Wi-Fi dongle still refused to connect.
However, eventually, it was determined that the WPA supplicant file that was on the Pi had to be configured which
finally enabled wireless connection of the Wi-Fi dongle. Another major challenge faced was the dynamic IP of the Pi.
The school router kept supplying the Pi with a dynamic IP thats inaccessible remotely. Attempts at forwarding the IP
address out of the Pi to 3rd Party DNS have failed as well as attempt to program Pi to email the IP4W address to user
account. At present moment, this difficulty still has yet to be resolved.

Fuhua Song

Team 15
Subsystem

6.4

Individual

Contribution

Communications

Subsystem Technical Details

For the communication network, 3 subsections has to be considered. The communication between the Myo Gesture
Controller with the computer, the communication between the Kinect and the computer, and between the Raspberry
Pi 2 and the computer. For Kinect, the communication is through USB 3.0 that allow secure flow of data at 3 Mhz.
Since this is a wired USB connection, the transmission size as well as speed is extremely rapid. Between the Myo
Hand Gesture Controller and the Laptop, the communication method is Bluetooth. In order to allow for communication
between the Myo Gesture control and the computer, the Myoconnect software is necessary to act as the linker in order
to permit the exchange of data packages. As for the connection between the Raspberry Pi 2 and the Computer, Socket
communication is the primary source of connection. Socket communication is the primary method of transporting data
packages between the computer and the Raspberry Pi 2. Since the Kinect sends information and the Myo utilizes the
Myoconnect software to send and receive information, the Pi 2 is the primary focus for setting up the communication
interface. The Raspberry Pi is set up as the client while the laptop is setup as the Server. Utilizing putty to first remote
access the Pi, it can then be commanded to run the client while the server code is active on the laptop. This would thus
establish connection between the laptop and the Pi and packages of information (whether JPEG or Txt file) can thus
be sent.
Logic For Basic Text File Exchange
Server Code
import socket
HOST = '0.0.0.0'
PORT = 3820
socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
socket.bind((HOST, PORT))
socket.listen(1)
conn, addr = socket.accept()
with open('myTransfer.txt', 'wb') as file_to_write:
while True:
data = conn.recv(1024)
print ('data')
if not data:
break
file_to_write.write(data)
socket.close()
For the Server portion of the code, the host is assigned the 0.0.0.0 address so all local IP addresses are able to access
and link this file. The port of access is
Client Code
import socket
HOST = '10.202.117.93' #server name goes in here
PORT = 3820
socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
socket.connect((HOST,PORT))
fileToSend = open('myUpload.txt', 'rb')
while True:
data = fileToSend.readline()
if data:

Fuhua Song

Team 15
Subsystem

Individual

Contribution

Communications

socket.send(data)
else:
break
fileToSend.close()
print ('end')
socket.close()
exit()
The host address for the client is the IPv4 address of the host machine (set up as the server) that the client is trying to
connect to.

6.5

Subsystem Testing

The testing of the communications network between the computer and the Raspberry Pi is evaluating the types of data
packages as well as the size of the data packages that can be transported across this network.
Furthermore, the distance test is conducted to make sure that the range sufficient for this method of communication.

Test Setup
In order to test TCP socket communication, the server and the client has to be setup. Its best to set the Raspberry Pi
2 as the client and the server as the laptop. When running the test, be sure to run the server program first then followed
by the client program for a successful connection. Attempt to further test the communication capabilities by sending
test files and JPGs after further modification of the client/server code. When sending more sophisticated files, its
advisable to create another file along with the Client and set the file to call functions defined in the Client. Make sure
the blank files, whether JPG or text is created as to allow the transmitted data to be written onto this file.

Data

Figure 6.5.1: Camera photo data being transmitted.

Fuhua Song

Team 15
Subsystem

Individual

Contribution

Communications

There were still minor issues when transmitting JPEG files through socket communication that has yet to be resolved.
However, the issue at the moment is attributed to the fact the received data is not converted into the correct format
and instead due to UTF-8, is printing in terms of hexadecimals instead. However, this issue will certainly be resolved
before the beginning of 404.

Figure 6.5.2: Incorrectly received JPG

Figure 6.5.2: correctly received JPG


When transmitting files such as JPGs, be sure that the encoded data package is decoded properly. If the decoder failed
to decode the encoded blocks of data in correct format, the outcome may be in a scale thats undesirable such as in the
form of hexadecimal instead of the intended ASCII characters thus resulting in an inaccessible file.

Conclusion
There were still minor issues when transmitting JPEG files through socket communication that has yet to be resolved.
However, the issue at the moment is attributed to the fact the received data is not converted into the correct format
and instead due to UTF-8, is printing in terms of hexadecimals instead. However, this issue will certainly be resolved
before the beginning of 404.

Fuhua Song

Team 15

Project Overview

7 Conclusions
This report shows the progress of the entire system for the Modular Robotic Arm. There have been many milestones
accomplished with this project for each subsystem presented. It combines electronics, software, and mechanics heavily
and its important to document the progression of each side because of how much the team has learned from designing,
coding, and building each subsystem. Currently, all the subsystems are working and are ready for integration. Each
subsystem does need to be refined in some aspects but early in the upcoming semester, the input from the user all the
way to the control is definitely possible. Further designs for the mechanical side of the system will continue to be
created during the time before the upcoming semester so that the main focus of the following semester will be primarily
on the programs necessary to run the system.

Team 15

Project Overview

References
AL5D Assembly Guide (Lynxmotion.com)
7 Oct 2009
How does the Kinect 2 Compare To The Kinect 1 (Zugara.com)
9 Dec 2014
Is it Possible to Predict How Long a Surgery Will Last? (Medscape.com)
14 July, 2010
Kinect for Windows Sensor Components and Specifications (Microsoft.com)
Mar 2016
Methods and Apparatus For Maintaining Secure Connections In A Wireless
Communication Network (US Patent, Patent Number: 20110010539)
13 January, 2011
Panda Wireless 802.11n Wireless USB Adapter (PandaWireless.com)
Feb 2014
PCA9685 Datasheet (Adafruit.com)
16 Apr 2015
Raspberry Pi 2 GPIO Electrical Specifications (Mosaic Documentation Web)
Mar 2016
Raspberry Pi 2 FAQ (RaspberryPi.org)
Mar 2016
Single Chip 2.4 GHz Transceiver - Nordic Semiconductor (Sparkfun.com)
Mar 2008

Team 15

Project Overview

Appendix A
Budget Table
Include a table listing the purchases/expenses for the project thus far, and note any known pending expenses.

Description
Lynxmotion Robotic Arm
Keyboard
Wood
Screws
Corner Bracket
Drill Bits
Spray Pin
Buck Converter
Panda Wireless PAU06
Raspberry Pi 2 Model B
AD/DA Expansion Board
PB4600 External Power Supply
EXP12200 12 Volt 20 Ah Battery
12 V Battery Charger

Table A-1: Project Budget


Quantity
Item Cost
1
$0
1
$10.70
4
$20.20
50
$2.48
4
$2.98
10
$15.98
20
$5.88
1
$12.60
1
$19.99
1
$49.99
1
$38.99
1
$19.99
1
$40.00
1
$19.88

Shipping
0
0
0
0
0
0
0
0
0
0
0
0
0
0
Total

Subtotal
$0
$10.70
$20.20
$2.48
$2.98
$15.98
$5.88
$12.60
$19.99
$49.99
$38.99
$19.99
$40.00
$19.88
$259.66

Team 15

Appendix B

B-1

Kevin Bradshaw

Power Subsystem

Power Subsystem Schematic and BOM - Kevin Bradshaw


Power Subsystem Schematic

Value

Description

PCA9685A

PWM Driver

HS-485HB

Servo Motor

HS-805BB

Servo Motor

HS-755HB

Servo Motor

HS-645MG

Servo Motor

HS-422

Servo Motor

Raspberry Pi 2

MCU

0.1 F

Capacitors (5)

LynxMotion

Robotic Arm

EXP12200

Main System Power Supply

PB4600

MCU Power Supply

Team 15

Appendix C

Control Subsystem

Control Subsystem Code - Kevin Bradshaw

#!/usr/bin/python
from __future__ import division
import time
import Adafruit_PCA9685
# Initialise the PWM device using the default address
pwm = Adafruit_PCA9685.PCA9685()
#=========================================================================
# ORIGIN INITIALIZATION AND DEFINITIONS
#=========================================================================
# Adafruit's Servo Motor Definition found at:
#https://learn.adafruit.com/downloads/pdf/adafruit-16-channel-servo-driver-with-raspberry-pi.pdf
def setServoPulse(channel, pulse):
pulseLength = 1000000
# 1,000,000 us per second
pulseLength /= 60
# 60 Hz
print "%d us per period" % pulseLength
pulseLength /= 4096
# 12 bits of resolution
print "%d us per bit" % pulseLength
pulse *= 1000
pulse /= pulseLength
pwm.set_pwm(channel, 0, pulse)
# All pulse lengths are out of 4096
Shoulder_Origin_1 = 1263 # Middle pulse = 1262.5
Shoulder_Origin_2 = 1188 # Middle pulse = 1187.5
Elbow_Origin_1 = 565 # Minimum pulse
Wrist_Origin_1 = 1303 # Middle pulse
Wrist_Origin_2 = 1165 # Middle pulse
#Initialize Pulses at the specified origin of the User
StepSize = 10
Shoulder_1 = Shoulder_Origin_1
Shoulder_2 = Shoulder_Origin_2
Elbow_1 = Elbow_Origin_1
Wrist_1 = Wrist_Origin_1
Wrist_2 = Wrist_Origin_2
#Initialize Duty Cycles
Shoulder_1_DTY = 0
Shoulder_2_DTY = 0
Elbow_1_DTY = 0
Wrist_1_DTY = 0
Wrist_2_DTY = 0

Kevin Bradshaw

# Decrease for Precision, Increase for speed

Team 15

Control Subsystem

# Set PWM Driver Frequency


SlowFreq = 550
MediumFreq = 600
FastFreq = 650
# Set Arm to origin position until user calibration
pwm.set_pwm(15, 0, Shoulder_Origin_1)
pwm.set_pwm(14, 0, Shoulder_Origin_2)
pwm.set_pwm(13, 0, Elbow_Origin_1)
pwm.set_pwm(12, 0, Wrist_Origin_1)
pwm.set_pwm(11, 0, Wrist_Origin_2)
#pwm.set_pwm_freq(SlowFreq)
#pwm.set_pwm_freq(MediumFreq)
#pwm.set_pwm_freq(FastFreq)
#=========================================================================
# ARM MOVEMENT DEFINITIONS
#=========================================================================
# Returns a single character from standard input
#+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
# This definition was obtained from:
#
http://stackoverflow.com/questions/22397289/finding-the-values-of-the-arrow-keys-in-python-why-are-theytriples
def getchar():
import tty, termios, sys
fd = sys.stdin.fileno()
old_settings = termios.tcgetattr(fd)
try:
tty.setraw(sys.stdin.fileno())
ch = sys.stdin.read(1) #get input from keyboard
finally:
termios.tcsetattr(fd, termios.TCSADRAIN, old_settings)
return ch
#+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
print 'Press Q to exit into Mode 2'
# Returns a single Duty Cycle for the specific motor excited
def getDTY(Pulse):
DTY = float((Pulse/4095)*100)
return DTY
# ---------------------------------------------------------------------------------------#Returns a single angle for the position of the specific motor with respect to the origin
def angle1(Pulse):
Min_Pulse = 415
Max_Pulse = 2110
Min_Range = 0

Kevin Bradshaw

Team 15

Max_Range = 180
theta = ((Pulse - Min_Pulse)/(Max_Pulse - Min_Pulse))*180
return theta
def angle2(Pulse):
Min_Pulse = 695
Max_Pulse = 1680
Min_Range = -90
Max_Range = 90
theta = ((Pulse - Min_Pulse)/(Max_Pulse - Min_Pulse))*180
return theta
def angle3(Pulse):
Min_Pulse = 565
Max_Pulse = 2000
Min_Range = 0
Max_Range = 150
theta = ((Pulse - Min_Pulse)/(Max_Pulse - Min_Pulse))*180
return theta
def angle4(Pulse):
Min_Pulse = 425
Max_Pulse = 2180
Min_Range = -90
Max_Range = 90
theta = ((Pulse - Min_Pulse)/(Max_Pulse - Min_Pulse))*180
return theta
def angle5(Pulse):
Min_Pulse = 360
Max_Pulse = 1960
Min_Range = -90
Max_Range = 90
theta = ((Pulse - Min_Pulse)/(Max_Pulse - Min_Pulse))*180
return theta
# ---------------------------------------------------------------------------------------# Specific Angles are calculated
def getPulse1(Angle):
Min_Pulse = 415
Max_Pulse = 2110
Min_Range = 0
Max_Range = 180
Pulse = ((Angle/180)*(Max_Pulse - Min_Pulse)) + Min_Pulse
return Pulse
def getPulse2(Angle):
Min_Pulse = 695
Max_Pulse = 1680
Min_Range = -90

Kevin Bradshaw

Control Subsystem

Team 15

Control Subsystem

Max_Range = 90
Pulse = ((Angle/180)*(Max_Pulse - Min_Pulse)) + Min_Pulse
return Pulse
def getPulse3(Angle):
Min_Pulse = 565
Max_Pulse = 2000
Min_Range = 0
Max_Range = 150
Pulse = ((Angle/180)*(Max_Pulse - Min_Pulse)) + Min_Pulse
return Pulse
def getPulse4(Angle):
Min_Pulse = 425
Max_Pulse = 2100
Min_Range = -90
Max_Range = 90
Pulse = ((Angle/180)*(Max_Pulse - Min_Pulse)) + Min_Pulse
return Pulse
def getPulse5(Angle):
Min_Pulse = 360
Max_Pulse = 1960
Min_Range = -90
Max_Range = 90
Pulse = ((Angle/180)*(Max_Pulse - Min_Pulse)) + Min_Pulse
return Pulse
#=========================================================================
# Set Pulse, Get Angle Mode
while 1:
ch = getchar()
#Shoulder Motor 1
if ch=='a' or ch=='A':
# Left
while ch == 'a' or ch == 'A':
pwm.set_pwm(15, 0, Shoulder_1)
Shoulder_1 = Shoulder_1 + StepSize
Shoulder_1_DTY = round(getDTY(Shoulder_1), 2)
Shoulder_Angle_1 = round(angle1(Shoulder_1), 2)
print 'Shoulder Motor 1 Pulse is', Shoulder_1
print 'Shoulder Motor 1 Duty Cycle is', Shoulder_1_DTY
print 'Shoulder Motor 1 is at', Shoulder_Angle_1, 'degrees from the origin'
ch = 'r' # r for reset
time.sleep(0.01)
if Shoulder_1 >= 2115:
Shoulder_1 = Shoulder_1 - StepSize
print 'Shoulder Motor 1 Upper Bound Met'
# print 'Testing keypress A', ch
elif ch=='d' or ch=='D':
# Right
while ch == 'd' or ch == 'D':

Kevin Bradshaw

Team 15

Control Subsystem

pwm.set_pwm(15, 0, Shoulder_1)
Shoulder_1 = Shoulder_1 - StepSize
Shoulder_1_DTY = round(getDTY(Shoulder_1), 2)
Shoulder_Angle_1 = round(angle1(Shoulder_1), 2)
print 'Shoulder Motor 1 Pulse is', Shoulder_1
print 'Shoulder Motor 1 Duty Cycle is', Shoulder_1_DTY
print 'Shoulder Motor 1 is at', Shoulder_Angle_1, 'degrees from the origin'
ch = 'r' # r for reset
time.sleep(0.01)
if Shoulder_1 <= 410:
Shoulder_1 = Shoulder_1 + StepSize
print 'Shoulder Motor 1 Lower Bound Met'
# print 'Testing keypress D', ch

#Shoulder Motor 2
if ch=='w' or ch=='W':
# Left
while ch == 'w' or ch == 'W':
pwm.set_pwm(14, 0, Shoulder_2)
Shoulder_2 = Shoulder_2 + StepSize
Shoulder_2_DTY = round(getDTY(Shoulder_2), 2)
Shoulder_Angle_2 = round(angle2(Shoulder_2), 2)
print 'Shoulder Motor 2 Pulse is', Shoulder_2
print 'Shoulder Motor 2 Duty Cycle is', Shoulder_2_DTY
print 'Shoulder Motor 2 is at', Shoulder_Angle_2, 'degrees from the origin'
ch = 'r' # r for reset
time.sleep(0.01)
if Shoulder_2 >= 1680:
Shoulder_2 = Shoulder_2 - StepSize
print 'Shoulder Motor 2 Upper Bound Met'
# print 'Testing keypress W', ch
elif ch=='s' or ch=='S':
# Right
while ch == 's' or ch == 'S':
pwm.set_pwm(14, 0, Shoulder_2)
Shoulder_2 = Shoulder_2 - StepSize
Shoulder_2_DTY = round(getDTY(Shoulder_2), 2)
Shoulder_Angle_2 = round(angle2(Shoulder_2), 2)
print 'Shoulder Motor 2 Pulse is', Shoulder_2
print 'Shoulder Motor 2 Duty Cycle is', Shoulder_2_DTY
print 'Shoulder Motor 2 is at', Shoulder_Angle_2, 'degrees from the origin'
ch = 'r' # r for reset
time.sleep(0.01)
if Shoulder_2 <= 695:
Shoulder_2 = Shoulder_1 + StepSize
print 'Shoulder Motor 2 Lower Bound Met'
# print 'Testing keypress S', ch

#Elbow Motor 1

Kevin Bradshaw

Team 15

if ch=='t' or ch=='T':
# Left
while ch == 't' or ch == 'T':
pwm.set_pwm(13, 0, Elbow_1)
Elbow_1 = Elbow_1 + StepSize
Elbow_1_DTY = round(getDTY(Elbow_1), 2)
Elbow_Angle_1 = round(angle3(Elbow_1), 2)
print 'Elbow Motor 1 Pulse is', Elbow_1
print 'Elbow Motor 1 Duty Cycle is', Elbow_1_DTY
print 'Elbow Motor 1 is at', Elbow_Angle_1, 'degrees from the origin'
ch = 'r' # r for reset
time.sleep(0.01)
if Elbow_1 >= 2000:
Elbow_1 = Elbow_1 - StepSize
print 'Elbow Motor 1 Upper Bound Met'
# print 'Testing keypress T', ch
elif ch=='g' or ch=='G':
# Right
while ch == 'g' or ch == 'G':
pwm.set_pwm(13, 0, Elbow_1)
Elbow_1 = Elbow_1 - StepSize
Elbow_1_DTY = round(getDTY(Elbow_1), 2)
Elbow_Angle_1 = round(angle3(Elbow_1), 2)
print 'Elbow Motor 1 Pulse is', Elbow_1
print 'Elbow Motor 1 Duty Cycle is', Elbow_1_DTY
print 'Elbow Motor 1 is at', Elbow_Angle_1, 'degrees from the origin'
ch = 'r' # r for reset
time.sleep(0.01)
if Elbow_1 <= 565:
Elbow_1 = Elbow_1 + StepSize
print 'Elbow Motor 1 Lower Bound Met'
# print 'Testing keypress G', ch

#Wrist Motor 1
if ch=='j' or ch=='J':
# Left
while ch == 'j' or ch == 'J':
pwm.set_pwm(12, 0, Wrist_1)
Wrist_1 = Wrist_1 + StepSize
Wrist_1_DTY = round(getDTY(Wrist_1), 2)
Wrist_Angle_1 = round(angle4(Wrist_1), 2)
print 'Wrist Motor 1 Pulse is', Wrist_1
print 'Wrist Motor 1 Duty Cycle is', Wrist_1_DTY
print 'Wrist Motor 1 is at', Wrist_Angle_1, 'degrees from the origin'
ch = 'r' # r for reset
time.sleep(0.01)
if Wrist_1 >= 2180:
Wrist_1 = Wrist_1 - StepSize
print 'Wrist Motor 1 Upper Bound Met'
# print 'Testing keypress J', ch
elif ch=='l' or ch=='L':
# Right
while ch == 'l' or ch == 'L':

Kevin Bradshaw

Control Subsystem

Team 15

Control Subsystem

pwm.set_pwm(12, 0, Wrist_1)
Wrist_1 = Wrist_1 - StepSize
Wrist_1_DTY = round(getDTY(Wrist_1), 2)
Wrist_Angle_1 = round(angle4(Wrist_1), 2)
print 'Wrist Motor 1 Pulse is', Wrist_1
print 'Wrist Motor 1 Duty Cycle is', Wrist_1_DTY
print 'Wrist Motor 1 is at', Wrist_Angle_1, 'degrees from the origin'
ch = 'r' # r for reset
time.sleep(0.01)
if Wrist_1 <= 425:
Wrist_1 = Wrist_1 + StepSize
print 'Wrist Motor 1 Lower Bound Met'
# print 'Testing keypress L', ch

#Wrist Motor 2
if ch=='i' or ch=='I':
# Left
while ch == 'i' or ch == 'I':
pwm.set_pwm(11, 0, Wrist_2)
Wrist_2 = Wrist_2 + StepSize
Wrist_2_DTY = round(getDTY(Wrist_2), 2)
Wrist_Angle_2 = round(angle5(Wrist_2), 2)
print 'Wrist Motor 2 Pulse is', Wrist_2
print 'Wrist Motor 2 Duty Cycle is', Wrist_2_DTY
print 'Wrist Motor 2 is at', Wrist_Angle_2, 'degrees from the origin'
ch = 'r' # r for reset
time.sleep(0.01)
if Wrist_2 >= 1960:
#Wrist_2 = Wrist_2 - StepSize
print 'Wrist Motor 2 Upper Bound Met'
# print 'Testing keypress I', ch
elif ch=='k' or ch=='K':
# Right
while ch == 'k' or ch == 'K':
pwm.set_pwm(11, 0, Wrist_2)
Wrist_2 = Wrist_2 - StepSize
Wrist_2_DTY = round(getDTY(Wrist_2), 2)
Wrist_Angle_2 = round(angle5(Wrist_2), 2)
print 'Wrist Motor 2 Pulse is', Wrist_2
print 'Wrist Motor 2 Duty Cycle is', Wrist_2_DTY
print 'Wrist Motor 2 is at', Wrist_Angle_2, 'degrees from the origin'
ch = 'r' # r for reset
time.sleep(0.01)
if Wrist_2 <= 360:
#Wrist_2 = Wrist_2 + StepSize
print 'Wrist Motor 2 Lower Bound Met'
# print 'Testing keypress K', ch

# Vary PWM Frequency

Kevin Bradshaw

Team 15

Control Subsystem

if ch=='b' or ch=='B':
# Left
while ch == 'b' or ch == 'B':
FastFreq = FastFreq + 1
pwm.set_pwm_freq(FastFreq)
print 'Current Frequency is', FastFreq
ch = 'r' # r for reset
time.sleep(0.01)
# print 'Testing keypress B', ch
elif ch=='n' or ch=='N':
# Right
while ch == 'n' or ch == 'N':
FastFreq = FastFreq - 1
pwm.set_pwm_freq(FastFreq)
print 'Current Frequency is', FastFreq
ch = 'r' # r for reset
time.sleep(0.01)

# Reset Arm and Change Step Size


if ch=='o' or ch=='O':
# Left
while ch == 'o' or ch == 'O':
print 'Current StepSize:', StepSize
StepSize = input("Enter a new StepSize: ")
pwm.set_pwm(15, 0, Shoulder_Origin_1)
pwm.set_pwm(14, 0, Shoulder_Origin_2)
pwm.set_pwm(13, 0, Elbow_Origin_1)
pwm.set_pwm(12, 0, Wrist_Origin_1)
pwm.set_pwm(11, 0, Wrist_Origin_2)
ch = 'r' # r for reset
time.sleep(0.01)
elif ch=='o' or ch=='O':
pass
# Quit Mode
elif ch=='q' or ch=='Q':
break
else:
print '===================================================='

#==========================================================================
# Set Angle, Get a Pulse Mode
while (True):
#Quit
ch2 = getchar()
if ch2 == 'q' or ch2 == 'Q':
break
else:
print '===================================================='
print 'Mapping Angles from the Kinect are as follows:'

Kevin Bradshaw

Team 15

print 'Shoulder 1 Motor ranges from 0 to 180 degrees'


print 'Shoulder 2 Motor ranges from -90 to 90 degrees'
print 'Elbow 1 Motor ranges from 0 to 150 degrees'
print 'Wrist 1 Motor ranges from 0 to 180 degrees'
print 'Wrist 2 Motor ranges from 0 to 180 degrees'
print ''
print 'Shoulder 1 Origin = 90 degrees'
print 'Shoulder 2 Origin = 0 degrees'
print 'Elbow 1 Origin = 0 degrees'
print 'Wrist 1 Origin = 0 degrees'
print 'Wrist 2 Origin = 0 degrees'
print ''
print ''
Shoulder_Angle_1 = input("Enter an Angle for Shoulder 1: ")
Shoulder_Angle_2 = input("Enter an Angle for Shoulder 2: ") + 90
Elbow_Angle_1 = input("Enter an Angle for Elbow 1: ")
Wrist_Angle_1 = input("Enter an Angle for Wrist 1: ") + 90
Wrist_Angle_2 = input("Enter an Angle for Wrist 2: ") + 90
Shoulder_Pulse_1 = int(round(getPulse1(Shoulder_Angle_1), 2))
Shoulder_Pulse_2 = int(round(getPulse2(Shoulder_Angle_2), 2))
Elbow_Pulse_1 = int(round(getPulse3(Elbow_Angle_1), 2))
Wrist_Pulse_1 = int(round(getPulse4(Wrist_Angle_1), 2))
Wrist_Pulse_2 = int(round(getPulse5(Wrist_Angle_2), 2))

pwm.set_pwm(15, 0, Shoulder_Pulse_1)
pwm.set_pwm(14, 0, Shoulder_Pulse_2)
pwm.set_pwm(13, 0, Elbow_Pulse_1)
pwm.set_pwm(12, 0, Wrist_Pulse_1)
pwm.set_pwm(11, 0, Wrist_Pulse_2)
print 'Shoulder 1 Pulse:', Shoulder_Pulse_1
print 'Shoulder 2 Pulse:', Shoulder_Pulse_2
print 'Elbow 1 Pulse:', Elbow_Pulse_1
print 'Wrist 1 Pulse:', Wrist_Pulse_1
print 'Wrist 2 Pulse:', Wrist_Pulse_2
time.sleep(1)

Kevin Bradshaw

Control Subsystem

Team 15

Communications Subsystem

Appendix D

Communications Subsystem Code - Fuhua Song

Server Portion:
Import socket
import os.path
import codecs
host = '0.0.0.0'
port = 5560
filePath = ('C:\\Users\\Asian\\Desktop\\403\\client\\demo\\version 2\\a.jpg')

def setupServer():
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
print("Socket created.")
try:
s.bind((host, port))
except socket.error as msg:
print(msg)
print("Socket bind complete.")
return s
def setupConnection():
s.listen(1) # Allows one connection at a time.
conn, address = s.accept()
print("Connected to: " + address[0] + ":" + str(address[1]))
return conn
#def storeFile(filePath):
# picFile = open(filePath, 'wb')
# print("Opened the file.")
# pic = conn.recv(1024)
#while pic:
# print("Receiving picture still.")
# picFile.write(pic)
# pic = conn.recv(1024)
#picFile.close()
def dataTransfer(conn):
# A big loop that sends/receives data until told not to.
while True:
# Receive the data
data = conn.recv(1024) # receive the data

Fuhua Song

Team 15

data = data.decode('utf-8')
# Split the data such that you separate the command
# from the rest of the data.
dataMessage = data.split(' ', 1)
command = dataMessage[0]
if command == 'GET':
reply = GET()
elif command == 'REPEAT':
reply = REPEAT(dataMessage)
elif command == 'STORE':
print("Store command received. Time to save a picture")
picFile = codecs.open(filePath, 'w','utf-8')
print("Opened the file.")
pic = conn.recv(5000)#original 1024
while pic:
print("Receiving picture still.")
picFile.write(str(pic))
pic = conn.recv(1024)
picFile.close()
image = header.split(b'keep-alive\r\n\r\n', 1)[-1]
open(filepath, 'wb').write(image)
# storeFile(filePath)
reply = "File stored."
elif command == 'EXIT':
print("Client exited :(")
break
elif command == 'KILL':
print("Shut server.")
s.close()
break
else:
reply = 'Unknown Command'
# Send the reply back to the client
conn.sendall(str.encode(reply))
print("Data has been sent!")
conn.close()

s = setupServer()
while True:
try:
conn = setupConnection()
dataTransfer(conn)

Fuhua Song

Communications Subsystem

Team 15

Communications Subsystem

#storeFile(filePath)
except:
break
Client and Main unfortunately Inaccessible at moment
Table 6.1.1 Inputs
Pointer

Communication

Data Type

USB 3.0

Skeletal Tracking

Bluetooth

Motor Commands

Bluetooth

Haptic Feedback

WIFI(IEEE 802.11)

Control

WIFI(IEEE 802.11)

Visual, Haptic

GPIO

Motor Control

GPIO

Motor Control

Mount

Visual Feedback

GPIO

Motor Control

10

GPIO

Motor Control

11

GPIO

Haptic Feedback

12

GPIO

Visual Feedback

Fuhua Song

Team 15

Software Subsystem

Appendix E

Software Subsystem <Yuan Tian>


Code Flow Chart:

Note: The grey boxes indicate the protocols that current code can support, and the white boxes represent the features
that are under development.

The GUI and Kinect Code (Full Version):

#include"Simple_window.h"
#include"Graph.h"
#include"function.h"
#include <Windows.h>
#include <NuiApi.h>
#include <FL/Fl.H>
#include <FL/Fl_Double_Window.H>

Yuan Tian

Team 15

#include <FL/Fl_Box.H>
#include <FL/fl_draw.H>
#include <math.h>
#include <stdio.h>
#include <time.h>
#include "function.h"
#include <stdlib.h>
#include <FL/Fl_Button.H>
#include "Window.h"
#include "Graph.h"
#include "Point.h"
#include <FL/Fl_Text_Display.H>
#include <string>

#define BG_COLOR 45
#define TICK_COLOR 50
#define CIRC_COLOR 0

//upper limb of arm


class vector_1{
double x;
double y;
double z;
public:
void set_x(double xval){
x = xval;
}
void set_y(double yval){
y = yval;
}
void set_z(double zval){
z = zval;
}
double get_x(void){
return x;
}
double get_y(void){
return y;
}
double get_z(void){
return z;
}
};
//lower limb of arm
class vector_2{
double x;
double y;
double z;

Yuan Tian

Software Subsystem

Team 15

Software Subsystem

public:
void set_x(double xval){
x = xval;
}
void set_y(double yval){
y = yval;
}
void set_z(double zval){
z = zval;
}
double get_x(void){
return x;
}
double get_y(void){
return y;
}
double get_z(void){
return z;
}
};

// bool functions
class logic{
bool a1;
bool a2;
bool a3;
bool a4;
bool a5;
bool a6;
public:
logic::logic() :a1(false), a2(false), a3(false),a4(false),a5(false),a6(false){}
bool get_a1(void){ return a1;}
bool get_a2(void){ return a2;}
bool get_a3(void){ return a3;}
bool get_a4(void){ return a4;}
bool get_a5(void){ return a5;}
bool get_a6(void){ return a6;}
void set_a1(bool xa1){ a1 = xa1; }
void set_a2(bool xa2){ a2 = xa2; }
void set_a3(bool xa3){ a3 = xa3; }
void set_a4(bool xa4){ a4 = xa4; }
void set_a5(bool xa5){ a5 = xa5; }
void set_a6(bool xa6){ a6 = xa6; }
};
class stage{
int StageNumber;
public:
stage::stage() : StageNumber(0){};

Yuan Tian

Team 15

Software Subsystem

void set_stage(int x){ StageNumber = x; }


int get_stage(void){ return StageNumber; }
};
class calibr_data{
double user_hand_x_left;
double user_hand_x_right;
double user_origin_elbow_x;
double user_origin_elbow_y;
double user_origin_elbow_z;
double user_origin_wrist_x;
double user_origin_wrist_y;
double user_origin_wrist_z;
double user_origin_shoulder_x;
double user_origin_shoulder_y;
double user_origin_shoulder_z;

//Cali_step=1 left boundary


//Cali_step=0 right boundary
//Cali_step=3 origin position

double vector3_x;
double vector3_y;
double vector3_z;
double vector4_x;
double vector4_y;
double vector4_z;
public:
calibr_data::calibr_data() :user_hand_x_left(0),
user_hand_x_right(0), user_origin_elbow_x(0),
user_origin_elbow_y(0), user_origin_elbow_z(0),
user_origin_wrist_x(0), user_origin_wrist_y(0),
user_origin_wrist_z(0),user_origin_shoulder_x(0),
user_origin_shoulder_y(0),user_origin_shoulder_z(0),
vector3_x(0),vector3_y(0),vector3_z(0),
vector4_x(0), vector4_y(0), vector4_z(0){}
void set_uhxr(double x){ user_hand_x_right = x; }
void set_uhxl(double x){ user_hand_x_left = x; }
void set_uoex(double x){ user_origin_elbow_x = x; }
void set_uoey(double x){ user_origin_elbow_y = x; }
void set_uoez(double x){ user_origin_elbow_z = x; }
void set_uowx(double x){ user_origin_wrist_x = x; }
void set_uowy(double x){ user_origin_wrist_y = x; }
void set_uowz(double x){ user_origin_wrist_z = x; }
void set_uosx(double x){ user_origin_shoulder_x = x; }
void set_uosy(double x){ user_origin_shoulder_y = x; }
void set_uosz(double x){ user_origin_shoulder_z = x; }
void set_v3_x(double x){ vector3_x = x; }
void set_v3_y(double x){ vector3_y = x; }
void set_v3_z(double x){ vector3_z = x; }

Yuan Tian

Team 15

Software Subsystem

void set_v4_x(double x){ vector4_x = x; }


void set_v4_y(double x){ vector4_y = x; }
void set_v4_z(double x){ vector4_z = x; }

double get_uhxr(void){ return user_hand_x_right; }


double get_uhxl(void){ return user_hand_x_left; }
double get_uoex(void){ return user_origin_elbow_x;}
double get_uoey(void){ return user_origin_elbow_y;}
double get_uoez(void){ return user_origin_elbow_z;}
double get_uowx(void){ return user_origin_wrist_x;}
double get_uowy(void){ return user_origin_wrist_y;}
double get_uowz(void){ return user_origin_wrist_z;}
double get_uosx(void){ return user_origin_shoulder_x; }
double get_uosy(void){ return user_origin_shoulder_y; }
double get_uosz(void){ return user_origin_shoulder_z; }
double get_v3_x(void){ return vector3_x; }
double get_v3_y(void){ return vector3_y; }
double get_v3_z(void){ return vector3_z; }
double get_v4_x(void){ return vector4_x; }
double get_v4_y(void){ return vector4_y; }
double get_v4_z(void){ return vector4_z; }
};
class theta_x{
double thetaX;
double angleX;
double arm_extend_length;
public:
theta_x::theta_x() : thetaX(0), angleX(0), arm_extend_length(0){}
void set_thetaX(double x){
thetaX = x;
}
void set_angleX(double x){
angleX = x;
}
void set_arm_extend_length(double x){
arm_extend_length = x;
}
double get_thetaX(void){
return thetaX;
}
double get_angleX(void){
return angleX;
}
double get_arm_extend_length(void){
return arm_extend_length;
}

Yuan Tian

Team 15

Software Subsystem

};
class MyTimer : public Fl_Box {
void draw() {
// TELL BASE WIDGET TO DRAW ITS BACKGROUND
//Fl_Box::draw();
buff_Hand_X->text(to_string(res_Hand_X).c_str());
buff_Hand_Y->text(to_string(res_Hand_Y).c_str());
buff_Hand_Z->text(to_string(res_Hand_Z).c_str());
buff_Elbow_X->text(to_string(res_Elbow_X).c_str());
buff_Elbow_Y->text(to_string(res_Elbow_Y).c_str());
buff_Elbow_Z->text(to_string(res_Elbow_Z).c_str());
buff_Shoulder_X->text(to_string(res_Shoulder_X).c_str());
buff_Shoulder_Y->text(to_string(res_Shoulder_Y).c_str());
buff_Shoulder_Z->text(to_string(res_Shoulder_Z).c_str());
buff_thetaX->text(to_string(thetaX.get_angleX()).c_str());
buff_theta1->text(to_string(res_theta1_angle).c_str());
buff_theta2->text(to_string(res_theta2).c_str());
buff_diff_thetaX->text(to_string(diff_thetaX).c_str());
buff_diff_theta1->text(to_string(diff_theta1_angle).c_str());
buff_diff_theta2->text(to_string(diff_theta2_angle).c_str());
buff_SystemStutas->text(Sys_status.c_str());
buff2->text(to_string(Count_sec/20).c_str());
buff3->text(to_string(Count_min).c_str());
buff4->text(to_string(Count_hr).c_str());
}
static void Timer_CB(void *userdata) {
MyTimer *o = (MyTimer*)userdata;
if (o->flag){
NUI_SKELETON_FRAME ourframe;
if ((o->Count_sec) == 1199)
{
o->Count_sec = 0;
if (o->Count_min == 59)
{
o->Count_min = 0;
o->Count_hr++;
}
else{
o->Count_min++;
}
}
else{

Yuan Tian

//3
//4
o->Count_sec++;
o->Count_sec_cali++;

Team 15

Software Subsystem

}
NuiSkeletonGetNextFrame(0, &ourframe);
for (int i = 0; i < 6; i++){
if (ourframe.SkeletonData[i].eTrackingState == NUI_SKELETON_TRACKED)
{
if (o->stage.get_stage() == 0){
o->Shoulder_center
ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_SHOULDER_CENTER].z;
//cout << o->Shoulder_center << endl;

if (o->Shoulder_center<=1.8&&o->Shoulder_center>=1.5)
{
o->logic.set_a1(true);
//cout << "1" << endl;
}
else {
o->logic.set_a1(false);
o->stage.set_stage(0);
}
if (o->logic.get_a1() == true){
//cout << "3" << endl;
switch (o->Cali_step){
case 0:
if (o->logic.get_a2() == false){
o->temp_count
=

o-

>Count_sec_cali;
o->logic.set_a2(true);
//cout << "4" << endl;
}
else{
if

(o->Count_sec_cali

==

(o-

>temp_count + 100))
{
o>user.set_uhxr(ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_WRIST_RIGHT].x);
o->Cali_step = 1;
//cout << "5" << endl;
}
else if (o->Count_sec_cali < (o>temp_count + 100)){
o->Sys_status = "Stay
where you are. Raise your arm straight and stretch to the right side";
}
else{// cout << "6" << endl;
}
}
break;
case 1:

Yuan Tian

Team 15

Software Subsystem

if (o->logic.get_a3() == false){
o->temp_count
=

o-

>Count_sec_cali;
o->logic.set_a3(true);
//cout << "3" << endl;
}
else{
if

(o->Count_sec_cali

==

(o-

>temp_count + 100))
{
o>user.set_uhxl(ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_WRIST_RIGHT].x);
o->Cali_step = 2;
}
else if (o->Count_sec_cali < (o>temp_count + 100)){
o->Sys_status = "Success.
Now, please raise your arm straight and stretch to the left side";
}
else{}
}
break;
case 2:
if (o->logic.get_a4() == false){
o->temp_count
=
o>Count_sec_cali;
o->logic.set_a4(true);
}
else{
if (o->Count_sec_cali == (o>temp_count + 100))
{
o>user.set_uoex(ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_ELBOW_RIGHT].x);
o>user.set_uoey(ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_ELBOW_RIGHT].y);
o>user.set_uoez(ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_ELBOW_RIGHT].z);
o>user.set_uowx(ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_WRIST_RIGHT].x);
o>user.set_uowy(ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_WRIST_RIGHT].y);
o>user.set_uowz(ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_WRIST_RIGHT].z);
o>user.set_uosx(ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_SHOULDER_RIGHT].
x);
o>user.set_uosy(ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_SHOULDER_RIGHT].
y);

Yuan Tian

Team 15

Software Subsystem

o>user.set_uosz(ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_SHOULDER_RIGHT].
z);
/* test version 1: doesn't
work well*/
o>user.set_v3_x(ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_ELBOW_RIGHT].x ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_SHOULDER_RIGHT].x);
o>user.set_v3_y(ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_ELBOW_RIGHT].y ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_SHOULDER_RIGHT].y);
o>user.set_v3_z(ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_ELBOW_RIGHT].z ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_SHOULDER_RIGHT].z);

o->stage.set_stage(1);
}
else if (o->Count_sec_cali < (o>temp_count + 100)){
o->Sys_status = "Success.
Now, please lay down your arm natrually to the side of your body";
}
else{}
}
break;
default:
break;
}
}
else{
if (o->Shoulder_center > 1.8){
o->Sys_status = "Please move foward a little
bit to get into sampling range";
}
else if (o->Shoulder_center < 1.5){
o->Sys_status = "Please back up a little bit to
get into sampling range";
}
}
}
if (o->stage.get_stage() == 1){
o->Sys_status = "Enter mimicking stage";
o->res_Hand_X
ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_WRIST_RIGHT].x;
o->res_Hand_Y
ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_WRIST_RIGHT].y;

Yuan Tian

=
=

Team 15

Software Subsystem

o->res_Hand_Z
ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_WRIST_RIGHT].z;

o->res_Elbow_X
ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_ELBOW_RIGHT].x;
o->res_Elbow_Y
ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_ELBOW_RIGHT].y;
o->res_Elbow_Z
ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_ELBOW_RIGHT].z;

=
=
=

o->res_Shoulder_X
ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_SHOULDER_RIGHT].x;
o->res_Shoulder_Y
ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_SHOULDER_RIGHT].y;
o->res_Shoulder_Z
ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_SHOULDER_RIGHT].z;

=
=
=

o>user.set_v4_x(ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_HIP_CENTER].x
ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_SHOULDER_CENTER].x);
o>user.set_v4_y(ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_HIP_CENTER].y
ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_SHOULDER_CENTER].y);
o>user.set_v4_z(ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_HIP_CENTER].z
ourframe.SkeletonData[i].SkeletonPositions[NUI_SKELETON_POSITION_SHOULDER_CENTER].z);
//o->thetaX.set_arm_extend_length
>res_Hand_X - o->user.get_uosx()), 2) + pow((o->res_Hand_Y - o->user.get_uosy()), 2)));

// plan 1 (Using Origin User Coordinates.)


/*
double diff = o->res_Hand_X - o->user.get_uosx();
double diff1 = abs(o->res_Hand_X - o->user.get_uosx());
double diff2 = abs(o->res_Hand_Y - o->user.get_uosy());
*/
//plan 2 (Use user's instant Coordinates)
double diff = o->res_Hand_X - o->res_Shoulder_X;
double diff1 = abs(o->res_Hand_X - o->res_Shoulder_X);
double diff2 = abs(o->res_Hand_Z - o->res_Shoulder_Z);

Yuan Tian

(sqrt(pow((o-

/*************************************************************************************
*************************************************/
//servo motor 1 (tested)

if (diff1 < 0.1 && diff2 < 0.1){


o->thetaX.set_thetaX(0);
o->thetaX.set_angleX(0);

Team 15

Software Subsystem

}
else {
if (diff > 0){
//plan 1:
/*o->thetaX.set_thetaX(atan(abs(o>res_Hand_Y - o->user.get_uosy()) / abs(o->res_Hand_X - o->user.get_uosx())));
o->thetaX.set_angleX(atan(abs(o>res_Hand_Y - o->user.get_uosy()) / abs(o->res_Hand_X - o->user.get_uosx()))*57.29);
*/
//plan 2:
//o->thetaX.set_thetaX(atan(abs(o>res_Hand_Z - o->res_Shoulder_Z) / abs(o->res_Hand_X - o->res_Shoulder_X)));
//o->thetaX.set_angleX(atan(abs(o>res_Hand_Z - o->res_Shoulder_Z) / abs(o->res_Hand_X - o->res_Shoulder_X))*57.29);
//plan 3:
o->thetaX.set_thetaX(atan(abs(o>res_Hand_Z - o->res_Elbow_Z) / abs(o->res_Hand_X - o->res_Elbow_X)));
o->thetaX.set_angleX(atan(abs(o>res_Hand_Z - o->res_Elbow_Z) / abs(o->res_Hand_X - o->res_Elbow_X))*57.29);
}
else if (diff=0){
o->thetaX.set_thetaX(1.57);
o->thetaX.set_angleX(90);
}
else
{
// plan 1:
/*
o->thetaX.set_thetaX(1.57
>res_Hand_Y - o->user.get_uosy()) / abs(o->res_Hand_X - o->user.get_uosx())));
o->thetaX.set_angleX(90
>res_Hand_Y - o->user.get_uosy()) / abs(o->res_Hand_X - o->user.get_uosx()))*57.29);
*/
// plan 2:
//o->thetaX.set_thetaX(1.57
>res_Hand_Z - o->res_Shoulder_Z) / abs(o->res_Hand_X - o->res_Shoulder_X)));
//o->thetaX.set_angleX(180
>res_Hand_Z - o->res_Shoulder_Z) / abs(o->res_Hand_X - o->res_Shoulder_X))*57.29);
// plan 3:
o->thetaX.set_thetaX(1.57
>res_Hand_Z - o->res_Elbow_Z) / abs(o->res_Hand_X - o->res_Elbow_X)));
o->thetaX.set_angleX(180
>res_Hand_Z - o->res_Elbow_Z) / abs(o->res_Hand_X - o->res_Elbow_X))*57.29);
}
}

Yuan Tian

atan(abs(o-

atan(abs(o-

+ atan(abs(o- atan(abs(o-

atan(abs(o-

atan(abs(o-

Team 15

Software Subsystem

/*************************************************************************************
*********************************************/
//servo motor 2 (tested)
o->v1.set_x(o->res_Elbow_X - o->res_Shoulder_X);
o->v1.set_y(o->res_Elbow_Y - o->res_Shoulder_Y);
o->v1.set_z(o->res_Elbow_Z - o->res_Shoulder_Z);
o->v2.set_x(o->res_Hand_X - o->res_Elbow_X);
o->v2.set_y(o->res_Hand_Y - o->res_Elbow_Y);
o->v2.set_z(o->res_Hand_Z - o->res_Elbow_Z);

o->res_theta1 = acos(((o->v1.get_x()*o->v2.get_x()) + (o>v1.get_y()*o->v2.get_y()) + (o->v1.get_z()*o->v2.get_z())) / (sqrt(pow(o->v1.get_x(), 2) + pow(o->v1.get_y(), 2)


+ pow(o->v1.get_y(), 2))*sqrt(pow(o->v2.get_x(), 2) + pow(o->v2.get_y(), 2) + pow(o->v2.get_z(), 2))));

o->res_theta1_angle = o->res_theta1*57.2*(-15)/8+255;
//double res_theta1_angle = (o->res_theta1*57.2);
/*********************************************************************************************
**********************************/
//servo motor 3
//1.
o->res_theta2
=
57.2*acos(((o->v1.get_x()*o>user.get_v4_x()) + (o->v1.get_y()*o->user.get_v4_y()) + (o->v1.get_z()*o->user.get_v4_z())) / (sqrt(pow(o>v1.get_x(), 2) + pow(o->v1.get_y(), 2) + pow(o->v1.get_y(), 2))*sqrt(pow(o->user.get_v4_x(), 2) + pow(o>user.get_v4_y(), 2) + pow(o->user.get_v4_z(), 2))));
//2.
//double res_theta2_mid = 57.2*acos(((o->v1.get_x()*o>user.get_v4_x()) + (o->v1.get_y()*o->user.get_v4_y()) + (o->v1.get_z()*o->user.get_v4_z())) / (sqrt(pow(o>v1.get_x(), 2) + pow(o->v1.get_y(), 2) + pow(o->v1.get_y(), 2))*sqrt(pow(o->user.get_v4_x(), 2) + pow(o>user.get_v4_y(), 2) + pow(o->user.get_v4_z(), 2))));
// o->res_theta2 = res_theta2_mid*2 - 90;
//3.
/*
double arm_extend_length=(sqrt(pow((o->res_Elbow_X - o>res_Shoulder_X), 2) + pow((o->res_Elbow_Y - o->res_Shoulder_Y), 2) + pow((o->res_Elbow_Z - o>res_Shoulder_Z), 2)));
o->res_theta2 =57.2*acos(sqrt(pow((o->res_Elbow_X - o>res_Shoulder_X), 2) + pow((o->res_Elbow_Y - o->res_Shoulder_Y), 2)) / (arm_extend_length));
double diff3 = o->res_Elbow_Z - o->user.get_uoez();
*/

/*************************************************************************************
*****************************************/
o->temp_1 = o->array_angleX[0];

Yuan Tian

Team 15

Software Subsystem

o->temp_2 = o->array_theta2_angle[0];
o->temp_3 = o->array_theta1_angle[0];
if (o->array_index != 2) {
o->array_angleX_diff[o->array_index]

o-

>thetaX.get_angleX();
o->array_theta1_angle_diff[o->array_index]

o-

o->array_theta2_angle_diff[o->array_index]

o-

>res_theta1_angle;
>res_theta2;
o->array_index++;
}
else{
o->array_angleX_diff[o->array_index]

o-

>thetaX.get_angleX();
o->array_theta1_angle_diff[o->array_index]

o-

o->array_theta2_angle_diff[o->array_index]

o-

>res_theta1_angle;
>res_theta2;
o->array_angleX[0] = (o->array_angleX_diff[0] + o>array_angleX_diff[1] + o->array_angleX_diff[2])/3;
o->array_theta1_angle[0]
=
>array_theta1_angle_diff[0] + o->array_theta1_angle_diff[1] + o->array_theta1_angle_diff[2])/3;
o->array_theta2_angle[0]
=
>array_theta2_angle_diff[0] + o->array_theta2_angle_diff[1] + o->array_theta2_angle_diff[2]) / 3;

(o(o-

o->diff_thetaX = (o->array_angleX[0] - o->temp_1);


o->diff_theta1_angle = (o->array_theta1_angle[0] - o>temp_3);
o->diff_theta2_angle = (o->array_theta2_angle[0] - o>temp_2);
o->array_index = 0;
}
}
}
o->redraw();
}
Fl::repeat_timeout(0.05, Timer_CB, userdata);
}

static void start_cb(Fl_Widget* o, void* userdata){


MyTimer *v = (MyTimer*)userdata;
//v->result = 214;
v->flag = true;
}
static void End_cb(Fl_Widget* o, void* userdata){

Yuan Tian

Team 15

Software Subsystem

MyTimer *v = (MyTimer*)userdata;
//v->result = 214;
v->flag = false;
cout << ((v->Count_sec_cali) / 20) << "seconds" << endl;
//cout << ""
}
public:
double array_angleX[1];
double array_theta2_angle[1];
double array_theta1_angle[1];
double array_angleX_diff[3];
double array_theta1_angle_diff[3];
double array_theta2_angle_diff[3];
double Shoulder_center = 0;
vector_1 v1;
vector_2 v2;
logic logic;
stage stage;
calibr_data user;
string Sys_status = " Please Press 'start' button when you are ready";
theta_x thetaX;
int Cali_step = 0;
double temp_1 = 0;
double temp_2 = 0;
double temp_3 = 0;
double cali_hand_x;

double diff_thetaX = 0;
double diff_theta1_angle = 0;
double diff_theta2_angle = 0;
//double diff_theta1 = 0;
//double theta1;
bool flag = false;
double Count_sec = 0;
double Count_min = 0;
double Count_hr = 0;
double Count_sec_cali = 0;
double temp_count = 0;
int array_index = 0;
Fl_Button* start = NULL;// button
Fl_Button* End = NULL;
double res_Hand_X = 0;
double res_Hand_Y = 0;

Yuan Tian

Team 15

Software Subsystem

double res_Hand_Z = 0;
double res_Elbow_X = 0;
double res_Elbow_Y = 0;
double res_Elbow_Z = 0;
double res_Shoulder_X = 0;
double res_Shoulder_Y = 0;
double res_Shoulder_Z = 0;
double cali_Hand_X = 0;
double res_theta1 = 0;
double res_theta1_angle = 0;
double res_theta2 = 0;
Fl_Text_Buffer *buff_Hand_X;
Fl_Text_Buffer *buff_Hand_Y;
Fl_Text_Buffer *buff_Hand_Z;
Fl_Text_Buffer *buff_Elbow_X;
Fl_Text_Buffer *buff_Elbow_Y;
Fl_Text_Buffer *buff_Elbow_Z;
Fl_Text_Buffer *buff_Shoulder_X;
Fl_Text_Buffer *buff_Shoulder_Y;
Fl_Text_Buffer *buff_Shoulder_Z;
Fl_Text_Buffer *buff_thetaX;
Fl_Text_Buffer *buff_theta1;
Fl_Text_Buffer *buff_theta2;

Fl_Text_Buffer *buff_diff_thetaX;
Fl_Text_Buffer *buff_diff_theta1;
Fl_Text_Buffer *buff_diff_theta2;
//Fl_Text_Buffer *buff_diff_Elbow_Y;
Fl_Text_Buffer *buff_SystemStutas;

Fl_Text_Buffer *buff2;
Fl_Text_Buffer *buff3;
Fl_Text_Buffer *buff4;
Fl_Text_Display *disp_Hand_X;
Fl_Text_Display *disp_Hand_Y;
Fl_Text_Display *disp_Hand_Z;
Fl_Text_Display *disp_Elbow_X;
Fl_Text_Display *disp_Elbow_Y;
Fl_Text_Display *disp_Elbow_Z;
Fl_Text_Display *disp_Shoulder_X;
Fl_Text_Display *disp_Shoulder_Y;
Fl_Text_Display *disp_Shoulder_Z;
Fl_Text_Display *disp_thetaX;
Fl_Text_Display *disp_theta1;
Fl_Text_Display *disp_theta2;

Yuan Tian

Team 15

Software Subsystem

Fl_Text_Display *disp_diff_thetaX;
Fl_Text_Display *disp_diff_theta1;
Fl_Text_Display *disp_diff_theta2;
//Fl_Text_Display *disp_diff_Elbow_Y;
Fl_Text_Display *disp_SystemStutas;

Fl_Text_Display *disp2;
Fl_Text_Display *disp3;
Fl_Text_Display *disp4;
// CONSTRUCTOR
MyTimer(int X, int Y, int W, int H, const char*L = 0) : Fl_Box(X, Y, W, H, L) {
//
box(FL_FLAT_BOX);
color(BG_COLOR);
Fl::add_timeout(0.1, Timer_CB, (void*)this);
start = new Fl_Button(x() + 190, y() + h()+10 , 100, 50, "start"); start->callback(start_cb,
(void*)this);//define button
End = new Fl_Button(x() + 290, y() + h()+10 , 100, 50, "End"); End->callback(End_cb, (void*)this);
buff_Hand_X = new Fl_Text_Buffer();
buff_Hand_Y = new Fl_Text_Buffer();
buff_Hand_Z = new Fl_Text_Buffer();
buff_Elbow_X = new Fl_Text_Buffer();
buff_Elbow_Y = new Fl_Text_Buffer();
buff_Elbow_Z = new Fl_Text_Buffer();
buff_Shoulder_X = new Fl_Text_Buffer();
buff_Shoulder_Y = new Fl_Text_Buffer();
buff_Shoulder_Z = new Fl_Text_Buffer();
buff_thetaX = new Fl_Text_Buffer();
buff_theta1 = new Fl_Text_Buffer();
buff_theta2 = new Fl_Text_Buffer();
buff_diff_thetaX = new Fl_Text_Buffer();
buff_diff_theta1 = new Fl_Text_Buffer();
buff_diff_theta2 = new Fl_Text_Buffer();
//buff_diff_Elbow_Y = new Fl_Text_Buffer();
buff_SystemStutas = new Fl_Text_Buffer();
buff2 = new Fl_Text_Buffer();
buff3 = new Fl_Text_Buffer();
buff4 = new Fl_Text_Buffer();

Yuan Tian

Team 15

Software Subsystem

disp_Hand_X = new Fl_Text_Display(x() + 70, y() + h() - 250, 100, 30, "Hand X Coordinate");
disp_Hand_Y = new Fl_Text_Display(x() + 70, y() + h() - 200, 100, 30, "Hand Y Coordinate");
disp_Hand_Z = new Fl_Text_Display(x() + 70, y() + h() - 150, 100, 30, "Hand Z Coordinate");
disp_Elbow_X = new Fl_Text_Display(x() + 240, y() + h() - 250, 100, 30, "Elbow X Coordinate");
disp_Elbow_Y = new Fl_Text_Display(x() + 240, y() + h() - 200, 100, 30, "Elbow Y Coordinate");
disp_Elbow_Z = new Fl_Text_Display(x() + 240, y() + h() - 150, 100, 30, "Elbow Z Coordinate");
disp_Shoulder_X = new Fl_Text_Display(x() + 410, y() + h() - 250, 100, 30, "Shoulder X
Coordinate");
disp_Shoulder_Y = new Fl_Text_Display(x() + 410, y() + h() - 200, 100, 30, "Shoulder Y
Coordinate");
disp_Shoulder_Z = new Fl_Text_Display(x() + 410, y() + h() - 150, 100, 30, "Shoulder Z
Coordinate");
disp_thetaX = new Fl_Text_Display(x() + 70, y() + h() - 100, 100, 30, "Theta 1");
disp_theta1 = new Fl_Text_Display(x() + 240, y() + h() - 100, 100, 30, "Theta 2");
disp_theta2 = new Fl_Text_Display(x() + 410, y() + h() - 100, 100, 30, "Theta 3");
disp_diff_thetaX = new Fl_Text_Display(x() + 70, y() + h() - 300, 100, 30, "Theta 1 Diff");
//disp_diff_theta1 = new Fl_Text_Display(x() + 200, y() + h() - 300, 80, 30, "Theta 1 Difference");
disp_diff_theta1 = new Fl_Text_Display(x() + 240, y() + h() - 300, 100, 30, "Theta 2 Diff");
disp_diff_theta2 = new Fl_Text_Display(x() + 410, y() + h() - 300, 100, 30, "Theta 3 Diff");
disp_SystemStutas = new Fl_Text_Display(x()+20, y() + h() - 380, 550, 30, "System Message");
disp2 = new Fl_Text_Display(x() + 170, y() + h() - 30, 80, 30, "Second");
disp3 = new Fl_Text_Display(x() + 250, y() + h() - 30, 80, 30, "Minite");
disp4 = new Fl_Text_Display(x() + 330, y() + h() - 30,80, 30, "Hour");
disp_Hand_X->buffer(buff_Hand_X);
disp_Hand_Y->buffer(buff_Hand_Y);
disp_Hand_Z->buffer(buff_Hand_Z);
disp_Elbow_X->buffer(buff_Elbow_X);
disp_Elbow_Y->buffer(buff_Elbow_Y);
disp_Elbow_Z->buffer(buff_Elbow_Z);
disp_Shoulder_X->buffer(buff_Shoulder_X);
disp_Shoulder_Y->buffer(buff_Shoulder_Y);
disp_Shoulder_Z->buffer(buff_Shoulder_Z);
disp_thetaX->buffer(buff_thetaX);
disp_theta1->buffer(buff_theta1);
disp_theta2->buffer(buff_theta2);
disp_diff_thetaX->buffer(buff_diff_thetaX);
disp_diff_theta1->buffer(buff_diff_theta1);
disp_diff_theta2->buffer(buff_diff_theta2);
//disp_diff_Elbow_Y->buffer(buff_diff_Elbow_Y);
disp_SystemStutas->buffer(buff_SystemStutas);
disp2->buffer(buff2);
disp3->buffer(buff3);

Yuan Tian

Team 15

Software Subsystem

disp4->buffer(buff4);
buff_Hand_X->text(to_string(res_Hand_X).c_str());
buff_Hand_Y->text(to_string(res_Hand_Y).c_str());
buff_Hand_Z->text(to_string(res_Hand_Z).c_str());
buff_Elbow_X->text(to_string(res_Elbow_X).c_str());
buff_Elbow_Y->text(to_string(res_Elbow_Y).c_str());
buff_Elbow_Z->text(to_string(res_Elbow_Z).c_str());
buff_Shoulder_X->text(to_string(res_Shoulder_X).c_str());
buff_Shoulder_Y->text(to_string(res_Shoulder_Y).c_str());
buff_Shoulder_Z->text(to_string(res_Shoulder_Z).c_str());
buff_thetaX->text(to_string(thetaX.get_angleX()).c_str());
buff_theta1->text(to_string(res_theta1_angle).c_str());
buff_theta2->text(to_string(res_theta2).c_str());
buff_diff_thetaX->text(to_string(diff_thetaX).c_str());
buff_diff_theta1->text(to_string(diff_theta1_angle).c_str());
buff_diff_theta1->text(to_string(diff_theta2_angle).c_str());
//buff_diff_Elbow_Y->text(to_string(diff_Elbow_Y).c_str());
buff_SystemStutas->text(Sys_status.c_str());
buff2->text(to_string(Count_sec).c_str());
buff3->text(to_string(Count_min).c_str());
buff4->text(to_string(Count_hr).c_str());
}
};
// MAIN
int main() {
Graph_lib::Window win(Point(200, 100), 800,600 , "");
win.begin();
MyTimer tim(100, 100, 600, 400);
NuiInitialize(NUI_INITIALIZE_FLAG_USES_SKELETON);
win.end();
win.show();
while (win.shown())Fl::wait();

Yuan Tian

Team 15

Hand/Tool Subsystem

APPENDIX F HAND/TOOL SUBSYSTEM

Physical Structure of hand

The fingers in open status

The fingers in close status

Zhengshuai Zhang

Team 15

Hand/Tool Subsystem

Hand movement control


Using a stepper motor in the middle part of the structure, a mechanical hand can be built with
precision movements. Since a stepper motor can define positions with utilizing multiple toothed
electromagnets arranged around a central gear, it can provide a lot of torque at the low end. Control
the stepper motor to rotate in clockwise (to grab) or counterclockwise (to release). The stepper
motor rotates the middle gear to control the fingers to grab or release.

Hand Control Flowchart


Stepper Motor Control: Hardware
EasyDriver is the stepper driver chosen for this part. The driver is connected between the
Raspberry 2 and stepper motor as shown in the image below. Between the Pi and the driver board

Zhengshuai Zhang

Team 15

Hand/Tool Subsystem

there are power lines (3.3V and GND) and four control lines which are used for stepper motor
phases switching). The motor is connected to the driver with four wires.
EasyDriver Board

Stepper Motor (XY42STH34 -0354A)

Stepper Motor Connection Diagram

Zhengshuai Zhang

Team 15

Hand/Tool Subsystem

Stepper Motor Control: Connection


Motor - Driver: Red and yellow wires are in one loop and connected to loop A on the driver board.
Grey and green wires are in one loop and connected to loop B on the driver board.
Driver - RPi: Two GND pins are connected parallelly to the GND pin on Raspberry Pi 2. And M+
pin is connected to the 5V pin on the Raspberry Pi 2. STEP pin is connected to the pin 16 and
DIR is connected to the pin 18.

Stepper Motor Control: Software


Server side (Raspberry Pi)
To move the motor shaft, sequence of square wave pulses have to be generated from the GPIO
pins on the Pi. The stepper control program is written in Python. The specific code is shown below:
Code 1 full code on server side
import RPi.GPIO as GPIO, time
GPIO.setmode(GPIO.BOARD)
GPIO.setup(16,GPIO.OUT)
GPIO.setup(18,GPIO.OUT)

Zhengshuai Zhang

Team 15

Hand/Tool Subsystem

p = GPIO.PWM(16,500)
def SpinMotor(direction,num_steps):
GPIO.output(18,direction)
While num_steps > 0:
p.start(1)
time.sleep(0.01)
Num_step -= 1
p.stop()
return True
def getchar():
#return a single character from standard input
import tty, termios, sys
fd = sys.stdin.fileno()
old_settings = terminos.tcgetattr(fd)
try:
tty.setraw(sys.stdin.fileno())
ch = sys.stdin.read(1)
# get input from keyboard
finally:
termios.tcsetattr(fd, termios.TCSADRAIN, old_settings)
return ch
print press q to exit
while 1:
ch = getchar()
if ch==a or ch==A:
SpinMotor(False,270)
elif ch==d or ch==D:
SpinMotor(True,270)
elif ch==q or ch==Q:
break
else:
print you pressed, ch

Client side (Myo)


Myo triggers specific code to activate and perform specific motion to replace current tool with
selected tool. Myo band reads the users gestures which are programmed to specific motion. The
code is shown below:
Code 2 client side (Myo SDK)
scriptTitle = "MotorControl"
scriptDetailsUrl = ""
function onPoseEdge(pose,edge)
myo.debug("onPoseEdge:"..pose..","..edge)
if pose == "fist" and edge == "on" then

Zhengshuai Zhang

Team 15

Hand/Tool Subsystem

myo.debug("FIST")
myo.keyboard("a","down")
end
if pose == "fingerSpread" and edge == "on" then
myo.debug("FINGERSPREAD")
myo.keyboard("d","down")
end
end
function onForegroundWindowChange(app,title)
wantActive=false
myo.debug("onForegroundWindowChange:"..app..","..title)
if app == "putty.exe" and title == "pi@raspberrypi: -" then
myo.debug("is putty")
wantActive=true
end
return wantActive
end
function activeAppName()
return "Output Everything"
end
function onActiveChange(isActive)
myo.debug("onActiveChange")
end

Sensor
We will use tactile sensor to detect and show the how much force the user has applied on the finger.
The sensor will be attached to the fingertip and the connection of the sensor is shown in figure
below. (Tekscan)

Zhengshuai Zhang

Team 15

Hand/Tool Subsystem

Flex Sensor Circuit

Zhengshuai Zhang

Das könnte Ihnen auch gefallen