Sie sind auf Seite 1von 37

Sonoma State University

Department of Engineering Science

AutoCart

Jorge
Inocencio

Chanbora
Uch

Richard
Duong

May 19, 2016

Adviser: Dr. Farid Farahmand


http://projectautocart.wix.com/mysite

Submitted to the Department of Engineering


Science in partial fulfillment of the
requirements for the degree of:

Bachelor of Science in Electrical


Engineering

Abstract
Autonomous vehicle technology is the future of transportation because it will
allow the movement of people and goods without the need for human drivers.
This will lead to an increase in the speed and transportation of people and
goods. Currently all autonomous vehicles require the use of an expensive array
of sensors and powerful computers to navigate autonomously. The AutoCart
is a golf cart that has been converted into a semi-autonomous vehicle, and
will investigate the possibility of a low-cost semi-autonomous shuttle vehicle
that navigates specific pre-defined routes on a campus. It will use a variety of
sensors including a GPS, magnetometer, LIDAR, ultrasonics, etc.

Acknowledgments
We would like to thank the Dr. Farid Farahman, Tyler Spott, and Yikealo
Abraha. This project would not be possible without their continued support
and assistance.

Contents
1 Introduction
1.1 Literature Review & Existing Patents
1.2 Problem Statement . . . . . . . . . . .
1.3 Methodology . . . . . . . . . . . . . .
1.4 Challenges . . . . . . . . . . . . . . . .
1.5 Key Components . . . . . . . . . . . .
1.6 Marketing Requirements . . . . . . . .
1.7 Engineering Requirements . . . . . . .

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

7
7
8
9
9
10
10
10

2 Implementation
2.1 Project Schedule
2.2 Bill Of Materials
2.3 Module Testing .
2.4 Test Results . . .

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

11
13
14
14
16

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

3 System Testing
18
3.1 Build . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
4 Design Process
4.1 Hardware Design . . . . . .
4.1.1 Motion Control . . .
4.1.2 Object Detection . .
4.1.3 Path Tracking . . .
4.2 Software Design . . . . . . .
4.2.1 MCM . . . . . . . .
4.2.2 Arduino Mega 2560
4.2.3 LabVIEW . . . . . .

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

5 Survey Responses

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

21
23
23
23
24
24
24
24
25
28

6 Future Work
28
6.1 Steering System . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
6.2 User Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
6.3 Object Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
7 Conclusion

30

8 Source Of Funding

30

9 Appendix

33

A Instructions For Use


33
A.1 Turning the AutoCart System ON . . . . . . . . . . . . . . . . . 33
A.2 Turning the AutoCart System OFF . . . . . . . . . . . . . . . . . 33

B Motion Control Module


34
B.1 Circuit Schematic . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
B.2 C code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
C Power Distribution

36

List of Figures
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22

AutoCart System Architecture . . . . . . . . . . . . . . .


Schedule . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Sensor Network Test Plan . . . . . . . . . . . . . . . . . .
GPS Path Tracking System Test Plan . . . . . . . . . . .
Test Setup to Characterize LIDAR and Ultrasonic Sensors
Ultrasonic test results: distance vs distance read . . . . .
LIDAR test results: distance vs distance read . . . . . . .
GPS to Computer Wiring . . . . . . . . . . . . . . . . . .
Raw GPS Data in RealTerm . . . . . . . . . . . . . . . .
testing results for various system tests . . . . . . . . . . .
Front View of the Sensor Array . . . . . . . . . . . . . . .
Field of View of the Sensor Array . . . . . . . . . . . . . .
Angle of Bumper . . . . . . . . . . . . . . . . . . . . . . .
Finished cart . . . . . . . . . . . . . . . . . . . . . . . . .
Inside of the finished cart . . . . . . . . . . . . . . . . . .
Flowchart for MCM implementation code . . . . . . . . .
Block diagram of how the data is processed . . . . . . . .
Harversin: Distance between two points . . . . . . . . . .
Forward Azimuth: Heading between two points . . . . . .
LabVIEW User Interface . . . . . . . . . . . . . . . . . . .
Circuit Schematic of the Motion Control Module . . . . .
Power Distribution and Safety Switches . . . . . . . . . .

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

12
13
15
16
16
17
17
18
18
18
19
20
20
22
22
25
26
27
27
28
35
36

List of Tables
1
2
3
4

List of key components


Bill of Materials . . .
MCM packet format .
MCM example packet

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

10
14
34
34

Introduction

A quick mode of transportation is a necessity in a large facility in order for


students and faculty to move quickly across large areas which are inaccessible
to ordinary vehicle. An autonomous mode of transportation will allow people
to move very quickly around campus without having knowledge of the campus
layout and will facilitate transportation of materials or equipment around the
facilities.
Creating an autonomous system presents many technical challenges such as
finding the right systems to detect people and objects and creating a program
to stay on track by controlling the vehicles systems. An autonomous system can
drastically reduce the amount of time it takes to traverse large campuses and
facilities, in addition it can move people and equipment without the need for a
driver. It can also serve as a way to transport people who are unfamiliar with
the campus such as visiting parents and lecturers. An autonomous system such
as this can showcase the universitys engineering program. Looking at a broader
scope, autonomous vehicle technology can be used to drive through areas that
are too dangerous for humans to drive in, to alleviate truck drivers for hours at
a time, and to caravan a group of autonomous vehicles together with only one
human driver leading the way.

1.1

Literature Review & Existing Patents

The current automobile market has already implemented many automated safety
systems know as Advanced Driver Assist Systems (ADAS). Current ADAS systems include adaptive cruise control, emergency braking systems, lane drift
warnings, adaptive speed control, blind spot detection, self-parking, etc. All of
these systems add some level of autonomy to vehicles and are beginning to be
widely adopted by vehicle manufacturers. The two main forms of implementing
autonomous functions are sensor based and connectivity based. Sensor based solutions use Artificial Intelligence along with cameras, GPS, RADARs, LIDARs,
ultrasonic sensors, and inertial measurement units to continuously monitor the
vehicles position and surroundings. Connectivity based solutions use Vehicle
to Vehicle (V2V) communications to relay traffic, road, and driving conditions
to surrounding vehicles. With this information it is possible for vehicles to be
aware of traffic patterns and vehicle behavior in city roads where 360 views are
impossible or difficult to achieve. The convergence of these two technologies is
what will enable autonomous vehicle technology [2]. Most current autonomous
systems use sensor based solutions, because of this fully autonomous cars are
still in their infancy and are mostly being designed and tested in universities and
by large companies like Google and Tesla. There is only one fully autonomous
vehicle ready for sale on the market, this vehicle is not intended to be used on
public roads [1].
One of the most successful and complex autonomous golf carts is the SMART
golf cart. This golf cart is the product of Singapore-MIT alliance for Research
and Technology. The SMART cart was designed and built by a team of 19 pro7

fessors and students. This cart was tested over a six day period in a large public
park. The cart successfully ferried over 500 passengers across the gardens.The
key design difference in the design of this cart is the conscious choice to reduce
the number of sensors used. Instead of using complex 3D LIDAR systems they
used multiple laser rangefinders mounted at various heights along with a stereo
camera to obtain data on the vehicles surroundings. Making this choice vastly
reduced the amount of complex calculations needed to map surrounding environment. In addition to using simpler sensors, the SMART golf cart used a new
algorithm for obstacle avoidance, the new algorithm maps a cylinder unto the
vehicles planned trajectory. If an object intrudes unto this cylinder the software
redraws the cylinder to exclude the object this reroutes the golf cart [6].
There is an abundance of research available for autonomous operation, but
most of the information is applied to a small scale, such as remote control cars.
However since these projects are on a smaller scale, the availability of possible
components is much greater as they dont require as much power/force. If we
were to just upscale the parts used in a small scale project, the price would become too great or the availability of the component would be very limited so we
will opt for a DC motor as opposed to a servo or stepper motor [5]. The object
detection system however should be scalable and we will research how viable
an option a camera and laser system will be. To achieve object detection and
avoidance they used a camera and three lasers, and when any of the lasers break
line of sight with the others the system will detect it and make the appropriate
movements [11].
Autonomous transportation is an exciting field and it is easy to see why companies and universities are investing time and resources to further develop
it.According to nature.com there are about 1.24 million traffic fatalities every
year worldwide and 90% are due to driver error. With the use of an autonomous
vehicle replacing all these drivers we could bring that number down to as low
as 124,000 fatalities a year. States are already beginning to approve the use of
autonomous vehicles on roads paving the way for their inevitable debut to the
consumer market [3].

1.2

Problem Statement

The purpose of this project is to design and implement a semi autonomous


transportation system that uses GPS waypoints and collision avoidance to safely
transport passengers across a predefined path on campus. This system will be
met with many large challenges such as, designing the high power circuits to
control the steering, braking and acceleration,designing everything with safety
in mind (for example our code should never have ambiguous states where the
vehicles accelerator would stay depressed), creating the best routes around campus, designing a system to control the steering, stopping the golf cart if it ever
loses its direction and GPS signal, and if time permits, designing adequate collision avoidance algorithm. Our accuracy will be 3 meters as that should be
enough to safely travel across a campus. Power will also not be an issue in
8

our design because the entire system will run off of the golf carts 36V electric
system. The size of our system is also of minimal concern. The mechanical
structures needed to control the steering will be the biggest challenge for us
because of a lack of mechanical engineering skills. Finally, the limited budget
is a large challenge because many of the sensors need for implementation are
extremely expensive.

1.3

Methodology

Our team will solve the proposed problem by using GPS waypoints to move the
cart through campus, while it is traveling from waypoint to waypoint the cart
will be scanning for objects and calculating to see if a collision is imminent or
probable, if the AutoCart detects an approaching object it will try to gently
steer away from it or if it cant, it will stop altogether. To do this our cart will
use a GPS receiver, an object detection system, and LabVIEW to perform the
controls and calculations needed for the semi-autonomous functions. In order
to make our project as affordable as possible we will attempt to use readily
available and low price parts.

1.4

Challenges

Challenges that we face with this project is how we can bypass the fail-safes in
the cart without compromising the carts features. Meaning if we remove the
fail-safe preventing the cart from moving without a driver in it, the cart will
still be able to be used without a driver. Issues that had been brought up before
were that there were issues installing the mechanism to apply the brakes of the
cart. Another system that needs to be implemented would be manual controls.
This would either require the system be installed so that a user could operate
the vehicle without risk of damaging the installed system, or a simple set of
electronic controls that would act as emergency overrides. Itd also be ideal to
have the system be small enough as to not take up the space of the drivers seat.
The system will be exposed to the elements so a level of weatherproofing will
be required. This means that the system is rugged enough to take some hits,
withstand wind, and survive water damage. This system should ideally last as
long as the cart as well. As for cost and installation, those two things will be
fairly high because the system is invasive and requires some time to prepare.
This vehicle will be using GPS so the accuracy of the data we require will be
fairly important. As GPS can be accurate to within a few feet this should not
be a problem when it comes to its ability to navigate. Though itd be ideal for
this system to have object detection to better improve its accuracy and increase
safety for the passengers and pedestrians. The steering, braking, and acceleration also need to be very precise so that it can navigate properly as well as
avoid dangerous situations.
Finally, the system should not significantly affect the battery life of the golf cart.

1.5

Key Components

The AutoCart system uses the following key components.


Table 1: List of key components
Component Name
Motion COntrol PCB
Arduino Mega
Arduino Shield
Feedback Rod Linear Actuators
Adafruit Ultimate GPS Breakout - 66 channel w/10 Hz
HC - SR04 Ultrasonic sensor
Triple-axis Magnetometer (Compass) Board - HMC5883L
LIDAR-Lite 2 Laser Rangefinder
Stepper Motor

1.6

Marketing Requirements

As a requirement for the project, we had to include 10 marketing requirement


which would emphasize some of the key aspects of the AutoCart System.
The AutoCart system should cost under $1000.
The AutoCart should be app controlled.
The AutoCart system will be modular.
The AutoCart System will not significantly reduce battery life.
The AutoCart system will be safe for the riders and for pedestrians.
The AutoCart system will have warnings and lights to alert pedestrians
that it is in use.
The AutoCart system will be well documented so future students can
resume the work.
The AutoCart will stop if it does not know how to proceed.
The AutoCart system will be well documented.
The AutoCart system can be a platform for future Senior Design Projects.

1.7

Engineering Requirements

The AutoCart will have an emergency kill switch that turns the autocart
off.

10

The AutoCart will acquire a GPS signal within 10 seconds of start-up.


The AutoCart will STOP within 5 seconds of losing GPS connection.
The AutoCart will not exceed 5 mph during peak pedestrian traffic hours.
The AutoCarts steering will be controlled with a PI controller.
The AutoCart will have a GPS receiver and antenna to find its position.
The AutoCart system will be dust proof.
The AutoCart system will be water proof.
The AutoCart GPS will be accurate to 3 meters.
The AutoCart will be able to stop within 5 seconds after being commanded
to stop
The AutoCart will be able to detect moving objects up to 10 ft away using
ultrasonic sensors.

Implementation

The AutoCart system will be implemented by breaking down the key systems
into smaller systems. The system is broken down into a motion control module
(MCM), an object detection systems, and a GUI and path tracking system. The
description of each system is included below.
Motion Control Module
The Motion Control Module (MCM) controls the electric motors and actuators
that drive the cart. In particular, it controls the accelerator, the brake, and the
steering. The MCM is commanded by a laptop running Labview. The MCM interfaces with the existing electrical system, speed controller, and safety switches
on the cart. In addition to this, the MCM is its own self-contained PCB which
can be used for other projects in the future. The Motion Control Module is
implemented using a PIC18F14k22.
Object Detection System
The cart will feature three forward facing sensors. There will be two HC-SR04
ultrasonic sensors mounted on the left and right of the front of the cart closer
to the ground. The third sensor, a LIDAR-Lite 2 laser range finder, will be
mounted on the middle of the bumper at the top and will provide a 90 degree
sweep. This sensor array will provide data on objects nearby and send a message
to the mcm dictating what course of action to take. These instructions include
braking, decelerating, and turning to avoid any objects in the carts path. The
purpose of the two low ultrasonic sensors is to cover the area that the LIDAR
cannot see. The rear and side sensors are not necessary as the cart would not

11

utilize the reverse function and the side sensors would not make much of a difference. A button will be enclosed with the LIDAR and it will serve as the
starting point for the LIDAR. This means that when the system starts up the
LIDAR will sweep into the button and when the button is pressed, the system
will know that the LIDAR is at its starting position.
GPS and Path Tracking
This GPS Path Tracking system consists of GPS receiver and an magnetometer(digital compass). The path of the AutoCart traveling is pre-defined. Distance between current location and the next waypoint is monitored at all times.
The compass provides the direction of travel. The outputs include speed, distance to the next waypoint in straight lines, target heading, current heading,
and the error in heading. The speed and the decision to make a turn are sent
to motion control module.
All of the previous systems have to interface with each other for the AutoCart
to function correctly. The different systems communicate via different serial
interfaces. An overall system architecture is shown below.

Figure 1: AutoCart System Architecture

12

2.1

Project Schedule

Figure 2: Schedule
The schedule above was our projected schedule at the beginning of the
project. Overall we did not meet the schedule very effectively. The mechanical
modifications time line took a lot longer than expected. The reason for this was
a lack of appropriate tools and equipment in order to make the modifications.
The steering in particular took much longer than we initially assumed and was
ultimately not successful in implementation. Finally, our projected schedule
allotted four weeks for the inter-module communications. In practice however
we successfully completed this part of the project in less than two weeks.

13

2.2

Bill Of Materials

Table 2: Bill of Materials


Component
100W DC-to-DC power Supply
DC motor Driver H bridge
DC motor
drive sprocket
Roller Chain
Chain Links
PIC16F877A-E/P
H Bridge board
compass and accelerometer
Arduino Mega 2560
Arduino Mega Shield
GPS Antenna
SMA to uFL/u.FL/IPX/IPEX RF Adapter Cable
LIDAR-Lite 2 Laser Range finder
Enclosures
TOTAL

2.3

Quantity
1
2
1
1
1
1
2
1
1
1
1
1
1
1
1

Unit Price
45.95
6.88
72.99
13.45
14.99
6.99
5.44
6.99
20.82
45.76
8.32
12.95
3.95
114.89
30.00
410.37

Module Testing

For proof of concept we developed the following tests.


Motion Control System Test Plan
A PIC18 is used to control 1 DC motors while outputting a PWM Signal.
One DC motor is simulating the brake actuator. The second motor will control
a DC and the PWM signal will be used to control the accelerator pedal. The
second DC motor will simulate the steering control, in addition to this the
steering motor circuit will have a feedback mechanism so the micro-controller
will know the steering angle of the steering wheel. This will allow for accurate
steering. The test will be considered successful if all 3 outputs can be controlled
by sending commands via a serial interface and if the steering angle can be
controlled accurately.
Object Detection Test Plan
A PIC18F45k20 will be tested to observe the data coming from an ultrasonic
sensor and after testing with one sensor, a second sensor will be introduced and
tested to determine how accurately it can place an object within both of the
sensors fields of view. From there more sensors will have to be introduced to
test whether or not they can all be active at the same time, or whether or not
there are enough ports to run all the sensors or not. Measuring the battery
levels will also be tested and sent to the LCD screen.

14

Figure 3: Sensor Network Test Plan


GPS Path Tracking System Test Plan
The components includes 3.3/5V power supply, Arduino Uno R3, GPS receiver, Triple-Axis Magnetometer and and an LCD. The Arduino is connected
to GPS receiver via UART, connected to Magnetometer via I2C, and also connected to LCD via I/O ports. As the AutoCart is traveling, the distance from
current location to the next way is monitored at all Time. As soon as it reaches
or are getting very very close to the waypoint, the compasss bearing will make a
decision whether to make a turn or not. The decision is sent to Motion Control
Module. The speed, current location and heading will be displayed on LCD.

15

Figure 4: GPS Path Tracking System Test Plan

2.4

Test Results

Object Detection Test Result


The ultrasonic sensor did not work on the PIC18F45K20, because it required
additional power to drive it. The echo pin needed to be stepped down from 5V
to 3.3 V, and the trigger pin needed to be stepped up from 3.3V to 5V. The
datasheet states that the range for the sensor is between 2 and 400 centimeters with a 15 degree angle. When transmitting data an 8 MHz crystal was
necessary as testing has shown that using USART or any other transmission
protocol will affect the clock speed. After trying all of these adjustments, we
were unable to get the ultrasonic sensors to operate off of the chip. We then
proceeded to try using a PIC16F877A. Further testing and research revealed
that the PIC16F877A was not an ideal component as it did not support serial
transmission, so the use of an Arduino UNO, followed by an Arduino Mega,
was needed to be able to process all the data incoming from the two ultrasonic
sensors and the LIDAR-Lite 2 laser range finder.

Figure 5: Test Setup to Characterize LIDAR and Ultrasonic Sensors


The results yielded an approximate +10cm off, refer to figure 7, the actual
distance for the LIDAR-Lite 2 laser range finder with a zero degree of drift due
to the LIDARs sensing method being a single point of light. The LIDAR
easily reads out past 10m, ranged at about 40m, but was not tested beyond
16

Figure 6: Ultrasonic test results: distance vs distance read


10m as we intend to have the system begin reacting to any object within 10 m
of the cart. The HC-SR04 ultrasonic sensors read about -5cm off, refer to figure
6, and had an approximate 15 degree scan range, but as the range increased,
about 2.5m to 3m, the ultrasonic signal would attenuate too much and the
sensor could no longer reliably read out at those angles. The ultrasonic sensors
will reach out to just over 4m, about 4.3m, when the object is directly in front
of the sensor.

Figure 7: LIDAR test results: distance vs distance read

17

GPS Test Result: To display GPS coordinates and time on computer


Components: Adafruit Ultimate GPS Breakout - 66 channel w/10 Hz, TTL
Serial-USB Converter, Module, Computer

Figure 8: GPS to Computer Wiring


Realterm-Time(UTC)=07:40:33,Latitude:3825.3849N,Longitude: 12243.3128W

Figure 9: Raw GPS Data in RealTerm

System Testing

In order to verify the completion of our engineering requirements we developed


and performed the following tests.

Figure 10: testing results for various system tests

18

3.1

Build

This section describes the design of the three key systems, the Object Detection,
the GPS system and the Motion Control.
Object Detection
Originally, the sensor array would utilize four HC-SR04 ultrasonic sensors,
with two in the front of the cart and two in the back. This setup did not have
the required range to stop in time if an object did show up in the carts path
and we do not have a way to switch the cart into reverse automatically. We
removed the rear sensors and have included a LIDAR-Lite 2 laser range finder
to remedy the range issue. Due to the LIDARs single point scanning, we placed
it on a stepper motor, rotating at 90 degrees, to make up for this. Issues arose
with the stepper motors rotation speed being affected by the additional sensors.
The motor would rotate at the designated rate when all sensors were triggered,
but would rapidly decline in speed when one, or more, of the ultrasonic sensors
went out of range. This problem was due to the ultrasonic sensors having to
wait for the pulse to return resulting in longer delays depending on the range
of the object or lack of object. To correct this, we reduced the number of times
the ultrasonic will pulse to twice every cycle. To ensure the position of the
LIDAR without the use of an encoder, we introduced code and a button that
would be pressed when the system boots up. This is one time run code and will
deactivate as soon as the button is pressed.

Figure 11: Front View of the Sensor Array

19

Figure 12: Field of View of the Sensor Array


The carts bumper is sloped at a very small angle, so the enclosure will need
to be crafted/mounted with this small angle in mind.

Figure 13: Angle of Bumper


The enclosure for the ultrasonic sensors will be made of a pre-built junction
box. Modifications will need to be made to the box in order to allow for the
speakers and wires to come out of the box. The LIDAR enclosure will be made
from a different pre-made box that has a window to allow for the LIDAR to
read properly. The only modifications that will need to be made are to secure
the motor to the box, these modifications include adding a shelf to create a level
surface to mount the motor to and a wall to mount the button on.
GPS System
20

The guidance system of the cart relies on a GPS module for location and a
compass module for the carts direction. A GPS module is the obvious choice,
over manually writing in directions, for a system like this as it allows for more
accurate and more reliable instructions. GPS also allows us to program more
paths more easily as well. The compass will also assist in maintaining an accurate bearing, ensuring that the cart will reach its proper destination.
Motion Control
The motion control system would control the acceleration, steering, and
braking of the cart. To manage the acceleration, we would use pulse width
modulation because the accelerator basically served as a variable resistor, varying the amount of voltage that would be allowed through when the pedal is
depressed. The steering was originally going to be controlled by a stepper motor or a servo, but was changed to a DC motor because acquiring a stepper
or servo strong enough to turn our wheel would have been beyond our budget. The DC motor is attached to a chain to provide more torque allowing our
weaker, more affordable motor to turn the wheel. The DC motor and chain
are mounted on the dashboard of the cart, with the chain being mounted underneath the steering wheel, allowing for the user to assume manual control
if necessary. The brake will be handled my a linear actuator pushing on the
shaft that has replaced the brake pedal. The reasoning for the shaft is because
the actuator could not supply enough force alone to initiate the brakes. The
addition of the shaft also allows for the user to manually apply the brakes as
well. The actuator is attached to the underside of the dashboard of the cart and
retracts to apply the brakes. A bracket has been mounted to the actuator so
that the user can apply the brakes without worrying about the actuators shaft
preventing braking.

Design Process

The following section describes our build process and our design choices and
trade offs in the project.

21

Figure 14: Finished cart

Figure 15: Inside of the finished cart

22

4.1

Hardware Design

Due to the large nature of the design, a description of the hardware used to
implement each of individual systems is included below.
4.1.1

Motion Control

The MCM is its own self-contained module on its own PCB. It is implemented
using a PIC18F14k22. The MCM also includes: a 5 volt regulator used to step
the 12 volts in into 5 volts. A PWM to DC filter which is implemented using
two MOSFETs and a filter. An H-bridge which is used to control the braking
actuator. And one UART serial line in.
A more in depth description is available in the appendix as well as in the project
CD.
4.1.2

Object Detection

LIDAR
The LIDAR-Lite 2 laser rangefinder was the obvious choice for sensors as this
sensor type is popular with current autonomous systems and this specific component is popular with hobby drones. The characteristics that make this sensor
ideal are its reliability and its quickness of scanning the sensors described area.
Our sensor was single point and capable of reaching out up to 40 meters and
because it uses light to determine distance, it can receive this data extremely
quickly, much faster than the HC-SR04 ultrasonic sensors which can take over
one full second to receive the return signal. To make up for the single point
scanning, the sensor was mounted onto a stepper motor and swept at a ninety
degree angle. To act as a starting point a button was installed and must be
pressed by the sweeping sensor before it begins scanning the area. Its repetition
rate is between 1-500 Hz. This sensor also could be powered by the Arduino
Mega because it only requires 5 volts and approximately 100 mA to operate. It
also has two interfaces to read distance, I2C and PWM.
Ultrasonic Sensor
We used HC-SR04 ultrasonic sensors as a type of fail safe for the LIDAR as it is
not always looking forward. To make up for this we placed two ultrasonic sensors, one on the left and one on the right of the front of the cart, to ensure that
any object in front of the cart while the LIDAR is looking away will be caught.
Due to the ultrasonic sensors relying on sound waves for distance measurement
its range is much shorter, at approximately 4 meters, due to wave attenuation.
This also results in a much longer return time, making this sensor fall short as
the main sensor, but making it suitable for a secondary sensor due to their low
cost. These sensors can also be powered off of the Arduino Mega because they
require 5 volts and about 15 mA to power. This sensor operates at 40 Hz and
works by sending out a 10 microsecond pulse and records the time it takes to

23

return to the receiving speaker.

4.1.3

Path Tracking

GPS Receiver
The GPS receiver that we are using is an Adafruit Ultimate GPS Breakout
- 66 channel w/10 Hz. It is built around the MTK3339 chipset, high-quality
GPS module that can track up to 22 satellites on 66 channels, has an excellent
high-sensitivity receiver (-165 dBm ), and a built in antenna. It can do up to
10 location updates a second for high speed, high sensitivity logging or tracking
with an accuracy of 3m in radius. It requires 5V input and with only 20mA
current draw. An external antenna can be connected to the board via uFL
connector. This GPS receivers has serial interface that provide data such as
longitude, latitude and speed.
Digital Compass
The digital compass that we are using is a triple-axis magnetometer HMC5883L
chip. It uses I2C to communicate with an Arduino Mega. The HMC5883L chip
requires 3.3V, but the breakout will accept 5V. This magnetometer measures
the earths magnetic field in 3 directions, but in this case we are looking at the
magnetic field in x and y direction in order to calculate the heading relative to
north. The digital compass is required in this project, because GPS receiver
only provides heading when an object is moving.

4.2
4.2.1

Software Design
MCM

The MCMs code is available in the project CD and a more in depth explanation
is given in the appendix. Only a flowchart is included in this section in Figure
16. Figure 16 shows how the MCM is implemented using a finite state machine.
The main control FSM is shown on the left side. The right side of the figure
shows the interrupt routine FSM used to store received data in a buffer, the
MCM waits until an entire packet is received before it begins the control loop.
4.2.2

Arduino Mega 2560

The Mega 2560 is a microcontroller board based on the ATmega2560. It has 54


digital input/output pins (of which 15 can be used as PWM outputs), 16 analog
inputs, 4 UARTs (hardware serial ports), a 16 MHz crystal oscillator, a USB
connection, a power jack, an ICSP header, and a reset button. The logic of the
code is that for each loop, the data is collected and send to the laptop for processing. The microcontroller is used to collect data from GPS, magnetometer,
LIDAR-Lite 2 laser range finder and the two HC-SR04 ultrasonic sensors. The
packet of data is :
24

Figure 16: Flowchart for MCM implementation code

(latitude,longitude,speed,current heading,lidars angle, lidars distance, ultrasonics distance1, ultrasonic s distance2)


4.2.3

LabVIEW

Overview
LabVIEW (short for Laboratory Virtual Instrument Engineering Workbench)
is a system-design platform and development environment for a visual programming language from National Instruments. It is used for processing because it
is a user-friendly program that is able to interact with external serial devices

25

such as Arduino board [12]. The parallel programing allows multitasking process which is need in this project. The LabVIEW that we use is version 2014
Service Pack 1 with Visa driver installed in order to communicate through USB
port.
Data Processing
When the data is coming in the USB port, LabVIEW reads buffer one byte
at a time and translate them into an array of numbers. Each index represents
each data from each sensor. These data is then displayed and plotted on graphs
for visualization.The unique feature about labView is that it can understand C
script or Matlab script using script function [13]. In this case, we use C script
and haversin formula to calculate distance to waypoint and its bearing using
our predefined waypoint and GPS location. Figure 17 shows the block diagram
of data processing.

Figure 17: Block diagram of how the data is processed

Formulae
In navigation world, Haversine formula is very useful for finding the geographic
distance between the two locations on Earth. The assumption of using this
formula is that the Earth is a perfect sphere, but since it it not, the result may
be off a little bit[14]. Given the earth radius of approximately 6,371 km, the
distance between the two points (lat1,lon1) and (lat2,lon2) can be calculated as
shown in Figure 18.
Since the two points is measuring in a straight line, the heading between these
two points can be calculated using a formula known as Forward Azimuth.Azimuth
is the angle a line makes with a meridian, measured clockwise from north[15].
The formula is shown in Figure 19.
Features
Figure 20 is a screenshot of the LabVIEW front panel that is being used as our
user interface.
A user can select a predefined waypoint from a dropdown menu and its
location will be displayed in longitude and latitude

26

Figure 18: Harversin: Distance between two points

Figure 19: Forward Azimuth: Heading between two points


Data visualization: Current time and date, speedometer, distance to waypoint, heading, live path tracking plot, live object detection plot
LED indicating the incoming data from Arduino
LEDs for lidar and ultrasonic sensors indicating detected objects
Manual Remote Control Mode using a keyboard
Autonomous Mode
Autonomous Mode
The unique feature of this user interface is the ability to switch between manual
mode to autonomous mode at all time with ease using a shortcut keyboard
button esc that we implemented. While in autonomous mode, the cart will
move forward at a medium speed. The cart will only stop under three following
conditions:
when the cart is within 3 meters radius of the waypoint OR
when an object is detected by a lidar within 5 meters at angles +/-20
degrees from the center OR
when ultrasonic sensors detect an object within 3m at a lower level
Another unique feature is that when the cart stops during autonomous mode, it
will automatically switch to manual mode and the cart will remain still, waiting
27

Figure 20: LabVIEW User Interface


for a user input. The user can either switch back to autonomous mode or control
a cart using a keyboard in manual mode.

Survey Responses

The survey was done on hard copies.There were 10 people taking part in the
survey on Autonomous Golf Cart project. Eight out of ten were students and
faculties from SSU. The others were random people from outside the school. 80%
of the people have heard of a similar product out there in the market, while 20%
who were not on campus, have never heard of it before. Majority thought that
this device would be beneficial to the society, but 50/50 showed interest in this
project. 30% would pay less than $500, 40% would pay in the range of $500$1000 and the other 30% would pay in the range of $1000-$2500. 70% would
purchase this or similar device for their workplace/campus. Majority would
think that this device would ideally range in more than 1 km. Some suggestions
to improve this device would be: GPS, speedometer, cellphone interface, solar
power, object detection, and voice activated. We thought the overall responses
were good and they had given us a green light to work on this project especially
when majority would purchase this device for their campuses.

Future Work

The AutoCart system was a massive project that required lots of time and
funding. We have left the project in a state where future work could be done
28

by students from various fields and disciplines. We have identified 3 key areas
where the AutoCart could be improved: Steering System, User Interface, and
Object Detection. In addition to identifying these areas we have also created a
basic implementation plan.

6.1

Steering System

The steering system should be implemented using a 12-36 VDC electric motor
and a gearing system which could provide at least 15 ft-lbs of force. Such a high
amount of torque is required because the cart does not have a assisted steering.
In addition, the steering has to allow the user to take control at any time,
this requirement makes wormgear type motors unsuitable for this application.
Finally because of the lack of assisted steering, the system should NOT steer
when it is not in motion. Our research has shown that a stepper motor would
be ideal for the system. Controlling the motor will require the use a closed
loop control system which could be used to control the speed of rotation and
the steering angle. To accomplish this, a rotary encoder or some other form of
feedback must be used. Finally, the steering has to have a way to disengage, if
a stepper motor is used, this can be accomplished by simply turning the power
OFF.

6.2

User Interface

The LabVIEW interface can be improved by adding a feature such as live video.
It can be done using a USB camera connected to the computer. By using Vision
Acquisition software in LabVIEW, a user is able to see the front view of the
cart on the computer screen. In addition, any useful data on the screen can
be deployed to the web server given the computer is connected to the internet.
Using Data Dashboard App (currently available in iTunes and Google Play
Store) and the computers IP address, you can create a custom dashboards to
display the live data and deployed LabVIEW Web services on indicators, such
as charts, gauges, textboxes, LEDS and such. This would be convenient for
sharing data with other people without the need to be at the cart. For manual
control mode, we can add an external USB device such as a joystick (recognized
by LabVIEW) to control the cart instead of using a keyboard. It makes a user
feel like controlling an RC cart. Turn Left/Turn Right buttons could also be
added to the user interface to give a full control of the carts navigation.

6.3

Object Detection

The current object detection system can be improved by either providing the
system its own processor, a faster processor, or by replacing the arduino mega
with something that can handle multi-process. Any of these changes would increase the rate at which the stepper motor moves allowing for less objects to get
by the LIDAR as it sweeps and increase the number of scans we can perform per
sweep; with the increase in number of reads per sweep we also get a smoother

29

sweep from the motor potentially resulting in a reduction in false positives. As


of now the stepper motor moves slowly because the system is waiting for multiple processes to complete before it can make the next step. It has to retrieve the
GPS data, compass data, and LIDAR data each step and because the GPS and
compass take a bit of time to receive it severely reduces the speed the motor
can sweep.
The system could also be improved by replacing the sensors with better more
accurate sensors. There are LIDARs that can cover a larger area and could
remove the need for the ultrasonic sensors. Other options involve replacing
the ultrasonic sensors with more LIDAR, or using an xbox kinect for object
detection. Along with optimizing the front sensors, side and rear sensors would
also be required as the cart reaches autonomous function.

Conclusion

The AutoCart was a very ambitious project which was restrained and slowed
down by a lack of funding and time. In spite of this however, we met over 90%
of our engineering requirements. Unfortunately the engineering requirement
we failed to achieve was the automated steering. As of now the AutoCart only
automates the braking and acceleration. The future work for this project is very
exciting as we have at least successfully laid down the necessary groundwork for
a a future project.

Source Of Funding

Our project proposal was selected to receive a funding of $750 from SOURCE
award of Sonoma State University. This amount of reward will be spent wisely
on electrical components and necessities for our project. We, as a group, would
like to thanks SOURCE award and its committees for the generous support to
help our project moving forward to accomplish the goal. I hope to see SOURCE
award being giving. to students future projects.

30

References
[1] Navyatech. (2015).Navya.
http://navya.tech/?lang=en

Retrieved

22

October,

2015,

from

[2] Center
For
Automotive
Research.
(2012).
KPMG.com.
Retrieved
25
February,
2016,
from
https://www.kpmg.com/US/en/IssuesAndInsights/ArticlesPublications/Documents/selfdriving-cars-next-revolution.pdf
[3] Google.com. (2015). Google Self-Driving Car Project. Retrieved 22 October, 2015, from https://www.google.com/selfdrivingcar/how/
[4] Nature.com. (2015). Nature.com Retrieved 22 October, 2015, from
http://www.nature.com/polopoly fs/1.16832!/menu/main/topColumns/topLeftColumn/
pdf/518020a.pdf
[5] Csuchicoedu.com (2015). Csuchicoedu. Retrieved 22 October, 2015, from
http://isl.ecst.csuchico.edu/DOCS/Logs/Michael/Files/web link files/low cost.pdf
[6] Mitedu.com (2015). MIT News. Retrieved 22 October, 2015, from
http://news.mit.edu/2015/autonomous-self-driving-golf-carts-0901
[7] Webjcli.org (2015). Webjcliorg. Retrieved 22 October, 2015, from
http://webjcli.org/article/view/344/471
[8] Technologyreview.com (2015). MIT Technology Review. Retrieved 22 October, 2015, from http://www.technologyreview.com/news/540751/startupaims-to-beat-google-to-market-with-self-driving-golf-cart/
[9] Aurobots.com. (2015). Aurobotscom. Retrieved 22 October, 2015, from
http://aurobots.com/
[10] Getcruise.com. (2015). Getcruisecom. Retrieved 22 October, 2015, from
http://www.getcruise.com/
[11] youtube.com. (2012). obstacle detection / collision avoidance- lego car
with NXT camera sensor and laser.m4v. Retrieved 1 November 2015, from
https://www.youtube.com/watch?v=56TdjEBesuM
[12] ni.com. (2016). LabVIEW System Design Software. Retrieved 18 May 2016,
from http://www.ni.com/labview/
[13] ni.com. (2016). Scripting Languages and NI LabVIEW. Retrieved 18 May
2016, from http://www.ni.com/white-paper/7671/en/
[14] movable-type.co.uk (2016). Calculate distance, bearing and more between Latitude/Longitude points. Retrieved 18 May 2016, from
http://www.movable-type.co.uk/scripts/latlong.html

31

[15] mathworks.com
(2016).
Directions
and
Areas
on
the
Sphere
and
Spheroid.
Retrieved
18
May
2016,
from
http://www.mathworks.com/help/map/directions-and-areas-on-thesphere-and-spheroid.html

32

Appendix

The following appendix contains more detailed information regarding the AutoCart system. For more detailed information including datasheets and test
results please refer to the Project CD or the website.

A
A.1

Instructions For Use


Turning the AutoCart System ON

1. Insert key into ignition and turn on.


2. Switch AutoCart power switch to ON position.
3. Switch the Safety switch to FORWARD ON position

A.2

Turning the AutoCart System OFF

1. Switch Safety switch to REAR ON position.


2. Switch AutoCart power switch to OFF position.
3. After completion of previous steps, the user will have complete control of
the cart.

33

Motion Control Module

The motion control module controls the controls of the AutoCart. The MCM
is controlled via a 9600 baud serial connection. The module accepts a 6 Byte
wide packet which is formatted in the following way.
Table 3: MCM packet format
Start of Packet Acceleration Brake
1
1
1

Content
Length in Bytes

Steering
2

End of Packet
1

Start of Frame: Start of Packet character


0x24: $ this symbol denotes the start of a packet
Acceleration: Acceleration control
0x00: Stop/no acceleration
0x0F: Slowest possible speed
0xF0: Medium speed
0xFF: Full speed
Brake: Brake control
0xXX: Brake OFF
0xFF: Brake ON
Steering: Steering Control
0x00FF: Turn Left
0xFF00: Turn Right
0x0000: Hold position
End of Packet: End of Frame Character
0x23: # this symbol denotes the end of the frame
Example packet:
Table 4: MCM example packet
0x24 0xFF 0x00 0xFF00 0x23
This packet commands the module to full speed, brake off, and turn left.

34

B.1

Circuit Schematic

Figure 21: Circuit Schematic of the Motion Control Module

35

B.2

C code

The Code for the MCM implements a Finite State Machine (FSM), to run
through the controls of the cart. the MCM is controlled via a serial UART
connection running at 9600 Baud. In addition to being serially controlled, the
MCM can be controlled through a GUI in Labview. Only a code flowchart is included here, for the actual code refer to the website or to the documentation CD.

Power Distribution

The existing golf cart uses 6 6V lead-acid batteries in series to create a 36V
source. These batteries are then tapped at 24V to create a 12V source for the
lights and horn because of this, the chassis is grounded at 24V with respect
to the batteries. The golf cart had two safety switches installed one was a
pressure switch underneath the drivers seat that opened if no one was sitting
in the seat, this safety mechanism had to be removed for our system. The
other safety mechanism is a microswitch installed in the acceleration pedal, this
switch unlocks the electric motor when the accelerator is slightly depressed.
To overcome this issue we installed a SPDT switch which could bypass the
microswitch as needed.

Figure 22: Power Distribution and Safety Switches

36

Das könnte Ihnen auch gefallen