Beruflich Dokumente
Kultur Dokumente
AutoCart
Jorge
Inocencio
Chanbora
Uch
Richard
Duong
Abstract
Autonomous vehicle technology is the future of transportation because it will
allow the movement of people and goods without the need for human drivers.
This will lead to an increase in the speed and transportation of people and
goods. Currently all autonomous vehicles require the use of an expensive array
of sensors and powerful computers to navigate autonomously. The AutoCart
is a golf cart that has been converted into a semi-autonomous vehicle, and
will investigate the possibility of a low-cost semi-autonomous shuttle vehicle
that navigates specific pre-defined routes on a campus. It will use a variety of
sensors including a GPS, magnetometer, LIDAR, ultrasonics, etc.
Acknowledgments
We would like to thank the Dr. Farid Farahman, Tyler Spott, and Yikealo
Abraha. This project would not be possible without their continued support
and assistance.
Contents
1 Introduction
1.1 Literature Review & Existing Patents
1.2 Problem Statement . . . . . . . . . . .
1.3 Methodology . . . . . . . . . . . . . .
1.4 Challenges . . . . . . . . . . . . . . . .
1.5 Key Components . . . . . . . . . . . .
1.6 Marketing Requirements . . . . . . . .
1.7 Engineering Requirements . . . . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
7
7
8
9
9
10
10
10
2 Implementation
2.1 Project Schedule
2.2 Bill Of Materials
2.3 Module Testing .
2.4 Test Results . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
11
13
14
14
16
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
3 System Testing
18
3.1 Build . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
4 Design Process
4.1 Hardware Design . . . . . .
4.1.1 Motion Control . . .
4.1.2 Object Detection . .
4.1.3 Path Tracking . . .
4.2 Software Design . . . . . . .
4.2.1 MCM . . . . . . . .
4.2.2 Arduino Mega 2560
4.2.3 LabVIEW . . . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
5 Survey Responses
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
21
23
23
23
24
24
24
24
25
28
6 Future Work
28
6.1 Steering System . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
6.2 User Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
6.3 Object Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
7 Conclusion
30
8 Source Of Funding
30
9 Appendix
33
36
List of Figures
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
12
13
15
16
16
17
17
18
18
18
19
20
20
22
22
25
26
27
27
28
35
36
List of Tables
1
2
3
4
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
10
14
34
34
Introduction
1.1
The current automobile market has already implemented many automated safety
systems know as Advanced Driver Assist Systems (ADAS). Current ADAS systems include adaptive cruise control, emergency braking systems, lane drift
warnings, adaptive speed control, blind spot detection, self-parking, etc. All of
these systems add some level of autonomy to vehicles and are beginning to be
widely adopted by vehicle manufacturers. The two main forms of implementing
autonomous functions are sensor based and connectivity based. Sensor based solutions use Artificial Intelligence along with cameras, GPS, RADARs, LIDARs,
ultrasonic sensors, and inertial measurement units to continuously monitor the
vehicles position and surroundings. Connectivity based solutions use Vehicle
to Vehicle (V2V) communications to relay traffic, road, and driving conditions
to surrounding vehicles. With this information it is possible for vehicles to be
aware of traffic patterns and vehicle behavior in city roads where 360 views are
impossible or difficult to achieve. The convergence of these two technologies is
what will enable autonomous vehicle technology [2]. Most current autonomous
systems use sensor based solutions, because of this fully autonomous cars are
still in their infancy and are mostly being designed and tested in universities and
by large companies like Google and Tesla. There is only one fully autonomous
vehicle ready for sale on the market, this vehicle is not intended to be used on
public roads [1].
One of the most successful and complex autonomous golf carts is the SMART
golf cart. This golf cart is the product of Singapore-MIT alliance for Research
and Technology. The SMART cart was designed and built by a team of 19 pro7
fessors and students. This cart was tested over a six day period in a large public
park. The cart successfully ferried over 500 passengers across the gardens.The
key design difference in the design of this cart is the conscious choice to reduce
the number of sensors used. Instead of using complex 3D LIDAR systems they
used multiple laser rangefinders mounted at various heights along with a stereo
camera to obtain data on the vehicles surroundings. Making this choice vastly
reduced the amount of complex calculations needed to map surrounding environment. In addition to using simpler sensors, the SMART golf cart used a new
algorithm for obstacle avoidance, the new algorithm maps a cylinder unto the
vehicles planned trajectory. If an object intrudes unto this cylinder the software
redraws the cylinder to exclude the object this reroutes the golf cart [6].
There is an abundance of research available for autonomous operation, but
most of the information is applied to a small scale, such as remote control cars.
However since these projects are on a smaller scale, the availability of possible
components is much greater as they dont require as much power/force. If we
were to just upscale the parts used in a small scale project, the price would become too great or the availability of the component would be very limited so we
will opt for a DC motor as opposed to a servo or stepper motor [5]. The object
detection system however should be scalable and we will research how viable
an option a camera and laser system will be. To achieve object detection and
avoidance they used a camera and three lasers, and when any of the lasers break
line of sight with the others the system will detect it and make the appropriate
movements [11].
Autonomous transportation is an exciting field and it is easy to see why companies and universities are investing time and resources to further develop
it.According to nature.com there are about 1.24 million traffic fatalities every
year worldwide and 90% are due to driver error. With the use of an autonomous
vehicle replacing all these drivers we could bring that number down to as low
as 124,000 fatalities a year. States are already beginning to approve the use of
autonomous vehicles on roads paving the way for their inevitable debut to the
consumer market [3].
1.2
Problem Statement
our design because the entire system will run off of the golf carts 36V electric
system. The size of our system is also of minimal concern. The mechanical
structures needed to control the steering will be the biggest challenge for us
because of a lack of mechanical engineering skills. Finally, the limited budget
is a large challenge because many of the sensors need for implementation are
extremely expensive.
1.3
Methodology
Our team will solve the proposed problem by using GPS waypoints to move the
cart through campus, while it is traveling from waypoint to waypoint the cart
will be scanning for objects and calculating to see if a collision is imminent or
probable, if the AutoCart detects an approaching object it will try to gently
steer away from it or if it cant, it will stop altogether. To do this our cart will
use a GPS receiver, an object detection system, and LabVIEW to perform the
controls and calculations needed for the semi-autonomous functions. In order
to make our project as affordable as possible we will attempt to use readily
available and low price parts.
1.4
Challenges
Challenges that we face with this project is how we can bypass the fail-safes in
the cart without compromising the carts features. Meaning if we remove the
fail-safe preventing the cart from moving without a driver in it, the cart will
still be able to be used without a driver. Issues that had been brought up before
were that there were issues installing the mechanism to apply the brakes of the
cart. Another system that needs to be implemented would be manual controls.
This would either require the system be installed so that a user could operate
the vehicle without risk of damaging the installed system, or a simple set of
electronic controls that would act as emergency overrides. Itd also be ideal to
have the system be small enough as to not take up the space of the drivers seat.
The system will be exposed to the elements so a level of weatherproofing will
be required. This means that the system is rugged enough to take some hits,
withstand wind, and survive water damage. This system should ideally last as
long as the cart as well. As for cost and installation, those two things will be
fairly high because the system is invasive and requires some time to prepare.
This vehicle will be using GPS so the accuracy of the data we require will be
fairly important. As GPS can be accurate to within a few feet this should not
be a problem when it comes to its ability to navigate. Though itd be ideal for
this system to have object detection to better improve its accuracy and increase
safety for the passengers and pedestrians. The steering, braking, and acceleration also need to be very precise so that it can navigate properly as well as
avoid dangerous situations.
Finally, the system should not significantly affect the battery life of the golf cart.
1.5
Key Components
1.6
Marketing Requirements
1.7
Engineering Requirements
The AutoCart will have an emergency kill switch that turns the autocart
off.
10
Implementation
The AutoCart system will be implemented by breaking down the key systems
into smaller systems. The system is broken down into a motion control module
(MCM), an object detection systems, and a GUI and path tracking system. The
description of each system is included below.
Motion Control Module
The Motion Control Module (MCM) controls the electric motors and actuators
that drive the cart. In particular, it controls the accelerator, the brake, and the
steering. The MCM is commanded by a laptop running Labview. The MCM interfaces with the existing electrical system, speed controller, and safety switches
on the cart. In addition to this, the MCM is its own self-contained PCB which
can be used for other projects in the future. The Motion Control Module is
implemented using a PIC18F14k22.
Object Detection System
The cart will feature three forward facing sensors. There will be two HC-SR04
ultrasonic sensors mounted on the left and right of the front of the cart closer
to the ground. The third sensor, a LIDAR-Lite 2 laser range finder, will be
mounted on the middle of the bumper at the top and will provide a 90 degree
sweep. This sensor array will provide data on objects nearby and send a message
to the mcm dictating what course of action to take. These instructions include
braking, decelerating, and turning to avoid any objects in the carts path. The
purpose of the two low ultrasonic sensors is to cover the area that the LIDAR
cannot see. The rear and side sensors are not necessary as the cart would not
11
utilize the reverse function and the side sensors would not make much of a difference. A button will be enclosed with the LIDAR and it will serve as the
starting point for the LIDAR. This means that when the system starts up the
LIDAR will sweep into the button and when the button is pressed, the system
will know that the LIDAR is at its starting position.
GPS and Path Tracking
This GPS Path Tracking system consists of GPS receiver and an magnetometer(digital compass). The path of the AutoCart traveling is pre-defined. Distance between current location and the next waypoint is monitored at all times.
The compass provides the direction of travel. The outputs include speed, distance to the next waypoint in straight lines, target heading, current heading,
and the error in heading. The speed and the decision to make a turn are sent
to motion control module.
All of the previous systems have to interface with each other for the AutoCart
to function correctly. The different systems communicate via different serial
interfaces. An overall system architecture is shown below.
12
2.1
Project Schedule
Figure 2: Schedule
The schedule above was our projected schedule at the beginning of the
project. Overall we did not meet the schedule very effectively. The mechanical
modifications time line took a lot longer than expected. The reason for this was
a lack of appropriate tools and equipment in order to make the modifications.
The steering in particular took much longer than we initially assumed and was
ultimately not successful in implementation. Finally, our projected schedule
allotted four weeks for the inter-module communications. In practice however
we successfully completed this part of the project in less than two weeks.
13
2.2
Bill Of Materials
2.3
Quantity
1
2
1
1
1
1
2
1
1
1
1
1
1
1
1
Unit Price
45.95
6.88
72.99
13.45
14.99
6.99
5.44
6.99
20.82
45.76
8.32
12.95
3.95
114.89
30.00
410.37
Module Testing
14
15
2.4
Test Results
17
System Testing
18
3.1
Build
This section describes the design of the three key systems, the Object Detection,
the GPS system and the Motion Control.
Object Detection
Originally, the sensor array would utilize four HC-SR04 ultrasonic sensors,
with two in the front of the cart and two in the back. This setup did not have
the required range to stop in time if an object did show up in the carts path
and we do not have a way to switch the cart into reverse automatically. We
removed the rear sensors and have included a LIDAR-Lite 2 laser range finder
to remedy the range issue. Due to the LIDARs single point scanning, we placed
it on a stepper motor, rotating at 90 degrees, to make up for this. Issues arose
with the stepper motors rotation speed being affected by the additional sensors.
The motor would rotate at the designated rate when all sensors were triggered,
but would rapidly decline in speed when one, or more, of the ultrasonic sensors
went out of range. This problem was due to the ultrasonic sensors having to
wait for the pulse to return resulting in longer delays depending on the range
of the object or lack of object. To correct this, we reduced the number of times
the ultrasonic will pulse to twice every cycle. To ensure the position of the
LIDAR without the use of an encoder, we introduced code and a button that
would be pressed when the system boots up. This is one time run code and will
deactivate as soon as the button is pressed.
19
The guidance system of the cart relies on a GPS module for location and a
compass module for the carts direction. A GPS module is the obvious choice,
over manually writing in directions, for a system like this as it allows for more
accurate and more reliable instructions. GPS also allows us to program more
paths more easily as well. The compass will also assist in maintaining an accurate bearing, ensuring that the cart will reach its proper destination.
Motion Control
The motion control system would control the acceleration, steering, and
braking of the cart. To manage the acceleration, we would use pulse width
modulation because the accelerator basically served as a variable resistor, varying the amount of voltage that would be allowed through when the pedal is
depressed. The steering was originally going to be controlled by a stepper motor or a servo, but was changed to a DC motor because acquiring a stepper
or servo strong enough to turn our wheel would have been beyond our budget. The DC motor is attached to a chain to provide more torque allowing our
weaker, more affordable motor to turn the wheel. The DC motor and chain
are mounted on the dashboard of the cart, with the chain being mounted underneath the steering wheel, allowing for the user to assume manual control
if necessary. The brake will be handled my a linear actuator pushing on the
shaft that has replaced the brake pedal. The reasoning for the shaft is because
the actuator could not supply enough force alone to initiate the brakes. The
addition of the shaft also allows for the user to manually apply the brakes as
well. The actuator is attached to the underside of the dashboard of the cart and
retracts to apply the brakes. A bracket has been mounted to the actuator so
that the user can apply the brakes without worrying about the actuators shaft
preventing braking.
Design Process
The following section describes our build process and our design choices and
trade offs in the project.
21
22
4.1
Hardware Design
Due to the large nature of the design, a description of the hardware used to
implement each of individual systems is included below.
4.1.1
Motion Control
The MCM is its own self-contained module on its own PCB. It is implemented
using a PIC18F14k22. The MCM also includes: a 5 volt regulator used to step
the 12 volts in into 5 volts. A PWM to DC filter which is implemented using
two MOSFETs and a filter. An H-bridge which is used to control the braking
actuator. And one UART serial line in.
A more in depth description is available in the appendix as well as in the project
CD.
4.1.2
Object Detection
LIDAR
The LIDAR-Lite 2 laser rangefinder was the obvious choice for sensors as this
sensor type is popular with current autonomous systems and this specific component is popular with hobby drones. The characteristics that make this sensor
ideal are its reliability and its quickness of scanning the sensors described area.
Our sensor was single point and capable of reaching out up to 40 meters and
because it uses light to determine distance, it can receive this data extremely
quickly, much faster than the HC-SR04 ultrasonic sensors which can take over
one full second to receive the return signal. To make up for the single point
scanning, the sensor was mounted onto a stepper motor and swept at a ninety
degree angle. To act as a starting point a button was installed and must be
pressed by the sweeping sensor before it begins scanning the area. Its repetition
rate is between 1-500 Hz. This sensor also could be powered by the Arduino
Mega because it only requires 5 volts and approximately 100 mA to operate. It
also has two interfaces to read distance, I2C and PWM.
Ultrasonic Sensor
We used HC-SR04 ultrasonic sensors as a type of fail safe for the LIDAR as it is
not always looking forward. To make up for this we placed two ultrasonic sensors, one on the left and one on the right of the front of the cart, to ensure that
any object in front of the cart while the LIDAR is looking away will be caught.
Due to the ultrasonic sensors relying on sound waves for distance measurement
its range is much shorter, at approximately 4 meters, due to wave attenuation.
This also results in a much longer return time, making this sensor fall short as
the main sensor, but making it suitable for a secondary sensor due to their low
cost. These sensors can also be powered off of the Arduino Mega because they
require 5 volts and about 15 mA to power. This sensor operates at 40 Hz and
works by sending out a 10 microsecond pulse and records the time it takes to
23
4.1.3
Path Tracking
GPS Receiver
The GPS receiver that we are using is an Adafruit Ultimate GPS Breakout
- 66 channel w/10 Hz. It is built around the MTK3339 chipset, high-quality
GPS module that can track up to 22 satellites on 66 channels, has an excellent
high-sensitivity receiver (-165 dBm ), and a built in antenna. It can do up to
10 location updates a second for high speed, high sensitivity logging or tracking
with an accuracy of 3m in radius. It requires 5V input and with only 20mA
current draw. An external antenna can be connected to the board via uFL
connector. This GPS receivers has serial interface that provide data such as
longitude, latitude and speed.
Digital Compass
The digital compass that we are using is a triple-axis magnetometer HMC5883L
chip. It uses I2C to communicate with an Arduino Mega. The HMC5883L chip
requires 3.3V, but the breakout will accept 5V. This magnetometer measures
the earths magnetic field in 3 directions, but in this case we are looking at the
magnetic field in x and y direction in order to calculate the heading relative to
north. The digital compass is required in this project, because GPS receiver
only provides heading when an object is moving.
4.2
4.2.1
Software Design
MCM
The MCMs code is available in the project CD and a more in depth explanation
is given in the appendix. Only a flowchart is included in this section in Figure
16. Figure 16 shows how the MCM is implemented using a finite state machine.
The main control FSM is shown on the left side. The right side of the figure
shows the interrupt routine FSM used to store received data in a buffer, the
MCM waits until an entire packet is received before it begins the control loop.
4.2.2
LabVIEW
Overview
LabVIEW (short for Laboratory Virtual Instrument Engineering Workbench)
is a system-design platform and development environment for a visual programming language from National Instruments. It is used for processing because it
is a user-friendly program that is able to interact with external serial devices
25
such as Arduino board [12]. The parallel programing allows multitasking process which is need in this project. The LabVIEW that we use is version 2014
Service Pack 1 with Visa driver installed in order to communicate through USB
port.
Data Processing
When the data is coming in the USB port, LabVIEW reads buffer one byte
at a time and translate them into an array of numbers. Each index represents
each data from each sensor. These data is then displayed and plotted on graphs
for visualization.The unique feature about labView is that it can understand C
script or Matlab script using script function [13]. In this case, we use C script
and haversin formula to calculate distance to waypoint and its bearing using
our predefined waypoint and GPS location. Figure 17 shows the block diagram
of data processing.
Formulae
In navigation world, Haversine formula is very useful for finding the geographic
distance between the two locations on Earth. The assumption of using this
formula is that the Earth is a perfect sphere, but since it it not, the result may
be off a little bit[14]. Given the earth radius of approximately 6,371 km, the
distance between the two points (lat1,lon1) and (lat2,lon2) can be calculated as
shown in Figure 18.
Since the two points is measuring in a straight line, the heading between these
two points can be calculated using a formula known as Forward Azimuth.Azimuth
is the angle a line makes with a meridian, measured clockwise from north[15].
The formula is shown in Figure 19.
Features
Figure 20 is a screenshot of the LabVIEW front panel that is being used as our
user interface.
A user can select a predefined waypoint from a dropdown menu and its
location will be displayed in longitude and latitude
26
Survey Responses
The survey was done on hard copies.There were 10 people taking part in the
survey on Autonomous Golf Cart project. Eight out of ten were students and
faculties from SSU. The others were random people from outside the school. 80%
of the people have heard of a similar product out there in the market, while 20%
who were not on campus, have never heard of it before. Majority thought that
this device would be beneficial to the society, but 50/50 showed interest in this
project. 30% would pay less than $500, 40% would pay in the range of $500$1000 and the other 30% would pay in the range of $1000-$2500. 70% would
purchase this or similar device for their workplace/campus. Majority would
think that this device would ideally range in more than 1 km. Some suggestions
to improve this device would be: GPS, speedometer, cellphone interface, solar
power, object detection, and voice activated. We thought the overall responses
were good and they had given us a green light to work on this project especially
when majority would purchase this device for their campuses.
Future Work
The AutoCart system was a massive project that required lots of time and
funding. We have left the project in a state where future work could be done
28
by students from various fields and disciplines. We have identified 3 key areas
where the AutoCart could be improved: Steering System, User Interface, and
Object Detection. In addition to identifying these areas we have also created a
basic implementation plan.
6.1
Steering System
The steering system should be implemented using a 12-36 VDC electric motor
and a gearing system which could provide at least 15 ft-lbs of force. Such a high
amount of torque is required because the cart does not have a assisted steering.
In addition, the steering has to allow the user to take control at any time,
this requirement makes wormgear type motors unsuitable for this application.
Finally because of the lack of assisted steering, the system should NOT steer
when it is not in motion. Our research has shown that a stepper motor would
be ideal for the system. Controlling the motor will require the use a closed
loop control system which could be used to control the speed of rotation and
the steering angle. To accomplish this, a rotary encoder or some other form of
feedback must be used. Finally, the steering has to have a way to disengage, if
a stepper motor is used, this can be accomplished by simply turning the power
OFF.
6.2
User Interface
The LabVIEW interface can be improved by adding a feature such as live video.
It can be done using a USB camera connected to the computer. By using Vision
Acquisition software in LabVIEW, a user is able to see the front view of the
cart on the computer screen. In addition, any useful data on the screen can
be deployed to the web server given the computer is connected to the internet.
Using Data Dashboard App (currently available in iTunes and Google Play
Store) and the computers IP address, you can create a custom dashboards to
display the live data and deployed LabVIEW Web services on indicators, such
as charts, gauges, textboxes, LEDS and such. This would be convenient for
sharing data with other people without the need to be at the cart. For manual
control mode, we can add an external USB device such as a joystick (recognized
by LabVIEW) to control the cart instead of using a keyboard. It makes a user
feel like controlling an RC cart. Turn Left/Turn Right buttons could also be
added to the user interface to give a full control of the carts navigation.
6.3
Object Detection
The current object detection system can be improved by either providing the
system its own processor, a faster processor, or by replacing the arduino mega
with something that can handle multi-process. Any of these changes would increase the rate at which the stepper motor moves allowing for less objects to get
by the LIDAR as it sweeps and increase the number of scans we can perform per
sweep; with the increase in number of reads per sweep we also get a smoother
29
Conclusion
The AutoCart was a very ambitious project which was restrained and slowed
down by a lack of funding and time. In spite of this however, we met over 90%
of our engineering requirements. Unfortunately the engineering requirement
we failed to achieve was the automated steering. As of now the AutoCart only
automates the braking and acceleration. The future work for this project is very
exciting as we have at least successfully laid down the necessary groundwork for
a a future project.
Source Of Funding
Our project proposal was selected to receive a funding of $750 from SOURCE
award of Sonoma State University. This amount of reward will be spent wisely
on electrical components and necessities for our project. We, as a group, would
like to thanks SOURCE award and its committees for the generous support to
help our project moving forward to accomplish the goal. I hope to see SOURCE
award being giving. to students future projects.
30
References
[1] Navyatech. (2015).Navya.
http://navya.tech/?lang=en
Retrieved
22
October,
2015,
from
[2] Center
For
Automotive
Research.
(2012).
KPMG.com.
Retrieved
25
February,
2016,
from
https://www.kpmg.com/US/en/IssuesAndInsights/ArticlesPublications/Documents/selfdriving-cars-next-revolution.pdf
[3] Google.com. (2015). Google Self-Driving Car Project. Retrieved 22 October, 2015, from https://www.google.com/selfdrivingcar/how/
[4] Nature.com. (2015). Nature.com Retrieved 22 October, 2015, from
http://www.nature.com/polopoly fs/1.16832!/menu/main/topColumns/topLeftColumn/
pdf/518020a.pdf
[5] Csuchicoedu.com (2015). Csuchicoedu. Retrieved 22 October, 2015, from
http://isl.ecst.csuchico.edu/DOCS/Logs/Michael/Files/web link files/low cost.pdf
[6] Mitedu.com (2015). MIT News. Retrieved 22 October, 2015, from
http://news.mit.edu/2015/autonomous-self-driving-golf-carts-0901
[7] Webjcli.org (2015). Webjcliorg. Retrieved 22 October, 2015, from
http://webjcli.org/article/view/344/471
[8] Technologyreview.com (2015). MIT Technology Review. Retrieved 22 October, 2015, from http://www.technologyreview.com/news/540751/startupaims-to-beat-google-to-market-with-self-driving-golf-cart/
[9] Aurobots.com. (2015). Aurobotscom. Retrieved 22 October, 2015, from
http://aurobots.com/
[10] Getcruise.com. (2015). Getcruisecom. Retrieved 22 October, 2015, from
http://www.getcruise.com/
[11] youtube.com. (2012). obstacle detection / collision avoidance- lego car
with NXT camera sensor and laser.m4v. Retrieved 1 November 2015, from
https://www.youtube.com/watch?v=56TdjEBesuM
[12] ni.com. (2016). LabVIEW System Design Software. Retrieved 18 May 2016,
from http://www.ni.com/labview/
[13] ni.com. (2016). Scripting Languages and NI LabVIEW. Retrieved 18 May
2016, from http://www.ni.com/white-paper/7671/en/
[14] movable-type.co.uk (2016). Calculate distance, bearing and more between Latitude/Longitude points. Retrieved 18 May 2016, from
http://www.movable-type.co.uk/scripts/latlong.html
31
[15] mathworks.com
(2016).
Directions
and
Areas
on
the
Sphere
and
Spheroid.
Retrieved
18
May
2016,
from
http://www.mathworks.com/help/map/directions-and-areas-on-thesphere-and-spheroid.html
32
Appendix
The following appendix contains more detailed information regarding the AutoCart system. For more detailed information including datasheets and test
results please refer to the Project CD or the website.
A
A.1
A.2
33
The motion control module controls the controls of the AutoCart. The MCM
is controlled via a 9600 baud serial connection. The module accepts a 6 Byte
wide packet which is formatted in the following way.
Table 3: MCM packet format
Start of Packet Acceleration Brake
1
1
1
Content
Length in Bytes
Steering
2
End of Packet
1
34
B.1
Circuit Schematic
35
B.2
C code
The Code for the MCM implements a Finite State Machine (FSM), to run
through the controls of the cart. the MCM is controlled via a serial UART
connection running at 9600 Baud. In addition to being serially controlled, the
MCM can be controlled through a GUI in Labview. Only a code flowchart is included here, for the actual code refer to the website or to the documentation CD.
Power Distribution
The existing golf cart uses 6 6V lead-acid batteries in series to create a 36V
source. These batteries are then tapped at 24V to create a 12V source for the
lights and horn because of this, the chassis is grounded at 24V with respect
to the batteries. The golf cart had two safety switches installed one was a
pressure switch underneath the drivers seat that opened if no one was sitting
in the seat, this safety mechanism had to be removed for our system. The
other safety mechanism is a microswitch installed in the acceleration pedal, this
switch unlocks the electric motor when the accelerator is slightly depressed.
To overcome this issue we installed a SPDT switch which could bypass the
microswitch as needed.
36