This document summarizes a team-based project to design an autonomous robot at California Polytechnic State University. It discusses the hardware and software design of the robot, called C-P30, which is intended to navigate the campus. The project provides undergraduate computer engineering students experience with interdisciplinary systems by assigning them to subsystems of the robot. It uses a motorized wheelchair frame and batteries, and interfaces additional hardware including sensors, motors, and a microcontroller through modular circuit boards. The overall goal is to give students practical engineering experience through a challenging design project.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0 ratings0% found this document useful (0 votes)
48 views
Team Based Project Design
This document summarizes a team-based project to design an autonomous robot at California Polytechnic State University. It discusses the hardware and software design of the robot, called C-P30, which is intended to navigate the campus. The project provides undergraduate computer engineering students experience with interdisciplinary systems by assigning them to subsystems of the robot. It uses a motorized wheelchair frame and batteries, and interfaces additional hardware including sensors, motors, and a microcontroller through modular circuit boards. The overall goal is to give students practical engineering experience through a challenging design project.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6
Team-based Project Design of an Autonomous Robot
Thomas J. Norrie John S. Seng
Dept. of Computer Science California Polytechnic State University San Luis Obispo, CA 93407 {tnorrie, jseng}@calpoly.edu
Abstract Poly. The overarching goal for the project is to de-
velop a robot system that is capable of navigating In this paper, we discuss the design and engineer- areas of our campus. The project is developed and ing of the C-P30, a custom robot design at Cal Poly maintained by a team of undergraduate computer en- State University, San Luis Obispo. This robot is de- gineering students. Creating such an opportunity al- signed by undergraduate computer engineering stu- lows these engineering students to work on a large dents at the university. This robot project is intended project that involves mechanical, electrical, and soft- to be continually developed as students enter and ware systems. Each student is assigned a particular leave the project. In addition, working on the project subsystem of the robot and is required to develop the allows the students to fulfill a university senior design subsystem and define the interfaces necessary to con- experience requirement. As this project is a contin- nect it with the subsystems of the other students. ual work-in-progress, this paper outlines the current This paper outlines the design, construction, and state of the hardware and software design. management of this student-built robot at Cal Poly This paper covers the technical aspects of the State University, San Luis Obispo. This project was robot design, as well as the educational objectives started as an on-going project for the computer en- that are achieved. gineering program. At Cal Poly, seniors at the uni- versity are required to have a senior project that lasts 1. Introduction two quarters (six months). The purpose of the robot project is two-fold: to provide students with an ad- Robots serve as effective educational tools for ditional means of fulfilling the senior project course teaching engineering in the university. Because robot requirement and to provide students with a project design requires the interfacing of software, com- which they find motivating and challenging. puter hardware, and mechanical systems, robot de- In this paper, we document the hardware and soft- sign projects often challenge students by presenting ware technologies we utilize. As this is a continual problems beyond what they typically encounter in the work-in-progress, all systems are described as imple- classroom. The multidisciplinary nature of the field mented at the time of this writing. of robotics often simulates the complexity of projects Because Cal Poly State University is primarily students will encounter when they leave school and a teaching-focused institution, it is difficult for us enter the workforce. to assemble the manpower and financial budget to In addition to the experience the students obtain compete in larger competitions such as the DARPA in areas outside of computer hardware and software Grand Challenge [1]. We view this endeavor as a engineering, team projects also mimic the develop- challenging intermediate project that provides a sim- ment model of industry. It is important for under- ilar education experience for the students but at a graduate students to obtain real experience in man- much lower cost. The challenges in building the aging team dynamics, defining hardware/software in- robot include interfacing with low-level electronics, terfaces, and developing teamwork and communica- vision algorithms, control algorithms, and higher- tion skills. level planning and decision making. As a means to providing students with an oppor- This paper is organized as follows: Section 2 out- tunity to apply their engineering education in a prac- lines work that is related to this project. Section 3 tical setting, we developed the C-P30 project at Cal outlines the hardware components used in the con- struction of the robot. Section 4 describes the soft- ware architecture and tools used for this project. Sec- tion 5 outlines the budget for this project. Section 6 discusses the project management tools used in our development. Section 7 covers the technical and teamwork learning experience. Section 8 describes future work for this project. Section 9 concludes the paper.
2. Related Work
This section briefly describes projects which are
related to this work. Related works cover a wide spectrum of designs that range from individual stu- dent projects to larger robots such as contestants in the most recent DARPA Grand Challenge [1]. Figure 1. This is a block diagram of the hardware In our project, we focus on providing this architecture. The primary controller is based on project as an experimental platform for undergrad- an Intel Core 2 Duo motherboard. The sensor and uate projects and research. actuator interface consists of 2 PolyBot boards Our robot is similar in size to the robots provided along with an IBC dual motor controller which by the DARPA-funded Learning Applied to Ground controls two DC permanent magnet motors. Robots [2] program. In this grant program, the robots are provided by the funding agency and used in pe- riodic competitions to test vision and control algo- rithms when driving over rural terrain. The focus of project so students can work on subsystems much the program is to develop robots which can function earlier. A stock motorized wheelchair comes with a on unstructured off-road terrain using only passive frame, motors, batteries, and a motor controller. For sensors (i.e. no LIDAR or sonar). this project, we opted to use the frame, motors, and The robot described by Ulrich et al. [3] is a mo- batteries, while substituting the standard motor con- torized wheelchair platform with a single camera troller with one that we built ourselves. We keep the mounted on-board. The wheelchair navigates using original connector in place so that we can reinstall a monocular color vision algorithm. We have experi- the original motor controller when necessary. mented thus far using similar algorithms based on the The chassis for the robot is a motorized wheelchair work described by Ulrich et al. (Pride Mobility Jet 1). This wheelchair is powered with two 12V 55AH lead acid batteries wired in se- 3. Hardware Architecture ries to provide 24V, and give the wheelchair a range of 25 miles. The frame of the chassis is constructed This section outlines the hardware components from tubular steel which is very strong. The en- used in the construction of the robot. We cover what tire chassis, including the batteries, weighs approx- hardware systems are currently present. imately 100 lbs. A picture of the wheelchair can be Figure 1 provides an overview of the hardware seen in Figure 3. systems present on the robot. The hardware subsys- tems are designed to be as modular as possible to al- The motors are bi-directional DC permanent mag- low for ease of part replacement and also to ease the net motors with an integrated disk brake. The actua- task of subdividing projects among students. tion of the brake is performed using a 24V solenoid. For this project, we choose to use a frame and We use a p-type MOSFET controlled by one of I/O drive system from a motorized wheelchair. Select- pins on the micro-controller to activate and deactivate ing a stable and pre-built drive system advances the the brake. Figure 2. There are two PolyBot boards present Figure 3. This is a picture of the robot chassis. on the current robot. One is used to manage the The housing has been removed and the base task of reading the sonar and infrared sensors. mounting plate is shown. The other is used to generate control signal for the motor controller. boards use a USB-serial interface which emulates the RS-232 interface over the USB bus. The GPS re- The connection from the original motor controller ceiver we use is USB-based, as is our camera. By to the wheelchair consists of a 9-pin Beau plug. We running all of these devices into a single hub, we can interface an IBC dual motor controller [4] through easily switch out laptops for debugging. this plug. This motor controller was originally de- For our vision system, we use a single color web signed for BattleBot-type robots. This motor con- cam. In addition to using vision, we have a number of troller is designed to control two permanent magnet active ranging devices. Currently we are using both DC motors. The motor controller was designed to sonar sensors and infrared sensors for ranging. interface with the receiver from an R/C radio setup. Fortunately, the PolyBot board provides servo con- 4. Software Architecture trol outputs and three of these outputs on the PolyBot board are connected to the motor controller. This section describes the approach we take in de- We use a 12V automotive-style relay to turn on veloping a software architecture for the robot. In ad- power to the entire system. This relay is energized dition, we describe the technical details of the soft- when the main power switch is toggled. ware design and discuss the trade-offs we consider. The main controller board is supplied by Intel The philosophy behind our software design is to Corp. and is a reference laptop motherboard design. implement the higher-level ‘glue’ code using Python This board features an Intel Core 2 Duo (T7400) run- and to implement the functionality requiring high ning at 2.16 GHz. This controller board is used as the performance in C. The Python code is used to con- core computational unit of the robot. nect a number of system-level modules in the system, Two PolyBot boards are used as secondary con- such as the vision module, the PolyBot board inter- troller boards [5]. One of the boards is used to read faces, and the GPS module. Additionally, Python is sensor information. The other board is used to per- used to provide a GUI for debugging purposes. form motor control. Writing some of the code in Python provides the One primary goal in the arrangement of our hard- advantage of simplifying the implementation due to ware was to have all sensor input and motor output the high-level constructs available in the language. run through a single USB connector. Our PolyBot This allows for faster modifications to the software Google Maps Server current implementation, the Psyco module does not Client Application seem to provide much added performance. (wxPython) The vision code is written in C and uses the OpenCV library extensively [7]. We have found this Internet library to be highly robust and practical for our pur- Unix Socket poses. We are currently testing a ground detection algorithm with our monocular vision system. Thus far, we are using algorithms similar to the work by Network Communicator Ulrich et al. [3] and Dahlkamp et al. [8]. (Python) The high-level control of the robot is designed to GPS Module be modular to allow for the testing of a variety of (C) path-planning and control algorithms. Currently, we are implementing a system that will perform path- Planning and Vision Processing planning based on GPS data until an obstacle is de- Control (C) (Python) tected. At that point, the system will enter a reactive mode to safely navigate around the obstacle. Our cur- PolyBot Sensor rent low-level control algorithm is a PID controller. Interface (Python + pySerial) The code running on the PolyBot board is writ- PolyBot Motor ten in C and uses the PolyBot board library exten- Interface (Python + pySerial) sively. One PolyBot board is used to read the sonar and infrared ranging sensors. The sonar modules are connected over an I2 C bus. The IR ranging sensors are connected through an ADC. The other PolyBot board is used as an interface with our motor con- troller and also monitors on-board voltages. Figure 4. A block diagram of the software archi- The GUI code is written in wxPython [9]. Wx- tecture along with the language used to imple- Python provides cross-platform bindings to the ment each module. The vision code is written in wxWidgets [10] toolkit to Python. The GUI provides C and uses the OpenCV library. Lower-level sen- a visual display of the sensor inputs and robot out- sor interface code is also in C, while higher-level puts. code is written in Python. The underlying operating system that we are using on the primary controller board is Linux. Our code is cross-platform because it is written in standard C architecture without the need to worry about things and Python. such as memory management. Additionally, using multiple languages (Python and C) gives students ex- posure to each language and the details of interfacing 5. Budget cross-language code. The performance penalty of us- ing Python versus C is minimal because most of the In this section we discuss the budget for the Python code will be interfacing with IO-bound sys- project. Cost was a major concern for the project be- tems and will not be computationally intensive. In cause the project was not supported by a grant. The comparison with other high-level scripting-type lan- project thus far has been funded by the university as guages, Python is a natural fit due to its large system well as corporate sponsors. library (allowing easy access to serial interfaces and In order to minimize costs, the wheelchair was such) and its ability to interface cleanly with C. purchased used; the remainder of the parts for the In order to improve performance, we include the robot were purchased new. In addition, we found Psyco module [6]. This module dynamically com- that using components that have a built-in USB in- piles Python bytecode into native code. For our terface not only simplifies interconnection, but also reduces cost. The GPS module and 802.11b module In addition to the development of technical skills, both connect to the USB bus. the students also were exposed to issues faced by industrial engineering teams. The students were re- Item Cost quired to define their own software and hardware in- Chassis $700 terfaces as well as document those interfaces. Motherboard/CPU $1000 Using the Trac program has greatly alleviated the PolyBot boards $120 challenges to managing a large software project with Motor controller $250 multiple developers. Through the wiki-based pro- Camera $50 gram, the students are able to share information re- Sonar $40 garding the subsystem interfaces, as well as share IR $50 information via comments in the source code. This GPS $75 kind of flexibility and ease of information sharing has Wireless module $50 greatly benefited the project. Total $2335 As this is the initial group of students working on the project, it is unknown as to how long it will 6. Project Management take the next group of students who participate in the project to understanding the previous group’s work. Software development is managed using the open- Previous work will be documented in the Trac pro- source program Trac [11]. Trac provides an excel- gram and in the source code. This project will be lent front-end to our Subversion source code repos- constantly evolving as students enter and leave the itory. By using Trac, we are able to view source project team. code changes in a graphical manner which greatly simplifies tracking modifications. In addition, Trac 8. Future Work provides a wiki environment which is very useful for documentation of the project. There are a number of areas where our current im- plementation can be improved. Because this project 7. Experience is on-going, we intend to continually add improve- ments. This project was initiated by a faculty adviser and One current limitation of our robot is that it uses currently involves six undergraduate students. The a single camera for vision. Monocular vision has undergraduate students are all computer engineering its advantages such as lower computational require- majors with senior-level experience. Each student is ments, but stereo vision can provide much more in- responsible for a particular subsystem of the robot: formation. In the future, we would like to add stereo two students work on the robot chassis, two students vision to the robot. Adding a second camera will pro- work on the vision system and high-level control, vide the potential for disparity analysis. one student is working on the sensor data acquisition, The motherboard we are currently using has a and one student is working on a web interface to the dual-core CPU, but our code is not currently opti- robot. mized for multiple CPUs. Currently, we plan on ex- From a technical standpoint, the project has ecuting the vision algorithms on one core while exe- greatly benefited the students. There are a num- cuting control algorithms on the other core. In the fu- ber of computer engineering topics which the stu- ture, it would be beneficial to properly distribute the dents encounter when working on the robot. In terms workload across the cores after analyzing the compu- of software experience and exposure, the students tation requirements of each thread of execution. have gained experience with developing embedded One area we have not explored is the human-robot software, control algorithms, vision algorithms, and interface. Eventually, we would like the robot to multi-language interfacing. In terms of hardware, the interact with people and thus the robot would need project has required circuit analysis, power analysis, to have a simple, user-friendly interface. This in- mechanical design, and modular design. terface could be as simple as some LEDs and some push-buttons or as complex as voice recognition and [6] A. Rigo, “Representation-Based Just-In-Time Spe- speech generation. cialization and the Prototype for Python,” in Sympo- sium on Partial Evaluation and Program Manipula- tion, Aug. 2004. 9. Conclusion [7] “Open Source Computer Vision Library.” http://www.intel.com/technology/computing/ This paper describes the hardware and software opencv/index.htm. design of a robot along with the pedagogical experi- ence obtained through the design project. The project [8] H. Dahlkamp, A. Kaehler, D. Stavens, S. Thrun, and G. Bradski, “Self-supervised Monocular Road De- was developed by six undergraduate students. The tection in Desert Terrain,” in Symposium on Partial students have found the project simultaneously chal- Evaluation and Program Manipulation, Aug. 2004. lenging and motivating. A strength of this project is its flexibility in providing multiple generations of [9] “wxPython.” http://www.wxpython.org. students with a wide range of engineering projects. [10] “wxWidgets - Cross-Platform GUI Library.” The project goal is to develop a robot that can au- http://www.wxwidgets.org. tonomously navigate the Cal Poly campus. The robot [11] “The Trac Project.” http://trac.edgewall.org/. will do this via sonar, infrared, and vision sensors. The primary controller for the robot is an Intel Core 2 Duo motherboard which is connected to PolyBot boards. The PolyBot boards perform low-level sen- sor interfacing and motor control interfacing.
10. Acknowledgments
We would like to thank Intel for donating the pri-
mary controller board and Raytheon for funding the motor controller. We would also like to thank the Computer Engineering program at Cal Poly for fund- ing the purchase of the motorized wheelchair.
[3] I. Ulrich and I. Nourbakhsh, “Appearance-Based
Obstacle Detection with Monocular Color Vision,” in The AAAI National Conference on Artificial Intel- ligence, July 2000.
[4] “IBC Dual Motor Controller Manual.”
http://robotcombat.com/marketplace robowars.html.
[5] J. Seng, “PolyBot Board: a Controller Board for
Robotics Applications and Education,” in The 2006 International Conference on Frontiers in Education: Computer Science and Computer Engineering, June 2006.