Elcano Student Projects
Tyler C. Folsom, PhD, PE
Affiliate Professor, University of Washington, Bothell
Project Manager, QUEST Integrated, Inc., Kent, WA
The Elcano project is an attempt to build a low-cost, highly energy efficient, self-driving electric tricycle that can form the backbone of a 21st century urban transportation system. The mechanical, electrical, and software components of a first iteration prototype have been assembled, but there is much work remaining. The AI systems can be worked on in parallel without access to the vehicle. Open source code is maintained in https://github.com/elcano/elcano. All code is in the Arduino dialect of C++. A broader discussion of the technology is on http://www.qi2.com/index.php/transportation.
Project 1. Complete mechanical and electrical configuration.
Vehicle #1 is drive-by-wire: a joystick and microcontroller implement throttle, brakes and steering. The actuators and batteries need to be mounted and wired on Vehicle #2. The joystick for Vehicle #2 will be remote, and needs to be connected. There is also a need for electrical design to implement the emergency stop and produce the next iteration of the low level printed circuit board.
Project 2. Software.
Some or all of steps 4 through 10 on http://www.elcanoproject.org/tutorial/ See the web for details of the tasks.
Project 3. Sensor fusion for localization. [Step 8]
Elcano uses an Arduino Mega microcontroller to obtain a realtime fix on the current location. The implemented code uses GPS and a Kalman filter, but the result has limited accuracy. The system should be expanded to use wheel odometry, visual odometry, inertial measurement unit, and dead reckoning, as described in http://www.enviroteach.com/LandNavigation.pdf . The Kalman filter will perform the sensor fusion. All sensors are currently available.
Project 4. Map reading. [Expansion of Step 9]
Elcano performs continuous path planning, using the A* algorithm on an Arduino Duemillenove. The possible paths consist of a road network, with decision points only at the intersections. This restricts the scope of the search so that it can run in real time on limited power. At present, the road map is hardcoded. This task is to build a PC based system that will let a user select the intended operating region, download all latitudes and longitudes, select appropriate information, and format it properly. The result will be written as a C++ header file, which can then be compiled for the current mission. Use
Project 5. Vision system: detect cones. [Step 11]
The vehicle satisfies the weight and size limits for the Seattle Robotics Society’s Robo-Magellan contest: http://www.robothon.org/robothon/robo-magellan.php . The contest requires navigating between several points marked by orange cones. Elcano presently has no vision system. The first stage is a PC based demonstration. The suggested approach and sample images are on http://sourceforge.net/p/urbanchallenge/code/111/tree/Elcano/Vision/ .
The second stage would be selection of an embedded computer and camera and porting the code to the embedded platform. All code must be self-contained, since there will be no operating system on the embedded computer. If development is done in OpenCV, the selected algorithm can be extracted in stand-alone C++ source code, and the data structure simplified.
Project 6. Vision system: detect lane markings and edges. [Step 12]
This project has some similarities to Project 1, but the application is visual detection of lanes. The current vehicle position and attitude are known, and a map is available. Time of day and geographical position are known, so shadows can be predicted. Thus the expected scene is known, and can be correlated with the sensed visual input. The lane position will eventually be part of the input to localization in Project 3. Lane detection should work under any lighting and weather conditions.
Project 7. Configure simulator.
Elcano is designed to be compatible with the USARsim simulator (http://sourceforge.net/projects/usarsim/), which is a mod to the Unreal Tournament game. In both the vehicle and the simulator, the link from the AI to the low level control is a text string giving wheel rotation rate and steering angle. The most important configuration is to port Elcano’s AI code (running on three Arduinos) to a PC. The ideal method is to create a PC system that can compile Arduino code. USARsim has at least one road-based environment equipped with instrumented cars. When this is downloaded, installed, and configured, it will be possible to use the PC version of the Elcano AI for efficient testing of the vehicle. In the simulator, execution of the AI produces real-time graphics of the vehicle’s behavior, which is controlled by simulated sensors and actuators.
A secondary configuration is to create the graphics and physics of a simulated Elcano vehicle.