Proceedings of the Multi-Disciplinary Senior Design Conference Page 9
Project Number: P14215
Copyright © 2014 Rochester Institute of Technology
Proceedings of the Multi-Disciplinary Senior Design Conference Page 9
Autonnomous wandering ambassador
Apurva ShahElectrical Engineering
(Project Leader) / Nick Nguyen
Electrical Engineering
(Software Specialist)
Peichuan Yin
Electrical Engineering
(Electrical Specialist) / Michael Gambino
Electrical Engineering
(Electrical Specialist)
Copyright © 2014 Rochester Institute of Technology
Proceedings of the Multi-Disciplinary Senior Design Conference Page 9
Abstract
The objective of this project is to modify a currently remote controlled robot so that it can be both autonomous and remote controlled. In doing so, the robot should have the capability to navigate autonomously without causing harm to any person or damaging any objects in its path. In order to improve upon the previous version of the robot and make the robot more user interactive, sonar and IR sensors will be implemented to avoid obstacles. In addition to the sensors, the robot will be able to navigate autonomously using an RFID reader to move on its own through the third floor of the Kate Gleason Engineering Building. Successful navigation of the third floor will act as proof of concept and will lead the way for future wandering ambassador projects.
introduction
The Autonomous Wandering Ambassador has been a mascot for the Electrical Engineering Department at Rochester Institute of Technology. It began as a sustainability project and has been modified over the years to serve various purposes such as a plowing snow and squirting water robot.
The objective of this project was to add autonomous navigation to a robot that was originally remote-controlled... In doing so the robot should have the capability to navigate through the third floor of Kate Gleason Engineering Building without causing harm to any person or damaging any objects in its path. This robot will be equipped with multiple different types of distance detecting sensors, PS2 controller interface for remote controlled access, and a remote terminal that way team members are able to control the robot from a remote location. The robot will also have the capability to read RFID tags which would be placed at the entrance of the different rooms on the third floor. The RFID tags have information the robot will retrieve which also will determine the location of the robot through the remote terminal.
The purpose of the project was to innovatively design new features of the project that can be able to be passed down to future engineering students. In addition to the work completed on this project, this served as a template for team members moving into the real engineering world as they progress into their careers.
process
The process of this project entailed a lot of learning from previous teams that have worked on the project. Many graduated seniors came in for code reviews as well as explained existing boards and schematics of the old project.
As creative and innovative students at the Rochester Institute of Technology the team created a list of customer requirements that was presented to Professor George Slack, our faculty guide. Below is a table of customer requirements that was made to differentiate the previous projects to our own.
Table 1: Customer Requirements
The objective of creating this list was to start a list of accomplishments that the team wants to create as a whole. The use of RFID in this project was to identify classrooms and function as a navigation system for the third floor the engineering building. Another major part of this project was to make sure the robot is controlled via remote control. This was a wireless PlayStation 2 controller that sent signals to an Arduino microcontroller that was wired on board the robot. Another customer requirement was to investigate all the parts already existing on the robot from the previous team. The team discovered that the encoders and a few other parts were not functioning so this also was a setback, but all the non-functioning parts were replaced.
A table of more engineering requirements was created to address the customer requirements. From a technical stand point there is a large amount of expertise that was requirement in programming, machining, and hardware engineering. All this was used in creating and accomplishing this project. The biggest goal for this robot was to ensure object avoidance and to identify its current location. Object avoidance was implemented with IR sensors, sonar sensors, and other various sensors such as an IMU. Location was determined in two ways: RFIDs acting as waypoints and encoders and accelerometer for tracking distance and direction while traveling between RFIDs.
Table 2: Engineering Requirements
Figure 1: Functional Decomposition
Figure 2: System Block Diagram
The Functional Decomposition and the System Block Diagram of Figures 1 and 2 represent the overall flow of the project. The Functional Decomposition relates objectives with goals for completing that objective. The System Block Diagram connects hardware and software and shows how the system interfaces with the user.
The robot is equipped with six sonar sensors and eight IR sensors. The sonar sensors and IR sensors were calibrated according to the data shown below. Note that raw values given when testing the IR sensors were those obtained from an Arduino microcontroller at the given distance. Both tests were performed by suspending a solid, flat object in front of the sensor at the given distance.
Table 3: Values of Sonar Calibration
Figure 3: Sonar Sensor Graph with Equations
Table 4: IR Calibration Values
Figure 4: IR Sensor Calibration Equations
The table below outlines the risk assessment devised by the team. Each team member took ownership of a risk associated with completing the wandering ambassador project.
Table 5: Risk Assesment
During the planning stages of the project the team analyzed possible constraints within the project while also collaborating with the faculty guide. No constraints on the project were found, except for the budget of $500. This amount was negotiable if the team required more money to complete the project. Also the team voted on how to split up the work that was required for this project. We named Apurva Shah the Project Lead, responsible for the overall project and to support on any needed technical aspects of the project. Nick Nguyen the Hardware Specialist, in charge of any hardware or circuit related materials for this project and also support for software. Michael Gambino, Software Specialist, in charge of any software related decisions and support for hardware engineering. Peichuan Yin, Electrical Specialist, in charge of wiring diagrams, schematics, and also support for the team. These roles changed during the course of development. By the end, Nick focused primarily on the software detecting objects and controlling the robot.
As a team there was a need for collaboration on all parts of the project and as aspiring engineers this acted as a stepping stone and learning element for different areas of work that could be applied in real world scenerios.
Results and discussion
Most customer requirements were met One item that was not met involved distance of detection by the RFIDs. This problem will likely be encountered by a future team. The robot was successfully able to avoid runningavoid running into obstacles using the sonar and IR sensors... It was originally planned to have a custom PCB that would allow all eight IR sensor readings to pass into the I2C bus line, however it was too costly and time consuming. In alternative, was to cascade the IR sensors using OR gates. The disadvantage of this method is that the tripped IR sensor is unknown, and proper resistance (require for setting threshold) need to be computed for each IR sensor. Figure 5 shows the connections made using the OR gates and this was done in a cascaded format. The robot was equipped with ten IR sensors, but two were connected to an ADS1015 to access I2C bus line on the microcontroller to handle stair detection. The cascaded IR sensors handled close-range object detection. These were aimed at the floor to prevent the robot from hitting objects close to the floor and out of range of the Sonar sensors. If one tripped, the robot would cease movement until the object was removed.
Figure 53: Cascaded IR Sensor Pin Connections
As seen in the above figure, resistors were used as voltage dividers to calibrate the IR sensors to show 'Lo' when only the floor is detected. Unused pins were pulled down to ensure the ICs functioned as expected.
A Playstation 2 controller was used from manual control of the robot. Below is a snap shot of the code which implements the basic functions for moving forward, backward, left and right.
Figure 64: Arduino Code for Manual Control
Further work is required to improve functionality of the RFID reader on the robot. The current RFID reader can detect an RFID tag up to 4 inches away from the reader. This requires that the reader be placed on the underside of the robot, to detect tags placed on the floor. There is very little room for error. One task for future teams could be to develop an antenna that will increase the range in reading the RFID. For test purposes the current implementation is a work in progress. The RFID tags were intended to be the primary tool for determining the current location of the robot. There is a chance that an RFID tag would not be detected if the robot does not reach the exact location necessary.
To extend the distance of RFID tag detection, the current antenna on the RFID reader must be replaced with a larger antenna of similar shape, a loop antenna. In addition to replacing the antenna, a new matching circuit would need to be designed. Another potential solution would be to replace the passive RFID reader and tags with an active RFID reader and tags. Typically the range for active RFIDs would be significantly more than the passive RFID.
Figure 75: Processing Remote Screen
Figure 75 is a snapshot of the screen to remote access from the terminal to gain the sensor readings as well as to find the location of the robot. Note that the live feed is captured using a script stored on the PandaBoard, initiated by login into the Pandaboard and use “runcam” command. Live feed will be blank if “runcam” haven’t been initiated.The map is the third floor of the Kate Gleason Engineering Building that will be used to test the robot navigation and path finding. The robot avoid obstacles while navigating using RFIDs ,Sonar, and IR sensors.. The encoders will be used to measure distance, particularly when traveling between RFID tags. The webcam on the robot allowsfor a remote view of what is in front of the robot. The robot acts as an ambassador, allowing users to remotely view the area around it. It would be particularly useful to view an area dangerous to people.
Conclusions and recommendations
Throughout this project the team as a whole was fortunate to have great support from the Electrical Engineering Department as well as help from various people and departments in the Kate Gleason School of Engineering. There was a rather large learning curve when developing the different schematics as well as looking at what was already built and implemented on the robot. During MSD I, multiple code reviews were held just to understand the basis of connecting the Panda Board to the server at RIT. While MSD I was mostly documentation and learning about the robot, a large amount of benchmarking was also completed; Boards; Boards were tested, circuits were made, and prototyping was performed. This was all part of the process to figure out how it would be best to design modifications to the robot.
The objective of the robot is to navigate through the third floor of the engineering building at RIT. Although all customer requirements were met, one major goal for a future team would entail creating an antenna for the RFID reader. This antenna that the future team creates should allowshould allow detection range of RFID cards of one foot. This will allow for less exact movement to detect an RFID tag. Instead of purchasing an active RFID this would give the future team a challenge in soldering, circuit design, PCB design, and also a sense of team work just to develop an antenna from scratch.
Due to the high success rate of this project and meeting the customer specifications another recommendation to a future team would be to develop an application for an android or iPhone users to control this robot. In addition, due to lack of time and skill sets, autonomously navigating to a desire end location wasn’t completed. However, a good based code with AI path planning was implemented. It’s highly recommended that future team continue to improve the code and making autonomously navigating to a desire end location possible. It would be highly encouraged to have a computer or software engineer on the team to refine andrefine and improve the code.
References
[1] "SkyeModule M9." Sketek Tagnostic 19 Jan. 2005. Print.
[2] "GP2Y0A60SZ0F." Sharp IR Sensors 18 Oct. 2010. Print.
[3] "Using PSoC 3 and PSoC 5LP GPOP." AN72382 17 Sept. 2014. Print.
[4] "Pandaboard." OMAP 4 PandaBoard System 22 Sept. 2010. Print.
[5] "MA3." Minature Absolute Magnetic Shaft Encoder 29 Nov. 2011. Print.
[6] Rectifier, IOR. "IRF530NpbF." International Rectifier 30 Jan. 2004. Print.
[7] Pro Audio, Bourns. "EM 14." Rotary Optical Encoder W/Switch 8 June 2011. Print.
[8] Instruments, Texas. "SN54HCT32." Quadruple 2-input Positive OR Gates 8 Aug. 2003. Print
[9] "MB7001." Sonar Range Finder 3 Apr. 2005. Print.
[10] "PN532." Adafruit Breakout and Shield 19 Sept. 2009. Print.
[11] "TSL2772." Light to Digital Converter with Proximity Sensing 15 Mar. 2011. Print.
Acknowledgments
This project would not have been able to be completed without the help of Doctor George Slack. In addition, thank you Celal Savur (CE) for dedicating his time and specialties in helping with interfacing with PandaBoard and creating a Server and Client program. Also thank you to Jim Stefano and Ken Snyder for giving us the necessary equipment to complete this project. Thank you to the Electrical Engineering department that funded this project.
Copyright © 2014 Rochester Institute of Technology