Friday, December 24, 2010

Progress!

Its been an extremely productive couple of weeks!

Most of the hardware of the Robobuggy is complete,  but we're still waiting for interface boards in the mail. Luckily, I have it set up so that you can simply just plug in the component where its needed and it'll be ready to go!

We have been actually road testing the Robobuggy. We walked out onto Forbes and Morewood at around 2AM EST to test the camera's vision in the dark. People were definitely confused, but the testing went extremely well. Nate's algorithms responded well, even with little light.

In addition, we actually pushed the Robobuggy down the free roll, with me and Nate close behind it. We gained some valuable sensor data that Nate was able to use to test to the algorithms.

After going through at least 6 sensors, tonight we finally got a hall effect sensor tested and mounted, and we're ready to play in traffic tomorrow. This time we'll be around Maggie Mo, to test the different types of street lines and the camera's response to them.

AEPi's an absolute mess, but we'll be cleaning that up tomorrow as well. In addition, here's a snap from our sketchy night run:



Also, we'd love to hear from the buggy community! Leave us feedback.

-Alex

Friday, December 17, 2010

Winter Break Update

Nate and I have been working until around 4AM every day this week on Robobuggy. Its been pretty stressful, but its still a lot of fun. We've accomplished a lot so far, but we still have a bit of ways to go.

A few nights ago, we took the buggy out on Forbes and on Morewood at around 2AM. Nate needed to gather data to ensure his algorithms were robust enough to work in the dark as well. There weren't many cars on the road, but the drivers that did see us pushing a buggy in the middle of the street must have been extremely confused. The drivers familiar with the area probably realized that this was typical CMU behavior.

The hardware aspect of Robobuggy is fairly small. There's one main computer that sits right in the middle of Robobuggy that controls all of its peripherals. A "phidgets" I/O Board hangs off of the main computer that controls and reads the data from the hardware components in parallel. The steering encoder runs through the I/O board as well as the hall effect sensor on the front wheel. In addition to the I/O board, there is a Servo controller that also plugs into the main computer via USB.

Although we do realize the analog camera that was on the old Robobuggy is still very good, its difficult to interface it with with our new computer brains. So instead, in true Nate Barshay image processing style, we replaced that camera with a Playstation Eye Toy (The Newer Version for PS3) . I could go into specifics of why we chose that camera, but the bottom line is that it was made for image processing. In addition, Playstation wants to push games instead of the EyeToy itself, so its pretty cheap for its value.

Two relays will control the braking valves. The relays are attached almost directly to the RC receiver, so we don't have to run the brake through the computer. In addition, the Robobuggy is set up in a way, that we can literally plug the steering servo into the RC receiver and steer it by remote control, pretty easily. We may do this in order to gather data more effectively.

3 6V Sealed Lead Acid Batteries will be powering the Robobuggy. I built an enclosure for them last night.

Some set-backs we've had:

We thought we could get away with building our own simple circuit for the quadrature decoder, turns out, its not that simple, and we need to buy specialized hardware in order to make that work.

The Hall Effect Sensor broke. I got new ones, all I have to do is attach them.

Nate's been chugging away with the code, and in our next update I'll provide a more detailed view on how the algorithms might work. Nate's having a bit of trouble with the methods being too slow, and he's working to fix that. In addition, I'll try to provide some stills from our sketchy night run.

Until next time,

Alex

Sunday, December 12, 2010

The Next Two Weeks.....

As of right now, Nate has a pretty clear idea on how the navigation algorithm is going to work. Its essentially image processing algorithm coupled with navigation algorithms based on position, time and velocity, where the two algorithms play off each other depending which data is readily available. The buggy will rely on clues from its previous position and velocity, and make a guess as to where it is with a given confidence interval. If there is a white line available in the road, the confidence will increase. The buggy will compare its projected position on the road with a "map" of buggy course already in its brain. We predict that map will be represented as a string of GPS coordinates that trace a line through the track.

I have been working on the hardware and getting it all to interface with each other. There are circuit boards to print, and components to solder, so I definitely have all my work cut out for me. Right now I'm working on setting up the emergency brake to stop the buggy with an RC transmitter and integrating an I/O board to interact with a computer.

Nate and I will be in Pittsburgh working on Robobuggy until Christmas day. If you're in Pittsburgh - Stop by! We're at the corner house on Forbes and Morewood, the AEPi house.

-Alex

Friday, November 19, 2010

And we're off!

The SURG committee approved the RoboBuggy grant I wrote, so now we have $1000 to spend!

We have been busy making system diagrams of all the components we need, and we have begun formulating a list of various other electrical components, in addition to setting up our Zotac motherboard.

Here is the system diagram for the entire RoboBuggy:



The only change is that the power supply is at around 18 Volts, as we bought 3 6V 5aH SLA batteries. The Voltage Regulator Board will be adjusted accordingly.

In addition, a R/C controller will be used as an emergency stop for the RoboBuggy if it goes off of the course. The circuit for that setup is below:


Ubuntu is now installed on the motherboard! The GUI will be removed, and startup scripts will be implemented in order for it to function reliably. We will be accessing this computer via SSH in the meantime.

Tomorrow Nate and I are bussing to the hardware store bright and early in order to make a casing for the computer!

Stay tuned!

-Alex


Monday, October 25, 2010

Breaking News!

AEPi has been looking for a brother to drive for awhile now. Just in case, for whatever reason we do not have drivers one weekend, we want to be able to have a brother that we know we'll be able to drag out there. I personally have been able to fit inside Kamikaze, but have been unable to steer effectively. Our current buggy chair, Jake Yosafat, was able to fit inside Kamikaze, but wasn't able to see well.

I'm not sure why we didn't think of it sooner, but Nathaniel Barshay, is the perfect candidate. Sure enough he slides into Kamikaze perfectly, and steers and sees really well.

Since he'll be heading up the vision algorithms for RoboBuggy, driving an actual buggy will give him a perfect idea of the course.

Stay tuned for added hilarity.

Hiatus

Hi All - its been awhile!

I've spent the past couple weeks writing the RoboBuggy proposal. Right now, Nate and I estimate that we can build RoboBuggy for around $800. That's pretty cheap for an autonomous vehicle, and we'll see how long our budget holds to that.

For all interested parties, I will attach the RoboBuggy proposal! Its not my finest work, but hopefully it will get the point across. The point being: This is an awesome project and needs funding.

Enjoy!



Project: RoboBuggy
Collaborators: Alex Klarfeld and Nathaniel Barshay
Introduction
    Robotics and the sport of Buggy have been a trademark of Carnegie Mellon since the university's first inception. The bridge between these two ideas was first established in the year 2000 as Arne Suppe's senior research project. Partnered with the Computer Science department, Arne created a fully autonomous robotic buggy that rolled exhibitions on Race day 2001. After it successfully navigated the free roll, the RoboBuggy did not get past the monument at the bottom of the hill. Due to technical issues, the RoboBuggy did not roll after that and lay dormant for many years.
Research Question and Significance
    We propose to resurrect the RoboBuggy project with newer technologies. We would like to prove that it is possible to build a vehicle capable of autonomously navigating a regular street at a low cost and with relatively limited resources. The idea of autonomous vehicles have been at the front end of today’s technical world, but are usually tackled by companies or large research groups like Red Whittaker's autonomous SUVs. Most of these projects have been funded with extremely large budgets ranging in the millions of dollars. We are looking to create a scaled down version of an autonomous street vehicle for roughly one thousand dollars. This project would prove interesting not only to members of the Carnegie Mellon Community, but to the robotics community as a whole, as we are planning on building an autonomous street vehicle for about a third of the cost of a regular, manned buggy. In addition, our main fabrication shop will be the basement of the Alpha Epsilon Pi fraternity. As one can infer, this workshop will have significantly less resources compared to one of Carnegie Mellon's research laboratories. The main point of interest is that such a sophisticated robotics project is feasible with very limited resources.
Project Design and Feasibility
    The equipment for RoboBuggy will be gathered and tested during the Fall Semester of 2010. All of the equipment from the last RoboBuggy attempt has already been donated to the current project. Since the last project took place ten years ago, much of the equipment inside the RoboBuggy is obsolete. This hardware has been removed and preserved in a safe location. The hardware that will remain with the current project are the metallic buggy shell and braking mechanism, the high resolution camera, the large steering servos and the turning radius encoder. Once the buggy has been fully assembled and tested in house, we will attempt to navigate the free roll using computer vision coupled with a powerful line following algorithm. This algorithm will be adapted from the winning Mobot that Nathaniel and I built for the 2010 Spring Mobot Race. The code will be tweaked so that the RoboBuggy will be able to follow the white street lines. We are planning on incorporating the RoboBuggy with Alpha Epsilon Pi's regular fleet of manned buggies, which will allow us to complete field tests every day of the weekend between the hours of 4:30 and 9:30AM. Theoretically, the RoboBuggy will be able to be tested two to three times a day during that time span. All of the field testing will occur at the start of buggy rolls, Spring Semester 2011.
    This project will be split into five milestones. These milestones correspond to different parts of the course. The first milestone will be if the buggy successfully navigates down the free roll using line navigation and successfully stops when it reaches the bottom of the hill or if its prompted to stop in case of an emergency. The second milestone will be if the buggy can navigate from the start of the race on Tech Street to the start of the free roll with the aid of pushers. The third milestone will be if the buggy can navigate along Schenley Drive, maneuvering around the monument and successfully recognizing the flaggers. The fourth milestone will occur if the RoboBuggy successfully takes the sharp turn at the edge of the chute. This milestone will be the most difficult out of all of them. Finally, the fifth milestone will be if the buggy can navigate to the finish line with the aid of the pushers. We will strive to achieve all milestones by Race Day 2011, though we realize this is a very daunting project and realistically we will have to work over multiple years. For Race Day 2011, our goal will to be to have the RoboBuggy roll up to the sharp turn in the chute.

Above: The five milestones of the RoboBuggy project mapped out on the Schenley Course.


    In addition to the idea of methodically splitting up the course, supplementary markers will be stationed along the sides of the course for added guidance. A remote control will also be on hand so that RoboBuggy will be able to stop in case of an emergency. RoboBuggy will roll last in Alpha Epsilon Pi's fleet of manned buggies and a follow car will be right behind it. Inside the car will two individuals capable of picking up the RoboBuggy and moving it into the car, in addition to the driver. These individuals will also carry a toolbox and the emergency remote control.
    The RoboBuggy will be treated as a regular AEPi buggy. We will receive support from our fraternity brothers in order to make this project a success. We will delegate tasks to competent members and work together as a group. The schedule for rolls has already been preplanned, and because this buggy will be treated like a regular AEPi buggy the amount of work that will go into the development outside of the race course will be split up amongst brothers. The RoboBuggy will receive equal, if not more attention than a regular buggy and the amount of time spent on RoboBuggy will amount to around 15 hours a week.
Background

    Both Nathaniel Barshay and Alex Klarfeld have had vast experience in the field of engineering projects considering their young age. Alex Klarfeld worked for NASA at the age of 17 on the first and last launch vehicle of the Constellation program, the Ares I-X rocket. From there he was able to see a large scale engineering project in a professional setting. In addition, throughout his High School career, Alex placed second in a national engineering called CANSAT. This project required high school students to build a sub-orbital satellite inside of a soda can. This gave Alex the opportunity to lead an engineering project from start to finish. Alex was also was one of the few interns selected to work at SpaceX for the summer of 2011.
Nathaniel has been building robots since a young age, with high school successes including World Champion at the Trinity College International Fire Fighting Home Robot Contest, and Third place in the iRobot create challenge. He also spent two summers working for a LEGO engineering lab at Tufts University, and one summer as an intern at Qualcomm.
    Both Alex and Nathaniel worked on the winning Mobot for the Spring 2010. Alex and Nathaniel are both Sophomores, Alex is studying Electrical and Computer Engineering, and Nathaniel is studying Computer Science.
Feedback and Evaluation
     A blog will be kept up to document the progress of RoboBuggy (http://www.robobuggy.blogspot.com).  I have sent out this blog to all interested parties with the intention of receiving feedback on the project. We will be checking in frequently with our faculty adviser, Mark Stehlik, or other adviser Arne Suppe. A copy of our projected schedule will be forwarded to them as well. Our progress will be judged on how closely we are following the projected schedule as well as how well we are documenting the issues that arise so that we do not face them again. Our goals are lofty as this is quite obviously a multi-year project, but we will hold to the schedule as best as possible.
Dissemination of Knowledge
     The results of our project will be demonstrated through many different means media including pictures and video. A poster will also be created documenting the system level design of the RoboBuggy's hardware. In addition, the blog will be updated frequently with all of the progress made on RoboBuggy. This will be available for the all interested parties to see. A final engineering report will also be written, documenting all the successes and failures of the RoboBuggy project.

Budget


ItemVendorURLPrice
Mini-ITX MainboardLogic Supplyhttp://www.logicsupply.com/$150
PCI-Express TV DecoderNewegghttp://www.newegg.com/$100
picoPSU power supplyLogic Supplyhttp://www.logicsupply.com/$60
RAMNewegghttp://www.newegg.com/$80
Solid State Hard DriveLogic Supplyhttp://www.logicsupply.com/$120
Computer case and Shock MountingLogic Supplyhttp://www.logicsupply.com/$80
IO CoprocessorsPololu Roboticshttp://www.pololu.com/$100
Misc. ComponentsMouser Electronicshttp://www.mouser.com/$100
Total: $790




Budget Narrative


     The Mini-ITX Mainboard will be purchased as a replacement for the motherboard that was used in the year 2000. This is an industrial strength computer and is designed to withstand the bumps along the course. It is necessary for interacting with the rest of the hardware. The PCI-Express TV Decoder will be used to in order to integrate with the RoboBuggy's original camera. Since this camera was already donated to us from the previous project, it integrated very well with the RoboBuggy's original motherboard. Since we are replacing this motherboard, we need a TV decoder to utilize the camera. The power supply will be used to power the motherboard as well as supplementary power for the other components. This power supply was recommended to us by the previous project owners and should also fulfill its role as an industrial strength component. The RAM will be used in order to allow for the image processing algorithms to be run quickly and effectively. The on board RAM that came with the motherboard is not enough for the programs to run quickly. The solid state hard drive will be used to hold all the programs and operating systems we need to operate the RoboBuggy. The solid state characteristic is important so that bumps on the road do not interfere with our data. The computer case and shock mounting are important to protect the computer inside of the RoboBuggy. This buggy should be able to roll continuously without repair, and the computer case and shock mounting should ensure that. The IO Coprocessors will help us communicate with some of the original hardware that we would not be able to interface with otherwise. The legacy hardware we will be interfacing with includes the brake, the servos and the front wheel encoder. The miscellaneous components includes items such as the watchdog timer which will detect if the computer crashes, various voltage regulators and power supplies to help us supplement the hardware, as well as wires and other sorts of connectors.  

-Alex

P.S. I hope you all are impressed with my immense HTML table skills.




Tuesday, October 5, 2010

All Shook Up

To all my loyal RoboBuggy followers,

I apologize for the delay in updating the blog. Work has been busy for all of here at Alpha Epsilon Pi, but we're slowly getting in the swing of things.

A few weekends ago, Nate and I attempted to get better footage of a roll, with a better camera. So we spent the greater part of our Saturday Evening building a solid camera mount for the buggy to mount Nate's camera on.

Here it is:

Nate's Fancy Camera on Kamikaze

Nate adjusting the Camera

This mount was strong. The camera did not move a single inch! Unfortunately, in the end, this was a bad thing, as the footage we got of the roll almost made me throw up. The camera shook with every single bump in the road and it was impossible to see anything. Guess its back to the drawing board.

Also, I will be writing SURG grants this weekend to get money for this project! Not that exciting but I will update the blog when I finish!

-Alex

Sunday, September 19, 2010

Data Collection at the Barricades

Nate and were assigned to buggy chores today. Fortunately, we made the best of our situation. We were stationed at the chute, and we were given a perfect view of the buggies maneuvering around the corner. We snapped some videos of the buggies on my Droid. There's about 19 videos from of buggies going around that chute. We have decided that this is going to be one of the hardest parts of RoboBuggy.

Here's an experienced driver successfully taking the curve: http://www.youtube.com/watch?v=IRjjDDHCIeM

This is going to be a really difficult task, as there are little land marks around the curve that the RoboBuggy would be able to pick up on. Our current solution is to put additional flaggers (or just unmanned flags) around the hay bales as visual cues for RoboBuggy.

In addition to filming the other buggies, we put a camera on our own buggy, Kamikaze, and our driver Emily provided us with a nice perspective of what RoboBuggy will see. I've seen a few first person views of buggies, but this one is pretty cool.


We're open to any suggestions on how we should deal with the curve. It looks like we're going to tackle the free roll the same way they did it in the past, with line following. Also, I apologize for providing links intead of embedding video. Blogspot didn't want to upload them for some unknown reason.

-Alex

Saturday, September 18, 2010

Gutted

Its Friday Night at 11PM. The basements of normal fraternities are filled with libations and attractive women.

AEPi at Carnegie Mellon marches to the beat of a different drummer. You know, that drummer in marching band, who is really nice and quiet but kind of smells weird. Yeah. That guy.

The boys of AEPi decided that Friday night would be a perfect opportunity to whip out the power tools and begin exploring the depths of RoboBuggy.

Mike Zankel and Brandon Sherman took lead of this endeavor.

Mike Zankel is in the stripes, Brando is in the light blue, and Nate is hunched over

Nate and Alex watched in horror as Brando took an angle grinder to the rusted bolts in order to free the brains of the RoboBuggy.

After 3 hours of work, when the sparks and debris settled, the ancient brains of the RoboBuggy were unearthed.

Here is the full Inventory List:


  • 1 – 150MB Flash Hard Drive and Ribbon

  • 1 – Magnetic Compass

  • 1 – Unkown home made relay component

  • 1- Intel motherboard

    • 1 – Pentium III processor

    • 1 – ethernet card

    • 1 – graphics card

    • 1 – video decoder

    • 2 – Expansion cards labeled I and II

    • 1 – Shockmounted board

    • 3 – sticks of RAM

  • 1 – 12V 18A*Hr Lead Acid Battery

  • 1 – 12V Computer Power Supply

  • 1 – VGA Adapter

  • 1 – 25 pin to ethernet and 4 pin adapter

  • 1 – Optical Encoder

  • 1 – Enormous Seiko Servo for the front wheel

  • 1 – Pneumatic Braking System

  • 1 – 6mm Color Camera

  • 1 – Video Transmitter

  • 1 – Logitech Wireless Mouse Adapter

  • 2 – 37 pin input boards labeled CTR #1 and 2

  • 1 – Cigarette Lighter Adapter and Power Panel

  • 1 – 8 Channel RF Reciever

  • 3 – CIA Buggy Wheels

  • 1 – Steel Push Bar
  • Assorted RF, Ethernet and Power Wires

Here is a picture of the home made relay component we found. If any of our loyal readers have any ideas what this is, please comment.
???


The final result:


RoboBuggy ain't go no smarts no more



The Next Steps:

1. The two components that we are focusing on right now are the laser gyroscope and the optical encoder. Nate is currently playing with them in order to figure out their basic function.

2. We're also trying to figure out how to mount a camera to our normal buggies in order to get data during rolls. We're trying to find a camera that is cheap but would provide usable data, so we're looking for one that will be very close to the cameras we'll be using on the real deal.

Although we're planning on rolling RoboBuggy in exhibitions during this year's races, we want to make this buggy adhere to all the rules of a normal buggy. Our goal is to make RoboBuggy a competitve buggy in the near future.

Here is the complete rulebook that I found online from 2009:


One idea to increase the buggy's vision signals is to provide extra flaggers around the course that will help tell RoboBuggy where it is. This is within the rules (see the last paragraph of section 8.1.3).

A downside to adhering to the rules is that we would like to put a GPS unit in the buggy. This is a problem because section 10.5.2 under "Buggies" state that the driver can't have any communication to the outside world that isn't already accessible to them, and it specifically states that telemetry units are prohibited. Now although a lot of our other components may provide a little bit of haze within the rules, we can make the argument that we're simply mimicking all the senses of the driver, IE sight, speed and direction. Unless Sweepstakes says otherwise, we will need to figure out what we want to do with the GPS, as it can be incredibly helpful.

In addition, if you would like to learn more about RoboBuggy, simply see the project, or learn about regular buggies, feel free to either email me at alexklarfeld@gmail.com or simply stop by AEPi! We don't mind sharing our buggy "secrets" with interested parties. 

-Alex










Monday, September 13, 2010

It Has Arrived




Its official. The RoboBuggy is now parked in the basement of AEPi. It was relinquished from Professor Kosbie's office this afternoon.

Here are pictures:
Side of the Case -SCS Tribute

Naked RoboBuggy
Clothed RoboBuggy and one of our lovely drivers in the background

The Brains


The Drop Brake

The Vision


The next step will be a full inventory of the parts that we currently have, all documented online, and then a list of necessary components and their cost estimate will be formed. After that, we will be applying to SURG grants. If you are interested in participating in the RoboBuggy project, please email me at alexklarfeld@gmail.com - We'd love to have you!

There will be a RoboBuggy interest meeting this Saturday in the AEPi Office.

Stay Tuned!


The Revival

Buggy has been the official sport of Carnegie Mellon University ever since Carnegie Tech was founded in the early 1900's. Robotics has been CMU's claim to fame ever since the first Robotics Institute was established in 1979.

At the dawn of the 21st century, the School of Computer Science put two and two together, and thus RoboBuggy was born. The idea had been around for over 10 years, but it wasn't until Arne Suppe's undergraduate thesis when it was actually implemented.

The result:



In the spring of 2010, two young enterprising freshman won the Mobot race, becoming the first undergraduates to ever fully complete the course. Nathaniel Barshay was the project lead and Alex Klarfeld was assistant to the project lead providing necessary support including: providing a netbook, making sure the mobot didn't run off the course, and monitoring the sunlight that entered Nate's eyes.

Alex and Nate with the winning mobot and Greg Armstrong


Their win sparked a drive to go onto bigger and better things.

Shortly after the competition, Alex mentioned the idea of RoboBuggy to Nate after seeing the project in Professor Kosbie's office. He was immediately intrigued. After a meeting with Arne Suppe, the previous project lead, it was settled that RoboBuggy would be revived.

A rag tag team of short Jewish men was immediately assembled when news of a RoboBuggy project spread to Alex and Nate's fraternity, Alpha Epsilon Pi (AEPi). Alex Klarfeld took on the role of Chief Logistics Officer and Nate assumed the title of Chief Technical Officer. Meetings were scheduled, workshops were cleaned, and free time was deemed obsolete as the preparation for RoboBuggy began. 

This blog will document the RoboBuggy project and its associated endeavors.

Tonight, Alex Klarfeld and Maxwell Hutchinson (AEPi's president) will be meeting with the new Sweepstakes Chair in order to discuss incorporating RoboBuggy into AEPi's regular fleet of buggies.

RoboBuggy itself should arrive in the AEPi workshop later this week as soon as we get the "go ahead" from the CS department.

Stay tuned.