Weekly Updates

Week of 24 January 2006

  • Administrative tasks: I turned in the Senior Expo Intent to Participate paperwork to OTEFD. The current Expo date is 13 April 2006.
  • Laser Distance Meter: Over break, I looked into the laser meter idea that Nick found (http://www.pages.drexel.edu/~twd25/webcam_laser_ranger.html) using a laser pointer and a webcam and then calculating the distance by finding the brightest pixel. So far all of the equipment works. I found a way to power the laser pointer without a battery; next week it should be able to start running off of robot power. Dr. Malinowski brought down a new webcam that will auto-install itself on the laptop and screw onto the pan-tilt unit, and I was able to capture a few images of the laser dot to start playing with. Next week I will attempt to create a red filter for the camera to make the dot more visible.
  • Pan-Tilt Unit: The pan-tilt unit (PTU) that will be used to turn and tilt the laser meter assembly is now up and running off of robot power. Dr. Malinowski helped configure HyperTerminal to work with the PTU to send and receive data.
  • Plans for next week: Get the robot moving, and continue with image processing.

Week of 31 January 2006

  • Image Processing and Capture: The camera has a red filter now (red cellophane over the lens) to make the laser dot look brighter. I am still trying to find a good way to get images from the camera into a format that I can process in C++. The code from http://www.pages.drexel.edu/~twd25/webcam_laser_ranger.html was written for a Logitech Quickcam, which uses its own software and is not the webcam that we're using for this project. Dr. Malinowski and I found a Microsoft application that sounds like it does exactly what we want to do, but it only runs on the most recent version of Visual Studio, which I don't have. There are a couple more options that could be promising: Two previous senior projects that used stereo cameras to track migrating birds developed software for capturing frames from digital cameras. Two other ideas that Dr. Malinowski had were DirectX from Microsoft and a library called TWAIN, both for image processing. I am looking into the bird tracking project code first.
  • Plans for next week: Continue with image capture and processing.

Week of 07 February 2006

  • Image Processing and Capture: The bird tracking software used a frame-grabber card, which I will not be installing on the laptop, since it'll just add complication and there's already enough equipment coming out of the USB ports as it is. Nick suggested I look into the code from the HOMERS project of 2002 (Lovitt, Barngrover, and Knaub), since they essentially did the same thing we're tyring to do, but with stereo vision and looking for a red ball. If I can get their capture software working, then I can fill in my own image-processing. They used the TWAIN library that Dr. Malinowski had mentioned last week. Problem: The program requires a header file called twain.h, which I had to download from twain.org instead of their website, and the compiler won't recognize some of the typedefs it uses. Next option: DirectX from Microsoft?
  • Plans for next week: Continue with image capture and processing and try not to get further behind schedule.

Week of 14 February 2006

  • Image Processing and Capture: Switching to MATLAB Andy Lovitt answered an email to tell me that the TWAIN interface was essentially depreciated at the time that he was using it, and in any case, our camera would have to have TWAIN drivers installed, which it doesn't. I talked to Dr. H., who mentioned that Rob Scherbinske used the MATLAB Simulink Image Acquisition Toolbox and Image Processing Toolbox on his group's project last year. Since I'm running low on options, and since the whole image processing part of this project is taking very much longer than I had originally planned, I'm willing to go with this approach. The disadvantage is that MATLAB is slow to run. The advantage is that I actually prefer MATLAB to C++ for the image processing. Maybe I'll be able to get back on schedule! I created the Simulink model for the image capture process, and removed the red cellophane from the camera. It is no longer needed, and is probably distorting the image more than it ever helped. The image is captured in *.jpg format, which means that there are three color plans (red, green, blue) when it gets through Simulink.
  • Next issue: How to communicate between MATLAB, the PTU, and the robot?

Week of 28 February 2006

  • Administrative Issues: Missed last week due to illness. This week Thursday morning I have a project progress report presentation. I am now officially behind schedule.
  • Image Processing: I'm trying to work out an algorithm to identify the laser dot. This is a problem because there are several pixels with the brightest value in the image. Windowing may help, but the laser pointer and camera will have to be permanently placed for a window to be established.
  • Plans for next week: Continue with Image Processing, look into mapping software (reference http://www.jonh.net/%7Ejonh/robots/mapping/submitted-paper.html)

Week of 7 March 2006

  • Image Processing: On hold until the system is mounted on the robot. Hopefully this will happen over spring break.
  • Mapping: Due to the inaccuracy of the digital compass in the presence of EMI (building, robot), it won't be used in the project. The ultrasonic sensors may be used though to detect objects at very close range in an emergency stopping routine. I developed a set of dummy data for creating a test map in MATLAB. I'll be using a probabilistic approach (no obstacle = 0, definite obstacle = 1, maybes somewhere in between), and trying to grow the map as new data is taken.

Week of 14 March 2006

  • Spring Break: Nick Schmidt and Mr. Mattus mounted the PTU, laser pointer, webcam, and laptop frame on the robot while I was gone. It looks even cuter than it did before, not to mention having all of the parts connected and powered by the robot really helps to keep the project progressing.

Week of 23 March 2006

  • Laser Meter: I took measurements to calibrate the distance meter, following the Drexel web page, and came up with an initial equation. Later, when I tried to take long-distance measurements, I discovered that the laser pointer was tilting upwards. After Nick fixed that problem, I added the new data to the old set, and came up with a slightly different equation, which appears to work very well.

Week of 30 March 2006

  • Administrative Issues: This week I had a practice presentation with the speech coach, and there are several other presentations and deliverables due shortly. The next big event is the Student Expo, coming up in a couple of weeks.
  • Servers: Dr. Malinowski wrote a server to control the PTU that I tested this week. Shom helped me set it up, and it runs through telnet. The robot server has issues with the robot software. Saphira is no longer supported, and doesn't work with newer operating systems like XP.
  • Mapping Software: I spent some time working on the mapping routine, setting up the initial map array and colormap, and converting from polar to cartesian coordinates.

Week of 6 April 2006

  • Administrative Issues: Expo is coming up, and I need to get posters printed off ASAP.
  • Robot Software: Since Saphira is no longer supported, ActivMedia sent Dr. Malinowski a password for the newer version, called Aria, which he is trying to get working wiith a server.
  • Mapping Software: Meanwhile I am still trying to keep the mapping software progressing. This week: growing the map to accomodate new obstacles.

Week of 11 April 2006

  • Expo Week: MapBot and I went to Expo on Friday. Check out the photos! It went really well, and we took a few mappings and got to talk to a lot of people who came through.
  • Servers and Software: The PTU server can now be controlled through MATLAB using an IP Toolbox created by Peter Rydesater. The program maps completely once, but has a few minor issues with things like a mirror image effect and some data points that could be classified as wild anomalies.

Week of 20 April 2006

  • Tuesday: My practice presentation with the speech coach was cancelled, so I worked on fixing the mapping software instead. It seems to work pretty well now.
  • Wednesday: MapBot and I took a field trip to the Institute for Learning in Retirement at the Student Center. He was a hit, and he got to wear his 2nd Place ribbon from the Expo. :-)
  • Thursday: Working on fine-tuning the mapping software to allow for multiple mappings, i.e., the robot takes a map twice in the same place, and plots them on the same map.

Week of 25 April 2006

  • Administrative Issues: Final Presentation Thursday, Poster Presentation for alumni on Friday.
  • Robot Server: Dr. Malinowski came down to the lab and set the server up the right way (so that it didn't give the 88 linking errors I was getting...) and the robot moves around! We had a great time almost running into the professors on Friday during the poster presentation. :-)

May 8 2006

  • Demo Day: I got the robot running from a remote PC, took a mapping, and played around for a few hours before the Demo. Then, when I switched from manuevering to mapping mode, the computer crashed. When it came back up, it was reading the serial port as a mouse - not good for trying to run anything else. So MapBot's performance at the Expo will be accepted as a demo instead.
  • Wrap Up: The final reports and presentations will be posted on the website. I've commented all of my code, and I hope to post that on the website too, just in case anyone wants to build off of the project next year.

MapBot Home