latest news


We added this new template to the website

read more ...

other information

Doom 4 Should be out this coming September.

Project Summary

David - The human interface portion of the project began with determining the specifications for the sensors needed to track hand and head motion. After searching several related previous senior projects, I was able to obtain some specifications for hand movement. Later, these were less necessary because I found a glove with sensors already integrated. For the head, I found a chip that had a three axis accelerometer with a range up to 3g's to capture more rapid acceleration and a gyroscope with a 500 degree per second capture rate. The 500 degrees per second provided good resolution for both slow and rapid head movements. I also decided to use a 3-axis digital compass in order to help compensate for the drift innate in MEMs devices such as the accelerometer and gyroscope. To fulfill the specifications for the hand control sensors, I needed a way to measure finger flexure and hand orientation. I originally searched for a glove with potentiometers on the fingers to measure flexure, and found one for $1200. However, it had a USB interface and I wanted to have the flexibility to be able to switch to serial interface as well. Eventually, I found a glove, the DG5-V, that had sensors for the fingers, and as mentioned earlier, an accelerometer already mounted on the back. In addition to being half the price of the other glove, it had the flexibility to switch between USB and RS-232 simply by adding or removing a cable adapter. Thus, I was able to obtain devices that met all specifications. Next, I studied sensor fusion techniques and device testing techniques. I plan to use a servomotor to determine device parameters then implement either a Kalman or adaptive filter. Of course, as experimental results are obtained my final sensor fusion techniques may adapt as well.

Currently, I have interfaced both the glove and the robot arm, and am working on integrating them into a system.  After that, I will tune the correlation so as to give the user a more intuitive control interface.

Final Progress - The sensor glove can interface into the human mounted computer, and the computer can continually read the status of the accelerometer and potentiometers from the dataglove.  The data from the potentiometers is stable and feature recognition algorithms relying on the potentiometer data are reliable.  However, the accelerometer information  contains large spikes due to the user's imperfect gestures when providing motions for the feature recognition algorithm to recognize.  The output from the accelerometer mounted on the dataglove will require significant filtering from a statistical filter before it can be made useful.  The mode switching algorithm that relies upon a clenched or unclenched fist to change the mode of the agent has been written and can be implemented, as well as the parts of the algorithm that change the degree to which the claw attached to the robotic arm is open or closed.    However, manipulation of the position of the robotic arm via motion measured by the accelerometer is impossible due to high noise content inherent to the accelerometer signal.  The algorithm for manipulating the servomotors mounted on the robot arm is complete and can be implemented into the control programs to be loaded into the microcontroller located on the mobile agent.  The specifications for reliable position of the LCD headpiece require the sensor fusion of an accelerometer, gyroscope, and digital compass.  Because stabilization of these devices requires a control scheme too time consuming for the scope of this project, a problem with a narrower scope was addressed.  Instead of designing a Kalman filter to provide accurate position results from the three sensors, a Wiener filter was developed to reduce gyroscopic drift.  When tested in MATLAB with simulated nonlinear gyroscopic drift, the 11-tap filter was able to reduce the effect of the drift to a -12 order of magnitude.  Finally, preliminary statistics have been gathered for the design of a filter to compensate for the problems of the dataglove mounted accelerometer that was previously mentioned.

Brian - The majority of the work done thus far has been learning how to program using openGL. Currently, I am able to create a map in openGL that is based off a MobileSim map that is traditionally used for Bradley's RoboNAV class. In this map, I am able to navigate a virtual robot in a first-person 3D perspective. I am using methods that will allow me to easily convert to and from the physical robot's position and angle in regards to the openGL robot.

ARIA is currently set-up at my lab station, and much of my time and effort has been devoted to building a program that allows manual override and potential field navigation. I am also working on multi-threading to include the openGL support.

Final Progress - I was able to set up communications between the robot navigation program and my openGL program. The position is transferred and after a slight delay, openGL updates its view of the world. It has joystick override and can create maps.

All my code can be found at