Saturday, May 5, 2007

Final Notes

It has been a long semester, with many trials and tribulations for Hamsterbot. This post wraps up the ideas and goals I had for Hamsterbot, and the choices made along the way.

Approach
The general approach to building Hamsterbot came in a few steps.

The first step was getting mouse movement to control the movement of Hamsterbot. The approach used was to take the distance the mouse moved in short periods of time and translate that into motion. The location of the mouse could be found through Python functions, and the cursor could be reset to the center of the screen through available Windows interfaces. One choice made at this stage was to translate change in y as translational velocity and change in x as angular velocity. It is not clear if this is the optimal way to compute movement(in fact it seems unlikely), but it was a straightforward approxmation which behaves normally for most types of movement.

The second step was how to capture the movement of the hamster ball. This was done by placing an optical mouse directly under the ball, which would then record the movement of the ball. This was a very easy solution, and it worked perfectly, except for the issues brought up by the next step.

A large issue with this was the amount of friction it caused on the hamster ball. In fact, the friction was so great that there was no hope of a hamster actually being able to rotate the ball, when situated on Hamsterbot. An excellent future solution to this problem (given sufficient supplies) would be mounting the hamster ball on a ring of ball bearings, with the mouse below this setup. Then the mouse could still easily be used to track the ball, while the ball would be able to freely rotate thanks to the bearings.

The final issue was the smoothing of Hamsterbot's motion. This consisted mainly of the jerking starting and stopping, as well as Hamsterbot's penchant for ramming into walls at full speed. The jerking was fixed by taking the average of Hamsterbot's current velocity and the velocity read from the ball, resulting in less extreme acceleration in general. This also put an end to immediate stops, replacing them with gradual deceleration. The speed issue was fixed by putting an upper and lower bound on Hamsterbot's speed. This ensured that all ball movement actually moved the robot, as well as preventing excessive speeding (and hamster injuries).

Final Comments
The Hamsterbot project was a wonderful learning experience, and a lot of fun. If I had another try, or if I were to continue on with the project, there are a few areas I'd focus my effort on.

I would have loved to install some ball bearings and actually get a hamster controlling Hamsterbot. This was one shortcoming which was dissapointing, if only because the solution is just a few appropriate parts away. Additionally, there remain some "interesting" behaviors that emerge in Hamsterbot that are not present in the simulation, which I'd be interested to determine the source of.

Overall, working on a robot like this was an amazing learning experience. The chance to integrate many of the ideas we discussed in class into a single, unified system was a great challenge, and proved very rewarding. It truly has been a pleasure.

Demo Day!

Here are links to a few new videos for the Hamsterbot. They show the uncompromising courage with which Hamsterbot plows into walls in the interest of scientific progress.

Walls
More Walls
Hamster ball malfunctions

You will notice that the ball part of Hamsterbot is a little picky about when it wants to work correctly.

Tuesday, April 10, 2007

Future Ideas

Here's a short list of ideas for the further development of the HamsterBot. Currently, the robot can react to the mouse correctly, and has fairly good MCL mapping abilities, courtesy of the directional bump sensor. Unfortunately it is a very jerky platform, and it doesn't use all the sensors avaliable to the robot in its MCL model. Goals for the remaining part of the semester, include:

- Smoothing acceleration
- Setting minimum and maximum allowable velocity
- Including cliff sensor data in the MCL model
- Replacing the hamsterball (the old one had to be returned)

For now, these seem to be a good start. If these are accomplished quickly, there's always the idea of adding an offboard camera to track the roomba's progress, and perhaps even automatically generate the map used for MCL.

More media will be available shortly once the hamsterball is replaced and I can try more test runs.

Monday, April 9, 2007

Solving Set

Our latest project was to create an image processing program which could play the game Set (rules here). The hardest part of this assignment was recognizing the cards and their four attributes: color, number of figures, texture and shape. We did this in the following way.

Color
Since the images of the cards were all taken under imperfect lighting, we were forced to determine color very carefully. We ended up using the RGB value of each pixel in the image, and if one of those three colors was significantly high, and all the rest were low, then we colored it that dominant color. This worked well enough that we could take the entire image once this was done and whichever color was the most popular, that was the color of that Set card, and have accurate results.

Number of Figures
To determine the number of figures on a card, we took the leftmost pixel which was not a part of the background, and the rightmost pixel which was not background. We then looked at the difference of these two points, and if they were close, we knew there was only one figure, if they were very far, there were three figures, and if they were in the middle there was one. It seems simple, but the approach worked very well and was quite easy to code!

Texture
Texture was a bit of a strange problem. We identified solid figures first. To do this, we just calculated the average number of colored pixels per figure. For the figures which were solid, this number was much higher than non-solid figures, so we could easily identify solid shapes. To identify the shaded figures, we discovered that the shaded sections of the images had distinctively lower values for all three colors than unshaded sections. So we colored these yellow in our processed image to set them apart.

We counted up all the yellow pixels, and if this number was sufficiently high, we declared that figure to be shaded. Finally, if a figure was neither solid nor shaded, we declared it hollow.

Shape
The hardest problem was determining shape. We finally decided to measure the horizontal distance across the shape at a given height to determine the shape. We calculated this height by first finding the highest colored pixel in the image. Then we looked 30 pixels below that to measure the width of the figure. This allowed us to keep a consistent measurement even when the images were centered differently from each other. The diamonds had the smallest width, then came the squiggles, then finally the capsules were the largest. We had to allow some special cases to measure 3 figures correctly, as well as to measure the different shapes accurately, but this method turned out to be extremely accurate, with a very small average difference in width between images with the same shape, even when they had different numbers of figures, and/or had different textures.

Conclusions
Our solutions ended up being significantly uglier and much more hackish than we had anticipated, but we still found value in the process. In an ideal world, the processes of opening and closing could have been used to determine texture, and some sort of shape evaluations could have been used for finding shape, but these results were simpler, and at least in the case of shape, probably much more accurate than what we would have achieved otherwise.

Finally, our image processing is dependent on the figures being centered in the image, or very close. This was an unfortunate limitation we were not able to avoid, and seems like it would be a natural problem for any vision based system. The software is only as accurate as the pictures it is processing.

Friday, March 9, 2007

Monte Carlo Localization!

In robotics, localization is the task of determining one's location within a known environment, given available sensory data. Theoretically, if a robot knows its starting position and how many times its wheels have rotated, it should be able to calculate its resulting position on a map. Unfortuneately, due to real-world factors like wheel slippage, there is a significant amount of uncertainty in this odometric data, so a robot needs to gather and reason about other sensory information in order to keep better track of its own position.

With HamsterBot, we implemented an algorithm called Monte Carlo Localization, or MCL for short, which lets the robot figure out where it is on a map.

You can read the original paper on MCL at this page, or a short description of it at Wikipedia. The general idea of it is to first guess where the robot is. Then when the robot moves, we take each of these guesses, called particles, and move them through the map in roughly the same way that the odometric data indicates, but with small random variations for each particle. Next we compare the current sensor data from the robot to simulated sensor data for each particle, and assign a probability to the particle based on how close the two sensor readings are. See the post on Motion and Sensing Models for more information on how positions and probabilities were assigned. Finally we resample the particles based on their assigned probabilities, so that high probability particles tend to be multiplied and low probability particles tend to die off. Then the process starts over. In this way, we can get what we hope is a good idea of where the robot is within our map.

Currently, Hamsterbot uses just its bump sensor. In the videos of our demo, you can see the robot bumping into walls, and on the screen, all the yellow particles not touching a wall dissapear, because we know they aren't accurate representations, given the bump we just experienced.

Our MCL algorithm in particular has some cool features to keep the robot on track. Should none of our particles correspond to the sensor data from our robot, then we create all new particles, and spread them across the map, hoping one of them will be close to the actual robot. In the future, we are going to also make sure these all new particles match the current sensor data from the robot...if the robot is bumping something, there's no point in creating particles far away from any walls!

This particular algorithm is something that will likely have many upgrades in the near future, but for now, it's the brains of our Hamsterbot!

Check out Video #0 to see the real HamsterBot and a good example of MCL in action. Below is a movie of a simulated Roomba running MCL. The red dot indicates the actual robot position, the blue dot is the odometric data, and the yellow dots are our MCL particles.

MCL Motion and Sensing Models

The first step in MCL involves updating particle positions with respect to our motion model:
Given the current and previous poses reported by odometric data, we extract the distance and angle at which the robot should have moved. We then scale the distance by 1 ± p for some percent error p. The angles are randomized within a constant tolerance, c, of their values. By trial and error we found that values of 50% and .5 degrees, for p and c respectively, works quite well for the simulated roomba with a simulated noise factor of 2.5%.

The next step involves comparing robot sensory data with what each particle would sense from its pose in the given map. We used the following cases for assigning probabilities:

1. Robot bumps left and right
---> If the particle bumps left and right, it gets our maximum probability value: 0.9
---> If the particle bumps only on one side, we find the minimum angle, a, that it would need to rotate in order to register both left and right bumps, and assign a probability of (1-a/pi)*0.9.
---> If the particle is not touching any walls, we take a range reading, d, from its pose, and assign a probability of (1/(d+1))*0.9

2. Robot bumps on either left or right, but not both
---> If the particle bumps on the same side and not both, it gets probability 0.9
---> If the particle bumps on both sides, it gets 0.75*0.9
---> If the particle bumps on the opposite side, it gets 0.5*0.9
---> If the particle is not touching any walls, we take a range reading, d, from its position, at a 45 degree angle left or right, as appropriate, of its heading and assign a probability of (1/(d+1))*0.9

3. Robot is not bumping
---> If the particle is not bumping, it gets 0.9
---> If it is bumping, it gets 0.9 / N, where N is the number of particles.

Tuesday, March 6, 2007

HamsterBot Videos

See HamsterBot 1.0 being steered by mouse and hamsterball controls, and running MCL!
Video #0
Video #1
Video #2

HamsterBot

HamsterBot 1.0 was successfully demoed today. This post documents the details of our implementation...

Physical Design
Our final robot consists of a hamsterball, a roomba, a styrofoam ring, and a little duct tape. The ring is wrapped in teflon tape and mounted on two "feet" (empty teflon tape containers) which are then duct taped on either side of the cargo bay of the roomba. A motion sensor is then placed in the cargo bay (stabilized by some foam bedding) and the hamsterball is set inside the ring, such that its bottom surface rests on the sensor. The ring and sensor are sufficiently smooth to allow the ball to rotate freely in place in any direction.

Mouse Motion Sensing
Inspired by the video of iRobot's hamsterbot, we chose a wireless optical mouse as our motion sensor. The pyRoomba software used for various class assignments served as a basis for our program. The graphics code included therein utilized the Tkinter module and included functions for determining the cursor position (as long as it is over the canvas). I extended this code to include data members for the cursor position and to update these data members whenever mouse motion was detected.

Our Program
I then added a "Mouse Mode" to the pyRoomba main program where the cursor position is checked at regular intervals. If a change in position occurs, the robot's linear and angular velocities are set to be proportional to the vertical and horizontal displacements, respectively, between the starting and ending points of the movement. Thus, moving the cursor upward via the mouse causes forward motion, while moving it to the left causes the robot to turn toward the left, etc....
This motion model is basically what one would expect in order to steer the roomba with a mouse as one would normally use the mouse (that is, on the table, with the sensor facing down). Within Mouse Mode, we can also toggle on and off "Hamster Mode". Since the mouse sits upside down, with the hamsterball "rolling" in place over the sensor, this switch simply flips the sign of the angular velocity used. i.e., If it looks like the mouse is moving left under hamsterball control, the ball must actually be rolling toward the right, so we turn towards the right.

The ctypes python module provides a function that will set the position of the cursor to any point on the screen, which ensured that we would not "run out of screen" as described in the previous post, since we can just place the cursor back in the center of the screen after each motion sampling.

Future Work
There are still a couple issues that could be improved upon with this system. The first is that the hardware could be nicer. The styrofoam ring is not very durable and doesn't always keep the ball centered over the mouse sensor. It also allows the ball (and thus, the cursor) to wobble back a forth slightly due to changes in velocity, which is a kind of motion that it would be nice to ignore, if not eliminate.
Secondly, the robot's motions resulting from mouse movements are somewhat jerky. Part of this is due to the ball wobbling in the ring as described above, but a human hand can create the same effect. If the mouse or ball is moved quickly forward and back, the robot's linear velocity will change direction so abruptly that it will pop a wheely! This looks cool to an observer, but would probably not be very fun for a hamster. Thus, our motions could definitely use some smoothing. Forcing each sampled motion to run for a minimum time, or causing changes in velocity to occur more gradually are two possible approaches to this end.

Wednesday, February 21, 2007

Problems with PyHook and Tk

Hamsterbot requires that we convert mouse motion into differential drive motion.
Both the pyHook and Tkinter provide ways of monitoring cursor movements, so by modifying our pyRoomba code, we can now get an updated position of the cursor (such that the origin at the center of the map, and only when the mouse is over the map with no other windows between them) at each loop. There are many different ways we could use this information to adjust our linear and angular velocities, but I think one of the following should be both simple and effective for a first try:

1. Start the cursor position at the center of the screen. Keep track of the previous cursor position. Upon detecting movement, get the new cursor position.
"Draw" a ray from the previous position to the new position and a vertical line through the previous position. If the angle between the line and the ray is within some (possibly large) tolerance of 0, go straight forward. Within tolerance of 180: go backwards. Otherwise circle forwards or backwards and left or right based on which way the ray points.

Problem with 1: The cursor position is bound within the map. We cannot make the robot go forward for an arbitrary distance using the physical mouse because the cursor effectively encounters a wall at the edge of the map. Also, if the cursor moves off the map some distance, it has to be moved back that distance before the robot will pick up the motion again. We can solve this last problem by maximizing the map, but then we are still bound within the screen. However, we might be able to scale our screen to an arbitrarily large (or small) physical area by slowing the cursor speed via the OS. In fact, increasing the cursor speed should confine hamsterbot to a smaller area, like putting up an invisible fence.

2. Start the cursor at the center of the screen. Upon detecting movement, get the new position and do as in #1, but then reset the mouse position to the center of the screen. That way we never run out of screen. (At least one or both pyHook and Tk offer ways of checking if a mouse move was 'injected' by a program rather than coming from the actual device, so we won't count the resetting of the positiong as a movement.)

Problem with 2: Neither pyHook nor Tk nor any other package I can find seems to offer a way of injecting Windows mouse events in python.

Wednesday, February 7, 2007

Programs, Python and Photographs



Today Lilia and I successfully wrote a simulator for our dear Roomba which navigates a small room and seeks out a goal. This program is written in Python and runs on my laptop, which communicates the proper movements to the Roomba via a Bluetooth radio, which is the chip seen in the picture.


Indeed, we tried to test this program out on our Roomba. Unfortunately, we discovered that Roombas tend to not work so well when you don't have them charged. As a result, the only video of our progress so far is this one. Just wait, better things will come very soon!



Until we get some cool videos up, you will have to be content with the ever-fearsome, Snakes on Roombas!

Friday, February 2, 2007

Remote Control Roomba

Today we got the pyRoomba python code working on Scott's laptop. We also obtained a bluetooth radio for our roomba so that it can be controlled remotely through python scripts or the command prompt.

Our next objective is to capture some images and/or video of the roomba in action. We are also trying to think of an alternate name for our roomba, to distinguish it from other roombas...

We will also start working on the next step towards Hamsterbot: writing Python code to capture wireless optical mouse data and convert it into pyRoomba commands.

Wednesday, January 31, 2007

Up and Running

After several downloads and more than a few pages of manuals, the Roomba Create has successfully run a simple program and been connected to a computer!

Courtesy of the Create manuals page, we were able to download tools which allow us to write basic programs in C, compile them, and then transfer this compiled code over to the Roomba.

So far, deciphering the example code has proven to be tricky, but soon enough of it will be understood to create a sample program. The goals of this sample program will be testing the Roomba's mobility and sensor outputs.

Monday, January 29, 2007

How to Read this Blog

This blog serves as a website conforming to the guidelines given in HW2 for a technical report documenting a lab project for CS154.

The front page consists of a blog of all posts made on the site, making it extremely fast and easy to see if anything new has been added (although older posts may be edited here and there without notice). This includes everything from progress reports to background information to media. However, there is no gaurantee that individual posts will be related to those around them. In fact, the opposite is far more likely.

For a more cohesive reading experience, try navigating through the various Sections seen in the sidebar. For now, these are either links to particular posts (like this one) or to collections of related posts.

Introduction

In the beginning, there was dirt. So iRobot created the Roomba, a robotic vacuuming system. But then the roboticists got excited and started fiddling with their Roombas, so iRobot created the Create...

This is the "Introduction" section post. It needs to be edited and made better, stronger, faster...

Friday, January 26, 2007

iRobot Create: Manuals & Downloads

Lots of helpful documents from iRobot, including the Create's Open Interface specification, can be found here.

HamsterBot


Our first mission: a hamster-controlled roomba. As the video above (from the 2007 Consumer Electronics Show in Las Vegas) demonstrates, we have the technology! Our challenge now is to assemble it and of course to attempt to answer the question, What can one do with a hamster-controlled Roomba?

Progress Report #0

Today we received our Create and various accessories including the command module, remote, and battery pack. Unfortuneately, the battery was not yet charged, so further progress will have to wait.