Friday, December 31, 2010
I've been spending quite a bit of time working on SLAM with the Neato XV-11 using both the built in laser and the Hokuyo URG-04LX-UG01. I had pretty much given up on gmapping working with the Neato -- until earlier today we found an issue with the scan angle increment computation in gmapping not working with the Neato laser specifications. I probably wouldn't have found this bug had it not been for a user of the Trossen Robotic Community pointing out some issues he was having with gmapping, as my version still had some modifications from my PML work earlier this year.
Anyways, for anyone wanting to use gmapping with the Neato robot, you can apply the following patch:
- gsp_laser_angle_increment_ = (angle_max - angle_min)/scan.ranges.size();
+ gsp_laser_angle_increment_ = scan.angle_increment;
to slam_gmapping.cpp. This uses the angle_increment from the laser scan, rather than the computed one, which is incorrect for full rotation scans. This will avoid issues with the scan being improperly inverted, and issues with scan matching.
Here's another example map created (of the first floor of a home):
Next up on the docket of winter break projects: some updates to the ArbotiX ROS package, and a number of people perception algorithms.
Saturday, December 25, 2010
Here's an updated map using the 12-12-neato-ils bag file:
There's still some work to fix the way that the map->odom transform is handled, and allow a configurable map size and resolution (both of which will require reworking some more of the underlying library). I'm hoping to have the code released shortly.
Friday, December 24, 2010
These datasets were collected on iRobot Creates, the Neato XV-11, and the Vanadium Armadillo using the Neato Laser or the Hokuyo URG-04LX-UG01. In particular, I've recently collected a dataset I really am looking forward to working with: the 2010-12-23-double-laser.bag dataset, which consists of a long route around the ILS lab with a Neato XV-11. I mounted a second laser on the Neato for this run, a Hokuyo URG-04LX-UG01 sensor, which is aligned with the Neato laser from above:
And a picture of the Neato with second laser:
Thursday, December 23, 2010
One paper recently published that actually uses a short range Hokuyo is CoreSLAM, a SLAM implementation in under 200 lines of code. Well..., the map update/storage aspects are less than 200 lines of C code, however the complete system relies on a bit more, namely a particle filter for localization.
It looked good. So I integrated it into ROS:
This map is currently only using odometry, I'm going to try and get the MCL localization aspects working more tomorrow. I also need to get the map_saver to work with my ROS wrapper. I hope to have this on our SVN by the weekend.
Tuesday, December 14, 2010
Sunday, December 12, 2010
On the subject of ROS, I'm also a big fan of that. Which brings us to the true reason for this post: I picked up a Neato XV-11 robotic vacuum this week (Thursday) and it arrived Friday (Amazon Prime $3.99 next-day shipping FTW). Friday night I spent about 4 hours getting the laser scanner and motor basics lined up, and then drove it around for a little while to discover numerous bugs. This afternoon I worked out most of the bugs, although I still need to work on the odometry calculations a bit more. Anyways, here's a quick video:
A couple of thanks to send out -- had it not been for this blog post by Hash79 of the Trossen Robotics Community, I probably wouldn't have even bought a Neato -- but all that data! The Neato looks like it could be a very interesting competitor for the iRobot Create -- hopefully I can get the odometry/laser data to work in gmapping (so far, I've had *no* luck).
More tomorrow -- as well as a code release (after cleaning things up).
Monday, November 1, 2010
a FitPC2 brain, and a 4DOF arm. He's also sporting a Hokuyo URG-04LX-UG01 laser range finder.
The poor Armadillo was sitting around for quite a while until I upgraded his motor drivers to handle the extra weight of the platform. However, he's now fully operational. To test out whether his odometry would be good enough to work with the ROS navigation stack, he was driven around to build a series of maps.
The first map is of the first floor of my house. The Armadillo started in the living room, when down the hallway and into the kitchen, and then returned. The map came out pretty good, I plan to collect a map of the complete house later this week.
A second, much larger map was made of the CS department hallways. This one had some issues. In particular the scan matching was creating false positives, which "shortened" the hallways. I'm still hopeful this can be made to work though, with a bit more parameter tweaking. Below, the image on the left is the map from gmapping, and the image on the right is a raw odometry-based costmap in RViz:
I'm going to try and keep this blog a bit more up to date from now on.... we'll see if it actually happens.
Monday, August 9, 2010
I've been a bit distracted playing around with navigation and motion control under ROS. During July I built a "Poor Man's LIDAR" (or PML for short), out of a long range IR sensor and an AX-12. The image to the left shows the PML mounted on ROSalyn, an iRobot Create based ROS-powered robot I've recently assembled at the University lab. I'm using a new ArbotiX-derivative board to control an AX-12 pan and tilt and the PML.
I actually bought the sensor about 18 months ago -- originally to put on REX (may he rest in pieces), but hadn't gotten around to actually hooking it up until recently (partially inspired by the successes that Bob Mottram had). All in all, it works fairly well -- way better results than I ever had with a sonar ring, but of course nowhere near a true LIDAR. The PML results are broadcast within ROS as if they were actually produced by a laser scanner. You can see the scan (black dots) and then a costmap_2d generation (red dots are lethal objects, blue dots are expanded version for motion planning) showing up in the RVIZ view (the robot is at the end of our hallway, the range of the costmap is 3m, less than the 5 or so meters that the laser can trace out, so there's no costmap generation of the walls inside the rooms in the distance, just the laser scan dots).
All of this ROS work is towards the goal of producing a very robust, and extensive ROS package for the ArbotiX. The core of the package allows the ArbotiX to control AX-12 servos, read/write digital IO, and read analog inputs -- all within ROS. There's also extensions to control differential drive robots, or NUKE-powered walkers, using the standard "cmd_vel" topic -- and publish odometry such that the bots could be tied into the ROS navigation stack. Version 0.1 is now in SVN, although the ROS API is quite unstable and will be changing drastically in 0.2 (to a much nicer, and more robust interface, which also sets up several features I want to implement further down the line).
Friday, June 25, 2010
Today, we'll start with an idea that seems to go against all safety warnings: you can blink an LED directly off an Arduino/AVR pin. Yep, no resistors. We've probably all experienced the fun of plugging an LED in between our 5V and ground rails, and watching it glow brightly for a second before exploding and smelling quite awful -- so it seems natural we'd always want to install a current-limiting resistor to stop this problem.
But here's the interesting part: your Arduino/AVR is a current-limiting device. The I/O pins can typically only source about 15-20mA of current. Thus, if we connect our LED between the pin and ground, and toggle the pin high, our LED glows nicely without exploding.
There are a few caveats to this though: you can't do this on every pin on the AVR at the same time, or even a large number of pins. The ATMEGA168 data sheet specifies that the Absolute Maximum DC current between Vcc and GND is 200.0mA. Since the AVR core and periphrasis such as the UART/SPI/ADC draw some current of their own, we clearly can't control a huge number of LEDs this way. Also note that if we leave the LEDs on for a long time, you'll probably notice the AVR getting a bit warm sourcing that much current.
Next Up: Pull-Up Resistors (and Why You Need Them)
Tuesday, June 22, 2010
Unfortunately, no pictures. My camera is on vacation in Florida. But, here's a rendering of the final head/snout with the MS Lifecam cameras:
Sunday, June 13, 2010
Each gripper has two printed parts: the HS-55 mount and the C-bracket. The servo mount is the same part for both the left and right sides but I had to slightly tailor the C-brackets to get decent range of motion. Each HS-55 mounts to an AX-12 C-bracket.
The gripper fingers are made of 1/16" thick 5052, about 3/4" wide, mounted with two 2mm screws. Instead of a bearing, I'm using a 3/16" Chicago bolt on the non-driven end of the printed C-bracket. Preliminary tests show the grippers have no problem holding the robowaiter plate.
Thursday, June 10, 2010
One of the key features desired for Issy3 is a set of feet with tactile feedback. I've played with FSR versions in the past, and they tend not to survive that long.
Two afternoons of prototyping and printing has yielded the first of Issy's new feet. I have one leg assembled, but I'll still have to get some different hardware tomorrow to tweak it completely.
The foot consists of three printed parts: the outer case, the foot pad, and a retainer insert inside the case. A screw connects throw the foot pad and the retainer, and a spring pushes the foot downward. There will be a little spring steel connector on top that will get touched by the screw -- acting as a simple switch. A vinyl footpad gives traction.
Each foot takes about an hour of printing on the Makerbot.
Thursday, May 20, 2010
Issy did terrible in Mech Warfare. He walked like a champ, but had numerous other issues. On Friday afternoon his gun literally exploded inside. Thankfully, I found a Sports Authority around the corner and purchased an extra gun(have I mentioned how much more convienient of a venue San Mateo is over the old place?). Apparently though the Trendnet camera decided that 6V was just too much - it stopped working on Saturday, and Issy was dunn for. He's already been torn down, and retired from Mech Warfare.
He'll be rebuilt over the course of the next month or so, this time with a Fit-PC2, stereo camera head, and a tail. He's going to be running ROS, which ought to be cool. Here's a teaser shot of what he will (hopefully) look like:
On other fronts, I'll be posting some other robot goodies over the coming days. I'm currently working on some finishing software touches for our first "social" robot at Albany, Nelson. I'm also working on a robowaiter entry for next April.
Friday, April 9, 2010
It's often said that "you can't eat your cake and have it too" -- but we at Vanadium Labs don't like idioms, because if we use them, our robots will too. What could be more annoying than a robot telling you "There is no free lunch"? So, we decided that the ArbotiX had to now control RX servos -- and that's exactly what our new RX-bridge can do! This little board plugs into your ArbotiX, transforming it into the lowest cost RX controller on the market -- and you can still use your favorite apps like PyPose/NUKE and the familiar Arduino IDE.
P.S. You can see it all in action on Andrew's RX-64 based quadruped -- powered by NUKE:
Sunday, March 14, 2010
The former is build out of Home Depot aluminum, a cookie sheet with holes drilled in it, a portable grill/crockpot thingy, and massive shop vac.
It's pretty much a monstrosity. I tried to keep it small, by reducing the height of the lower box, but then ended up finding out I didn't have enough space to get the vacuum tube to turn... I then found these great adjustable legs, from a commercial range, they brought up the height a bunch, and made it easy to plug the tube in. They also add a bunch of weight to the bottom, making it quite a bit more stable.
Unfortunately, you'll have to wait a bit longer to see some of the results, as I didn't get around to snapping shots of Nelson's face plate.
Friday, March 12, 2010
The first one looked great, but about half way through, the pile of ABS got all coiled up, I walked back in to find that the print head and Z-carriage were pulled to the top of the bot, and the ABS was hanging off the side. De-tangle, reset, chop off about 10-20ft of ABS, and tried again. This print took 1h38m.
I then cleaned it up, attached the dual IR sensors, and the AX-12 servo. The 5/8" hole for the sonar sensor needed quite a bit of cleanup, but everything else was nearly perfect. I've now got all the sensors except the IR photodiode mounted, and the head is on SMALdog.
Now, I've got about 4 wks before the fire fighting competition. Hopefully that is enough time to tune his walking gait, load the map following code in, and get everything else working....
Wednesday, February 17, 2010
We also purchased a MakerBot, which is now up and running. This little guy will be producing robot parts soon, as I have a number of projects on the backlog that I want to get going.
As for that product on the way: how about controlling some RX-64 servos with your ArbotiX? You'll be able to do just that with our new RX-Bridge, a little add-on board for the ArbotiX that converts it from an AX-12 controller, to an RX-series controller (and EX-106's too).
Lastly, we've been hard at working testing, and the V1.1 release of PyPose/NUKE is just around the corner. Should be out very shortly, with a number of improvements and bug-fixes. Unfortunately, mammal-style IK will be held over until v1.2.
Saturday, January 9, 2010
The first is the MINI robocontroller. It's a smaller version of the ArbotiX based on the ATmega168 -- so it's completely Arduino compatible. Just like the ArbotiX it has a dual motor driver, XBEE socket, low dropout regulator, and 3-pin headers for I/O. The MINI also a block of 4 I/O ports that can be easily configured to control servos. The only thing it's missing from it's big brother is that it can't control AX-12s, and it has fewer I/O. This is the perfect board for your first rover! The MINI has had serious testing, an earlier revision was used as the scoring transponder for Mech Warfare 2009, and the new MINI's will be the scoring transponder for MW2010 and beyond. The MINI is also been the board used in my Introduction to Embedded Computing and Robotics workshop. MINIs will retail for $60, and should be available from Trossen Robotics later this week.
A completely different product is the ArbotiX Commander -- it's an Arduino+XBEE based handheld wireless controller. If you've been using NUKE, you've probably heard of our Commander library and protocol. Being open source, you can hack the Commander to do whatever -- there's a row of female headers along the side of the chip, and a whole slew of extra prototyping area at the top edge. It'll be a bit pricey compared to off-the-shelf controllers like the PS2, but I think the open-source and integrated XBEE make this the perfect controller for advanced roboticists. The Commander will be available later this month!
Sunday, January 3, 2010
NUKE is written in Python, and it exports a C/C++ Arduino project that runs on the ArbotiX. NUKE can be downloaded from our Google code site: http://code.google.com/p/arbotix/downloads/list. Documentation is also on that site.
We also have a new google group for support (it's very new, hence the low traffic) http://groups.google.com/group/robocontroller