Vacuum cleaner robots
Because “I do software” I was never too hyped by vacuum cleaner robots. In back of my mind I knew how hard would be to process all the ambient space and behave more or less inteligently. But I eventually gave up – I was drunk by the idea of having less boring work to do 😃 and I was also bit curious. Heck there must be some progress in last five or so years, isn’t it? Look at the processing power of RaspberryPi or Arduino boards. People are doing crazy stuff with these.
So I did some research and asked friends around me. Putting all the references and informations together I bought iRobot Roomba 620. The promise was to do what I needed, do it well, and without any gadgets around I didn’t needed.
But it actually didn’t. The robot was so stupid. First the walls. 9 out of 10 cases the Roomba was hitting the wall hard, not slowing down. Not the corners. Regular straight wall with nothing in range of one meter on left and right. Putting it into the bathroom, closing the door and letting it do the work sounded like an animal trying to escape the room. And the algorithm. I supposed it be driving in a more or less structured way. But no. It was pure random. Or at least it looked like that. Example is turning in front of the wall (after hitting it, of course). I was expecting it to turn left/right, drive couple of centimeters, turn again and drive to the other end of the room. Kind of zig-zagging through the room. Maybe it’s not the fastest way, but for a human being it looks smart. It’s what people normally do, more or less, when vacuum cleaning. Other issues like unable to find the home station to charge in half an hour when the station was two meters from the robot and in line-of-sight, noise, … were just confirmation I don’t want it. Sure it was not the iRobot’s top model, but I lost hope.
But I was not ready to give up. I couldn’t believe all the people buying these were OK with that. So I went to another research and this time focused more on the smartness. Next day the LG Hom-Bot 62601LVM arrived. With a promise – based on internet reviews – to be way smarter (or at least with heck of a lot sensors). And I have to say it is. This still nowhere near perfect but it’s acceptable, I think. The algorithm is really looking like algorithm. In the zig-zag mode the robot is following the path I would expect give or take. It’s clear that it still has zero clue about the room’s shape, but at least it’s doing its work consistently. It’s not hitting the walls. Horay! It detects even detects flower pots, edges of opened doors and most of the corners. It’s still fun to see that the robot has mostly no idea what’s in front of him – whethet it should touch it lightly (i.e. pillow, small flower pot) or whether it can use some force to overcome it. So sometimes it’s pushing pillow in front of itself before finally deciding it’s not going work. Sometimes it’s driving over (horizontal) table leg using pure force instead of pure computing power looking like off-road vehicle in full swing. So it’s not 100%, I think the last 10% (or whatever) is always an uphill battle. It still needs to be checked while working, but I’m keeping this one. I wish to be able to tweak some parameters. Every house is different.
Looking forward to see where the technology goes in comming years.