The Pixy Kickstarter crowdfunding campaign is about to end, and end very well having raised nearly 10 times the target of $25,000. What’s Pixy? It’s a fast and color-sensitive vision sensor for mobile robots that you can “teach” to track objects. In simple terms, it’s an eye for your robot.
Revolution in speed and simplicity
Pixy is an open source hardware project by Austin, TX-based Charmed Labs and Carnegie Mellon University. It’s revolutionary because of its speed and simplicity. We’ll talk some more about the speed and how you teach it to recognize objects, but the key factor behind its appeal on Kickstarter is that you don’t have to be a robotics expert to use it.
You don’t have to build anything or write code to make it work with the rest of the robot. Amateur robot enthusiasts can buy Pixy and directly hook it up with an open-source microcontroller such as Arduino using SPI or UART serial interface.
Then there’s the price factor – early Pixy backers got one Pixy unit and an Arduino cable for $39. That’s all sold out, and the special Kickstarter price is now $59. It’s a good bet that once the product goes to market (available Nov 2013), its price is going to fall some more eventually as volume picks up.
This means you can buy a part (on the cheap) that essentially solves the vision part of building a robot, and that’s a huge leap as far as bringing robotics to the masses is concerned.
Ted Macy, contributing editor for Robot Magazine, said in his review of Pixy that it is “the single most important robotics product since the Arduino.”
Pixy can be taught to recognize objects
Let’s take a look at some of the new capabilities that Pixy brings to a mobile robot. You can teach Pixy to recognize an object. Actually, you can teach it to recognize hundreds of objects. You just put the object in front of it and press a button. Pixy stores the color-combination model in its flash memory and will then be able to recognize an object with the same color signature if it bumps into it again.
It gets better, because Pixy operates at 50 frames/sec, which means it can process a 640×400 image in 1/50th of a second (or 20 milliseconds). Pixy can track moving objects, generating an update on the object’s position every 20ms.
As a practical example, consider that it can track a bouncing ball. A ball traveling at 30mph moves just a foot in 20ms. So you can throw a ball and Pixy hooked up to Arduino and the rest of a mobile robot will follow it around like a dog.
For the final week push on Kickstarter, Charmed Labs created a cute cat video of Pixy playing with lasers. It makes the robot scarily “alive.” Granted, it’s more Wall-E than Terminator, but the same difference nevertheless. The Austin-Post’s Chris-Rachael Oseland says “Charmed Labs is bringing the eventual robot uprising one step closer with their camera sensor.”
Startups.fm had a chat with Charmed Labs Founder Richard LeGrand, and he scoffs at the suggestion that Pixy could be the key to a robot revolution. “Pixy isn’t technologically sophisticated to make us worry about The Uprising,” says LeGrand.
“But you never know… Someone out there might be inspired by what we’re doing. This is our hope actually,” he added.
Pixy is quite sophisticated as compared to the current lot of robotic vision sensors. Until now, robots have essentially been using video cameras that send humungous amounts of image data to a powerful onboard processor, which then has to drop everything to process the data and translate it into an object and location.
Pixy has a hue-based color filtering algorithm. It takes the hue and saturation of each RGB pixel from the image sensor as filtering parameters, compiles the size and location of the object, and sends it to Arduino or another microcontroller which can take the data and have the CPU free for other tasks.
DIY robot assembled at your attic
Simple translation – an ordinary tin-can DIY robot assembled in your attic can now see and use that vision to do stuff at the same time. LeGrand says it can detect objects, navigate, avoid obstacles, decode barcodes, and literally do thousands of things.
You can also hook it up your PC or MAC using a simple USB cable. It comes with the PixyMon application, which lets you see on your screen what Pixy is looking at, with or without Arduino.
Richard LeGrand didn’t just get a bright idea one day to come up with Pixy. Pixy is actually CMUCam5. Charmed Labs, CMU faculty member Anthony Rowe and former grad student Scott Robinson worked on it for two years.
You can see the previous four versions which Rowe’s group tinkered with on cmucam.org. The project was supported by NASA’s Ames Intelligent Systems Program, Parallax, Lextronic, SparkFun, and the Semiconductor Research Corporation.
LeGrand founded Charmed Labs in 2002, and has a long history of association with Carnegie Mellon on prior educational robotics projects such as the Telepresence Robotics Kit (TeRK) and GigaPan robotic camera mount (gigapan.com).
So what’s next for Pixy? Well, right now they’re focused on mass producing thousands of units for shipping to the 2,383 (as of Sept 12, 2013) backers on Kickstarter. They also plan to improve on its capabilities, and are seriously considering facial recognition to give Pixy face detection and tracking powers. Other things on the to-do list include support for Linux and Raspberry Pi.