Cube26, making it possible for computers to understand our emotions

cube26-logo

The buzz around Human Computer Interaction (HCI) technology is like a swarm of bees these days, so popular is it among smart device users. HCI is what makes it possible for us to control our smartphones with just the look of an eye or the wave of a hand, or have lights and TVs switch on for us when we enter a room. The applications of this field could potentially go very, very far at the rate things are moving. At the forefront of developing this technology is Cube26, a Santa Clara based startup whose recent gesture controlled iOS app made waves when it launched a few months ago.

A different breed of HCI startup

cube26-HCIFounded by ex-Cornell students Aakash Jain, Abhilekh Agarwal, and Saurav Kumar, Cube26’s success belies its humble origins. When the three started off, it was just a distant dream. Brought together by the hackathons they used to attend, they started working on some robust pattern recognition algorithms while still at Cornell. That is when they built their basic engine, which they deployed onto an iPad. The result was that suddenly, they could understand user engagement and emotions in real time. It was around then that they started the process of customer development, and trying to understand the market for their new technology. As Kumar says, “Our idea was always about a great technology that looked for problems to solve.” That was 2012. They spent the entirety of the year understanding different market segments where they could potentially add value by using their innovations. By the time March 2012 rolled along, Ketan Banjara, an executive at Yahoo, did as well, to help them with their business development process. They considered everything from smart mobiles and TVs to digital signage, toys and cars. During that time, they amped up their technology, too, and realised that they were building a very different breed of HCI startup.

When you shush at your phone and it works!

Cube26’s aim is not to try and solve the problem of humans interacting with machines through only one medium, such as eye-tracking or gesture control. Their pattern recognition engine works the way we, as humans, work. When we talk to each other, we understand each other’s gestures, emotions, the way we pay attention to each other, and physical characteristics like age, gender, and facial features. Cube26’s system was built to support this information from the ground up. The surprising thing about their technology is that it works with existing hardware systems, such as the camera on your iPad or your low-resolution webcam.

So if you were to stand before any one of your devices powered by their software, it would immediately be able to identify your age, gender, whether you are shushing at it, or smiling, or if you are paying attention. The focus is on detecting spontaneous and instinctual movement, unlike other gesture controlled technologies such as those used in the Kinect. Imagine browsing Netflix and being recommended films you actually want to watch based only on your facial expressions. Their latest app, LookAway was launched in April to widespread critical acclaim. A free iOS app, it lets you pause video by merely looking away, and mute it by shushing at it. They call it “natural gesture control” and it also comes natively integrated into the Samsung Galaxy S4.

From PredictGaze to Cube26

cube26-gesture-controlCube26 was formerly known as PredictGaze but as the team worked on customer development, they realized that eye tracking was just one piece of the HCI puzzle. A cube, on the other hand, is related to 3 dimensional geometry, and it is the 3 dimensional space in which the 3 components of HCI – humans, devices and cameras interact. A 3x3x3 cube is made of 27 unit cubes, 26 of which are visible parts of the exterior layer – hence the name. Their technology is not confined to the iPhone and Android – they have ported it to work for Smart TVs and digital signage as well. So take, for example, a restaurant chain that is trying to figure out how to change the menu dynamically based on who is looking at it. If it is summer, and a child is looking, the menu will show a selection of ice creams, but if it is a middle aged man, the selection would change to that of light beers. Cube26 is in the process of building partnerships with OEMs, retail companies and companies that make toys. The exact details are being kept under wraps now, but will be revealed around August/September, along with a couple more products. From its humble origins in a garage, Cube26 really came into its own at Microsoft’s Mega Startup Weekend this year, where they won the event. Speaking of the experience, Kumar says “It was superb. We are hard core coders right from our Cornell days, but what was different about Mega Startup Weekend compared to other hackathons was that it was no longer just about product hacks. There was a business side of things there. It was a lot of fun, a really great experience.”

 

This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *