Miscellaneous helper robots have been whirring, rolling, and bumping around for years. They generally take the form of something adorable, with big doe eyes and a chunky white plastic body that is friendly and completely non-threatening. Or they’ll be the opposite, like Boston Dynamics’ agile robots that move in an uncanny valley-like way but lack things like faces. A robot called Misty falls squarely in the former category; even the name is cute. Misty Robotics designed it to be extensible and flexible, with technologies like computer vision on board the robot itself, and ripe for iteration by enterprising developers. At Microsoft Build 2019, we got an extended demonstration of Misty and a long chat with the company behind it.
A man is standing in a room
Ben Edwards, Misty Robotics’ head of developer engagement, plopped Misty on a coffee table in front of me. At just a couple of feet tall or so, Misty needs the extra height. To demonstrate Microsoft’s computer vision-enabled scene description, the robot rolled over to me and instructed me to say “cheese,” which I did. Misty blinked its robot doe eyes, and I heard the digitized sound of a camera shutter. Click. “A man is standing in front of a mirror,” said Misty.
That wasn’t quite right. I wasn’t standing, and there was no mirror in front of nor behind me; maybe the glass-covered painting over my shoulder was especially reflective and caused the confusion. “Some of that’s a little bit on Microsoft, and some of it’s on us,” Edwards chuckled. “We’re sending kind of a small image right now, but she generally gets that it’s a person [who is] doing something, or what they’re holding.”
We tried it again. Click. “A man is standing in a room,” said Misty. Again, technically not quite true, but getting closer.
“‘A man standing is a room’ is kind of her go-to with Microsoft’s [scene description],” said Edwards. The Misty Robotics crew has also hooked it up to other cognitive services from Amazon and IBM. The results vary, Edwards said, but the notable piece is that Misty can connect to any similar sort of third-party service.
But Misty heavily uses Microsoft technologies. Its operating system is Windows IoT Core — it uses Android 8 (Oreo) for navigation and computer vision — and C# code where all the skills are written. “We’re moving towards .NET Core, which is not far off from C# anyways. But we think that’s going to give us true cross-platform capabilities on Linux, Windows, and macOS. People can just use it and get into it,” said Edwards. They’ve also integrated with Azure Cognitive Services and the text-to-speech engine, and there’s a Misty SDK extension for Visual Studio.
Parts and pieces
Misty is laden with sensors, cameras, and miscellaneous computer vision technologies, with Qualcomm Snapdragon 820 and 410 chips and the Qualcomm Snapdragon Neural Processing Engine. The other hardware bits are designed to be extremely extensible.
Edwards and an assistant showed me more of Misty’s abilities, including how it listens when you call it and turns to face whoever’s speakeing thanks to its three Qualcomm Fluence Pro-powered far-field mics. When it looks at you, you can see a line of sensors across its forehead. The sensors are actually all one unit, made by Occipital, that includes a 166° wide-angle camera, a 4K camera, and IR depth sensors, and it employs SLAM sensor fusion and Occipital’s own Bridge Engine for spatial computing.
One of the advantages of this sensor bar is that it helps Misty navigate back to its charging station, even in the dark. In a pinch, though, Misty could turn on the flashlight that’s mounted near its right “ear.” The head has capacitive touch sensors for additional controls, and there’s a Nebula-style panel that you can swap out for a different type of camera, or a laser pointer, or other tool you can think up. The (patent pending) three degrees of freedom (3DoF) neck lets Misty emote a little by letting it cock its head to the side like a curious puppy.
There are speakers in the robot’s chest. All four corners of Misty’s base have time-of-flight bump sensors so it doesn’t run into objects and obstacles, nor fall off of things like coffee tables. The arms don’t do anything, although again, they’re meant to be extensible, so one could dream up some useful ideas with specially designed appendages. The most useful arm on this particular Misty offered a cupholder.
There’s a backpack module that’s primed for the likes of a Raspberry Pi or Arduino that can add bespoke functionality to Misty. “We did a demo at CES where we had a temperature sensor on the back,” Edwards said. If Misty was somewhere that got too hot, it could be programmed to send an alert to a dashboard. “Imagine — you could get readings all around in a warehouse,” he said.
A robot’s purpose
Misty has a lot of neat gadgetry on board, but the usefulness of such a robot is not immediately clear. It’s not exactly agile, and it’s small. No one would mistake it for a robot that’s designed to stack warehouse boxes, or tow vehicles, for example.
But, Edwards said, “It exists as the physical manifestation of the platform we’re creating for developers. We wanted to introduce robots to people who didn’t typically think that they were going to be into robots.” The idea is that Misty makes it easy for someone who can, say, write JavaScript to program a robot in a sophisticated way without having to deal with learning about and building hardware. That ideally makes it so developers can come up with actual uses much faster.
The company is beginning to see some of those uses trends for Misty, Edwards said. In health care — elder care especially — they’re finding that Misty can be helpful for people who are aging in place. In addition to simple helps like reminding someone to take their medication, there’s a certain level of companionship one can glean, even from a friendly robot that interacts with you in basic ways, he said. More importantly, remote family members can interact with a local family member to an extent through a robot. Maybe you can get a notification on a dashboard on your phone when your loved one takes their medicine, for instance. Edwards said a number of companies are working on applications in that area.
Misty can also be useful for building management. (This is an area in which Microsoft has expressed interest, Edwards said.) You can deploy a robot like Misty around the shop or warehouse to perform status reporting, or other similar tasks that are time-consuming and thereby costly for humans to perform.
It would seem that a system of IoT sensors could work just as well, and arguably more gracefully, than a little robot buzzing around on treads. Edwards noted that there can be gaps in those systems, though. Further the robot may alleviate some concerns for people about feeling overly surveilled, like in a room that’s constantly monitored by cameras. “But a robot could roam around and only be present a certain amount of time and then take a picture [for reporting].”
There are potential security applications, too, like performing simple anomaly detection, or even looking to see if a given person is supposed to be in a given place at a given time. Edwards and a partner demonstrated this particular use case for me. Misty looked at me and hollered, “Intruder! Intruder! You better leave now. I have already recorded you.” Apparently, one Misty Robotics engineer has already used Misty’s anomaly detection to thwart coworkers who were surreptitiously borrowing his tools.
Edwards mentioned that a sound technician had the idea to use Misty to roll around and help him check sound levels in music venues, which could save him maybe half an hour of setup time. “We’re not ever going to think of those scenarios,” Edwards said. “But if we get this in someone’s hands, maybe they think up a killer app for some percentage of the population.”
Misty is currently rather pricey at $2,400. It’s a tough sell without a killer application, so to speak, and Edwards knows this. “We think that robots need to have more really valuable use cases in homes and offices before people are going to be like, ‘Oh yeah, I’ll buy a $1,000, [or] $2,000 robot [for] my house,” he said. But the idea is that those will come once developers start dreaming up ways to use Misty.
OhNoCrypto
via https://www.aiupnow.com
Seth Colaner, Khareem Sudlow