YouTube

2022-08-26 20:17:34 By : Ms. Sophia Woo

Stacey on IoT | Internet of Things news and analysis

April 19, 2022 by Stacey Higginbotham 1 Comment

Thanks to the Amazon Astro reviews hitting the press, a recent conversation with a co-founder of iRobot, and fundings for startups building construction and medical robots, I’ve been thinking a lot about why we are so obsessed with robots in the smart home.

I’ve been thinking about how we are obsessed with a Jetsons-era idea of a roving robot that handles basic chores and perhaps provide security, and how we think this robot should be able to perform dozens, if not hundreds, of functions. What may come as a surprise to many is that such a robot is hiding in plain sight. But for it to work we need good standards.

Last week, reviews of Amazon’s Astro robot, which people can apply to purchase, hit the Wall Street Journal and CNET.  The robot is basically a box on wheels complete with cameras, a screen that acts as an Amazon Echo Show device, and additional periscoping cameras that give the robot a view of countertops, locks, and other things that it can’t see at the level of its primary camera.

The Astro is cute. It burbles and bops. People can place a bottle of soda or a bowl of popcorn in it and send it over to a named member of the household because it can navigate the rooms of a home and identify people who live in that home. It can also follow people around and act as a large roaming Echo device.

All of that being said, I think if you taped an Echo Show to a Roomba you might get a more useful device. After all, Roombas can at least vacuum. So what, exactly, is Amazon doing with Astro? I think it has a lot to do with context and mobility.

Amazon already has many of the tools it needs to have a perfect home robot thanks to Alexa and the coming Matter protocol. Alexa’s capabilities as both a personal digital assistant and control mechanism for an entire home full of gadgets continue to improve. And the upcoming Matter standard will enable more devices to act as extensions of Alexa’s commands by making it fairly easy to execute basic functions, such as checking to see if a door is locked or turning on a light.

What Alexa doesn’t have is the ability to map out a home, see who is in the home, and understand what has changed in the home based on the location of people and things in a room. Astro can provide that.

Such information is essential for creating a home robot that performs many of the functions we envision a robot performing à la Rosie from the Jetsons — but without requiring us to build a humanoid robot that can tackle stairs and manipulate many different types of objects and that contains a computer vision system that can also handle context and natural language processing. All of which are giant computer science problems that aren’t completely solved and are certainly not solved at a price point that consumers will currently accept.

As Helen Greiner, co-founder of Roomba maker iRobot and CEO of Tertill, maker of the Tertill weeding robot, explained to me, building a robot is about compromises, and trying to cheat to solve problems. In building Tertill, some of her team’s cheats involved using sensors to attack everything under four inches that was growing in a defined garden bed as opposed to using computer vision to identify individual plants and then destroy them.

The result was a cheaper robot that people will likely try, a strategy similar to what Greiner and few fellow founders of iRobot deployed for the Roomba. Robots are hard — even highly specific ones, such as the Roomba or a dedicated pizza-making robot. And with professional use cases, the environment is designed for the robots, something I’m not sure we’re willing to accept for our homes.

Professional robots are also designed to do only a few select things. Those Boston Dynamics nightmares we see running along the sidewalks are both pricier than consumers are willing to spend and engineered to keep their footing in a certain number of environments. New skills, such as turning a doorknob or running, takes weeks of specialized programming and even new hardware.

I don’t say this because I think home robots are an impossible goal. I say it to make clear that a home robot will likely look different than what we’re currently imagining. A home has so many functions and is a highly individual environment that a general purpose humanoid robot doesn’t make sense.

But if a robot is simply a machine designed to handle a task the same way every time, then telling Alexa to lock my door and having that command relayed to my smart lock is basically me instructing a robot. As are, essentially, the automations that check to make sure my lights are off, doors are locked, and cameras are turned on before I leave the house.

Alexa handles these tasks in many homes. And if Alexa can gather more context about the home thanks to a roving device with a camera along with the ability to suss out where people are and the state of the home at any given time, then Alexa can become the brains of a smart home robot.

Amazon could buy iRobot or another roaming robot that has mapping data to get this context, or it could deploy Astro. I wonder if Amazon is simply looking at the Astro mapping results to see if it needs to send a mapping robot around the home to deliver a quality home robot experience.

When it comes to the cameras and image recognition needed to figure out who is in the home, Amazon could get similar information by tracking phones or wearables on people who live in the home. But a roving camera can provide even more information. Plus, people have shown that they are willing to pay for security. So giving Astro a job patrolling the home makes sense. And it will yield more data.

I don’t think the masses will look to Astro as the definitive home robot. I think Amazon is likely using it to understand how best to make the whole home into a robot managed by Alexa. And I think that’s way smarter than spending $1,000 for a squat Echo Show on wheels.

Want the latest IoT news and analysis? Get my newsletter in your inbox every Friday.

Filed Under: Analysis, Featured Tagged With: Amazon, google, Matter

Enter your email address to receive notifications of new posts by email.

Hmmmm…I know that as someone who is quadriparetic I have a different view than most people, but there are two different definitions of “robot” in common use, and which one you use makes a big difference in this conversation.

The original, Jetsons type definition:

“ a machine resembling a human being and able to replicate certain human movements and functions automatically.” (Miriam-Webster)

The newer one, which fits what you’ve just described:

“ any automatically operated machine that replaces human effort, though it may not resemble human beings in appearance or perform functions in a humanlike manner” (Britannia)

So telling Alexa to lock your door and having the door lock fits the second definition, but not the first. So far, so good. When you can design the physical pieces of a system so that they can be operated by command, you can use the second definition.

However (and for people like me this is a HUGE issue), if you want to be able to have the system interact with the existing physical world, you need the first definition. You need a robot with hands and fingers that works like a human’s. Amazon’s new robot doesn’t have these, so it’s pretty useless for someone like me.

The obvious use case is the “bring me a beer from the fridge, open the bottle, and pour it into a cup I can drink from.” (Or water bottle, if you prefer.) My human helpers can do this. My service dog can bring the bottle, but not do the rest of it. The Amazon robot can’t do it at all.

But there are lots of other situations that have similar requirements. Fold the laundry. Load the dishwasher. Open the mail. Put away the groceries. Make the bed. Tighten a screw. Change a lightbulb.

I recently spent 3 days waiting for someone to change the batteries in a labelmaker. Again, not something my service dog can do.

So there’s this whole category of interaction with the physical world stuff that only a first definition robot could do. All the second definition stuff is great, but it’s just not enough.

(BTW, one task at a time you may be able to solve some of these without a humanoid robot. There’s an inflatable quilt that folds and unfolds itself to automate bedmaking—but doesn’t work for me because my dog’s claws would puncture it. There’s an awesome, if tiny, trash can from Townew which puts in its own trash bag when it’s empty and seals the bag when it’s full. I have one of these and it’s great. But there are so many things around the house that a general purpose robot with hands could do for me every day! I know it’s a long way off, but I’m still hoping.)

Your email address will not be published. Required fields are marked *

Notify me of follow-up comments by email.

Notify me of new posts by email.

This site uses Akismet to reduce spam. Learn how your comment data is processed.