Alexa’s ambient AI can alert you to events at home



Alexa, Amazon’s ubiquitous voice assistant, is getting an upgrade. The company announced some of these changes in a virtual event today. One of the most interesting developments is that users can now personalize their Alexa-enabled devices to listen to specific sound events in their home.

Amazon also revealed new features and products, including additional accessories for Ring and Halo devices, and access to invite-only devices like the Always Home Cam and a cute beatbox robot called Astro.

Alexa’s latest features are part of the Amazon team’s work on ambient computing – a generic term that refers to an underlying artificial intelligence system that pops up when you need it and takes a back seat when you don’t to need. This is made possible by the connected network of Amazon devices and services that interact with each other and with users.

“Perhaps the most vivid example of our ambient AI is Alexa,” said Rohit Prasad, senior vice president and chief scientist for Alexa at Amazon Popular science. “Because it’s not just a spoken language service where you make a lot of inquiries. As ambient intelligence, available on many different devices in your environment, it understands the state of your environment and may even act on your behalf. “

Alexa already has the ability to know what Prasad calls “global”Ambient noise or noise events. These are things like broken glass or a fire alarm, smoke alarms. These are events that make your home safer while you are away, he says. If something goes wrong, Alexa can send you a notification. It can also detect more harmless noises like your dog barking or your partner snoring.

[Related: How Amazon’s radar-based sleep tracking could work]

Now, Prasad and his team are using this pre-trained global sound event model that they created using thousands of real world sound samples, and giving users the ability to create alerts for their own custom sound events by manually typing 5-10. add examples of a specific sound you want Alexa to listen to at home. “Any other kind of data you’ve gathered about us can be used to create the custom sound events with fewer samples,” he says.

This could mean that children leave the refrigerator door open for more than 10 minutes after school. “The two refrigerators in my house both make different noises when one of our children leaves the door open,” says Prasad. That way, Alexa could send him a notification even when he’s not home, when his kids are around, if someone didn’t close the fridge door properly.

You can set an alarm for a whistling kettle, washing machine running, or doorbell ringing, or an oven timer to ring upstairs. “And if you have an elderly person at home who doesn’t hear well and watches TV when they’re plugged into a Fire TV, you can send a message on the TV that someone is at the door and the doorbell rang,” says Prasad .

Ring can tell you which items it looks out of place

In addition to custom sound events, Alexa can also notify you of certain visual events through its Ring cameras and some Ring devices. “We found that ring cameras, especially for outdoor use, [are] great for looking at the binary states of objects of interest in your home, ”says Prasad. For example, if you have a ring camera in front of an outdoor shed, you can teach her to check whether the door has been left open or not by providing her with some pictures of the open and closed states and of her Let send an alert when it is open.

[Related: Amazon’s home security drone may actually be less creepy than a regular camera]


“You are now combining computer vision and a few brief learning techniques,” says Prasad. The team has collected a large selection of publicly available photos of garage and shed doors to aid with pre-training, as well as with the audio component of the ambient AI. “But my shed door may look different than the one you might have, and then adjustment is still required, but now this can be done with very few patterns.”

Alexa will soon be able to learn your preferences

Last year, Amazon updated Alexa so that if it doesn’t recognize the concept in a customer request, it will come back to you and ask, “What do you mean by that?”

This could be a request like: put my thermostat in vacation mode while vacation mode is a setting that is unknown. Also, your preference for setting could be 70 degrees instead of 60 degrees. This is where users can come in and customize Alexa Natural language.

“When you have these strange concepts, or unknown and ambiguous concepts, it usually requires input from human labelers [on the developer end] Saying “vacation mode” is kind of a setting for a smart device like a thermostat, ”explains Prasad.

This type of data is difficult to collect without hands-on experience, and new terms and phrases pop up all the time. The more practical solution was for the team to build on Alexa’s ability to do generalized learning, or generalized AI. Rather than relying on supervised learning from human labelers on Amazon, Alexa can learn directly from end users, making it easier for them to adapt Alexa to their lives.

In a few months, users will be able to use this feature to ask Alexa to find out their preferences, which is preceded by the words “Alexa, learn my preferences”. You can then have a dialogue with Alexa to learn three areas of preferences at the beginning: These are food preferences, sports teams and weather providers like the Big Sky app.

If you say, “Alexa, I’m a vegetarian” when Alexa guides you through the dialogue, it will remember the next time you search for restaurants nearby and prioritize restaurants with vegetarian options. And if you are only asking for recipes for dinner, vegetarian options would be preferred over others.

For sports teams, if you’ve said you like the Boston Red Sox for baseball and the New England Patriots and then you ask Alexa about sports highlights on the Echo Show, you get more custom highlights for your favorite teams. And if another family member likes other teams, you can add that to their preferences as well.

[Related: The ‘artificial intelligence’ in your new smart gadget may not be what you think]

“We already know that customers express these preferences many times a day in their regular interactions,” says Prasad. “Now we’re making it very easy for these preferences to work.” You can go through the preset prompts with Alexa to teach her your preferences, or teach them right away. For example, if you ask Alexa about restaurants and she recommends Steakhouses, you can say, “Alexa, I’m a vegetarian,” and it will automatically learn that for future encounters.

“These three inventions, which make the complex simple, also illustrate more general learning skills, with more self-supervised learning, transfer learning and few short learning times, and also deep learning to enable this kind of interactive dialogue,” Prasad says. “This is the hallmark of generalized intelligence,” similar to how humans learn.

Alexa learns and grows

These three new functions– The custom sounds, custom graphics, and settings – not only come together to improve AI, but also improve Alexa’s self-learning, self-service, and self-awareness of your surroundings. “Alexa is just more aware of your surroundings to help you when you need it,” says Prasad. Along with a feature like routines or blueprints, these new add-ons allow Alexa to provide more custom responses without the need for programming knowledge.

Alexa automatically learns how to improve as you use it more often. In fact, Prasad says Alexa is now able to automatically correct more than 20 percent of the errors Alexa has without human supervision. “If it did something wrong and you burst in and say no, Alexa, that’s what I meant,” it will remember that the next time you will ask for something similar, he says.

In his case, when he sometimes asks Alexa to play BBC, what he says is registered as BPC. “It’s just difficult for an AI. Occasionally it recognizes “Playing BPC”. But it recognizes usage patterns, ”says Prasad. That way, it can automatically fix the request without asking, “Did you mean BBC?”

[Related: If you’re worried about Amazon Sidewalk, here’s how to opt out]

“This is the kind of automatic learning based on context in both your personalized usage and cohort usage that Alexa can use to be much smarter around the world and try to assess errors and correct them automatically without human intervention,” says Prasad. “If you look at the old days of supervised learning, even with active learning, Alexa will say, ‘This is the part I’m having trouble with, let’s get some human feedback.’ Now this human input comes directly from the end user. ”



Leave A Reply