Amazon Astro is called a "privacy nightmare" and "terrible" — here's Why

Amazon Astro is called a "privacy nightmare" and "terrible" — here's Why

Amazon's Astro robot is designed to "give peace of mind" to its owners by patrolling their homes and alerting them to any abnormalities. While this sounds like a groundbreaking leap forward in home surveillance technology, it has also been described as a "privacy nightmare" and "terrifying" by those allegedly involved in the project.

Internal documents obtained by Vice outline how Amazon's Astro robots are designed to monitor activity in the home and report suspicious activity or unwelcome strangers to residents. After purchasing the $999 robot, the owner "registers" his or her face and voice, as well as the faces and voices of those who live in or regularly visit the house, to allow the robot to roam freely.

From there, the astro-robot would regularly patrol the owner's home and identify anyone it encounters. If the person is scanned into its memory, it leaves them be. If not, Astro follows the unrecognized person and begins recording them just in case. The robot can also detect "unusual" sounds such as glass breaking or fire alarms.

Astro can be put into "away mode" for when the owner is not home. In this mode, Astro continues to patrol the house and the owner can remotely access a live stream of what Astro is seeing. For example, if they need to contact the house sitter, they can initiate a video call through the robot. This is definitely one way to gain peace of mind while on vacation or on a business trip.

On paper, the Astro robot sounds like a useful piece of surveillance technology, but former Amazon employees are less complimentary about its capabilities. One unnamed official who worked on the project told Vice, "Astro is terrible and will almost certainly throw itself down the stairs given the opportunity. It is unreliable at best at detecting people, and it makes home security proposals look like a laughingstock."

This source also stated that he feels the Astro is fragile for such an expensive device, and that Amazon's claim that it is an accessibility device is "at best, irrational nonsense and marketing, and at worst, potentially dangerous to those who actually rely on it for accessibility purposes . are potentially dangerous to those who actually rely on them for accessibility purposes."

It is not just the robot's ability to perform its primary function that former employees question. The manner in which privacy is traded for convenience in devices like Astro is an indictment of our society and a privacy nightmare. In my personal opinion, this device is a disaster that is too early to release," he added.

The questionable facial recognition capabilities of Astro are rather worrisome given that the robot's primary function is to patrol homes and determine suspicious or stranger people. That is before touching on the understandable privacy concerns that arise from allowing Amazon access to the cameras that constantly patrol its home.

In defense of its latest product, Amazon published a blog post detailing how the visual ID feature works, and representatives from the retailer responded to Vice's request for comment.

Christy Schmidt, Amazon's senior PR manager, said, "In addition to consulting with several Amazon Scholars who specialize in computer vision, we also consulted with an outside expert on algorithmic bias, Ayana Howard, dean of the College of Engineering at Ohio State University. and reviewed the steps taken to improve the fairness of this function."

Schmidt also provided Dr. Howard's comments, explaining that Amazon was extremely thorough in the design and testing of Astro's visual ID feature. He added that the company "has worked diligently to ensure that this feature not only works well statistically for all customers, but that it will continue to improve over time for those customers." "

In this light, there is definitely reason to be excited about Astrobot. However, the concerns of those involved in the project may deter potential buyers.

.

Categories