Embodied Shutdown Bricks Moxie AI Robots, Families Suffer

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

Embodied’s abrupt shutdown has rendered $799 Moxie AI robots—marketed as emotional-support companions for autistic children—completely nonfunctional, with refunds denied for most owners. Families face emotional distress and financial loss, underscoring the vulnerability of proprietary, cloud-reliant AI devices dependent on company solvency.[AI generated]

Why's our monitor labelling this an incident or hazard?

The AI system (Moxie robot) is explicitly mentioned as relying on cloud-based AI for its conversational and interactive capabilities. The shutdown leads to the robot ceasing to function, causing emotional harm to children who had bonded with it. Emotional harm to individuals, especially children, due to loss of an AI companion qualifies as harm to persons. Therefore, this event constitutes an AI Incident because the AI system's cessation directly leads to harm (emotional distress) to users.[AI generated]
AI principles
AccountabilityRobustness & digital securitySafetyTransparency & explainabilityHuman wellbeing

Industries
Consumer productsConsumer servicesHealthcare, drugs, and biotechnologyRobots, sensors, and IT hardwareEducation and training

Affected stakeholders
ConsumersChildren

Harm types
Economic/PropertyPsychological

Severity
AI incident

Business function:
Monitoring and quality controlCitizen/customer service

AI system task:
Interaction support/chatbotsContent generationRecognition/object detection


Articles about this incident or hazard

Thumbnail Image

Maker of AI robots for kids abruptly shutters

2024-12-10
Axios
Why's our monitor labelling this an incident or hazard?
The AI system (Moxie robot) is explicitly mentioned as relying on cloud-based AI for its conversational and interactive capabilities. The shutdown leads to the robot ceasing to function, causing emotional harm to children who had bonded with it. Emotional harm to individuals, especially children, due to loss of an AI companion qualifies as harm to persons. Therefore, this event constitutes an AI Incident because the AI system's cessation directly leads to harm (emotional distress) to users.
Thumbnail Image

They bought an $800 AI robot for their kids. Now the company is shutting down -- and children are having to say goodbye.

2024-12-11
Business Insider
Why's our monitor labelling this an incident or hazard?
The AI system (Moxie robot) is explicitly mentioned and was in use, but the harm described is emotional distress from the robot's shutdown, not from malfunction or misuse of the AI system itself. The shutdown is due to business failure (lack of funding), not AI system failure or misuse. There is no direct or indirect harm caused by the AI system's operation or outputs. The event informs about the consequences of AI product lifecycle and user attachment, which is relevant context but not an incident or hazard. Hence, it fits the definition of Complementary Information.
Thumbnail Image

The death of a robot designed for autistic children proves Apple's on-device AI is the rig...

2024-12-10
AppleInsider Forums
Why's our monitor labelling this an incident or hazard?
The Moxie robot is an AI system that uses cloud-based large language models to provide social and educational support to autistic children. The company's decision to shut down the cloud services directly causes the robot to cease functioning, which leads to emotional harm to children who considered the robot a friend and a lifeline. This constitutes harm to a group of people (autistic children) and their communities, fulfilling the criteria for an AI Incident. The harm is realized and directly linked to the AI system's use and operational dependency on cloud AI services. Therefore, this event qualifies as an AI Incident.
Thumbnail Image

'I love you... goodbye:' What will happen when this companion robot suddenly dies?

2024-12-10
Popular Science
Why's our monitor labelling this an incident or hazard?
The event involves an AI system (Moxie) that uses a large language model and other AI capabilities to interact with children. The shutdown of the AI service causes the robot to lose its core functionality, directly impacting users emotionally. The harm is realized and documented through users' reactions, including children experiencing distress and grief. This emotional harm to individuals and communities fits within the definition of harm caused by AI systems. Although the harm is non-physical, it is significant and clearly articulated, meeting the criteria for an AI Incident rather than a hazard or complementary information.
Thumbnail Image

Children Forced To Bid Farewell To Beloved AI Robot Friend After Company Goes Bankrupt: 'Like A Sad Pixar Movie'

2024-12-10
International Business Times
Why's our monitor labelling this an incident or hazard?
The AI system (Moxie robot) is explicitly mentioned and its shutdown is directly caused by the company's financial failure. While there is no physical injury or direct violation of rights, the event involves harm to users in the form of emotional distress and loss of a companion AI system. However, this harm is more emotional and social rather than a violation of fundamental rights or physical harm. The event does not describe a malfunction or misuse causing harm, but rather a cessation of service due to business failure. This does not meet the threshold for an AI Incident or AI Hazard, as no direct or indirect harm as defined (physical, rights violation, critical infrastructure disruption, or significant community harm) is occurring or plausibly expected. The event primarily provides context and update about the AI system's status and company situation, fitting the definition of Complementary Information.
Thumbnail Image

AI Company That Made Robots For Children Went Bust And Now The Robots Are Dying. Warning: sad. 😢

2024-12-09
Democratic Underground
Why's our monitor labelling this an incident or hazard?
The AI system involved is the Moxie robot, which is an AI-powered companion for children. The company's closure leads to the robots ceasing to function, which can be seen as harm to the users (children) in terms of emotional distress and loss of a companion. This harm is directly linked to the AI system's malfunction or discontinuation due to the company's shutdown. Therefore, this qualifies as an AI Incident because the AI system's use and malfunction have directly led to harm (emotional harm to children and families).
Thumbnail Image

AI Robot Maker Pulls The Plug Forcing Children To Deal With Loss And Death

2024-12-11
HotHardware
Why's our monitor labelling this an incident or hazard?
The AI system Moxie is explicitly described as using large language models and cloud-based AI services to interact with children, thus qualifying as an AI system. The shutdown of these services due to financial failure leads to the loss of the robot's functionality, directly affecting the children who relied on it for emotional and educational support. This loss can cause emotional harm and distress, especially for special needs children who may have formed attachments to the robot. Therefore, the event involves the use and malfunction (shutdown) of an AI system leading to harm to persons, fitting the definition of an AI Incident.
Thumbnail Image

A $1,200 AI-powered robot for autistic kids is going to die because its maker is going under

2024-12-11
Crikey
Why's our monitor labelling this an incident or hazard?
The Moxie robot is an AI system designed to interact socially with children, especially those with autism, to support emotional regulation and communication. The company's closure and the consequent shutdown of cloud AI services directly lead to harm by abruptly removing a trusted AI companion, causing emotional distress to children and their families. This constitutes harm to health and well-being (a form of injury or harm to persons) and harm to communities (emotional and social harm). Therefore, this qualifies as an AI Incident due to the realized harm caused by the AI system's discontinuation and malfunction resulting from the company's financial failure.
Thumbnail Image

They bought an $800 AI robot for their kids. Now the company is shutting down -- and children are having to say goodbye.

2024-12-11
Business Insider Nederland
Why's our monitor labelling this an incident or hazard?
The Moxie robot is an AI system designed to interact with children and support social skill development. The company's decision to shut down the service and servers leads to the AI system ceasing to function, which indirectly causes emotional distress to children attached to the robot. However, this emotional distress does not constitute injury or harm to health in a legal or medical sense, nor does it involve violations of rights, disruption of critical infrastructure, or harm to property or communities as defined. Therefore, this event does not meet the criteria for an AI Incident or AI Hazard. It is best classified as Complementary Information because it provides context about the lifecycle and impact of an AI system on users but does not describe a harm or plausible future harm as defined.
Thumbnail Image

Families face heartbreak as Moxie robots stop working after company shutdown

2024-12-11
Greenbot
Why's our monitor labelling this an incident or hazard?
The Moxie robot is an AI system designed to support children's emotional and social development. The company's shutdown and the resulting loss of AI service have directly caused emotional harm to children, as evidenced by parental reports and expert commentary on the negative emotional impact. This constitutes injury or harm to the health of a group of people (children with autism and others), fitting the definition of an AI Incident. The harm is realized and directly linked to the AI system's use and its sudden discontinuation.
Thumbnail Image

Discontinuation of $800 Emotional Support Robot for Children to Render Devices Inoperable, Without Refund Options. | Tech Biz Web

2024-12-12
TechBizWeb
Why's our monitor labelling this an incident or hazard?
The Moxie robot is an AI system providing emotional support and interactive functions. The event involves the company's closure leading to the AI system becoming inoperable, causing financial and emotional harm to users. This fits the definition of an AI Incident because the AI system's use (reliance on ongoing software support) has indirectly led to harm (financial loss, emotional distress). The harm is clearly articulated and pivotal to the event. Although the harm is not physical injury, it includes emotional harm to children and financial harm to consumers, which are recognized harms under the framework. The event also discusses broader governance and consumer protection issues, but the primary focus is the realized harm from the AI system's discontinuation, making it an AI Incident rather than Complementary Information or AI Hazard.
Thumbnail Image

AI Company That Made Robots For Kids Goes Under, Robots Die - Aftermath

2024-12-09
aftermath.site
Why's our monitor labelling this an incident or hazard?
The event involves an AI system (Moxie) that uses large language models to provide social interaction for autistic children. The company's closure causes the AI system to stop functioning, directly harming users by depriving them of the service and causing emotional distress. This fits the definition of an AI Incident because the AI system's use and sudden malfunction (due to company shutdown) have directly led to harm to a group of people (children and families). The harm includes emotional and social harm, which is a form of harm to people. The financial loss and lack of support further compound the harm. Therefore, this event qualifies as an AI Incident.
Thumbnail Image

AI Company That Made Robots For Children Went Bust And Now The Robots Are Dying

2024-12-11
Windows Copilot News
Why's our monitor labelling this an incident or hazard?
The AI system (Moxie, an AI-powered social robot) is explicitly mentioned and was used to assist autistic children, a vulnerable group. The shutdown of the company and the robots ceasing to function directly leads to harm to these users by depriving them of a supportive tool, which can be considered harm to groups of people (a). Therefore, this qualifies as an AI Incident due to the direct impact on users' well-being resulting from the AI system's unavailability.