Life360 Sells Sensitive User Location Data, Raising Privacy Concerns

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

Life360, a widely used family safety app, has been selling precise location data of millions of users, including children, to multiple data brokers. This AI-driven data collection and sale has led to significant privacy violations, exposing users to potential misuse of their sensitive information. The issue is heightened by Life360's acquisition of Tile.[AI generated]

Why's our monitor labelling this an incident or hazard?

Life360 is an AI-enabled system that collects and processes location data to provide safety features. The reported sale of raw, precise location data without adequate privacy protections, especially involving children, constitutes a breach of privacy rights and potentially other legal obligations protecting personal data. The involvement of AI in processing and analyzing location data is implicit in the app's functionalities. The direct consequence is a violation of human rights (privacy), fulfilling the criteria for an AI Incident. The harm is realized, not just potential, as the data is actively sold and can be linked back to individuals, including children, which is a serious rights violation.[AI generated]
AI principles
Privacy & data governanceRespect of human rightsTransparency & explainabilityAccountability

Industries
Consumer services

Affected stakeholders
ConsumersChildren

Harm types
Human or fundamental rights

Severity
AI incident


Articles about this incident or hazard

Thumbnail Image

Popular safety app Life360 is reportedly selling the precise location data of millions of kids and their families

2021-12-07
Business Insider
Why's our monitor labelling this an incident or hazard?
Life360 is an AI-enabled system that collects and processes location data to provide safety features. The reported sale of raw, precise location data without adequate privacy protections, especially involving children, constitutes a breach of privacy rights and potentially other legal obligations protecting personal data. The involvement of AI in processing and analyzing location data is implicit in the app's functionalities. The direct consequence is a violation of human rights (privacy), fulfilling the criteria for an AI Incident. The harm is realized, not just potential, as the data is actively sold and can be linked back to individuals, including children, which is a serious rights violation.
Thumbnail Image

Life360 app is selling data from millions of families, report says

2021-12-07
CNET
Why's our monitor labelling this an incident or hazard?
Life360's app uses AI-based location tracking systems to collect and process user location data. The failure to sufficiently anonymize this data leads to direct privacy harms and potential violations of fundamental rights, especially since the data includes children. The AI system's use and data handling practices have directly led to these harms, fulfilling the criteria for an AI Incident under violations of human rights and legal obligations. The involvement of AI in location tracking and data processing is explicit and central to the harm described.
Thumbnail Image

Kid-tracking app that parents love sells precise location data

2021-12-07
Mashable
Why's our monitor labelling this an incident or hazard?
Life360 is an AI system that collects and processes real-time location data to provide tracking services. The article reveals that the system's use has directly led to harms, including privacy violations and potential misuse of sensitive location data by third parties. The sale of precise location data without adequate anonymization or user awareness constitutes a violation of rights and harm to individuals and communities. The involvement of the AI system in collecting, processing, and distributing this data is central to these harms. Hence, this event meets the criteria for an AI Incident as the AI system's use has directly led to significant harm.
Thumbnail Image

Life360 Reportedly Sells Location Data of Families and Kids

2021-12-07
Gizmodo
Why's our monitor labelling this an incident or hazard?
Life360 uses AI systems to collect and process location data from millions of users, including children. The company's sale of this data to third-party brokers, with insufficient privacy protections and potential for re-identification, directly leads to violations of users' privacy rights and fundamental rights. The harm is realized as users' sensitive location data is exposed and potentially misused. The AI system's role in gathering and managing this data is pivotal to the incident. Hence, this event qualifies as an AI Incident under the framework.
Thumbnail Image

Tile's Owner 'Life360' Is Selling User Location Data

2021-12-07
Fossbytes
Why's our monitor labelling this an incident or hazard?
Life360 is an AI system as it uses software to track and infer user locations continuously. The app's development and use have directly led to harm by violating user privacy and potentially breaching legal protections for personal data. The sale of precise location data without sufficient anonymization or aggregation increases the risk of re-identification, which is a violation of fundamental rights. Therefore, this event qualifies as an AI Incident due to realized harm linked to the AI system's use and data handling practices.
Thumbnail Image

Does Tile's new owner, Life360, sell users' location data?

2021-12-07
Pocket-lint
Why's our monitor labelling this an incident or hazard?
Life360 uses AI-based location tracking systems to collect and process user location data. The reported sale of this data to third parties without clear user consent directly implicates violations of privacy rights, a form of human rights violation. The harm is realized as users' sensitive location data is commodified and shared, potentially exposing them to risks. Although the CEO claims that identifiable data is not sold and that Tile data is not sold, the core issue remains the sale of location data from Life360's platform, which is an AI system. Hence, this event meets the criteria for an AI Incident due to direct harm to users' rights through the use of AI systems.
Thumbnail Image

Family safety app Life360 is selling location data on millions of users

2021-12-07
The Next Web
Why's our monitor labelling this an incident or hazard?
The article explicitly details the use of an AI system (Life360 app and its data processing infrastructure) that collects and sells location data, leading to direct harm in the form of privacy violations and potential misuse of sensitive personal information. The harms are realized, not merely potential, as the data has been sold to numerous brokers and used in ways that could harm individuals and families. The AI system's role is pivotal in enabling the large-scale collection, processing, and distribution of location data. The event meets the criteria for an AI Incident because it involves the use of AI systems leading directly to violations of human rights (privacy) and harm to individuals and communities. It is not merely a hazard or complementary information, as the harm is ongoing and documented.
Thumbnail Image

Tile buyer Life360 reportedly sells precise user location data to nearly anyone - General Discussion Discussions on AppleInsider Forums

2021-12-06
AppleInsider Forums
Why's our monitor labelling this an incident or hazard?
Life360's system uses AI-based location tracking to collect and process precise user location data, which is then sold to third parties. The sale and sharing of this data, especially involving children, without adequate privacy measures, directly leads to violations of privacy rights and potentially legal breaches. The involvement of AI in processing and inferring location data and the resulting harm to user privacy meet the criteria for an AI Incident. The harm is realized, not just potential, as the data has been sold and shared, impacting user privacy and rights.
Thumbnail Image

Tile's future owner is currently selling the location data of millions of its users

2021-12-07
MobileSyrup
Why's our monitor labelling this an incident or hazard?
Life360's platform uses AI or algorithmic systems to track and process location data of users, which is then sold to third parties. This activity directly leads to violations of privacy rights and potentially breaches legal protections, constituting harm to individuals' rights. The involvement of AI in processing and distributing this data is clear, and the harm is realized as the data sale is ongoing. Hence, this event meets the criteria for an AI Incident under violations of human rights or breach of applicable law.
Thumbnail Image

Life360, the Company Buying Tile, Is Purportedly Selling the Location Data of Millions of Families and Kids

2021-12-07
Gizmodo AU
Why's our monitor labelling this an incident or hazard?
Life360 uses AI-based location tracking systems to collect and process user location data. The company's practice of selling this data to third-party brokers, who may further distribute it without adequate privacy protections, directly leads to violations of users' privacy rights and potentially breaches legal obligations. The harm is realized as the data of millions, including children, is exposed and could be re-identified, constituting a violation of human rights and privacy. The involvement of AI in the data collection and processing, combined with the direct harm caused by the data selling practices, classifies this event as an AI Incident.
Thumbnail Image

Tile's New Owner Selling Precise Location Data on Millions of Kids

2021-12-07
iDrop News
Why's our monitor labelling this an incident or hazard?
The article details how Life360, an app that uses AI-enabled location tracking systems, sells precise location data of millions of users, including children, to data brokers who then distribute it widely. This practice leads to direct harm in the form of privacy violations and breaches of fundamental rights. The AI system's role in collecting, processing, and distributing this data is pivotal to the harm. The lack of transparency, oversight, and effective anonymization exacerbates the risk and actual harm. Hence, the event meets the criteria for an AI Incident due to realized harm linked to AI system use.
Thumbnail Image

You Need To Uninstall Life360 Right Away! (Alexander Maxham/AndroidHeadlines.com)

2021-12-07
Tech Investor News
Why's our monitor labelling this an incident or hazard?
Life360 is an AI system that collects and processes location data from millions of users. The selling of this data constitutes a violation of user privacy, a human right, and thus a breach of obligations under applicable law. The article indicates that this data selling is happening, not just a potential risk, so harm is realized. The acquisition of Tile increases the potential scale of this harm but does not negate the existing harm. Hence, this is classified as an AI Incident due to direct involvement of AI systems in causing harm through privacy violations.
Thumbnail Image

Life360 sells millions of users' location data

2021-12-08
Android Community
Why's our monitor labelling this an incident or hazard?
The article involves an AI system context because location data processing and targeted advertising generally involve AI technologies. The event concerns the use of AI-derived data and its sale without adequate safeguards, which could plausibly lead to violations of privacy and human rights (harm category c) if the data were misused or leaked. However, since no actual harm or misuse has been reported yet, and the article focuses on the potential risks and lack of safeguards, this situation fits the definition of an AI Hazard rather than an AI Incident. It is not merely complementary information because the core issue is the plausible risk of harm from AI-related data practices, not just an update or response to a past incident.
Thumbnail Image

Life360 vendrait les données de localisation précises de millions d'utilisateurs

2021-12-07
PhonAndroid
Why's our monitor labelling this an incident or hazard?
Life360's service relies on AI or algorithmic systems to collect, process, and share precise location data. The sale of this data to third parties, including sensitive data about children, constitutes a violation of privacy rights and fundamental rights protections. The harm is realized, as users' location data is being sold without adequate safeguards, leading to potential misuse and privacy breaches. The AI system's role in processing and enabling this data sharing is pivotal, making this an AI Incident under the framework's criteria for violations of human rights and breaches of legal obligations.
Thumbnail Image

Acquéreur des traqueurs Tile pris la main dans le sac

2021-12-08
L'essentiel
Why's our monitor labelling this an incident or hazard?
An AI system is reasonably inferred here as the tracking and location-sharing app uses AI-based data processing to collect, analyze, and share user location data. The misuse or inadequate protection of this data has led to violations of privacy rights, a form of human rights violation under applicable law. Since the event involves actual data collection and resale practices causing harm to users' privacy and rights, it qualifies as an AI Incident under the category of violations of human rights or breach of legal protections.