AI Against Humanity
← Back to articles
Privacy 📅 April 2, 2026

Perplexity's "Incognito Mode" is a "sham," lawsuit says

A lawsuit against Perplexity, Google, and Meta reveals serious privacy violations linked to AI chat systems. Users' sensitive data is allegedly shared without consent.

A lawsuit has been filed against Perplexity, Google, and Meta, alleging that Perplexity’s 'Incognito Mode' misleads users regarding privacy protection. The suit claims that sensitive information from both subscribed and non-subscribed users, including personal financial and health discussions, is shared with Google and Meta without consent. It describes the ad trackers employed by these companies as akin to 'browser-based wiretap technology,' violating state and federal privacy laws. The plaintiff, Doe, asserts that he was unaware of this data transmission, which could lead to targeted advertising based on sensitive information. The lawsuit criticizes Perplexity for inadequate disclosure of its privacy policy and emphasizes the ethical implications of AI systems that fail to safeguard user privacy. It raises urgent concerns about transparency and accountability in AI technologies, particularly as they become more integrated into daily life and handle sensitive personal data. The case underscores the need for companies to genuinely protect user privacy and may result in substantial fines and damages for the alleged violations of legal standards and privacy policies.

Why This Matters

This article matters because it highlights serious privacy violations associated with AI technologies that many users may unknowingly encounter. The risks of sensitive data exposure can have far-reaching implications for individuals' privacy and trust in digital platforms. Understanding these issues is crucial for advocating for stronger regulations and ethical standards in AI deployment.

Original Source

Perplexity's "Incognito Mode" is a "sham," lawsuit says

Read the original source at arstechnica.com ↗

Topic