AI Against Humanity
← Back to articles
Privacy 📅 March 11, 2026

Grammarly's AI Feature Sparks Legal Controversy

Grammarly faces a class action lawsuit over its AI feature that misattributed editing suggestions to authors without consent. This raises serious ethical concerns.

Grammarly, a writing assistance tool developed by Superhuman, is currently facing a class action lawsuit due to its AI feature known as 'Expert Review.' This feature provided users with editing suggestions that were falsely attributed to established authors and academics without their consent. The lawsuit highlights significant ethical concerns surrounding the use of AI in content creation, particularly regarding consent and intellectual property rights. By misrepresenting the source of these suggestions, Grammarly not only risks legal repercussions but also undermines the trust of its user base and the integrity of the authors involved. The company has since shut down the feature, but the incident raises broader questions about the implications of AI technologies in creative fields and the potential for misuse that can harm individuals and communities. As AI systems become more integrated into everyday applications, the need for clear ethical guidelines and accountability becomes increasingly urgent to prevent similar issues in the future.

Why This Matters

This article matters because it underscores the ethical risks associated with AI technologies, particularly in how they can misrepresent and exploit the work of individuals without their consent. The implications of such actions extend beyond legal consequences; they can damage reputations and erode trust in AI systems. Understanding these risks is crucial as AI continues to permeate various sectors, affecting how content is created and consumed. Addressing these issues is vital for fostering responsible AI development and ensuring that creators' rights are respected.

Original Source

Grammarly Is Facing a Class Action Lawsuit Over Its AI ‘Expert Review’ Feature

Read the original source at wired.com ↗

Topic