Former public radio anchor David Greene, best known for his years on Morning Edition and his current role on Left, Right & Center, has filed a lawsuit against Google claiming that the company’s NotebookLM artificial-intelligence product replicates his distinctive voice without his permission or compensation. Greene alleges the AI-generated male podcast voice used in NotebookLM’s Audio Overviews feature closely mimics his cadence, intonation and habitual speech patterns — to the point where friends and colleagues repeatedly asked if he had licensed it to Google. He described hearing the synthetic voice as “completely freaky” and “eerie,” and argues that the technology effectively mimics his professional identity. The legal complaint, filed in California, points to these similarities and contends Google violated his rights by building a product on voice style developed over decades of broadcasting. Google has pushed back, dismissing the allegations as “baseless,” asserting the NotebookLM voice was produced using a paid professional actor. The case highlights broader tensions between traditional media professionals and AI developers over where imitation ends and infringement begins, and could have significant implications for how voice-based AI services are regulated and compensated in the future. (Associated reporting from Washington Post; similar coverage by AndroidAuthority and Times of India)
Sources
https://www.washingtonpost.com/technology/2026/02/15/david-greene-google-ai-podcast/
https://www.androidauthority.com/google-notebooklm-david-greene-voice-lawsuit-3641264/
https://timesofindia.indiatimes.com/technology/tech-news/popular-radio-show-host-david-greene-claims-google-stole-his-voice-google-responds/articleshow/128414462.cms
Key Takeaways
• David Greene, former Morning Edition host, is suing Google, alleging NotebookLM’s AI voice copies his unique broadcasting style without his consent or compensation.
• Google denies the claim, saying the AI voice in question was created by a paid professional actor, but Greene and his supporters argue the similarities are unmistakable and personal.
• The lawsuit illustrates rising legal and ethical conflicts between established media figures and AI developers over voice cloning and intellectual property rights.
In-Depth
David Greene’s legal confrontation with Google underscores a growing and complex debate at the intersection of artificial intelligence, intellectual property and personal rights. Greene — a respected voice in public radio for years, including a tenure as co-host of Morning Edition and later work on the political podcast Left, Right & Center — claims that Google’s NotebookLM AI tool has taken his distinctive voice and broadcast style without his authorization. According to multiple reports, Greene became aware of the issue after friends and colleagues began reaching out, asking him if he had agreed to let Google use his voice. After listening to the NotebookLM audio himself, Greene was alarmed by how closely the synthetic voice matched not only his vocal tone but also his patterns of speech, including cadence, phrasing and even filler words that had become familiar over decades of broadcasting. This startling resemblance, Greene says, was more than a generic similarity — it felt like an imitation of his personal identity, honed through years of professional practice.
In response to these concerns, Greene took legal action in Santa Clara County, California, alleging that Google’s use of this AI voice violated his rights, deprived him of proper compensation and potentially misled listeners into thinking he was involved with a product he had never endorsed or recorded. The lawsuit argues that the NotebookLM voice mirrors Greene’s own mannerisms to such an extent that even people who knew him well assumed he had licensed the voice to Google, which he emphatically did not. This claim places spotlight on broader questions about emerging AI technologies: how companies collect and use voice data, the boundaries of imitation versus protected identity, and the rights individuals should retain over their own vocal likenesses.
Google’s initial response has been to label the allegations “baseless,” saying the male voice used in NotebookLM’s Audio Overviews feature was created by a paid professional actor rather than being derived from Greene’s voice. That assertion sets up a legal dispute over both the evidence and the broader legal frameworks that govern voice imitation in AI products. As this case unfolds, it could pave the way for future litigation and regulatory action concerning how AI companies source, replicate and monetize human-like voices. Beyond Greene’s personal stake, the lawsuit reflects anxieties among media professionals and creators about the risks AI poses to their livelihoods and identities when technology blurs the lines between inspiration and appropriation. In an era where synthetic voices can be generated with increasing realism, courts and lawmakers may be pushed to define new boundaries for consent, compensation and the preservation of individual rights in the age of artificial intelligence.

