A federal judge ruled that training artificial intelligence on books is not a copyright violation, delivering a partial win to AI startup Anthropic while allowing a lawsuit against the company to proceed.
Judge William Alsup determined that Anthropic’s use of copyrighted books to train its Claude AI model qualifies as “exceedingly transformative” under U.S. copyright law, meaning it falls within legal bounds. The ruling marks a significant development in the ongoing debate over how large language models can be trained on existing creative works.
The case was brought by three authors — Andrea Bartz, Charles Graeber, and Kirk Wallace Johnson — who accused the firm of copying their books without permission to build a powerful AI product. Anthropic, backed by tech giants Amazon and Alphabet, could face penalties of up to $150,000 per title if found liable.
While Judge Alsup upheld Anthropic’s fair use defense, he denied the company’s attempt to dismiss the case entirely. The court found that the company may have broken the law by storing over seven million pirated books in what was described as a “central library.”
“Like any reader aspiring to be a writer, Anthropic’s LLMs trained upon works not to replicate but to create something different,” Alsup wrote. However, he noted the authors had not shown that the AI reproduced their books directly. “If they had, this would be a different case,” he added.
Anthropic said it was encouraged by the court’s recognition of its training methods as legal but disagreed with the decision to proceed to trial over the acquisition and storage of the books.
Legal disputes over AI training are growing. Just this month, Disney and Universal sued AI image platform Midjourney, and the BBC is considering similar action. In response, some AI firms have begun licensing content directly from rights holders.
Judge Alsup’s decision is one of the first to clarify how courts may treat AI’s use of copyrighted material, setting a possible precedent as similar cases move through the legal system.
Related Readings:









