A judge ruled Anthropic’s AI training using copyrighted books qualified as fair use, but storing 7 million pirated copies violated the law.
A federal judge in San Francisco has reportedly ruled that Amazon. com-backed (AMZN) AI startup Anthropic’s use of books without permission to train its Claude AI model constituted ‘fair use’ under U.S. copyright law.
A Reuters report said that despite this win, U.S. District Judge William Alsup found that Anthropic unlawfully copied and stored over 7 million pirated books in a central repository.
Anthropic, backed by Amazon and Alphabet Inc.(GOOGL/GOOG), was sued by authors Andrea Bartz, Charles Graeber, and Kirk Wallace Johnson for using their works to train its AI system without permission or payment, indicated the report.
Judge Alsup sided with Anthropic on transformative use, stating the training helped Claude develop new capabilities rather than reproduce the books verbatim.
However, Anthropic’s mass storage of pirated books exceeded fair use protections, which is why Alsup ordered a December trial to determine statutory damages, which can reach $150,000 per infringed work under U.S. law.
The ruling marks a significant victory for artificial intelligence firms amid ongoing litigation concerning the use of copyrighted materials in training large language models.
Earlier this month, OpenAI filed an appeal against a court order that required it to preserve user output data from ChatGPT indefinitely. CEO Sam Altman said the request was “inappropriate” and violated user privacy.
While AI firms argue that incorporating copyrighted texts into model training drives innovation, copyright holders warn that unauthorized use threatens their livelihoods and intellectual property rights.
The outcome of the December trial may shape how future AI systems are trained and regulated.
Founded in 2021 by former OpenAI employees, Anthropic secured $3.5 billion in its latest Series E financing in March, boosting the valuation to $61.5 billion.
For updates and corrections, email newsroom[at]stocktwits[dot]com.<