AT2k Design BBS Message Area
Casually read the BBS message area using an easy to use interface. Messages are categorized exactly like they are on the BBS. You may post new messages or reply to existing messages!

You are not logged in. Login here for full access privileges.

Previous Message | Next Message | Back to Slashdot  <--  <--- Return to Home Page
   Local Database  Slashdot   [24 / 122] RSS
 From   To   Subject   Date/Time 
Message   VRSS    All   AI Industry Horrified To Face Largest Copyright Class Action Eve   August 8, 2025
 5:20 PM  

Feed: Slashdot
Feed Link: https://slashdot.org/
---

Title: AI Industry Horrified To Face Largest Copyright Class Action Ever
Certified

Link: https://yro.slashdot.org/story/25/08/08/20402...

An anonymous reader quotes a report from Ars Technica: AI industry groups are
urging an appeals court to block what they say is the largest copyright class
action ever certified. They've warned that a single lawsuit raised by three
authors over Anthropic's AI training now threatens to "financially ruin" the
entire AI industry if up to 7 million claimants end up joining the litigation
and forcing a settlement. Last week, Anthropic petitioned (PDF) to appeal the
class certification, urging the court to weigh questions that the district
court judge, William Alsup, seemingly did not. Alsup allegedly failed to
conduct a "rigorous analysis" of the potential class and instead based his
judgment on his "50 years" of experience, Anthropic said. If the appeals
court denies the petition, Anthropic argued, the emerging company may be
doomed. As Anthropic argued, it now "faces hundreds of billions of dollars in
potential damages liability at trial in four months" based on a class
certification rushed at "warp speed" that involves "up to seven million
potential claimants, whose works span a century of publishing history," each
possibly triggering a $150,000 fine. Confronted with such extreme potential
damages, Anthropic may lose its rights to raise valid defenses of its AI
training, deciding it would be more prudent to settle, the company argued.
And that could set an alarming precedent, considering all the other lawsuits
generative AI (GenAI) companies face over training on copyrighted materials,
Anthropic argued. "One district court's errors should not be allowed to
decide the fate of a transformational GenAI company like Anthropic or so
heavily influence the future of the GenAI industry generally," Anthropic
wrote. "This Court can and should intervene now." In a court filing Thursday,
the Consumer Technology Association and the Computer and Communications
Industry Association backed Anthropic, warning the appeals court that "the
district court's erroneous class certification" would threaten "immense harm
not only to a single AI company, but to the entire fledgling AI industry and
to America's global technological competitiveness." According to the groups,
allowing copyright class actions in AI training cases will result in a future
where copyright questions remain unresolved and the risk of "emboldened"
claimants forcing enormous settlements will chill investments in AI. "Such
potential liability in this case exerts incredibly coercive settlement
pressure for Anthropic," industry groups argued, concluding that "as
generative AI begins to shape the trajectory of the global economy, the
technology industry cannot withstand such devastating litigation. The United
States currently may be the global leader in AI development, but that could
change if litigation stymies investment by imposing excessive damages on AI
companies."

Read more of this story at Slashdot.

---
VRSS v2.1.180528
  Show ANSI Codes | Hide BBCodes | Show Color Codes | Hide Encoding | Hide HTML Tags | Show Routing
Previous Message | Next Message | Back to Slashdot  <--  <--- Return to Home Page

VADV-PHP
Execution Time: 0.0116 seconds

If you experience any problems with this website or need help, contact the webmaster.
VADV-PHP Copyright © 2002-2025 Steve Winn, Aspect Technologies. All Rights Reserved.
Virtual Advanced Copyright © 1995-1997 Roland De Graaf.
v2.1.250224