It’s very likely—and it's already happening. Here's where things stand and where they could go:
Current Legal Landscape
1. Existing lawsuits have already been filed:
- Getty Images vs. Stability AI (UK & US) – For alleged misuse of copyrighted images to train Stable Diffusion.
- Sarah Andersen et al. vs. Stability AI, Midjourney, and DeviantArt – A class action filed by artists claiming infringement via AI training on their artwork without consent. 2. Claims often include:
- Copyright infringement – Training on copyrighted art without permission.
- DMCA violations – Removal of attribution or misuse of watermarked content.
- Unfair competition – Undercutting artists by using their work to generate derivatives.
Legal Hurdles for Artists
Despite the emotional and ethical strength of their arguments, these cases face uphill battles:1. Copyright law isn’t settled here.
- Courts are still deciding if training on copyrighted material constitutes infringement.
- There’s precedent that reading or analyzing a work (like a human artist studying a painting) doesn’t equal infringement—but AI training might stretch that analogy. 2. Derivative work claims are tricky.
- If an AI image doesn’t closely resemble the original, it might be deemed “transformative” or not infringing under U.S. fair use. 3. Class certification is hard.
- Artists must prove they’ve suffered similar, concrete harm.
- They also need to demonstrate that AI outputs are traceable to their specific works—which is hard unless the model overfits and replicates them directly.
Likelihood Going Forward: HIGH
- More lawsuits are coming. As more artists realize their work was in training sets, legal pressure will mount.
- Class actions are especially appealing because the harm is dispersed across thousands of artists. Banding together is the only practical option.
- Funding is increasing from groups like the Authors Guild, legal activists, and even sympathetic tech insiders.
Possible Outcomes
- Massive settlements (think: Getty scenario) where AI companies pay licensing fees retroactively.
- Consent-based datasets become the norm, with opt-in or licensing marketplaces (already happening with Shutterstock, Adobe, etc.).
- Legislation may eventually impose clear opt-in rules or compensation mechanisms.