Imagine spending 30 years mastering brushwork, only to be digitally outperformed by an algorithm that thinks your style is just a remixable option. Welcome to 2025, where your life’s work might just be a dataset.

Generative AI has sparked a global brawl over intellectual property that’s less “friendly chat” and more “gloves off, see you in court.” The central gripe? AI models, specifically the brainy ones like LLMs and GANs, are being accused of hoovering up humanity’s creativity without so much as a tip of the hat (or a royalty cheque).

Consent? Compensation? Credit? AI says: “Nah.”

Enter the “Triple-C” complaint. Creators are angry they weren’t asked (Consent), weren’t paid (Compensation), and aren’t acknowledged (Credit). These models don’t “learn” in the human sense, they ingest, process, and spit out “new” content based on the real work of real humans.

It’s like feeding Picasso and Tolkien into a blender and calling the smoothie “original.”

Fair Use or Foul Play?

Courts across the UK and US are knee-deep in lawsuits, trying to decide whether training an AI on copyrighted content is transformative genius or a high-tech smash-and-grab.

Spoiler alert: it depends who you ask. Some judges say AI is like a student taking notes. Others say it’s more like a photocopier with a corporate bank account.

One particularly juicy case, Bartz v. Anthropic, ruled that pirated books used in AI training definitely aren’t fair game. Meanwhile, the Authors Guild (including literary titans like George R.R. Martin) have brought class actions against AI developers, arguing their entire style and livelihood are being cloned.

Artists vs. Algorithms: Who’s Winning?

If you thought the law was slow, artists are speeding things up with tech of their own. Tools like Glaze and Nightshade add invisible poison to images, messing with AI training in a delightfully spiteful way.

But it’s an arms race. AI researchers have already launched LightShed, a counter-tool that can detect and scrub out these poisons with 99.98% accuracy. Yes, you read that right. We’ve reached the stage of cybersecurity warfare for watercolours.

The Ethical vs. Scraped Model Divide

As companies scramble to avoid litigation, the market’s split into two camps:

  • “Ethical” AI, like Adobe Firefly, trained only on licensed or public domain content. It’s the professional’s safe bet, with indemnification thrown in for good measure.
  • “Scraped” AI, like Stable Diffusion and some versions of Midjourney, which build their intelligence on a patchwork of scraped internet data—copyrighted or not.

Big business is flocking to the safe stuff. Creatives, on the other hand, are torn. Do they protect their work or ride the AI wave for exposure?

There are some middle-ground players too. OpenAI claims ChatGPT and DALL·E are trained on licensed and publicly available data, but without full transparency, we’re left guessing what exactly is in the mix.

The Existential Bit: Is Creativity Dying?

AI isn’t just a copyright issue, it’s a creative identity crisis. When your unique voice can be cloned in seconds, what does originality even mean?

Studies suggest AI may be increasing idea diversity, but at the cost of human confidence. Why toil for 10 hours on a song when AI can make five variations in 10 seconds? The result? A world drowning in content, and starving for soul.

But Haven’t We Always Borrowed?

Here’s the twist: artists have always looked to others for inspiration. We call it influence. In design, we built mood boards. In music, we borrowed chord progressions. In writing, we read voraciously to find our voice. Creativity has always been a remix of the past—but what made it human was the interpretation, the lived experience, the personal flair.

The difference now? AI doesn’t interpret. It imitates. It does it at scale, and it doesn’t ask permission.

So yes, we’ve always borrowed. But we didn’t industrialise it into a business model that sidelines the original creators.

So, What Next?

There’s hope. The “AI is theft” movement has catalysed:

  • Policy shifts, like opt-in datasets and contracts for training.
  • Stricter interpretations of fair use, especially when market impact is clear.
  • A bifurcated tech market, with legal-safe models attracting corporates, and open-source staying wild west.

In the end, it’s about balance. AI shouldn’t replace human creativity. It should elevate it. Like a great paintbrush, not a factory of forged Rembrandts.

As 2026 approaches, one thing’s clear: if we don’t get this right, the value of art, music, and storytelling could vanish into the noise of algorithms.

I am completely against the theft of work and believe people should be recognised and compensated for their contributions—whether that’s through properly licensed material for training, or royalties when their influence is used in an output. If you ask an AI to write a song in the style of Ed Sheeran, Ed should see a slice of that success.

The problem is, it’s already happening. Businesses are using generative AI to gain a strategic edge, and often it’s impossible to tell whether something was created by a human, a machine, or a bit of both. So where does that leave us?

We can choose to work with AI as ethically as we can, licensing what we use, crediting creators, being transparent when used and setting boundaries. Or we can try to block it completely. But the risk is that doing so just widens the gap between those moving forward successfully, and those who turn their back on evolution.

It’s not about fear of change. It’s about shaping that change in a way that doesn’t erase the humans who made it all possible in the first place.

AIG Agents
What is an AI Agent?AIAI Insights

What is an AI Agent?

Damon SegalDamon SegalMarch 25, 2025
AI Hardware
The Interplay of Hardware and Energy in Advancing Artificial IntelligenceAIPhysical AITech

The Interplay of Hardware and Energy in Advancing Artificial Intelligence

Damon SegalDamon SegalJanuary 31, 2025
AI News 31 January 2025
This Week in AI, AGI, and ASI: The Latest DevelopmentsAI News

This Week in AI, AGI, and ASI: The Latest Developments

Damon SegalDamon SegalFebruary 1, 2025
The owner of this website has made a commitment to accessibility and inclusion, please report any problems that you encounter using the contact form on this website. This site uses the WP ADA Compliance Check plugin to enhance accessibility.