Harpercollins wants authors to sign away AI training rights
Harpercollins wants authors to sign away AI training rights
If you’d like an essay-formatted version of this post to read or share, here’s a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:
https://pluralistic.net/2024/11/18/rights-without-power/#careful-what-you-wish-for
Rights don’t give you power. People with power can claim rights. Giving a “right” to someone powerless just transfers it to someone more powerful than them. Nowhere is this more visible than in copyright fights, where creative workers are given new rights that are immediately hoovered up by their bosses.
It’s not clear whether copyright gives anyone the right to control whether their work is used to train an AI model. It’s very common for people (including high ranking officials in entertainment companies, and practicing lawyers who don’t practice IP law) to overestimate their understanding of copyright in general, and their knowledge of fair use in particular.
Here’s a hint: any time someone says “X can never be fair use,” they are wrong and don’t know what they’re talking about (same goes for “X is always fair use”). Likewise, anyone who says, “Fair use is assessed solely by considering the ‘four factors.’” That is your iron-clad sign that the speaker does not understand fair use:
https://pluralistic.net/2024/06/27/nuke-first/#ask-questions-never
But let’s say for the sake of argument that training a model on someone’s work is a copyright violation, and so training is a licensable activity, and AI companies must get permission from rightsholders before they use their copyrighted works to train a model.
Even if that’s not how copyright works today, it’s how things could work. No one came down off a mountain with two stone tablets bearing the text of 17 USC chiseled in very, very tiny writing. We totally overhauled copyright in 1976, and again in 1998. There’ve been several smaller alterations since.
We could easily write a new law that requires licensing for AI training, and it’s not hard to imagine that happening, given the current confluence of interests among creative workers (who are worried about AI pitchmen’s proclaimed intention to destroy their livelihoods) and entertainment companies (who are suing many AI companies).
Creative workers are an essential element of that coalition. Without those workers as moral standard-bearers, it’s hard to imagine the cause getting much traction. No one seriously believes that entertainment execs like Warner CEO David Zaslav actually cares about creative works – this is a guy who happily deletes every copy of an unreleased major film that had superb early notices because it would be worth infinitesimally more as a tax-break than as a work of art:
https://collider.com/coyote-vs-acme-david-zaslav-never-seen/
The activists in this coalition commonly call it “anti AI.” But is it? Does David Zaslav – or any of the entertainment execs who are suing AI companies – want to prevent gen AI models from being used in the production of their products? No way – these guys love AI. Zaslav and his fellow movie execs held out against screenwriters demanding control over AI in the writers’ room for 148 days, and locked out their actors for another 118 days over the use of AI to replace actors. Studio execs forfeited at least $5 billion in a bid to insist on their right to use AI against workers:
Entertainment businesses love the idea of replacing their workers with AI. Now, that doesn’t mean that AI can replace workers: just because your boss can be sold an AI to do your job, it doesn’t mean that the AI he buys can actually do your job:
https://pluralistic.net/2024/07/25/accountability-sinks/#work-harder-not-smarter
So if we get the right to refuse to allow our work to be used to train a model, the “anti AI” coalition will fracture. Workers will (broadly) want to exercise that right to prevent AI models from being trained at all, while our bosses will want to exercise that right to be sure that they’re paid for AI training, and that they can steer production of the resulting model to maximize the number of workers than can fire after it’s done.
Hypothetically, creative workers could simply say to our bosses, “We will not sell you this right to authorize or refuse AI training that Congress just gave us.” But our bosses will then say, “Fine, you’re fired. We won’t hire you for this movie, or record your album, or publish your book.”
Given that there are only five major publishers, four major studios, three major labels, two ad-tech companies and one company that controls the whole ebook and audiobook market, a refusal to deal on the part of a small handful of firms effectively dooms you to obscurity.