AI Copying Is Not The Same As Human Copying | Commentary

AI copyright litigation that will define media and entertainment industry economics for decades to come is winding its way through the courts.

In all these infringement cases, essentially no one disputes that AI copies entire libraries of copyrighted works word-for-word (or image-by-image, note-for-note) when it trains on them without consent. Rather, those who try to defend it simply point out that the end result of AI’s heist frequently shows no direct relation to the original.

These defenders say its copying is fundamentally no different than what human artists have done since the beginning of time: build their creative works on top of the building blocks of others before them. They dismiss arguments to the contrary as a form of blind sentimentality that slows down society’s quest for continuous progress.

But humans generally don’t copy the creative works of others in their entirety, because when they do it’s usually called copyright infringement. And artists, even when they do copy, aren’t creating entirely new systems designed to enable billions of users to generate endless works that ultimately flood the marketplace, thereby competing directly with creators and squeezing them out.

Big Tech AI training, on the other hand, does precisely that. Sure AI’s outputs may show little resemblance to any individual creative work swallowed up by generative AI’s insatiable black box due to the sheer numbers involved. But AI’s wholesale copying — at such grand scale and for such broad purposes — makes it even more diabolical.

No-holds-barred generative AI absolutists — a movement known as “Effective Accelerationism” — want us to forget that the question of copyright infringement, together with related claims like misappropriation, has two parts: the input and output sides of the equation. On the input side, wholesale copying combined with market substitution are enough to reject fair use and find infringement.The separate question of infringement on the output side doesn’t even need to come up. Don’t take that from me. Take that directly from the Supreme Court.

In its recent Andy Warhol Foundation v. Goldsmith case (commonly referred to as “Warhol-Prince,” since a photograph of musician Prince was central to it), the Supreme Court didn’t even reach the separate issue of infringement on the output side which had already been conceded. Instead, the Court focused on the use and alleged infringement of the photograph itself.

In rejecting fair use as a defense, the 7-2 majority led by Justice Sonia Sotomayor wrote that market substitution is “copyright’s bete noire.” In her words, “The use of an original work to achieve a purpose that is the same as, or highly similar to, that of the original work is more likely to substitute for, or supplant, the work” which, in turn, “undermines the goal of copyright.”

The Court also pushed back on the notion that just because the alleged infringer’s outputs are “transformative” – i.e., they “have a different character” from the original – that is enough to find fair use. That’s what Big Tech wants us to believe. But the Court rejected that argument, noting that copyright’s protection is even stronger “where the copyrighted material serves an artistic rather than utilitarian function.” In its words, “To hold otherwise would potentially authorize a range of commercial copying of photographs, to be used for purposes that are substantially the same as those of the originals.”

To your reliance on the Getty Images’ lawsuit fair use defense, I say “Take that, Stability AI!”

Bit Tech and the “Google Books” Case

But what about the famous Authors Guild v. Google case (commonly referred to as “Google Books”)? That is always the case Big Tech trots out to argue that no consent or compensation is required to the creators on which its AI trains. The court found fair use in that case, and the same rationale applies here with generative AI.

Wrong.

First, “Google Books” hailed from the Second Circuit Court of Appeals — not the U.S. Supreme Court — so it isn’t the law of the land. Even if it were, the court in Google Books gave its stamp of approval to Google for fundamentally different reasons, none of which were market substitution. Sure there was wholesale copying there too — Google copied entire libraries of books without consent.

But Google did so to make them searchable and only displayed snippets of copied books in its search results. That, in turn, drove more discovery, sales and consumption of them — not less — which of course meant more dollars for the authors themselves. Google, in other words, promoted authors. It didn’t seek to replace them.

It’s the exact opposite when AI relentlessly scrapes creative works in their entirety. Big Tech developed generative AI systems precisely to create commercial substitutes for wholesale sectors of the media and entertainment industry. Global news analysis and features? Who needs The New York Times when you have Microsoft and OpenAI (litigation ongoing). Stock photos? Who needs Getty Images when you have Stability AI (litigation ongoing). These companies invested massively to create their works. Generative AI companies, however, believe they can simply take them – no payment needed.

Big Tech is on an endless quest to identify valuable creative content, vacuum it up to satisfy AI’s insatiable appetite, and then spit out its own artificially generated products to compete directly against copyright owners. These generative AI companies aim to be the one-stop shop for all kinds of creative works.

Only in a small number of cases does Big Tech seek to negotiate with creators and media companies (mere content repositories in its view), as it just did last week with the Financial Times. But the fact that it even deigned to do so in these small handful of cases only proves the point. If Big Tech were so confident in its position, why pay anyone at all?

Silicon Valley predictably warns of dire consequences for any kind of roadblocks to generative AI’s unbridled arms race in the name of progress. The Supreme Court dealt with similar doom and gloom pronouncements in Warhol-Prince, but blithely rejected them. Justice Sotomayor openly mocked claims that the Court’s decision would “snuff out the light of Western civilization, returning us to the Dark Ages ….” In her words, “It will not impoverish our world to require [the infringer] to pay [the creator] a fraction of the proceeds from its reuse of [the] copyrighted work.”

Exactly! That’s all we’re talking about here. The creative community is not trying to stop Big Tech’s development of generative AI. To the contrary, it expressly acknowledges AI’s power and potential. The Human Artistry Campaign, a coalition of major media and entertainment organizations, sets forth seven “core principles for artificial applications” in its mission. Principle number one states: “Technology long empowered human expression, and AI will be no different.”

The creative community just expects Big Tech to pay for the foundational ingredients it needs for its AI tech to generate its value. In Justice Sotomayor’s words, new expression “does not in itself dispense with the need for licensing.”

So, don’t justify Big Tech’s relentless quest for its next trillion-dollar valuations on claims that it’s no different than what artists have done since the beginning of time.

It’s entirely different.

Reach out to Peter at peter@creativemedia.biz. For those of you interested in learning more, sign up to his “the brAIn” newsletter, visit his firm Creative Media at creativemedia.biz, and follow him on Threads @pcsathy.

The post AI Copying Is Not The Same As Human Copying | Commentary appeared first on TheWrap.