2 Min Read

How to Navigate the AI Industry's Legal Problems

Featured Image

Wondering how to get started with AI? Take our on-demand Piloting AI for Marketers Series.

Learn More

Experts are sounding the alarm that the AI industry is about to have serious legal problems. And you might too if you use generative AI tools.

First, a report in The Wall Street Journal kicks off by saying:

“Companies that simply use generative AI—say, by using OpenAI’s tech as part of a service delivered to a customer—are likely to be responsible for the outputs of such systems.”

And it seems like the legality of the output of these systems is seriously in question. Another report in The New York Times reveals:

“OpenAI, Google, and Meta ignored corporate policies, altered their own rules, and discussed skirting copyright law” in order to train their models. The article then cites several ways these companies acquired potentially copyrighted data.

How should business leaders navigate the legal and ethical minefields around generative AI?

I got the answer from Marketing AI Institute founder / CEO Paul Roetzer on Episode 91 of The Artificial Intelligence Show.

You don't own anything created by generative AI

The first thing to know is this:

If you're in the United States, you don't own anything created by generative AI. At least, that's according to the latest guidance from the US Copyright Office.

That means you cannot copyright something created by generative AI. Prompting a system alone is not enough to prove human authorship. 

Now, this could change. But, as of today, it's the guidance provided by the US government.

“To my knowledge, I don’t believe they have updated any additional guidance or given any additional indications about how copyright law may evolve in the United States," says Roetzer.

So, if you need a copyright on something in the US, you can't really use generative AI to create it.

And you need to consider the legal implications of using generative AI

Second, you may need to consider the legal risks of using these tools.

It's possible courts will determine that OpenAI, Google, and others trained models illegally. That could then make customers who use those tools liable for the outputs.

Some big tech companies are already trying to get ahead of this. Microsoft and Google say they'll cover your legal bills if you're sued for using their tools.

“I don’t know if that’s going to make your legal department feel much better about it, though," says Roetzer.

The New York Times story makes it clear that every company cut corners when training  models. OpenAI used Whisper to transcribe YouTube videos and train models. Google trained models on YouTube data that they don't technically own. Meta execs said openly it would take too long to negotiate usage with publishers.

“So basically everyone is violating all of these rights and just sort of going full steam ahead," says Roetzer.

It helps to remember:

We still have to wait for courts to actually decide these things.

Roetzer thinks it's likely the companies pay fines and we all move on. But that's not the point, he says. The point is that you need to start planning for the legal implications of generative AI today.

So, what do you do about this?

How do you start doing that?

The first step is: Get legal involved.

“This isn’t something you’re going to solve on your own," says Roetzer. "You’ve got to have the lawyers in the room.”

Second, you need generative AI policies.

You need policies both for your team and all your service providers. Your policies should extend out to anyone you contract with to create content. Existing contracts should get reviewed.

Remember, if your contractor is using AI to create content, you do not own that content. And agencies or contractors may not even know themselves to what extent AI is being used.

“Our experience has been that a lot of agencies and freelancers are just trying to figure this stuff out," says Roetzer.

Roetzer says to ask yourself three big questions when creating generative AI policies:

  1. What AI technologies is someone allowed to use?
  2. How are they allowed to use these technologies?
  3. And do they have to disclose their use of these technologies?

The good news? There are plenty of generative AI policy templates out there to get you started.

Related Posts

AI Music Generation Tools Face Major Lawsuits from Recording Industry

Mike Kaput | July 2, 2024

The Recording Industry Association of America (RIAA) just dropped a legal bombshell on the AI music generation world.

"AI Journalist" Writes News Articles for You

Mike Kaput | April 2, 2024

A new tool called AI Journalist is exactly what it sounds like: AI that writes news articles for you.

Intelligent Assistants Should Be Your New Coworker—and That's a Good Thing

Mike Kaput | July 2, 2019

Conversica CMO Rashmi Vittal says your next coworker is going to be powered by AI. Here’s why that’s a good thing.