One of YouTube’s most popular podcasters has criticised the tech platform for taking content from creators to train artificial intelligence (AI) programs without paying compensation.

Steven Bartlett, host of the podcast The Diary of a CEO, warned Google that it risked “hollowing out” YouTube, which it owns, if the platform lost the trust of creators like him. Bartlett was not aware that creators’ content was being taken by YouTube until he was contacted by The Times, despite his prominence on the video-sharing site.

Bartlett hosts the second-biggest podcast on YouTube, boasting more than ten million subscribers, a mark surpassed only by Joe Rogan. It is the fastest-growing podcast show, adding between 300,000 and 500,000 new subscribers a month.

YouTube said it had been transparent about using creators’ content for AI by publishing a blog post last September in which it said: “We use content uploaded to YouTube to improve the product experience for creators and viewers across YouTube and Google, including through machine learning and AI ­applications. We do this consistent with the terms that creators agree to.”

But Bartlett, who has also starred on Dragons’ Den, said: “I was really surprised … the flywheel between platforms, creators and audiences only spins when trust and value flow both ways. Right now there is a widening gap between the ­upside AI can unlock and the return most creators see for the data that ­powers those models. If that gap keeps growing, it risks hollowing out the very ecosystem that makes YouTube valuable. This is a risk not just for creators but also for YouTube.”

He called for “constructive ­dialogue that secures fair value for creators and media owners, while still allowing YouTube and others to innovate”.

Companies such as Google require huge amounts of data, like video, to create their AI programs. Veo 3 has become an internet sensation with its hyper-realistic ­videos that can be created from simple text prompts.

Man on a beach talking to camera, boat in background.

Material from the YouTube creator Brodie Moss, above, had been used in a video created using Google’s Veo 3 tool, below, according to the tracking company Vermillio

Selfie of a man in a river with crocodiles.

Vermillio, a start-up that identifies where creators’ content has been used in AI output, said it had discovered the use of material posted by Brodie Moss, a prominent Australian YouTuber who catalogues his ocean adventures and fishing trips on the YBS Youngbloods channel. Vermillio used its TraceID technology to prove that Moss’s material had turned up in a Veo 3 video. Moss was approached for comment.

Dan Neely, the chief executive of Vermillio, said: “The reality is that creators’ content is being used to train AI and while YouTube may point to blog posts as proof of disclosure, that’s not meaningful transparency. Most creators have no idea this is happening.

“YouTube needs to communicate in a way that’s impossible to miss, not just in ways that protect the company but in ways that respect the creators who built the ­platform. Our customers, who are creators, want what they’re already offered by other AI developers, which is the ability to opt out or to be fairly compensated for the use of their data.”

Alex Segal, the managing director of InterTalent, one of the leading creator talent agencies, said: “You set a dangerous precedent when you start using content without either asking for approval or paying for it. I think that there needs to be some sort of legal system where you have to ask for approval.

“If people are going to start doing this without asking, without disclosing and without paying, it is setting a horrific precedent because that will only go one way … south, very quickly.”

Woman eating exotic fruit in front of indigenous people in a rainforest.

Videos created with Veo 3, which are easy to make, have become wildly popular

Segal said his agency now inserted a clause in all contracts to stop clients’ content being used for AI training without permission. However, one senior executive in the industry said his clients were afraid to act because they feared being downgraded in Google searches as a result.

He said: “I do make them aware of some of these issues … A lot of the time they’re just throwing their hands up in the air because they don’t know what they can even do.

“It feels like a lost cause. ‘I can remove myself from search, but then my revenue goes to hell. If I take a stance, they’re just going to penalise me.’ This is the fear.”

The issue came to a head in the US Senate last month when a YouTube ­executive was criticised for the practice.

Josh Hawley, a Republican senator in Missouri, told Suzana Carlos, head of music policy for YouTube: “The fact that ­YouTube is monetising these kinds of videos seems like a huge, huge problem to me … YouTube, I’m sure, is ­making billions of dollars off of this.

“The people who are losing are the artists and the creators and the teenagers whose lives are upended.”

Jack Malon, a YouTube spokesman, said: “We’re clear in our terms of service that we use YouTube data to make our products better, and this remains the case with AI.”

The company claimed it had paid out $70 billion to creators, artists and media companies between 2021 and 2023. It added that “only a subset of the videos on YouTube may be used to help power our products”.