https://www.geeknetic.es/Noticia/36707/Tim-Sweeney-CEO-de-Epic-Games-rechaza-las-etiquetas-de-IA-en-las-tiendas-de-juegos-mientras-la-industria-avanza-hacia-el-dominio-de-la-IA-generativa.html
Tim Sweeney He has never exactly been discreet. The executive director of Epic Games has been giving unfiltered opinions for years about commissions, monopolies, mobile platforms and, now, about the generative artificial intelligence. His last clash was with Valve and, on the rebound, with a good part of the gaming community.
Everything starts with Valve’s new demands on Steamwhere developers and publishers are asked to disclose whether they have used generative AI at any point in the process. It is not a “veto” on AI, but a transparency label. Sweeney thought it was an absurd measure.. In a recent post on
Translated: If the use of generative AI is going to be the norm, placing specific notices would no longer make sense. And, incidentally, he let it slip between the lines that at Epic (and most likely in its own projects) AI is already part of the day-to-day development.
The message has another important reading: It is almost an advance confirmation that the Epic Games Store will not follow Steam’s line and will not require these types of disclosures from the studios that publish their games there.
The internal contradiction: Sketchfab does ask to warn about AI
The situation becomes more curious when Sketchfab, the 3D model platform that Epic bought in 2021, enters the scene. Recently, Sketchfab began requiring creators to indicate whether a piece was made with the help of generative AI. That is, just what Sweeney considers unnecessary for a video game store.
The probable explanation is much less ideological and much more pragmatic: lawyers. For a 3D content platform that can draw on models created with AI tools, having a clear notice about what has been generated with AI helps cover your back on copyright and licensing issues.
That is, at a legal and intellectual property level, Epic is interested in protecting itself. At the store and player relationship level, Sweeney prefers not to complicate the experience with notices and labels. And that is where many have seen an inconsistency: on the one hand, the legal risk of AI is recognized, on the other, the need to inform the consumer is minimized.
Gamers who demand to know everything: from misleading trailers to AI
The response to Sweeney’s message was quick and quite forceful. Many players asked for something quite simple: they want to know when a game has used generative AI in its process, whether for art, voices or texts. They compared it with the rules that require clarifying whether a trailer includes “footage that is not representative of gameplay” or scenes captured in a different version.
The reasoning is direct: If transparency is required when a trailer can be misleading, why not do the same when technology has been used that affects the work of artists, scriptwriters or voice actors?
There were also those who went even further and presented it as a consumer decision.: If a player wants to “vote with their wallet” and avoid projects that abuse generative AI, they need to be able to identify which ones they are. Without labels or warnings, that choice becomes very difficult.
The context: EA, Krafton, Take-Two and the stumbles with AI
Sweeney’s words do not arrive in a vacuum. They come after several weeks in which other big names in the sector have made it clear that they want to integrate AI into their workflowswith results that have not always been an example of subtlety.
EA and Krafton have openly talked about using AI in creative and production processes, and some of these experiments have already generated controversy, especially when the results have been poor or have directly replaced tasks that are usually done by a human team. At the same time, The CEO of Take-Two has openly defended that AI can assume part of the dialogues and settings that today depend on voice actors.
Between inevitability and the right to know
What Sweeney proposes, at its core, is an almost deterministic vision: If AI is going to be everywhere, stopping asking for warnings is assuming it as just another development tool, at the same level as a graphics engine or a sound editor.
The problem is that, for many people, generative AI is not just “just another tool.” It affects specific jobs, has legal and ethical implications, and is at the center of a huge cultural debate that goes far beyond the video game. That’s why many don’t buy the argument that labels are meaningless.. On the contrary: the more present AI is, the more logical it seems to ask for transparency.
For now, the only thing clear is that the clash between Sweeney’s vision and the expectations of the community will not be the last. Generative AI is not only going to change how video games are built; It will also force us to rethink how much information the player deserves (and demands) about what is behind each title they install.
