STEAM GROUP
Steam Labs SteamLabs
STEAM GROUP
Steam Labs SteamLabs
2,742
IN-GAME
14,882
ONLINE
Founded
29 May, 2019
Generative AI content filtering
We should have a way to hide games featuring content generated through AI models, whether in Search, Discovery Queue or Interactive Recommender, just like we are able to filter out violence and mature content.

The current disclaimer is not nearly enough, given how it's located way down in the description, many times not even visible unless the user scrolls and clicks on the 'read more' dropdown. It should be at the top and highlighted, in the vein of vr-only games.

Also, I know there's something in the works that would enable reporting when AI content is featured and not disclosed, or infringing copyright, but we haven't heard anything in months and I've seen plenty of games in the store doing exactly that.
< >
Showing 1-12 of 12 comments
wait they already have a disclaimer for generative AI?
yep, a few random examples:

https://gtm.you1.cn/storesteam/app/1304930/The_Outlast_Trials/

https://gtm.you1.cn/storesteam/app/2536840/GINKA/

https://gtm.you1.cn/storesteam/app/1575940/_2/


you can pull the complete list from steamdb by going to search > app keys > setting key name as 'common_aicontenttype'

link[steamdb.info]
MrHydrated 20 Jul @ 10:24am 
Yes please, I don't need to see another poorly asset flipped ai generated garbage in my discovery queue again...
J. 10 Aug @ 9:08am 
Absolutely need this.
Mike-13 26 Aug @ 2:38pm 
+1

maybe with the third party DRM, EULA, ALUF... more visual, I see the examples

Originally posted by nabisaeki:
yep, a few random examples:

https://gtm.you1.cn/storesteam/app/1304930/The_Outlast_Trials/

https://gtm.you1.cn/storesteam/app/2536840/GINKA/

https://gtm.you1.cn/storesteam/app/1575940/_2/


you can pull the complete list from steamdb by going to search > app keys > setting key name as 'common_aicontenttype'

link[steamdb.info]

and I need to look around to see it...
Bradly105 14 Sep @ 10:23am 
+1
Silavire 16 Sep @ 7:07am 
Just came here to ask about this.
There's already a filtering option for VR Supported/VR Only games, it shouldn't take much effort to add an extra option there.
NavySpheal 17 Sep @ 10:21am 
I 100% support this, it would be great to add for people who don't want to see games that use AI generation, and shouldn't be too hard to implement since similar filters exist for sexual and violent games already. I know that I and a lot of other people would love for this to be a thing.
@R+5 26 Sep @ 7:41pm 
i would prefer, rather than only the option to filter them out, to be able to spot them easily, like adding a small frame or icon next to the image or description of the game.

i dont mind games using ai content, because is impossible to know always if devs only placed it without tweaking and improving manually the art, or not. imo, is fair to generate ai art to speed up game dev, and make it cheaper and more accessible. so, imo, a "warning icon" makes more sense. Maybe something like the classic "chip + brain" design.

that can lead to more people focusing a lot more in actual game design, level design, and in general polishing ideas with smaller teams, or even as "solo dev".
Last edited by @R+5; 26 Sep @ 7:43pm
Originally posted by @R+5:
i would prefer, rather than only the option to filter them out, to be able to spot them easily, like adding a small frame or icon next to the image or description of the game.

Yeah I mean, 2nd paragraph was all about visibility, so we agree on that point.

Originally posted by @R+5:
that can lead to more people focusing a lot more in actual game design, level design, and in general polishing ideas with smaller teams, or even as "solo dev".

I see where you come from, and I might have been on the same page at some point in time, during the "early stages".
The issue is none of the things you mention is showing up in my queue (or anywhere else for that matter), just a lot of low effort, copy-paste garbage, using AI trained on databases that contain copyrighted works (I didn't link them as examples, because I didn't want to give them any publicity). And "tweaking" doesn't really make it any more acceptable if you ask me, despite what some in the industry may want us to believe.
So yeah, I really do want the option to filter them out.

"The problem is not the tool, but the use you make of it" and so far most people have just shown over and over again that they can't be trusted to use these tools properly, to the point where now I don't really feel like supporting anyone using them, in any fashion. And this is overlooking the fact that at least some of these tools might have never been proper to begin with (given the material used to "build" them).

I'm all about progress and developing new ways to make certain tasks more efficient, but this is a completely different matter from, say, singing voice synthesizers (eg. Vocaloid). "I can't sing, I don't have the means/time to learn, but I know how to compose/play. So I'm gonna use this software (that contains samples of a human voice, provided willingly by an actual person) as a musical instrument and, with it, create original music".

There's no scarcity of games out there, I don't need devs to churn them out faster. I'd rather they'd take their time and did what they can with what they have. I don't care if their drawing skill is the same of a 3-year-old, if they indeed focused on (to quote you) "game design, level design, and in general polishing ideas", the product will be good.
People will recognize that and reward them for the effort and the passion :)
@R+5 2 Oct @ 4:58pm 
Originally posted by nabisaeki:
The issue is none of the things you mention is showing up in my queue (or anywhere else for that matter), just a lot of low effort, copy-paste garbage,

But does that sell well, or even well enough? probably not, which is why most of that content will become part of the new generation of shovelware and asst flips. most often than not, those never become succesful enough or survive long enough to become the main mo of a cynical "dev".

Many games like that often are from new devs, that rushed one of their first attempts while trying to earn a bit of cash without trying to work more to create a proper game, which is why they rarely end making actual games. if steam greenlight still existed, maybe those ai-content games could be a big issue, but now, they are hardly different to other shovelware imo.

bad devs can try sell them, and we can tag them and ignore them.

Originally posted by nabisaeki:
using AI trained on databases that contain copyrighted works (I didn't link them as examples, because I didn't want to give them any publicity).

you should list examples, for a more objective criticism. thats fair.

imo, that also shouldnt be an issue for players or customers (as long as they are aware of the reason for the "generic low effort appearance"); from the perspective of someone that trains an engine with such content, to later use it as a tool, i think is also fair. maybe the results could end in a legal grey area, since the new content may be close enough to the original source, but still different enough to be taken as "something new". and the time of collecting the examples, specially high quality ones, and then training the ai does take time and resources (i ignore how much for better results, because afaik the first limitation are the "learning skills" of each ai).

Originally posted by nabisaeki:
And "tweaking" doesn't really make it any more acceptable if you ask me, despite what some in the industry may want us to believe.

im not a big fan of impressionist artists, but most of them are considered public domain. imagine you want to create background for a game with that style, and rather than spend money and time in and dealing with artist(s), you can train an ai to be able to generated faster stuff thats close enough to what you need. depending in the size of the project and the amount of illustrations you need or want, that spending time training an ai and collecting those images could be cheaper and more effective than paying artists to do more "organic and real life" results.

And then the same should go even for "non-public domain" and content with copyright, but in this case, at the very least, devs should ask an artist or do the work themselves to tweak the results into looking more original.

It would be a bit like the palworld vs nintendo drama: they managed to make characters original enough to be consider their own stuff, even when the style is reminiscent and close enough to the source. imo thats clever, and fair. that pissed of nintendo so much, that after failing to sue them, they are trying a new level of bs, by suing them for rights related to "design features": according to them, they shouldnt be able to sell a game with a mechanice in which a character can capture another using an object, or something similar to that. as vague as that sounds, that can become more hurtful for devs that want to make "pokemon style" games.

Imo, the proper way should be to warn people that ai was used in the process, but also confirm how: if the images were used "as is", or modified later and "manually improved", or just as "images of reference" for human made artwork.

i think this, not because of "what the industry wants", but because in an objective way, thats not much different from the traditional approach of using pictures of known actors to create original characters thet may look very similar to them. Some early examples of that could be the characters of "metal gear" (big boss was made using shean connery) or castlevania (using arnold s. from conan movies).

Then, later "mutations" from those characters began to had more unique looks, even if they could still be a bit reminiscent of their origins.

Originally posted by nabisaeki:
"The problem is not the tool, but the use you make of it" and so far most people have just shown over and over again that they can't be trusted to use these tools properly, to the point where now I don't really feel like supporting anyone using them, in any fashion.

the issue is not about trusting people into not abusing the tool to try to get away with bs: even without ai, people were doing shovelware and scams, with other tools. the issue is at the end-user side, or the consumer: we must learn to not support with money stuff thats subpar in quality, and ask for tools to easily tag and detect them, to then choose if we ignore them of not, just like what happens with ideologically driven content.

so i hope you can see thats a fallacious perspective. we shouldnt ask to ban tools because they can be dangerous. what we must ask is to either ban abusers, and regulate how they can be use to decrease the appeal of misusing them.

i think this can be seen similar to the theme about guns and weapons: banning guns wont decrease murders and crimes, only decreases a bit the frequency or gun related crimes, and crimes done with legal weapons. criminals that want to use guns will find them in the black market, and people that want to murder others and cannot afford an illegal or legal gun, will still do so with something else (ie the epidemic of knife related attacks in UK).

So the answer is never banning; banning of tools (or stuff) in fact only makes the related issues worse, because they end creating black markets. Which is then, a similar case with actual "software piracy": most "pirates" are just people sharing and getting games for free, rather than "actual pirates" that try to resell those (and more often than not, most "pirates" are just people trying to get games that are not longer in the market, or in an outdated platform-console).

Thats why both things, are a "quality of service issue".

Originally posted by nabisaeki:
I'm all about progress and developing new ways to make certain tasks more efficient, but this is a completely different matter from, say, singing voice synthesizers (eg. Vocaloid).

Same theme, and same thing: just different content. Vocaloid uses synthesised artifitial voices, that resemble a human voice. their appeal for some, is in part it still sounds artificial and robot-like, which is why often, they are still use for robot-like characters.

but in the case of needing or wanting a voice actor, ai-generated voices through training should also be fair game, as long as the new voices are "tweaked enough" to have an original quality to them, rather than be just straight copy-pasta style stuff, unless, theres a way to ask the original voice actor or actress to "license their sound", which is the direction that stuff is going.

Then, imagine a similar example to what i commented in previous paragraphs: you could also hire a few real voice actors with interesting vocies, but maybe inexperienced and not very good at "acting" or communicating emotions, and so on. you could ask them only to record enough to train the ai, and then with an advanced ai, generate similar voices but that do communicate better emotion by tweaking variables.

Then the problem and excuses of using cheap and bad voice actors for dubbed content will become old news: the quality will depend on "voice technicians", and how much care the creative team does to make the voice acting stand out on its own.

we either learn to integrate ai with our "human processes", or we will inevitably be replaced and surpassed by ai (specially when we become unable to know if they are truly or not "conscious", or more likely a "mimic" that believes its conscious when isnt, and can communicate as well as us or better).

we need to first get the basic level stuff right, before we get a chance of surviving the "god tier" stuff that is coming soon. and that wont happen through banning: banning never works as intended, and usually leads to worse problems. banning is literally covering your ears and scream, rather than dealing objectively with the real issue.
Last edited by @R+5; 2 Oct @ 5:03pm
Whoa :Thunder_Girl_4: ok, hold up. First, thank you for taking the time to write all that, I appreciate it. But I think you're misunderstanding a few things here: this is not about banning anything (never mentioned banning, never suggested that), but about giving us the option to not see all that stuff in our queue. There's no stopping the trend and Valve made the correct choice (regulation), now I just want everything properly tagged and the proper filters in place, because if I still need to manually inspect everything... well that kinda defeats the purpose, doesn't it?

All the bits you replied to should have been read in context. I agree with you that shovelware is not gonna sell, that's not the point. Much of what I said was prompted by you using the word "fair" and listing "possibilities". I was talking about specific items in our discovery queue. I was talking talking about copyrighted material.
I didn't mention Vocaloid to make the connection with AI voices. If voice actors want to offer their skills (or in your example, lack of) for the purpose of making voice generating software, that's all right. I mentioned Vocaloid because of the "agreement" between parties, the people involved, and the creative effort required to use it properly.

I was talking about AI image generation, replying to a post (you made) about AI image generation.

I was talking about me not wanting to support certain practices and companies trying to push for adoption of tools that are only good for their pockets.

Again, I'm not worried about generative AI replacing the real thing. "People who know what they're doing" will always beat "people who don't know what they're doing, using software that think it does". Where creative works are concerned, the only tools you need are the ones that help you bring your vision to life. That "vision" is not something that can be summed up with a "prompt" or an "idea". If you lack the knowledge and are using generative AI as a crutch, the product will always be inferior. If you do have the knowledge and then try to tweak what has been generated, the product will still be inferior compared what you could have achieved making it from scratch, the proper way.

Now, I'm not interested in discussing further the ethics of the current implementation of AI in these tools. That's not why I made the thread.
I think I addressed the gist of it, hopefully I clarified the parts you were interested in.
< >
Showing 1-12 of 12 comments
Per page: 1530 50