Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

ChatGPT and MMORPGs

2456

Comments

  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    After some more thought, this isn't ChatGPT, but I think AI could be useful to some MMORPGs in some new ways.

    One is to assist with moderation.  Games can track a bunch of statistics on what players are doing to get data.  Have human moderators make decisions on what is or is not bannable, and have AI figure out which players statistically look like they're cheating or spamming or whatever.

    Letting AI make the final decision on who gets banned would be disastrous, but flagging things for human moderators to verify could be very useful.  If at any given moment, 1% of the players in a game are using some sort of cheat, AI can flag the 1% that it thinks is cheating, and be right 2/3 of the time, that could be very useful.  AI can adapt to new cheats much more quickly than programmers putting in manual ways to try to detect particular new ways that players find to cheat.

    Another possibility is with finding groups.  We all know that PUGs sometimes stick you in a group with some terrible players.  Suppose that, at the end of every PUG, the game asks you to rate how much you'd like to group again with the players you grouped with.  You probably wouldn't group again with those particular players.

    However, the game could statistically track various players and AI could learn what sort of players you like or don't like.  There are some preferences of liking more skilled players and not less skilled ones, but a lot of things are personal preference.

    One problem that group finders tend to have is that there just aren't enough players in the system to form groups at all.  But if it were scheduled that groups for this or that dungeon get formed at these times of day, you might be able to get enough players to simultaneously form several groups for a given dungeon at the fixed times, and then be able to split players into groups that most of the players will like better than random chance.
    AlBQuirky
  • eoloeeoloe Member RarePosts: 864
    edited December 2022
    IceAge said:
    Is a really great tool. Today I made him to write me a game server, login server and a basic DB, all 3 connected. Basically I can make him write me an entire game...or a ..demo, but he can.

    The problem comes with .. disconnects and not ..remembering you ( I don't know why they did not implemented this yet ). So if something goes wrong while you are in the middle of a ..code or something like that, and it disconnects your chat for .. X or Y reasons, literally you have to put him to learn the code he gave you the first time again. Most of the time he will give you other code and not really related with the old one and can mess up it.

    All in all I see this as a very powerful .. tool in the (near) future. 

    OpenAI also have Art AI. I gave him a pretty complex description of what I wanted, and this is the result : 









    There were..more but this are my favorite.

    I am myself playing with stable diffusion








    AlBQuirky
  • fineflufffinefluff Member RarePosts: 561
    edited December 2022
    Quizzical said:
    Where AI really shines is in doing tasks that just about any human could do competently, but when you need so much of it done that it's impractical to hire humans to do all of it.  That can include simple, deterministic computations such as computers have been doing for decades.

    More recently, it has become useful for more computationally intensive things where you have humans do a task a bunch, and then train AI to mimic the decisions that humans make without necessarily understanding why.

    AI can be good for processing large amounts of data to flag a relative handful of distinctive things for further review by a human.  Search engines do this, for example, as do social media sites trying to catch people saying things that merit banning them.

    Maybe I'm just not thinking of this the right way, but I don't see a lot of potential for AI to be useful in developing games in ways that it isn't already being used.  Maybe it could generate large amounts of bad content cheaply, but that's not really a very interesting thing to do, unless you're the sort of developer who publishes a ton of shovelware games on Steam.

    Because AI is really only able to mimic the sort of decisions that were made in the training data, it's not likely to generate anything particularly innovative.  A future in which the bulk of most games is created by AI is a future in which most computer games are generic knock-off of some particular genre, far more so than is already the case.
    I think the benefits in time-saving and lowering the barrier of entry for development are pretty significant. It's impressive how much it can already do. 

    Imagine you are working by yourself and want to create something that would take you days or weeks to figure out. The AI can help you write code, write dialogue, write lore, draw concept art. It can be a huge time saver, even in terms of just providing general information that you want to know as you are working. 

    I asked it how to play a specific guitar chord and it gave me instructions on where to place my finger and even drew a diagram for me. Instead of browsing random search results, I can just ask the AI and it will tell me about what I want to know. It can function both as an information provider and as a creative and decision-making aid. But then, there is a question of how reliable the information is. The final judgment and responsibility must always rest with the person. 

    I don't yet see it as something where you just release a game based only on what it gives you. With AI development, you can think of yourself as the director and the AI as your servant, and you can accept from it what you like and reject what you don't.

    For generating new content, I think it could work. It could be good even, or at least good enough. Many games already rely on randomly generated content like Diablo, Path of Exile, Minecraft, etc. Imagine a developer, or even a player, providing the game with a written description of an instance/level and the game would do its best to make it for them. At least it would give a starting point that could be modified and fleshed out by the developer. AI can already be used to generate art, music, even short video clips. Are entire game assets out of the question? Its not perfect yet (just listen to these AI songs https://openai.com/blog/jukebox/), but I think it will improve. The question is, will it become good enough?

    Would it possible someday for Unity or Unreal to take your written game design document and generate a prototype with placeholders? Whatever you don't like about it or whatever doesn't work, you tell the AI and the AI changes it. You can keep instructing the AI and making manual edits here and there until you have some you like. That would be astonishing.

    A game that I can see using an AI like ChatGPT is Elder Scrolls 6. I have a feeling it will be like Daggerfall, which, from what I understand, was mostly procedurally generated.
    AlBQuirky
  • fineflufffinefluff Member RarePosts: 561
    New Elder Scrolls lore just dropped


    Elder-Scrolls
    AlBQuirkyAndemnon
  • UwakionnaUwakionna Member RarePosts: 1,139
    The hurdle as I see it is not creative writing. You can do a good chunk of this with libraries of speech components strung together if all you're doing is applying dynamic prompts as a skin to an otherwise static structure.

    Rather than the AI writing dialogue, you need it to coherently track the actions of dozens to thousands of players and apply logical progression of consequence and new opportunity to the horde of chaos-monkeys in your game.
    AlBQuirkyMendel
  • UwakionnaUwakionna Member RarePosts: 1,139
    finefluff said:
    New Elder Scrolls lore just dropped


    Elder-Scrolls
    Think this also highlights that while good overall, there is vetting and controls that are most definitely needed.

    Not over anything obscene, but just clearing hiccups like that continuity shift between the fourth and fifth segment. Mara going from promising to guide Torval to the afterlife, to imbuing him with health.

    While it's not bad overall, continuity hiccups and idea shifts are still relatively common with AI models. This means a human or something else would still has to be involved in some layer between the AI and the player. Takes a bit away from the functionality of using the AI in real time for dynamic content, though it can aid devs in revving more content.

    Right now a more practical function is creating tables of prebuilt blocks for the AI to pull from, creating less freeform results, but more reliable. The dynamic element would be adding weighted values to the content within those tables, where the weight will shift in response to in-world activity, creating hot spots where certain scenarios are more likely.
    AlBQuirkyQuizzical
  • AlBQuirkyAlBQuirky Member EpicPosts: 7,432
    AI does not have "feelings" as most humans do. It can write words just fine. It can "mimic", like Quizzical pointed out. It will be interesting to see where it goes.

    Shadiversity (on YouTube) has been playing around with art AI for his upcoming graphic novel. He has a couple of streams where he talks with actual industry artists and they have a great chat about what AI can and can not do.

    Some of the filters Shad can input are "art styles" of specific artists, like Frank Miller, Boris, Frazetta, and such. That almost rides on the edge of "plagiarism." The art AI also seemed to have a lot of troubles with hands :)

    Again, it will be interesting to see where this all goes. I almost wonder if it is "real AI", since it seems to copy, not think. Just my thoughts :)
    finefluffAndemnon

    - Al

    Personally the only modern MMORPG trend that annoys me is the idea that MMOs need to be designed in a way to attract people who don't actually like MMOs. Which to me makes about as much sense as someone trying to figure out a way to get vegetarians to eat at their steakhouse.
    - FARGIN_WAR


  • ScotScot Member LegendaryPosts: 24,271
    edited December 2022
    You guys do realise that adbots will be using this soon and we won't have a clue that's what they are until they start doing links to 'Porn "R" Us' or whatever. :)
    eoloeAlBQuirky
  • SensaiSensai Member UncommonPosts: 222
    Mendel said:
    I'm not entirely certain that AI will really catch on in MMORPGs.  If it is generating content, I can't envision a developer that will not want to 'review' the content before it is release.  So, as much as I'd like it, dynamically generated content is unlikely to be a thing.  AI probably has a better chance of helping create content from scratch in an off-line environment and reviewed by a developer.

    I'm not certain about the various public AI generators.  I get a sense of e-plagiarism from some of them.

    I've followed AI for some time now.  The more successful ones appear to focus more on dialog.  That may be great if you're writing a play.  A novel generally has dialog and narrative -- although too many writers seem to focus entirely on the dialog.  TV and movies have influenced us more than we realize, I suspect.



    I would argue that not only should it catch on, it is the only means for mmorpgs to evolve.  While games have done a good job of translating tabletop gaming in many respects,  the lack of the dungeon master has always been the glaring exception.  While humans still need to be the source of the creative assets, AI opens the door for an artificial DM to guide things already in the game. 

    For example,  instead of skeletons walking back and forth on some insignificant patch of grass, they now are bound by a lich that organizes them, has them raid npc towns randomly or after certain prerequisites are met. Not planned, scripted events that occur at 21:00 server time. Or orc raiding parties actually . . . raiding.

    As for dungeons, instead of mobs always being in the same location or always being inhabited by X number of melee, Y number of ranged, Z number of healers, the makeup is more random,  as well as the aggro profile. Patrol pathing routes, high security areas and access routes all change providing more unique encounters. 

    While I agree the AI mentioned in this thread has limited applications in mmorpgs,  true AI is the true game changer for this genre, I just hope I live to see it.
    MendelAlBQuirky

    image

  • TheocritusTheocritus Member LegendaryPosts: 9,974
    Just give us a good game....Dont need the fluff
    eoloeAlBQuirky
  • eoloeeoloe Member RarePosts: 864
    Sensai said:
    Mendel said:
    I'm not entirely certain that AI will really catch on in MMORPGs.  If it is generating content, I can't envision a developer that will not want to 'review' the content before it is release.  So, as much as I'd like it, dynamically generated content is unlikely to be a thing.  AI probably has a better chance of helping create content from scratch in an off-line environment and reviewed by a developer.

    I'm not certain about the various public AI generators.  I get a sense of e-plagiarism from some of them.

    I've followed AI for some time now.  The more successful ones appear to focus more on dialog.  That may be great if you're writing a play.  A novel generally has dialog and narrative -- although too many writers seem to focus entirely on the dialog.  TV and movies have influenced us more than we realize, I suspect.



    I would argue that not only should it catch on, it is the only means for mmorpgs to evolve.  While games have done a good job of translating tabletop gaming in many respects,  the lack of the dungeon master has always been the glaring exception.  While humans still need to be the source of the creative assets, AI opens the door for an artificial DM to guide things already in the game. 

    For example,  instead of skeletons walking back and forth on some insignificant patch of grass, they now are bound by a lich that organizes them, has them raid npc towns randomly or after certain prerequisites are met. Not planned, scripted events that occur at 21:00 server time. Or orc raiding parties actually . . . raiding.

    As for dungeons, instead of mobs always being in the same location or always being inhabited by X number of melee, Y number of ranged, Z number of healers, the makeup is more random,  as well as the aggro profile. Patrol pathing routes, high security areas and access routes all change providing more unique encounters. 

    While I agree the AI mentioned in this thread has limited applications in mmorpgs,  true AI is the true game changer for this genre, I just hope I live to see it.

    I disagree with this.

    Especially with the first statement. AI indeed could represent an evolution for MMORPG... Sure. However it is far to be the only one, and current MMORPGs are from miles away from a well made narrative tabletop experience (unless you play stupid Hack'n Slash in tabletop, which video games do way better anyway).

    Just create a complex living world a la "Dwarf Fortress". Remove the silly dwarves and put players instead. Keep the interaction level. No AI needed, just well thought scripting. Et voilà ! You have IMO the best multiplayer sandbox ever created.

    The lich, orc raiders examples can simply be created by using scripting. No AI needed. Sandbox games IMO do not need AI.

    Where I could see the AI shining in MMORPG (not chatGPT) is in generating themepark content, which is the usual weakness of this type of games... or anything procedural... I bet that at some point, when everything will be ironed out, it could become a norm.

    AlBQuirky
  • eoloeeoloe Member RarePosts: 864
    AlBQuirky said:
    AI does not have "feelings" as most humans do. It can write words just fine. It can "mimic", like Quizzical pointed out. It will be interesting to see where it goes.


    It can not have feelings, but it can create some in the mind of people!

    I have shown to a friend some of the pictures, I made stable diffusion made, and he was in awe. One picture represented an empress sitten on a throne, and he said "I am in love".

    Since the AI is using emotional impactful work for its training, it is able to produce feelings in the human psychee. Which is IMO amazing!
    AlBQuirky
  • ScotScot Member LegendaryPosts: 24,271
    Darn thing wants a mobile number just to create an account, that goes to far for me, shame.
    AlBQuirkyAndemnon
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    AlBQuirky said:
    AI does not have "feelings" as most humans do. It can write words just fine. It can "mimic", like Quizzical pointed out. It will be interesting to see where it goes.

    Shadiversity (on YouTube) has been playing around with art AI for his upcoming graphic novel. He has a couple of streams where he talks with actual industry artists and they have a great chat about what AI can and can not do.

    Some of the filters Shad can input are "art styles" of specific artists, like Frank Miller, Boris, Frazetta, and such. That almost rides on the edge of "plagiarism." The art AI also seemed to have a lot of troubles with hands :)

    Again, it will be interesting to see where this all goes. I almost wonder if it is "real AI", since it seems to copy, not think. Just my thoughts :)
    Where it becomes plagiarism is also an interesting question.  You could make a case that if you rely on training data that isn't owned by you, then it's plagiarism.  If you made the training data yourself, then it's not.
    MendelAlBQuirky
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    eoloe said:
    AlBQuirky said:
    AI does not have "feelings" as most humans do. It can write words just fine. It can "mimic", like Quizzical pointed out. It will be interesting to see where it goes.


    It can not have feelings, but it can create some in the mind of people!

    I have shown to a friend some of the pictures, I made stable diffusion made, and he was in awe. One picture represented an empress sitten on a throne, and he said "I am in love".

    Since the AI is using emotional impactful work for its training, it is able to produce feelings in the human psychee. Which is IMO amazing!
    Computers have long been able to produce feelings in humans, especially anger and frustration.
    UwakionnaeoloeAlBQuirky
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    finefluff said:
    A game that I can see using an AI like ChatGPT is Elder Scrolls 6. I have a feeling it will be like Daggerfall, which, from what I understand, was mostly procedurally generated.
    Procedurally generated content is a very different thing from AI generated content.  That one is appropriate in a given situation doesn't necessarily mean that the other is.

    To make procedurally generated content, you pick some probability distribution for various attributes that describe what you want and use a random number generator to pick from that distribution to generate things.  The person designing it isn't limited to what has been done in the past and can have it create innovative things that are very different from what has been done before.

    The idea of (supervised) machine learning is that you give it a bunch of training data where you manually label that this particular piece of data has these particular attributes.  Then you give it a bunch of additional data that you haven't manually labeled and ask the algorithm to label it for you, trying to mimic the decisions that you made.  Unless you have some canonical way for it to create its own data and always label it correctly, that's all that it can do.

    I said "supervised" above because there is also unsupervised machine learning, in which you give the algorithm a bunch of unlabeled data and ask it to create clusters of which things are similar to each other.  That could plausibly take a bunch of work from different artists, try to pick out which sets of pieces are from the same artist, and be mostly correct, or at least a lot better than random chance.  But that's not generating new art.
    AlBQuirky
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    finefluff said:
    I asked it how to play a specific guitar chord and it gave me instructions on where to place my finger and even drew a diagram for me. Instead of browsing random search results, I can just ask the AI and it will tell me about what I want to know. It can function both as an information provider and as a creative and decision-making aid. But then, there is a question of how reliable the information is. The final judgment and responsibility must always rest with the person. 
    A search engine that finds an article written by a human is more likely to be correct.  If you're asking AI to explain to you things that you know nothing about, then you won't be able to catch it when it's wrong.  And sometimes it will be wrong.
    finefluffAlBQuirky
  • eoloeeoloe Member RarePosts: 864
    Quizzical said:
    eoloe said:
    AlBQuirky said:
    AI does not have "feelings" as most humans do. It can write words just fine. It can "mimic", like Quizzical pointed out. It will be interesting to see where it goes.


    It can not have feelings, but it can create some in the mind of people!

    I have shown to a friend some of the pictures, I made stable diffusion made, and he was in awe. One picture represented an empress sitten on a throne, and he said "I am in love".

    Since the AI is using emotional impactful work for its training, it is able to produce feelings in the human psychee. Which is IMO amazing!
    Computers have long been able to produce feelings in humans, especially anger and frustration.

    I hate printers more :s
    AlBQuirky
  • eoloeeoloe Member RarePosts: 864
    Quizzical said:
    AlBQuirky said:
    AI does not have "feelings" as most humans do. It can write words just fine. It can "mimic", like Quizzical pointed out. It will be interesting to see where it goes.

    Shadiversity (on YouTube) has been playing around with art AI for his upcoming graphic novel. He has a couple of streams where he talks with actual industry artists and they have a great chat about what AI can and can not do.

    Some of the filters Shad can input are "art styles" of specific artists, like Frank Miller, Boris, Frazetta, and such. That almost rides on the edge of "plagiarism." The art AI also seemed to have a lot of troubles with hands :)

    Again, it will be interesting to see where this all goes. I almost wonder if it is "real AI", since it seems to copy, not think. Just my thoughts :)
    Where it becomes plagiarism is also an interesting question.  You could make a case that if you rely on training data that isn't owned by you, then it's plagiarism.  If you made the training data yourself, then it's not.

    But the training is taken from?... Who really owns the dataset? The guy who programmed the crawler? Or each element belong to their rightful original producers?

    So to own a training data set, you would have to pay for each picture, sentence, word and ask each owner to sell/agree to the fact that this element will be used as a dataset?

    That will never work.

    And the beauty of mimicking an art style, is that you can combine them!

    So you can achieve a crossover between Frazetta and Van Gogh !...

    AlBQuirky
  • fineflufffinefluff Member RarePosts: 561
    edited December 2022
    finefluff said:
    ...
    Would it possible someday for Unity or Unreal to take your written game design document and generate a prototype with placeholders? Whatever you don't like about it or whatever doesn't work, you tell the AI and the AI changes it. You can keep instructing the AI and making manual edits here and there until you have some you like. That would be astonishing.
    ...
    I guess it's closer than I thought...People are already figuring out how to use it to speed up MMO dev. It's "proof-of-concept" but I definitely see the potential there.

    "On December 14, Beamable CEO Jon Radoff shared the experience of developing a massively multiplayer online game using AI. During the experiment, Radoff discovered that ChatGPT knows how to create adventures based on text and can develop role-playing fan games. In addition, the AI can work with code and simulate virtual computers, including filesystems and websites.

    Artificial intelligence also learned how to generate blueprints for Unreal Engine, which is used to create photoreal visuals and immersive experiences. So, during the experiment conducted by Beamable, the developers taught ChatGPT about Beamable’s application programming interface. The AI learned using the company’s Live services and Unreal Engine’s Blueprints. The result is impressive: Beamable created a virtual world environment without writing codes or programming a graphics system."

    https://mpost.io/chatgpt-learns-massively-multiplayer-online-game-development/

    "The future is wild: we taught ChatGPT how to create a massively multiplayer online game!

    It uses Beamable Live Services and Unreal Engine Blueprints. Imagine: without ever touching a server, writing a line of backend code, or knowing how to program a graphics system -- you bootstrap a virtual world environment with persistence, user accounts, etc.

    It's still more of a proof-of-concept (as anything in ChatGPT is, at this point) but it gives a view into what the future will be like: what was once an effort requiring hundreds of people may become a simple text prompt. Enabling technologies: 3D engines, microservices, serverless backends, visual scripting, AI assistants."

    https://www.linkedin.com/posts/jonradoff_massively-multiplayer-game-from-chatgpt-activity-7008455034652774400-sV11/?utm_source=share&utm_medium=member_desktop
    eoloeAlBQuirky
  • fineflufffinefluff Member RarePosts: 561
    edited December 2022
    Just searching a bit more I found out that OpenAI (the company that made ChatGPT) already has already been working on AI for MMO. It's called Neural MMO. They describe here how it works https://openai.com/blog/neural-mmo/ 

    Admittedly I don't really understand it. Too many big words. But I guess it's to simulate NPCs (agents)?
    AlBQuirky
  • UwakionnaUwakionna Member RarePosts: 1,139
    edited December 2022
    finefluff said:
    Just searching a bit more I found out that OpenAI (the company that made ChatGPT) already has already been working on AI for MMO. It's called Neural MMO. They describe here how it works https://openai.com/blog/neural-mmo/ 

    Admittedly I don't really understand it. Too many big words. But I guess it's to simulate NPCs (agents)?
    That's not quite an MMO(nor for MMOs) as you might be thinking.

    They are using the moniker of MMO and players to refer to the setup of their test environment and the AI agents. The goal seems it was to simulate several of the common features of an MMO environment with persistence and variable tasks to achieve while maintaining a set of personal stats (health, stamina, etc) and put the AI agents in an environment where they could either cooperate or compete to achieve their tasks.

    That's not an AI built for MMOs.
    finefluffAlBQuirky
  • ScotScot Member LegendaryPosts: 24,271
    finefluff said:
    Just searching a bit more I found out that OpenAI (the company that made ChatGPT) already has already been working on AI for MMO. It's called Neural MMO. They describe here how it works https://openai.com/blog/neural-mmo/ 

    Admittedly I don't really understand it. Too many big words. But I guess it's to simulate NPCs (agents)?
    There would be a certain irony if the AI of a Massively Multiplayer Online Role Playing Game went on to take over the world :)
    AlBQuirkyfinefluff
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    eoloe said:
    Quizzical said:
    Where it becomes plagiarism is also an interesting question.  You could make a case that if you rely on training data that isn't owned by you, then it's plagiarism.  If you made the training data yourself, then it's not.
    So to own a training data set, you would have to pay for each picture, sentence, word and ask each owner to sell/agree to the fact that this element will be used as a dataset?

    That will never work.
    Yes, that's how it works.  If it can't be made to work, then it can't be made to work.  If the only way that I could get a billion dollars is to steal it, that doesn't make it legal for me to do so.

    Would it be legal for me to take a picture that I don't own, change the resolution and file format, and then claim that I own the modified picture and can do whatever I want with it?  Obviously not.

    What if I take two pictures that I don't own, mash them together, and combine them into one?  Does that make me into the legal owner of the combined picture?  Of course not.  That's two cases of plagiarism, not one.

    If you make it a million pictures rather than two, does that suddenly become legal?  Obviously not.

    There are some things that you can do with assets that you don't own as fair use.  But fair use generally requires that the new work that relies on someone else's is mostly your own.  You can quote what someone else said and respond, or have small clips from a movie in a review of it, or whatever.  But the new work has to fundamentally be mostly your own.

    Borrowing from other works to make a derivative work that competes with the original is never fair use.  That's what we're talking about if you're hoping to have AI generated text or artwork or code or whatever as game assets that compete with other games.

    Want to make it legal?  Get a legal license to use the data as training data.  If you want to use the entire archives of the New York Times as training data for a text generator, that's fine if you pay the New York Times enough to make it worth their while to give you a license to use their data.  It's not fine if you just take it and use it without permission.

    The big companies creating these AI programs ought to be careful that they don't get themselves sued into oblivion over copyright violations.  So long as it's still at the "look what it's possible to do" stage, they're probably safe.  But if people start making commercial products out of stuff generated by AI, everyone who owns some of the training data but didn't consent to it being used by the AI is going to have a legitimate claim of copyright infringement.
    AlBQuirky
  • eoloeeoloe Member RarePosts: 864
    Quizzical said:
    eoloe said:
    Quizzical said:
    Where it becomes plagiarism is also an interesting question.  You could make a case that if you rely on training data that isn't owned by you, then it's plagiarism.  If you made the training data yourself, then it's not.
    So to own a training data set, you would have to pay for each picture, sentence, word and ask each owner to sell/agree to the fact that this element will be used as a dataset?

    That will never work.
    Yes, that's how it works.  If it can't be made to work, then it can't be made to work.  If the only way that I could get a billion dollars is to steal it, that doesn't make it legal for me to do so.

    Would it be legal for me to take a picture that I don't own, change the resolution and file format, and then claim that I own the modified picture and can do whatever I want with it?  Obviously not.

    What if I take two pictures that I don't own, mash them together, and combine them into one?  Does that make me into the legal owner of the combined picture?  Of course not.  That's two cases of plagiarism, not one.

    If you make it a million pictures rather than two, does that suddenly become legal?  Obviously not.

    There are some things that you can do with assets that you don't own as fair use.  But fair use generally requires that the new work that relies on someone else's is mostly your own.  You can quote what someone else said and respond, or have small clips from a movie in a review of it, or whatever.  But the new work has to fundamentally be mostly your own.

    Borrowing from other works to make a derivative work that competes with the original is never fair use.  That's what we're talking about if you're hoping to have AI generated text or artwork or code or whatever as game assets that compete with other games.

    Want to make it legal?  Get a legal license to use the data as training data.  If you want to use the entire archives of the New York Times as training data for a text generator, that's fine if you pay the New York Times enough to make it worth their while to give you a license to use their data.  It's not fine if you just take it and use it without permission.

    The big companies creating these AI programs ought to be careful that they don't get themselves sued into oblivion over copyright violations.  So long as it's still at the "look what it's possible to do" stage, they're probably safe.  But if people start making commercial products out of stuff generated by AI, everyone who owns some of the training data but didn't consent to it being used by the AI is going to have a legitimate claim of copyright infringement.


    "What if I take two pictures that I don't own, mash them together, and combine them into one?  Does that make me into the legal owner of the combined picture?  Of course not.  That's two cases of plagiarism, not one."

    I would not be so sure about this, even if it sounds right. Techno music is a good example, where people are remixing each others or including movies' OST without owner's consent nor justice problems.

    And if I make a painting in the style of Van Gogh, can his descendants sue me? I doubt it.

    Was Blizzard attacked by the owners of WH40K for creating Starcraft (obvious copy)? I don't know, but they still own "their" Starcraft IP.

    Sure, if I write/sell a Middle Earth story, the Tolkien family will make me sell my house for crumbles with each of my family member included in the deal! But what can they do, if I sell a chatGPT story about a continent named "Middleland" in which elves, dwarves, humans and littleings (lol) fight against orcs who are led by an evil and powerful spirit named "Ronsau". And at the very core of this battle, there is an incredible necklace that allows its wearer to teleport?

    I think the problem is way more complex than it sounds.

    Part of it is that the AI does way more than combining pictures.

    It is generating them, and this is exactly why it is escaping any form of copyright despite the training samples belonging to various authors.


Sign In or Register to comment.