Most 3 Popular Posts of The Week!

The 10 Best Sonic Games

Baldur's Gate 3 Interactive Map and Locations for Acts 1, 2, and 3

IGN UK Podcast #643: Doctor Strange in the Desperate Search for Seal Meat

The Indie Developers Who Are Using AI to Make Living Cities, Tactics Machines, and More

City of Gangsters is a mafia game that simulates a city full of people interacting during a fictional 1920s Prohibition. The characters in City of Gangsters are supposed to react to the player in ways that appear to be organic and period-accurate – a feeling that strictly scripted interactions would never be able to encapsulate.

So, like many developers over the years before him, SomaSim founder and CTO Rob Zubek turned to AI to make a realistic city.

“We simulate a city populated by all sorts of characters, who have various relationships and histories with each other,” he explained. “And our AI system lets us find interesting relationships and act on them. For example, we can ask the system to find us people who should start a vendetta against the player. We could express something like ‘find me someone whose family member was recently killed by the player’, except we write it in logic notation rather than in English, and then do something interesting with the result.”

Des Gayle, game director at Hidden Titan, is making a story-focused, action-RPG called The Hellbrand in Unreal Engine 5. One of the ways Gayle is using AI in development is effectively as a tactics machine that pays attention to how a player is fighting, and adjusts the behavior of enemies accordingly. It even allows enemy soldiers who escape an encounter with a player to inform other units of their behavior so that, and future groups of enemies might come prepared. For instance, if a player is using a lot of close combat attacks, but an enemy escapes, future waves might include more long-range attackers.

Zubek and Gayle number among the many developers experimenting with AI, which has been a hot topic in 2023 as media personalities, tech analysts, and communities ponder the technology’s uses, potential, and dangers.s.Conversations around new forms of AI, especially machine learning, have sparked a broader social conversation about how AI will impact all aspects of our world, including gaming. And as with anything else in games, these conversations aren’t only happening at AAA studios. They’re also being discussed by independent developers, a group that frequently has both the most to gain or to lose by massive industry trends and shifts.

IGN spoke to 12 independent developers working on a diverse array of games, all of whom had at some point in their work used or seriously considered using AI of some kind. They shared their experiences with the technology ranging from more conventional usages to unique, experimental ones. And they all expressed different degrees of hope for how AI might make indie development faster, easier, cheaper, and more accessible – as well as fear for the ways in which it may ultimately harm their work, their livelihood, and the industry at large.

The new AI conversation

Fundamentally, “artificial intelligence” can refer to any instance of intelligence appearing to be demonstrated by machines. It encompasses a vast field of technology, and in games, it’s historically been used in situations like the aforementioned player/enemy AI, procedural generation, or as a development tool for model skinning, rigging, or other “fill tasks.”

But the current surge in conversation around AI is about a new kind of AI: machine learning and, more specifically, generative AI. Machine learning is a type of AI that uses data and algorithms to “train” systems, mimicking the way that human beings learn, but with far more efficiency and capacity. Generative AI is a form of machine learning that uses that training to create new content such as audio, images, text, and other things. You’ve likely seen it discussed with regards to programs like ChatGPT, or image generator DALL-E, but it’s finding its way into video games as well.

The idea was to use AI images as these dream spaces. It made sense to me that these places don't exist.

Yifat Shaik, an assistant professor of Computational Art at York University in Canada, recently spoke to me alongside her research assistant and phD student SK Sabada. Shaik has been making games in programs that aren’t game engines, and is currently in the early stages of a game made in online whiteboard tool Miro. She wants her game to be about “places that only exist in dreams,” and is contemplating using an AI program such as DALL-E to generate such images for her.

“So the idea was to use murals to do a collaborative map about games, images, places that only exist in dreams,” she explained. “And I've been planning this for a while and the idea was to use AI images as these dream spaces. It made sense to me that these places don't exist. The plan was a year and a half ago when I started planning the game…definitely before a lot of the conversation around AI was, [so now] I need to frame it in a way that is like, this is me not wanting to pay an artist, but rather conceptually it seems fitting with places that don't exist to have the machine try to figure out the place. I have these places that I've been in my dreams over and over again. I think a lot of people have those.”

Lamar Deys, a developer working on a game he describes as “a mix of The Sims and State of Decay,” told me he was planning on building his own generative AI to create origin stories for his game’s characters and their traits, backgrounds, and skills. He’s also considering using it as a worldbuilding tool, to generate “rules/lore for the world” such as the cause of an in-game apocalypse, how many survivors are left, what resources are plentiful and what are scarce, and so forth in a given playthrough of the game.

What he’s struggling with at the moment, though, is deciding what to use as “training data” – or information he feeds to the AI so that it can generate content. He considered creating a dataset himself, but that would require thousands or more data points – effectively defeating the purpose of using the AI in the first place.

poll

“My second thought was to scrape information from the web and websites that focus on characters, specifically I was looking at Fandom or Goodreads,” Deys explained. “The plan would be to scrape the data, get rid of information I don't need and use that as training data. I was looking at Fandom first since it has information on characters from modern day games, movies, etc. but I know some people will consider that stealing which is why I moved towards Goodreads where I could specifically use characters from books in the public domain. To some people this will still be stealing but I don't really know what I could do beyond that. Once I have the training data I'll use that to build the model which should generate the information I want.”

Deys brought up one of the biggest criticisms of generative AI, which is that to get a large enough dataset to train the AI on, you frequently need to look at data collections where you may not necessarily have the rights to some or even all of the data you’re using. Which means that while it appears your generative AI is spitting out “new” information, what’s actually happening is it’s borrowing from tons of images, pieces of text, or audio files that it doesn’t have permission to use. The practical result has been that artists, writers, and even everyday citizens have noticed AI-generated content that looks remarkably similar to existing human-created works – effectively plagiarism, and something most game developers rightfully want to stay far, far away from.

Tuning the World Generation Machine

But despite the limitations of generative AI, a number of developers told me they’re using it in unique ways to support their projects – without risk of stealing content from others. At Cannibal Interactive, for instance, Damien Crawford isn’t just using procedural generation to make dungeons. This past January, they released “Night Shift at Pop’s,” a game where players use an employee handbook (in PDF form) to figure out how to put sandwiches together correctly. While the game used procedural generation to spit out “orders” for sandwiches, Crawford said artist Star St. Germain used an AI process to help her create original art much faster than she could have without it.

“The images were originally generated in niji journey (Midjourney anime fork), and then edited/vectorized/redrawn in Adobe Illustrator,” Crawford explained. “This meant she could normalize the bun color across all sandwiches with the same bun type and assemble vector versions piece by piece. Traditional illustration was needed to make this work as well, since some things not only needed to be brought in line with the rest of the aesthetic but weren't generating well. But starting from generated images meant she could get a lot done quickly, so it beat the normal alternative of hand drawing all the things.”

One of the more complex examples of AI use came from Chantal Ryan, founder and director of We Have Always Lived in the Forest. Ryan spent several years creating her own AI for her upcoming game, darkwebSTREAMER, an internet simulator of sorts about the stress of being a streamer.

“The AI system started as essentially as a language technology – I was trying to figure out how to adequately simulate the interactions of hundreds or thousands of different individuals across the internet without having to individually write every line for every different character/personality myself. I came up with a way to procedurally generate speech in ways that gave the appearance of different people and personalities, and then realized the tech I’d created was capable of doing a whole lot more.”

Ryan told me she effectively ended up accidentally creating a “procedurally-generated internet” with characters that have their own webpages reflecting their personalities, interests, and knowledge of the world, as well as random events that learn from events that came before and the actions of the player.

But most people I spoke to were not quite as AI-ambitious as Ryan. A few told me they were using generative AI in inventive ways to help them structure or build their game behind the scenes, without actually including AI in the game itself. On The Hellbrand, for instance, Des Gayle uses AI not to write dialogue for him, but to assist him with writing it himself.

I came up with a way to procedurally generate speech...and realized the tech I’d created was capable of doing a whole lot more.

“Something may sound amazing in my head and still might sound good written down but until I hear it spoken back to me, I can never be sure,” he explained. “Quickly lobbing some lines in an AI tool saves me dragging a random developer away from their work and absolutely saves me hiring an actor. Actual recording is a long way away but when we do get there I will be confident that I am happy with the lines and this will hopefully lead to fewer takes and less stressful sessions.”

Gayle wasn’t the only one using AI chatbots for more conversational purposes. Izzy Kestrel, gameplay engineer at No Goblin, told me about using AI on a personal project to teach her how to do something in Unreal she didn’t know how to do before. The project was called Gexfield, which Kestrel explained as “a crossover parody of Gex and Seinfield, obv.” While working on it, she saw someone on Mastodon posting about asking ChatGPT to write C++ code for them. While she knew you shouldn’t just copy and paste generated code into something, it made her curious how well the tool could explain something she was struggling with.

“If you need to know how to do something in Unreal and you work in a dedicated office, you can just turn your head and ask the more senior guy sitting behind you how to do it,” Kestrel explains. “If you're an indie dev and don't have that kind of access to an industry vet, that's hours of research burden you now have to factor into your timeline. If you're a hobbyist, even figuring out what to type into Google could be a non-starter.”

Kestrel hypothesized that the information on how to accomplish the tasks she wanted to in Unreal was out there somewhere, but it was so scattered that a typical Google search wouldn’t be helpful. Enter ChatGPT.

“I tried a baseline prompt first – something I could reasonably explain myself – just to see how accurate it could be. I think it was something about making a simple user interface widget. ChatGPT was able to spit out a handful of steps that were easy to read and largely accurate. Cool. Let's get weirder. I'm making a Gex game, so it's only natural he should be able to climb on walls, right? Tell me how, ChatGPT:

“The resulting steps were fine, but nothing I couldn't have come up with myself. Some steps were more specific about adding Unreal-specific components to manage physics interactions and such, but other steps (the ones I really needed) were a lot more vague, telling me to simply ‘add logic to allow the gecko character to climb the wall’.”

Kestrel isn’t the only developer who told me a story like this. Johan Peitz, CEO of Swedish studio Apskeppet, needed to write a character who is annoyed with the player of pico8 game Hellgineers, but English wasn’t his first language. So he told ChatGPT about his character and the situation, then directed it iteratively until he had short, concise responses.

And programming hobbyist Mira Gale explained that while they also made an in-game AI to script dialogue responses from in-game datasets, they also use ChatGPT to look at their code and suggest alternative solutions to problems they’re having. But they also tell me that as someone who is both autistic and diagnosed with ADHD, they find ChatGPT to be a helpful resource simply because it’s explicitly not a person.

“Straight-up asking people can take me a lot of mental energy, while posting on a more anonymous place like StackOverflow or a forum can take far too long when I want a solution now. ChatGPT gives me immediate answers while also having the added benefit of not being a real person, which means I'm not constantly policing myself on how I talk, whether I'm appearing normal, whether my questions are stupid or not, etc. – it takes a real social load off my plate. Again, it's not something I trust for actual information, but sometimes information-shaped sentences are all that you need to point you in a general direction. It's kind of just rubber duck debugging, but the duck talks back, I suppose?”

It's kind of just rubber duck debugging, but the duck talks back, I suppose?

One other curious use of AI came from Yifat Shaik and SK Sabada, who in addition to Shaik’s dream space project, are currently running an academic ARG starring an AI that isn’t actually an AI. In reality, it’s a person writing responses, but they’re intended to sound like a real AI, and are based on real research into how AIs respond and behave – and specifically, how humans interact with them.

“One of the big questions I think that the ARG is dealing with is less the intensive focus on ‘How human is the AI?’ that we're working with versus ‘What does the AI desire in relation to what's going on with the human characters in the piece?’” Sabata explains. “Which gestures towards a different kind of narrative that I think we're all on the research team hoping that people will move towards instead of getting stuck on that idea of [how human the AI is]...It's one thing to bring up that conversation around the humanization of things that aren't people, but it's also this other more complicated interplay between what you're choosing to animate and how you respond back to what's being animated.”

So indie developers are using AI not just in conventional ways such as procedural generation and to make characters and worlds seem more real, but also as learning tools, reference material, simulated pitch meetings, and of course, content generation. Sounds great, right? What could go wrong?

‘An Excel Spreadsheet the Size of Texas’

While every developer I spoke to had a very different experience with AI, one thing that united almost everyone I spoke to was at least some degree of caution around the new technology, ranging from tempered expectations and questions to outright condemnation of certain uses of the technology. Most of the concerns I heard were about generative AI specifically - no one I talked to thought that the more conventional ways game developers have used AI in the past (procedural generation, enemy AI, and so forth) were an issue.

Freelance tech artist Olivia Wetheimer explained that the key to making ethical decisions around AI isn’t to wholesale condemn or approve it, but rather to look at the ethics of each type of AI and its intended use. As an example, Wetheimer brought up Spiderman: Into the Spiderverse, where AI was used to speed up the process of drawing comic book lines on characters. They say this usage is probably acceptable, as the intent isn’t “to take jobs away from artists, but rather provide a feasible way for them to use a certain artstyle that otherwise wouldn't have been possible in the time frame.” In this context, AI is like a fill tool in a digital art program. But Wetheimer also notes that other similar uses could be used to take work away from human beings not just in animation, but in games too. And that’s where things get pretty problematic.

Des Gayle brings up another concern: AI bias. Depending on how developers train their AI, it’s conceivable that generative AI making images or writing dialogue for games could result in those games inadvertently saying or showing something the developers never intended.

“AI is only as good as the information it consumes, so if the people that are feeding it aren't careful, horrible things will happen,” he says. “I can't remember the exact details but there was an AI that was let out into the wild and after [less than 24 hours] started spouting racist comments. The internet has some dark places, who knew?”

Several people we spoke to pointed out that while there are certainly many concerns around AI and its use, mainstream conversation about it tends to focus on the wrong ones, such as AI becoming sentient or turning into Skynet. Rob Zubek called ChatGPT “shockingly dumb,” because it’s effectively just “an Excel spreadsheet the size of Texas” spitting out words and phrases it’s been trained on, without any understanding or intentionality. But that doesn’t stop it from being used in harmful ways, not just by stealing the art of others, but also by producing individuals’ private information, or providing falsehoods that sound true. While all this is fairly removed from games at the moment, it’s a real danger if games start bringing in systems like ChatGPT, DALL-E, or others to create content for them – especially if that content doesn’t get human oversight.

“The public conversations about dangers of AI hide the fact that the real danger is humans using automation in evil or stupid ways, because it now became incredibly cheap to do so,” Zubek said. “That's the real problem with this tech. Calling AI ‘dangerous’ is convenient for companies that sell it, because it lets them shirk responsibility and pretend that danger is inherent or some kind of a fait accompli. But it is not. The responsibility is always with the people who use automated tools in ways that affect other people – they're responsible for what they have their machines do.”

Responsible Magic

All of the indie developers I spoke to acknowledged that there were at least some good uses for AI technology at the level, budget, and type of game development they were currently working at – and that includes generative AI.

Most people I spoke to asserted that we would never reach a point where AI would allow a developer to simply push a button and make a game on the level of current AAA games appear out of thin air, but one person disagreed: Rafal Grochala, a solo developer working on grand strategy game Espiocracy. He says that we currently have good enough AI technology that GPT-4 can create miniature, complete games, and later iterations “will set us on track to have full games written by large language models in this decade.”

Far more common among those I spoke to was the perspective that AI, when used responsibly, could help hone existing ideas or make intensive labor a human being is already doing a bit faster, easier, or more accurate – especially when there’s a large quantity of content to generate on a low budget. With careful ethical limitations, it’s another tool in developers’ hands. Damien Crawford, for instance, said that training an AI on a studio style can lead to rapid concept art and storyboards that humans can refine. AI can also write rough drafts of equipment descriptions and character backstories, but everything still depends (and should depend) on human oversight and what humans feed into it.

Lamar Deys put it another way.

“The biggest example of this is Skyrim, whose entire map was initially procedurally generated and then developers went in and tweaked it to give that handcrafted feel. That's the role I see AI at large taking in game development. I see AI as being analogous to sewing machines and calculators. The invention of sewing machines didn't put tailors and accountants out of business, instead they got more efficient. In a similar way I think those who are truly professionals will know how to most efficiently use AI in tandem with their skills.”

One perhaps surprising response came from a handful of developers who made more empathetic cases for AI, in addition to practical ones. Mira Gale, whose AI use involved assembling game writing and dialogue from in-game datasets based on player behavior and other in-game events, said that AI had been personally meaningful to them because it allowed their own game to surprise them constantly.

“I don’t know what someone is going to say, or what’s going to happen to me,” they said. “That is thrilling. I’ve made something outside myself, as has the team. When the game you’ve made can surprise you, it feels like it has a beating heart. It has, in a way, individual agency. In that way it feels more like a child than a game – it’s always changing, always doing something you didn’t expect, always making you laugh or think. I’m not sure I could work on a game for years that didn’t have the capacity to keep surprising me. So this aspect was critically important.”

And Chantal Ryan, who devoted years to creating her own AI that can create effectively a horror-themed internet simulation within her game darkwebSTREAMER, called AI “genuine magic.”

“And I think, like all magic, there are ways to use such powers responsibly as well as ways to use them for evil,” she said. “It’s critically important we have the conversations about what ‘evil’ looks like in this context. I support this.

“I would, in addition, also like to see more nuance in the discussion – I would like people to understand that not all ‘AI’ is the same, that it’s an incredibly large term for a host of widely different technologies and applications, and that there is a lot of potential for good – both creative and social – that can come out of these technologies. Instead of fighting what I believe is an unstoppable process of momentum – there’s no putting the genie back in the bottle – I think shaping the public conversation around what healthy AI applications do look like is a valuable and vital addition to the discourse.”

Rebekah Valentine is a news reporter for IGN. You can find her on Twitter @duckvalentine.



source https://www.ign.com/articles/the-indie-developers-who-are-using-ai-to-make-living-cities-tactics-machines-and-more

Comments