Stone Soup Social video with Inworld AI’s Kylan Gibbs
In December, Patrick Lee and the gang at Fanverse invited me to take part in a Stone Soup Social event at UTA in Los Angeles. This took the form of a fireside chat with Kylan Gibbs, then CPO but now CEO of Inworld AI, a company I’ve been working with since the Virj project last year and am continuing to work with on Whenere.
The video is now up. I watched it to make sure I didn’t say anything egregiously stupid. As usual, I speak in an Ent-like cadence that may prompt viewers to up the playback speed to 2.0. I ended up saying a lot of things that have more recently appeared in written form here in Graphomane. Recommended for Stephenson completists, if there are any, or people who prefer videos to reading! Kylan is admirably patient waiting for me to wind up my answers.
Matthew Ball’s THE METAVERSE
Matthew Ball is bringing out a new edition of The Metaverse, which is billed as “revised” but is really more like a ground-up rewrite of the 2022 edition. Official pub date is 23 July 2024 but you can pre-order it now wherever you buy books.
Recently Matthew organized a 90-minute interview with me and Tim Sweeney, the founder and CEO of Epic Games. Matt will soon be making the transcript available on his socials. When he does, I’ll publish some additional thoughts and reactions here.
No sooner had I started drafting such a post than I realized that I should begin by supplying some general background about the industry, for the benefit of readers who don’t follow this stuff closely. I’ve decided to drop that here. If you know your way around the game industry, you can probably stop reading now. If you don’t, this should provide some context.
Game Engines 101
Epic Games is the maker of Unreal Engine, which is one of the biggest and most capable game engines. What is a game engine? Well, all games, no matter how diverse their play style or visual presentation, have certain functionality in common. Developers used to code all of that from scratch (and still can, if they so choose), but it makes sense to bundle all of those capabilities into one piece of software so that devs can work on the things that make their projects unique instead of continually re-inventing a thousand different kinds of wheels. This is what game engines do.
There’s a feedback loop that has now been running for decades between companies like Epic that make the engines, and developers who come up with new ways to use them. Engines become more capable as companies like Epic say “hey, a lot of devs are implementing Feature X, we should just add that to the next release.”
Meanwhile the hardware that people use to run games becomes much more powerful. Developers think up interesting ways to take advantage of all of these improvements.
Game engines are really a general infrastructure for immersive experiences—by which I mean, audiovisual productions that you can move around in and interact with. Many of the applications built on this foundation fit cleanly into established genres such as first-person shooter games, but increasingly people use these things to make art projects, movies, and commercial/industrial applications.
Nick Whiting, a former studio head and engineering director at Epic, has co-founded a company called Kumikai that, among other things, helps developers who are using Unreal Engine to create applications that are not games. Part of his inspiration came from this brain aneurysm surgery simulator. Nick generously provided me with a list of links to other non-game projects that I can’t fully do justice to here, so I’ll just drop them in:
The “Guided Ureterovesical Anastomosis” section of this page on the Surgical Science site. Maybe a better and quicker view here though
Tesla’s use of Unreal Engine to generate synthetic data for training AI. If you’re teaching an AI to deal with conditions that arise in three-dimensional space, you can get data much more easily and cheaply by simulating it photorealistically than by going out into the real world and shooting video.
The inevitable NASA link. Label says it has been taken down but it seems to play.
A mining construction simulator. “By changing things like lighting and tunnel sizes to give a bit more "breathing room"…before they blast holes in the ground, they were able to save large amounts of money and have better safety for folks that are already in incredibly hazardous environments.”
Fortnite and the Metaverse
So if there’s going to be a Metaverse, game engines are going to run it. And game developers—the people who are proficient at using those engines and the toolchains that feed assets into them—are going to build it.
In 2017 Epic released Fortnite Battle Royale, which most people just refer to as Fortnite. It is an immensely successful game. In any given month, 70 - 80 million people play it. At peak it generated $5 - 6 billion a year in revenue.
This is relevant to the Metaverse because Fortnite is an online, multi-player game. 100 avatars parachute onto an island at about the same time and fight each other until only one team remains. The players can be anywhere on Earth. So in order for Epic’s engineers to make this game work, they had to solve a host of technical problems around synchronizing those 100 players’ perceptions and experiences of the same virtual space.
They’re not the first or the only engineers to have tackled such challenges. MMORPGs (Massively Multiplayer Online Role Playing Games) are a genre unto themselves and have been around since long before Fortnite. More recently Minecraft and Roblox have achieved phenomenal success enabling users to craft experiences that can be experienced by multiple players at once.
Tim Sweeney, however, has been openly stating for a long time that the goal is to develop all of this into something like the Metaverse. He’s been personally working on a new programming language called Verse that is tailored to the needs of Metaverse builders. After years of development Verse has recently broken the surface in UEFN, Unreal Editor for Fortnite, which is a system that Epic has released in order to make it possible for developers to extend the base Fortnite experience into games of their own design, and to make money doing so.
I think that those are the basics that non-game people need to know to get more out of the upcoming Matthew Ball interview, but there’s one more detail I’ll add. In spring of 2023, after the Metaverse hype bubble of 2022 had inevitably burst, Tim posted this tweet, which I think speaks for itself.
Hollywood has been leaning on Unreal Engine more in recent years too. The Mandalorian famously used it for environment backdrops, which the director loved because it allowed him to move individual pieces (think rocks and buildings) around in real time as a scene demanded it—bypassing a costly and time consuming re-rendering process with a traditional 3D animation house.
Source: https://www.unrealengine.com/en-US/blog/forging-new-paths-for-filmmakers-on-the-mandalorian