True-to-life motion capture, VR advancements, Unreal Engine and other highlights that caught our attention at GDC this year.
The annual Game Developers Conference (GDC) is one of our favorite weeks of the year. It’s not the show where loads of massive games are announced—that’s E3 in June—but it’s the one that pulls back the curtain on the magic, the passion, and the innovation needed to bring modern games to life. And this year’s conference certainly didn’t disappoint on that front.
Held last week in San Francisco, GDC 2018 debuted some exciting new technology in the gaming world in the areas of facial performance capture, virtual and augmented reality, and video game graphics. It also showcased compelling hardware that could help shape the future of interactive entertainment.
We followed the news all week and wanted to share some of the technology and demos that sent our jaws dropping and minds racing. Here are some of GDC 2018’s most exciting announcements.
Epic Games’ dynamic digital humans
Andy Serkis is no stranger to playing impressive digital creatures, from Gollum in The Lord of the Rings to Supreme Leader Snoke in Star Wars—but this time, the digital creation looks just like him. And wow, it really looks just like him.
At GDC, Epic Games and 3Lateral showcased an Unreal Engine 4-powered tech demo that showcases a fully digital Serkis performing a monologue from Shakespeare’s Macbeth, and it’s one of the most lifelike digital humans we’ve seen so far. It’s also running in real time in the engine, as shown via on-the-fly tweaks during the demonstration.
The process involved various capture scenarios, some focused on geometry, some on appearance and others on motion. All to extract universal facial semantics that represent muscular contractions, making the performance so very lifelike.
But what’s even more impressive is how, a few moments later, the same performance is translated onto the face of an alien. The facial movements are a perfect, chillingly malevolent replication.
That’s not all, either. Epic Games—along with Vicon, Cubic Motion, 3Lateral and Tencent—also showcased Siren, an incredibly lifelike digital personality that was showcased in real-time with an actress wearing a head cam. It has potential implications for video games and virtual reality, offering players the ability to interact with digital characters like they’ve never seen before, for example. However, the ability to immediately see what you’re capturing in such a vivid, polished manner could help creators across fields.
The Oculus Go impresses attendees
Quality virtual reality is still a fairly pricey proposition, as modern headsets require a PC, game console, or high-end smartphone on top of their own buy-in costs. Luckily, some impressive-looking standalone headsets are on the horizon, and Oculus looks to lead the charge with its Oculus Go headset, which was usable for the first time at GDC.
By and large, reactions from the press and attendees have been widely positive. The Oculus Go is essentially a standalone version of Samsung’s (Oculus-powered) Gear VR, running the same selection of apps and games, but the Go has its own processor, screen, storage, and battery—you won’t be tethered to another device. And it’s incredibly affordable, starting at $199.
That kind of price point could help democratize immersive VR like no device before it. No official release date has been announced, but it’s expected to feature at Facebook’s F8 conference in May, and could debut around then.
Real-time ray tracing comes to games
Ray tracing is the norm when it comes to rendering film and TV CGI work, but the real-time nature of video games and modest power of consoles and gaming PCs means that they still use less-lifelike rasterization and handle lighting effects with shaders.
But thanks to heavy hitters like Microsoft and Nvidia, real-time ray tracing is finally coming to video games soon. Microsoft’s DirectX Raytracing (DXR) in the DirectX 12 API and Nvidia’s RTX technology are designed to help developers introduce ray tracing into games in the near future. Ray tracing will allow for much more realistic lighting simulations, which will affect the way that digital beams of light affect the world and improve things like shadows and transparencies.
You’ll still need a pretty powerful GPU to pull it off, and it’s likely that ray tracing will simply supplement rasterization in the near future. But over time, as hardware improves and developers get more comfortable with the technology, ray tracing could eventually become the new standard in the industry and dramatically improve the way that video games look.
Magic Leap begins development push
There are still a lot of questions surrounding Magic Leap—like exactly how impressive the augmented reality headset tech has to be to court more than $2 billion in investments to date. But it may not be too much longer before we’re wearing the Magic Leap One headset: a 2018 release is still on the books, and now they’re trying to court content developers.
At GDC, Magic Leap launched its Creator Portal with an official SDK and tools, while Unreal Engine 4 released support for the headset. According to Epic Games, Unreal Engine 4 is already being used by studios like Framestore and ILMxLAB to produce Magic Leap content. With even more buzz building around the AR headset and developers coming onboard, we’ll be interacting with Magic Leap’s wondrous illusions very soon.