Some of you may remember that I used to go to a meet-up for local Edinburgh game developers called GameDevEd… In fact, I went so regularly that I ended up becoming the organiser (oops). This is a fun role for somebody who doesn’t have a smartphone, seeing as it requires being able to broadcast which table we’re at on social media. To compensate, I take along my little laptop, because the full game dev laptop is too massive to cart around all day (let alone keep open next to my dinner and a drink or two).
Now that the pubs are definitely open again, I’ve finally restarted GameDevEd. Unlike the Before Times, however, I now have a demo — my precious Exon can truly be played live! But my little laptop can’t run Exon smoothly, so no, actually, Exon cannot be played live…
… Unless I can find a way?
On a traditional heavyweight desktop PC, Exon is CPU-bound. That is, it requires much more processing power to simulate the world than it does graphics card grunt to render the results (like, orders of magnitude more).
This makes sense; while Exon has a lot of physics bullshittery going on amongst its sloppy AI systems and overly complicated animation controllers, the meshes for a single unit barely add up to a thousand triangles, most textures are 256×256 and there are no normal maps or other fancy effects. Barring me getting a little bit overeager with dynamic lights and shadows, the GPU barely knows the game exists.
I’ve set my lower bound performance target to be my “portable” (3.5Kg, ooft) gamedev laptop Overlook, a six-year-old Acer Predator. It has a 2.3GHz quad-core and a GTX 970M so it’s not exactly a slouch, but it is showing its age. I’m hoping that if I can get the game running nicely on a mid-range laptop from six years ago, it should fit perfectly on anything newer that an actual human being is likely to own (especially by the time I ever finish the game).
(Meanwhile my main PC, which clocks in at somewhere between 3 and 8 years old due to its untimely death and resurrection, is much more conventionally powerful with its 4GHz quad-core and full-fat desktop GTX 970. Indeed, it shielded me from a lot of performance problems for a long time because it’s so overpowered for what I’m trying to achieve with Exon.)
Having said that, as far as I can tell I’ve optimised Exon to the limit — that is, the limit beyond which I’d be starting to cut systems in ways that might compromise the gameplay and the aesthetic (don’t talk to me about draw calls).
After all of these optimisations, I ended up at one inescapable truth: Exon‘s performance is tied almost directly to how many units are in a level. Units are the heavyweights because of their complex movement systems (mainly due to all the raycasting), but since they are also most of the gameplay, I’m happy that this is where all the costs should be concentrated.
By this measure, however, the Exon Academy demo level is slightly on the large side and I need to rein myself in going forward. It does run fine on the big laptop, but it’s not got much headroom in case of a crisis. (Depending on how thoroughly you may have tried to escape the bounds of the Academy, you may have noticed a lot of empty spaces that will now never be filled.)
My little laptop, Evitar, is supposed to be small and light. I use it mainly for writing and I want to be able to do that anywhere, so portability is key. Even so, it is actually capable of playing classics like Unreal Tournament and Warcraft III just fine — it does have a 2.3GHz processor, albeit coupled with an integrated GPU. As a smaller device it’s probably very vulnerable to throttling for thermal control and battery life, but on paper, it actually falls right into my target range. If I truly am CPU-bound, those integrated graphics should be able to handle the rendering side just fine. Even if I have to turn off shadows.
Alas, I tried to run Exon there and… Well, the menu screen at least has a perfect frame-rate, so we know there’s no fundamental barrier. Start the game and get into the Academy level, however, and it grinds right down.
Knowing that size matters, I’ve accepted that the primary campaign and its sprawling overworld levels will never run on Evitar… but there is another way. I have flirted a number of times in the past with alternative campaigns and stand-alone missions; the world of Exon is, after all, deliberately designed to host more than a single story. I want to be making new campaigns for this engine into my twilight years. Why yes, the sunk cost fallacy and I are very close friends!
So I posed myself the question: what if I made a stand-alone dungeon, composed of tiny little rooms? Would that be enough to get over the hump?
Eager for a change of scenery, I started work on an exon barrow, an underground complex where dead exons get interred with their mechs and equipment. Protected from looters by traps, drones and puzzles, these tombs are dangerous but potentially very lucrative for the raider who manages to win through. Crucially, each chamber can be a small, self-contained unit with a single challenge.
But did it work?
Yes! It worked! The micro-chambers allowed Exon to run smoothly on Evitar!
At full power graphics, the frame-rate is not perfect, but it is good enough to be completely playable. Turn off some of those fancy effects like dynamic shadows, however, and it finally runs like butter. So maybe I still need to find a few more corners to shave off to get it running smoothly even with shadows, but overall, the experiment has been a great success. Turns out I do still understand my constraints… phew.
What do you mean, I’m getting distracted and I have a main campaign to work on? It means you’re very likely to get a little bit of new content a lot sooner than expected, so shush!