Game Development

Blog 745: Throwing Shade(rs)

Pretty much everything I do in Unity, I compare to what I could do in Warcraft III. One of the most surprising (to me at least) omissions was the ability to make custom materials. That is, being able to layer, blend and animate textures to create a final look for your characters.

In this strange modern worlds, such things require one to make shaders. That means a whole heap of extra programming work, and some pretty esoteric programming at that.

Then they invented lovely graph-based shader builders.

This story begins about two years ago now when I bought Shader Forge. Things were going well; I had made a single mega-shader that included all the special effects layers I needed for my mechs; I had made some fancy wavering effects for exhausts and flames; I had even introduced some triplanar mapping so I could dispense with troublesome UV mapping for terrain objects.

Unfortunately, I had made a teensy little mistake in purchasing Shaderforge — it was soon to be or already abandoned by its creator. They had been approached by Unity to join the fold as an official tool, but chose to make games rather than tools and declined the offer. Although Shaderforge kept on trucking for a while, with the release of Unity 2018 it stopped working — along with the shaders that had been forged upon it. D’oh.

Everything except a few particles use a custom shader, to ensure consistency of lighting and appearance across every single part of the game.

Almost too conveniently, Unity released their own graph-based shader editor with 2018.

Unfortunately, I couldn’t get Unity’s official shader editor to work, and despite the marketing push, the package manager still claimed it to be “Experimental”. It also only works with their new Scriptable Render Pipeline stuff which, surprise surprise, is also still highly experimental and unstable. I’m vaguely interested in some of the opportunities this will present in future, but I’m no early adopter — I’m looking for simple and stable answers so I can get on with building the actual game.

The alternative on everybody’s lips was Amplify Shader Editor. Which is a lot like Shader Forge but as if, well, it had a year or more’s continued development under its belt.

One of the problems I was starting to have with Shader Forge anyway was the growing complexity of my mega-shader. I’ve got it in my head that every status effect must be represented visually on your mech — you should be able to tell looking at the world itself what’s going on, without reference to abstract lists of icons (perhaps this is another hangover from Warcraft III, where every single spell effect was represented by a unique lightshow).

The classic EMP lightning overlay is perhaps the most important effect but is by no means the end of the line.

At the moment, Exon is extremely processor-bound, mainly because I do a lot of silly things in inefficient ways to make it easier for me to comprehend and develop content. Even so, owing to my use of low-poly models and simple lighting, the graphics card barely knows the game exists. This means I need to shift as much work as possible onto the graphics card — that is, by adding more stuff to shaders.

While I am doing the right thing, Shader Forge did not have a way to decompose shaders into sub-components. By the time I had colour, camouflage and cubemap overlays piped into wreckage overlays with glowing lamps and lightning and decay clipping I was somewhat afraid of adding more to that tangle.

Luckily, Amplify lets you build your own custom nodes. I’ve opted to make one custom node for each effect overlay I’ll need, piping in the output of the previous layers to eventually arrive at the final look.

At this point I can drop an RDZWreck node into any shader and they’ll be able to get beaten and burnt up like anything else, complete with built-in properties for the material editor and my code to twiddle. Which suggests that I don’t actually need a single mega-shader anymore, and can start to make more single-purpose shaders without losing the ability to update common effects from a single place.

You do NOT want to see what’s inside the RDZTemperature node.

The most fun part of this enterprise has been terminology. Amplify Shader Editor uses slightly different words for things that existed in Shader Forge, so I’ve been having a whale of a time trying to map old concepts that I was just getting to grips with to the new world — while simultaneously upgrading where easier/fancier options are now available.

Actually, that’s a lie — the most fun part has been the things that just don’t work the same way. For example, I was using a combination of depth blend and fresnel to make a faux-volumetric giant explosion glow that’s refusing to come together quite right under the new regime. However, this effect isn’t in use anywhere and won’t be any time soon, so hopefully once I actually need it I’ll have got over the hump.

I also discovered their implementation of the classic “Overlay” blend mode isn’t quite right (really should raise a bug report…), but I found the algorithm and rebuilt it as a custom node with a couple extra bells and whistles that I use all the time.

So it’s another slightly annoying learning curve I could have done without, but overall it seems to have been the right move. Huzzah for shader editors!

And you tell me...

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.