Been a bit, but that's sorta to be expected with all the holiday madness around this time of year. Hopefully your holidays have gone swimmingly thus far.
So, workblog update whatevernumberwe'reon, go!
Much like fusion, PBR is always just a little ways away. Unlike fusion, we're actually making great progress (heyooo)
We've gotten a ton of progress locked in, part of which was helped when we opted to just fold in the lighting buffers into the color target by just doing straight additive behavior when rendering the lights and probes. This is similar to how other engines do it, and while it does *technically* lose us a little flexibility in the render path, it WILDLY simplifies a lot of the overall behavior, streamlining the render steps, fewer render targets to juggle, a lighter GBuffer, and less raw crunching of pixels to combine them after the fact. All of which not only - as said - simplifies the overall behavior, but will end out making rendering more efficient.
This also let us do a good bit of standardizing/simplifying of the general math for surface/light shader handling, so we'll be able to go in and do a solid trimdown effort on Shadergen to make that easier to deal with as well.
We can also make up the slight loss in flexibility by ensuring we can inject post effects between the GBuffer fill and the probe/light rendering. This will be important for wetness shaders, which need to modify the roughness channel before we apply reflections, for example.
It also let us tackle something we've been plotting for a while, which is a refactor of how the glow/emissive works. In the old way, emissive and glow were handled VERY differently. Emissive basically was an early bailout on pixels with that material flag on it, so we pretty much just aborted out without dealing with any lighting information.
Glow, however, was done by adding those pixels into a dedicated target, which was then blurred in a post process.
It...worked, but anyone that's used the glow would realize it was kinda ugly, the bloom/blur wasn't particularly consistent and was prone to jittering, etc. Not ideal. While the changes done right this second don't directly fix several of those, it skips out the extra overhead of a second target, and opens up the option of doing much more natural blooms which should integrate with the scene FAR better.
To preface how we do this, we'll first deal with how we do lighting in general: We fill our GBuffer, which is our color, normals, metalness/roughness/ao, and material flags. When we go to do reflections/lights, we render those into the color target instead of completely separate ones, doing all the lighting and additive math right then and there. HDR and bloom would work off this by just detecting any and all pixels over the normal 0-1 range and just go 'this is so bright we need to do HDR stuff to it'.
So we use this with the glow flag by capitalizing on that. When we write the surface pixel, we just give it a modifier, and when it goes to write that pixel's color, we multiply by the modifier, which makes it "overbright" into the HDR range automatically. Then we just let the HDR/bloom deal with it on it's own instead of a completely separate step.
Once again, this standardizies, simplifies and will ultimately save some render overhead. We're going to largely replicate the current behavior for now, so don't expect a sudden explosion of TURBOBLOOM5000 quality glow, but it will be far easier to go in later and do much more refined bloom models and have it Just Work.
So all in all, progress is going very, very well. We have some bits that are still in todo, some examples are:
Point lights are still being uppity and need to be corrected
Fix up probe blending, it doesn't properly order the probes by importance yet
various cleanup bits, remove redundant/old code
But a majority of our todo list at this point falls to cleanup, removing old/redundant bits, removing commented code, and removing confusing bits of interface that cause problems. So our list is rapidly growing short which is excellent.
For some money shots, to see how it all comes together in action:
Also, for all of you out there that have Allegorythmic's Substance tools, we've been expanding the old T3D material converter substance graph to a) be more robust, and b) have integrated functionality for exporting out terrain materials from regular PBR materials. So it should drastically help standardize the workflow with materials AND terrain.
Entity/Components and Asset Pipeline
I had a few days off for the holidays so I capitalized on it aggressively and crunched a good deal on the general pipeline.
I made solid headway on some of the component-ification of the existing game classes, started with StaticShapeObject(TSStatic), then jumped to one of the big boys, the PlayerObject(Player). Gotta fix up some bits with the PlayerObject, mainly the PlayerAnimationComponent, which implements the action animation system(but with improvements utilizing the tags system like I was talking about with the motion matching stuff earlier), arm/look animations, etc.
It also has a default InteractComponent, with a compantion InteractableComponent to be put on things that are, well, interactable.
I'll be doing various videos that show off using such things.
I also made a ton of progress refining the general Asset Browser and asset pipeline stuffs. Tons of fixes, standardizing of code, streamlining some bits and fleshing out bits I've been planning for a good while now. It's very near it's initial major release, I'm hoping to start peeling parts of it next weekend for PR'ing
For now, a video showing off the basic flow that you'd follow setting up a custom PlayerObject. I'll be doing a much more full start-to-playable video probably next weekend to act as a sort of demo/tutorial.
As well as this video showing the initial compound image/material importing:
Remember the talk I was doing about scenes? I also jumped on an initial implementation of that. Lots to do with it yet of course, but the basic implement is largely there. It just needs full integration into the editor.
One of the things that motivated this was the realization that I could implement a 'Bake Level Geometry' function similar to our Export Mesh action that bakes and exports the selected objects into a new collada file, only instead we'd take all the static meshes in a level and bake them down(separated by zones) into as few meshes as possible, and store them off into a separate subScene, and then move the original objects into a different subscene.
The idea being that you'd bake all your static objects into compacted, efficient meshes which would drastically cut rendering overhead for stuff that never changes - buildings, decorative background stuff, etc. This subscene is loaded with the regular mission objects when you play the level, yielding lower load times and faster render times.
If you open the editor, however, the baked geometry subscene is unloaded, and the static objects "editor" subscene is loaded, allowing you to edit the individual objects as normal without manually fussing with anything. When you're happy, you rebake and let the subscenes do the logical work for you.
Obviously doesn't help in scenes where most stuff is dynamic or procedural, but a vast majority of levels have static geometry backdrops and it'd go a LONG way to improve loadtimes and performance without hurting the actual editing workflow.
From there, we can start looking into cool stuff like streaming levels for open world games and other crazy shenanigans
Other totally cool stuff
These are a few especially rocking stuff that came up in the discord and has been posted here, but I think it needs more attention because of how cool it is:
OpenAL EAX support:
@ marauder2k9 has done a ton of work and got the EAX behavior enabled in OpenAL allowing reverb and other related behaviors to zones, which opens up a ton of options for sound that weren't easy to do before
Meawhile @ OTHGMars got the latest of the OpenVR library working complete with hand skeletons and tracked objects.
Overall, things are looking fairly good. A few bits may well end up getting pushed back to 4.01 or whatnot, but part of the point of the work being done right now is a solid foundation with the healthy dose of new features. For example, I don't know how much editor refactor work we'll get in by the end of the year(but I do want to get a few bits sorted for it) but part of what I've been eyeballing while working on the AssetBrowser and Scenes stuff is how we can module-ify and do iterative upgrades of the different specific tools as we go.
Post 4.0 we can definitely do a much more rapid release schedule focusing on a particular chunk to work on, so anything not implemented alongside the big stuff like the assets work or PBR, we can quickly do follow up releases every month, every two, etc, while still keeping an eye towards an overall arc, such as 4.2 being the planned target for all the graphics pipeline/new-GFX work.
I'll be updating the roadmap thread(I've been lagging on that a bit, admittedly) in the next few days, but while a bit behind schedule, the big stuff is trucking along fairly well.
So yeah, that's the bulk of what's been worked on recently. Quite a lot in a relatively short period of time, and as PBR is finished out and the asset stuff goes in, I imagine the general pace will accelerate. Exciting stuff!
As always, feel free to comment, question and engage because this stuff is super cool, and I enjoy talking about it/explaining it