Work Blog - JeffR

297 posts Page 12 of 30
Posts: 375
Joined: Mon Feb 09, 2015 7:48 pm
  by chriscalef » Wed Apr 19, 2017 7:28 am
Wow! Beautiful work you guys! OMG.

Jason Campbell
Posts: 271
Joined: Fri Feb 13, 2015 2:51 am
by Jason Campbell » Wed Apr 19, 2017 5:13 pm
This is exciting! Coming along well.
Posts: 25
Joined: Wed Feb 18, 2015 11:53 am
by Monkeychops » Wed Apr 19, 2017 9:01 pm
Looks really promising!
Posts: 75
Joined: Sat Feb 07, 2015 1:29 am
by HeadClot » Wed Apr 19, 2017 9:44 pm
I am very excited :)
Posts: 201
Joined: Wed Jul 01, 2015 10:33 am
by Chelaru » Thu Apr 27, 2017 9:10 am
Any news or progress on the latest topic ?
Posts: 409
Joined: Tue Feb 03, 2015 9:50 pm
by Azaezel » Thu Apr 27, 2017 5:31 pm
Fork: ... R_w_probes

Sort why point and spot lights are getting culled from reflection. - Current task

Address angle - Eyeballing potential SH Encoding there for smoother transitioning, and/or possibly riffing off of the compute4lights setup forward lit uses to light translucent objects.

Apply probe results to translucents.

Sort why Decals are playing up and tossing things in the wrong render bin.

Clean up Probe preview sphere reflection display. - currently not rendering itsself, just what hits it.
Steering Committee
Steering Committee
Posts: 840
Joined: Tue Feb 03, 2015 9:49 pm
by JeffR » Mon May 01, 2017 8:50 am
As Az said, some work todo yet, but it's getting there!

I'm not 100% satisfied with the workaround/fix for the lighting culling behavior, but the workaround, well, works, for the moment and we'll continue to review the render path and stuff to hammer out that kinda weird stuff.

So, I rigged up a smaller-scale little testbed scene that's similar to the famed Cornell Box scene, one probe, uniquely textured walls, easier to see what's going on as we dial everything in, then we can demo off in the Sponza proper.

So, if we take the scene, with a purple spotlight and the sun, it looks like this, with the ambient color completely removed:


But, if we add our magical [editor's note: not actually magic] reflection probes, having baked the scene from it's position in the center of the box, it looks like this:


Suffice it to say, it's a HUGE improvement in the believability of the lighting to the scene. The indirect lighting is subtle and directionally colored, unlike the flat color we get when we just have a solid ambient color:


The work to get all this playing nice has led to a few secondary improvements, and plans for further ones: namely, a common function that can be called that will do a full scene render from a given transform and frustum and render out to any given render target. This is used in the probes for the baking process, but I plan to utilize it in the future for doing stuff like making a camera component be able to render out to a uniquely named render target, which you can do whatever you want with.

An example would be having a camera as an actual camera object, and then it renders to a render target by just setting a regular field, which a material on a TV elsewhere is set to display from and boom, you have a CCTV system with basically no effort.

The other benefit this has had is it fixed point/spot lights not rendering in water reflections and the like properly.


Other stuff that got worked on this weekend includes me getting box mode and projection MOST of the way working, so you can pick a sphere or a box for the probe influence area, and the math in box mode is a bit more accurate in accounting for parallaxing of the view. So that's nice. Also more general cleanup of the probes, and should have the preview sphere behaving properly in the next day or two. You can tell in the screenshots they're not really rendering right.

I also started plotting out the work involved in updating the editor suite's layout and editability/expandability. I'd touched on it a little bit a while back with the dock-based windows and stuff. This is similar, but less jank, works with existing UI controls, and all that. End goal should see the editors behaving the same, but way easier to edit, way easier to add custom editors and able to customize the layout(and eventually multiple window support for various editors' subwindows and stuff). So that'll be nice since, as-is, expanding or modifying the editor suite can be a pretty good pain in the butt.

I also started working on a Legacy Game Import Script for the BaseGame template.
While talking with az, I realized that even with solid docs, people may find it a right pain to do the gruntwork of porting a project up. So, I started tests into a script where you can click a tool option in the editor, pick your project's directory, and it runs the import on it to move the scripts, art, sounds, etc into a module in the new BaseGame template-based project.

This would go a long way in cutting down how much work is required in hopping up to the BaseGame template, which should make it far more palatable for people to do, and not put it off.
Steering Committee
Steering Committee
Posts: 840
Joined: Tue Feb 03, 2015 9:49 pm
by JeffR » Mon May 08, 2017 6:35 am
More funness with probes. The number of bits that need touching are coming down, and once we get probes locked in, there isn't much left on PBR todo, thankfully!

So some stuff that got sorted out: Fixed up a crash the 'make lights/shadows render in reflections' change had corrected for the above bit, so that's nice. I also fixed that obnoxious halo'ing effect you can see with the probes in the sceenshots, where you get an ghosting effect. That's dead.

We also got the preview sphere updated to straight show the cubemap, so it's a lot cleaner and provides more meaningful data on what it's applying to the scene, as well as fixing a stupid error that was jacking the world normals up.

They're still a little off, but MOSTLY correct.

I also took a few hours to build out my little cornell box scene in blender, as well as load up the sponza map and do some rough mapping of the materials, and then running offline renders in cycles to get "Ground Truth" reference images so we can compare the current output with offline raycast renders.

Obviously no realtime render will ever 1:1 30 minute rendertimes, but it gives us a good compass to dial things in and get as close as is reasonable. I'll be updating the reference scenes with proper texturing and more accurate lighting conditions later, but it's a good start.

You can poke through the whole shebang here but a few neat highlights in particular:

Cornell box scene, rendered in Cycles(Blender), 9 and a half minute render time:

And the PBR branch, as of the time of testing:

Also, did one of the sponza. This is largely untextured, but it gives an idea of how the indirect lighting and stuff SHOULD behave.(Note: this sucker to 30 minutes to render!)

Now, part of what makes those scenes look so nice in the not-directly-lit areas is Global Illumination, or Indirect Lighting, or any number of a dozen other words and phrases used to describe the same phenomena.

While I do plan to implement some GI-specific techniques(my heart is set on SVOTI, down the line), for the near term, we'll be relying on the probes to provide that information to the scene because it can bake cubemaps informing the lighting conditions around it.

However, you can't just rely on reflections, because indirect lighting can come from almost any direction, and usually does, especially when a surface isn't particularly reflective.

There's several techniques that can be used to encode this indirect lighting info, and for the initial foray, I'm working to implement 'Spherical Harmonics'. I'd suggest giving the subject a google if you're curious, it's pretty cool crap.
But the basics can be explained as taking the cubemap we baked, and then for each pixel, doing a bunch of samples of different directions to encode the general ambient lighting that hits a given pixel.

Then from there, we 'encode' that info into a set of Colors. In the shader, we can then, through the voodoo power of math, take the normal of a pixel, and use that to decode the directional color information out of our set of colors. This yields some surprisingly accurate directional info of the irradiance lighting around it. And the best part is it's very data compact, so it's pretty efficient.

SH is the middle-of-the-road method between accuracy and cost. The more accurate method is convolutinour baked cubemap using some specific algorthms and saving the 'irradiance maps' as well, and sampling them in the shader. More accurate, but samples are more expensive usually.

Then the least expensive, but least accurate method, is to just spoof it, and sample the lowest mip level of the reflection cubemaps and apply that as our irradiance data. Not partocularly accurate, as it's VERY directional compared to the other two, but it's very cheap because we're already using that cubemap anyways.

For giggles, I hacked in an approximation of THAT method, and it ended up looking like this. I did a bake with just the sun and the lights with the torches, and no other lightsources. Ambient color on the shadows is pure-black, any other light info is from our 'irradiance':


And the compare, without the probes:

And this one is particularly cool:

And the compare:

As you ca see, the lighting on the walls, especially that teal light in the second shot, is 100% from the irradiance. The "light" on the wall is actually just an emmisive material, but because it shows up super bright in the reflection bake, when used for irradiance, it automagically also applies lighting to the environment as well, more or less for free.

This was a very hacked in tester case though, and not general-consumption tier. The Spherical Harmonics method should provide even better presentation, and if that fails, we can look at irradiance maps or cleaning up the low-mip reflection maps, but either way, we should get some pretty good results.

As an aside, that's almost 50 probes in the scene, all contributing reflection and irradiance info to the scene. As-is, it all costs ~3mspf, and I'm pretty sure it can be optimized a good bit further yet. So cost-wise, it's surprisingly low. Options like LOD'ing out probes if they're far away and the like will further optimize larger spaces.

From there, I've already started to look into multi-baking, which will let you run the bake several times compounding the results, which should effectively make the light 'bounce' and provide much more pronounced, smoother results.

I've also started implementing a dynamic reflections mode. I did a test and for a 32x32 res cubemap, in debug, updating every 200ms cost about 1mspf. It should be appreciably less in release. It's not something where you'll slap dozens of dynamically updating reflection probes everywhere, but when used selectively, such as mounting a dynamically reflecting probe on the player so nearby stuff gets dynamic reflections, or for special-case stuff like mirrors, you should be able to get dynamic reflections and make everything feel pretty sexy.

I've also started tests on reflections applying to translucent materials, so stuff like glass will actually FEEL like glass.

All in all, a lot of good stuff, and it's really started to come together!
Posts: 201
Joined: Wed Jul 01, 2015 10:33 am
by Chelaru » Tue May 09, 2017 9:04 am
"I'd suggest giving the subject a google if you're curious, it's pretty cool crap." Nice one
Posts: 37
Joined: Thu Jun 23, 2016 12:02 pm
by damik » Tue May 09, 2017 10:21 am
coooool :D
297 posts Page 12 of 30

Who is online

Users browsing this forum: No registered users and 4 guests