Work Blog - JeffR

456 posts Page 17 of 46
Posts: 957
Joined: Tue Feb 03, 2015 9:49 pm
by JeffR » Thu Sep 14, 2017 9:06 am
Hey guys! Updates, woo!

Unsurprisingly, a fair bit of work has been happening. Beyond the regular spread of bugfixes and improvements in PRs and the like, we've got a good bit of work continuing on the forward-looking stuff.

We've think we've got the PBR math pretty well dialed in, it's just a matter of getting it rolled together from different WIP builds so we can finalize things before finally merging that sucker into stock, so that's very exciting.

On my R&D side of things, I've got a fair number of bits I've worked on since the last update. I've been further improving/refining the new assets/importing stuff. There are now asset types for levels, post effects and scripts. I've also implemented an addition to the module definitions to auto-load certain asset types when the module is initialized.
Normally, an asset is only actually initialized/loaded when it's referenced - for example, an object that renders a shape references a shape asset, and this informs the asset it's referenced and if it's the first time, runs the init/loading stuffs.

This is largely fine, but there's some cases where we need stuff loaded up-front. Stuff like scripts, levels, post effects and a few other special-cases where we want that kind of thing in effect all the time as opposed to only when a mission is actually loaded. This conveniently happens to also remove a lot of the extra script fluff for setting stuff up, such as removing the need for a lot of the exec files that only existed to exec script files, etc.

The level asset is nice because it means that as long as it exists as a valid asset for a module, the level selection screen can automagically detect it, regardless of what module it's in. This gives a lot of flexibility in how to handle modules and packages, and will make doing stuff like demo scenes a lot easier since it means everything can stay self-contained in it's module if you want.

I also started - piggybacking on the asset importing work - initial work into a refactor of the TSShape stuff. While it most certainly works still, I don't know if you've looked at it, but it's a pretty spaghetti-tastic mess of code. Rendering any given shape out of one of the game classes sees at least 7 function calls to make it happen, which is pretty rediculous. I have basic mesh rendering working, and the plan by the end is to replace the TSShape stuff, with all functionality intact(and room to expand new features) while being appreciably cleaner.

I also started work on a Tileset Object Editor tool. This utilizes a bit of a rework of how new tools are done in the engine. The current way, you basically have to write an entire gui control that replicates the general functionality of the world editor, but then also does it's own special stuff. It leads to a lot of unneeded complexity and duplication. Then on the script side you end out doing a lot of swapping back and forth of GUI controls and we've seen it break before with some swaps not happening so editors get 'stuck' and the like.

With this, you enable a much smaller class object derived from the 'EditorTool' class. When enabled, it pretty much just intercepts the inputs from the regular World Editor gui control, and piggybacks off it's rendering. This keeps the tool's code streamlined and to-purpose, and simplifies the script side as well without losing any functionality.

So, the Tileset Object Editor is something i'm doing up for my project, which uses a lot of factory/industrial space stuff, so I made the logical conclusion to use tile meshes. But rather than having to do a ton of work placing and gridsnapping everything manually, I figured I'd draft up a relatively simple tool to streamline that.

Here's what it currently looks like:

From here, the plan is to get tile sets working. Think like how you can have your Forest Items in the forest tool, and then can make brushes of multiple items you can rapidly paint down. My idea is you could paint down say, several unique floor tiles, or wall tiles, select the set, and save that set. Then you could either place them down like a stamp, where it'll fill in as per the tileset's arrangement, a random selection from the tileset(which would help break stuff up to avoid obvious repetition) or a 'fill' paint.

It's pretty project agnostic, ultimately, as long as you use tile meshes to build your stuff. So that should end up being pretty useful :)

Speaking of editor stuff, I also did some modifications to the VariableInspector gui object. I'll have to specifically demo that off, but the idea is it'll let you do arbitrary inspectors. The current Inspector Control used in the editor lets you select an object, and it'll populate with the fields that object has exposed for editing purposes. Awesome and effect, but limited.
This lets you add fields yourself via script, can be pointed to an object's fields to be updated, or can point to global variables. It'll try and find matching engine-defined types, but it also lets you implement completely custom field types(if you remember the fancy-looking Material Slot field on the MeshComponent that had the preview image and stuff, it lets you do things like that.
You've actually seen this in effect without realizing it in my asset import videos, where you can see the Asset Import Config editor control having that nice list of options to edit.

That'll be getting PR'd soon, and with that, it'll let us update a lot of special-snowflake GUI controls to have a simple standard layout that's just as powerful but far, far easier to work with. The first things on the chopping block for that to try is the madness that is the material editor's GUI, and editor settings.

Anywho, I think that's everything for the moment. If I missed anything I'll be sure to update this :)

Later guys!
Posts: 345
Joined: Tue Feb 03, 2015 10:30 pm
by Steve_Yorkshire » Fri Sep 15, 2017 4:22 pm
Looks like "Son Of BPS! The Revenge Of BSP! The BSPening!" ;)
Posts: 28
Joined: Sun Apr 05, 2015 6:45 pm
by Julius » Fri Sep 15, 2017 5:36 pm
Great update!

Regarding the tile editor: Right now it seems quite custom tailored to 2D floors.
But what about quickly assembling modular 3D assets such as: ... -3d-assets
Just examples...
Online Duion
Posts: 1636
Joined: Sun Feb 08, 2015 1:51 am
by Duion » Fri Sep 15, 2017 8:33 pm
You can just use ramp tiles, I don't see a problem there.
Posts: 444
Joined: Sat Feb 07, 2015 11:37 pm
by Johxz » Sun Sep 17, 2017 4:31 am
good work @ JeffR

would be nice to merge this as well

@ Julius you mean something like this?
Posts: 87
Joined: Sat Feb 07, 2015 1:29 am
by HeadClot » Tue Sep 26, 2017 1:32 am
Hey @ JeffR - I really like what you are doing with the Tile Editor.
From here, the plan is to get tile sets working. Think like how you can have your Forest Items in the forest tool, and then can make brushes of multiple items you can rapidly paint down. My idea is you could paint down say, several unique floor tiles, or wall tiles, select the set, and save that set. Then you could either place them down like a stamp, where it'll fill in as per the tileset's arrangement, a random selection from the tileset(which would help break stuff up to avoid obvious repetition) or a 'fill' paint.
I Would recommend looking at the Dota 2 mod tools which you can get for free on steam. They do something similar to this. :) Here is a video that may inspire you :)
Posts: 957
Joined: Tue Feb 03, 2015 9:49 pm
by JeffR » Wed Sep 27, 2017 6:01 pm
@ HeadClot

Oh hey, yeah, that's fairly similar to what I was thinking. I was also figuring on having the ability to 'stamp' several pieces down at once rather than just the paint mode, but otherwise that definitely is along the lines. Good point of reference :D
Posts: 957
Joined: Tue Feb 03, 2015 9:49 pm
by JeffR » Mon Oct 16, 2017 7:32 am
Hey guys!

Time for another update!

Predictably, lots of stuff has been worked on since the last update, it's hard to know where to begin, haha.

Well, first and foremost, lets get it out of the way: thanks to the excellent efforts of @ Bloodknight getting a nice, clean PR put together, AFX has been officially merged into devhead. This brings with it a number of things, such as datablock caching, effects improvements and support for a tooooon of neat little utility things. I'd recommend poking through the commit log for everything, but I can articulate a major bulletpoint list on it later.

A number of PRs got rolled in, including support for Drag and Dropping files (which will be big for the Asset Browser and drag-n-drop importing), an update to the as far as i can tell unused Variable Inspector and supporting elements to allow all the usefulness of the Inspector class, but being able to assign and integrate arbitrary fields and field types without being restricted to inspecting objects nor only engine types.

This will get WIDE usage going forward in order to standardize the editor interface.

Also merged in the Editor Tools implementation, which will allow much lighter weight editor tool creation without having to completely replicate the entire editor gui class. It's cleaner, it'll be more stable and simpler to work with going forward.

As a fun bit of trivia, we've passed 3,000 commits! Woah!

Recently PR'd but not merged in yet comes a bit spread of stuff out of my RnD work I've been posting about in this workblog.

To go over some of it:

Convex Proxies. I've talked about them before, but it's easier to see how quick and easy it is to modify zones, portals and occluders with this via this video:

Several various small improvements or fixes for stuff like the RotationF class, ability to clear netobject's scope always status, etc.

Also PR'd a bunch of tweaks and fixes for the Entity and Component classes I've been brewing up, as well as removing the Torque_Experimental flag, so Entities and Components will be considered a standard part of the engine once that's merged in. The main game classes are still the main implement and aren't deprecated just yet, but it's closing it now :)

Another neat tidbit that I have a PR for is autoloaded assets. As-is, the asset system(with modules) are reference-based. When you, say have an object that references a ShapeAsset, the asset's ref counter increments, and if it's the first time it's referenced, it does the loading of said asset.

But for some stuff, we want it loaded ASAP, and pretty much always want it loaded. GUIs, Components, etc.

So, the PR implements autoloaded assets. You add a definition to the module's file for what types it autoloads, and after it scans for assets, it will automatically reference and load those assets of those types, skipping having to manually execute stuff yourself if you need it ahead of time.

I have more things yet to get PR'd, but that's the stuff sitting in the queue right now. Feel free to grab that stuff and give it some testing so it can go in!

On the horizon is the initial implementation of the Asset Browser, several new asset types, and my Convex editor improvements.

For new RnD, I started work on "Spacial Heatmaps".
If you're not familar with the idea of Heatmaps, this would be an example of a heatmap in classic game design:


In this case, the above image is from Halo 3, and is a heatmap of kill information. Namely, it represets every place a kill has happened in that map. This sort of data is incredibly useful in adjusting and correcting for map or gameplay design flaws. So, while talking with @ Azaezel I came up with a way to do something similar. However, unlike ye olde maps, we often have to worry about 3d spaces, and also we want data more than just 'someone died here'.

So, Spacial Heatmaps. This will let us not only record data points as the game plays for later review, but it lets us do it in 3d space, and also lets us enter data points under any category tag so we can get extreme granularity if we want.

So to see the WIP of it in action:

As you can see, any time we fire our gun, it plots down a point in our spacial grid. If we shoot a bunch of times in the same spot, the point gets 'hotter'. The idea is we'll be able to track player movement, AI movement, weapons fired, players killed, where grenades are thrown, where sniper rifles are fired, etc, etc.

The ultimate goal is that when you're working on your maps and gameplay design, you can record this actual play data as feedback to know how it's influencing the play. It should work in multiplayer just fine as well

Got several other neat bits I'll be looking into in the next week or two that should be quite useful as well.

Lastly, if you haven't seen Jeff Hutch's blog post about ongoing optimization work for TorqueScript, you can check that out here

Also I have a patreon now
If you're feeling weirdly charitable and wanted to throw some cash my way, it'll get funneled into continued work.
Posts: 345
Joined: Tue Feb 03, 2015 10:30 pm
by Steve_Yorkshire » Mon Oct 16, 2017 4:03 pm
Posts: 957
Joined: Tue Feb 03, 2015 9:49 pm
by JeffR » Mon Oct 16, 2017 4:05 pm
Oh, man forgot another bit being worked on: Custom Shader Features.

So, you may or may not be aware of how the material/shaders system in T3D works. It's a bit fuzzy unless you go digging through the code, but don't worry, I'll explain the basics first.

In T3D, you have 2 types of materials. Materials and CustomMaterials. Materials are what are made via the Material class definitions, and those are created when you make a new material through the material editor. CustomMaterials are mostly used for special classes like water blocks or other specific-purpose objects.

With Materials, these are Feature-driven. By which I mean, if you look at a Material in the editor, you'll note you have a lot of fields to work with. Diffuse map, Normal map, Diffuse Color, specularity, subsurface, the list goes on. Well, because the system has to take those fields and turn them into shader code, it treats each of those fields as a sort of flag.
If you assign a Diffuse map, it knows you want to utilize a texture, so it flags a DiffuseMap feature in Shadergen. When the shader is generated through ShaderGen, it detects that feature, and executes some codeblocks for generating that particular feature. This will ensure that the UV coordinates are passed along, that the DiffuseMap is bound as a shader constant, and that the code to actually sample the texture at the correct coordinate is added to the pixel shader.
It does this for pretty much every field in the Material.
This is certainly powerful, and it means you don't have to hand-write every shader yourself, which is a pretty big time saver, especially because most materials are going to largely be the same. But it has it's limitations.

While CustomMaterials derive from Materials, the way they're handled in the backend is a fair bit different.
With CustomMaterials, you write a shader by hand, hook it through ShaderData, bind your texture inputs and the like, and it'll render it. It does a sorta weird forward-but-not pass setup that's excellent for special-case rendering like the aforementioned water, but it's not designed for most materials in the engine. It's kinda been repurposed for doing non-standard materials and I don't think that was what it was designed for, so there's heavy limitations you can run into.

With Shadergen and Materials, the limitation is that because everything is feature-driven, you have a fairly rigid framework to work with. As long as you implement whatever you're trying to add as a Feature, it's fairly happy days, but that's kinda spread out through several files, then you have to add the code to bind your textures or whatever other shader consts that you need to have data pass in to the shader to actually do the work.

This is easier to do with CustomMaterials, which is why I think people have gravitated towards using that, but it's technically "wrong".

So, what to do?

Well, we make adding new ShaderGen features really easy, obviously. Enter CustomShaderFeature.

This is basically an object that you can define and add to a Material. When the Material is processed through Shadergen to create a new shader, it'll execute the CustomShaderFeature in addition to regular features, and generate code with it. While this sounds like the 'add a feature to Shadergen' work I mentioned above, there's 2 big differences:
1) All the work is already done for you, you just have to create the CSF and add it to the material
2) You can actually do that in script now, meaning you don't need to recompile the engine to tweak how the shaders generate, which will be a HUGE savings in iteration time.

So how's it work? Good question. Lets look at that:

Lets look at our CustomShaderFeature and Material definition for an exmaple:
singleton CustomShaderFeatureData(TestFeature){};

singleton Material(TestMaterial)
   CustomShaderFeature[0] = TestFeature;
What, that's it?

Not quite. Then we have callbacks during shadergen for the actual guts:
function TestFeature::processPixelHLSL(%this)
   %this.addUniform("strudel", "float2");
   %this.addTexture("strudelTex", "Texture2D", "strudelMap");
   %this.addVariable("bobsyeruncle", "float", 15.915);
   %this.addVariable("chimmychanga", "float");
   %this.writeLine("   @ = @ * 2;", "chimmychanga", "bobsyeruncle");
   %this.writeLine("   @ *= @.x;", "bobsyeruncle", "strudel");
   %this.writeLine("   @ *= @.y;", "chimmychanga", "strudel");
   %this.addVariable("sprangle", "float4");
   %this.writeLine("   @ = @.Sample(@,@);", "sprangle", "strudelTex", "strudelMap", "strudel");
Well, that's not that bad...

It's really not, nope. But what does it all do, you may ask?

Yeah, what's i-
Fragout main( ConnectData IN,
              uniform float2    strudel         : register(C0),
              uniform Texture2D strudelTex      : register(T1),
              uniform SamplerState strudelMap      : register(S1)

   // TestFeature
   float bobsyeruncle = 15.915;
   float chimmychanga = bobsyeruncle * 2;
   bobsyeruncle *= strudel.x;
   chimmychanga *= strudel.y;
   float4 sprangle = strudelTex.Sample(strudelMap,strudel);

   return OUT;
As you can see, in the above example, for the shader generated with this features, it correctly adds in the input uniforms, samplers and textures, then generates the code into the shader as we'd want.

In actual practice, this would let you add anything from small bits of code to, say, change the color of a character to display team colors, or make them turn redder as they take more damage, all the way up to writing the entire shader entirely in a custom feature.

Because this goes through shadergen, it follows all the 'rules' so there's no weird outlier behavior. It's just a plain old material, but now you can make them do whatever you want without having to jump through a ton of hoops that the CustomMaterials have.

Going forward, I plan to have a integrated CSF that takes a visual node graph file and generate the code from that. This will make it stupid easy to create shaders via a visual, node based interface, which will be great for artists, or newer people unfamiliar with shader code. You'd put together your node graph to build the logic of the shader, and it'll automatically spit out the actual shader code, hook it to the material, and bind the textures and fields and whatever else automatically for you.

So this should be quite bloody powerful going forward. I'm pretty stoked about it :)
456 posts Page 17 of 46

Who is online

Users browsing this forum: No registered users and 1 guest