🎉 Celebrating 25 Years of GameDev.net! 🎉

Not many can claim 25 years on the Internet! Join us in celebrating this milestone. Learn more about our history, and thank you for being a part of our community!

Material System for Forward and Deferred Rendering

Started by
5 comments, last by Juliean 3 years, 10 months ago

Hello,

I'm currently thinking about how I could create a material system in my game engine. I'm currently using forward rendering in my engine. My first idea was to use a json or xml file per material which specify the vertex, geometry and fragment shader and uniforms that need to be set like some textures. However with this design I would get some issues if I would switch at one day to deferred rendering, as I then need different vertex and fragment shaders to achieve the same effect. So my question is how other people handle this problem?

Advertisement

I can give you two possibilies:

  1. Have each shader which is needed for both forward and deferred implement both paths via a macro that gets passed when compiling (so you can compile all shaders for whatever path you want) OR
  2. My personal favourite, which is a little more complicated: Move away from directly referencing vertex/geometry/pixel-shaders for materials, and instead have sets of shaders that encompass a type of renderable, lets say one for “Mesh”. This implementes the structures for both forward and deferred, and is used as the basis for all materials. You then write custom specialisations of that shader by supplying a function-implementation for vertex and pixel-properties. For example:
// defined by the base-shader
struct PixelShaderProps
{
	float3 vAlbedo;
	float3 vNormals;
	float speculars;
}
struct PixelShaderInput
{
	float3 vMeshNormals;
	float2 vTex0;
}

// this is what a concrete material implements
PixelShaderProps evaluateSurface(PixelShaderInput input)
{
	PixelShaderProps props;
	props.vAlbedo = Sample(BaseTexture, input.vTex0) + Sample(DetailTexture, input.vTex0 * 2);
}

Now such a system is obviously a lot more complicated, but can lead to way better reuse of code and makes it easier to write new types of shaders. Its similar to what Unity does with Surface-Shaders, or what pretty much any node-based shader graph is doing in the background.

@Juliean Thank you for your reply. So for each type of geometry (mesh, skinned mesh, sprite, debug) I would create a base vertex shader this base vertex shader defines an interface. An material can use this interface to inject custom behaviour? Same for the fragment shader. I define a base fragment shader which for example implements phong lightning or pbr and create an interface which client code can use to inject custom behviour?

One thing that I ask myself, is how do I know which shaders I need to create? So for example in forward rendering, I could use that base vertex shader (with custom behaviour added) and the base fragment shader (with custom behaviour added) and create from it the shader program, that I then use for rendering. If I do deferred rendering, I obviously need to create more than one shader: I need to create a shader program with the base vertex shader and an fragment shader that renders to an quad and I need to create a shader program with a vertex shader that passes the quad data forward to the base fragment shader. How do you handle this creation and where do I store the created shader program? (Should I store it in the material?) A simple idea, that comes to my mind, is that on the creation of the material, the material sends it's custom shader code, with info which base shaders to use, to the renderer. The renderer will then create a concrete shader program based on it's on configuration (forward or deferred).

flexw said:
@Juliean Thank you for your reply. So for each type of geometry (mesh, skinned mesh, sprite, debug) I would create a base vertex shader this base vertex shader defines an interface. An material can use this interface to inject custom behaviour? Same for the fragment shader. I define a base fragment shader which for example implements phong lightning or pbr and create an interface which client code can use to inject custom behviour?

Yeah, sort of like that. Things like Mesh or SkinnedMesh could be the same base-shader with different options. Lets call those base-shaders a “shader-profile" for now (thats what I use internally and so I don't have to write around describing it every time).

A shader-profile would ideally consist of all the shaders necessary, meaning that you don't have separate vertex/fragment/geometry-shader, but only different shader-profiles that supply all the necessary shader-stages. If you need a profile to act differently depending on the context, lets say forward or deferred rendering, you implement the profile to support both "passes", so to say (and your render-framework should then be able to select the correct path). The difference between forward and deferred then becomes a matter of “Evaluate the material-properties, calculate lighting and write to buffer” (forward) or “Evaluate the material-properties and write them to buffers; then after all are done calculate lighting as a post-pro effect” (deferrred). So your Mesh-profile doesn't need to know anything about rendering a quad for lighting (but you are adviced to have your lighting-calculation as a set of functions that you can invoke in eigther path).

Now for the skinning, I would have this as an option (say a boolean) that the material can request from the profile. If it does, the profile generates code for skinning and gives the material-evaluation function(s) access to the skinnin-parameters. You could probably also make a different profile; but you really want to have as little duplication of shader-code as possible (otherwise introducing new features becomes a nightmare), so I'd go with the configuration-option (this is also very hand for different things like transparency-mode, shading-model, etc…)

flexw said:
A simple idea, that comes to my mind, is that on the creation of the material, the material sends it's custom shader code, with info which base shaders to use, to the renderer. The renderer will then create a concrete shader program based on it's on configuration (forward or deferred).

Yes, sounds like what I'm doing as well. In my case the material consists of a base-profile, the visual shader-code and all the variables/textures. The renderer then supplies a “view” (this is to distinquish between 3D/2D/Gui), and selects specific options (like Forward/Deferred; which the profile has to support), and then you got the final shader-program up and ready.

Thanks for explaining that to me. Helped me much. Could you please explain a little bit more what your shader profiles exactly do? Does the shader profile do the actual rendering, like the renderer calls a method shaderPofile→renderForward() ? But if it works like this, then the shader profile must have also methods like for example renderToGeometryBuffer() and so on? So if I would create later other render paradigms then I would need to create new methods in the shader profile. That would be bad. Also do you have just one shader profile class or are there many classes for each shader profile one class? How do your users create new shader profiles?

flexw said:
Thanks for explaining that to me. Helped me much. Could you please explain a little bit more what your shader profiles exactly do? Does the shader profile do the actual rendering, like the renderer calls a method shaderPofile→renderForward() ? But if it works like this, then the shader profile must have also methods like for example renderToGeometryBuffer() and so on? So if I would create later other render paradigms then I would need to create new methods in the shader profile. That would be bad. Also do you have just one shader profile class or are there many classes for each shader profile one class? How do your users create new shader profiles?

The shader-profile really is only used to generate the final material that is used to draw things. So in my renderer, "Material" is the collection of shader-profile, settings etc… and the renderer generates a MaterialInstance, which contains the final compiled shaders etc… That MaterialInstance is used by in combation with a Mesh to do that actual rendering. The implementation of that is totally up to the specific render-system, Even MaterialInstance is just an interface over different types of shaders; as I still support writing predefined shaders for specific stuff. My own implementation needs to be very flexible as I have lots of different use-cases, but thats not necessarily a must.

flexw said:
Also do you have just one shader profile class or are there many classes for each shader profile one class? How do your users create new shader profiles?

A ShaderProfile implements a base-class which has different functions like GenerateVertex, GeneratePixel etc… So if a user wants to make a new type of profile, he inherits from ShaderProfile, implements everything he wants to have and registers the shader-profile with my MaterialRegistry.

This topic is closed to new replies.

Advertisement