This article is a continuation from my original ‘AGAL2 is Here’ blog post. For anyone who may have missed my introduction to the new features of AGAL2, you can head over to the original article at:
Continuing on from the last article, I have 4 more features that I would like to cover and then I also want to talk about profiles and the fact that some of these features do not come under the new ‘standard’ profile and could be considered to be AGAL1 features. More about that later, for now, the remaining topics are:
- Floating point textures (RGBA_HALF_FLOAT)
- Partial derivatives (ddx, ddy)
- Fragment depth (od)
- Texture anti-aliasing
Floating point textures (RGBA_HALF_FLOAT)
When we create a texture, we can actually render our scene to that texture, this is called a render texture or offscreen buffer. Previously we only had a single texture option and would create a texture that contained 8 bits per channel. We instantiated this with Context3DTextureFormat.RGBA. Now we have access to a new texture type Context3DTextureFormat.RGBA_HALF_FLOAT which provides us with 16 bits per channel. To make use of this texture type is very simple, let’s say that we are going to create a render texture which is rectangular, we could use the following:
var rtt:RectangleTexture = context.createRectangleTexture(800, 600, Context3DTextureFormat.RGBA_HALF_FLOAT, true);
Having a wider bit range is particularly useful when working with deferred HDR rendering. Remembering that data in a shader is computed component-wise, 8 bits per channel can be very limiting and unable to simulate realistic lighting over a large range of colors. The color depth of texel data is known as the dynamic range and the new RGBA_HALF_FLOAT provides us with an increased range allowing for more realistic looking lighting conditions. We must later compress the results back into the standard RGB color space and this technique is called tone mapping.
The picture below shows a two scenes from Half-Life 2. Notice the difference in detail and contrast provided by the high dynamic range (HDR) version on the right.
The new ddx and ddy operations allow us to compute the gradient of surrounding fragments on either the x or y axis, for each fragment in the sampler. This is useful for a few reasons and notable when we need to perform anti-aliasing or blending between values in height maps.
Below is the result of a very simple AGAL2 test that I wrote. The original image contained black and white chequered triangles on 4 colored backgrounds. The result was then sent back to the output color channels and output into the color channel.
The following fragment program shows my usage of the ddy operation to generate this output:
tex ft0, v0, fs0 <2d,linear,anisotropic16x,nomip> ddy ft0, ft0 mov oc, ft0
The operations here are:
- Load the texture from fragment sampler 0 into the temp fragment register 0.
- Perform the ddy operation on temp fragment register 0 and store the result back into the temp fragment register 0.
- Moving the contents of temp fragment register 0 into the output color register to be displayed.
Having the ability to write to the depth buffer can be used to influence depth passes and the new ‘od’ operation makes that possible. In the same way that we use ‘oc’ to output the color to the backbuffer, ‘od’ enables us to output a depth value to the depth buffer. By writing to the depth buffer and then testing against the buffer with depth tests, such as:
We are now able to influence the way in which the depth buffer is written. This new feature enables effects that were not possible when the buffer was affected only by the geometry. An example of this might be a shader where geometry is created in the fragment shader, such as a fur effect. The depth information can also be written to the depth buffer so that the data doesn’t get overridden with another geometry.
In a similar use-case, Ariel Nehmad of Flare3D points out that this is also useful for volumetric particles.
Another cool use of output depth is volumetric particles, basically writing particles depth based on a grey scale image.
Texture anti-aliasing is the ability to reduce artefacts and pixelation of textures by combining data from surrounding fragments. Again, this is as simple as passing an argument such as:
context.setRenderToTexture(texture:TextureBase, enableDepthAndStencil:Boolean, antiAlias:int=0, surfaceSelector:int=0, colorOutputIndex:int=0):void
Stage3D makes use of multisampling anti-aliasing (MSAA). By passing a value to the third parameter, we can assign an anti-alias value which is an integer within the closed range of [0-4]. 4x anti-aliasing provides the highest quality anti-aliasing but with that quality comes a performance hit so it may be useful to allow your users to configure these setting in accordance with their device’s capabilities, that way they can get the best quality gaming experience.
The image above (credit to wiki.mozilla.org) illustrates the use of MSAA to reduce harsh edges and artefacts. This improves the overall visual quality of the final render by interpolating the values around the edge of the model, causing the model to blend into its surroundings.
Earlier I mentioned that I would comment about profiles. It’s worth noting that with the release of Flash Player 14, we get a new Context3DProfile.STANDARD profile. We are able to allocate this when we make our context request:
Some of the features that I have discussed in these two articles are capable of working in previous versions of the player. Cheng Liao from Adobe, who has been working on these features, has kindly stated which features will be capable outside of the standard profile in his comments on my previous blog post:
This is a very exciting time to be working with Stage3D, the additional functionality will provide the basis for some great new features. Zest3D will incorporate these into the general pipeline allowing us to start testing a deferred rendering pipeline with a view to creating a physically based rendering (PBR) pipeline.
In my next blog post I will post the code that I am currently using to test multiple render textures (MRT) and I’ll show you how to address and write to multiple textures from within a single shader pass. Until next time, please follow me on twitter for updates 🙂