Road to Anti-Aliasing in BRE: Aliasing and Anti-Aliasing

I want to implement anti-aliasing in BRE, but first, I want to explore what it is, how it is caused, and what are the techniques to mitigate this effect.

In the previous post, we talked about RasterizationNow, we will see aliasing and anti-aliasing in depth. In the next post, I am going to describe which anti-aliasing technique I choose for BRE and how I implemented it.

 What is aliasing?

When we rasterize a segment or a polygon, we take a real domain and break into discrete parts, the pixels. In the following picture, we may agree that the pixels we use to represent the segment are the closest approximation we can find, but if you had no previous knowledge that those pixels are supposed to be a segment, you could not identify it as a segment, because they look like a series of adjacent pixels.

line

We can conclude that there are infinite segments that would be rasterized, which pass on the very same set of pixels. This ambiguity is what we call aliasing. The visual look of this effect is that segments and borders of polygons look jagged. Aliasing is intrinsic to the discretization but there are techniques to make it less perceivable.

In the previous article, we saw that partial fragments are either ignored or promoted to complete fragments. This assumption reduces per-fragment computations to a single-point same because a single fragment would win a pixel and determine its color. This is correct if we treat pixels as pure point samples, with no area, but this is not the case. Each pixel represents a rectangular region on the screen with a nonzero area. We can see that more than one fragment may be visible inside of a pixel’s rectangular region.

multiple_fragments_in_pixel

If we use the point-sampling method discussed in the previous article, we would select the color of a single fragment to represent the entire area of the pixel. The problem with this is that the pixel center point sample may not represent the color of the pixel as a whole. In the following figure, we can see that most of the area of the pixel is black, with only a very small triangle in the center being bright white. This does not represent the color of the pixel rectangle as a whole.

pixel_point_sampling.png

How can aliasing be solved? Area-Based Pixel Colors

As we mentioned, aliasing is intrinsic to the discretization but there are techniques to make it less perceivable. The minimum size we can handle with a raster device is one pixel. If we want to represent a line, we can compute a set of pixels to represent it. We can play with the color of the surrounding pixels to reduce the aliasing effect. In the following picture, we can see that if we consider the segment as a polygon having width 1 pixel and length equal to the segment’s length, we see that such a polygon partially covers some pixels.

ffig_5.16

If we shade each pixel intersected by this polygon with a color whose intensity is scaled down by the percentage of the area of the pixel covered by the polygon, the aliasing effect will be diminished like in the following picture.

fig_5.16b.png

This technique is called area averaging technique or unweighted area sampling, and it is based on the notion that if only half a pixel is on the line, we have its color, and at a proper viewing distance, we will obtain a convincing result. If more polygons are covering the same pixel, we can use blending, by using the alpha channel, for example.

The implementation of the area averaging technique could be expensive because it requires several floating point operations (intersection computation between a polygon and a square (pixel) in the fragment shader).

Another approach that is different than previous one but based on pixel area is to use the relative areas of each fragment within the pixel’s rectangle to weight the color of the pixel. The results will be much better. In the following picture, we can see that the white fragment covers approximately 10 percent of the area of the pixel, leaving the other 90 percent as dark gray.

area_based_weight.jpg

If we weight the color by the relative areas, we get a pixel color of

 Carea = 0.1 * (1.0, 1.0, 1.0) + 0.9 * (0.25, 0.25, 0.25) = (0.325, 0.325, 0.325)

Note that this computation is independent of where the white fragment falls within the pixel and only the size and color of the fragment matter. This area-based method avoids the point-sampling errors. This can be extended to any number of different colored fragments within a given pixel.

The problem with this method is that it requires pixel-coverage values computed per fragment. Computing them for triangles can be complicated and expensive. In practice, the pure area-based methods do not lead directly to simple, fast hardware antialiasing implementations.

How can aliasing be solved? Screen-Based Methods

In this section, we are going to discuss several algorithms with different implementations but one thing in common: all of them are screen-based antialiasing methods. This means that they operate only on the output samples of the pipeline and do not need any knowledge about the objects being rendered (a line, a triangle, polygon, etc).

As we saw in previous sections, one problem is the low sampling rate. We will see another example of this. In the following picture, we can see a red triangle where a single sample is taken at the center of each pixel’s grid cell. The only knowledge we have about the cell is whether or not its center is covered by the triangle.

triangle_point_sample

If we use more samples per screen grid cell and blend these, then we will get a better pixel color, as you can see in the following picture

triangle_multi_sample

Given this, we need to use a sampling pattern for the screen and then weight and sum the samples to produce a pixel color

p(x, y) = ∑(i = 1 to n) Wi * C(i, x, y)

where

  • is the number of samples taken for a pixel
  • C(i, x, y) is a sample color function
  • Wi is a weight in the range [0.0, 1.0] that the sample will contribute to the overall pixel color

Where the sample is taken on the screen grid is different from each sample, and optionally the sampling pattern can vary from pixel to pixel. Commonly, point samples are using in real-time rendering systems. Given this, the function C can be thought of as two functions:

  • A function f(i, n) that retrieves the floating-point (x, y) location on the screen where a sample is needed. This location on the screen is then sampled, giving the color at that point.
  • The sampling scheme is chosen and the rendering pipeline is configured to compute the samples at particular subpixel locations, typically based on a per-frame or per-application setting.

The sum of weights Wi of all samples is one. If we use constant weights then each sample will have a weight equal to 1 / n. Usually, the default mode for graphics hardware is a single sample at the center of the pixel.

Supersampling Antialiasing (SSAA)

In oversamplingsupersampling, or supersampled antialiasing (SSAA), area-based sampling is approximated by point sampling the scene at more than one point per pixel. Fragments are not generated at the per-pixel level but at the per-sample level. The supersamples are combined into a single pixel color via a weighted or unweighted average, as you can see in the following image.

ssaa_explained

The conceptually simplest is called full-scene antialiasing (FSAA). It renders the entire scene to a larger (higher-resolution) framebuffer and then filtering blocks of pixels in the higher-resolution framebuffer down to the resolution of the final framebuffer. For example, suppose we have an image of 1000 x 800 pixels. If you render an image of 2000 x 1600 offscreen and then average each 2 x 2 area on the screen, the desired image is generated with 4 samples per pixel. You can see this grid sampling in the following picture.

2x2

FSAA’s main advantage is simplicity, and also lower quality versions of this method can be used, as you can see in the following picture.

lower_quality

The positions and weights used with weighted area versions of these sampling patterns differ by manufacturer. In the following picture, you can see some examples.

manufacturers_sampling.jpg

This method is costly, as all subsamples must be fully shaded and filled, with a Z-buffer depth per sample. In the following picture, you can see some scenes with and without FSAA.

FSAA

Another method is called accumulation buffer, that instead of a large offscreen buffer, uses a buffer that has the same resolution than the desired image (and usually more bits of colors). To obtain a 2 x 2 sampling of a scene, four images are generated, with the view moved half a pixel in the screen x- or y-direction as needed. Each image generated is for a different sample position within the grid cell, and these images are summed up in the accumulation buffer. After rendering, the image is averaged and sent to the display. In the following picture, you can see an antialiasing technique using the accumulation buffer in OpenGL.

accumulation buffer.png

The additional cost of having to re-render the scene a few times per frame and copy the result to the screen make this algorithm costly for real-time rendering systems.

Multisampling AntiAliasing (MSAA)

Supersampling based techniques work by generating samples that are fully specified with individually computed shades, depth, and locations. As we mentioned, the cost is high (each sample has to run a fragment shader) and the overall gains relatively low. For example, N-sample SSAA generates N times as many fragments per pixel (usually between 2 and 16). Each of these new fragments (or subfragments) has its own color computed by evaluating per-vertex attributes, texture values, and the fragment shader itself as many as N times more frequently per frame than normal rendering. This is very expensive because it requires that the entire rasterization pipeline is invoked per sample, and this increases rasterization overhead by 2-16 times.

Multisampling Antialiasing (MSAA) takes into account that the most likely causes of aliasing in 3D rendering are partial fragments at the edges of objects (even where pixels could contain multiple partial fragments from different objects and colors). You can see this situation in the following image.

triangle_aliasing.jpg

MSAA generates fragments at the final pixel size and it only evaluates the fragment shader once per fragment, so the number of fragment shader invocations is reduced significantly when compared to SSAA. MSAA stores per-sample fragment coverage. When a fragment is rendered, its color is evaluated once, but then that same color is stored for each visible sample that the fragment covers. The existing color at a sample (from an earlier fragment) may be replaced with the new fragment’s color. But this is done at a per-sample level. At the end of the frame, we need to compute the final color of the pixel from the multiple samples but only a coverage value (a simple geometric operation) and possibly a depth value is computed per sample, per fragment. The expensive step of computing a fragment color is still done once per fragment. Since MSAA is coverage based, no antialiasing is computed on complete fragments, and they are rendered as if no antialiasing was used.

msaa_example

There is an issue related to the selection of the position of the pixel at which to evaluate a shader on a partial fragment. As we already mentioned, normally the pixel center is evaluated in the fragment shader, but a partial fragment may not cover it. Then we will be extrapolating the vertex attributes beyond the intended values which can be noticeable with textures and can lead to glaring visual artifacts.

The solution in most 3D MSAA hardware is to select the centroid of the samples covered by the fragment. This adds some complexity to the system, but the number of possible configurations of a fragment that does not include the pixel center is limited. This position adjustment is called centroid sampling or centroid interpolation. The convexity and the fact that the central sample is not touched means that there are a very limited set of covered-sample configurations possible. The set of possible positions can be precalculated before the hardware is even built. In the following picture, you can see some MSAA sampling patterns for ATI and NVIDIA graphics accelerators.

msaa_manufacturers

In the following pictures, you can see some comparisons between SSAA and MSAA.

aa_examples.png

aa_examples_2.png

As we already mentioned, MSAA focuses effort on sampling the fragment’s pixel coverage at a higher rate and sharing the computed shade but storing a separate color and z-depth for each sample is usually unnecessary. Coverage Sample AntiAliasing (CSAA) takes advantage of this observation by storing just the fragment coverage at a higher sampling rate. Each subpixel stores an index to the fragment with which it is associated. A table with a limited number of entries (4 or 8) holds the color and z-depth associated with each fragment. For most data types it is rare to have more than 4 fragments that are radically different in shader visible in a pixel, so this scheme performs well in practice.

CSAA_example.png

Other Antialiasing Techniques

There are more antialiasing techniques. We are going to list some of them:

Temporal Antialiasing (TAA)

NVIDIA’s Temporal Antialiasing (TXAA)

Fast Approximate Antialiasing (FXAA)

Morphological Antialiasing (MLAA)

Enhanced Subpixel Morphological Antialiasing (SMAA)

aa_comparisons.jpg

In the next post, we are going to talk about one of these antialiasing techniques in detail and about how we implemented it in BRE.

References

Fundamentals of Computer Graphics, 4th Edition

Essential Mathematics for Games and Interactive Applications, 3rd Edition

Introduction to Computer Graphics: A Practical Learning Approach, 1st Edition

Computer Graphics: Principles and Practices, 3rd Edition

Foundations of 3D Computer Graphics

Advertisement

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Create a website or blog at WordPress.com

Up ↑

%d bloggers like this: