Here is a quick experiment I made for a paper art/cut-out look which enables extreme zooming of very low resolution input textures. I have wanted to make a game about scale for a while (Giants touched on this theme years ago and I found it fascinating), and I can see this being the basis for a 2D game engine that allows several orders of magnification without requiring huge data sets.

The source image

The basis is the “improved alpha tested magnification” algorithm for smooth font rendering as published by Valve, except that I use it to generate an 8-bit Signed Distance Field for each of the 24 RGB color bit planes. Separating the bits is technically the same as using different band pass filters (from low to high frequency) for each color channel to get a 1-bit field, and creating an 8-bit SDF from that.
Source image (1024 x 1024)

Each SDF is just 64×64 pixels, because the lower the resolution, the “rounder” the edges of the shape actually become. An RGBA pixel can hold 4 SDF values, so for 24 channels  I combine 6 images in a texture atlas.

The SDF texture atlas

The SDF texture atlas

The shader then simply makes the “alpha” test for each bit plane and recombines all 24 bits into the final output color

Zoom Level 1






Even Closer

Even Closer

In the regions where different planes touch each other, they overlap a bit due to precision errors, but I actually like the look, it reminds me of overlapping water color strokes or collages of different-colored cardboard.

Here is another test with Paper Mario. This probably has too many gradients in it, but remember you are looking at a 64 x 64 pixel texture!

Paper Mario


Final RenderRecently I’ve done some work on non-photorealistic rendering for a pet project of mine. For starters, I wanted to get an “artistic” look as a real-time post-processing effect, something that looks like water color or a pencil sketch.

The basis should be a scene with rather simple texturing or a toon-shaded look, since the effect will not preserve fine detail. If the resolution is too high, the artistic look is lost and the picture is just going to look grainy. Also, the effect needs good edge detection, so the more distinct colors, the better.

Here’s what I came up with:

After Edge Detection Pass

Step 1: Run a (color/luminance-based) edge detection shader on the frame buffer. The target texture receives black texels at discontinuities, white otherwise.

After Distance Transform Pass. Brightness turned up to improve visibility

Step 2: Do a Distance Transform (see below) on the edge detection result to get a texture with the distance to the nearest edge in every texel. This is similar to a Signed Distance Field one might pre-compute for font or decal rendering, but in real-time on the GPU. Apart from the effect presented here, this is an excellent base for all kinds of other things like outlines, glows, variable transparency, etc

Step 3: Render a screen-aligned grid of quads with a brush stroke texture. For each stroke, the vertex shader fetches a few samples from the distance map and calculates a tangent. This is used to rotate the quads so that the brush direction is somewhat parallel to the nearest edge, which is a basic technique in real-life painting. Also, the actual distance value is used to scale the quad: The further away from an edge, the bolder the stroke can be to fill the area. The closer to an edge, the finer the strokes are to define the edge. Finally, the color from the frame buffer is used to tint the texture of the brush stroke.

Distance Transform

To get the distance to the nearest edge for every texel, I am using a clever little trick (not mine) called recursive doubling, or rather, recursive halving. It brings the value of every texel to every other texel in just log2(n) passes, where n is the width/height of the texture. This can be used for all kinds of things, like summing areas or counting lengths of rows or columns in the texture.
So taking the 256×256 texture with the edge detection result, I just bounce it between two render buffers 8 times with this shader, and the result is the distance map. The shader takes 7 samples in all directions with a decreasing distance at each pass. The sampling distance is added to the sampled value and the smallest of all samples is written out.

What’s missing

For a water color effect, the paint of each brush stroke should dissolve into the paint already on the “canvas”. This will probably be another post-processing pass, or maybe it can be done in the shader that renders the brush strokes.

In a moving scene, there is a lot of jittering in the brush strokes, because the tangents change a lot, and because of the inherent error in the edge detection and distance transform pass. It may be necessary to stabilize the orientation of the brush stroke quads over a couple of frames, gradually adapting the tangents, to reduce the jitter.