Thought I'd explain what it is since it may be of interest to some of you. It's a long read but I put a lot of time writing this dammit. Read it!
"Shimmering" is a term used by some to describe the appearance of speckled, jittery or otherwise shimmery appearance of surfaces in a game.
Shimmering is actually a form of aliasing. No, aliasing does not only mean the stair-stepping effect of polygon edges.
Okay so what is aliasing? In the most generic sense, aliasing is a possible byproduct of any sampled data system. Aliasing happens when the source signal is undersampled by the data system. Unless you took some electrical engineering or digital signal processing course before, this description may not help. And if you understood this you wouldn't need me to explain this to you. So I will use hopefully more useful examples to illustrate the concept. I will omit certain nuances to make the explanations easier to write without a whole bunch of caveats - those of you who understand sampling theory like the back of your hand will know it when you read it.
Let's consider an audio system example to begin with. Say your source signal is a 10kHz sine wave that you want to record with your digital sound recorder. The digital sound recorder is a sampled data system because it attempts to represent the continuous sine wave in the form of discrete data points (sample) every x seconds. Say the digital sound recorder has a sampling rate of 8kHz (visualize the data points as a dot on the sine wave every 1/8000th of a second). It has been proven by smart folks many years go that in order to record the 10kHz sine wave completely and be able to reconstruct it as exactly before, a minimum sample rate of 20kHz is required. Since your digital sound recorder only operates with a sample rate of 8kHz, you would not meet this requirement and this is called undersampling. The converse (if your digital recorder has a sample rate of > 20kHz) is you are oversampling the signal; oversampling in general is good if you can afford it in a system). So why is undersampling bad? Going back to our audio example, sampling a 10kHz sine wave with an 8kHz sampling rate would result in the data points appearing to represent a lower frequency 6kHz sine wave! Clearly this result is wrong.
Now we can extrapolate the previous example to video systems, specifically in our case the graphics rendering of games. Let's consider the graphics rendering resolution of 1920x1200 and resulting LCD display the sampled data system in this case. In our audio recording example before, the source signal was a simple 10kHz sine wave. What is our source signal in this case? Why, it's the game graphics that need to be displayed on your LCD display! What is the source frequency in this case? Potentially infinity. Now you see the problem. Any diagonal polygon edge, for example, always face aliasing (which is why they are so connected to the term aliasing in games). Polygon edges are described by straight lines which are defined mathematically and is an infinite frequency signal. To represent straight lines on your LCD screen, you use pixels (the analogous of the digital recorder data points in the audio case). The rendering resolution, which is at max the LCD native resolution, samples the infinite frequency polygon edge. Thus your sampled data system is always undersampling the source signal. And the result is an apparently low frequency version of it that appears as a jagged line on your LCD screen (recall in the audio case this would be the 6kHz sine wave).
We're getting there now. As mentioned before, polygon edge spatial aliasing is not the only form of aliasing in graphics. Any graphical texture, for example, if distant enough from the player POV will start to be of higher frequency than the data sampled system sampling rate of your 1920x1200 resolution and LCD screen. Taking a page out of wikipedia, here's what a undersampling does to a texture (in this case the texture is a photo of a building):
Now imagine that waving, speckly pattern moving around as you move your character in the game. This is what is known as shimmering.
Early on in 3D graphics rendering, mip-maps were devised to circumvent texture aliasing. Remember how I said if textures are placed distant enough from your POV it would start to cause visual aliasing? Well the mip-map solution is to generate increasingly low resolution versions of that texture and use those in progressively farther distances. Of course, if one messed around with Level-of-Detail (LOD) settings for textures one could cause textures to appear blurrier or start to show shimmering.
With the advent of anisotropic filtering, we welcomed much sharper textures than afforded by old techniques like bilinear filtering. This is actually a byproduct of improved accuracy of representing distant textures. But the downside is that if too aggressive of anisotropic filtering is used without any control, the result is again shimmering because you are trying to increase the source frequency beyond the capabilities of the sampled data system of the graphics resolution and LCD screen.
So is shimmering bad? In the strictest sense yes it is undesirable because it falsely represents the source signal. Part of the rendering goal for graphics card is to render the scene on the hairy edge of aliasing (ie on the verge of shimmering) because that is when you get the greatest perceptual sharpness. But step even a little above that limit and you start to get the artifacts like shimmering. And if you stop too short of that you appear blurrier than you have to be. Depending on the algorithm used to produce the result there may be performance loss/gained (ie lower or higher framerate).





Quote


