In which you allow a fragment to pass light from surfaces behind it to the viewer's eye.

Up till now, you've been rendering opaque surfaces. What makes them opaque is that their colors completely overwrite whatever colors were written to the framebuffer previously. A transparent surface, on the other hand, would let color from the behind through. The color that reached your eye would be a mix of many surfaces' colors.

You can render transparent surfaces by enabling blending with this statement:


This boolean flag isn't enough. You must also draw an object with an opacity less than 1, which means altering the assignment to fragmentColor in the fragment shader. This assignment, for example, renders fragments in a magenta that is 60% opaque:

fragmentColor = vec(1.0, 0.0, 1.0, 0.6);

Browsers will automatically blend the non-opaque pixels of the framebuffer with whatever page content is behind the canvas. If your page has a background color, it will mix into your 3D scene. Likely you do not want this. One way to disable the mixing is to eliminate the alpha channel from your framebuffer in the getContext call: = canvas.getContext('webgl2', {alpha: false});

When blending is disabled, the GPU updates the framebuffer with a statement like this:

framebuffer[r, c] = fragmentColor

With blending enabled, the assigned color is a weighted sum of the fragment color and the color previously written to the framebuffer:

framebuffer[r, c] = newWeight * fragmentColor.rgb +
                    oldWeight * framebuffer[r, c]

You specify the weights using gl.blendFunc. There are many possible weighting schemes, though one is more physically intuitive than others. If a fragment has an opacity of 75%, then that means you will see 75% of its color and 25% of whatever color is behind it. That scheme is applied with this statement:

gl.blendFunc(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA);

With gl.SRC_ALPHA as the first parameter, the new weight is the fragment's opacity. With gl.ONE_MINUS_SRC_ALPHA as the second parameter, the old weight is the complement of the fragment's opacity. The framebuffer is updated then with a statement like this:

framebuffer[r, c] = fragmentColor.a * fragmentColor.rgb +
                    (1 - fragmentColor.a) * framebuffer[r, c]

Explore how these two triangles blend together using this scheme:

Blending surfaces seems pretty neat, but it suffers from one major drawback. For a transparent surface to reveal the surfaces behind it, those farther surfaces must already have been recorded in the framebuffer. This requires sorting the geometry of the scene and rendering from farthest to nearest.