An introduction to WebGL shaders
When OpenGL or WebGL interact with a GPU, they pass in data to tell the GPU the geometry and textures it needs to render. At this point, the GPU needs to know how it must render those textures and the geometry associated with them into a single 2D image that will be displayed on your computer monitor. OpenGL Shader Language (GLSL) is a language that is used with both OpenGL and WebGL to instruct the GPU on how to render a 2D image.
The WebGL rendering pipeline requires us to write two types of shaders to render an image to the screen. These are the vertex shader, which renders the geometry on a per-vertex basis, and the fragment shader, which renders pixel candidates known as fragments. The GLSL looks a lot like the C language, so the code will look somewhat familiar if you work in C or C++.
This introduction to GLSL shaders will not go into a lot of detail. In a later chapter, I will discuss WebGL shaders more extensively. Right now, I only want to introduce the concept and show you a very simple 2D WebGL shader. I will go into a lot more detail in the chapter on 2D lighting. Here is an example of a simple vertex shader that is used to render quads for a 2D WebGL rendering engine:
precision mediump float;
attribute vec4 a_position;
attribute vec2 a_texcoord;
uniform vec4 u_translate;
varying vec2 v_texcoord;
void main() {
gl_Position = u_translate + a_position;
v_texcoord = a_texcoord;
}
This very simple shader takes in the position of a vertex and moves it based on a positional uniform value that's passed into the shader through WebGL. This shader will run on every single vertex in our geometry. In a 2D game, all geometry would be rendered as a quad (that is, a rectangle). Using WebGL in this way allows us to make better use of the computer's GPU. Let me briefly discuss what is going on in the code of this vertex shader.
The first line of this shader sets the floating-point precision:
precision mediump float;
All floating-point operations on a computer are approximations for real fractions. We can approximate 1/3 with a low precision using 0.333 and with higher precision with 0.33333333. The precision line of the code indicates the precision of the floating-point values on the GPU. We can use one of three possible precisions: highp, mediump, or lowp. The higher the floating-point precision, the slower the GPU will execute the code, but the higher the accuracy of all the values of the computations. In general, I have kept this value at mediump, and that has worked well for me. If you have an application that demands performance over precision, you can change this to lowp. If you require high precision, be sure that you know the capabilities of the target GPUs. Not all GPUs support highp.
The attribute variables are values that are passed in with the vertex arrays into the pipeline. In our code, these values include the texture coordinates associated with the vertex, as well as the 2D translation matrix associated with the vertex:
attribute vec4 a_position;
attribute vec2 a_texcoord;
The uniform variable type is a type of variable that remains constant across all vertices and fragments. In this vertex shader, we are passing in one uniform vector, u_translate. Typically, you would not want to translate all your vertices by the same amount unless it is for a camera, but because we are only writing a WebGL program to draw a single sprite, using a uniform variable for translate will work fine:
uniform vec4 u_translate;
The varying variables (sometimes known as interpolators) are values that are passed from the vertex shader into the fragment shader, with each fragment in the fragment shader getting an interpolated version of that value. In this code, the only varying variable is the texture coordinate for the vertex:
varying vec2 v_texcoord;
In mathematics, an interpolated value is a calculated intermediate value. For example, if we interpolate the halfway point between 0.2 and 1.2, we would get a value of 0.7. That is, the starting value of 0.2, plus the average of (1.2 - 0.2) / 2 = 0.5. So, 0.2 + 0.5 = 0.7. Values passed from the vertex shader to the fragment shader using the varying keyword will be interpolated based on the position of the fragments relative to the vertex.
Finally, the code executed in the vertex shader is inside of the main function. This code takes the position of the vertex and multiplies it by the translation matrix to get the world coordinates of the vertex so that it can place them into gl_Position. It then sets the texture coordinate that's passed into the vertex shader directly into the varying variable so that it can pass it into the fragment shader:
void main() {
gl_Position = u_translate + a_position;
v_texcoord = a_texcoord;
}
After the vertex shader has been run, all the fragments that vertex shader generated are run through the fragment shader, which interpolates all of the varying variables for each fragment.
Here is a simple example of a fragment shader:
precision mediump float;
varying vec2 v_texcoord;
uniform sampler2D u_texture;
void main() {
gl_FragColor = texture2D(u_texture, v_texcoord);
}
Just like in our vertex shader, we start by setting our floating-point precision to mediump. The fragments have a uniform sample2D texture that defines the texture map that's used to generate the 2D sprites in our game:
uniform sampler2D u_texture;
uniform is a little like a global variable that is passed into the pipeline and applies to either every vertex or every fragment in the shader that uses it. The code that's executed in the main function is also straightforward. It takes the interpolated texture coordinate from the v_texcoord varying variable and retrieves the color value from our sampled texture, and then uses that value to set the color of the gl_FragColor fragment:
void main() {
gl_FragColor = texture2D(u_texture, v_texcoord);
}
Drawing a simple 2D image to the screen using WebGL directly inside of JavaScript requires a lot more code. In the next section, we will write out the simplest version of a 2D sprite rendering WebGL app I can think of, which happens to be a new version of the 2D canvas app we wrote in the previous chapter. I think it is worthwhile to see the differences between the two methods of rendering 2D images to the HTML canvas. Knowing more about WebGL will also help us understand what is going on behind the scenes when we eventually use the SDL API in WebAssembly. I am going to try and keep the demonstration and code as simple as I possibly can while creating the WebGL JavaScript app.
In the next section, we will learn how to draw to the canvas with WebGL.