Shaders can look up values (usually colors) in images. This is how we implement traditional texture mapping.
First, we need to put the image file into a uniform variable. More specifically, we don't only need an image, but also all the machinery that can read values from it. For example, we might use a mipmap. The GLSL data type is called a
Setting up a
sampler2D from the host side is compilated. Fortunately, THREE does it for us. All we need to do is create a Texture and assign it to a uniform variable.
Then, in our shader program we refer to this
sampler2D uniform using the
texture2D function, which looks up a position in the texture.
Here's a simple example.
You can look at
7-1-texture.js and the shaders (
text.fs) to see how it works. Notice how the uniform is set up in the JS file, and then accessed in the fragment shader.
In this example, I'll make a planet. I'll have a texture with blue (ocean) and green (land) and I'll put it on a sphere.
Then, I'll add mountains - I'll use the green brightness as a height, and use that to displace the vertices of the sphere. I'll have to do this in the vertex shader.
The interesting part of this is the vertex shader
world.vs. Note how I look up values in the texture, and use the amount of green to move the vertices in the normal direction.
This is the opposite of a normal map: there is no lighting, but I am really moving the geometry. It is a displacement map.
You can also look at
world.fs - but these are pretty much the same as prior examples.
Now that we've seen how to use textures in shaders, we have all the main pieces. Now it's just time to make some shaders on the next page