pdf handout - Dr Anton Gerdelan

advertisement
OpenGL 4 Tutorial 3: Vertex Buffers
Anton Gerdelan
April 16, 2012
1
VBOs and VAOs
We are using a modern, hardware pipeline. This means that almost all operations now happen on the GPU. The
process:
1. copy points, normals, and texture coordinates into a buffer called a Vertex Buffer Object (VBO)
2. describe how many variables are in the VBO and how big each one is in memory with a Vertex Array Object
(VAO)
3. copy VBO and VAO to graphics hardware
The VBO is just the big collection of unassembled pieces (like IKEA furniture), and the VAO is the instructions.
We no longer need to deal with the VBO directly - it is just sitting there on the graphics memory. When we
draw() we give first bind the VAO’s index (“use these instructions”), and OpenGL knows how to pull the variables
out of the VBO and give them to the vertex shader.
2
Using a Maths Library
Later we will load our vertices from a file, but for now we can create them in our main programme. If you have
used an earlier version of OpenGL, you will notice that there are no longer any maths data types or functions in
OpenGL 4. We regularly use data types:
• 2D vector
• 3D vector
• 4D vector
• 3D matrix
• 4D matrix
It would be okay to use a float varA[2], float varB[3]... and so on, but we will often use some vector and
matrix maths functions as well:
• dot product of 2 vectors
• cross product of 2 vectors
• multiply vector and matrix (3D and 4D)
1
• multiply matrices (3D and 4D)
• normalise vector
• get vector length
• create identity matrix
It would also be convenient if we still had some of the matrix manipulation functions that existed in the older
OpenGL as well:
• lookAt() to create a view matrix for a virtual camera
• perspective() to distort vertices from an orthogonal to a projection perspective
• inverse and transpose matrix functions
• create axis-angle rotation matrices
• create yaw,pitch,roll rotation matrices
• quaternion (Hamiltonian) rotation
If you have built your own maths library to support most of these functions and data types in the previous lab,
then we can use that. You might also consider using the GLM library, which is a small maths library that is
designed to look like the data types from GLSL (vec3, mat4, etc.) as well as replicating the older OpenGL maths
functionality.
http://glm.g-truc.net/
To add glm to your project, unzip it into your project folder. You then just need to add the “glm/src” folder to
your C++ include directories in Project Properties. The whole project is built into header files, so you don’t need
to add anything to the linker paths.
To include glm support for most data types:
#include <glm/glm.hpp>
These are still in a “glm” name-space, so if you like you can add
using namespace glm;
but I prefer to specify the name-space every time, out of mostly paranoid fear that another library will use a
data-type called “vec3”.
3
Create Some Points
So, let’s make a temporary buffer or 3D points that we want to draw with.
Create 3 points that will form a triangle. Give them x, y, and z values between −1 and 1. Make sure that you add
them in counter clock-wise order (remember we defined “front” to mean counter clock-wise, and enabled back-face
culling). You might find a C++ STL vector useful for dynamic storage.
http://www.cplusplus.com/reference/stl/vector/.
2
int numberOfPoints = 0;
vector<glm::vec3> points;
glm::vec3 pointA;
pointA = glm::vec3(-0.5f, 0.5f, 0.0f);
points.push_back(pointA);
/*
... and so on ...
*/
numberOfPoints = points.size();
Make the number of points a global variable, because we will need this in one of our callbacks later.
Get out a pen and paper and draw the shape that you think will appear on the screen. I always have
paper for drawing to help understand geometric problems in 3D.
4
Create a VBO
Ask OpenGL to create 1 new VBO.
GLuint vboHandle;
glGenBuffers(1, &vboHandle);
If we had normals and texture coordinates as well then the easiest solution is to create 2 additional VBOs. It is
also possible to pack them all into one VBO and define the stride in the VAO. The stride is a description of how
much memory between each variable in the VBO.
We can now bind (activate) the VBO and put data into it. Only 1 buffer can be bound at a time in GL, so we
need to do this whenever we use it:
glBindBuffer(GL_ARRAY_BUFFER, vboHandle);
glBufferData(GL_ARRAY_BUFFER, numberOfPoints * sizeof(glm::vec3),
&points[0], GL_STATIC_DRAW);
Here I have told GL the type of buffer (array), how many bytes of memory to copy into the buffer, the address in
memory to start the copy from, and the usage type.
http://www.opengl.org/sdk/docs/man/xhtml/glBufferData.xml.
5
Create the VAO
When we render our VBO it goes to the shader pipeline as one huge block of data. The VAO tells the shader
programmes where to extract the data for each variable from this block. We have 1, and only 1, VAO bound
before each shader programme is used. If we are using several VBOs together (e.g. one for normals, one for
points, one for texture coordinates) then we get our VAO to point to all of these.
GLuint vaoHandle;
glGenVertexArrays(1, &vaoHandle);
If we bind our new VAO we can enable a new variable, and then describe how to extract that from the VBO.
First, bind our new VAO:
3
glBindVertexArray(vaoHandle);
Now we tell the VAO about our position variable. We tell it to create a new variable at location 0, because each
variable in a shader has a layout number.
int variableCount = 0;
// "we have a variable, make it variable location 0"
glEnableVertexAttribArray(variableCount);
// "the variable is in this VBO"
glBindBuffer(GL_ARRAY_BUFFER, vboHandle);
// "the variable starts at address 0 in the VBO, and each one has 3 floats"
glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, (GLubyte *)NULL);
variableCount++;
Go to the documentation and look at what the parameters for each function expect.
Later we will add new variable to our VAO. When we do so, we just need to repeat this last block of code, but
the layout number given to glEnableVertexAttribArray should change, the VBO handle used will change, and
the size and type of variable in glVertexAttribPointer will change.
6
Rendering
Now we have successfully sent our data to sit on the graphics card. OpenGL gives as a GLuint (just a regular
unsigned int) as a way of referring to the VAO later - for now lets make it a global variable. The rendering process
works like this:
1. glUseProgram() the shader programme used to render our object
2. glBindVertexArray() to bind our object’s VAO
3. glDrawArrays() to draw the currently bound VAO and specify rendering style
Note that we don’t need to do anything with the VBO because the VAO already has all of that information.
If we assume that we already have a compiled shader programme, it would be convenient to make that a global
variable as well. We can write then write the following code inside our rendering callback:
// activate our shader
myShaderProgramme.use();
// bind our VAO
glBindVertexArray(vaoHandle);
// draw using the shader and vao
glDrawArrays(GL_TRIANGLES, 0, numberOfPoints);
The last function is telling GL to draw triangles using every 3 points, to start at point index 0, and to continue
until numberOfPoints. Have a look here to see what other types of drawing mode are available:
http://www.opengl.org/sdk/docs/man/xhtml/glDrawArrays.xml
Don’t forget to call the function from your shader programme class that updates your uniform variable! This can
go anywhere - perhaps you can set it once after the shader is loaded, then change the colour when a certain key
is pressed?
That’s it! Does the render window show a shape resembling the one that you drew on paper?
4
7
Problem?
1. I don’t see my shape at all! Did you draw you shape on paper? Do the points go in clock-wise or anti
clock-wise order?
2. I can see my shape, but it is white! Are you calling your ShaderProgramme’s setUniform() function?
Are you giving it sensible colour values - between 0 and 1?
3. My shaders are being compiled but I still only see a white shape - Did you forget to link your
shader programme? Perhaps add a “wasLinked” boolean to do some validation at the top of functions.
8
Continuing On
We now have enough of a framework to get started with OpenGL shader programming. I suggest following David
Wolff’s OpenGL 4.0 Shader Programming Cookbook, which you can find through the library’s Safari Books
Online portal:
http://miman.bib.bth.se/login?url=http://proquest.safaribooksonline.com/?uicode=bth
What you will need to do to solve the assignments now is to learn GLSL. The best way to do that is to go over
examples from the book. Most problems involve transforming vectors between spaces with matrices, and getting
angles between vectors using the dot() function.
Always solve your problems on paper first (with a diagram) before you start programming.
9
Most Common 3D Programming Bugs
• Truncating a vec4 to a vec3. Use .xyz “swizzle” operator to get a vec3 from a vec4.
• Using a vec3 when vec4 is required. Use = vec4(myVector, 1) or = vec4(myVector, 0).
• Incorrect layout size in VAO. Check byte size and component count.
• Missing VA Pointer for VAO. Did you add a new VBO variable and forget to add a VAO pointer?
• Mixing vectors from two different spaces. Always post-fix variable names with their space.
• Direction vec4 with w component set to 1.
• Position vec4 with w component set to 0.
To find bugs in GL try running with the -gldebug command-line debug argument (see freeglut documentation).
The easiest way to find bugs in GLSL is to step-by-step check each variable’s value by returning it as a fragment
shader colour. Fragment colours are vec4 with component values between 0 and 1, so you will need to do some
adjustment, but you can then visualise your problem.
10
Debugging Shaders
When writing shader programmes, you can no longer use your C# debugger to watch what is happening during
execution. You also can’t print information to a terminal. This can make programming shaders a bit of a “black
box”. Basically this means you need to change your programming style to the old-fashioned “try one line at a
time” approach, or even doing a binary “try this half first, then the other half” binary search style of debugging.
That is a pain. The best approach is to do isolated tests:
5
1. set all other variables to some constant value
2. allow one variable to change
3. work out the expected result on paper
4. check the value of the variable being tested to see if it matches expected (paper) result
5. repeat process for next variable
The main problem is: how can we check the value of each variable during run-time?
10.1
Debugging Errors From Compilation
The first thing to check is any shader compilation or linking errors. What line number is the error on? What
variable or function is causing the error?
10.2
Debugging Shaders During Run-Time
Best case scenario - we are working in the fragment shader or can send our variable to the fragment shader. This
means that we can create an rgba colour value, and use the values from the tested variable as colour components.
Examples:
float variable = testVariableThatIsAFloat; // get the variable
// let’s say we know that this particular variable can be between 0 and 1024
float rangeOfValues = 1024.0f;
// convert to range 0 to 1 (sensible colour values)
variable = variable / rangeOfValues;
float4 testColour(variable, 0, 0, 1); // the red channel will give value of variable
return testColour; // use to colour all fragments
We can see that we need to do some conversion to get the value into a colour range. But what if the values can
be negative? We could take the absolute value, but then we lose the sign information, which might be useful. We
could just leave it negative, then it will show as black, but then we only have the sign information, and not the
magnitude of the negative value. What about:
float negativeVariable = variable * -1;
float4 testColour(variable, negativeVariable, 0, 1);
Now our value will show as red colour if positive, and green colours if negative.
Vectors are easy to convert to colours, but we need to be careful to put a ‘1’ for the alpha component or blending
options might do weird things. Here we return an rgba colour from an xyz float3. The built-in constructors are
usually pretty flexible like this.
float3 variable = testFromFloat3;
return float4(variable, 1);
Here I just use xyz from the float4 variable and add a 1 for the alpha manually. This .xyz thing is called the
swizzle operator and is just a short-cut.
6
Figure 1: The left image is surface normal vectors (used in lighting calculations) returned as colour values. If
the normal points up it is green (because ‘g’ corresponds to ‘y’), forward is blue (z), and right is red (x). You
can see some red on the edges of the front face, which means my normals are incorrect! The right image is
the result of a dot product calculation (remember dot product returns a scalar) used for all colour channels;
float4(value,value,value,1), so we get a greyscale image. I used this to check how big the spot created by
my specular lighting calculation was.
float4 variable = testFromFloat4;
return float4(variable.xyz, 1);
We still have the same problem for displaying negative values, and ranges of values greater than 1, so if we need
that information we might need a few more steps:
float rangeOfValues = 150; // we know vector can go from 0 to 150
float4 variable = testFromFloat4 / rangeOfValues; // put in range 0 to 1
return float4(variable.xyz, a);
We don’t have any extra colour channels to show negative values, so I would run the test a second time using
∗ = −1 if I wanted to check the values held in black (negative or zero) regions.
Generally speaking this approach is a very quick way to visually debug any kind of directional vector because xyz
mapped to rgb means that vector components pointing right are red, up are green, and back are blue. It is very
easy to spot small errors in normals or calculations as they vary across fragments.
11
Self-Quiz
Make sure that you have read enough theory to answer all of these questions early in the course:
1. Which shader is responsible for moving my objects?
2. Which matrix is used to change between orthographic and perspective scenes?
3. What steps and matrices do you need to rotate one object around another object.
4. What is the job of the fragment shader?
5. Which shader can I use if I want to delete vertices?
7
6. If I have a light at world position 0, 0, 100 and a camera at location 0, 100, 0 pointing at 0, 0, 0, what can we
expect the position of the light to be in eye space?
7. Why do 4d positions need a 1 in the w component? Hint: something to do with 4d matrices.
8. If we render a triangle covering half of the screen space; How many vertex shaders will run? How many
frgment shaders will run?
9. Consider the triangle in Figure 2. At each vertex we have a normal. Along one edge of the triangle the first
vertex has a normal of value 0, 0, 1, the second vertex has 0, 1, 0. If we output our normals from the vertex
shader to the fragment shader, what values do we expect to get in the fragment shader for fragments along
this edge of the triangle?
10. Consider the same rendered triangle. The vertices are stored in a vertex buffer in the order; top vertex,
bottom-right vertex, bottom-left vertex. We have enabled back-face culling and front-faces are anticlockwise winding. Is the triangle visible or invisible?
Figure 2: A triangle on screen is created between 3 vertices. Fragments (pixels) are shown by the grid. Normals
are stored in the vertex shadesr and interpolated to fragment shaders. Fragments along one edge of the triangle
are highlighted.
8
Download