Week 9 Tutorial Solutions

Question 1: Texturing

  1. What is a reason to use texture mapping rather than lots of little polygons? Are the two representations functionally equivalent? What are the differences?

    Textures can be used to give the illusion of complex geometry while keeping the actual geometric complexity low. Rather than using textures, we could just add lots of polygons to the figure to model the extra details, but this would slow down drawing, and it would be a lot of work to figure out the extra points and faces that we want to add.

    They are not functionally equivalent. Textures can be used to simulate small features that are not actually there in the geometry so will not respond to lighting or in the same way. For example, if we shine light nearly parallel to a given face on the golf ball, one side of the dimple should be light and the other should be dark, but this won't happen if we're using textures. Textures also have resolution and aliasing issues.

  2. What is difference between the following 2 texture filter settings?
     gl.glTexParameteri(GL.GL_TEXTURE_2D, GL.GL_TEXTURE_MAG_FILTER, GL.GL_NEAREST);
    
    gl.glTexParameteri(GL.GL_TEXTURE_2D, GL.GL_TEXTURE_MAG_FILTER, GL.GL_LINEAR);
    

    These are 2 different filter settings for magnification. If we are in a situation where one texel gets mapped to many pixels (like zooming in or enlarging a photo too much) we have 2 choices of filters.

    GL.GL_NEAREST just chooses the texel closest to the texture coordinate value output by the rasteriser. This is known as point sampling and is most subject to visible aliasing.

    GL.GL_LINEAR is a filter that for 2D textures performs bilinear interpolation across the 4 nearest texels [2 texels in S direction and 2 Texels in T direction] to create a smoother (sometimes blurrier) image This approach is more expensive to compute.

  3. What are Mip Maps? Why are they used?

    They are pre-calculated, sequences of textures, each of which is a progressively lower resolution representation of the same image. The height and width of each image, or level, in the mipmap is a power of two smaller than the previous level where each texel in the next level is calculated by averaging 4 of the parent texels.

    These can help with minification aliasing problems which can arise when more than one texel is mapped to one pixel (like zooming out). Without mip-mapping you can use similar filters for minification as for magnification - GL_NEAREST and GL_LINEAR. WITH mip mapping you can use GL_NEAREST_MIPMAP_NEAREST which returns the nearest texel on the nearest mip map or GL_LINEAR_MIPMAP_LINEAR which is trilinear filtering where bilinear filtering is used on 2 of the nearest mipmaps and then interpolated. There are also other filter settings.

  4. What do the following settings do?
    float flargest[] = new float[];
    gl.glGetFloatv(GL.GL_MAX_TEXTURE_MAX_ANISOTROPY_EXT, fLargest[0]);
    gl.glTexParameterf(GL.GL_TEXTURE_2D, GL.GL_TEXTURE_MAX_ANISOTROPY_EXT,fLargest[0]);
    

    This code is turning on anisotropic filtering. Like trilinear filtering, anisotropic filtering is used to eliminate aliasing effects, but improves on trilinear filtering by reducing blur and preserving detail at extreme viewing angles. Anisotropic can take a different number of samples in the horizontal vs vertical directions depending on the projected shape of the texel. Where as isotropic (trilinear/mipmapping) always filters the same in each direction.

    Different degrees or ratios of anisotropic filtering can be applied during rendering and current hardware rendering implementations set an upper bound on this ratio. This code finds out the maximum level of filtering for the current implementation and sets the filter to this max level.

    Like trilinear filtering, anisotropic filtering is used to eliminate aliasing effects, but improves on trilinear filtering by reducing blur and preserving detail at extreme viewing angles. Anisotropic can take a different number of samles in the horizontal vs vertical directions deeding on the rojected shape of the texel. Where as isotroic (trilinear/mipmapping) always filters the same in each direction.

    For best results, combine anisotropic filtering with a GL_LINEAR_MIPMAP_LINEAR minification filter.

Question 2: Texture Co-ordinates

Assuming texture co-ordinates start at 0,0 in the bottom left of the image, define texture co-ordinates for the following situations.
  1. A quad with the following vertices, where you want to map the whole texture to the quad.
        gl.glVertex3d(-1, -1, -1.0);
        gl.glVertex3d(1, -1, -1.0);
        gl.glVertex3d(1, 1, -1.0);
        gl.glVertex3d(-1, 1, -1.0);
    
    
        gl.glTexCoord2d(0, 0.0); gl.glVertex3d(-1, -1, -1.0);
        gl.glTexCoord2d(1, 0.0); gl.glVertex3d(1, -1, -1.0);
        gl.glTexCoord2d(1, 1);   gl.glVertex3d(1, 1, -1.0);
        gl.glTexCoord2d(0, 1.0); gl.glVertex3d(-1, 1, -1.0);
    
    
  2. The same quad but where you want to map the whole texture to the quad in the y-direction, but have 3 repeated copies in the x direction. Assuming we are using GL2.GL_REPEAT mode.
    
        gl.glTexCoord2d(0, 0.0); gl.glVertex3d(-1, -1, -1.0);
        gl.glTexCoord2d(3, 0.0); gl.glVertex3d(1, -1, -1.0);
        gl.glTexCoord2d(3, 1);   gl.glVertex3d(1, 1, -1.0);
        gl.glTexCoord2d(0, 1.0); gl.glVertex3d(-1, 1, -1.0);
    
    
  3. The triangle
        gl.glVertex3d(-10.0, -10.0, 0.0);
        gl.glVertex3d(10.0, -10.0, 0.0);
        gl.glVertex3d(0.0, -5.0, 0.0);
    
        gl.glTexCoord2d(0, 0.0); 
        gl.glVertex3d(-10.0, -10.0, 0.0);
        gl.glTexCoord2d(1, 0.0); 
        gl.glVertex3d(10.0, -10.0, 0.0);
        gl.glTexCoord2d(0.5, 1); 
        gl.glVertex3d(0.0, -5.0, 0.0);
    
  4. The sides of a cylinder where the cylinder containing NUM_SLICES slices as specified below
     
        gl.glBegin(GL2.GL_QUAD_STRIP);{
        for(int i=0; i<= NUM_SLICES; i++){
            double angle = i*angleIncrement;
                                
            double xPos = Math.cos(angle);
            double yPos = Math.sin(angle);
                           
            float s =  i * 1.0/(float)NUM_SLICES;    
    
            gl.glNormal3d(xPos, yPos, 0);
            gl.glTexCoord2f(s,0);              
            gl.glVertex3d(xPos,yPos,zFront);
            gl.glTexCoord2f(s,1);  
            gl.glVertex3d(xPos,yPos,zBack);
    
        }
        }gl.glEnd();
    
  5. The same cylinder but assuming we only want the texture to be like a label covering just one third of the curved surface.

    You could also do something similar in the T dimension.

     
    
        //To stop texturing repeating
        gl.glTexParameteri(GL2.GL_TEXTURE_2D, GL2.GL_TEXTURE_WRAP_S, GL2.GL_CLAMP_TO_EDGE);
        //OR
        // float color[] = { 1.0f, 0.0f, 0.0f, 0.1f };
        // gl.glTexParameterfv(GL2.GL_TEXTURE_2D, GL2.GL_TEXTURE_BORDER_COLOR, color,0);
    
        gl.glBegin(GL2.GL_QUAD_STRIP);{
        for(int i=0; i<= NUM_SLICES; i++){
            double angle = i*angleIncrement;
                                
            double xPos = Math.cos(angle);
            double yPos = Math.sin(angle);
                           
            float s =  i * 3.0/(float)NUM_SLICES;    
    
            gl.glNormal3d(xPos, yPos, 0);
            gl.glTexCoord2f(s,0);              
            gl.glVertex3d(xPos,yPos,zFront);
            gl.glTexCoord2f(s,1);  
            gl.glVertex3d(xPos,yPos,zBack);
    
        }
        }gl.glEnd();
    

Question 3: Vertex Buffer Objects and Texture Co-ordinates

Suppose you have the following code setting up a VBO for a quad.
private float vertices[] = { -10,0,0,
                              10,0,0,
                              10,10,0,
                              -10,10,0
                           };

private FloatBuffer vertexBuffer = Buffers.newDirectFloatBuffer(vertices);

public void init(GLAutoDrawable drawable) {
        GL2 gl = drawable.getGL().getGL2();
        //do init stuff
        //etc
   
        bufferIds = new int[1];
        gl.glGenBuffers(1,bufferIds,0);

        gl.glBufferData(GL.GL_ARRAY_BUFFER,vertices.length*4,vertexBuffer,GL2.GL_STATIC_DRAW);
}

public void display(GLAutoDrawable drawable) {
        GL2 gl = drawable.getGL().getGL2();
        //do display stuff
        //etc

       //draw the quad
}
  1. What code would you have to add to the display function to draw the quad?
    
    public void display(GLAutoDrawable drawable) {
            GL2 gl = drawable.getGL().getGL2();
            //do display stuff
            //etc
    
           //draw the quad -setting up the buffer
           gl.glBindBuffer(GL.GL_ARRAY_BUFFER,bufferIds[0]);      
           gl.glEnableClientState(GL2.GL_VERTEX_ARRAY);
           gl.glVertexPointer(3, GL.GL_FLOAT, 0, 0); //3 float coordinates for each vertex
    
           //actually ask for the quad to be drawn
           
           gl.glDrawArrays(GL2.GL_QUADS,0,4);//use vertices starting at index 0 and use 4 of them
    
           //You may want to unset all these states if you are drawing something
           //else next
           gl.glBindBuffer(GL.GL_ARRAY_BUFFER,0);
           gl.glDisableClientState(GL2.GL_VERTEX_ARRAY);
                  
    }
    
  2. What would you have to modify/add to include texture co-ordinates in the VBO?
    private float vertices[] = { -10,0,0,
                                  10,0,0,
                                  10,10,0,
                                  -10,10,0
                               };
    
    //one for each vertex
    private float texCoords[] =
                    {       0,0, 
                            1,0,
                            1,1,
                            0,1
                    };
    
    
    private FloatBuffer vertexBuffer = Buffers.newDirectFloatBuffer(vertices);
    
    private FloatBuffer vertexBuffer = Buffers.newDirectFloatBuffer(texCoords);
    
    
    public void init(GLAutoDrawable drawable) {
            GL2 gl = drawable.getGL().getGL2();
            //do init stuff
            //etc
       
            bufferIds = new int[1];
            gl.glGenBuffers(1,bufferIds,0);
    
                    
            gl.glBufferData(GL.GL_ARRAY_BUFFER,vertices.length*4+texCoords.length*4,null,GL2.GL_STATIC_DRAW);
            gl.glBufferSubData(GL.GL_ARRAY_BUFFER,0,vertices.length*4,vertexBuffer);
            gl.glBufferSubData(GL.GL_ARRAY_BUFFER,vertices.length*4,texCoords.length*4,texBuffer);
            
    }
    
    public void display(GLAutoDrawable drawable) {
            GL2 gl = drawable.getGL().getGL2();
            //do display stuff
            //etc
    
           //draw the quad -setting up the buffer
           gl.glBindBuffer(GL.GL_ARRAY_BUFFER,bufferIds[0]);      
           gl.glEnableClientState(GL2.GL_VERTEX_ARRAY);
           gl.glVertexPointer(3, GL.GL_FLOAT, 0, 0); //3 float coordinates for each vertex
    
           gl.glEnableClientState(GL2.GL_TEXTURE_COORD_ARRAY);
              //2 float coordinates for each vertex starting at an offset after the vertex data
              gl.glTexCoordPointer(2, GL.GL_FLOAT, 0,vertices.length*4);
           
           //actually ask for the quad to be drawn
           
           gl.glDrawArrays(GL2.GL_QUADS,0,4);//use vertices starting at index 0 and use 4 of them
    
           //You may want to unset all these states if you are drawing something
           //else next
           gl.glBindBuffer(GL.GL_ARRAY_BUFFER,0);
           gl.glDisableClientState(GL2.GL_VERTEX_ARRAY);
           gl.glDisableClientState(GL2.GL_TEXTURE_COORD_ARRAY);
    }
    
    

Question 4: Assignment

If there is any time left, discuss the assignment. Please make sure you ask any questions about the format of the json file and in particular the representation of the terrain. That is the most important thing in the early stages of the assignment.