OpenGL ES课程VI之纹理贴图(原文对照)

http://www.educity.cn/wenda/92368.html

OpenGL ES教程VI之纹理贴图(原文对照)

  OpenGL ES Tutorial for Android – Part VI – Textures

  December 30th, 2010 by Per-Erik Bergman — Android, Embedded, Java

  Last tutorial we worked a bit more on meshes and we have also talked about adding colors to our mesh. The most common way of adding colors to your mesh is to add a texture. There is a couple of different steps involved with adding a texture to the mesh I will try to go through them all and explain the basics about them.

  上一教程我们生成了一些模型,而且我们已经知道如何给模型着色。但最常用的着色方式还是添加纹理。给模型添加纹理有几个不同的操作步骤。下面我将一一展开。

  Loading bitmaps

  First step would be to get a bitmap to generate a texture from. You can get hold of a bitmap in many different ways from downloading, generating or simply just load one from the resources. I‘m going with the simplest one for this example witch is loading from the resources.

  第一步,我们需要得到贴图的图片,这有许多方式。你可以下载,生成,或是简单地从资源中加载,我使用了最后一种:从一个资源文件中加载。

  Bitmap bitmap = BitmapFactory.decodeResource(contect.getResources(),

  R.drawable.icon);

  One other thing about textures is that some hardware requires that the height and width are in the power of 2 (1, 2, 4, 8, 16, 32, 64...). If you run a texture with a size of 30x30pixels on a hardware that don’t support it you will just get a white square (unless you change the default color).

  需要注意的是,在某些硬件上,贴图需要的图片尺寸必须是2的n次方(1,2,4,8,16,32…)。如果你的图片是30X30的话,而且硬件不支持的话,那么你只能看到一个白色的方框(除非,你更改了默认颜色)

  Generating a texture

  After we have loaded the bitmap we need to tell OpenGL to actually create the texture.

  图片加载之后,就可以告诉OpenGL 来产生纹理了。

  First thing we need to do is to let OpenGL generate some texture id‘s that we will use as handles to the textures later on. In this example we will only have one texture.

  首先要做的是让OpenGL产生纹理ID,这些ID将在后面用到。例子中我们只有一个纹理。

  // Create an int array with the number of textures we want,

  // in this case 1.

  int[] textures = new int[1];

  // Tell OpenGL to generate textures.

  gl.glGenTextures(1, textures, 0);

  With the same parameters you can delete the textures:

  // Delete a texture.

  gl.glDeleteTextures(1, textures, 0)

  Now when the texture id‘s are generated we need to just like everything else tell OpenGL what to work with. With textures we use the command glBindTexture:

  ID产生之后,我们需要将这些ID使用glBindTexture方式进行绑定

  gl.glBindTexture(GL10.GL_TEXTURE_2D, textures[0]);

  From this point all commands we call on regarding textures will be applied on to your texture with the generated id.

  那么在此之后,我们后面将使用产生的ID来调用纹理

  glTexParameter

  There is a couple of parameters we need to set on the texture, the first one is to tell OpenGL what to do if the texture need to be shrunk or magnified to match the rendered image.

  在纹理映射,我们需要设置几个参数,第一个是告诉OpenGL在渲染图片时,怎么缩小或放大以适合大小。
If the texture is smaller it needs to be magnified that is done with the magnification function:

  如果贴图小的话,那我们需要使用放大函数进行放大操作。

  // Scale up if the texture if smaller.

  gl.glTexParameterf(GL10.GL_TEXTURE_2D,

  GL10.GL_TEXTURE_MAG_FILTER,

  GL10.GL_LINEAR);

  And how to scale if the texture needs to be scaled down using the minification function.

  类似,在贴图过多时,使用压缩函数进行缩小。

  // scale linearly when image smalled than texture

  gl.glTexParameterf(GL10.GL_TEXTURE_2D,

  GL10.GL_TEXTURE_MIN_FILTER,

  GL10.GL_LINEAR);

  You need to pass an argument to these functions. I‘m only going to show you two of them the rest you can investigate your self

  请看上面的函数,你可以自己研究一下,该给它传递什么参数。

  If you want a crisp and clean rendering like this image you need to use the GL10.GL_NEAREST parameter.

  如果你想要清晰的渲染效果,你可以使用GL10.GL_NEAREST。

  

  If you rather want a blurred image you should use the GL10.GL_LINEAR parameter.

  如果你喜欢模糊一点,应该使用GL10.GL_LINEAR

  

  UV Mapping

  We will also need to tell OpenGL how to map this image onto the mesh this is done in two steps, fist we need to assign UV coordinates

  下面我们需要告诉OpenGL怎样将图片映射到模型上,有两个步骤。首先我们指定一个UV坐标

  UV mapping is the way we map the pixels on the bitmap to the vertices in our mesh. The UV coordinates are 0,0 in the upper left and 1,1 is the bottom right, like the left image below. The right image below illustrates how our plane is built. To get the texture mapped correctly we need to map the lower left part of the texture (0,1) to the lower left vertex (0) in our plane and we need to map the the bottom right (1,1) in the texture to the bottom right (1) to the bottom right in our plane and... you get the idea.

  我们使用UV映射将图片的每一像素映射到模型的顶点上。UV坐标中,左上角为0,0,右下角为1,1,请看下图的左半部分。右半部分是我们要创建的平面。为保证映射正确,我们将纹理左下角映射到左下角顶点0,右下角映射到顶点1…依此类推。

  注:在OpenGL教程里讲道,图片左下角为0,0坐标。不过我们这里是Android的OpenGL ES。或许Android在接口封装上,有些许改动吧。

  We put this mapping into a float array like this:

  纹理坐标数组的定义如下:

  float textureCoordinates[] = {0.0f, 1.0f,

  1.0f, 1.0f,

  0.0f, 0.0f,

  1.0f, 0.0f };

  

  If we instead used 0.5 instead of 1.0 like this:

  float textureCoordinates[] = {0.0f, 0.5f,

  0.5f, 0.5f,

  0.0f, 0.0f,

  0.5f, 0.0f };

  
The texture will be mapped so the plane will have the upper left part of it.

  那么将映射图片的左上角到平面中

  

  Back to the glTexParameterf, if we go the other way and uses values higher then 1.0 like this:

  请回想一下glTexParameterf函数。如果我们将1.0放大到2.0

  float textureCoordinates[] = {0.0f, 2.0f,

  2.0f, 2.0f,

  0.0f, 0.0f,

  2.0f, 0.0f };

  We actually tell OpenGL to use part of the texture that does not exist so we need to tell OpenGL what to do with the part that does not exist.

  那么超过图片的位置,OpenGL该如何处理呢?这正是下面我们讨论的。

  We use the glTexParameterf function to tell OpenGL what to do with the texture. By default OpenGL uses something called GL_REPEAT.

  我们使用glTexParameterf函数来告诉OpenGL该如何进行贴图,默认使用的参数项为GL_REPEAT

  GL_REPEAT means that OpenGL should repeat the texture beyond 1.0.

  GL_REPEAT意味着OpenGL应该重复纹理超过1.0的部分
GL_CLAMP_TO_EDGE means that OpenGL only will draw the image once and after that just repeat the last pixel line the rest of the image.

  GL_CLAMP_TO_EDGE表示OpenGL只画图片一次,剩下的部分将使用图片最后一行像素重复

  Since we are working with a 2D texture so we need to tell OpenGL what to do in two directions: GL_TEXTURE_WRAP_S and GL_TEXTURE_WRAP_T.

  对于一个2D纹理,我们还需要告诉它们的方向。

  Below you see a chart with the 4 combinations of GL_REPEAT and GL_CLAMP_TO_EDGE.

  下面请看它们的四种组合(第三种组合对应的图片错了。)

  

WRAP_S: GL_REPEAT
WRAP_T: GL_REPEAT
WRAP_S: GL_REPEAT
WRAP_T: GL_CLAMP_TO_EDGE
WRAP_S: GL_REPEAT
WRAP_T: GL_CLAMP_TO_EDGE
WRAP_S: GL_CLAMP_TO_EDGE
WRAP_T: GL_CLAMP_TO_EDGE

  This is how we use the glTexParameterf function:

  gl.glTexParameterf(GL10.GL_TEXTURE_2D,

  GL10.GL_TEXTURE_WRAP_S,

  GL10.GL_REPEAT);

  gl.glTexParameterf(GL10.GL_TEXTURE_2D,

  GL10.GL_TEXTURE_WRAP_T,

  GL10.GL_REPEAT);

  The last thing we need to do is to bind the bitmap we loaded to the texture id we created.

  GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, bitmap, 0);

  Using the texture

  To be able to use the texture we need just like with everything else create a byte buffer with the UV coordinates:

  对于UV坐标,我们同样使用字节缓冲

  FloatBuffer byteBuf = ByteBuffer.allocateDirect(texture.length * 4);

  byteBuf.order(ByteOrder.nativeOrder());

  textureBuffer = byteBuf.asFloatBuffer();

  textureBuffer.put(textureCoordinates);

  textureBuffer.position(0);

  Rendering

  // Telling OpenGL to enable textures.

  gl.glEnable(GL10.GL_TEXTURE_2D);

  // Tell OpenGL where our texture is located.

  gl.glBindTexture(GL10.GL_TEXTURE_2D, textures[0]);

  // Tell OpenGL to enable the use of UV coordinates.

  gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);

  // Telling OpenGL where our UV coordinates are.

  gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, textureBuffer);

  // ... here goes the rendering of the mesh ...

  // Disable the use of UV coordinates.

  gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY);

  // Disable the use of textures.

  gl.glDisable(GL10.GL_TEXTURE_2D);

  Putting it all together

  I‘m using a modified version of the code from the previous tutorial. The different is mostly that I renamed some variables and functions and added more comments and all code is now under Apache License. To make the code easier to understand I removed the previous plane and added a new easier one called SimplePlane.

  Updating the Mesh class

  The first thing we need to do is to update the Mesh class (se.jaywash.Mesh). We need to add the functionality to load and render a texture.

  We need to be able to set and store the UV coordinates.

  // Our UV texture buffer.

  private FloatBuffer mTextureBuffer;

  /**

   * Set the texture coordinates.

   *

   * @param textureCoords

   */

  protected void setTextureCoordinates(float[] textureCoords) {

  // float is 4 bytes, therefore we multiply the number if

  // vertices with 4.

  ByteBuffer byteBuf = ByteBuffer.allocateDirect(

  textureCoords.length * 4);

  byteBuf.order(ByteOrder.nativeOrder());

  mTextureBuffer = byteBuf.asFloatBuffer();

  mTextureBuffer.put(textureCoords);

  mTextureBuffer.position(0);

  }

  We also need to add functions to set the bitmap and create the texture.

  // Our texture id.

  private int mTextureId = -1;

  // The bitmap we want to load as a texture.

  private Bitmap mBitmap;

  /**

   * Set the bitmap to load into a texture.

   *

   * @param bitmap

   */

  public void loadBitmap(Bitmap bitmap) {

  this.mBitmap = bitmap;

  mShouldLoadTexture = true;

  }

  /**

   * Loads the texture.

   *

   * @param gl

   */

  private void loadGLTexture(GL10 gl) {

  // Generate one texture pointer...

  int[] textures = new int[1];

  gl.glGenTextures(1, textures, 0);

  mTextureId = textures[0];

  // ...and bind it to our array

  gl.glBindTexture(GL10.GL_TEXTURE_2D, mTextureId);

  // Create Nearest Filtered Texture

  gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER,

  GL10.GL_LINEAR);

  gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MAG_FILTER,

  GL10.GL_LINEAR);

  // Different possible texture parameters, e.g. GL10.GL_CLAMP_TO_EDGE

  gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_S,

  GL10.GL_CLAMP_TO_EDGE);

  gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_T,

  GL10.GL_REPEAT);

  // Use the Android GLUtils to specify a two-dimensional texture image

  // from our bitmap

  GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, mBitmap, 0);

  }

  And finally we need to add the call to the texture loading and to actually tell OpenGL to render with this texture. I removed some code so the page would not be so long but you will find the code complete in the attached zip file.

  // Indicates if we need to load the texture.

  private boolean mShouldLoadTexture = false;

  /**

   * Render the mesh.

   *

   * @param gl

   *      the OpenGL context to render to.

   */

  public void draw(GL10 gl) {

  ...

  // Smooth color

  if (mColorBuffer != null) {

  // Enable the color array buffer to be used during rendering.

  gl.glEnableClientState(GL10.GL_COLOR_ARRAY);

  gl.glColorPointer(4, GL10.GL_FLOAT, 0, mColorBuffer);

  }

  if (mShouldLoadTexture) {

  loadGLTexture(gl);

  mShouldLoadTexture = false;

  }

  if (mTextureId != -1 && mTextureBuffer != null) {

  gl.glEnable(GL10.GL_TEXTURE_2D);

  // Enable the texture state

  gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);

  // Point to our buffers

  gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, mTextureBuffer);

  gl.glBindTexture(GL10.GL_TEXTURE_2D, mTextureId);

  }

  gl.glTranslatef(x, y, z);

  ...

  // Point out the where the color buffer is.

  gl.glDrawElements(GL10.GL_TRIANGLES, mNumOfIndices,

  GL10.GL_UNSIGNED_SHORT, mIndicesBuffer);

  ...

  if (mTextureId != -1 && mTextureBuffer != null) {

  gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY);

  }

  ...

  }

  Creating the SimplePlane class

  We also need to create the SimplePlane.java. The code is pretty simple and self-explaining if you have read my previous tutorials. The new element is the textureCoordinates variable.

  package se.jaywash;

  /**

   * SimplePlane is a setup class for Mesh that creates a plane mesh.

   *

   * @author Per-Erik Bergman (per-erik.)

   *

   */

  public class SimplePlane extends Mesh {

  /**

      * Create a plane with a default with and height of 1 unit.

      */

  public SimplePlane() {

  this(1, 1);

  }

  /**

      * Create a plane.

      *

      * @param width

      *      the width of the plane.

      * @param height

      *      the height of the plane.

      */

  public SimplePlane(float width, float height) {

  // Mapping coordinates for the vertices

  float textureCoordinates[] = { 0.0f, 2.0f, //

  2.0f, 2.0f, //

  0.0f, 0.0f, //

  2.0f, 0.0f, //

  };

  short[] indices = new short[] { 0, 1, 2, 1, 3, 2 };

  float[] vertices = new float[] { -0.5f, -0.5f, 0.0f,

  0.5f, -0.5f, 0.0f,

  -0.5f, 0.5f, 0.0f,

  0.5f, 0.5f, 0.0f };

  setIndices(indices);

  setVertices(vertices);

  setTextureCoordinates(textureCoordinates);

  }

  }

  References

  The info used in this tutorial is collected from:
Android Developers
OpenGL ES 1.1 Reference Pages

  You can download the source for this tutorial here: Tutorial_Part_VI
You can also checkout the code from: cod

  Previous tutorial: OpenGL ES Tutorial for Android – Part V – More on Meshes

  Per-Erik Bergman 
Consultant at Jayway

时间: 2024-08-01 10:20:36

OpenGL ES课程VI之纹理贴图(原文对照)的相关文章

基于Cocos2d-x学习OpenGL ES 2.0系列——纹理贴图(6)

在上一篇文章中,我们介绍了如何绘制一个立方体,里面涉及的知识点有VBO(Vertex Buffer Object).IBO(Index Buffer Object)和MVP(Modile-View-Projection)变换. 本文将在教程4的基础之上,添加纹理贴图支持.最后,本文会把纹理贴图扩展至3D立方体上面. 基本方法 当我们把一张图片加载到内存里面之后,它是不能直接被GPU绘制出来的,纹理贴图过程如下: 首先,我们为之前的顶点添加纹理坐标属性并传到vertex shader里面去: 然后

OpenGL——OpenCV读取图片进行纹理贴图

使用OpenCV读取图片代码如下 img = imread(m_fileName); if (img.empty()) { fprintf(stderr, "Can not load image %s\n", m_fileName); return -1; } //设置长宽 int width = img.cols; int height = img.rows; int channel = img.channels(); printf(" depth %d\n",

对Android opengl ES世界坐标系和纹理坐标系的理解

初学opengl ES,每一个教你在屏幕上贴图的opengl版hello world都有这么两数组: static final float COORD[] = { -1.0f, -1.0f, 1.0f, -1.0f, -1.0f, 1.0f, 1.0f, 1.0f, }; static final float TEXTURE_COORD[] = { 0.0f, 1.0f, 1.0f, 1.0f, 0.0f, 0.0f, 1.0f, 0.0f, }; 但是几乎都不解释,所以我学的时候都不明白这些点

OpenGL教程翻译 第十六课 基本的纹理贴图

OpenGL教程翻译 第十六课 基本的纹理贴图 原文地址:http://ogldev.atspace.co.uk/(源码请从原文主页下载) Background 纹理贴图就是将任意一种类型的图片应用到3D模型的一个或多个面.图片(也可以称之为纹理)内容可以是任何东西,但是他们一般都是一些比如砖,叶子,地面等的图案,纹理贴图增加了场景的真实性.例如,对比下面的两幅图片. 为了进行纹理贴图,你需要进行三个步骤:将图片加载到OpenGl中,定义模型顶点的纹理坐标(以对其进行贴图),用纹理坐标对图片进行

OpenGL ES 2.0 渲染管线 学习笔记

图中展示整个OpenGL ES 2.0可编程管线 图中Vertex Shader和Fragment Shader 是可编程管线: Vertex Array/Buffer objects 顶点数据来源,这时渲染管线的顶点输入,通常使用 Buffer objects效率更好. Vertex Shader 顶点着色器通过矩阵变换位置.计算照明公式来生成逐顶点颜色已经生成或变换纹理坐标等基于顶点的操作. Primitive Assembly 图元装配经过着色器处理之后的顶点在图片装配阶段被装配为基本图元

[转] iOS OpenGL ES Guide

OpenGL ES 小结 概述 OpenGL ES (Open Graphics Library for Embedded Systems)是访问类似 iPhone 和 iPad 的现代嵌入式系统的 2D 和 3D 图形加速硬件的标准. 把程序提供的几何数据转换为屏幕上的图像的过程叫做渲染. GPU 控制的缓存是高效渲染的关键.容纳几何数据的缓存定义了要渲染的点.线段和三角形. OpenGL ES 3D 的默认坐标系.顶点和矢量为几何数据的描述提供了数学基础. 渲染的结果通常保存在帧缓存中.有两

Opengl ES 1.x NDK实例开发之六:纹理贴图

开发框架介绍请參见:Opengl ES NDK实例开发之中的一个:搭建开发框架 本章在第三章(Opengl ES 1.x NDK实例开发之三:多边形的旋转)的基础上演示怎样使用纹理贴图,分别实现了三角形纹理贴图和正方形纹理贴图. [实例解说] OpenglES要求生成纹理的图片长宽为2的n次方.支持各种格式(BMP, GIF, JPEG, PNG...) 本例中使用的图片为png格式,尺寸为128*128 本例中,在上层GLJNIView.java中生成纹理.将纹理句柄传递给Native层进行绘

终于照着教程,成功使用OpenGL ES 绘制纹理贴图

在之前成功绘制变色的几何图形之后,今天利用Openg ES的可编程管线绘制出第一张纹理.学校时候不知道OpenGL的重要性,怕晦涩的语法,没有跟老师学习OpenGL的环境配置,如今只能利用cocos2dx 2.2.3 配置好的环境学习OpenGL ES.源代码来自<cocos2d-x高级开发教程>,注释是本人的. void HelloWorld::draw() { //opengl世界坐标轴的读取和绘制默认是逆时针顺序 static GLfloat vertext[] = { 0.0f,0.0

Android OpenGL ES(七)----理解纹理与纹理过滤

1.理解纹理 OpenGL中的纹理可以用来表示图像,照片,甚至由一个数学算法生成的分形数据.每个二维的纹理都由许多小的纹理元素组成,它们是小块的数据,类似于我们前面讨论过的片段和像素.要使用纹理,最常用的方式是直接从一个图像文件加载数据. 每个二维纹理都有其自己的坐标空间,其范围是从一个拐角的(0,0)到另一个拐角的(1,1).按照惯例,一个维度叫做S,而另一个称为T.当我们想要把一个纹理应用于一个三角形或一组三角形的时候,我们要为每个顶点指定一组ST纹理坐标,以便OpenGL知道需要用那个纹理