将android opengl上的触摸转换为光线/矢量,并检查它是否击中平面



我对android应用程序开发/使用OpenGL ES相当陌生。我的基本目标是在我的Surface View中创建4个简单的正方形,当用户点击屏幕时,我想检查他点击了哪个正方形(如果有的话)。然后应该标记这个正方形并改变它的颜色,当用户点击第二个(不同的)正方形时,我想画一个从正方形1到正方形2的箭头。我使用了opengl的android教程作为起点,并试图根据我的目的对其进行调整。


我在检查用户是否单击了矩形时遇到问题。我完成了很多关于安卓系统中opengl和一般线性代数的stackoverflow问题和其他指南。我发现这些是最有用的:
Opengl教程
带光线投射的鼠标拾取
实现光线拾取

这就是我迄今为止的收获:
我的渲染正方形是在模型视图投影矩阵中定义的,为了检查用户是否点击了这些正方形,我必须将点击转化为世界空间坐标中的光线。在那之后,我必须检查这条射线是否与我的一个正方形碰撞,这些正方形都位于同一平面上

这是我编辑最多的地方,在surfaceCreated上,我添加了四个正方形,并将它们移动到它们的位置。当用户点击屏幕时,会使用绝对屏幕坐标调用checkCollision-方法。然后我尝试翻译这些帖子中的说明:
实现光线拾取
直线和平面的交点

public class MyGLRenderer implements GLSurfaceView.Renderer {
private static final String TAG = "MyGLRenderer";
private HashMap<String, Square> mySquares = new HashMap<>();

// mMVPMatrix is an abbreviation for "Model View Projection Matrix"
private final float[] mMVPMatrix = new float[16];
private final float[] mProjectionMatrix = new float[16];
private final float[] mViewMatrix = new float[16];
private final float[] mRotationMatrix = new float[16];
private int screenWidth = 0;
private int screenHeight = 0;
private float mAngle;
private int square_number = 65;
private final float[][] colors = {
{0.29f, 0.57f, 1.0f, 1.0f},
{0.8f, 0.0f, 0.0f, 1.0f},
{0.13f, 0.8f, 0.0f, 1.0f},
{1.0f, 0.84f, 0.0f, 1.0f}};

public void onSurfaceCreated(GL10 unused, EGLConfig config) {
// Set the background frame color
GLES20.glClearColor(1.0f, 1.0f, 1.0f, 1.0f);

//Adding the 4 squares to the grid and move them to their positions
String square_key = "";
square_key = addSquare();
this.mySquares.get(square_key).moveSquare(0.5f, 0.5f);
square_key = addSquare();
this.mySquares.get(square_key).moveSquare(0.5f, -0.5f);
square_key = addSquare();
this.mySquares.get(square_key).moveSquare(-0.5f, 0.5f);
square_key = addSquare();
this.mySquares.get(square_key).moveSquare(-0.5f, -0.5f);
}

public void checkCollision(float touchX, float touchY) {
//Step 1: normalize coordinates
float[] touchClipMatrix = new float[]{
2.0f * touchX / this.screenWidth - 1.0f,
1.0f - touchY * 2 / this.screenHeight,
0,
1.0f
};

//inverted matrices
float[] invertedProjectionMatrix = new float[16];
float[] invertedMViewMatrix = new float[16];
Matrix.invertM(invertedProjectionMatrix,0, mProjectionMatrix, 0);
Matrix.invertM(invertedMViewMatrix,0, mViewMatrix, 0);
//Calculation Matrices
float[] unviewMatrix = new float[16];
float[] mouse_worldspace = new float[4];
//Getting mouse position in world space
Matrix.multiplyMM(unviewMatrix, 0, invertedMViewMatrix, 0, invertedProjectionMatrix,0);
Matrix.multiplyMV(mouse_worldspace, 0 , unviewMatrix, 0 , touchClipMatrix, 0);

Log.i(TAG, "checkCollision-touchClipMatrix: "+ Arrays.toString(touchClipMatrix));
Log.i(TAG, "checkCollision-invertedProjectionMatrix: "+ Arrays.toString(invertedProjectionMatrix));
Log.i(TAG, "checkCollision-invertedMViewMatrix: "+ Arrays.toString(invertedMViewMatrix));
Log.i(TAG, "checkCollision-mouse_worldspace: "+ Arrays.toString(mouse_worldspace));

//Getting the camera position
float [] cameraPosition = {0, 0, -3};
//subtract camera position from the mouse_worldspace
float [] ray_unnormalized = new float[4];
for(int i = 0; i < 3; i++){
ray_unnormalized[i] = mouse_worldspace[i] / mouse_worldspace[3] - cameraPosition[i];
}
//normalize ray_vector
float ray_length = Matrix.length(ray_unnormalized[0], ray_unnormalized[1], ray_unnormalized[2]);
float [] ray_vector = new float[4];
for(int i=0; i<3; i++){
ray_vector[i] = ray_unnormalized[i]/ray_length;
}
Log.i(TAG, "checkCollision - ray_vector: "+ Arrays.toString(ray_vector));
LinePlaneIntersection linePlaneIntersection = new LinePlaneIntersection();
LinePlaneIntersection.Vector3D rv = new LinePlaneIntersection.Vector3D(ray_vector[0], ray_vector[1], ray_vector[2]);
LinePlaneIntersection.Vector3D rp = new LinePlaneIntersection.Vector3D(mouse_worldspace[0], mouse_worldspace[1], mouse_worldspace[2]);
LinePlaneIntersection.Vector3D pn = new LinePlaneIntersection.Vector3D(0.0, 0.0, 0.0);
LinePlaneIntersection.Vector3D pp = new LinePlaneIntersection.Vector3D(0.0, 0.0, 1.0);
LinePlaneIntersection.Vector3D ip = linePlaneIntersection.intersectPoint(rv, rp, pn, pp);
Log.i(TAG, "checkCollision-intersection point: "+ip);
}
public String addSquare() {
String keyName = String.valueOf((char) this.square_number);
this.mySquares.put(keyName, new Square(keyName, colors[this.square_number-65]));
this.square_number += 1;
return keyName;
}

public void logMatrices() {
Log.i(TAG, "MVPMatrice: " + Arrays.toString(this.mMVPMatrix));
Log.i(TAG, "mProjectionMarice: " + Arrays.toString(this.mProjectionMatrix));
Log.i(TAG, "mViewMatrice: " + Arrays.toString(this.mViewMatrix));
}
@Override
public void onDrawFrame(GL10 unused) {
float[] scratch = new float[16];
// Draw background color
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
// Set the camera position (View matrix)
//mySquare.moveSquare(0.25f, 0.25f);
Matrix.setLookAtM(mViewMatrix, 0, 0, 0, -3, 0f, 0f, 0.0f, 0f, 1.0f, 0.0f);
//        Matrix.scaleM(mViewMatrix, 0, 0.5f,0.5f,0);
//        Matrix.translateM(mViewMatrix, 0, 2f, 1f, 0);
// Calculate the projection and view transformation
Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mViewMatrix, 0);

// Create a rotation for the square
Matrix.setRotateM(mRotationMatrix, 0, mAngle, 0, 0.0f, 1.0f);
// Combine the rotation matrix with the projection and camera view
// Note that the mMVPMatrix factor *must be first* in order
// for the matrix multiplication product to be correct.
Matrix.multiplyMM(scratch, 0, mMVPMatrix, 0, mRotationMatrix, 0);
// Draw squares
for (Map.Entry<String, Square> s : this.mySquares.entrySet()) {
s.getValue().draw(scratch);
}
}
@Override
public void onSurfaceChanged(GL10 unused, int width, int height) {
this.screenWidth = width;
this.screenHeight = height;
// Adjust the viewport based on geometry changes,
// such as screen rotation
GLES20.glViewport(0, 0, width, height);
float ratio = (float) width / height;
// this projection matrix is applied to object coordinates
// in the onDrawFrame() method
Matrix.frustumM(mProjectionMatrix, 0, -ratio, ratio, -1, 1, 3, 7);
}

public static int loadShader(int type, String shaderCode) {
// create a vertex shader type (GLES20.GL_VERTEX_SHADER)
// or a fragment shader type (GLES20.GL_FRAGMENT_SHADER)
int shader = GLES20.glCreateShader(type);
// add the source code to the shader and compile it
GLES20.glShaderSource(shader, shaderCode);
GLES20.glCompileShader(shader);
return shader;
}
public static void checkGlError(String glOperation) {
int error;
while ((error = GLES20.glGetError()) != GLES20.GL_NO_ERROR) {
Log.e(TAG, glOperation + ": glError " + error);
throw new RuntimeException(glOperation + ": glError " + error);
}
}
}

我添加了一个moveSquare方法,因为初始化时所有的正方形都有相同的坐标。我不确定这是否是正确的方法,如果这是错误的/扰乱了其他计算,请告诉我。

public class Square {
private String squareID;
private final String vertexShaderCode =
// This matrix member variable provides a hook to manipulate
// the coordinates of the objects that use this vertex shader
"uniform mat4 uMVPMatrix;" +
"attribute vec4 squarePosition;" +
"void main() {" +
// The matrix must be included as a modifier of gl_Position.
// Note that the uMVPMatrix factor *must be first* in order
// for the matrix multiplication product to be correct.
"  gl_Position = uMVPMatrix * squarePosition;" +
"}";
private final String fragmentShaderCode =
"precision mediump float;" +
"uniform vec4 squareColor;" +
"void main() {" +
"  gl_FragColor = squareColor;" +
"}";
private FloatBuffer vertexBuffer;
private ShortBuffer drawListBuffer;
private int mProgram;
private int mPositionHandle;
private int mColorHandle;
private int mMVPMatrixHandle;
private static final String TAG = "Square";
// number of coordinates per vertex in this array
static final int COORDS_PER_VERTEX = 3;
private float squareCoords[] = {
-0.1f, 0.1f, 0.0f,   // top left
-0.1f, -0.1f, 0.0f,   // bottom left
0.1f, -0.1f, 0.0f,   // bottom right
0.1f, 0.1f, 0.0f}; // top right
private final short drawOrder[] = {0, 1, 2, 0, 2, 3}; // order to draw vertices
private final int vertexStride = COORDS_PER_VERTEX * 4; // 4 bytes per vertex
//Fallback color
private float color[] = {0.2f, 0.709803922f, 0.898039216f, 1.0f};
/**
* Sets up the drawing object data for use in an OpenGL ES context.
*/
public Square(String id, float [] color) {
this.squareID = id;
if(color.length == 4) {
this.color = color;
}
//Buffers need to updated with the new square coordinates
updateBuffers();
//Shaders (should) only be prepared once when initializing a square
prepareShadersAndOpenGL();
}

private void prepareShadersAndOpenGL() {
// prepare shaders and OpenGL program
int vertexShader = MyGLRenderer.loadShader(
GLES20.GL_VERTEX_SHADER,
vertexShaderCode);
int fragmentShader = MyGLRenderer.loadShader(
GLES20.GL_FRAGMENT_SHADER,
fragmentShaderCode);
mProgram = GLES20.glCreateProgram();             // create empty OpenGL Program
GLES20.glAttachShader(mProgram, vertexShader);   // add the vertex shader to program
GLES20.glAttachShader(mProgram, fragmentShader); // add the fragment shader to program
GLES20.glLinkProgram(mProgram);                    // create OpenGL program executables
}
public void updateBuffers() {
// initialize vertex byte buffer for shape coordinates
ByteBuffer bb = ByteBuffer.allocateDirect(
// (# of coordinate values * 4 bytes per float)
squareCoords.length * 4);
bb.order(ByteOrder.nativeOrder());
vertexBuffer = bb.asFloatBuffer();
vertexBuffer.put(squareCoords);
vertexBuffer.position(0);
// initialize byte buffer for the draw list
ByteBuffer dlb = ByteBuffer.allocateDirect(
// (# of coordinate values * 2 bytes per short)
drawOrder.length * 2);
dlb.order(ByteOrder.nativeOrder());
drawListBuffer = dlb.asShortBuffer();
drawListBuffer.put(drawOrder);
drawListBuffer.position(0);
}

//Updating the square coordinates and updating to buffers
public void moveSquare(float deltaX, float deltaY) {
this.squareCoords[0] += deltaX;
this.squareCoords[3] += deltaX;
this.squareCoords[6] += deltaX;
this.squareCoords[9] += deltaX;
this.squareCoords[1] += deltaY;
this.squareCoords[4] += deltaY;
this.squareCoords[7] += deltaY;
this.squareCoords[10] += deltaY;
updateBuffers();
}

/**
* Encapsulates the OpenGL ES instructions for drawing this shape.
*
* @param mvpMatrix - The Model View Project matrix in which to draw
*                  this shape.
*/
public void draw(float[] mvpMatrix) {
// Add program to OpenGL environment
//        Log.i(TAG, "Square ("+squareID+") mProgram: "+mProgram);
GLES20.glUseProgram(mProgram);

// get handle to vertex shader's vPosition member
mPositionHandle = GLES20.glGetAttribLocation(mProgram, "squarePosition");
// Enable a handle to the triangle vertices
GLES20.glEnableVertexAttribArray(mPositionHandle);
// Prepare the triangle coordinate data
GLES20.glVertexAttribPointer(
mPositionHandle, COORDS_PER_VERTEX,
GLES20.GL_FLOAT, false,
vertexStride, vertexBuffer);
// get handle to fragment shader's vColor member
mColorHandle = GLES20.glGetUniformLocation(mProgram, "squareColor");
// Set color for drawing the triangle
GLES20.glUniform4fv(mColorHandle, 1, color, 0);
// get handle to shape's transformation matrix
mMVPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uMVPMatrix");
//        MyGLRenderer.checkGlError("glGetUniformLocation");
// Apply the projection and view transformation
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMatrix, 0);
//        MyGLRenderer.checkGlError("glUniformMatrix4fv");
// Draw the square
GLES20.glDrawElements(
GLES20.GL_TRIANGLES, drawOrder.length,
GLES20.GL_UNSIGNED_SHORT, drawListBuffer);
// Disable vertex array
GLES20.glDisableVertexAttribArray(mPositionHandle);
}
}
public class MyGLSurfaceView extends GLSurfaceView {
private final MyGLRenderer mRenderer;
private static final String TAG = "MyGLSurfaceView";
private final float TOUCH_SCALE_FACTOR = 180.0f / 320;
public MyGLSurfaceView(Context context) {
super(context);
// Create an OpenGL ES 2.0 context.
setEGLContextClientVersion(2);
// Set the Renderer for drawing on the GLSurfaceView
mRenderer = new MyGLRenderer();
setRenderer(mRenderer);
// Render the view only when there is a change in the drawing data
setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);
}

@Override
public boolean onTouchEvent(MotionEvent e) {
// MotionEvent reports input details from the touch screen
// and other input controls. In this case, you are only
// interested in events where the touch position changed.
float x = e.getX();
float y = e.getY();
switch (e.getAction()) {
case MotionEvent.ACTION_DOWN:
mRenderer.logMatrices();
mRenderer.checkCollision(x, y);
//  mRenderer.setAngle(mRenderer.getAngle()+45f);
requestRender();
}
return true;
}
}

我知道这是一本很难通读的书,所以我会尝试表达我的主要问题:

  1. 我的想法总体上是正确的,还是使用了错误的转换/步骤
  2. Square类中的squareWord数组是否表示我的模型矩阵?这些矩阵在哪一点转换为世界坐标
  3. 为什么我在平方类中给draw Method的矩阵称为mMVPMatrix,对我来说,这意味着这个矩阵包含所有三个矩阵(模型、视图、投影)。但在我称之为绘制方法的时候,我刚刚将投影与视图矩阵相乘,那么模型零件应该从哪里来呢?我是遗漏了什么,还是混淆了矩阵的项
  4. 我仍在努力理解投影矩阵的作用/描述。我知道它基本上定义了要渲染的区域,所有不在这个区域的东西都不会显示在屏幕上。此区域是否始终相对于摄影机(视图)位置?

我希望我能正确地解释我的问题,也许对我的问题有一个更简单的解决方案。提前感谢所有到目前为止阅读过的人。我希望有人能帮我做这个

附言:这是我在Stackoverflow上的第一个问题,我的拼写可能不完美,所以很抱歉。如果你缺少理解问题的信息/回答我刚刚提出的问题,我会尽快添加。


以下是一些调试信息:

  • 位置x=940.94604 | y=407.9297上的识别触摸
  • MVP矩阵:[-4.4,0.0,0.0,0.0,0.0,0.0,0,0.0,0.5,1.0,0.0,3.0]
  • mProjectonMarix:[4.4,0.0,0.0,0.0,3.0,0.0,0.0,0.0,-2.5,-1.0,0.0,0.0-10.5,0.0]
  • mViewMatrix:[-1.0,0.0,-0.0,0.0,0.0,1.0,-0.0、0.0、0.0、-0.0、-1.0、0.0,0.0、0.0,-3.0,1.0]
  • checkCollision touchClipMatrix:[0.7424927,0.48493725,-3.0,1.0]
  • checkCollision invertedProjectionMatrix:[0.22727272,-0.0,-0.0
  • checkCollision invertedMViewMatrix:[-1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,-1.0,0.0,0,0.0,0.0.,0.0,3.0]
  • checkCollision取消查看矩阵[0.22727272,0.0,0.0,0.0,0.3333333,0.0,0.0,0.0,0.2857143-0.0952381,0.0,-0.0,0.28574330.23809522]
  • checkCollision-mouse_worldspace:【-0.16874833、0.16164574、-0.5714286、0.52380955】
  • checkCollision-ray_unnormalized:[-0.3221559、0.3085964、1.9090909、0.0]
  • checkCollision-ray_length:1.9605213
  • checkCollision-ray_vector:[0.16432154、0.15740527、0.9737669、0.0]
  • check碰撞交点:(NaN,NaN,NaN)

ray_unnormalized的计算似乎是错误的。不能用这种方法减去齐次坐标。将mouse_worldspace转换为笛卡尔坐标。笛卡尔坐标是xyz分量和w组件的商(请参见透视除法)
光线方向是从笛卡尔相机位置到笛卡尔鼠标位置的矢量:

//Getting the camera position
float [] cameraPosition = {0, 0, -6};
//subtract camera position from the mouse_worldspace
float [] ray_unnormalized = new float[4];
for(int i = 0; i < 3; i++){
ray_unnormalized[i] = mouse_worldspace[i] / mouse_worldspace[3] - cameraPosition[i];
}

相关内容

  • 没有找到相关文章

最新更新