From Pandora Wiki
Jump to: navigation, search


Unfortunately OpenGL ES takes away many of the functions that beginner programmers use. They do this, because a lot of those functions are plain rubbish for any high-end application. The Pandora supports OpenGL ES 1.0 as well as ES 2.0. The main difference with ES 2.0 is the lack of the fixed function pipeline, which has been replaced by freely programmable GLSL shaders. OpenGL ES compared to classic OpenGL leaves you with only a handful of functions, where you have to do all of the matrix math on your own, which can be really hard if you're not good at maths. I'll cover how to do that in a later tutorial. What you're learning here isn't just for the Pandora, you should use these new methods in your Windows/Linux code. These changes are being made to the OpenGL core profile, too, so you're losing all of this stuff soon enough.

The Basics

glBegin() and glEnd()


You know what I just did? Yeah, I drew a triangle! Nope, you can't draw stuff like this in OpenGL ES 1.1, so how would one draw a triangle in ES? Bear with me while you read this if you don't understand it, I'll explain it in depth.

GLfloat vertices[] = {1,0,0, 0,1,0, -1,0,0};
glVertexPointer(3, GL_FLOAT, 0, vertices);
glDrawArrays(GL_TRIANGLES, 0, 3);

This is called drawing with a vertex array. Eww? Yeah, you would never draw like that in a real program, your vertex array would be stored already, you wouldn't have to define it every time you drew somthing. While it's messy for drawing primitives, it's really cool for drawing models, because your models vertices are generally stored in arrays like that already. I'll show you an example from TINCS after I explain this code.

GLfloat vertices[] = {1,0,0, 0,1,0, -1,0,0};

This should be obvious, we're simply creating an array that stores the vertices of our triangle. It can be of any size, you can store as many vertices as you want. Note, how I space out every three numbers, this is because a vertex is made of X,Y,Z components. So we have our X, our Y, our Z then a space for the next three. You don't have to space it like this, it's mainly so you can see what's going on.


We need to do this before we start drawing, this tells OGL that we're going to feed it a vertex array from the client side (rather than the server, which is the graphics card).

glVertexPointer(3, GL_FLOAT, 0, vertices);

Ok, here is where we're telling OGL where to get it's vertices from. The first argument, tells OGL how many components (or floats) there are per vertex. Remember we talked about how we were spacing them out every 3 numbers? Here we're telling OGL that we did that! The next argument tells OpenGL that we're giving it floating point numbers. Now we have a "0". This is called the stride, I could explain it in detail, but you don't need to understand it yet. Just leave it at 0. In the last argument we give OGL our vertex array!

glDrawArrays(GL_TRIANGLES, 0, 3);

Here is where the work is done. The first argument is telling OGL that we're drawing in triangles. (This can be GL_QUADS, GL_LINES, whatever!). The second is specifying where we want to start in the array we passed OGL earier. The last one is telling OpenGL how many vertices we want to draw. In this case, it's just the three.


Finally, we're disabling the vertex array state. We're done!

So this is one of the ways to draw stuff in OpenGL. It's the next step up from glBegin and glEnd and it works fine in ES 1.1, so it works fine on the Pandora!

Drawing Textured Quads

OpenGL can be great for speeding up your 2D application. One of the ways you do this, is by drawing with OpenGL rather than SDL. 2D games are made up of lots of sprites, which are just images that move about in a 2D world. You can use hardware acceleration to speed this up. If you're only planning on using 3D stuff on the Pandora, read this anyway, it introduces you to texturing with texture arrays and you're probably going to need to use sprites somewhere anyway.

So, we can pick apart my method of sprite drawing in TINCS for this tutorial, I mainly use this function to draw text, once it's been rendered by SDL_ttf and converted into a OpenGL texture, but never mind any of that.

static void DrawSprite(GLuint sprite, float X, float Y, float Z, float W, float H)
	GLfloat box[] = {X,Y + H,Z,  X + W,Y + H,Z,     X + W, Y, Z,   X,Y,Z};
	GLfloat tex[] = {0,0, 1,0, 1,1, 0,1};
	glVertexPointer(3, GL_FLOAT, 0,box);
	glTexCoordPointer(2, GL_FLOAT, 0, tex);

Give it a good read, make sure you understand all of the stuff we've been over before and how it's applied in this example! Also, I'm not sure how this will fare in the 3D realm, I only use it when I'm in Ortho mode (2D, drawing ma GUI).



As you can see here, binding the texture that's passed to the function to GL_TEXTURE_2D. You should understand this already. (If not, hit up NeHes tutorials!)

GLfloat tex[] = {0,0, 1,0, 1,1, 0,1};

This is new too, what we've done, is got the texture co-ordinates that we would usually pass to OGL per-vertex. (glTexCoord2f() style) Then put them into a little array.


Just like last time, we're enabling a state on the client (not the server), this one tells the GPU we want to pass a texture coordinate array too.

glTexCoordPointer(2, GL_FLOAT, 0, tex);

Here we're passing the texture coordinates to OGL. Exactly like the vertex coordinates, except that the first argument (the size) is only two. This is because we only need two values per texture coordinate (a U and a V).


We disable the texture coordinate array just like we did with the vertex array.

There we go! That's not much harder than SDL blit is it? I guarantee it will make your app run a whole bunch faster too :D


This is all you need to know to use OpenGL on the Pandora. Note, you can't open your window like you would in SDL/Linux/Windows, it's a little bit different

Combining GLES 1.1 and SDL to make a window on the pandora

There are two options for combining GLES and SDL:

1: Grab the modified version of SDL

There is a special SDL source code for pandora with OpenGL ES capability available here :

The screen initialisation is now easy as this : SDL_SetVideoMode(800, 480, 16, SDL_OPENGLES)

An SDL/GLES exemple is also available on the git repository.

2: Use EGL Directly (with SDL 1.2)

Sometimes its not best to use the modified version of SDl (when working on cross platform games). Instead you can use EGL to create the GLES graphics context for you. The following functions do just that.


  1. Make sure you call InitOpenGL() straight after creating the window (SDL_SetVideoMode()).
  2. When using EGL you should not pass in the SDL_OPENGL flag to SDL_SetVideoMode(). SDL_HWSurface and SDL_SWSurface should do.
  3. You will notice '#ifdef GLES1' around much of the EGL code. Simply define GLES1 in your preprocessor (-DGLES1 on the command line of gcc if i recall correctly?) to use GLES. Without the define SDL will be set up for standard OpenGL (I left it in as its useful for cross platform dev).
  4. Once you have finished with the window and just before terminating it call TerminateOpenGL() to clean up.

NOTE - If you call SDL_SetVideoMode more than once you will probably need to terminate and reset the context (TerminateOpenGL(); SDL_SetVideoMode(); InitOpenGL();)

// OpenGLInit.c

#ifdef GLES1
	#include <EGL/egl.h>
	#include <GLES/gl.h>
	#include <SDL/SDL_syswm.h>
	#include <GL/gl.h>
	#include <SDL/SDL.h>
#include "OpenGLInit.h"
#ifdef GLES1
	EGLDisplay g_eglDisplay = 0;
	EGLConfig g_eglConfig = 0;
	EGLContext g_eglContext = 0;
	EGLSurface g_eglSurface = 0;
        Display *g_x11Display = NULL;
// consts
#ifdef GLES1
static const EGLint g_configAttribs[] ={
										  EGL_RED_SIZE,      	    COLOURDEPTH_RED_SIZE,
										  EGL_SURFACE_TYPE,         EGL_WINDOW_BIT,
Initialise opengl settings. Call straight after SDL_SetVideoMode()
int InitOpenGL()
#ifdef GLES1
	// use EGL to initialise GLES
	g_x11Display = XOpenDisplay(NULL);
	if (!g_x11Display)
		fprintf(stderr, "ERROR: unable to get display!n");
		return 0;
	g_eglDisplay = eglGetDisplay((EGLNativeDisplayType)g_x11Display);
	if (g_eglDisplay == EGL_NO_DISPLAY)
		printf("Unable to initialise EGL display.");
		return 0;
	// Initialise egl
	if (!eglInitialize(g_eglDisplay, NULL, NULL))
		printf("Unable to initialise EGL display.");
		return 0;
	// Find a matching config
	EGLint numConfigsOut = 0;
	if (eglChooseConfig(g_eglDisplay, g_configAttribs, &g_eglConfig, 1, &numConfigsOut) != EGL_TRUE || numConfigsOut == 0)
		fprintf(stderr, "Unable to find appropriate EGL config.");
		return 0;
	// Get the SDL window handle
	SDL_SysWMinfo sysInfo; //Will hold our Window information
	SDL_VERSION(&sysInfo.version); //Set SDL version
	if(SDL_GetWMInfo(&sysInfo) <= 0) 
		printf("Unable to get window handle");
		return 0;
	g_eglSurface = eglCreateWindowSurface(g_eglDisplay, g_eglConfig, (EGLNativeWindowType), 0);
	if ( g_eglSurface == EGL_NO_SURFACE)
		printf("Unable to create EGL surface!");
		return 0;
	// Bind GLES and create the context
	EGLint contextParams[] = {EGL_CONTEXT_CLIENT_VERSION, 1, EGL_NONE};		// Use GLES version 1.x
	g_eglContext = eglCreateContext(g_eglDisplay, g_eglConfig, NULL, NULL);
	if (g_eglContext == EGL_NO_CONTEXT)
		printf("Unable to create GLES context!");
		return 0;
	if (eglMakeCurrent(g_eglDisplay,  g_eglSurface,  g_eglSurface, g_eglContext) == EGL_FALSE)
		printf("Unable to make GLES context current");
		return 0;
	return 1;
 * Kill off any opengl specific details
void TerminateOpenGL()
#ifdef GLES1
	eglMakeCurrent(g_eglDisplay, NULL, NULL, EGL_NO_CONTEXT);
	eglDestroySurface(g_eglDisplay, g_eglSurface);
	eglDestroyContext(g_eglDisplay, g_eglContext);
	g_eglSurface = 0;
	g_eglContext = 0;
	g_eglConfig = 0;
	g_eglDisplay = 0;
        g_x11Display = NULL;
int SwapBuffers()
#ifdef GLES1
	eglSwapBuffers(g_eglDisplay, g_eglSurface);

Example on how to use GLES2 with SDL2 on the Pandora

This is just one way to do it, but it's the simplest I know of. User ptitSeb has released the PND "Code::Blocks", which not only contains the Code::Blocks IDE but an updated gcc compiler, git and many libraries needed for development including a special version of SDL2 for the Pandora. All of which you can also use from the command line on your Pandora. You can read about ptitSeb's SDL2 patch on the forums in this thread:

You can use the following test code to get something running quickly. Note you should use some better shader functions which print out an info log if you made an error in your shader code. Also note that GLES2 fragment shaders require precision qualifiers (lowp, mediump, highp) for floats. I used an #ifdef statement so the shader code works with both OpenGL and OpenGLES 2.

// gles2_test.cpp

#include <SDL.h>
#include <SDL_opengles2.h>
#include <SDL_opengl.h>
#include <SDL_opengl_glext.h>
#include <stdio.h> // for printf
SDL_Window *sdl_window;
SDL_GLContext sdl_gl_context;
void initSDL() {
	if (SDL_Init(SDL_INIT_VIDEO) < 0) {
		fprintf(stderr, "Couldn't init SDL2: %s\n", SDL_GetError());
	int video_width = 800;
	int video_height = 480;
	sdl_window = SDL_CreateWindow("Test",
		video_width, video_height, window_flags);
	if (!sdl_window) {
		fprintf(stderr, "Failed to create OpenGL window: %s\n", SDL_GetError());
	sdl_gl_context = SDL_GL_CreateContext(sdl_window);
	if (!sdl_gl_context) {
		fprintf(stderr, "Failed to create OpenGL context: %s\n", SDL_GetError());
void quitSDL() {
int main(int argc, char *argv[]) {
	// create shader
	const char *shader_vert_src = 
		"uniform float u_time;\n"
		"attribute vec2 va_position;\n"
		"varying vec3 v_color;\n"
		"void main() {\n"
		"	v_color = vec3(1.0 - 0.5*(va_position.x+va_position.y),va_position);\n"
		"	float c = cos(u_time), s = sin(u_time);"
		"	vec2 t = mat2(c, s, -s, c)*(va_position-vec2(0.33));\n"
		"	gl_Position = vec4(t.x*3.0/5.0, t.y, 0.0, 1.0);\n"
	const char *shader_frag_src =
		"#ifdef GL_ES\n"
		"precision mediump float;\n"
		"varying vec3 v_color;\n"
		"void main() {\n"
		"	gl_FragColor = vec4(v_color, 1.0);\n"
	GLint is_compiled;
	GLuint program, shader_vert, shader_frag;
	program = glCreateProgram();
	shader_vert = glCreateShader(GL_VERTEX_SHADER);
	glShaderSource(shader_vert, 1, &shader_vert_src, NULL);
	glGetShaderiv(shader_vert, GL_COMPILE_STATUS, &is_compiled);
	printf("vert shader compiled %d\n", is_compiled);
	glAttachShader(program, shader_vert);
	shader_frag = glCreateShader(GL_FRAGMENT_SHADER);
	glShaderSource(shader_frag, 1, &shader_frag_src, NULL);
	glGetShaderiv(shader_frag, GL_COMPILE_STATUS, &is_compiled);
	printf("frag shader compiled %d\n", is_compiled);
	glAttachShader(program, shader_frag);
	GLuint u_time_loc = glGetUniformLocation(program, "u_time");
	float u_time = 0.0f;
	// create vbo
	GLuint vbo;
	glGenBuffers(1, &vbo);
	glBindBuffer(GL_ARRAY_BUFFER, vbo);
	float vertex_data[] = {0.0f, 0.0f, 1.0f, 0.0f, 0.0f, 1.0f};
	glBufferData(GL_ARRAY_BUFFER, sizeof(vertex_data), vertex_data, GL_STATIC_DRAW);
	// setup vertex attribs
	GLuint va_position = 0;
	glVertexAttribPointer(va_position, 2, GL_FLOAT, GL_FALSE, 0, (GLvoid*)0);
	glClearColor(0.4, 0.6, 0.8, 1.0);
	bool running = true;
	do {
		SDL_Event event;
		while (SDL_PollEvent(&event)) {
			running = !(event.type == SDL_QUIT
				|| (event.type == SDL_KEYDOWN && event.key.keysym.sym == SDLK_ESCAPE));
		glUniform1f(u_time_loc, u_time += 1.0f/60.0f);
		glDrawArrays(GL_TRIANGLES, 0, 3);
	} while (running);
	glBindBuffer(GL_ARRAY_BUFFER, 0);
	glDeleteBuffers(1, &vbo);
	return 0;

You can compile this code with:

c++ `pkg-config --cflags --libs sdl2 glesv2` -DUSE_OPENGLES gles2_test.cpp -o gles2_test

The define USE_OPENGLES I added switches my GLES related code on. If compilation was successful you can run this tiny program with:


export SDL_VIDEO_GLES2=1

A colored rotating triangle on a blue background should appear. You can quit using the escape key. The exported environment variables are very important. They tell the patched SDL2 library to use the GLES2 driver instead of glshim which only translates OpenGL 1.5 calls. Else your shader code will silently fail with an empty info log.

When you finished coding your program and you want to publish a PND on the repo, you also have to package all your linked libraries from your Code::Blocks development environment. Here comes in handy, which is also contained in ptitSeb's Code::Blocks PND.

Remember, you can and should use this knowledge for regular OpenGL!

I want to add one last section about drawing models once I've released the source to my model format loader/drawer

Reference and learning material


  • OpenGL ES 2.0 Programming Guide (OpenGL) (NB - link) - This is a great book, very much like the defacto stadard text for OpenGL - the Red Book - I highly recommend this if you are serious about it. (£34+)



(See also Development_Tools)

First, you will need to be able to test and run your programs on your development machine. For this you will need an OpenGL ES emulator and the OpenGL ES SDK. Note that to run the emulator on your machine, you need an OpenGL 2.0 compliant graphics card, any nVidia 6XXX+ or ATi 95XX+ should work. BUT you do not need any special graphics hardware if you only want to compile an application on your PC, and test it on the Pandora.

  • OpenGL ES 2.0 SDK for PC[1] This works both on Linux and Windows. You will need to register with imgtec before downloading.

Running an OpenGL ES application on your PC is not as simple as just downloading the SDK, building some demos, and running them. To run the demo applications, you will need to set a few environment variables so that your system can find the necessary libraries. Or, you can add the libraries to your system path as appropriate.

Directions for the linux version of the SDK - tested on Ubuntu Gutsy:

  • LD_LIBRARY_PATH needs to be set to the SDK folder containing the OpenGL ES libraries such as ($SDK_PATH/Builds/OGLES2/LinuxPC/Lib should work)
  • To run the makefiles in the Training section you need to set a PLATFORM variable to either LinuxPC or LinuxGeneric. I have had success with LinuxPC.
  • See [2] for a script that you can run to set these for you.
  • If you are not afraid to use the commandline, try "export LD_LIBRARY_PATH=$SDK_PATH/Builds/OGLES2/LinuxPC/Lib" and "export PLATFORM=LinuxPC", where $SDK_PATH is the path of the unpacked SDK, such as "/home/username/OGLES_SDK/".

Directions for the Windows version of the SDK - tested on Windows Vista with Visual C++ Express Edition [3] :

  • The SDK works almost out of the box, with some minor modifications. The default configuration on VC++ works if you just want to compile. To actually run the code or debug it, you need to link VC++ with the PowerVR OGL ES emulation libraries. This can be done in two different ways:
    • By adding the path to the libraries in VC++. Go to (Tools > Options > Projects and solutions > VC++ Directories). Choose Library files in the drop-down menu labeled as Show directories for: and add the path to the libraries (SDKPackage\Builds\OGLES2\WindowsPC\Lib by default)
    • By copying the files in the (SDKPackage\Builds\OGLES2\WindowsPC\Lib) folder (libEGL.dll, libEGLv2.dll, etc) and pasting them in the Windows folder, which is already a default path in VC++. This method is preferred, since all the executables and projects that came with the SDK were programmed to find the DLL's in the folders that are configured by the default VC++ configuration.
  • Note that the projects included in the SDK were programmed in an older version of the VC++ IDE and need conversion if you are using the express edition. [4] This procedure is almost automatic and should not cause any problems. [5]