Joel Meuleman

Rendering the Ocean with OpenGL

Introduction

I've always enjoyed staring at the ocean. But what if I had the ocean at home on my computer? This is the question I bravely asked as I set out to implement an OpenGL program to mimic the ocean.

Why OpenGL?

OpenGL - Low level cross-platform graphics programming API.
Shader - Program that runs on the GPU.


Why suffer through the pain of using OpenGL instead of using a higher-level graphics framework or library? As an engineer, wanting to know how things work is inherent to who I am. Programming at a low-ish level like this provides several benefits: it forces me to think harder about making my own abstractions, and it teaches me about (in)efficiencies with different implementations. Both these skills come in handy when debugging code made with higher level tools.

As an aside, this post is not about teaching OpenGL basics, but rather about how a complex looking graphic can be from small simple pieces.

Step by Step Implementation

Initial Planning

To begin, I should mention I'll be doing the rendering through rasterization (as opposed to ray tracing/marching). This means I'll be describing a scene in 3D space, then doing some algebra to project this 3D world onto a 2D plane which determines the pixel colours.

To render this scene, I'll generate a plane of vertices, then displace these vertices with a wave function. Later, I'll put this plane in a skybox when it's time to make the waves look a bit more realistic with reflections. The final executable will utilize 2 shader programs: one to render the waves, and one to render the background.

Rendering a Plane

I first start by generating points for and rendering a simple plane. I generate evenly spaced points and record their index ordering for passing to the EBO for rendering.

In the vertex shader, I displace the vertical position with a sine function \[p = (x,sin(x,z), z)\] In the fragment shader, I assign a blue colour to each pixel in the mesh. Finally, I displace the camera slightly so I can view the plane.

I have a wave! I can see I have motion and depth, but it would be nice if I had a bit of lighting off the surface of the mesh, so that I can better tell what's going on.

Lighting

For this shader program, I chose to implement Blinn-Phong lighting. This is a fundamental lighting model in computer graphics, and it’s extensible for adding later lighting features. There are 3 aspects to Blinn Phong lighting: ambient, specular, and diffuse light.

To begin implementing this lighting method, the surface normal for each vertex is required. Thankfully, as I have a simple function for the displacement of each vertex, I get for free a simple derivative that will give me the exact normal for each vertex \[dy/dx = cos(x)\] Using the normal as calculated in the vertex shader I calculate the lighting contribution for each pixel.

Nice! With the basic lighting done, I can implement more complex wave patterns.

Improving the Wave Displacement Algorithm

With the basics out of the way, it's time to look at more realistic wave algorithms for displacing our plane vertices. GPU Gems describes how I can achieve more complex looking waves using a sum of sines calculation. \[ y=\sum_{i = 0}^{N}{f(x)} \hspace{12pt} \text{and} \hspace{12pt} \frac{dy}{dx}=\sum_{i = 0}^{N}{f'(x)} \] Thankfully, the derivative of the sum equals the sum of the derivative, which means I can more or less wrap the displacement and normal calculations in a loop. After adding some pseudo randomly generated wave parameters, and summing just 4 waves, I arrive at this animation.

Then, I change the wave function to one with steeper peaks and lower troughs by updating the displacement and normal computations in the vertex shader. \[ W_i(x, z, t) = 2A_i \times \left( \frac{\sin(\mathbf{D}_i \cdot (x, z) \times w_i + t \times \varphi_i) + 1}{2} \right)^k \]

\[ \frac{\partial}{\partial x}(W_i(x, z, t)) = k \times \mathbf{D}_i.x \times w_i \times A_i \times \left( \frac{\sin(\mathbf{D}_i \cdot (x, z) \times w_i + t \times \varphi_i) + 1}{2} \right)^{k-1} \times \cos(\mathbf{D}_i \cdot (x, z) \times w_i + t \times \varphi_i). \]

This surface is more interesting with more complexity. I can spend time tweaking values to get exactly what I want later.

Adding a Skybox for Reflections

In this stage, I add more realism to the surface by adding an atmosphere to view in the distance. To be somewhat realistic, this atmosphere must also be reflected off the surface of the water.

A skybox is pretty much exactly what the name suggests: a box surrounding our scene with images of the sky. I implement the skybox as a separate shader. I define a box surrounding the wave plane, mapping a pretty sky picture to it. In the fragment shader of the wave shader program, I check the camera angle, then sample the skybox texture to determine what colour to reflect off the surface pixel. After implementing these changes and fiddling with the wave parameters

Not too bad! We see whites from the clouds reflected, as well as the blue of the sky. The wave function could still be more interesting/realistic, and the lighting params improved for more natural-looking light.

Implementing additional tweaks and techniques of which I'll write later (mostly implemented from Acerola's video), I finally produced this animation.

Conclusion

Over the course of a couple sections, I've shown how to implement a complex looking animation through little more than some basic calculus and linear algebra. I'll likely post a follow-up blog where I implement a more realistic wave function, but this seems like a good place to stop before going down the realism rabbit hole.

References