water shading

recently i did some water rendering in Arnold renderer, which you can see here. the shading worked quite well on the water and whitewater.

after this started to sim a wave Tunnel in Houdini Flips. a huge wave where you actually could surf. But i run into some trouble for water shading and whitewater look, i needed some reference.


I went to beach and took trusty NikonV1 camera with me to shoot some reference Photos. This are tiny Waves (50cm in height), but easy to take some pictures and good enough as shading reference.

the nice shot of miniature breaking wave

a nice snapshot of translucent effects of the wave
here we can see droplets (whitewater) in close up

here we can see more Droplets / Whitewater with sun in out back. a good example for anisotropy effect on shading.

droplets turn white with sunlight, depending on the sun direction.

There multiple Ways to render realistic Water. The old school way is to render a polygon water surface and volume underneath to simulate the light scattering. we did similar things back in 2008 on the Avatar Movie with custom written shaders for Renderman. Sidefx added presets for Houdini for its Ocean setup’s. The render time of this method are modest but the shading can quite difficult depending on camera angle and light situation.

these Days in the age of Path tracers, there 2 ways, rendering it with Sub-Surface Scattering (SSS) or Transmission Depth.

Sub-Surface Scattering simulates the effect of light entering an object and scattering beneath its surface. Not all light reflects from a surface. Some of it will penetrate below the surface of an illuminated object. There it will be absorbed by the material and scattered internally. Some of this scattered light will make its way back out of the surface and become visible to the camera. This is known as ‘sub-surface scattering’ or ‘SSS’. SSS is necessary for the realistic rendering of materials such as marble, skin, leaves, wax, and milk. The SSS component in this shader is calculated using a brute-force raytracing method.

While the Transmission Depth attribute controls volumetric light absorption within the object (fog), the Scatter attribute controls what percentage of the light will be scattered instead of absorbed, effectively creating the murky effect of semi-transparent materials.

Depth Controls the depth into the volume at which the transmission color is realized. Increasing this value makes the volume thinner, which means less absorption and scattering. It is a scale factor so that you can set a transmission_color and then tweak the depth to be appropriate for the size of your object.

Scattering is very import if wanna shader deep Materials like Ocean water. For the scattering effect to work Scatter must have a dominant percentage value, and the Depth attribute must generally be much lower. also the Opaque attribute must be unchecked in the Arnold attributes of the object’s shape node for the light to be able to pass into the mesh and illuminate the volume.

Rendering with refraction Depth is more “physical correct” way, but it does account tiny organism (light blockers) in this case you add textures to simulate plankton in the water.

I choose go with SSS route. The tycial Surface Scattering shading model has a similar volume light scatter look. the look can be limited but it works in case deep Ocean water. the advantage: it full support with current Arnold GPU renderer (Depth transmission is not supported yet) and SSS shading model is also faster to render. In addition, I’ ve added an extra underwater bubble simulation with particles to increase the realism.

water rendering with arnold GPU

i am starting to look into water effecst shading, this 1st simualtion with Houdini solvers. the foam is a little over-the-top. but i think the shading itself start to come together. with the new Arnold GPU updates its getting really fast., specifically when i am using Arnold Operators.

i’ve created underwater bubbles in extra simulation to make the side view nicer.

all this above was done in Houdini and regular Anrold htoA plug in.

here i am testing the scene in Gaffer, the IPR is quite fast in here. the next thing i wanna try to use Solaris.

About Render Engines part 1

This is a quick overview of current render Engines for Houdini and General in terms of MotionGraphics and VFX usage. 

There are different RenderEngines out there, each one is unique and uses different method to solve a problem. I am looking into Arnold, RenderMan, Vray, Octane and Redshift. For comparison reason I added Indigo Renderer engine.

There are different way to render a scene with benefits and shortcomings. lets start with most common one.

image by Glare Technology

Pathtracing (PT)

to be precise Backward Pathtracing.  In backward ray tracing, an eye ray is created at the eye; it passes through the viewplane and on into the world.  The first object the eye ray hits is the object that will be visible from that point of the viewplane.  After the ray tracer allows that light ray to bounce around, it figures out the exact coloring and shading of that point in the viewplane and displays it on the corresponding pixel on the computer monitor screen. that’s classical way, which all of the Render engines uses as standard.

Metropolis light transport (MLT)

This procedure has the advantage, relative to bidirectional path tracing, that once a path has been found from light to eye, the algorithm can then explore nearby paths; thus difficult-to-find light paths can be explored more thoroughly with the same number of simulated photons. Metropolis light transport is an unbiased method that, in some cases (but not always), converges to a solution of the rendering equation faster than other unbiased algorithms such as path tracing or bidirectional path tracing. MetroPolis is often used in Bidirectional mode (BDMLT).

Path Guiding

Mix between Path-tracing and MLT, unbiased technique for intelligent light-path construction in path-tracing algorithms. Indirect Guiding that improves indirect lighting by sampling from the better lit or more important areas of the scene. goal is to allow path-tracing algorithms to iteratively “learn” how to construct high-energy light paths.

link to latest Siggraph paper

BiDirectional Pathtracing ( BDPT )

Regular backward Pathtracing has hard time in indoor scene with small light source because it take lot’s rays and bounce to find a tiny light in a room, just to see if a object gets light by the light.

with Bidirectional, rays are fired from both the camera and light sources. They are then joined together to create many complete light paths.

Spectral rendering

image by Silverwing

Unlike most renderers which work with RGB colours, Spectral renderers uses spectral colour throughout, from the physically-based sky model to the reflective and refractive properties of materials. The material models are completely based on the laws of physics.
This makes it possible to render transparent materials like glass and water at the highest degree of realism.
Spectral renderer are pretty good in simulate different medium atmospheric effects like under water or earth air atmosphere.

Biased Rendering

hat Biased Render Engine actually means is pre-computing a lot of information before sending out rays from the camera. In more simple words, It uses an optimization algorithm to greatly speed up the render time but doing so It is not strictly just modeling the physics of light but it is giving an approximation

here is an example what Spectral rendering able to do:

Indigo renderer Planet-scale atmospheric simulation

Unlike other rendering systems which rely on so-called practical models based on approximations, Indigo’s sun and sky system is derived directly from physical principles. Using Rayleigh/Mie scattering and data sourced from NASA, Indigo’s atmospheric simulation is highly accurate. It’s stored with full spectral information, and allows fast rendering and real-time changes of sun position.

some examples of Atmosphere simulations by Indigo Forum user Yonosoy.

image by Yonosoy.
image by Yonosoy.
image by Yonosoy.
image by Yonosoy.
image by Yonosoy., even complete Planet athmosphere simulation is possible

Alien fractal Landscape

I starting dig up old projects  and scenes i ‘ve finished.  the idea this Project was to take my Fractals and scatter it into a landscapes.  the goal is create a “real” alien landscape. I was looking into “alien landscape” art, but there much to find on the Internet. most alien landscape art is quite earth-like with uber-epics rocks formation or giant Mushroom. the closed thing i found was from games like “no Mans Sky”, Ratchet&Clank or Starlink.   

so, here we go here one attempt on alien landscape. the camera flight is rendered with 5000 splines primitives done in Arnold renderer.


quick and dirty work in progress!