Chronicles of our epic weekend projects.
Tuesday, December 31, 2013
Wednesday, December 18, 2013
SSGI in Unity
A simple SSGI (Screen Space Global Illumation) effect was a surprisingly quick feature to add, as most of the legwork had already been done by Unity's SSAO (Screen Space Ambient Occlusion) Image Effect.
The aim was to add additional contextual information to the final render, by way of indirect bounce light (or more accurately - color bleeding) from nearby surfaces.
The aim was to add additional contextual information to the final render, by way of indirect bounce light (or more accurately - color bleeding) from nearby surfaces.
SSGI is OFF, SSAO is OFF.
We lack substantial spatial cues.
The process I used is identical to SSAO, only instead of merely measuring contributions to the occlusion term, I allowed the occluding surfaces to contribute local surface color as well.
Altering the SSAO shader, I stuffed the AO term into the _SSGI tex's .a channel, and the GI term into the .rgb channels, then blended the color bleeding signal with the original image, using a remapped ao signal.
sampler2D _SSGI;
half4 frag( v2f i ) : COLOR
{
half4 c = tex2D (_MainTex, i.uv[0]);
half ao = tex2D (_SSGI, i.uv[1]).a;
ao = pow (ao, _Params.w);
half3 gi = tex2D (_SSGI, i.uv[1]).rgb;
c.rgb = lerp(c.rgb, gi, 1-ao);
return c;
}
What's nice about this hack is all the parameters for AO are separable and compatible with those for GI.
SSGI is ON, SSAO is ON.
Indirect illumination and occlusion provide greater spatial context.
Saturday, November 30, 2013
Models
I activated my 30 day Unity Pro trial license a few days ago so we're frantically trying to finish this forest sprite demo. While Lou is coding like mad, I am going to share some of our models with you. We are using Maya 2013 to build and rig (where necessary) our assets. We are using Photoshop and Mari to texture our character and Unity to shade our assets. In Unity we have actually created our own "uber shader" which almost all of our assets are currently using. For now, the modeling.
Here's our set from our main camera with all of our assets referenced into a scene and dressed according to our artwork. The green bits are maya joints for some vegetation that will have some wind animation in Unity.
This is one of the flowers that also has a skeleton bound to it.
Here's our set from our main camera with all of our assets referenced into a scene and dressed according to our artwork. The green bits are maya joints for some vegetation that will have some wind animation in Unity.
Here is the same set from a different camera to show our hackery. The beauty of working in context is that we can shape all of our dressing to our camera.
The log is detailed where it needs to be and sparse everywhere else
This is one of the flowers that also has a skeleton bound to it.
This is our forest sprite character. We are using our cache exporter and importer to bring our animation into Unity.
Wednesday, November 20, 2013
Moss Shader
We wanted some soft "texture" in our scene to make it more inviting and organic, so in envisioning the scene we decided we ought to add some cushy moss to the fallen log.
To reiterate - we're using Unity on a mac - so we don't have access to DirectX 11's tessellation support to grow isoline fur or anything like that. The tried and true method in games prior to that technology has been to implement a shell/fin method - but we wanted to take that idea a step further.
A simple 3 pass shell extrusion shader.
In order to have decent-quality fur using the method mentioned above, you need a significant number of shells, each serving as a cross section of the fur volume (which in Unity would each require a separate draw call per sub shader, since we'd prefer to do all the extrusion using the vertex program). This can bring down performance, requires a lot of redundant code, and a long compilation cycle.
I want to keep the draw call count low per model, so instead I chose to integrate the effects of multiple cross-section layers in only one layer. This essentially turns into a ray casting problem.
Rather than spend a bunch of shader instructions writing the ray casting itself, I hijacked another popular shading concept called parallax mapping.
Essentially, naive parallax mapping allows for a texture to seem "deeper" inset into the surface being shaded, as a function of a height map. If you replace the height map value with a constant, you can set the entire texture a certain "depth" into the material. Do this enough times with linear varying depths, and integrate things such as opacity, surface normal, root-tip color, etc in a loop, and you get a holographic ray-marched-esque shader of the fur volume.
In pseudocode:
for each iteration
{
shift uv by ParallaxOffset(0, depth*iteration/numIters, viewDirection);
look up cross section tex2d(tex, shiftedUV);
integrate tinted texture over length with root&tip color;
integrate alpha over length with root&tip opacity;
integrate normal from tex2D(NormalMap, shiftedUV);
}
normalize(Normal);
With 30 iterations and 1 draw call you can achieve the look of 30 shells.
One shortcoming of naive parallax mapping is that you don't have true silhouette edges. This could be solved by implementing relief mapping. Instead I opted to AlphaTest anything lower than a certain fixed value, allowing for blades of moss and grass to be clipped.
Clumping was also achieved by additionally offsetting the uv value at the tip by some smoothly varying vectors, encoded in a normal map generated from tiling FBM. Same goes for keep alive/wind.
Gross directionality changes and density are attenuated via vertex colors = (tangentShift, bitangentShift, 0, density).
Resulting hologram moss.
Friday, November 1, 2013
Colored Shadows in Unity
Sometimes you just want to colored shadows. It may not be physically correct, but perhaps you're making a stylistic choice. Unfortunately our options are limited. Unity's Ambient Light is a global concept, so if you want to add colored shadows on a per-material basis for fine grain control, you probably need to implement it in your shader.
In order for our forest to feel more organic, I made some adjustments to all of our custom shaders - a very simple addendum within our lighting models.
In order for our forest to feel more organic, I made some adjustments to all of our custom shaders - a very simple addendum within our lighting models.
Let's start with standard lambert irradiance.
c.rgb = s.Albedo * saturate(NdotL) * atten;
There are two things happening here, we're diminishing the color of the material by the diffuse component (saturate(NdotL)), as well as the combined factor of the surface's distance to the light source and whatever cast shadows are occurring (atten). Both of these are 0->1 values. This is what that looks like:
With all that black we lose a lot of contextual information.
Let's add some terms to our lighting to make it more interesting.
c.rgb = (s.Albedo * saturate(NdotL) + s.Albedo * _ShadowColor.rgb * (1-saturate(NdotL))) * atten
+ s.Albedo * _ShadowColor.rgb * (1-atten);
+ s.Albedo * _ShadowColor.rgb * (1-atten);
Since both the diffuse and attenuation/shadow components map 0->1 as "in shadow"->"in light", we can remap them, multiply against a per-material _ShadowColor, and add the result back into the mix. Now colors that used to be crushed to (0,0,0) are (_ShadowColor.r,_ShadowColor.g,_ShadowColor.b), and our whites are preserved:
Per-material colored shadows for optimal control.
Friday, October 25, 2013
Procedural Wind Effects in Unity
Rather than animating and cacheing a bunch of flower bending by hand, we wanted to create some wind procedurally. This wind is simply for "keep alive," or, to add some subtle movement so our scene does not look completely static and CG. In order to do this, the flowers were rigged, and three scripts were written: WindSim, WindObject, and AssignIDs.
WindSim
The WindSim script is attached to any flower or object that has a rig and wants to be simmed. The object "wants" to be simmed if it is specifically tagged in the inspector. This allowed us the ability to turn off simulation easily rather than removing the script entirely from the object. WindSim does the actual moving of the joints and takes in a direction from WindObject. WindSim has attributes such as strength and animation offset to vary the wind motion from object to object. Here is the noise function we came up with:
(Mathf.Pow(Mathf.Tan(Mathf.Sin(Time.time+AnimOffset*id))/2.0f,4f)-.1f)*10F*Mathf.PerlinNoise(Time.time/2F,Time.time/2F);
Inigo Quilez's Graph Toy or the Grapher tool for Macs are really helpful tools to come up with interesting noise functions. This is the function visualized with Graph Toy:
WindObject
WindObject is attached to an empty gameObject and serves as a global way of controlling the direction of the wind as well as other parameters. The direction is just based on the gameObject’s directional vector. We have a simple Gizmo line drawing in the direction of the wind which is visible in the scene view. Like WindSim, WindObject also has strength, flexibility, and animation offset attributes that are passed to WindSim initially by default. If any of these attributes are set in WindSim, then they override WindObject’s settings.
Here's a screenshot of our WindObject selected. The yellow line tells us which direction the wind is blowing.
Object IDs
We created AssignIDs as an editor script to add some randomization for each simmable object, . It looks for all simmable objects in the scene and incrementally assigns an id to each of the objects.
Here's an example of our wind in action.
If nothing loads, try installing
Unity Player
The result is a globally controllable wind sim with the ability to tweak the look of each simmed object if necessary.
WindSim
The WindSim script is attached to any flower or object that has a rig and wants to be simmed. The object "wants" to be simmed if it is specifically tagged in the inspector. This allowed us the ability to turn off simulation easily rather than removing the script entirely from the object. WindSim does the actual moving of the joints and takes in a direction from WindObject. WindSim has attributes such as strength and animation offset to vary the wind motion from object to object. Here is the noise function we came up with:
(Mathf.Pow(Mathf.Tan(Mathf.Sin(Time.time+AnimOffset*id))/2.0f,4f)-.1f)*10F*Mathf.PerlinNoise(Time.time/2F,Time.time/2F);
Inigo Quilez's Graph Toy or the Grapher tool for Macs are really helpful tools to come up with interesting noise functions. This is the function visualized with Graph Toy:
WindObject
WindObject is attached to an empty gameObject and serves as a global way of controlling the direction of the wind as well as other parameters. The direction is just based on the gameObject’s directional vector. We have a simple Gizmo line drawing in the direction of the wind which is visible in the scene view. Like WindSim, WindObject also has strength, flexibility, and animation offset attributes that are passed to WindSim initially by default. If any of these attributes are set in WindSim, then they override WindObject’s settings.
Here's a screenshot of our WindObject selected. The yellow line tells us which direction the wind is blowing.
Object IDs
We created AssignIDs as an editor script to add some randomization for each simmable object, . It looks for all simmable objects in the scene and incrementally assigns an id to each of the objects.
Here's an example of our wind in action.
Thursday, October 10, 2013
Cache, Money
One requirement of our realtime effort is that we be able to achieve film-quality animation within Unity. The current release doesn't support blend shapes, and even if it did, we'd want to leave ourselves some room to use other deformer solutions, as well as allow the animation to come from any arbitrary 3rd party package one could imagine. The industry's standard answer to a problem like this is to point cache the scene data.
At this time, Unity also does not support caches. So! We have to make our own, or buy something someone else has made. We're frugal, so we made our own.
The concept is simple - store non frame-varying data once (triangles, uvs), and frame-varying data once per frame (vert positions, normals), in some text file, and construct a parser in Unity to build out the resulting mesh, predicated upon a frame in the range you wish to animate.
One problem though...
Unity uses a different coordinate system than most other 3d packages. As an example, we're using Maya for our asset creation - and it happens to use a Right Handed Coord-Sys, whereas Unity uses a Left Handed Coord-Sys.
At this time, Unity also does not support caches. So! We have to make our own, or buy something someone else has made. We're frugal, so we made our own.
The concept is simple - store non frame-varying data once (triangles, uvs), and frame-varying data once per frame (vert positions, normals), in some text file, and construct a parser in Unity to build out the resulting mesh, predicated upon a frame in the range you wish to animate.
One problem though...
Unity uses a different coordinate system than most other 3d packages. As an example, we're using Maya for our asset creation - and it happens to use a Right Handed Coord-Sys, whereas Unity uses a Left Handed Coord-Sys.
In practice all our positional and directional vectors (i.e. vertex positions and normals) have to be flipped (*-1) in the x-axis (or "right" vector). But that's not enough! If you were to only scale, all your normals would be inverted, so you must take care to also reverse the winding-order of your triangles upon export.
The winding-order simply defines what is considered the "front-face" or "back-face" of a polygon, and its really just what it sounds like - the order in which the vertices are listed in the polygon's array. By reversing the winding-order, you effectively flip the face normal, and now all is right in the world.
What's fantastic about these two simple observations is that (as far as I can tell) this is exactly what's happening within Unity's Maya Scene Importer. Now anything statically imported from Maya through Unity's traditional pipeline will match perfectly with anything dynamically cached!
Thursday, September 26, 2013
Unity Dust Particle Shader
You can now buy our particle shader on the Unity asset store for $5!
NOTE: If you're seeing WHITE SQUARES, please try adding this line:
AlphaTest Greater .01
under the line:
Tags {"RenderType"="Transparent" "Queue" = "Transparent" "IgnoreProjector" = "True"}
See below for more info. I'll release an update to the shader soon with this in there by default. Sorry for the confusion!
We're using Lou's painting as inspiration for our first test shot. I started developing a shader for Unity's particle system to create the close up floaties. The tricky part here was that I had no way of accessing particle ids in the shader, which meant that I couldn't randomize the particles in the shader. I wanted to randomize opacity, color and flashes. I originally tried creating a perlin noise texture look up in screen space based on the particle's screen space position to get a random value for that particle. That works for our purposes since our camera is static. I didn't get very good results with this and instead varied the opacity in the particle system by manually setting Alpha to be 0 or 255 in Color over Lifetime. The varying opacity was used to drive a super bright emission and the color naturally looked random because of the different opacities.
o.Emission = lerp(0,_Emission, smoothstep(_FlashSpeed,1,o.Alpha));
Large Blurry Dust
For the coloration, I wanted to create a "photographic" look. I looked a lot at images like these I found on google:
Even though the overall mote looks circular, it is actually a cutout of an imperfect circle texture I painted. This just adds a subtle randomness to all of them as they slowly spawn and rotate (I think anyway).
if(_DoShape)
o.Alpha = o.Alpha*tex2D(_Shape,IN.uv_Noise.xy).r;
//otherwise make a circle
else{
if(dist>=.5)
o.Alpha = 0;
}
float dist = distance(IN.uv_Noise.xy, float2(.5,.5));
if(dist<(.5-_Blur)){
o.Alpha = o.Alpha*max(1.0*pow(dist/(.5-_Blur),2),(1-_Transparency));
//add a shadow color as the particle fades
o.Albedo = lerp(_ShadowColor.rgb, o.Albedo,o.Alpha);
}
o.Albedo = lerp(o.Albedo,_Speckles.rgb,tex2D(_Noise,(IN.uv_Noise.xy*.1+_Time*.01)).r);
As as bonus, I created a custom lighting model for these dust motes which simulates light scattering through them.
half4 LightingParticle(SurfaceOutputCustom s, half3 lightDir, half3 viewDir, half atten) {
half4 c;
c.rgb = s.Albedo;
c.a =lerp(0,s.Alpha, 1-saturate(dot(normalize(viewDir), lightDir)));
return c;
}
NOTE the lighting model does not work with multiple lights and can result in the particles looking blown out, or white. Since the particles are alpha mapped billboards, they will look like white squares. There are a couple ways to fix this. One is to add "alpha" to the pragma surface line. This basically overrides the lighting model (for whatever reason):
#pragma surface surf Particle alpha
If you want to preserve the ability to have a light source create a scattering effect, add the following line:
AlphaTest Greater .01
If you want to preserve the ability to have a light source create a scattering effect, add the following line:
AlphaTest Greater .01
under the line:
Tags {"RenderType"="Transparent" "Queue" = "Transparent" "IgnoreProjector" = "True"}
I believe the first light that was created in the scene is the only light that can create the scattering effect.
This one was much simpler. Basically white and the cutout is an irregular polygon texture.
And here is my result:
Dabble #1 Realtime Forest Sprite
We work in the animation industry and we know how much it sucks to wait for renders to come back. So we're embarking on a little experiment. Inspired by Square-Enix's "Agni's Philosophy" and Unity's "Butterfly Effect", we want to see if we can create a near film quality shot/test/short in "real-time" in our free time. For this project, we are using Maya to create assets and set dress to a camera and Unity for shading, lighting, and rendering.
Our test idea is centered around a tiny forest sprite running across a fallen log. This sprite actually came from a game idea we had but shelved because of our lack of DirectX11 (We're using Macs). Here are some initial artworks for the sprite. We are actually pretty far along on this project already (we've only spent 2 weekends so far), but we'll update the blog with pipeline scripts and pretty shaders we had to develop.
Subscribe to:
Posts (Atom)