unity Blending transparent textures with depth

blender transparent texture white (4)

I am trying to blend textures which have transparent areas:

glEnable( GL_TEXTURE_2D );
glBindTexture( GL_TEXTURE_2D, ...);
glVertexPointer( 2, GL_FLOAT, 0, ... );
glEnable (GL_BLEND);
glDrawArrays( GL_TRIANGLE_STRIP, 0, 4 );

Unless I add glDisable(GL_DEPTH_TEST), transparent parts of the top textures overwrite everything beneath them (instead of blending). Is there any way to do this without disabling depth? I have tried various blending functions but none of the helped.

Enabling depth test doesn’t actually sort your geometry by depth—in the usual GL_LESS case, it merely prevents primitives from drawing if they aren’t closer to the viewer than what has previously been drawn. This allows you to draw opaque geometry in whatever order you want and still get the desired result, but correct rendering of blended geometry typically requires everything behind the blended object to have already been rendered.

Here’s what you should be doing to get a mix of opaque and blended geometry to look right:

  1. Separate your blended geometry from your opaque geometry.
  2. Sort your blended geometry from back to front.
  3. Draw all the opaque geometry first as usual.
  4. Draw your blended geometry in sorted order. You’ll want to leave depth testing enabled but temporarily disable depth writes with glDepthMask(GL_FALSE).

Alternately, if your content is always either fully opaque or fully transparent, you can just enable alpha testing and disable blending, but I’m guessing that’s not what you’re looking for.

If you're in WebGL or OpenGL ES 2.0 (iPhone/Android) there is no alpha testing. Instead you need to not draw transparent pixels. That way they won't affect the depth buffer since no pixel was written. To do that you need to discard pixels that are transparent in your fragment shader. You could hard code it

void main() {
   vec4 color = texture2D(u_someSampler, v_someUVs);
   if (color.a == 0.0) {
   gl_FragColor = color;

or you could simulate the old style alpha testing where you can set the alpha value

uniform float u_alphaTest;
void main() {
   vec4 color = texture2D(u_someSampler, v_someUVs);
   if (color.a < u_alphaTest) {
   gl_FragColor = color;

Rendering glitch with GL_DEPTH_TEST and transparent textures

That link (and sorting on CPU) is for alpha blending. If you need only Alpha Testing (not Blending), then you don't need to sort anything. Just enable alpha test, keeping depth test enabled, and everything will be rendered fine.

See here: http://www.opengl.org/wiki/Transparency_Sorting You need "Alpha test" that requires alpha testing, not "Standard translucent" that requires sorting.

Solution #1:

  1. Render all non-transparent objects first in any order, depth buffer enabled. That includes all objects that use alpha testing without alpha blending.
  2. For glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA) objects (smoke/glass/grass): Render transparent scene from furthest polygon to nearest polygon with depth buffer write disabled (glDepthMask(GL_FALSE)). If all transparent objects are convex and do not intersect, you can sort objects instead of polygons.
  3. For glBlendFunc(GL_SRC_ALPHA, GL_ONE) and glBlend(GL_ONE, GL_ONE) (fire, "magic" particle systems, glow): Render transparent scene in any order with depth buffer write (glDepthMask(GL_FALSE)) disabled.
  4. Do not render any depth-buffer enabled objects after step #3.

Solution #2:
Use depth-peeling (google it). Especially if transparent objects intersect each other. Not suitable for particle systems and grass, which require Solution #1.

and then manually sort them on the CPU pretty much every single frame

Insertion sort works great for already sorted or partially sorted data.

There has to be a way to delegate this to the GPU...

I think you can generate grass polygons (in correct order) in geometry shader using texture that has a channel (say, alpha), that marks areas with and without grass. Requires OpenGL 4, and you probably will have to perform some kind of higher-level sorting for polygons you'll feed to shader to generate grass patches.

Individual shrubs can be rotated in vertex shader (by +- 90/180/270 degrees) to maintain correct polygon ordering if they're perfectly symmetrical in all directions.

And there's merge sort algorithm that parallelizes well and can be performed on GPU, using either GDGPU approach or OpenCL/CUDA.

However, using something like that to render 5 shrubs of grass is roughly equivalent to trying to kill a cockroach with grenade launcher - fun thing to do, but not exactly efficient.

I suggest to forget about "offloading it to GPU" until you actually run into performance problem. Use profilers and always measure before optimizing, otherwise you'll waste large amount of development time doing unnecessary optimizations.