Spout

Problem with alpha between Unity and FFGL

Hi

I’m working on an FFGL plugin that renders a texture coming from Unity. I’m having a problem with the alpha channel and I don’t know how to proceed.

This is an image of the render result in SpoutDemoReceiver, Resolume’s native Spout source and my own FFGL plugin. Notice the glow around the right cube in the first two images and the lack of glow in the third.

Inside my plugin, which uses the SpoutGL receiver from current master branch, I have the following code:

spoutReceiver.IsUpdated();
if ( spoutReceiver.ReceiveTexture() )
{
    ffglex::ScopedShaderBinding shaderBinding( shader.GetGLID() );
 	ffglex::ScopedSamplerActivation activateSampler0( 0 );
  	spoutReceiver.spout.BindSharedTexture();
 	shader.Set( "Tex", 0 );
 	shader.Set( "maxUV", 1.0f, 1.0f );
  	quad.Draw();
 	spoutReceiver.spout.UnBindSharedTexture();
 }

When modifying the shader to only render the alpha channel, the area around the right cube is black. This leads me to believe that the problem is not something with straight vs premultiplied, but that there is something going wrong converting the texture from DX to OpenGL. Of course this could also be me being a complete idiot and forgetting something obvious.

Either way, I’m out of ideas where to look next, so I’m grateful for any guidance anyone here can provide.

Hi Joris,

I am not exactly sure of what is sending and receiving in that image, but it certainly looks like the problem is isolated to Unity.

What is the “Unity Player”. Are you using the “KlakSout” plugin? Perhaps there could be some setup needed. I see in the readme, “Camera capture mode” and “Render texture mode” both with an alpha checkbox.

There is mention of “custom render texture”. I have no experience with this, but perhaps the texture format there could be important. Texture format compatibility is limited for the GL/DX interop (SpoutGL.cpp - line 672). KlakSpout closed issue #66 discussion could be relevant

What happens if you don’t activate your shader? Would a simple test image from Unity help track it down?

Thanks for the quick reply and my apologies for not providing more info on the sender. It’s always hard to find a balance between legibility and info dumping :slight_smile: I very much appreciate your help in this!

The three images are all receivers, displaying the same sender. As previously stated, the receivers from top to bottom are SpoutDemoReceiver, Resolume’s native Spout source and my FFGL plugin using SpoutGL.

The sender is indeed created using Klak Spout in HDRP, which works correctly and as expected. I don’t have any of the issues described in KlakSpout#66, I think Keijiro is correct in closing it. The problem occurs when using both DXGI_FORMAT_R8G8B8A8_UNORM and DXGI_FORMAT_R16G16B16A16_UNORM for the render texture. Neither of those is explicitly mentioned in SpoutGL.cpp but both work fine on non-transparent textures. Both also work in SpoutDemoReceiver and Resolume with semi transparency. It’s only my plugin that has problems with semi transparency.

The SpoutDemoReceiver and Resolume’s native Spout renderer both correctly render the semi opaque pixels around the right cube. To me, this indicates that the texture coming out from Unity has the correct information in it. Also, it would seem the texture format is supported by Spout, or at least is compatible with the GL/DX interop.

My plugin doesn’t used fixed function OpenGL, so afaik it’s not possible to render without a shader. Even so, I’m not doing anything else aside from
fragColor = texture( Tex, uv );
or
vec4 color = texture( Tex, uv ); fragColor = vec4( vec3 ( color.a ), 1.0);
when debugging the alpha channel information.

I’m happy to provide stripped down projects and source files to help track this down. I’m 100% sure I’m doing something silly, but from how SpoutDemoReceiver and Resolume render the texture correctly, I feel that the silliness is somewhere in how I’m using the SpoutGL functionallity, not in the texture coming from Unity.

(edited to further clarify receiver and sender setup, and in which setup the problem occurs)

You mention DXGI_FORMAT_R8G8B8A8_UNORM and DXGI_FORMAT_R16G16B16A16_UNORM.

Format testing goes a long way back now but it’s clear that I should resurrect this and document it better. It’s described in more detail in 2.006 at Line 152 SpoutSDK.cpp.

Meanwhile in SpoutGL.cpp, line 680 you will see that DXGI_FORMAT_R16G16B16A16_FLOAT, DXGI_FORMAT_R16G16B16A16_SNORM and DXGI_FORMAT_R10G10B10A2_UNORM have been tested successfully.

This means that DXGI_FORMAT_R16G16B16A16_UNORM did not work. I can say that DXGI_FORMAT_R8G8B8A8_UNORM works OK with DirectX11 but not with DirectX9. That might also mean there is some trouble with it that I have not discovered. But I see that KlakSpout uses DXGI_FORMAT_R8G8B8A8_UNORM as default and it has been OK so far.

I would like to understand the problem better and help to track this down. I don’t have a lot of time right now but will try to build your plugin if I can add it as an extra project to the Resolume FFGL repo download to see what I can do with it. Testing with Unity is another step though.

Meanwhile have a look specifically at DXGI_FORMAT_R16G16B16A16_UNORM and whether you can avoid it.

You can contact me by email for downloads etc.

Cool. I understand that this is a bees nest of compatibility issues between different softwares, so I won’t take up any more of your time. Spout is amazing, I wouldn’t have gotten this far without it, and I appreciate the insight you were able to provide. Thank you!

OK no problems.

To confirm the suspected problem with DXGI_FORMAT_R16G16B16A16_UNORM, I will re-visit format testing now that I have DirectX examples to work with and let you know. The GL/DX interop spec lacks documentation about formats.

Edit - all tested with the demo receiver

DXGI_FORMAT_R32G32B32A32_FLOAT = 2
DXGI_FORMAT_R16G16B16A16_FLOAT = 10
DXGI_FORMAT_R16G16B16A16_UNORM = 11
DXGI_FORMAT_R16G16B16A16_SNORM = 13
DXGI_FORMAT_R10G10B10A2_UNORM = 24
DXGI_FORMAT_R8G8B8A8_UNORM = 28
DXGI_FORMAT_R8G8B8A8_UNORM_SRGB = 29
DXGI_FORMAT_R8G8B8A8_SNORM = 31
DXGI_FORMAT_B8G8R8A8_UNORM = 87
DXGI_FORMAT_B8G8R8X8_UNORM = 88

DXGI_FORMAT_R16G16B16A16_UNORM works OK, but I have not tested alpha specifically.

Update -

I have made a test program to show received alpha values. Could you tell me whether your plugin works correctly if not using HDRP?

Wow. Thanks for the commitment, I really appreciate it. I can confirm that my plugin has the same problems when not using HDRP, both in Render Texture mode and Camera mode for the Spout Sender.

It’s very strange behavior, it looks like all the color and alpha information is there and correct, but when doing anything with the actual .a value of the pixel, it’s as if all semi transparent pixels have an alpha value of 0.

OK thanks Joris. That means the format idea is a red herring. I will investigate further.

I can also confirm that Resolume’s native Spout player has the same problem. The texture looks correct, but when inspecting the alpha value of the semi transparent pixels, they are all 0. Very strange.

I think all works as expected. But it’s not exactly straightforward.

I set up an OpenGL receiver to observe received pixel values.

  1. DirectX sender : clear the backbuffer to a test colour and try different backbuffer formats.

Clear backbuffer to :
0.2f, 0.4f, 0.5f, 0.1f

Received in OpenGL RGBA texture for backbuffer formats :
DXGI_FORMAT_R8G8B8A8_UNORM
DXGI_FORMAT_B8G8R8A8_UNORM
DXGI_FORMAT_R16G16B16A16_FLOAT
0.200000 0.400000 0.498039 0.098039

The values are not exact because they are translated to the range 0-255
For example :
255*0.5 = 127.5
127/255 = 0.498039

Except for sRGB
DXGI_FORMAT_R8G8B8A8_UNORM_SRGB
(sRGB - Wikipedia)
0.486275 0.666667 0.737255 0.098039
Alpha value is not affected, but the colour space and equations are changed. I am not an expert in this area.

  1. Openframeworks OpenGL sender, rendering to an rgba fbo.
    Clear the fbo colour buffer.
    ofClear(51, 102, 204, 255); - 0.200000 0.400000 0.800000 1.000000
    ofClear(51, 102, 204, 127); - 0.200000 0.400000 0.800000 0.498039
    ofClear(51, 102, 204, 102); - 0.200000 0.400000 0.800000 0.400000
    ofClear(51, 102, 204, 51); - 0.200000 0.400000 0.800000 0.200000
    Received pixels are exactly as sent

  2. GLSL shader rendered to an OpenGL rgba fbo
    Green + Blue, Alpha
    glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
    // vec4 c0 = vec4(0.0, 1.0, 0.1, 0.0); // 0.000000 0.000000 0.000000 1.000000
    // vec4 c0 = vec4(0.0, 1.0, 1.0, 0.1); // 0.000000 0.098039 0.098039 0.909804
    // vec4 c0 = vec4(0.0, 1.0, 1.0, 0.2); // 0.000000 0.200000 0.200000 0.839216
    // vec4 c0 = vec4(0.0, 1.0, 1.0, 0.3); // 0.000000 0.298039 0.298039 0.792157
    // vec4 c0 = vec4(0.0, 1.0, 1.0, 0.4); // 0.000000 0.400000 0.400000 0.760784
    // vec4 c0 = vec4(0.0, 1.0, 1.0, 0.5); // 0.000000 0.498039 0.498039 0.749020
    The received colour values are as expected for the alpha value.
    The received alpha value is changed however (why?).

glBlendFunc(GL_ONE, GL_ONE);
Alpha has no effect, values received including alpha are exactly the same.
Other blend equations have different results, but suffice to say that they all affect the pixel values received.

It looks like the alpha channel is sent and received OK but it’s value depends on the sender. I don’t think this is much closer to solving your problem but, as I understand from these tests, the question is what Unity does to the pixels values including alpha. Can you send a solid colour from Unity?

Edit -
Solid colour from Arena seems to work OK. Alpha received does not exactly match with the percentage, but it changes as expected. I don’t know where the variation is.

First of all, a huuuuuuge thanks for diving down this rabbit hole with me. Your support is really appreciated and immensely helpful for tracking down the source of this weirdness.

So per your suggestion, I created a very basic solid color quad with controllable RGBA values, and lo and behold, this works completely as expected. I then added a semi transparent particle system next to it, and bam, the particle system still has the problem.

I then did some further diving, and apparently Unity lets you set a blend mode for semi transparent effects, which is set to multiply by default. I’m assuming that when the background is transparent black, and multiplying 0 with a value results in 0, these semitransparent pixels from the particle system are then written as straight alpha values, ie with correct values for rgb but 0 for alpha.

Now Resolume expects alpha values to already be premultiplied and not straight, so when it displays the image without any processing, it will look correct, but any operations using the alpha channel will result in 0 because that value is in fact 0.

Now to further complicate the issue, I’m fairly certain Resolume does an extra premultiplication pass on every FFGL Source. I’m still waiting to hear back from Resolume support on this (hi Zoli!), but this would explain the difference in output between their native Spout player and my FFGL Source.

So, long story short, it looks like I was caught between not knowing enough about Unity’s render process on the one hand and undocumented behavior from Resolume on the other, and I jumped to the conclusion that the problem was with Spout. I stand corrected. The GL/DX works perfectly as expected and the problem was completely between my chair and my keyboard. I apologise for wasting your time with this.

No problems of wasting time at all. I want to get to the bottom of this because we have always assumed that the texture received is the same as that sent.

Indeed that appears to be the case but a lot can go on before the texture is ready to send. The strange result is with the GLSL shader rendered to an fbo. Why is the alpha value changed? I assume it’s something to do with the blend equations.

It appears that you can safely assume premultiplied alpha but from these results it seems that you can’t be sure of converting back. But I don’t have experience with premultiplication methods so I am not sure about it.

Finding the Unity blend mode is the key. What happens if you use the equivalent of ONE, ONE? But I guess that won’t help if alpha is already 0. Can you achieve what you want to do?

I know that the difference you see in sRGB occurs because gamma correction is applied on the RGB values.

In your 3rd example, I’m not entirely sure what the RGBA values are of the background color you are blending with. Your calculations would be correct if the background color is vec4( 0.0, 0.0, 0.0, 1.0), ie opaque black and you are blending the provided color over it with glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);. Either way, I’m in full agreement that the combination of the background color with the selected blending mode on the sender end (in my case Unity) are what’s causing the results I’m seeing.

Short of writing my own scriptable render pipeline, I’ve accepted that there are too many uses cases in Unity where I have no control over the blending. Since on the other end Resolume expects premultiplied values instead of straight, I also cannot just use the texture as is. So as a workaround, I’m calculating the premultiplied alpha value by taking the max of each of the .r, .g, .b and .a components and assigning that to the .a component. Although not 100% correct, this results in correct alpha values for fully opaque, correct alpha values for fully transparent and usable premultiplied alpha values for semi transparent.

Again, I really appreciate you taking the time to dive into this. I love learning about how graphic operations work under the hood and this has taught me so much!

For something meant to be transparent, working with alpha sure gets murky sometimes…

I am familiar with gamma correction, although I understand that sRGB also has a matrix transformation. I am no expert in this though and can’t follow the math easily. But that’s separate from the alpha problem I think.

The 3rd example is what I do not understand. The background colour of the fbo is indeed black and the blending is as you say.

glClearColor(0.0f, 0.0f, 0.0f, 1.0f); // Set background color
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);

But I am not sure why the alpha value would change. In any case you seem to have devised a clever solution so you can create your plugin as intended.

I have also learned a lot from this testing and it gives me confidence that the underlying methods are reliable.