This post will show a simple implementation of a gaussian blur effect with edge preservation.

Sometimes (like in fluid rendering) we need to perform a gaussian blur but we also need to still with edge preservation. The following code shows how to achieve this.

First, we need to calculate the Gaussian Blur Kernel and sent them to the GPU:

private float sigma; private float[] kernel; private Vector2[] offsetsHoriz; private Vector2[] offsetsVert; private int radius; private float amount; void ComputeKernel(int blurRadius, float blurAmount) { radius = blurRadius; amount = blurAmount; kernel = null; kernel = new float[radius * 2 + 1]; sigma = radius / amount; float twoSigmaSquare = 2.0f * sigma * sigma; float sigmaRoot = (float)Math.Sqrt(twoSigmaSquare * Math.PI); float total = 0.0f; float distance = 0.0f; int index = 0; for (int i = -radius; i <= radius; ++i) { distance = i * i; index = i + radius; kernel[index] = (float)Math.Exp(-distance / twoSigmaSquare) / sigmaRoot; total += kernel[index]; } for (int i = 0; i < kernel.Length; ++i) kernel[i] /= total; } void ComputeOffsets(float textureWidth, float textureHeight) { offsetsHoriz = null; offsetsHoriz = new Vector2[radius * 2 + 1]; offsetsVert = null; offsetsVert = new Vector2[radius * 2 + 1]; int index = 0; float xOffset = 1.0f / textureWidth; float yOffset = 1.0f / textureHeight; for (int i = -radius; i <= radius; ++i) { index = i + radius; offsetsHoriz[index] = new Vector2(i * xOffset, 0.0f); offsetsVert[index] = new Vector2(0.0f, i * yOffset); } }

We call both funtions in Intialization time passing the parameters: blurRadius = something ranging from 2 to 15 (the Kernel width), blurAmount = blur scale parameter, textureWidth = width of the texture that will be blured, textureHeight = same for height.

Here is the shader that Performs the Gaussian Blur (the RADIUS value must be defined in Compilation time, it must have the same value used in CPU side)

#define RADIUS 15 #define KERNEL_SIZE (RADIUS * 2 + 1) float weights[KERNEL_SIZE]; float2 offsets[KERNEL_SIZE]; float2 GBufferPixelSize; ///half pixel size of the origin render target(0.5f /width , 0.5f/height) float2 TempBufferRes; ///destiny buffer size float blurDepthFalloff; sampler2D depthSampler : register(s0); sampler2D ssaoSampler : register(s1); struct VertexShaderInput { float4 Position : POSITION0; float2 TexCoord : TEXCOORD0; }; struct VertexShaderOutput { float4 Position : POSITION0; float4 TexCoord : TEXCOORD0; }; VertexShaderOutput VertexShaderBlur(VertexShaderInput input) { VertexShaderOutput output = (VertexShaderOutput)0; output.Position = input.Position; output.TexCoord.xy = input.TexCoord + GBufferPixelSize; output.TexCoord.zw = input.TexCoord + 0.5f / TempBufferRes; return output; } /////////////////////////////////// bilateral float4 PS_GaussianBlurTriple(float4 texCoord : TEXCOORD0) : COLOR0 { float3 color = 0; float depth = tex2D(depthSampler, texCoord.xy).x; float s=0; for (int i = 0; i < KERNEL_SIZE; ++i) { float3 im = tex2D(ssaoSampler, texCoord.zw + offsets[i] ); float d = tex2D(depthSampler, texCoord.xy + offsets[i] ).x; float r2 = abs(depth - d) * blurDepthFalloff; float g = exp(-r2*r2); color += im* weights[i] * g; s+=g* weights[i]; } color = color/s; return float4(color,1); } technique GAUSSTriple { pass p0 { VertexShader = compile vs_3_0 VertexShaderBlur(); PixelShader = compile ps_3_0 PS_GaussianBlurTriple(); } } ////////////////////////////////////// float4 PS_GaussianBlurSingle(float4 texCoord : TEXCOORD0) : COLOR0 { float color = 0; float depth = tex2D(depthSampler, texCoord.xy).x; float s=0; for (int i = 0; i < KERNEL_SIZE; ++i) { float im = tex2D(ssaoSampler, texCoord.zw + offsets[i] ).x; float d = tex2D(depthSampler, texCoord.xy + offsets[i] ).x; float r2 = abs(depth - d) * blurDepthFalloff; float g = exp(-r2*r2); color += im* weights[i] * g; s+=g* weights[i]; } color = color/s; return float4(color,0,0,1); } technique GAUSSSingle { pass p0 { VertexShader = compile vs_3_0 VertexShaderBlur(); PixelShader = compile ps_3_0 PS_GaussianBlurSingle(); } }

I included two version of the pixel shader, one for Color Texture and another for float point textures (used in fluid rendering for example).

The blurDepthFalloff is a artistic/adjustment parameter

The DepthTexture is a texture containing the depth of the Scene (in z/w form). See this article for more details about how to generate it.

To render the Scene (Post Effect) we use Render a Quad using the previous shader and the avaluated parameters.

rHelper.PushRenderTarget(RenderTarget2D); effect.Parameters["blurDepthFalloff"].SetValue(blurDepthFalloff); effect.Parameters["weights"].SetValue(kernel); effect.Parameters["offsets"].SetValue(offsetsHoriz); effect.Parameters["GBufferPixelSize"].SetValue(new Vector2(1f / ImageToProcess.Width, 1f / ImageToProcess.Height)); effect.Parameters["TempBufferRes"].SetValue(destinySize.Value); rHelper.Textures[0] = rHelper[PrincipalConstants.DephRT]; rHelper.Textures[1] = ImageToProcess; SamplerState s0 = rHelper.SetSamplerState(SamplerState.PointClamp, 0); SamplerState s1 = rHelper.SetSamplerState(ImageSamplerState, 1); rHelper.RenderFullScreenQuadVertexPixel(effect); rHelper.PopRenderTarget(); effect.Parameters["offsets"].SetValue(offsetsVert); rHelper.Textures[1] = RenderTarget2D; rHelper.RenderFullScreenQuadVertexPixel(effect); rHelper.SetSamplerState(s0, 0); rHelper.SetSamplerState(s1, 1);

Quite Simple !

The Gaussian Blur with edge preservation is not Separable (we cant perform a Y and a X pass independentely) but for performance we normally dont care about this.

#1 by Wohnwagen mit Hund on 26 de julho de 2016 - 7:58 pm

Very nice write-up. I absolutely appreciate this website. Keep writing!

#2 by Atv on 26 de julho de 2016 - 8:55 pm

I must show appreciation to this writer for rescuing me from such a instance. After exploring throughout the internet and coming across notions that were not helpful, I believed my life was well over. Existing without the solutions to the problems you have solved by way of your review is a serious case, and the ones that would have in a wrong way damaged my entire career if I had not discovered your web blog. Your personal competence and kindness in controlling all the stuff was vital. I am not sure what I would’ve done if I had not discovered such a subject like this. I’m able to now look forward to my future. Thank you very much for the skilled and result oriented help. I will not hesitate to refer your blog to any person who should have assistance about this subject matter.

#3 by Garden IdeasÂ on 26 de julho de 2016 - 9:19 pm

You could certainly see your skills within the work you write. The world hopes for more passionate writers such as you who are not afraid to mention how they believe. Always follow your heart.

#4 by Interior DesignÂ on 26 de julho de 2016 - 9:34 pm

Hi, i think that i saw you visited my blog so i came to “return the favor”.I’m attempting to find things to enhance my website!I suppose its ok to use a few of your ideas!!

#5 by pet cages for cats on 26 de julho de 2016 - 9:36 pm

I think this is a real great blog post.Really looking forward to read more. Really Cool.

#6 by Jeep Patriot on 26 de julho de 2016 - 10:36 pm

I¡¦ve recently started a website, the information you provide on this site has helped me tremendously. Thank you for all of your time & work.

#7 by moneylender review singapore on 26 de julho de 2016 - 11:14 pm

Love the open source sentiment of this article. Do you think that this rewards tactic can work on luxury products as well? Luxury products wouldn’t be able to give away products, but could offer a discount? As this is less tangible, it feels less rewarding to me. What do you think?LikeLike

#8 by Daron Brafman on 26 de julho de 2016 - 11:27 pm

Hello my friend! I want to say that this post is amazing, great written and include almost all vital infos. I’d like to peer more posts like this.

#9 by Spezialturen on 26 de julho de 2016 - 11:36 pm

Wonderful story, reckoned we could combine a few unrelated data, nevertheless seriously really worth taking a look, whoa did one learn about Mid East has got extra problerms too