creating normals from alpha/heightmap inside a shader

http://www.polycount.com/forum/showthread.php?t=117185

I am making some custom terrain shaders with strumpy‘s editor and I want to be able to create normals based on my blend mask. Does anyone know how I can turn grayscale data into basic normal mapping info? I have seen someone do this in unreal but I can‘t remember where.

You probably want to get the difference of the values pixel-by-pixel using ddx, I think.

Off the top of my head... for CG/HLSL;

float heightmap = your height map value;
float3 normal;
normal.x = ddx(heightmap);
normal.y = ddy(heightmap);
normal.z = sqrt(1 - normal.x*normal.x - normal.y * normal.y); // Reconstruct z component to get a unit normal.

For OpenGL, I think the functions are dFdx and dFdy.

DirectX11 has ddx_fine and ddy_fine which I think give more accurate results.

That‘ll give you the normal in, ehr... screen space I think?

Otherwise you can sample the heightmap multiple times, offsetting it by a pixel and working out the difference from those values. That should get you it in whatever space the base normal is in.

Nicked frm here; http://www.gamedev.net/topic/594781-...mal-algorithm/

float me = tex2D(heightMapSampler,IN.tex).x;
float n = tex2D(heightMapSampler,float2(IN.tex.x,IN.tex.y+1.0/heightMapSizeY)).x;
float s = tex2D(heightMapSampler,float2(IN.tex.x,IN.tex.y-1.0/heightMapSizeY)).x;
float e = tex2D(heightMapSampler,float2(IN.tex.x+1.0/heightMapSizeX,IN.tex.y)).x;
float w = tex2D(heightMapSampler,float2(IN.tex.x-1.0/heightMapSizeX,IN.tex.y)).x;                

//find perpendicular vector to norm:
float3 temp = norm; //a temporary vector that is not parallel to norm
if(norm.x==1)
    temp.y+=0.5;
else
    temp.x+=0.5;

//form a basis with norm being one of the axes:
float3 perp1 = normalize(cross(norm,temp));
float3 perp2 = normalize(cross(norm,perp1));

//use the basis to move the normal in its own space by the offset
float3 normalOffset = -bumpHeightScale*(((n-me)-(s-me))*perp1 + ((e-me)-(w-me))*perp2);
norm += normalOffset;
norm = normalize(norm);

Gave it a try out of curiosity.

Here‘s the multi-sample method in a stripped-down surface shader.

I had to change the handedness from the code above to match Unity (changing the sign of the operation when sampling the e and w values from + to - and vice versa).

Shader "Debug/Normal Map From Height" {
    Properties {
        _Color ("Main Color", Color) = (1,1,1,1)
        _MainTex ("Diffuse (RGB) Alpha (A)", 2D) = "white" {}
        _BumpMap ("Normal (Normal)", 2D) = "bump" {}
        _HeightMap ("Heightmap (R)", 2D) = "grey" {}
        _HeightmapStrength ("Heightmap Strength", Float) = 1.0
        _HeightmapDimX ("Heightmap Width", Float) = 2048
        _HeightmapDimY ("Heightmap Height", Float) = 2048
    }

    SubShader{
        Tags { "RenderType" = "Opaque" }

        CGPROGRAM

            #pragma surface surf NormalsHeight
            #pragma target 3.0

            struct Input
            {
                float2 uv_MainTex;
            };

            sampler2D _MainTex, _BumpMap, _HeightMap;
            float _HeightmapStrength, _HeightmapDimX, _HeightmapDimY;

            void surf (Input IN, inout SurfaceOutput o)
            {
                o.Albedo = fixed3(0.5);

                float3 normal = UnpackNormal(tex2D(_BumpMap, IN.uv_MainTex));

                float me = tex2D(_HeightMap,IN.uv_MainTex).x;
                float n = tex2D(_HeightMap,float2(IN.uv_MainTex.x,IN.uv_MainTex.y+1.0/_HeightmapDimY)).x;
                float s = tex2D(_HeightMap,float2(IN.uv_MainTex.x,IN.uv_MainTex.y-1.0/_HeightmapDimY)).x;
                float e = tex2D(_HeightMap,float2(IN.uv_MainTex.x-1.0/_HeightmapDimX,IN.uv_MainTex.y)).x;
                float w = tex2D(_HeightMap,float2(IN.uv_MainTex.x+1.0/_HeightmapDimX,IN.uv_MainTex.y)).x;

                float3 norm = normal;
                float3 temp = norm; //a temporary vector that is not parallel to norm
                if(norm.x==1)
                    temp.y+=0.5;
                else
                    temp.x+=0.5;

                //form a basis with norm being one of the axes:
                float3 perp1 = normalize(cross(norm,temp));
                float3 perp2 = normalize(cross(norm,perp1));

                //use the basis to move the normal in its own space by the offset
                float3 normalOffset = -_HeightmapStrength * ( ( (n-me) - (s-me) ) * perp1 + ( ( e - me ) - ( w - me ) ) * perp2 );
                norm += normalOffset;
                norm = normalize(norm);

                o.Normal = norm;
            }

            inline fixed4 LightingNormalsHeight (SurfaceOutput s, fixed3 lightDir, fixed3 viewDir, fixed atten)
            {
                viewDir = normalize(viewDir);
                lightDir = normalize(lightDir);
                s.Normal = normalize(s.Normal);
                float NdotL = dot(s.Normal, lightDir);
                _LightColor0.rgb = _LightColor0.rgb;

                fixed4 c;
                c.rgb = float3(0.5) * saturate ( NdotL ) * _LightColor0.rgb * atten;
                c.a = 1.0;
                return c;
            }

        ENDCG
    }
    FallBack "VertexLit"
}

Derivative method works, but it‘s fucking ugly ‘cause it‘s in screen space - really noisy.

ddx_fine might give better results in DX11, but it looks like crap in DX9.

creating normals from alpha/heightmap inside a shader,布布扣,bubuko.com

时间: 2024-10-06 16:56:31

creating normals from alpha/heightmap inside a shader的相关文章

游戏框架其九:网和着色器( Mesh and Shader )

网的重要作用可以导入3DMAX等创建的模型,到游戏中:着色器可以实现特定绚丽的效果.它们的实现如下 1. 网 Mesh的实现: #pragma once //======================================================================== // File: Mesh.h - classes to render meshes in D3D9 and D3D11 // 主要是导入3DMAX等软件生成的模型文件 基于Windows Dir

【Unity Shaders】Alpha Test和Alpha Blending

写在前面 关于alpha的问题一直是个比较容易摸不清头脑的事情,尤其是涉及到半透明问题的时候,总是不知道为什么A就遮挡了B,而B明明在A前面.这篇文章就总结一下我现在的认识~ Alpha Test和Alpha Blending是两种处理透明的方法. Alpha Test采用一种很霸道极端的机制,只要一个像素的alpha不满足条件,那么它就会被fragment shader舍弃,"我才不要你类!":否则,就会按正常方式写入到缓存中,并进行正常的深度检验等等,也就是说,Alpha Test

【Unity Shader编程】之十四 边缘发光Shader(Rim Shader)的两种实现形态

本系列文章由@浅墨_毛星云 出品,转载请注明出处.   文章链接:http://blog.csdn.net/poem_qianmo/article/details/51764028 作者:毛星云(浅墨)    微博:http://weibo.com/u/1723155442 本文工程使用的Unity3D版本: 5.2.1  这篇文章主要讲解了如何在Unity3D中分别使用Surface Shader和Vertex & Fragment Shader来编写边缘发光Shader. 一.最终实现的效果

(转)【Unity Shaders】Alpha Test和Alpha Blending

转自:http://blog.csdn.net/candycat1992/article/details/41599167 写在前面 关于alpha的问题一直是个比较容易摸不清头脑的事情,尤其是涉及到半透明问题的时候,总是不知道为什么A就遮挡了B,而B明明在A前面.这篇文章就总结一下我现在的认识~ Alpha Test和Alpha Blending是两种处理透明的方法. Alpha Test采用一种很霸道极端的机制,只要一个像素的alpha不满足条件,那么它就会被fragment shader舍

Unity shader教程-第五课:自定义光照模型之Half Lambert模型

本文首发地址:http://98jy.net/article/24 更多文章,请入传送门 ---------------------------------------------- Half Lambert光照模型是Valve公司在制作"半条命"游戏时发明的,用来给在比较暗的区域显示物体.总体来说,该光照模型提高了物体表面的漫反射光.下图是Valve的示例,左手边是Lambert模型,右手边是Half Lambert模型. 使用我们原来的基础的shader,我们把LightingBa

unity, unlit surface shader (texColor only surface shader)

要实现双面透明无光照只有纹理色的surface shader. 错误的写法:(导致带有曝光) Shader "Custom/doubleFaceTranspTexColor" { Properties { _Color ("Color", Color) = (1,1,1,1) _MainTex ("Albedo (RGB)", 2D) = "white" {} } SubShader { Tags { "Render

雷达波Shader

OSG版本: vert #version 120 varying out vec3 v; void main() { gl_FrontColor = gl_Color; gl_Position = ftransform(); v = gl_Vertex.xyz; } frag #version 120 varying in vec3 v; uniform float osg_FrameTime; void main() { //gl_FragColor = vec4(1,0,0,1); gl_F

WebGPU学习(四):Alpha To Coverage

大家好,本文学习与MSAA相关的Alpha To Coverage以及在WebGPU中的实现. 上一篇博文 WebGPU学习(三):MSAA 学习Alpha To Coverage 前置知识 WebGPU学习(三):MSAA 一个fragment对应一个像素 介绍 开启了MSAA和Alpha To Coverage后,fragment的alpha值(fragment shader输出的颜色的alpha值)会影响该fragment对应像素的采样点是否被覆盖. 动机 参考乱弹纪录II:Alpha T

第4章:缓冲区、着色器、GLSL

原文链接: http://www.rastertek.com/gl40tut04.html Tutorial 4: Buffers, Shaders, and GLSL This tutorial will be the introduction to writing vertex and pixel shaders in OpenGL 4.0. It will also be the introduction to using vertex and index buffers in OpenG