# SOLVEDGetting poly normal and CreatePhongNormals()

Hi,

a bilinear interpolation is quite straight forward. If you have the quadrilateral Q with the the points,

``````c---d
|   |
a---b
``````

then the bilinear interpolation is just,

``````ab = lerp(a, b, t0)
cd = lerp(c, d, t0)
res = lerp(ab, cd, t1)
``````

where `t0, t1` are the interpolation offset(s), i.e. the texture coordinates in your case (the ordering/orientation of the quad is obviously not set in stone). I am not quite sure what you do when rendering normals, but when you render a color gradient, in a value noise for example, you actually want to avoid linear interpolation, because it will give you these ugly star-patterns. So you might need something like a bi-quadratic, bi-cubic or bi-cosine interpolation, i.e. pre-interpolate your interpolation offsets.

If I am not overlooking something, this should also work for triangles when you treat them as quasi-quadrilaterals like Cinema does in its polygon type.

Cheers,
zipit

Hi,

Moller-Trumbore

what I forgot to mention, but I already hinted at in my previous posting, is that you did not state in which space your interpolation coordinates are formulated. When you talk about uv-coordinates I am (and probably also everyone else is) assuming that you have cartesian coordinates, just like Cinemas texture coordinate system is formulated in.

If you have another coordinate system, some kind of linear/affine combination like your code snippet suggests, then the interpolated normal is just the linear combination of the neighbouring normals (I think that was what you were trying to do in that code).

If this fails, you should either check your Möller-Trumbore code for errors (Scratch-Pixel has a nice article on it and how to calculate correct affine space/barycentric coordinates for it), or more pragmatically use Cinema's `GeRayCollider` instead, which will also give you texture coordinates.

I am not so well versed in the C++ SDK, maybe there is even a more low level (i.e. triangle level) version of `GeRayCollider` there, i.e. something where you do not have the overhead of casting against a whole mesh.

Cheers,
zipit

Oooh..... I feel like such a dope. I had almost everything right to begin with, except I was accessing the returned CreatePhongNormals() SVector incorrectly. I was using operator()[] when I should have been using ToRV(). Here's a smoothed result:

Thanks guys, we got there

One last thing, how do I 'globalise' the normals? They look local to me (see floating cube, it's slightly rotated but has the same shading as walls in the background)?

WP.

Hi,

all polygonal data is in object space. So if you want your normals to be in global/world space, you will have to multiply them with the frame of the object they are attached to. To get the frame, you will only have to zero out the offset of the global matrix of the object and then normalise the axis components.

Cheers,
zipit

Thanks zipit,

makes sense being local. Zeroing out and multiplying by the frame matrix does it:

Thanks for your time contributors, been a big help.

WP.

hi

can we considered this thread as resolved ?

Cheers,
Manuel

Thanks Manuel,

I didn't realise I had to mark the thread as solved. Done.

WP.

@WickedP Uhh, Ohh what are you coding there is that a brdf map shader ? I am searching for something like this !

https://alastaira.wordpress.com/2013/11/26/lighting-models-and-brdf-maps/

Hi @mogh,