THE POST BELOW IS MORE THAN 5 YEARS OLD. RELATED SUPPORT INFORMATION MIGHT BE OUTDATED OR DEPRECATED
On 23/01/2003 at 18:59, xxxxxxxx wrote:
User Information:
Cinema 4D Version: 8.012
Platform: Windows ; Mac ; Mac OSX ;
Language(s) : C++ ;
---------
Hi all,
This is a complicated one to explain, here goes...
I'm trying to write a C++ channel shader based on the SDK 'Bitmap Distortion' shader. As with that shader, another shader is used as an input channel.
If Bhodinut 3D noise is used as that input channel, I can't get valid data out - and this problem exists with the SDK shader as well.
Example:
1. Create a new material, and apply it to a primitive object.
2. Add alpha channel to material, and select SDK Bitmap Distortion for the channel. Leave params as default (no distortion).
3. Select edit for the channel shader, and select Bhodinut 3D noise.
The preview looks fine, but when rendered you just get a uniform grey value instead.
Now, the odd thing is that I found that if a 'fusion' block (with 0% blend) is put in front of the 'Bitmap Distortion' and the '3d noise' then the render works properly, even though this shouldn't have any effect. (I.e. Material->Fusion->BitmapDistort->3DNoise).
Unfortunately, this workaround causes my second problem.
It seems that if I animate parameters for shaders that are beneath an SLA fusion shader, then the animation doesn't appear in the final render (though in some cases you can preview any frame and everything works fine - but attempting to render out a sequence results in the shader animations not appearing).
So, how can I get my shader to grab valid data from a 3D noise channel shader, without having to hide the whole thing under a fusion shader? I.e. what code changes does the 'SDK Bitmap Distortion' example need so that it works as it should?
Any help MUCH appreciated, as I've already wasted far too long trying to get around these problems, and I need this for a current project.
Cheers - Steve