MaterialData fetch/sample texture [SOLVED]

On 29/08/2014 at 03:24, xxxxxxxx wrote:

User Information:
Cinema 4D Version:   15 
Platform:   Windows  ;   
Language(s) :     C++  ;

Hi plugincafe-members,

I have a MaterialData plugin with a description including a group "color". In this group I have a texture. The description looks like this:


Here is a screenshot from the material:

I want to emulate texture mapping, in the same way as the default material of C4D does. Do I have to load the texture first? Is it already there? Do I have to use the vd->uvw coordinates?

Here is the code I have so far to return the diffuse color (simply the Vector)

void MyMaterial::CalcSurface(BaseMaterial* mat, VolumeData* vd)
GeData data;

if (data.GetBool())
	Vector diffuseColor = data.GetVector();
	vd->col = diffuseColor;

	// here I would like to use UV-coordinates to fetch my texture
	vd->col = Vector(0.0f);


Thanks a lot!

On 29/08/2014 at 14:02, xxxxxxxx wrote:


a TEXTURE is just an ordinary string (with a fancy preview image). So you have to load the actual bitmap to use it. The best place to load the bitmap is in the InitRender function:

    virtual INITRENDERRESULT InitRender(BaseMaterial* mat, const InitRenderStruct& irs)  
      // get filename  
      GeData data;  
      mat->GetParameter(DescLevel(MYMATERIAL_DIFFUSE_TEXTURE), data, DESCFLAGS_GET_0);  
      String name = data.GetString();  
      if (name.Content() == true)  
          // get full path  
          Filename texture;  
          **GenerateTexturePath** (irs.doc->GetDocumentPath() + irs.doc->GetDocumentName(), Filename(name), Filename(), &texture);  
          // loat texture into BaseBitmap  
          this->_texture = BaseBitmap::Alloc();  
          IMAGERESULT res = this->_texture-> **Init** (texture);  
          if (res != IMAGERESULT_OK)  

You can use the GenerateTexturePath function to generate the full file path.

Later, you can use this BaseBitmap to calculate your material:

            if (this->_texture != nullptr)  
              // get texture coordinates from uv-data  
              Int32 x = (Int32)((Float)this->_texture->GetBw() * vd->uvw.x);  
              Int32 y = (Int32)((Float)this->_texture->GetBh() * vd->uvw.y);  
              UInt16 r, g, b;  
              // "sample" texture  
              this->_texture-> **GetPixel** (x, y, &r, &g, &b);  
              // turn into floats  
              Vector color(1.0);  
              color.x = (Float)r / 255.0;  
              color.y = (Float)g / 255.0;  
              color.z = (Float)b / 255.0;  
              // set result  
              vd->col = color;  

Don't forget to free the bitmap later.

Best wishes,

On 30/08/2014 at 01:28, xxxxxxxx wrote:

Thanks Sebastian!

It works, when I render an image with the preview or another "render" call, but in the editor the GetPixel(...) of the bitmap does not change the r,g,b values. So when I initialize r,g,b to zero, my image is black, because they don't get valid values.

Here is what I get:

It is also interesting that the InitRender() method is called when any parameter in the material is changed. So, for example, I change the diffuseColor of my material from white to red, the InitRender() is called and my bitmap texture is loaded again. This is not a huge problem, but a bit strange because it wastes disk-access time for nothing.

When I want the CalcSurface method to work in the editor too, do I have to create an OpenGL material? I'm pretty sure this cannot be required, because one can use C4D without OpenGL.

Or maybe I have to return correct flags in the GetRenderInfo() call? Right now I'm returning VOLUMEINFO_EVALUATEPROJECTION so that the uv-mapping works.

Is there something like "evaluate in editor"?

EDIT: I found a few topics in the forum, but I don't understand the answers, like this one from 2004 -.-

Thanks again!

On 30/08/2014 at 03:57, xxxxxxxx wrote:

It is also interesting that the InitRender() method is called when any parameter in the material is changed. So, for example, I change the diffuseColor of my material from white to red, the InitRender() is called and my bitmap texture is loaded again. This is not a huge problem, but a bit strange because it wastes disk-access time for nothing.

If you change a parameter on the material, the material preview will update. And updating the
material preview requires a render.

I never had the need of implementing a MaterialData plugin (yet), so I am NOT speaking from
experience here. I think it would improve speed if you load the bitmap once instead of with
each InitRender() call. Of course you should reload it if the Filename changed or the file was
modified since the last time you loaded it.

Maybe you want to look at the C++ SDK Simple Material example. It at least displays the
specified color in the viewport, maybe that can work for textures as well.


On 30/08/2014 at 04:14, xxxxxxxx wrote:

Hi Niklas,

thanks for your comment on the loading of the bitmaps. I might investigate such optimizations if needed in the future.

The Simple Material example behaves the same as my material does. You can change the color which is then updated and correctly displayed in the editor. When rendered though, the material uses some sort of noise/generative texture. So i guess that the shaders are not evaluated in the editor, the same way as the texture fetching/sampling is ignored.

My feeling about C4D tells me that there has to be some sort of flag that i have to set. I have read somewhere that one should call the MaterialData::Update method, but it actually has nothing to do with my problem, since the CalcSurface method is called but the GetPixel() method of the texture simply returns nothing. In my opinion it is quite a bad design that this function has no return values (enum or bool) and does not communicate an "errorious texture" or state or whatever.

It would be great if someone could "reimplement" the most important channels of the C4D default material and put it into the sdk samples (maybe the new github repo). I think this would solve a huge amount of questions in the forum. Is there a MAXON developer listening to me right now? :-)

On 30/08/2014 at 08:08, xxxxxxxx wrote:

IMHO. It seems to me that you're using the wrong kind of plugin.

MaterialData plugins are for making Volume Shaders. Which are different than displaying textures in UV space.
Textures and their UV data usually use the ChannelData class instead of the VolumeData class. The VolumeData class is for making 3D textures.
The ChannelData class will display the 2D textures one the object in the scene editor window even if the user has no OpenGL enabled. While vd will only display them at render time (or with the IRR). Or if the user has Open GL enabled.

If you just want to use bitmaps and textures. It's probably better to write a ShaderData plugin.
That plugin structure has both the ChannelData & VolumeData stuff built into it and you can use them both at the same time.


On 30/08/2014 at 17:32, xxxxxxxx wrote:

Hi ScottA,

so you mean that a ShaderData plugin, like the MandelBrot or Gradient shaders from the SDK examples, should be used. I'm pretty sure that the MaterialData is what I want because the user should use it like a material. The channels/description shown above is only a small part of the material. The complete one has multiple channels, exactly like C4D materials.

So I know that it is possible to implement simple texture mapping in a material, because.... you know... I can see it in C4D;-)

And I strongly believe that someone will know how to fetch textures for the editor as well :-)

If, for some strange and stupid reason, I have to use a ShaderData plugin inside my MaterialData plugin, to see the result in the editor, I will implement it. But in this case I would appreciate some hints on how to create/link/use such a combination from code. But as I said, I don't think that this is necessary.

On 30/08/2014 at 18:30, xxxxxxxx wrote:

A ShaderData plugin uses the existing default MaterialData layout. And you simply add new things to it as new shader options. But your new shaders can have custom gizmos in them (I think).
If you really wanted to start from absolute zero and make your own custom material from scratch. Then I guess the MaterialData is the one to use... Maybe.

The problem is that you don't get overridable methods containing ChannelData in a MaterialData plugin. It's supposedly for people wanting to create 3D Volume shaders. So you'll have to find some way to manually use the ChannelData class in your plugin. Which I don't even know if it's possible or not.
Plus, I don't even know if Maxon intended for people to use the MaterialData plugin as a blank sheet for making new materials from scratch.
The SDK says:  "A data class for creating material (volume shader) plugins"
If you read that literally then it's only for volume shaders. And doesn't mean it's an empty material canvas that's "blank" for you to add your gizmos.
On the other hand. It doesn't do a very job of clarifying that either.
The docs are horribly vague about things like this. Maxon support would need to clarify this.

The problem with using the VolumeData class to generate 2D textures is that your 2D textures will use resources and behave like a 3D volume. One of them being not drawing in the scene editor window.
Sure it will work. But what your doing is like putting a V-8 engine on a skateboard. 😉


On 31/08/2014 at 01:34, xxxxxxxx wrote:

Ok, I see :-)

For anyone here that knows how to tell my texture and GetPixel() to return valid data in the editor I'm super thankful!

But, if it is not possible I would like to clear a view things, please correct me anywhere:

* A ShaderData plugin is some sort of 2D effect. It's like writing rendering a fullscreen quad in OpenGL and applying a GLSL fragment shader that uses the u and v coordinates to produce an output "image". Since a texture, in some sense, already is a 2D image, C4D treats them in the same manner. That's the reason why we can select "Texture", "Gradient" or any of the ShaderData plugins from the SDK samples in the default material. So this behavior would be implemented as a SHADERLINK?

* Now when I use a default material, which is more like a "volumetric-I-do-it-all-for-my-underlying-object-material" and I simply activate the Color channel and put a texture into the... you see... "Texture"... then I get my simple UV texture mapping. That is exactly what I want. But C4D is so confusing by mixing up names like "texture", "shader" and "material".

* So is it true, that ShaderData plugins are evaluated in the editor independently of the OpenGL/Software settings of the user? And is it true that textures are not fetch-/sample-able in the editor?

* And what do I have to do, to get the attention from MAXON support? ;-)

Thank you all for your time!

On 31/08/2014 at 11:04, xxxxxxxx wrote:


the Cinema 4D viewport uses OpenGL; the CalcSurface-function is written in C++ so there is no way of calling that function while drawing the viewport. What we can do is to use a bitmap texture to define how the material looks like in the viewport. Then the OpenGL viewport can use that texture on the assigned objects.

When you want to create such texture you have to set the PLUGINFLAG_MATERIAL_GLIMAGE flag while registering your MaterialData-plugin. Then you can use the InitGLImage function to define the preview texture.

    virtual Bool InitGLImage(BaseMaterial* mat, BaseDocument* doc, BaseThread* th, BaseBitmap* bmp, Int32 doccolorspace, Bool linearworkflow)  
      GeData data;  
      mat->GetParameter(DescLevel(MYMATERIAL_DIFFUSE_TEXTURE), data, DESCFLAGS_GET_0);  
      String name = data.GetString();  
      // if defined use bitmap  
      if (name.Content() == true)  
          // get full path  
          Filename texture;  
          GenerateTexturePath(doc->GetDocumentPath() + doc->GetDocumentName(), Filename(name), Filename(), &texture);  
          // load texture into BaseBitmap  
          BaseBitmap * loadedBitmap = BaseBitmap::Alloc();  
          IMAGERESULT res = loadedBitmap->Init(texture);  
          if (res == IMAGERESULT_OK)  
              BaseBitmap * tempBitmap = BaseBitmap::Alloc();  
              tempBitmap->Init(bmp->GetBw(), bmp->GetBh(), bmp->GetBt());  
              // scale and copy loaded bitmap into temp bitmap  
              loadedBitmap->ScaleIt(tempBitmap, 256, true, true);  
              const Int32 width = bmp->GetBw();  
              const Int32 height = bmp->GetBh();  
              UInt16 r, g, b = 0;  
              for (Int32 y = 0; y < height; y++)  
                  for (Int32 x = 0; x < width; x++)  
                      tempBitmap->GetPixel(x, y, &r, &g, &b);  
                      bmp->SetPixel(x, y, r, g, b);  
          // use diffuse color  
          const Vector diffuseColor = data.GetVector();  
          const Int32 r = (Int32)(diffuseColor.x * 255.0);  
          const Int32 g = (Int32)(diffuseColor.y * 255.0);  
          const Int32 b = (Int32)(diffuseColor.z * 255.0);  
          const Int32 width = bmp->GetBw();  
          const Int32 height = bmp->GetBh();  
          for (Int32 y = 0; y < height; y++)  
              for (Int32 x = 0; x < width; x++)  
                  bmp->SetPixel(x, y, r, g, b);  
      return true;  

In some cases Cinema 4D does use special OpenGL-shaders to re-create the look of (C++) shaders in the viewport. This is what happens when you turn on "Noises" when "Advanced OpenGL" is activated. This only works for some standard shaders like noise. When you put a noise shader into a layer shader then Cinema will again bake the texture.

I'm not quite sure what Cinema does when you don't use OpenGL but the software viewport. Maybe it will create some preview texture itself when no texture is provided by the material.

So when to use MaterialData and ShaderData? Well, a MaterialData plugin is something that appears in the Material Manager. This can be some complex material like Cheen or just some scene entry that can be assigned to objects and are not really "shaders" or "materials" (like the architectural grass, hair material, sketch material).

A ShaderData is a (fragment)-shader that can be used within a material (or anywhere else where you find a SHADERLINK). It's Output function returns the color for the given fragment (defined by the ChannelData argument). So as an example if you want to do something that modifies the transparency of a object you can create a MaterialData plugin and use it's CalcTransparency function. Or you write a ShaderData that can be used by a material (that deals with transparency). ShaderData plugins can be used in different channels of the standard material (or at completely different places), MaterialData plugins can just be used as materials.

When you add a "Image" to the "Texture" parameter in the standard material's "Color" channel you really define a BaseShader of the type Xbitmap. That shader is called by the standard material when it is calculating the color. The actual bitmap sampling happens within that shader.

best wishes,

On 31/08/2014 at 12:15, xxxxxxxx wrote:

How does using the InitGLImage() to load the image eliminate using a VolumeData class to display it?

When I use the InitGLImage() to load the image. All that does is change how I'm loading it. And it also does not execute until the material is dropped on an object.
But I still have to use the VolumeData class in the CalcSurface() method to display the 2D image. Which is a horrible waste of resources.

In ShaderPlugins it's very clear.
We use ChannelData for displaying 2D images efficiently. And VolumeData for displaying 3D images which are much more resource hungry.
There doesn't seem to be that option with MaterialData plugins. And using the InitGLImage() method doesn't seem to adress that.

That's where the docs are not clear.
Can we, or can't we, display 2D images efficiently in MD plugins without using the VolumeClass?
If we can't. Then doesn't that make using 2D images in a MD a bad practice?


On 01/09/2014 at 14:24, xxxxxxxx wrote:


Maybe there are some misunderstandings. The InitGLImage function is not about loading resources. It's about creating a texture bitmap that is used by the viewport to give you a preview of the material that is assigned to a certain object. If you have to load some external resources to create that preview bitmap, you may do that.

I'm not quite sure what you mean by "display 2D images". When the raytracer wants to know the color of a certain fragment it executes the corresponding material's CalcSurface function. This function will do whatever is needed to calculate the final color. This may or may not involve sampling same BaseShaders. Both the CalcSurface function and the BaseShaders may or may not access data loaded from bitmap files.

You can add as may SHADERLINK parameters as you want to a material. And each shader can be a Xbitmap shader. Cinema can't know which texture to display; you must define what to do with this data.

best wishes,

On 01/09/2014 at 15:17, xxxxxxxx wrote:

Oh. It's just for the little Preview window?
So it has no bearing on how the material is displayed on object in the editor window?
No wonder why I was confused.

My understanding is that the MaterialData plugin is for generating 3D Volume type of textures for our meshes. And 2D Images should not be used in it at all. Because there's nothing it that handles displaying 2D images efficiently in the editor window.
It strictly uses math to create the colors. Which requires the VolumeData class.
Sure, we can sample colors of images in it. And apply them to our meshes using the VolumeData class. But that's crazy because we're treating 2D data as if it was 3D data.
AFAIK. There's no 2D class available in the MaterialData plugin to display the 2D sampled image colors on our meshes. Or is there?
As of now. My thinking is that nothing 2D related at all should be used in a MaterialData plugin. Other than maybe sampling images for the Preview window. But's that's it.

So what if we wanted  to create our own material dialog with only a color channel in it that is putting a 2D texture on our meshes. Without all of the other channels (Diffusion, Luminance, transparency, etc... ).  And does not use the VolumeData class to do it?
Do we use the MaterialData plugin for that (I'm thinking no) ?
Or do we need to use a ShaderData plugin and comment out "Xbase" in the .res file to get rid of those other channels. And then somehow create the color channel ourselves in the plugin's .res file?
Or is this not even possible?

I suppose I should just try it and see what happens. 🙂


On 02/09/2014 at 08:20, xxxxxxxx wrote:

So I have implemented the InitGlImage method. I used the code Sebastian posted and I multiply the texture color by the diffuse color. Fine for now... BUT... All updates happen one step "behind" so when I choose a texture and hit ok, then nothing happens. I have to change the color or hit "render" to update the editor view. When OpenGL is enabled then the "rendering" does not update the editor viewport, but I have to change a parameter to update the viewport. I have an image here, that should explain my problem. Keep in mind that I return full red (1,0,0) in the CalcSurface method, because I'm not interested in the raytracer for now.

It seems that when a description parameter is changed the InitGlImage method receives the old values. So the material description is not updated at this point. But why? And how can I tell C4D to use the updated values? It is sooo strange and I'm starting to get nuts!

On 06/09/2014 at 07:27, xxxxxxxx wrote:

I added a call to EventAdd() in InitGLImage(), so my editor viewport updates correctly now when OPENGL IS DISABLED. This is thread-save because the Event-Queue is thread-save, right? I ask this because InitGLImage is not called from the MainThread!

When OPENGL IS ENABLED nothing changed at all. I even tried to use EventAdd(EVENT_FORCEREDRAW), but still no update in the editor viewport. I can set a breakpoint in the InitGLImage method and the viewport in C4D is updated BEFORE the call to InitGLImage, which makes no sense at all! At least I have to be able to initiate an additional redraw/update of the viewport somehow?!

The DrawViews() method doesn't work either. I tried the flags DRAWFLAGS_NO_THREAD and DRAWFLAGS_FORCEFULLREDRAW. Since this function should only be called from the MainThread, I hacked a bool into my plugin and tried to listen to it in the Message method. I did not get the viewport to update with this approach.

I found this thread from 2006 where someone tried to do exactly the same as myself. He has problems with the alpha channel, but the updates seem to work (at least I guess so) : This post tells us to use Set/GetParameters, but why do all samples and a lot of code in the forum (dated after the recommendations!) still use the GetDataInstance access functions?!

Bool MyMaterial::Init(GeListNode* node)

instead of:

BaseContainer\* data = ((BaseMaterial\* )node)->GetDataInstance();


It seems to make no difference though.

I also experimented with this:
BaseMaterial* mat = static_cast<BaseMaterial*>(Get());
mat->Update(TRUE, TRUE);
Found here:

In all cases I have to change any parameter again/twice to see the changes in the editor viewport. Since it works when OpenGL is disabled, I guess that there has to be some sort of refresh/update/redraw call that I'm not aware of.

What do I have to do to get developer support? Where are the guys like "Matthias Bober" or "Mikael Sterner" which are MAXON developers, right?

On 11/09/2014 at 00:52, xxxxxxxx wrote:

Hey forum members :-)

Up there in my post are about 10 questions that are not or only partially answered. Is there anyone out there that can at least give me some hints? All my images should pretty much explain exactly what I'm trying to do.

On 08/11/2014 at 02:18, xxxxxxxx wrote:

Hey there,

The solution to the update bug (when OpenGL is enabled) is to call SetDirty() of the BaseBitmap when the pixels are set.

Bool MyMaterial::InitGLImage(BaseMaterial* mat, BaseDocument* doc, BaseThread* th, BaseBitmap* bmp, Int32 doccolorspace, Bool linearworkflow)
	Vector diffuseColor = ** your code to change the color of the material, e.g. fetch textures or adjust the color in the GUI **
	C4dHelper::setBitmapColor(bmp, diffuseColor);
	bmp->SetDirty(); // this is required so that the editor is update when OpenGL is used
	EventAdd(); // updates the editor view when using software rendering.
	return true;

The solution was posted in this related thread:

Note that there are still open questions in the thread, but they are of a more general nature and might be answered by an upcoming example that will be release soon by the support team. To find out more about the sample, read the thread in the link above.

Good luck!