On 04/03/2017 at 05:22, xxxxxxxx wrote:
I have written an HDRI Browser so I can load hdris faster. I have used an GeDialog and that is working fine. My real problem is, that when I am switching hdris, the ram just keeps filling up. I only have a sky object and the material where I change the luminance channel to my hdri path.
So I was wondering if there is a method or something to clear the ram of unnecessary textures.
On 06/03/2017 at 03:09, xxxxxxxx wrote:
this is of course a bit hard to answer without knowing details about your implementation.
In general there should be no need to free RAM for unnecessary textures. After all you only set a path in a bitmap shader and pass the ownership of this shader to C4D. After this C4D will take care of the memory handling.
I'm more thinking about any references to image data (BaseBitmap) you hold in your code. Like for example in a list or an array. In this case Python's garbage collection would not be able to free the data. Would something like this be possible?
On 06/03/2017 at 04:00, xxxxxxxx wrote:
thanks for the reply.
I don't have arrays where I store my images.
I load the images from png "preview files" using
bitA = c4d.bitmaps.BaseBitmap() bitB = c4d.bitmaps.BaseBitmap() bitB.Init(200*self.sizeFac,100*self.sizeFac) bitA.InitWith(self.path + "/" + file) bitA.ScaleIt(bitB, 256, False, True) bitA.FlushAll() self.BUTTON_ID.SetImage(bitB , True)
this piece of code. I use the CUSTOMGUI_BITMAPBUTTON which I generate dynamically. When changing the HDRI folder I flush all groups (every hdri image has its own group) and use the same code as above. Would there be any problems doing things this way?
On 08/03/2017 at 08:37, xxxxxxxx wrote:
how do you measure the increase in memory consumption? I have implemented a GeDialog, which toggles the image of a BitmapButton between a scaled and a non-scaled version (basically using your code on every button press) and I don't see any increase in memory consumption. Neither with the built in memory statistics (there are plugins showing these in C++ and Python examples), nor with the Windows Task Manager.
On 08/03/2017 at 12:16, xxxxxxxx wrote:
thanks for your reply.
The plugin as it is does not have memory leaks, as far as I have tested it. But what my problem is, is that when I changd the path of the channel (with my plugin), Cinema 4D loads the new HDR file into the material and therefore into the ram, but also keeps the old one in ram. Which means by changing the HDR file just a few times Cinema 4D took like 2 GB of ram.
hope everything is clear.
On 09/03/2017 at 02:34, xxxxxxxx wrote:
sorry, I forgot about the material context. I have extended my test code respectively and now I change the texture of the luminance channel of the material assigned to a Sky Object within a timer every two seconds (BitmapButton still changes in parallel). I also cranked resolution of involved images and increased material preview size to make any leak more visible. It's running for a few hours now and no relevant increase in peak memory usage here. I mean, during the first minutes there is a slight increase (few megabytes), but these can be explained by various caching mechanisms and asynchronism of memory allocations of the various sub-systems, so it takes a while until they all meet and create the final peak. But from then on it's absolutely stable here for hours. So I still can't reproduce.
How do you change the texture? By changing the path of the bitmap shader, only? Or by creating a new bitmap shader every time? In the later case, could it be, that you forget to remove the old bitmap shader from the material?