External dependencies: The right way to do

  • On 11/03/2013 at 14:17, xxxxxxxx wrote:

    Update  2016-01-22: Read this  post instead. The information in this thread is old!

    Hello fellow developers. Today I want to talk about third-party modules in
    a Python plugin. It is often necessary to use modules that are not delivered
    with the CPython distribution in your plugin because of certain functionality
    they provide. There are two common ways to import external libraries in your
    plugin, and they both have their advantages and disadvantages.

    What we, as developers, might like the most, is to just import the module and
    don't care about the rest. For this, we have to put the module in a place where
    the Python interpreter in Cinema can find it.

    1. Insert the module in:

    {Cinema Preferences Folder}/library/python/packages/{OS Name}

    The Python interpreter embedded in Cinema will search for modules
           there additionally to the standart paths. However, the module
           has to be copied twice for Windows since there is a win32 and win64
           folder and the Python distribution will search in the respective folder
           depending on the Cinema build you run.

    2. Modify the PYTHONPATH environment variable: We can add a path to this
           environment variable where we can put all external modules we ever
           want to use. The advantage of this method is that *any* Python
           installation and not only the Cinema 4D installation will check the
           paths defined in this environment variable. This way, we only need one
           copy of the external module on our whole computer.

    Either way, the user has to perform this step, too. This isn't actuall a great
    deal and can be done quickly, but keep in mind that a normal user might not be
    familar with folder-structures, environment variables and all the stuff. Not
    kidding, this is not a rare case.
    The user will also be responsible for updating the external modules when your
    new version of your plugin requires a newer version, etc. All in all, we end
    up giving support to the users of our plugins on how to correctly fullfil all

    How can we make it better?

    We can distribute the external dependencies with our plugin. The user will
    just unpack the plugin archive and it will run. Sounds good, eh? But it's
    tricky !! First of all, how do we deliver an external module with our plugin?

    We will put it into a folder and import it from there, like:

    lib\n    module.py
    res\n    <resource files>

    And the code to load the module:

    import os
    import sys
    dirname = os.path.dirname(__file__)
    lib_path = os.path.join(dirname, 'lib')
    sys.path.insert(0, lib_path)
        import module
        # Remove the path we've just inserted.

    Notice the try/finally clause. Even when an error occures while loading one of the
    modules within the try-block, the finally block garuantees that the path we have added
    will be removed again!

    So where's the tricky part?

    Let's make this more fancy and say that we have two plugins relieng on the same module,
    for the sake of example we will call this module "module". Therefor, we got wo plugins,
    one dependency, but both distribute the module with  them.

    # plugin1.pyp
    import module
    # plugin2.pyp
    import module

    Yet, there is not problem. But it comes to a problem when the plugins rely on
    different versions of the module. Eg. plugin2 relies on version 2 of the module while
    plugin1 is already satisfied with version 1. Version 2 added new functions to the module.

    # plugin1.pyp
    import module
    # plugin2.pyp
    import module

    The call to module.new_func() can easily fail when plugin1 is loaded before
    plugin2. After plugin1 imported the module in version 1, the module resides
    in the sys.modules dictionary, and this is the place where the import
    mechanism of Python looks in the first place before searching for the module
    on the HD.

    Once again: When plugin1 is loaded before plugin2, import module will load
    "module.py" from plugin1 into plugin2 and not "module.py" from the plugin2!
    The module loaded by plugin1 might even be a completely different one than
    plugin2 but have the same name. In this case, one of the plugins is condenced
    to fail!

    How to fix?

    You should always remove modules imported from your plugin-local folder from
    sys.modules (not to mention to remove the path added to sys.path to actually
    import the module). Best practice is to restore the old module configuration
    after importing local libraries.

    # Store the old module configuration.
    old_modules = sys.modules.copy()
    # Import your stuff, for example:
    lib_path = os.path.join(os.path.dirname(__file__), 'lib')
    sys.path.insert(0, lib_path)
        import module
    # Restore the previous module configuration making sure to not
    # remove modules not loaded from the local libraries folder.
    for k, v in sys.modules.items() :
        if k not in old_modules and hasattr(v, '__file__') or not v:
            if not v or v.__file__.startswith(lib_path) :
            sys.modules[k] = v

    Edit: I have hit strange errors when removing modules loaded by a module from the
    local plugin distribution that are from the Python standard library or at least not
    from the plugin's lib dir. With hasattr(v, '__file__') we're checking if the module
    is a built-in module or not. If it isn't, we check if the filename of the module
    starts with the lib_path of our plugin. If so, we can safely remove it!

    But note that you should remove built-in modules when they are distributed with your
    plugin (native modules referring to C/C++ modules with *.pyd suffix)! Imagine you are
    delivering the _ssl module for Windows with your plugin, then do it like this:

    # .. add lib path to sys.path and store old module configuration
    import _ssl
    del sys.modules['_ssl']
    # ... remove modules like above!

    The code above would be too complex to put it into a one-line expression, which is
    why I striked through the following block:
    If you're feeling nasty, you can also put the last five lines into a one-liner (just
    split up  for readability) :
    [sys.modules.__setitem__(k, v) if k in old_modules else sys.modules.pop(k)
    ~~  for k, v in sys.modules.items()]~~

    Conclusion : Do always ensure that you do not influence other plugins by
    importing external modules.

    edit: 2013/03/13
    Reloading modules at runtime

    Cinema 4D allows us to reload Python plugins at runtime so we do not have to restart the
    application after making changes to a plugin. However, loaded modules not be reloaded
    with this command. There's a reload() function built-in to Python so we can use it in
    PluginMessage() and react on the C4DPL_RELOADPYTHONPLUGINS message.

    def PluginMessage(msg, data) :
        if msg == c4d.C4DPL_RELOADPYTHONPLUGINS:

    But it's more tricky than that. The reload() function only works when the module can
    be found in sys.modules (which we have removed it from) and if the module can be found
    via sys.path (we removed the path the module is located at from this list). So, we
    again have to find a work-around to this. For this workaround, we have to dive into
    some black magic. We could reload a single module simply by

    import sys
    import imp
    import types
    def load_module(module=None, filename=None, name=None) :
        if module:
            filename = module.__file__
            name = module.__name__
        elif not all(filename, name) :
            raise TypeError('expected a module or both filename and name.')
        # Obtain the module that was previously stored.
        prev_mod = sys.modules.get(name, None)
        # Load the module.
        new_mod = imp.load_source(name, filename)
        # The new module has been inserted into sys.modules, we have to
        # fix that again.
        if prev_mod:
            sys.modules[name] = prev_mod
        return new_mod
    def reload_recursive(module) :
        mod = load_module(module)
        for name, submodule in vars(module).iteritems() :
            if isinstance(submodule, types.ModuleType) :
                new_mod = reload_recursive(submodule)
                setattr(module, name, new_mod)
        return mod

    But then again, it comes to issues when the module imports modules from your other
    dependencies, because the modules are not on sys.path. As hacky as it sounds, we
    have to insert the required paths to sys.path again temporarily. So here's the solution:

    import os
    import sys
    import imp
    def get_root_module(modname, suffixes='pyc pyo py'.split()) :
        Returns the root-file or folder of a module filename. The return-value
        is a tuple of ``(root_path, is_file)``.
        dirname, basename = os.path.split(modname)
        # Check if the module-filename is part of a Python package.
        in_package =  False
        for sufx in suffixes:
            init_mod = os.path.join(dirname, '__init__.%s' % sufx)
            if os.path.exists(init_mod) :
                in_package = True
        # Go on recursively if the module is in a package or return the
        # module path and if it is a file.
        if in_package:
            return get_root_module(dirname)
            return os.path.normpath(modname), os.path.isfile(modname)
    def reload_modules(*modules) :
        Reload the passed module objects. Restores the previous module
        configuration after reloading.
        # Find all root-directories of the passed modules.
        paths = set()
        for mod in modules:
            mod_name = mod.__name__
            if imp.is_builtin(mod_name) != 0 or not hasattr(mod, '__file__') :
                raise RuntimeError('cannot reload built-in module %s.' % mod_name)
            mod_root, is_file = get_root_module(mod.__file__)
            dirname = os.path.dirname(mod_root)
            dirname = os.path.normpath(dirname)
        # Change the system path and store a copy of the current module
        # configuration.
        old_path = sys.path
        old_mods = sys.modules.copy()
        sys.path = list(paths) + sys.path
            # Reload the modules.
            new_mods = []
            for mod in modules:
                if mod.__name__ in sys.modules:
                    new_mod = reload(mod)
                    new_mod = __import__(mod.__name__)
            # Restore the import search path and module configuration.
            sys.path = old_path
            for k, v in sys.modules.items() :
                if k not in old_mods:
        return new_mods
    def PluginMessage(msg, data) :
        if msg == c4d.C4DPL_RELOADPYTHONPLUGINS:
            global module
            module, = reload_modules(module)

    Welcome back after reading through this monster for such a "simple task". The
    reload_modules() function will use the reload() function if the module is available
    in sys.modules or use the standart import mechanism to import it if the reload()
    function can not be used.

    Holy crap, why should I stick to these rules?

    Seriously, just do your fellow developers a favor. And also yourself. And the users of
    your plguins. I've hit this problem just two days before and that is why I wanted to
    share the solution with you. Yes, I've hit it myself, with two of my own plugins!

    All of the things I've mentioned above to not fuzz other plugins by importing the
    dependencies would not be a necessity  if the users install the dependencies somewhere
    all plugins can find them (and updates them accordingly to new requirements of other
    plugins etc.). But as already mentioned, you will either end up giving very much user
    support or loose users because of this "inconvenience" (it actually is just a 5 minute
    task, if even).

    I admit that I usually  tell the users to install the dependency as described above for
    free plugins. There are also cases were you are actually forced  to deliver the
    dependencies with  your plugin. For example, both of my current clients explicitly stated
    that they want the dependencies to be distribute with the plugins.  Ouch, I've spend a lot
    of time figuring the above out, but since they pay me to do what they want, I'll have
    to do it. 😉

    If you have any questions regarding this topic, please don't hesitate to ask here.

    All the best,

  • On 12/03/2013 at 08:08, xxxxxxxx wrote:

    hey, good tutorial. something like this should be in in the documentation. as this is
    targeted at beginners you might want to add a part about relaoding modules, so that 
    you can dabug your modules without having to restart c4d for each change.

  • On 12/03/2013 at 09:28, xxxxxxxx wrote:

    Hi Ferdinand,

    thanks for the feedback. Good idea, I'll add this asap.


  • On 13/03/2013 at 06:19, xxxxxxxx wrote:

    I've added "Reloading modules at runtime" and a final word to the entry.


  • On 14/03/2013 at 07:18, xxxxxxxx wrote:

    I have updated the article on built-in modules. I've hit strange errors when a built-in module
    was removed from sys.modules.

    I have hit strange problems when removing non-local modules (eg modules imported by the
    local distributed modules from the Python STL) from sys.modules. Not removing those fixes the
    problems and is also more resource efficient!

    The post above is updated respectively.

Log in to reply