I was wondering why there's such a fundamental difference between emojis on macOS vs. Windows inside of Cinema4D. How is C4D handling these emojis and is there a place to change the way they are treated?!
My Emoji-List looks like the following (VSCode/Windows):
same list on Windows:
same list on macOS:
Some further insights would be very much appreciated.
I am not quite sure what an answer you would consider enlightening. Unicode is primarily an encoding standard, so it defines how an character should be encoded in bytes, not how it should look like (technically they also try to enforce representation since a few years, but with very little success). You could also say Unicode is a structural standard and not a content (metadata) standard. This leads to the fact that many characters are either not renderable at all on any given system or look very different. This happens often in social media apps like Slack, the Debian version of its client handles emoji charcters very poorly for example.
And I can't resist to quote XKCD here
Hi @zipit ,
thanks for your answer. I guess I would at least except something along those lines:
https://emojipedia.org/apple/ vs. https://emojipedia.org/microsoft/
Just wondering how and why C4D is handling these things.
why do you think C4D has any special handling for emojis? The font rendering is OS dependent, so any kind of text - including emoji codepoints - is drawn by the underlying operating system routines. Anything else would mean a ridiculous effort by Maxon to replicate font behavior.
(I do not know how Windows internally handles emojis, I doubt that every available font has all these characters so most likely certain codepoints are mapped to common glyphs regardless of the font... but that is not a C4D question anyway.)