Last updated: Feb 12, 1999
Here it is: A patch for POV-Ray 3.1 to allow Unicode text.
This will allow for non-ASCII characters in your text objects. In fact, it should allow to get at every glyph in your fonts.
Why do you need this? Why isn't the existing DBCS patch good enough?
Well, in a nutshell, those other patches just don't quite do the job in a portable way. If you wanted to render Japanese, you'd need to have your OS support it. I for one don't. With Unicode support and this patch, anyone in the world using the same POV file and the same font file will end up with the same end results. That's good.
Here are the the source patches needed for 3.1. Download these and apply them to a clean copy of the official sources.
Download unipatch.zip
These are the steps to apply the changes:
optout.h
with the optout.h.patch
file.
parse.c
with the parse.c.patch
file.
parse.h
with the parse.h.patch
file.
tokenize.c
with the tokenize.c.patch
file.
truetype.c
with the truetype.c.patch
file.
truetype.h
with the truetype.h.patch
file.
uniutils.c
uniutils.h
Now compile and render away!
Warning: This patch is incompatible with the CJK & DBCS patches out there. It is intended for a clean source only. However, it will work with just the TTC patches.
(I also have a Windows .exe built for those who might need that done for them. See the download section)
Once you have a binary with the changes that you can run, the rest is quite simple. Now all text objects will be processed as UTF-8 characters. Since UTF-8 is a superset of standard US-ASCII, most pov files will work with no changes. However, if you are using (or want to use) characters outside of 32-127 (0x20-0x7f), you'll need to be sure your POV source file gets converted to UTF-8.
Added Feb 06, 1999
If you need to use something other than UTF-8, there is now global setting
that will affect it. Set string_encoding
to a supported value.
global_settings { string_encoding "UTF8" // default // string_encoding "ISO8859_1" // string_encoding "Cp1252" // string_encoding "MacRoman" }
Well, just save your files as UTF-8 :-)
Seriously, if you need a utility to do this, I know of a sledgehammer to
do it. If you download one of the Java
JDK's from Sun, it comes with a utility for characterset converions:
native2ascii.
Just convert from whatever characterset to escaped ASCII, and then convert from
escaped ASCII to UTF-8 using the "-reverse
" and "-encoding
UTF8
" options.
Here are a few things that I might add:
.pov
filesFor any comments, suggestions, support help, etc. send me mail at joncruz@geocities.com
Download unipatch.zip. This is the preferred method.
You might want to read the instructions for applying the patch.
Warning: These are purely experimental, and should be used as such. The main intent is to test the creation of glyphs outside the 0-127 range. There might be minor differences in scene colors, subtleties, etc., but the characters should be the correct shapes. I'm mainly presenting these as a convenience for those who might have problems compiling for themselves, but still want to check it out.
Although I try to take reasonable steps to check these, please verify things at you end. And as with all things you download, run your own virus checks on them.