Re: GetCurrentDirectory return a value with none-unicode
"Mihai N." <nmihai_year_2000@yahoo.com> wrote in message
news:Xns9981EDAAFECE6MihaiN@207.46.248.16...
Yes, but if you have an ANSI DLL that for some unfathomable reason must
have an 8-bit interface, you could pass strings in UTF-8, convert them
to UTF-16 in the DLL, and use a wide-character API to open the file (not
CFile).
Well, you can.
But if the DLL can handle UTf-16 in the belly, there is little reason to
have a UTF-8 interface. Ok, it can be an "unfathomable reason" :-)
You can even use UTF-7, if so you wish :-)
But to do something usefull with the thing (using some Windows API)
you will have to go back to UTF-16.
Most often (and in this thread) the argument was that the DLL uses
ANSI because it is too difficult to move to Unicode.
The real problem is MFC doesn't have something called CFileW. If it did,
then just the parts of the DLL that used this file would deal with LPWSTR,
the filename would be in Unicode, and all would be well. And the rest of
the DLL would still be in Ansi, thus safely protecting whatever depends on
Ansi.
What I have done is to selectively compile only certain .cpp modules with
UNICODE/_UNICODE defined and leave the rest with _MBCS defined. So the DLL
is "mixed" because it has both Ansi and UNICODE modules. It works OK.
Sometimes it is not straightforward to move the entire DLL to UNICODE
because sometimes LPTSTR is used when LPSTR is meant, so flipping to UNICODE
would break the code that still needs LPSTR. I completely understand the
reluctance to do this, late in the cycle.
-- David