[Development] Why can't QString use UTF-8 internally?

Julien Blanc julien.blanc at nmc-company.com
Wed Feb 11 11:22:59 CET 2015


On 11/02/2015 10:32, Bo Thorsen wrote:
> 2) length() returns the number of chars I see on the screen, not a
> random implementation detail of the chosen encoding.

How’s that supposed to work with combining characters, which are part of 
unicode ?

> 3) at(int) and [] gives the unicode char, not a random encoding char.

Same problem with combining characters. What do you expect :

QString s = QString::fromWCharArray(L"n\u0303");
s.length(); // 1 or 2 ??
s[0]; // n or ñ ??

> std::string fails at those completely basic requirement, which is why
> you will never see me use it, unless some customer API demands it or I'm
> in one of those exceptional cases where there is sure to be ascii only
> in the strings.

QString (at least in qt4, not tested with qt5) fails, too, but less often.


> Another note: Latin1 is the worst idea for i18n ever invented, and it's
> by now useless, irrelevant and only a source for bugs once you start to
> truly support i18n outside of USA and Western Europe. I would be one
> step closer to total happiness if C++17 and Qt7 makes this "encoding"
> completely unsupported.

Could not agree more with that part.

Regards,

Julien Blanc



More information about the Development mailing list