[Development] Why can't QString use UTF-8 internally?
Konstantin Ritt
ritt.ks at gmail.com
Tue Feb 10 23:19:59 CET 2015
Can QChar represent a 32 bits codepoint, then?
Regards,
Konstantin
2015-02-11 2:11 GMT+04:00 Thiago Macieira <thiago.macieira at intel.com>:
> On Wednesday 11 February 2015 01:52:34 Konstantin Ritt wrote:
> > 2015-02-11 1:26 GMT+04:00 Thiago Macieira <thiago.macieira at intel.com>:
> > > On Wednesday 11 February 2015 00:37:41 Konstantin Ritt wrote:
> > > > Yes, that would be an ideal solution. Unfortunately, that would also
> > >
> > > break
> > >
> > > > a LOT of existing code.
> > > > In Qt4 times, I was doing some experiments with the QString adaptive
> > > > storage (similar to what NSString does behind the scenes).
> > >
> > > I've thought of this too.
> > >
> > > This stumbles on QString's implicit sharing. If you do this:
> > > QString foo = "some UTF-8 text";
> > > QString copy = foo;
> > > qDebug() << foo.constData()[0];
> >
> > In my experiments (a QString with an adaptive storage),
> data()/constData()
> > returns uchar*;
>
> No, it doesn't. It returns QChar because it has returned QChar since Qt 2.
> You
> cannot change this.
>
> --
> Thiago Macieira - thiago.macieira (AT) intel.com
> Software Architect - Intel Open Source Technology Center
>
> _______________________________________________
> Development mailing list
> Development at qt-project.org
> http://lists.qt-project.org/mailman/listinfo/development
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.qt-project.org/pipermail/development/attachments/20150211/44246d5e/attachment.html>
More information about the Development
mailing list