[Qt-interest] QString and UNICODE: size determination
Constantin Makshin
dinosaur-rus at users.sourceforge.net
Thu Jul 9 14:50:16 CEST 2009
QString *always* stores Unicode characters, so the amount of memory used
by QString's contents is "str.size() * 2" or, what should be even better
(forward compatibility, etc.), "str.size() * sizeof(QChar)".
On Wed, 08 Jul 2009 20:16:15 +0400, Bob Hood <bhood2 at comcast.net> wrote:
> I want to verify that this is the best way to do this. If there's a
> better way, please feel free to correct me.
>
> I'm using a QString to store data that I'm going to use with direct
> Windows API calls in my Qt application. I'm setting/clearing a Windows
> Registry key. The call to these functions requires a wchar_t* and a
> size. QString returns a size(), but it is only returning the number of
> characters in the string, not the actual number of bytes that are
> consumed by the string.
>
> In order to determine this, I'm making an assumption that the size()
> value can be multiplied by a factor of 2 in order to get the correct
> length. For example:
>
> [...]
> int len = path_str.size();
> #ifdef UNICODE
> len *= 2;
> #endif
> RegSetValueEx(hkey,label_str.utf16(),0,REG_SZ,(const
> LPBYTE)path_str.utf16(),len);
> [...]
>
> Is this the best way of determining the actual memory footprint of the
> data in a QString?
--
Constantin "Dinosaur" Makshin
More information about the Qt-interest-old
mailing list