[Qt-interest] Bug with QVector on really large data sets
Colin S. Miller
no-spam-thank-you at csmiller.demon.co.uk
Sun Feb 14 23:11:34 CET 2010
David Boosalis wrote:
> I am using qt-everywhere-opensource-src-4.6.1/ on a 64 bit OS. I get a
> core dump from QVector <QByteArray> object I have.
>
> After about 826 appends of a QByteArray object of size about 52 it core
> dumps with the trace given below. For large Vectors should I try to
> reserve space to handle large vectors. Doing this presents a problem in
> that I never know how many records will coming in. It could be 1 or in
> this case it might be close to a 1000. To reserve space for the worst
> case scenario would really waste a lot of resources for me.
>
> Any adivse on how to get around or fix this issue ?
> Thank you
> David
David,
assuming you are running on Linux, I'd install libc6-dbg;
it will let you see what malloc is doing wrong.
malloc() shouldn't crash; either there is a bug in libc6 or Qt has trashed
the heap - however both are well respected major pieces of code;
I'd be surprised if either has bug in the memory handling.
Running under valgrind might help diagnose any problems before they cause the crash.
You could try resizing the QVector in units of 10 (or doubling its size each time),
and then shrink it when the final size is known.
BTW, is there any reason why you are not using std::vector?
HTH,
Colin S. Miller
More information about the Qt-interest-old
mailing list