[Interest] memory fragmentation?

Till Oliver Knoll till.oliver.knoll at gmail.com
Wed Aug 22 08:30:29 CEST 2012

Am 22.08.12 05:45, schrieb Graeme Gill:
> Till Oliver Knoll wrote:
>> Folks, I gave up checking for NULL pointers (C, malloc) or bad_alloc
>> exceptions (new, C++) a long time ago. I remember a discussion several
>> years ago (here on Qt interest?) about desktop memory managers actually
>> never returning a NULL pointer (or throwing an exception) when they
>> cannot allocate memory.
> This is simply not true when it comes to malloc. Malloc can and does
> return NULL on MSWin, OS X and Linux. I have some code that
> uses as much RAM as possible for computation caching, and the simplest
> portable way of sizing the virtual memory space is to malloc memory
> until it returns NULL,

Now I got curious and just tried to allocate a Terrabyte (if my math 
serves me right ;))

   const size_t NofGigabyte = 1024;
   const size_t Gigabyte = 1024 * 1024 * 1024;
   char *ptr;

   ptr = (char *)malloc(NofGigabyte * Gigabyte);

This worked both with malloc (as above) and likewise with new[]! I got a 
valid pointer.

The next thing I did is actually try to use that memory:

   if (ptr) {
     for (size_t i; i < NofGigabyte * Gigabyte; ++i) {
       ptr[i] = i % 256;

With some debug statements in between, the process would happily start 
writing data into that allocated memory!

This is on Mac OS X 10.6.8 (Snow Leopard) with 4 GByte of physical RAM, 
and according to Activity Monitor with about 268 GB of Virtual Memory 
(which couldn't be used entirely, because my harddisk has much less free 

According to Activity Monitor my "Memory Buster" process got allocated 
up to about 2 GByte of physical memory, after that the harddisk started 
spinning like crazy, as expected and the swap file grew. After 5 minutes 
I killed the process with Ctrl + C, but up to that point it was still 
happily filling my swap file with funny numbers. It was *not* killed by 
the OS.

However when I first ignored the compiler warning that a constant was 
overflowing when writing

   const size_t RidiculousSize = 1024 * 1024 * 1024 * 2;

(size_t is not the problem here, but the constant at the right of the = 
would overflow!)

only then did I get an exception, saying that 823....4 (lots of digits - 
a REALLY ridiculous large size) could not be allocated. So yes, there is 
a limit somewhere, but it must be FAR beyond my harddisk size ;)

Cheers, Oliver

More information about the Interest mailing list