Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

Why does declaring a 2D array of sufficient size cause a segfault on Linux but not macOS?

Problem

I’m trying to declare a large 2D Array (a.k.a. matrix) in C / C++, but it’s crashing with segfault only on Linux. The Linux system has much more RAM installed than the macOS laptop, yet it only crashes on the Linux system.

My question is: Why does this crash only on Linux, but not macOS?

Here is a small program to reproduce the issue:

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

// C++ program to segfault on linux
#include <iostream>
#include <stdlib.h>

using namespace std;

int main()
{
    cout << "Let's crash for no raisin! 🧠" << endl;

    cout << "Int size: " << sizeof(int) << endl;
    for (int n=1012; n < 2000; n++) {
      cout << "Declaring Matrix2D of size: " << n << "x" << n << " = " << n*n << endl;
      cout << "Bytes: " << n*n*sizeof(int) << endl;

      // segfault on my machine at 1448x1448 = 8386816 bytes
      int Matrix2D[n][n];
      // these two lines can be commented out and the program still reaches segfault
      // int* pM2D = (int*)malloc(n*n*sizeof(int));
      // free(pM2D);
    }
    return 0;
}

Compile with: g++ -Wall -g -o segfault segfault.cpp

Output

Linux

Linux system has 64 GiB RAM installed!

$ ./segfault ; free --bytes
Let's crash for no raisin! 🧠
Int size: 4

[...SNIP...]

Declaring Matrix2D of size: 1446x1446 = 2090916
Bytes: 8363664
Declaring Matrix2D of size: 1447x1447 = 2093809
Bytes: 8375236
Declaring Matrix2D of size: 1448x1448 = 2096704
Bytes: 8386816
Segmentation fault (core dumped)


              total        used        free      shared  buff/cache   available
Mem:    67400994816 11200716800  4125982720   412532736 52074295296 55054041088
Swap:    1023406080   824442880   198963200

$ ulimit -a
core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
scheduling priority             (-e) 0
file size               (blocks, -f) unlimited
pending signals                 (-i) 256763
max locked memory       (kbytes, -l) 65536
max memory size         (kbytes, -m) unlimited
open files                      (-n) 65535
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) 256763
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited
macOS

macOS system has only 16 GB RAM installed! 😲

$ ./segfault ; sysctl -a | grep mem ;
Let's crash for no raisin! 🧠
Int size: 4

[...SNIP...]

Declaring Matrix2D of size: 1997x1997 = 3988009
Bytes: 15952036
Declaring Matrix2D of size: 1998x1998 = 3992004
Bytes: 15968016
Declaring Matrix2D of size: 1999x1999 = 3996001
Bytes: 15984004


kern.dtrace.buffer_memory_maxsize: 5726623061
kern.dtrace.buffer_memory_inuse: 0
kern.memorystatus_sysprocs_idle_delay_time: 10
kern.memorystatus_apps_idle_delay_time: 10
kern.memorystatus_purge_on_warning: 2
kern.memorystatus_purge_on_urgent: 5
kern.memorystatus_purge_on_critical: 8
vm.memory_pressure: 0
hw.memsize: 17179869184
machdep.memmap.Conventional: 17077571584
machdep.memmap.RuntimeServices: 524288
machdep.memmap.ACPIReclaim: 188416
machdep.memmap.ACPINVS: 294912
machdep.memmap.PalCode: 0
machdep.memmap.Reserved: 84250624
machdep.memmap.Unusable: 0
machdep.memmap.Other: 0

>Solution :

Although ISO C++ does not support variable-length arrays, you seem to be using a compiler which supports them as an extension.

In the line

int Matrix2D[n][n];

n can have a value up to 2000. This means that the 2D array can have 2000*2000 elements, which equals 4 million. Every element has a size of sizeof(int), which is 4 bytes on linux. This means that you are allocating a total of 16 Megabytes on the stack. This is exceeding the limit of the stack, causing a stack overflow.

The reason why it is not crashing on MacOS could be that the stack is configured for a higher maximum limit, or it could be that your program is not crashing because variable-length arrays are implemented differently, so that the program is not touching the 2D array, or maybe it is touching the 2D array, but only in such a way that it doesn’t cause a crash. These are implementation-details of the compiler.

The amount of memory actually installed on the computer is not relevant. What counts is the maximum stack limit configured in the operating system.

If you want to use larger amounts of memory than would be permitted on the stack, you should use the heap instead. In that case, you should allocate the memory instead with std::make_unique, operator new or std::malloc. You can also use most STL containers such as std::vector, which will automatically store its contents on the heap, even if you create the actual container on the stack. However, beware that some STL containers will not, such as std::array.

Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading