large - maximum size of char array in c




What is the maximum size of an array in C? (5)

C99 5.2.4.1 "Translation limits" minimal size

The implementation shall be able to translate and execute at least one program that contains at least one instance of every one of the following limits: 13)

  • 65535 bytes in an object (in a hosted environment only)

13) Implementations should avoid imposing fixed translation limits whenever possible.

This suggests that a conforming implementation could refuse to compile an object (which includes arrays) with more than short bytes.

PTRDIFF_MAX seems to be a practical limit for static array objects

The C99 standard 6.5.6 Additive operators says:

9 When two pointers are subtracted, both shall point to elements of the same array object, or one past the last element of the array object; the result is the difference of the subscripts of the two array elements. The size of the result is implementation-defined, and its type (a signed integer type) is ptrdiff_t defined in the <stddef.h> header. If the result is not representable in an object of that type, the behavior is undefined.

Which implies to me that arrays larger than ptrdiff_t are allowed in theory, but then you cannot take the difference of their addresses portabibly.

So perhaps for this reason, GCC just seems to limit you to ptrdiff_t. This is also mentioned at: Why is the maximum size of an array "too large"?

I have empirically verified this with main.c:

#include <stdint.h>

uint8_t a[(X)];

int main(void) {
    return 0;
}

and then in Ubunbu 17.10:

$ arm-linux-gnueabi-gcc --version
arm-linux-gnueabi-gcc (Ubuntu/Linaro 7.2.0-6ubuntu1) 7.2.0
Copyright (C) 2017 Free Software Foundation, Inc.
This is free software; see the source for copying conditions.  There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

$ printf '
> #include <stdint.h>
> PTRDIFF_MAX
> SIZE_MAX
> ' | arm-linux-gnueabi-cpp | tail -n2
(2147483647)
(4294967295U)
$ PTRDIFF_MAX == 2147483647 == 2^31 - 1
$
$ # 2lu << 30 == 2^31 == PTRDIFF_MAX + 1
$ arm-linux-gnueabi-gcc -std=c99 -DX='(2lu << 30)' main.c
a.c:5:9: error: size of array ‘a’ is too large
 uint8_t a[(X)];
         ^
$
$ # PTRDIFF_MAX
$ arm-linux-gnueabi-gcc -std=c99 -DX='(2lu << 30) - 1lu' main.c
$

See also

I understand that hardware will limit the amount of memory allocated during program execution. However, my question is without regard to hardware. Assuming that there was no limit to the amount of memory, would there be no limit to the array?


A 64-bit machine could theoretically address a maximum of 2^64 bytes of memory.


I was looking for a way to determine the maximum size for an array. This question seems to ask the same, so I want to share my findings.

Initially, C does not provide any function to determine the maximum number of elements allocable in an array in compilation time. This is because it will depend of the memory of available in the machine where it will be executed.

On the other hand, I have found, that memory allocation functions (calloc() and malloc()) enable to allocate larger arrays. Moreover, these functions allows you to handle runtime memory allocation errors.

Hope that helps.


The size of the pointer will limit the memory you are able to access. Even if the hardware offers support for unlimited memory, if the largest datatype you are able to use is 64 bit, you'll only be able to access 2^64 bytes of memory.


Without regard for memory, the maximum size of an array is limited by the type of integer used to index the array.





arrays