Any way to reserve but not commit memory in linux?

Any way to reserve but not commit memory in linux?

Windows has VirtualAlloc, which allows you to reserve a contiguous region of address space, but not actually use any physical memory. Later when you want to use it(or part of it) you call VirtualAllo…


python - Repeat NumPy array without replicating data?

I'd like to create a 1D NumPy array that would consist of 1000 back-to-back repetitions of another 1D array, without replicating the data 1000 times. Is it possible? If it helps, I intend to treat bot…


c - mmap with/dev/zero

Say I allocate a big memory(40MB) with mmap using/dev/zero as follows. fd=open("/dev/zero", O_RDWR); a=mmap(0, 4096e4, PROT_READ | PROT_WRITE, MAP_PRIVATE | MAP_FILE, fd, 0); What I understand…


NumPy vs. multiprocessing and mmap

I am using Python's multiprocessing module to process large numpy arrays in parallel. The arrays are memory-mapped using numpy.load(mmap_mode='r') in the master process. After that, multiprocessing.P…


c - Faster way to move memory page than mremap()?

I've been experimenting with mremap(). I'd like to be able to move virtual memory pages around at high speeds. At least higher speeds than copying them. I have some ideas for algorithms which could ma…


python - Techniques for working with large Numpy arrays?

There are times when you have to perform many intermediate operations on one, or more, large Numpy arrays. This can quickly result in MemoryErrors. In my research so far, U have found that Pickling(P…


Working with big data in python and numpy, not enough ram, how to save partial results on disc?

I am trying to implement algorithms for 1000-dimensional data with 200k+ datapoints in python. I want to use numpy, scipy, sklearn, networkx and other usefull libraries. I want to perform operations s…



How to use contiguous memory in linux kernel?

I found that physical memory is split into ranks as follows(Memory Interleaving): rank0:[0-512KB][2048KB-2560KB][4096KB-4608KB]... rank1:[512KB-1024KB][2560KB-3072KB][4608KB-5120KB]...…


python - Resizing numpy.memmap arrays

I'm working with a bunch of large numpy arrays, and as these started to chew up too much memory lately, I wanted to replace them with numpy.memmap instances. The problem is, now and then I have to res…


python - Efficient dot products of large memory-mapped arrays

I'm working with some rather large, dense numpy float arrays that currently reside on disk in PyTables CArrays. I need to be able to perform efficient dot products using these arrays, for example C=…




memory numpy large python mmap linux mapped kernel array file