Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

Is there a way to prevent std::vector from de-allocating memory dynamically when a resize occurs?

I would like to use std::vector in place of raw C arrays, in order to avoid cluttering up my code.

However, one of the setbacks with std::vector’s dynamism is that the underlying data can’t reliably be assumed to remain at the same location in RAM. For example:

Say I have three vectors of chars stored in 9 bytes of memory:

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

A[0] A[1] A[2] | B[0] B[1] B[2] | C[0] C[1] C[2]

Given the way B is located, snugly in-between A and C, it’s reasonable to assume that if B increases in size, the entire contents would move to a different location in memory, as opposed to the implementation of std::vector using any non-sequential storage of data. I’m happy enough that std::vector doesn’t do this non-sequential storage stuff, because that means the underlying data structure remains the array, which can be maximally efficient.

The problem with how std::vector would move B in the above scenario relates to what it does after this move: it frees the memory locations where B’s 3 items were previously located. This led to a bug in my code, as I was using pointers into the vector in other locations. These pointers became invalid shortly after a resize, when the memory was freed. A segmentation fault then followed.

My question: is there a way to prevent std::vector from freeing memory when it re-sizes? Even if the memory overhead therefore increases, this would be fine in my use-case.

Alternatively, are there any other C++ data-structures you would recommend that do this?

>Solution :

Aside from calling .reserve with a value larger than the vector will ever get, vector won’t do this for you, so if there is no known maximum possible size, std::vector is not an option.

It sounds like you want std::deque here. It has all the big-O performance guarantees of std::vector (except appends and pops from the end are actually O(1), rather than amortized O(1)), and as long as it’s only growing, never shrinking, any given element is never moved to new memory (because under the hood, it’s allocating additional non-adjacent blocks of memory, rather than reallocating a single contiguous block). It pays for this with higher baseline memory overhead (though nowhere near what std::list pays as it grows), and higher fixed overhead for all operations (e.g. indexing a std::vector only requires chasing a single pointer, while indexing a std::deque might take three such pointer-chasing operations, but it would be a fixed three, not like std::list where it scales with collection size), but it’s usually fast enough, and it would get you the memory location stability guarantees you’re looking for.

Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading