Ask your own question, for FREE!
Computer Science 21 Online
OpenStudy (anonymous):

If, in c++, I were to create a large number of very large arrays and never delete them with the delete [] command (but be sure of no memory leaks), would there be any negative consequences? That is, would I possibly run out of memory on the stack and get segfaults if I were to continue making huge arrays? Thanks!

OpenStudy (osanseviero):

I think you would...and it would become really slow...but that depends on how big the arrays are

OpenStudy (anonymous):

don't think as much about the entire machine as the processor. When a class, it's data, and the current function of operation fit within the CPU cache, you have an automatic boost in speed. When you allocate large numbers of large arrays, all memory is accessed externally to the CPU instruction and memory caches. This may be necessary if you are multithreaded anyhow, but if your intent is speed you want small useable chunks rather than big allocations. One other issue is that when you reach the 2gig boundary of a 32bit program, or the 4gig boundary of a 32bit program run in a 64bit window, you risk a crash of the software (segfault). Prior to this limit, you always risk being cached to disk to allow other software to run on the host machine. Smaller means less caches to disk (virtual memory) and faster operation.

OpenStudy (stormfire1):

Nice answer...me likey.

Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!
Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!