This topic has been moved to Using SLAMM (http://warrenpinnacle.com/SLAMMFORUM/index.php?board=3).
Several years back we moved SLAMM data management into computers' memories as opposed to their hard-drives. The memory limitation then becomes the 32 bit windows 2GB per application. Furthermore, SLAMM currently requires contiguous memory for processing large arrays. We also performed some additional cell optimization in which we don't track as much information about open water and high-elevation cells. We have found practically that the number of cells that we can model without causing a crash is roughly 50 million.
However, we're banging our head against this memory limitation over and over again as input data is available in higher resolution. For this reason, I'm currently writing a memory manager for 64 bit operating systems that far surpass the 2GB limitation. The problem is, the development platform in which this software is written does not have a 64 bit implementation (Delphi 2009). So we're either going to port the whole software to 64 bit or try to write a 64 bit memory management DLL to solve this problem.
You may also run the model using the hard-drive for memory management in which case this limitation is effectively removed. However, we have found this increases run time by about an order of magnitude. One of my hard-drives crashed shortly after a 36 hour run as well making me think this option puts considerable stress on the drive... Perhaps a flash drive of 10GB or so would be an option but we have not tried this yet.