News:

Due to SPAM attacks, new members must be approved before posting.  Please email jclough@warrenpinnacle.com when registering and your account will be approved.

Main Menu

slamm 6.0.1 memory error

Started by caroline, June 02, 2010, 01:00:08 PM

Previous topic - Next topic

caroline

hi Jonathan,

I am happy to report I can run my 21,177,910 cel sized NC map that uses 1.24 gb with the updated version (it would crash in 6 although it ran in 5). I unclicked the soil saturation and I don't have streaks - this was a major problem before.  :) Thanks.

I can't get it to write to Word however (same error as others) even using the workaround mentioned in this forum. It also tells me I am out of memory when I attempt to set the attributes so that part doesn't work. Is this necessary? My output looks reasonable. Looking forward to the 64 bit compatible version so it doesn't take so long to run.

Caroline

Jonathan S. Clough

Great!  I'm glad to hear that the new version is useful to you.

The set attributes is usually kind of important for QA, setting sub-site variables, etc.  I'm surprised that it does not work and I'm not sure why that wouldn't work when running the model does.  If you want to upload your data files to FTP I can try to debug at this end...

With regards to writing to Word, please tell me exactly what error that you get.  You can only copy maps of a certain size into Word.  That is a clipboard/Word limitation.  Try running SLAMM with 12.5% "initial zoom" and you may find that it will write the maps to Word for you.  You will only get one pixel shown per 64 pixels on your map but you can view 1 to 1 results on GIS.  Or, if you can get into "set map attributes" you can produce "output sites" in which sub-sections of the map will be output at 100% zoom.

By the way, the 64-bit version is not currently in production as the software was nearly completely ported to 64-bit but then was crashing frequently and providing no information about the causes of the crash.  The platform that I was working with (Lazarus & Free Pascal) did not have a strong 64-bit debugger yet.  Also, the problem may have been with the (in-progress and open-source) development platform instead of with the SLAMM code itself...  The upshot is,  I burned too much time trying to get it to work for a particular project and it hasn't yet borne fruit.    If someone wants to fund further work on this I can either see if the free-pascal compiler has improved in the interim or try to port to a different 64-bit compiler.

That being said, the (eventual) 64-bit version will not be appreciably faster but will allow for nearly unlimited memory use.

caroline

Hi Jonathan,

I've attempted to recreate my errors using the same datasets and sometimes I get an out of memory error when either setting attributes and/or executing the program, and sometimes I don't! I noticed that when I set up count first and then set attributes that it worked. I think maybe it has to do with the order I select the choices on the opening form?

In Word I get the following error:  'Runtime error copying map to microsoft word for year 0'. I tried to paste a screen shot into the word document (per http://warrenpinnacle.com/SLAMMFORUM/index.php?topic=87.0) but when I say yes to repaste it it creates a new word document so this doesn't work. So I tried your suggetion to run Slamm with the initial zoom set to 12.5% and it worked! :)

I imported slamm5 data and got these errors:  NWI: "does not include "description" identifier. Dem, Dike, slp and site files: "tide range has been collapsed in v6, reading tide inland as primary tide input". I am assuming these are not a problem, but maybe that could be an issue?

Looking closer at my output data, although the long streaks are gone (when I uncheck the soil saturation selection) for my non-tidal swamp I now have some shorter streaks for undeveloped land, estuarine beach and tidal flats that weren't there before.

The first time I ran this set of data (in 6.0.1) it took most of the day but the second two times it only took an hour or two. Odd. Thanks for the 64 bit update, 32 works fine for now- I can run Slamm in the background while I do other things. This is my western section of my area. I ran the entire area in Slamm 5 and it took a week or so but Slamm 6 crashed, and also Slamm 6.0.1- I get an "Error creating map in memory, write map to disk instead? I answered yes, when setting map attributes and then it runs awhile and then I get a 'not enough storage is available to process this command' (90,046,282 cells, 5.28 gb memory utilization), when I try to run it I get a 'range check' error. I am sure it must be way too big but it is interesting that it ran in slamm5.

I can send you the input ascii files via zipped file (they are 135,557 kb in size) if you'd like as I can't figure out how to get an ftp link to work.

thanks,

Caroline








caroline


I've been running Slamm 6.0.1 on my second data set (27,100,961 cells, 1.59 gb memory) and again find some interesting Slamm behavior. Again I couldn't set the attributes (memory error) but I could run Slamm and the output looked ok. However when I went to run it a second time (I made a modification to my NWI data but nothing else was different- same size ascii file) I kept getting memory errors.  Then I changed the tracking from blank to 'do not track high elevations and open water' and it ran (and the set attributes also worked)! The cel size/gigs were then reduced to 9,396,996 cells, 0.55 gb.

Caroline

Jonathan S. Clough

Are you running a 32 bit or 64 bit OS?  To use more than 2GB at one time you really need a 64 bit OS and preferably with more than 4GB of memory as well...

It also depends on what other software is resident in memory, in my experience.

But we have had much more success in a 64-bit OS even though the software is 32-bit because it enables much more memory to be available to an individual application (up to 4GB)

Good luck -- Jonathan

caroline

I'm running a 32 bit XP OS. Thanks for the info.

Justin Saarinen

Hello,

We are having trouble with out of memory errors in version 6.0.1 using Windows XP. The pilot datasets are quite a bit smaller than the successful examples described in this thread, but still large (~1:48,000 scale at 1 meter resolution, but with elevations over 50m masked out with limited wetland class coverage). Our first attempts are to:

1. Reduce the spatial extent of the analysis
2. Increase the virtual memory on the D: drive (non OS) to 5-10 GB. 

I do not yet understand how the "Do Not Track ""Blank"" Cells" option makes processing more efficient. Also, can you gain efficiency by populating the "Sites" parameter?

Any tips for increasing efficiency of processing are appreciated.  :)

Justin A. Saarinen, GISP
Geographer
USGS-Western Fisheries Research Center
at Hatfield Marine Science Center
2111 SE Marine Science Drive
Newport, OR 97365
Phone: 541-867-5022
FAX: 541-867-4049
EMAIL: jsaarinen@usgs.gov

Justin A. Saarinen, GISP
Geographer
USGS-Western Fisheries Research Center
at Hatfield Marine Science Center
2111 SE Marine Science Drive
Newport, OR 97365
Phone: 541-867-5022
FAX: 541-867-4049
EMAIL: jsaarinen@usgs.gov

Jonathan S. Clough

OK, given we have had several reports of memory errors and the fact that we're not having significant problems running fairly large sites here, I went through the latest release of the open source code vs. our in-house code to ensure that we hadn't fixed some subtle memory problem since the last release.  However, there had been no changes pertaining to memory management in our code.  (Our code is evolving because we are adding an integrated uncertainty component.)

So my observations are as follows.  If you run the model on a 64-bit OS you should have 4GB of RAM to work with.  It's only 32 bit software, sadly, so 4GB is the absolute maximum you'll have.  You can use the file-setup window to estimate the memory requirements of your project.  Not all memory requirements of the software are listed there but the big map is.  So if your footprint is above 3.5-3.8 GB it's not going to run in memory.

Not tracking blank cells simply compresses the size of the map in the memory such that the model is not tracking all cell attributes for irrelevant (blank) cells.  It takes a bit more housekeeping, however, so for smaller sites we don't usually use that.  Same thing for removing tracking of high-elevation and open water cells.

I don't think populating the sites parameter has any effect on the required memory footprint of the software.

Sorry that the porting to 64-bit has hit so many glitches resulting in these continuing problems for our users.  The Delphi development platform is not 64 bit yet (if ever) and trying to port the software to another platform resulted in random crashes (and that development platform did not have an integrated 64-bit debugger to solve those problems.)  So we're momentarily stuck with 32 bit until we have the resources to port the model to a more modern development platform.

Good luck -- Jonathan


Justin Saarinen

The steps above resulted in completed simulations. I guess jacking up the virtual memory past 4 GB will not do anything in the near future. By footprint, do you mean the "Memory Utilization in GB" ? I am new to this and don't quite understand all of the jargon yet, so thanks for your patience.

Thanks for addressing my questions and insight. Regarding irrelevant cells (blank), I would expect some landscape with no wetland designation to become wetland with SLR. By not tracking blank raster cells, is this possibility removed - unless the cell has a SLAMM code >= 2.

Hope the 32b to 64b port smooths out soon.


-Justin

Justin A. Saarinen, GISP
Geographer
USGS-Western Fisheries Research Center
at Hatfield Marine Science Center
2111 SE Marine Science Drive
Newport, OR 97365
Phone: 541-867-5022
FAX: 541-867-4049
EMAIL: jsaarinen@usgs.gov

Jonathan S. Clough

#9
Blank cells are completely ignored in all cases.  Non wetlands must be designated as dry lands to be subject to conversion.