News:

Due to SPAM attacks, new members must be approved before posting.  Please email jclough@warrenpinnacle.com when registering and your account will be approved.

Main Menu
Menu

Show posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Show posts Menu

Messages - Jonathan S. Clough

#301
Using SLAMM / Re: time step
September 28, 2009, 06:51:14 AM
Hi Robert:

I think that the discrepancy in "time-zero" results gives you a partial sense of the uncertainty of the model, even in the current condition.  There will almost always be some discrepancy due to horizontal off-sets between the land-cover and elevation data-sets, general data uncertainty, or other local conditions that make a portion of your site not conform perfectly to the "conceptual model." 

Five percent seems pretty good, in my opinion.

-- Jonathan
#302
Using SLAMM / Re: time step
September 22, 2009, 12:20:03 PM
Robert wrote:
QuoteI believe you had mentioned in our June phone call that you hadn't found much difference between a 25 year and a 5 or 10 year time step, so I had been modeling in 25 years for convenience.  However I just tested 5 year and found significant differences.  Areas that were forecasted as SLAMM cat 7 or 8 in a 25 year time step were most often tidal flat or open water in a 5 year.

Would this relate to the fact that the model only converts one class per time step?  I have a fairly small tide range (2ft) so cells would not stay in a section of the inundation model for long.

Robert, you are right on.  When I mentioned that I hadn't found much difference in my last time-step analysis this was in a system in which there were essentially no changes at time-zero. 

Making sure the model behaves appropriately at time-zero is a very important step.  In some cases where there is a horizontal DEM to NWI off-set you will have open water, or tidal-flat elevations in a location that is coded as dry land.  In this case, the land type will move from dry land to transitional marsh to salt marsh to tidal flat to open water in each successive time-step.  Going through the SLAMM decision tree each transition only occurs once per time-step. 

In a case where the time-zero results look pretty good, and SLR is not so fast as to cause multiple transitions per time-step, I did not see much sensitivity to the time-step choice last time I checked.

I hope this helps to some degree.
#303
Using SLAMM / Re: protection and nwi photo data options
September 10, 2009, 03:36:41 PM
OK, thanks for pointing out that bug.  I'll fix it for the next version which I'm working on now.

All spreadsheet output is indeed in hectares.

Dry land, both developed and undeveloped is predicted to be subject to soil saturation from the increase in the fresh water table.  This is not something that building a dike or seawall is predicted to help.  I guess that someone could bring in fill or build drainage ditches and protect that area, but that is not part of the model assumptions at this time.

By the way, if I'm a bit slow in responding, feel free to email me directly ! -- Jonathan
#304
Using SLAMM / Re: protection and nwi photo data options
September 02, 2009, 09:38:19 AM
If both boxes are checked the software is supposed first run protect and then run don't protect and save results and maps for both.  However, there may be a bug there because we almost always run only one at a time.  I'll look into it.

-- Jonathan
#305
Using SLAMM / Re: Maximum number of cells
September 02, 2009, 09:36:38 AM
Here's the latest on conversion to 64 bit...

Quote64 bit DLLs cannot be called from 32 bit applications.

This means we would be forced to port the entire piece of software to a 64 bit development platform to go forward in that manner

http://msdn.microsoft.com/en-us/magazine/cc300794.aspx

While running a fully 64-bit Windows system sounds great, the reality is that you'll very likely need to run Win32 code for a while. Towards that end, x64 versions of Windows include the WOW64 subsystem that lets Win32 and Win64 processes run side-by-side on the same system. However, loading your 32-bit DLL into a 64-bit process, or vice versa, isn't supported. (It's a good thing, trust me.) And you can finally kiss good bye to 16-bit legacy code!

In x64 versions of Windows, a process that starts from a 64-bit executable such as Explorer.exe can only load Win64 DLLs, while a process started from a 32-bit executable can only load Win32 DLLs. When a Win32 process makes a call to kernel modeĀ­to read a file, for instanceĀ­the WOW64 code intercepts the call silently and invokes the correct x64 equivalent code in its place.

Of course, processes of different lineages (32-bit versus 64-bit) need to be able to communicate with each other. Luckily, all the usual interprocess communication mechanisms that you know and love in Win32 also work in Win64, including shared memory, named pipes, and named synchronization objects.

However, I'm currently experimenting with RAM Disks in the 64 bit implementation to see if that might solve our problem with a minimum of consternation.  I doubt that accessing data in a RAM-Drive will be as fast as accessing memory registers but stay tuned...
#306
Not behavior that I'm familiar with.  I use these options all the time with no problems.  Not to say there isn't a problem with your version.  I'll check it out.  What is the version number and date you're working with?  -- J
#307
Using SLAMM / Re: Maximum number of cells
August 31, 2009, 11:41:23 AM
Several years back we moved SLAMM data management into computers' memories as opposed to their hard-drives.  The memory limitation then becomes the 32 bit windows 2GB per application.  Furthermore, SLAMM currently requires contiguous memory for processing large arrays.  We also performed some additional cell optimization in which we don't track as much information about open water and high-elevation cells.  We have found practically that the number of cells that we can model without causing a crash is roughly 50 million. 

However, we're banging our head against this memory limitation over and over again as input data is available in higher resolution.  For this reason, I'm currently writing a memory manager for 64 bit operating systems that far surpass the 2GB limitation.  The problem is, the development platform in which this software is written does not have a 64 bit implementation (Delphi 2009).  So we're either going to port the whole software to 64 bit or try to write a 64 bit memory management DLL to solve this problem.

You may also run the model using the hard-drive for memory management in which case this limitation is effectively removed. However, we have found this increases run time by about an order of magnitude.  One of my hard-drives crashed shortly after a 36 hour run as well making me think this option puts considerable stress on the drive...  Perhaps a flash drive of 10GB or so would be an option but we have not tried this yet.

-- Jonathan
#308
Several years back we moved SLAMM data management into computers' memories as opposed to their hard-drives.  The memory limitation then becomes the 32 bit windows 2GB per application.  Furthermore, SLAMM currently requires contiguous memory for processing large arrays.  We also performed some additional cell optimization in which we don't track as much information about open water and high-elevation cells.  We have found practically that the number of cells that we can model without causing a crash is roughly 50 million. 

However, we're banging our head against this memory limitation over and over again as input data is available in higher resolution.  For this reason, I'm currently writing a memory manager for 64 bit operating systems that far surpass the 2GB limitation.  The problem is, the development platform in which this software is written does not have a 64 bit implementation (Delphi 2009).  So we're either going to port the whole software to 64 bit or try to write a 64 bit memory management DLL to solve this problem.

You may also run the model using the hard-drive for memory management in which case this limitation is effectively removed. However, we have found this increases run time by about an order of magnitude.  One of my hard-drives crashed shortly after a 36 hour run as well making me think this option puts considerable stress on the drive...  Perhaps a flash drive of 10GB or so would be an option but we have not tried this yet.

-- Jonathan
#310
Using SLAMM / Re: salinity model and map attributes
August 26, 2009, 08:31:08 AM
The salinity module is currently not supported:

The reasons are

1. It is undocumented: I can't conceive of anybody using it without extensive hand-holding from myself.  I just don't have time to do that.
2. It only applies to very specific estuary geometry and vegetation succession.  In most areas I've worked with it hasn't been applicable.
3. It is in the process of being rewritten, refined, and documented.
4. The interface is raw and non-bullet-proofed as you found out already.

Sorry:  -- Jonathan

Here's some more background:

Salinity Module: 

QuoteThe salinity sub-model was written as part of the STAR grant as the original model was not working well when there was significant freshwater influence.   (Required as marsh-type is more highly correlated to salinity than elevation when fresh-water flow is significant, Higinbotham et. al, 2004)
We never originally had funding or scope to write the salinity portion of the model under STAR but managed to squeeze it in under the original scope.  The resulting model is therefore simple.  There are several nuances to applying this model (e.g. manner of defining estuaries and flow-vectors) that have never been adequately documented and are not likely to be so.  The salinity model is essentially unsupported by us at the current time (for use by other users than ourselves).
The salinity model does currently have funding to be significantly updated and refined under a TNC conrract.  That work is expected to be completed within the next six months.  Therefore, this "original" version of the salinity model is not expected to be fully documented or supported at any time as we are instead putting our efforts towards the refinement and documentation and support of that model.


#311
Using SLAMM / Re: impervious data and dryland class
August 24, 2009, 03:18:04 PM
The NWI input file actually needs to have category "1" or "2" to determine where the upland (dry-land) and wetland boundary exists.  If you know the developed and undeveloped boundaries, the model can be input with spatially variable "1"s and "2"s depending on what is developed or undeveloped.  Otherwise you can input all 1s and all 2s and use an _imp.txt file to produce the developed, non-developed boundaries.

The file name may matter, (at one point I made it accept either form but I don't know if that's part of the latest distributed executable or not).  Try filename_imp.txt first.  You should see a map that has bright red for undeveloped land and dark maroon red for developed land. 

Good luck and let me know if you have further questions.  -- J
#312
Using SLAMM / Re: model input files
August 21, 2009, 12:39:27 PM
Input files must be ASCII Rasters.  Input files can not be polygons or shape files at this time.  Feel free to email me your input files zipped up if you'd like some feedback.

Good luck!  Jonathan  (jclough (at) warrenpinnacle.com)

#313
Using SLAMM / Re: NWI to SLAMM land cover classes
August 19, 2009, 10:53:46 AM
Yes, generally we use the 2001 NLCD percent impervious as the best available dataset unless a better one is available.

In fact, you can input a percent impervious integer raster into SLAMM and it will differentiate dry land and developed dry land on that basis (I think using 25% developed as the cutoff).

Thanks for your kind words about the Forum.  I think it will grow as the new version of SLAMM becomes available this fall and more communication is warranted.

-- Jonathan
#314
Using SLAMM / Re: NWI to SLAMM land cover classes
August 19, 2009, 07:16:49 AM
Thanks for this question.  The tech. doc table is now a bit out of date and is not the best way to map these categories.  There is an Excel spreadsheet for that.  The latest spreadsheet can be found here:

http://warrenpinnacle.com/prof/SLAMM6/SLAMM6_nwi_codes_2009.xls

Bill Wilen, the head of the National Wetlands Inventory has carefully vetted and approved this version of the spreadsheet.  If you come across any codes that are not in the spreadsheet feel free to ask for guidance here.

On your second note, the modifier "K" was chosen to be included with tidal because of an error in some of the (now obsolete) NWI documentation which included that under the tidal classification.  So that has been fixed in this spreadsheet.

Finally, Excel's VLOOKUP is what we usually use, if you come up with a script in ArcMap I hope that you are willing to share it.

Thanks again -- Jonathan
#315
Model Formulation & Parameters / Accretion
April 24, 2009, 09:29:23 AM
Within SLAMM 5 The accretion model has been quite simple:  accretion rates for marshes may be spatially distributed at whatever resolution is deemed appropriate, but they do not change over time.  

Currently, under contract to TNC, the accretion model is being modified to allow for a more sophisticated approach.  The new approach will modifies a maximum theoretical accretion rate considering cell elevation, distance to channel, and salinity (if relevant).  In this manner, feedbacks to SLR will be possible to include in the accretion formulation.  More information about this model, as it continues to be derived and tested, will be posted here.