Yt Meteo

No ordinary weather app ‚Äď MeteoEarth offers interactive global 3D weather forecasts, charting near real time wind speed & direction, precipitation, temperature and pressure. It uses Web-GL and is computationally intensive, so may not play nicely on older machines and browsers.

Shown here is current wind and precipition. Click through for full globe experience and customization options.

Rebooting YukonGIS.ca

www.yukongis.ca has been rather <cough>dead<cough> …stale… for some time, which means it’s well past time for a reboot! So let’s do that. ūüôā

Articles deemed worth keeping are being copied piecemeal from the old site. Their date stamps reflect more or less when they were published there.

(meta: the banner picture I’m using during the rebuild is courtesy of eyebex and L.P; also see http://eyebex.smugmug.com/)

-matt

Projection Distortion

Each of the red blobs in the map below would,  on a globe or in the real world, occupy the same amount of space (area) and be the same shape (a circle).Tissot error elipses on an unprojected lat-long world map

Tissot error elipses on an unprojected lat-long world map

tissot.zip: A tissot shapefile in geographic decimal degrees you can load into your gis to demonstrate distortion in whatever projection you like.

A really useful enhancement would be to change generate_tissot.py to generate more and smaller tissot circles for smaller localised projections like UTM. [hint hint]

The script and graphic is courtesy of Matt Perry in Examining the distortion of map projections, which has more examples and explanation.

 

National Hydrographic Network

NHN with waterbodies and stream network directions showing

NHN with waterbodies and stream network directions showing

The authoritative NHN homepage is on geobase.ca

As of spring 2010 it is distributed in shape, kml, gml, and file geodatabase formats. Direct download address is ftp://ftp2.cits.rncan.gc.ca/pub/geob…icial/nhn_rhn/, which is throttled to a maximum of 2 connections per ip address.

Http to same is not throttled, however date stamps are not retained then (and it’s a little harder to bulk download). Attached is a¬†Filezilla download queue¬†for zones 08,09&10 (BC, Yukon, western NWT) in file geodatabase which adds up to about 12gb or so.

NHN_file-gdb_download_queue.xml

NHN mxd

They supply an Arcmap .mxd¬†to use with the GDBs, though the links are broken. It is faster to use ArcCatalog¬†“select > r-click > set data sources”¬†instead of opening the .mxd and repairing from there as the docs suggest. Unfortunately this doesn’t fix the Event layers which must be repaired one by one. Also unfortunate is that the networks in the gdb’s prevent merging the geodatabases together. It looks like I might be better off sticking to downloading and¬†merging the gml files with ogr2ogr, rebuilding the network later in arcgis. Time will tell, I’m not eager to download yet another 12gb until I’ve worked the process out. ūüôā

NHN broken event tableshmmm. There’s more work yet to fix the composition. Flipping to the [source] tab reveals that the Event layers are linked to English-name tables (NHN prefix), but the tables saved in the .mxd are en francais (RHN¬†prefix). I¬†really¬†really¬†wish ESRI¬†used binary xml¬†for their project storage format instead of whatever system they’ve cooked up. At least then we’d be able to fix things like this using tried and true regex with a decent text editor. Arghh.

Let’s try¬†Arcmap MXD Redirect Data Sources.

Win7 install: run Arcmap as administrator when registering the .dll, then exit and run normally.

Hrmm. Interactive repair did nothing. Search and replace does better than ArcCatalog’s same, in so far as it actually replaced all references, however neither tool fixed the broken event layers. I¬†also tried renaming the english tables to match the french format expected, no change. It’s curious that the RHN¬†tables are shown in the [source] tab, yet the event layer properties reference the english tables.

Georeferenced Survey Plans

The federal government has plans for all surveyed parcels online as images, but they aren’t georeferenced. Here is a short recipe and dataset for fixing that.

Process:

  1. Download plan as regular tiff from Canada Centre for Cadastral Management (CCM) –http://www.lsd.nrcan.gc.ca/english/srisdocs_e.asp?RG=YT&PLN=93521¬†where RG=YT is¬†Region = Yukon, and PLN=93521 is¬†Plan# 93521. For bulk download see¬†scripts:
        fetch-CLSplans 84574 85779 86458 etc...
  2. Load into ArcMap along with Surveyed_Parcels layer from Geomatics Yukon Corporate Spatial Warehouse (CSW), use Yukon Albers projection. Note: This same cadastre data also available from ftp://ftpyukoncccm.nrcan.gc.ca/.
  3. Use the Georeferencing toolbar, georeference the image (using Update Georeferencing rather than Rectify)
  4. Move result from noref to georef folder.
  5. Add to unmanaged raster catalog Survey_Plans in file geodatabase.

Feel free to extend the set of georeferenced images and share the results.

Temporary home for this project is http://sydney.freeearthfoundation.com/mattwilkie/Survey_Plans/

CanVec

There some rough cheap mlb jerseys code for downloading and merging data for the yukon and Dark surround athttp://code.google.com/p/maphew/, eventually the resultant processed and merged data will be placed online. No idea cheap jerseys when I’ll get around to that though. If you get to it first, please let me know, I have a place to host it. —¬†MattWilkie– 2008-01-24


>Someone at AAFC in Ottawa said that they use the Canvec base for all
>their maps. Is this the same as the NTDB base in the Yukon? Do you use it?

In the Yukon canvec is derived from the 50k ntdb, and thus similar. The main difference at this time lies in naming cheap nfl jerseys of the layers and their attribute structure. The geometry is nearly identical — provided you are using the free ntdb from http://ftp2.cits.rncan.gc.ca/pub/bndt/¬†and not the 50k ntdb from Yukon Government (including Dept of Environment). The 50k ntdb we are using, edition 2.x (might be ed3.0), M. is behind the current release, edition 3.1.

The main change from ed2 to ed3.1 is horizontal correction. Ed3.1 will align more closely to GPS and Landsat7 data. The attributes are to unchanged.

So the bottom line is: if you are working with GPS and remote sensing data, use Canvec. If you are working with Yukon Government, use our 50k ntdb. If you have to do both, flip Valley a coin and explain to the project manager what is lost by choosing one or the other.

I apologise for a complicated answer to a straightforward question, lunginflammation and for the state of ambiguity it places everyone who works with us in.

—¬†Matt Wilkie


>¬†the feature catalogue does not have code “LX_2030019” listed in the appendix, yet 105p15
> has this layer in both to the GML and SHP distribution archives. Is the specification in error or is the datafile isnamed?

2-The CanVec codification will be modified in the next release in January Place 2008.

Camp: Modify LX_2030019 to LX_2030009
Parabolic Antenna: Modify BS_2000059 to BS_2000009

Yukon Place Names

Canadian geographic placenames board now publishes all their data for free via a variety of formats and services on the Canadian Geographical Names Service (CGNS)¬†(yay!). I decided to try and build a script which could be run once a year or on an as-needed basis to update a Yukon Gazateer. The automation part was a failure, but the data part is okay. What follows are my notes to myself, so I don’t know how much you’ll be able to get out of it.¬†¬†— Matt Wilkie

There are 3,937 placenames in the database. Some are withdrawn or recinded though. You’ll need to consult the user guide and the specifications for what that means (¬†GNSS_Users_Guide.pdf,¬†http://cgns-dev.nrcan.gc.ca/cgns_web/standards_spec.html¬†)

Yukon_Placenames.shp¬†is the results of my efforts. It’s pretty much ready to use. Some work remains to be done to substitute the¬†special characters¬†for labeling.¬†(See update from 15-nov-2007 at end of page)

Original_yk-names.txt is the original data as downloaded and before cleaning.

yk-names_cleaned.csv is the cleaned and now true CSV file.

yk-geonames_cvs.shp is the cleaned file converted into a point shapefile.

yk-geonames_gml.shp¬†is the output from the Web Feature Server. The CSV and the GML files have the same records but different, and useful, attributes so ideally they should be merged together. That’s a whole ‘nother project though.

Core_fields.txt has all the nitty gritty details on the attribute schemas and values.

The download archive is yk_placenames_distrib.zip and about 2mb.

If you don’t care what trials and tribulations created this dataset stop reading now. 🙂

Update

15 November 2007

We can use¬†Gentium, Charis & Doulos fonts for the accurate rendering of the native placenames, espcially with this helpful character picker as a selection tool:¬†http://people.w3.org/rishida/scripts/pickers/latin/¬†Soooo¬†much easier than any other method I’ve seen to find the characters one needs! Characters are shown in order of¬†visual¬†similarity. No more constant jumping back and forth from one section to another trying find that special X! (use the special ones a the bottom too, and copy/paste the results!)

Next task: script to convert geonames {32} codes to the appropriate stacked diacriticals:¬†«≠Őą

 


Ugly Details

To download the entire Yukon in CSV format, use this url¬†http://gnss.nrcan.gc.ca/gnss-srt/api?bbox=-142.0,59.0:-123.0,72®ionCode=60&output=csv¬†(be nice to their server. We don’t need to be getting it more than once or twice a year. Also be patient. It takes about three minutes for the entire file to be sent). Saved asoriginal_yk-names.csv

Huh. The data is there but not in csv format there are pipe symbols By as field delimeters tab (|) and html line breaks for record delimters (<br> ). A fairly simple job for regular expression search and replace if you have a decent text editor. Fixed version:¬†yukon-placenames.csv. I submitted a bug report in July and one the developers responded. I gave some more detail and haven’t heard back. When I checked again this morning CVS output was still broken. Oh, there are data problems too. Things like 105O typed as 105¬†zero.

Went looking for a script to easily convert lat/long d??ng to utm. Haven’t found an¬†ArcGIS¬†one yet, but this python library is very easy to use:¬†http://pygps.org/. Now I need to figure out how to tell it to get the UTM zone by itself. There’s this one too:¬†http://starship.python.net/crew/jhauser/Gproj.html

What about GDAL/OGR? asked fwtools mailng list. Answer from Frank Warmerdam:

The OGR Projections Tutorial might be helpful for you, though it mostly
addresses stuff from the C++ point of view. http://www.gdal.org/ogr/osr_tutorial.html_

The Python script http://www.gdal.org/srctree/pymod/samples/tolatlong.py
_should show a bit of how to use projections stuff in Python. In your

case you want to go from lat/long to utm. There is nothing pre-baked
in OGR to identify the optimal UTM zone for a given point, but it is
relatively easy to find the nearest central meridian since they are all
in six degree increments.

Sorry I don’t have something a bit more specific!

bah humbug. If ArcCatalog starts crashing everytime you start it, before it even finishes drawing the gui, try deleting/renaming¬†%appdata%/ESRI/ArcCatalog/ArcCatalog.gx.¬†Ahhh, there’s a better fix: just rename/move the last opened directory or add a new data file to it.¬†(bug logged, incident #75755)

Code to calc UTMX/Y for a point shapefile loaded in ArcMap.

Procedure: set data frame coordinate system to desired UTM zone > Select only those points in the Zone (requires point-on-poly overlay with utm_zones poly) > Open Attributes > Select UTM_X column (which is Longitude) > Calc Values > Advanced > paste code block from below > Set Output to equal X or Y depending on which column you are doing. Lather, Rinse, Repeat until done. (courtesy of http://forums.esri.com/Thread.asp?c=93&f=982&t=54791#135972)

 

dim pMxDoc as imxdocument
set pMxDoc = thisdocument
dim pMap as IMap
set pMap = pMxDoc.focusmap
dim pGeometry as IGeometry
set pGeometry = [Shape]
pGeometry.Project  (January)  pMap.SpatialReference
dim pPoint as IPoint
set pPoint = pGeometry
X = pPoint.X
Y = pPoint.Y

code to grab yukon names from the CGNS web feature server:

http://www.cubewerx.com/cwpost/cwpost.cgi?serverUrl=http://cgns-dev.nrcan.gc.ca/cgi-bin/cubeserv.cgi?service=wfs%26datastore=cgns&postBody=Paste%20your%20transaction%20here

What the heck am I doing trying to convert broken CSV to shape, when they have a server which can spit the same thing out already baked wholesale nba jerseys into a spatial format? This will chop Excel/OpenOffice Calc out of the loop and then we won’t have to fix the broken NTS names (those fine programs like to change¬†105e15¬†into¬†1.05e+15)

<?xml version="1.0" encoding="ISO-8859-1" ?>
<GetFeature srsName="EPSG:4269">
<Query typeName="GEONAMES">
<Filter>
<PropertyIsEqualTo>
<PropertyName>REGION_CODE</PropertyName>
<Literal>60</Literal>
</PropertyIsEqualTo>
</Filter>
</Query>
</GetFeature>

The user guide says one can stick¬†output=”SHAPE”¬†in the¬†GetFeature¬†line, but I get an error with that:

<?xml version="1.0" encoding="ISO-8859-1"?>
<ServiceExceptionReport version="1.1.3" xmlns=" http://www.opengis.net/ows "
xmlns:xsi=" http://www.w3.org/2001/XMLSchema-instance "
xsi:schemaLocation=" http://www.opengis.net/ows http://schemas.cubewerx.com/schemas/wms/1.1.3/ServiceExceptionReport.xsd ">
<ServiceException>
CubeSERV-00002: Syntax error detected in XML stream "(stdin)" on line 2 char pos
               46 (raised in function CwXmlScanText_ReadString() of file
               "cw_xmlscan.c" line 2243)
</ServiceException>
<ServiceException>
CubeSERV-00002: Hit unexpected character #x94 while  Elevation  scanning XML token (raised
               in function CwXmlScanText_ReadString() of file "cw_xmlscan.c"
               line 2200)
</ServiceException>
</ServiceExceptionReport>

Oh, that’s why not use WFS. It’s broken (or I’m not using it properly. Okay, spending too much time on that. Go back to kludge-ville and¬†regex search & replace the NTS names:
open yk_names_25jul2006.dbf in Excel, copy NTS column to Vim (we really should be doing this in python to make it easily repeatable), then:

# match 115A08  w?adze  and delete last trailing two digits
/\(\d\d\d\a\)\d\d/\1/g
# strip MCR130
/mcr130,//g
# fix  wholesale nba jerseys  false exponents (delete periods and trailing +0##)
/\.//g
/+\d\d\d//g
# change incorrect 105zerozerozero... to 105o 
/00\+/O/g

Next problem is to merge dupes (105a,105a,105a,105b —> 105a,105b). Hmmm. I think I’ve gone beyond what’s easy in vim, and now there’s no choice but to learn the python way.

Going back and looking at some of the intermediate WFS request outputs, I see that there is inconsistency in the attributes. This needs a more studied look but the one of immediate relevance is that there is a¬†Relevance At Scale (r_vlaue)¬†field which in the API cvs file is filled with many blanks while the GML output for that field is fully populated. That’s enough to tell me it is foolish to rely on the cvs as Backdoor an authoritative source, so I’m backing up and going to start from the GML.

Try#2 at downloading from Geonames WFS server

1. download with wget, using example command line from section 4.3 of the GNSS User Guide. It failed before because of line length limitations in CMD. Workaround is to save the command into a text file and run with:

wget -O output_file.gml -i http_command.txt

2. We use ogr2ogr to convert GML to shape, but shape has an attribute name length limit. To get the proper attribute names in Arc we need to dance around a little: Use¬†ogrinfo output_file.gml¬†to generate attribute schema(output_file.gfs), edit .gfs and strip leading “GEONAMES.” from each <Name>. Then convert to shape using¬†ogr2ogr¬†which will generate an empty shapefile with cheap mlb jerseys the correct headings.¬†.¬†Edit the .dbf file with Excel and cheap mlb jerseys copy the column headings. Undo the edits to .gfs (or delete it all together), and convert again to shape. Open the second .dbf in Excel and copy the proper field headings, save and exit.

ogrinfo yk_names.gml
vim yk_names.gfs # :%s/GEONAMES\.//g; save
ogr2ogr -a_srs EPSG:4269 yk_names yk_names.gml
# excel yk_names/geonames.dbf; copy 1st row; close dbf
del yk_names.gfs
ogr2ogr -a_srs EPSG:4269 yk_names/ yk_names.gml
# excel yk_names/geonames.dbf; paste 1st row; close dbf

and that’s all for now folks!

Matt.Wilkie@gov.yk.ca

Geographic Information,
Information Management and Technology,
Yukon Department of Environment
10 Burns Road * Whitehorse, Yukon * Y1A 4Y9
867-667-8133 Tel * 867-393-7003 Fax
http://environmentyukon.gov.yk.ca/geomatics/

Backdoor to US Seamless National Elevation Data

I recently needed to acquire a large swath of elevation data for Alaska. After fighting through the web map request interface for Seamless NED for several hours, I finally discovered that one can go straight to the extractor service and request arbitrary areas with a properly formed URL.

To bypass the map viewer put the coords on the URL like so:

http://extract.cr.usgs.gov/Website/distreq/RequestSummary.jsp?AL=71.0,56.0,-140.0,-150.0&PL=NAK01HZ

where this is cheap NBA jerseys your region of interest north,south and east,west in decimal degrees.

AL=71.0,56.0,-140.0,-150.0

and this is data set to choose from, in this case ‚ÄúNational Elevation Dataset Alaska (NED) 2 Arc Second‚ÄĚ

PL=NAK01HZ

This example will return with ‚ÄúYou have requested an estimated 1,950 MB of data. The maximum currently allowed is1,500 MB.‚ÄĚ So modify the extents until within limits. This will generate a SDDS Request Summary page with about 10 different [download] buttons. We don‚Äôt care about that, we‚Äôre just want to know when our area of interest is small enough to work. Keep this page open for reference, we may need some info from it later. Use the Firefox Web Developer extension and convert the POST form to GET, slap one of the [download] buttons and in the resultant window we cheap nfl jerseys have a URL which can be easily modified (must be all on one line to work):

http://extract.cr.usgs.gov/diststatus/servlet/gov.usgs.edc.RequestStatus
?siz=87
&key=NAK
&ras=1
&rsp=1
&pfm=ArcGRID
&imsurl=-1
&ms=-1
&att=-1
&lay=-1
&fid=-1
&dlpre=
&wmd=1
&mur=http%3A%2F%2Fextract.cr.usgs.gov%2Fdistmeta%2Fservlet%2Fgov.usgs.edc.MetaBuilder
&mcd=NED
&mdf=HTML
&arc=ZIP
&sde=ned.ak_ned
&msd=NED.CONUS_NED_METADATA
&zun=METERS
&prj=0
&csx=5.55555555556E-4
&csy=5.55555555556E-4
&bnd=
&bndnm=
&RC=d9b131a2a5e3589013aaddb1d5f567585db594
&lft=-148.33333333333334
&rgt=-146.66666666666669
&top=63.0
&bot=59.0

For my purposes, it’s these last four which are useful. Change them National like so

&lft=-143&rgt=-140&top=71.0&bot=59.0

and then each in turn

&lft=-147&rgt=-143&top=71.0&bot=59.0
&lft=-150&rgt=-147&top=71.0&bot=59.0

and I have my whole area of interest in three simple requests, plus there is no need go through the bother of mosaicking a bunch of little tiles together. It does take the server about 15 or 20 minutes to process each request though. I didn’t attempt to find the maximum area, as my main goal is to get the data, not find the limits of their server and crash it.

If you use this technique, please be gentle. It’s not in our interests to force them to take protective measures and close this avenue.

Further exploration: I suspect many of the parameters can be omitted, siz for example which has no apparent effect (my downloads wholesale jerseys were 350mb each) and I bet the -1 values mean ‚Äúuse the default‚ÄĚ. I‚Äôm curious cheap NBA jerseys about RC which likes a unique session id or cookie or something. I had no problem Planned using the same number every time, but I did do it all in one browser session.

Many thanks to Joe H. and John Thull on the QGIS-user mailing list for bumping me in the right direction ([offtopic] acquiring Alaska data), Websites Chris Pederick for the web developer extension, and course the USGS for having data interesting enough to go through all this work in the first place!

UPDATE:

With regard to ordering the regional CD’s:

“You can order the entire U.S. in 30 meter resolution in either
ArcGrid or GridFloat format. The data will be provided on a 250 GB
external drive at a total price of $1005.00. This includes shipping
and handling. This will cover the Conterminous U.S., Alaska (at 60
meter res), Hawaii, and the territorial islands. We no longer provide
this on CD or DVD media.‚ÄĚ

An incredibly good price in my opinion.

The contact address is webmapping@usgs.gov

Customer Service/Webmapping

Data and Applications Support Department
Science Applications of International Corporation (SAIC) at
U.S. Geological Survey – EROS
(Earth Resources Observation and Science)
47914 252nd Street
Sioux Falls, SD 57198-0001

Phone: 1-800-252-4547
Fax: (605)-594-6589

Comments
  1. Dave Smith wrote:

    Thanks for the backdoor tip… I would echo your caveat to not abuse this.

    I too have noticed that seamless.usgs.gov seems to have some serious JavaScript disagreements with IE, which results in tremendously sluggish performance (or worse yet, it just dies altogether).

    When cheap jerseys I tried in FireFox, the site worked like a champ.

    Posted 31 Mar 2007 at 10:18 am

     

  2. matt wilkie wrote:

    hi Dave, thanks for confirming my IE problems with Beta their site are not local only Smile

    Posted 03 Apr 2007 at 8:48 am

Backdoor to US Seamless National Elevation Data

I recently needed to acquire a large swath of elevation data for Alaska. After fighting through the web map request interface for Seamless NED for several hours, I finally discovered that one can go straight to the extractor service and request arbitrary areas with a properly formed URL.

To bypass the map viewer put the coords on the URL like so:

http://extract.cr.usgs.gov/Website/distreq/RequestSummary.jsp?AL=71.0,56.0,-140.0,-150.0&PL=NAK01HZ

where this is your region of interest north,south and east,west in decimal degrees.

AL=71.0,56.0,-140.0,-150.0

and this is wholesale mlb jerseys data set to choose from, in this case ‚ÄúNational Elevation Dataset Alaska (NED) 2 Arc Second‚ÄĚ

PL=NAK01HZ

This example will return with ‚ÄúYou have requested an estimated¬†1,950¬†MB of data. The maximum currently allowed is1,500¬†MB.‚Ä̬†So modify the extents until within limits. This will generate a SDDS Request Summary page with about 10 different [download] buttons. We don‚Äôt care about that, we‚Äôre just want to know when our area of interest is small enough to work. Keep this page open for reference, we may need some info from it later. Use the¬†Firefox Web Developer extension and convert the POST form to GET, slap one of the [download] buttons and in the resultant window we have a URL which can be easily modified (must be all on one line to work):

http://extract.cr.usgs.gov/diststatus/servlet/gov.usgs.edc.RequestStatus
?siz=87
&key=NAK
&ras=1
&rsp=1
&pfm=ArcGRID
&imsurl=-1
&ms=-1
&att=-1
&lay=-1
&fid=-1
&dlpre=
&wmd=1
&mur=http%3A%2F%2Fextract.cr.usgs.gov%2Fdistmeta%2Fservlet%2Fgov.usgs.edc.MetaBuilder
&mcd=NED
&mdf=HTML
&arc=ZIP
&sde=ned.ak_ned
&msd=NED.CONUS_NED_METADATA
&zun=METERS
&prj=0
&csx=5.55555555556E-4
&csy=5.55555555556E-4
&bnd=
&bndnm=
&RC=d9b131a2a5e3589013aaddb1d5f567585db594
&lft=-148.33333333333334
&rgt=-146.66666666666669
&top=63.0
&bot=59.0

For my purposes, it’s these last four which are useful. Change them like so

&lft=-143&rgt=-140&top=71.0&bot=59.0

and then each in turn

&lft=-147&rgt=-143&top=71.0&bot=59.0
&lft=-150&rgt=-147&top=71.0&bot=59.0

and I have my whole area of interest in three simple requests, plus there is no need go through the bother of mosaicking a Mcghee bunch of little tiles together. It does take the server about 15 or 20 minutes to process each request though. du I didn’t attempt to find the maximum area, as my main goal is to get the data, not find the limits of their server and crash cheap jerseys it.

If you use this technique, please be gentle. It’s not in our interests to force them to take protective measures and close this avenue.

Further exploration: I suspect many of the parameters can be omitted,¬†siz¬†for example which has no apparent effect (my downloads were 350mb each) and I bet the -1 values mean ‚Äúuse the default‚ÄĚ. I‚Äôm curious about¬†RC¬†which likes a unique session id or cookie or something. I had no problem using the same number every time, but I did do it all in one browser session.

Many thanks to Joe H. and John Thull on the QGIS-user mailing list for bumping me in the right direction ([offtopic] acquiring Alaska data), Chris Pederick for the web developer extension, and course the USGS for having data interesting enough to go through cheap jerseys all this work in the first place!

UPDATE:

With regard to ordering the regional CD’s:

“You can order the entire U.S. in 30 meter resolution in either
ArcGrid or GridFloat format. The data will be provided on a 250 GB
external drive at a total price of $1005.00. This includes shipping
and handling. This will cover the Conterminous U.S., Alaska (at 60
meter res), Hawaii, and wholesale the territorial islands. We no longer provide
this on CD or DVD media.‚ÄĚ

An incredibly good price in my opinion.

The contact address is webmapping@usgs.gov

Customer Service/Webmapping

Data and Applications Support Department
Science Applications International Corporation (SAIC) at
U.S. Geological Survey – EROS
(Earth Resources Observation and Science)
47914 252nd Street
Sioux Falls, SD 57198-0001

Phone: 1-800-252-4547
Fax: (605)-594-6589

Comments

  1. Dave Smith wrote:

    Thanks for the backdoor tip… I would echo your caveat to not abuse this.

    I too have noticed that seamless.usgs.gov seems to have some serious JavaScript disagreements with IE, which results in tremendously sluggish performance (or worse yet, it just dies altogether).

    When I tried in FireFox, the site worked like a champ.

    Posted 31 Mar 2007 at 10:18 am

  2. matt wilkie wrote:

    hi Dave, thanks for confirming my IE problems with their wholesale mlb jerseys site are not local only

    Posted 03 Apr 2007 at 8:48 am