This function determine which instances of a given model are available for download.
CrawlModels(abbrev = NULL, model.url = NULL, depth = NULL, verbose = TRUE)
A list of web page addresses, each of which corresponds to a model instance.
The model abbreviation, see NOMADSRealTimeList
.
Defaults to NULL
.
A URL to use instead of using the abbreviations in NOMADSRealTimeList
.
Defaults to NULL
.
How many model instances to return.
This avoids having to download the entire model list (sometimes several hundred) if only the first few instances are required.
Defaults to NULL
, which returns everything.
Print out each link as it is discovered.
Defaults to TRUE
.
Daniel C. Bowman danny.c.bowman@gmail.com
This function calls WebCrawler
, a recursive algorithm that discovers each link available in the URL provided.
It then searches each link in turn, and follows those links until it reaches a dead end.
At that point, it returns the URL.
For the model pages on the NOMADS web site, each dead end is a model instance that can be examined using ParseModelPage
or have data retrieved from it using GribGrab
.
WebCrawler
, ParseModelPage
, NOMADSRealTimeList
, GribGrab
#Get the latest 5 instances
#for the Global Forecast System 0.5 degree model
if (FALSE) urls.out <- CrawlModels(abbrev = "gfs_0p50", depth = 5)
Run the code above in your browser using DataLab