Question

Read Url Automaticly


Badge +5

is there any way to KMZ and Shape file reader get the link automatically when file or link change from website

 

for Example

https://www.nhc.noaa.gov/gis/forecast/archive/al012022_5day_latest.zip

its today may be tomorrow

file name change with lk015856.zip and anything

is there and parameter if i need to use


8 replies

Userlevel 3
Badge +16

Yes, something like this:

imageHTTPCaller to read the NHC website https://www.nhc.noaa.gov/gis/

StringSearcher on the html response, using regex to find the file link you are interested in. I used:

forecast/archive/[a-z]{2}[0-9]{6}_5day_latest.zip - this assumes that the part of the filename that changes is 2 letters followed by 6 numbers.

FilenamePartExtractor because it's handy.

HTTPCaller to download the zip file, and a FeatureReader to read every shapefile in that zip file.

 

It doesn't tell you when the file changes, when this needs to be run, maybe the website can notify you, else you may have to schedule FME to run daily.

You can add a tester after the FilenamePartExtractor to see if the filename is different to the previous time the workbench ran, and stop processing if it is the same.

 

 

Badge +5

Yes, something like this:

imageHTTPCaller to read the NHC website https://www.nhc.noaa.gov/gis/

StringSearcher on the html response, using regex to find the file link you are interested in. I used:

forecast/archive/[a-z]{2}[0-9]{6}_5day_latest.zip - this assumes that the part of the filename that changes is 2 letters followed by 6 numbers.

FilenamePartExtractor because it's handy.

HTTPCaller to download the zip file, and a FeatureReader to read every shapefile in that zip file.

 

It doesn't tell you when the file changes, when this needs to be run, maybe the website can notify you, else you may have to schedule FME to run daily.

You can add a tester after the FilenamePartExtractor to see if the filename is different to the previous time the workbench ran, and stop processing if it is the same.

 

 

Actually this is the website

https://www.nhc.noaa.gov/gis/

and there are many Products so i need each product

when storm come to USA they upload the data so what I need to do when in this page any new data come then fme will work and get the data..

 

when storm finish website don't have any link

 

i create the workspace for each product

Badge +5

In HTTPcaller 2 cant we add driect server details

Badge +5

Featuretype i am selecting manually is there any way that fme dont download the file direct extarct the files from zip and then run

 

image 

as u can see the Image i uploading manually

Badge +5

Yes, something like this:

imageHTTPCaller to read the NHC website https://www.nhc.noaa.gov/gis/

StringSearcher on the html response, using regex to find the file link you are interested in. I used:

forecast/archive/[a-z]{2}[0-9]{6}_5day_latest.zip - this assumes that the part of the filename that changes is 2 letters followed by 6 numbers.

FilenamePartExtractor because it's handy.

HTTPCaller to download the zip file, and a FeatureReader to read every shapefile in that zip file.

 

It doesn't tell you when the file changes, when this needs to be run, maybe the website can notify you, else you may have to schedule FME to run daily.

You can add a tester after the FilenamePartExtractor to see if the filename is different to the previous time the workbench ran, and stop processing if it is the same.

 

 

how can i add tester as i m new so

Userlevel 3
Badge +16

I've attached the workspace I put together there, hopefully it helps.

You have to download the file in order to read it. If the shapefile wasn't inside a zip, then you could point the feature reader at the url of the file, but in this case you have to download it first, and then read it.

Because the shapefile name changes every time, you have to set the feature reader to read everything in the zip file.

 

I don't know if there is a trigger to let you know when new data is uploaded. If there isn't one, then you need to design the workspace to check if there is new data, and do nothing if the data isn't there or hasn't changed since last time.

Badge +5

Thanks for your help

 

Badge +5

i think we dont need if condition .. because its real time Storm Data everytime storm hit data update

can we do if there new data then push the data if not and old data then stop .. and 2nd thing if source link dead then all files dlete from PC .. is it possible?

Reply