Skip to main content

 I´m right now trying to learn the basics of FME and have been busy doing calls to open data in Sweden, primary published as json files. I have managed to read the, and explode a list and write back to an excel file. I now try a larger dataset of about 1 million rows. I can not figure out how to extract more than the limit 500 records. I have tried searching in the forum an Google without understanding how to proceed. I attach the links to the dataset below.

 

I would realy appreciate if someone could show a setup that loops and writes all data to a csv for example.

 

Dataset:

https://www.vgregion.se/ov/dataportal-vast/#esc_org=https%3A%2F%2Fcatalog.goteborg.se%2Fstore%2F6%2Fresource%2F1

I try using this download URL: https://catalog.goteborg.se/rowstore/dataset/ce5b8afd-6721-49c7-9ebc-f1ad33fba266

If you know in advance how many requests to make, you can avoid looping completely and just make multiple HTTPCalls

 

Sometimes looping is unavoidable, there is some more information in answers to this question

 

https://community.safe.com/s/question/0D54Q0000930eGGSAY/is-there-a-tutorial-on-how-to-page-through-the-api-call


As I understand from the API docs you can use this link https://catalog.goteborg.se/rowstore/dataset/ce5b8afd-6721-49c7-9ebc-f1ad33fba266/info to get a rowcount: 1024135 attribute. You divide that by 500 and get a number of calls you have to make.

When making HTTP calls make sure to set the parameter _offset Swagger UI (entryscape.com)


As I understand from the API docs you can use this link https://catalog.goteborg.se/rowstore/dataset/ce5b8afd-6721-49c7-9ebc-f1ad33fba266/info to get a rowcount: 1024135 attribute. You divide that by 500 and get a number of calls you have to make.

When making HTTP calls make sure to set the parameter _offset Swagger UI (entryscape.com)

Thank´s for response! How do I go about to do multiple http calls, just multiple http call parts in fem workbench or?

 

 


Thank´s for response! How do I go about to do multiple http calls, just multiple http call parts in fem workbench or?

 

 

You use the Clonner as @ebygomm linked to.

After getting your total rows and knowing that the limit is 500 you calculate a new attribute that increments by 500 until you get to total rows. Each clone has this attribute incremented by 500.

Now you should have enough clones to cover all the rows for http caller.

You use the calculated attribute as the _offset parameter in HTTPCaller as the API specifies.


Thank´s for response! How do I go about to do multiple http calls, just multiple http call parts in fem workbench or?

 

 

I get stuck anyway I´m afraid, could you do a simple sketch or maybe a workbench model og how the different parts should be linked together, it would realy men a lot.


Thank´s for response! How do I go about to do multiple http calls, just multiple http call parts in fem workbench or?

 

 

Here you go!

It is good to know here that the amount of features that enter a http caller equals the number of queries that the http caller fires. That's why you use the cloner, to make the correct number of input features for the caller. If you give the correct attribute values to these cloned features you can use them to query the whole dataset in chunks of 500 rows.


Reply