Question

Set maximum for Network Cost Calculator?


Badge

I am experimenting with the creation of isochrones by using the network cost calculator. This works beautifully on small datasets but takes awfully long when using larger networks (in my case a routable dataset created from osm-data).

Can I somehow limit the maximum cost up to which connections between nodes should be calculated?


7 replies

Userlevel 4
Badge +25

I've been working on pretty much exactly the same thing recently:

However, I found the processing time to increase quite drastically. If I generate a 5 minute drivetime contour it takes 1 minute and 34 seconds, for every additional minute it almost doubles in time. 6 minutes takes 2:16, 8 minutes is 7:23 and this 10 minute drivetime polygon took almost half an hour to compute (all with the same start address).

One of the things I am doing is limiting the amount of data that goes in to the NetworkCostCalculator as much as possible. I download only the roads OSM data, through the Overpass Turbo API, for an area that's just big enough (x minutes driving at the max speed + 10 km/h extra), even then it's obvious the NetworkCostCalculator is the one taking the most time of my entire workspace.

A possible improvement that I haven't researched fully yet is splitting the data in 4 chunks (northeast, northwest, southwest and southeast from the starting point, with a bit of overlap) so I can use parallel processing on the NetworkCostCalculator.

Hope this helps, no answer I'm afraid...

Badge

Thanks! I just wanted to make sure that I wasn't missing anything too obvious.

I've found it quite helpful to generalize the network bevor sending it to the NetworkCostCalculator. Even a generalization of just 5 to 10m drastically simpifies OSM data and speeds up the process while still yielding comparable results.

I guess another way to go about it would be to use ArcGIS network analyst through arcpy and then pass the results back into my workflow.

Userlevel 4
Badge +25

Another option might be this service that I just found out about yesterday. They've got limited coverage for now, though The Netherlands is supported so I could give it a try, they have an API. However, I did a few quick tests and I'm not very impressed with the quality of the polygons.

Another option might be this service that I just found out about yesterday. They've got limited coverage for now, though The Netherlands is supported so I could give it a try, they have an API. However, I did a few quick tests and I'm not very impressed with the quality of the polygons.

Hey, it would be interesting to get your feedback on why you weren't impressed with the quality of polygons. I work for iGeolise (the makers of the app) and want to make sure we pass on feedback to continue to improve it.

 

 

Userlevel 4
Badge +25

I see we already have an enhancement request in to our developers for this. It's PR#71973. Currently it's targeted for FME2018, though I can't personally guarantee that.

I'll add a link to this thread and increase the priority a notch. But please do register your interest via our support team if you want updates on this, or post this as an idea on this site to see if anyone else is interested.

Otherwise I think Han's idea of parallel processing is a good idea, dividing the data into four centred on the start point.

Userlevel 4
Badge +25

Just a little update, I've ran some tests with parallel processing and on Desktop this has a very big result. Processing time for one particular run went from 28 minutes down to 6. I've used moderate parallel processing, i.e. the same number of processes as there are cores available and in my case that's 4, which also happens to be the number of "chunks" I process the data in.

On FME Cloud (starter instance) this unfortunately does not work out.

Another 'chunk' approach:

OSM has a classification, where higher classes mean higher possible speeds. If you start with the higher class in the NetworkCostCalculator, the mathematical problem is much smaller.

The remaining links form "islands" in between the higher class network (if its coding is done correctly and the higher class forms a closed network).

Then you can feed the NetworkCostCalculator again with one (of more) islands links together with the higher class set. Then merge the parts in the end, considering the duplication of the higher class roads.

I remember from my previous life at HERE Maps the highest class holds approx. 10% of the complete network, meaning the math problem goes down a 100 times. At the moment I work on a similar problem 400.000 links, but no classification! So I face other challenges.

Reply