Solved

Known limitations when reading large WFS datasets ?

  • 13 February 2024
  • 1 reply
  • 40 views

Userlevel 1
Badge +22

I posted this Thursday, but apparently it hasn’t been ported to the new system, so here goes again:

 

I have a job that harvest a number of data layers via WFS. As of late, the job has been throwing an error, when reading a particularly large dataset.

I've tried to retrieve the same dataset as an XML (GML) file using HttpCaller, and the resulting file is 452 Mb.

The error says something about "message: comment or processing instruction expected", but it points to the end of the very last line of the data file, and the data file syntax seems ok.

So I'm wondering whether I've hit a limitation in the WFS reader ? E.g. a limitation in downloaded file size ?

For now, I've split the original FeatureReader fetching 5 layers at once, into 5 FeatureReaders each fetching a single layer. I'll see if that solves anything tomorrow.

Using Desktop 2022.2.6 and Server 2022.2.6

Cheers.

Ps!

It would have been nice to have had a checkbox "Read layers individually" and keep the complete layer list, instead of having to rework the workspace. Maybe an idea for a quick enhancement ?

icon

Best answer by lifalin2016 13 February 2024, 14:11

View original

1 reply

Userlevel 1
Badge +22

It actually solved the problem when I split the single, large request into 5 request smaller request, one for each layer.

So it does seem that FME has a problem with large WFS datasets.

It would still have been nice to have done this by just clicking a “Fetch layers individually” checkbox ;-)

Reply