This is more of a hypothetical question, I don't have a problem that needs solving, but am pondering the possibilities and limits of FME.
What are the strategies to process huge datasets? I would imagine that there is a limit to the resource management that FME can do on its own? At some point the software might need some "help" in order to break down the data in manageable chunks? How is this done? Tiling? Memory management? Renting a super computer? Etc?
Trying to think of an example I came up with the following idea. How would one go about generalizing the contour lines for a whole country in one run without watching life go by?
Please note that I am not asking about how to generalize, my question is what strategies there are to speed up huge processing operations similar to the example above?