Skip to main content

I have a process that is reading from Smallworld (using the SWORLDSWAF reader) in 2020.1. All it is doing is writing to a file geodatabase. It keeps running out of memory no matter what I do. Is there anything I can do to reduce memory usage when accessing Smallworld?

- I can't use 64 bit because the Smallworld reader isn't compatible

-I have 32 GB of RAM on my Windows 10 machine

-I have the temp files writing to a separate hard drive

-I have a WHERE clause filtering out the incoming features

 

Thanks

  1. Make sure feature caching is turned off. This is a memory killer
  2. If writing to Geodatabase then you can do this in batches
    1. FME has the option to read a set number of features from a given start features. You could run the process several times with different start features.

 

Number two really shouldn't be needed though. FME should be dealing with the memory. I hope it's just the feature caching.


If I use the feature caching to just run the SW dataset, then do the writing separately it works. I also need this to be automated on Server so it has to be able to run in one shot without manual intervention.


If I use the feature caching to just run the SW dataset, then do the writing separately it works. I also need this to be automated on Server so it has to be able to run in one shot without manual intervention.

Did you try and run the process without FeatureCaching? FME Server does not perform feature caching so if feature caching is the issue then it will not be an issue


Did you try and run the process without FeatureCaching? FME Server does not perform feature caching so if feature caching is the issue then it will not be an issue

It fails when run without caching.


Did you try and run the process without FeatureCaching? FME Server does not perform feature caching so if feature caching is the issue then it will not be an issue

Ahh that's too bad. Perhaps create an issue with Safe via support. This should be an easy process.

 

One other thing to try miiight be to remove the WHERE clause and use a tester in FME instead. I know that might seems counterintuitive, however, it could be that the WHERE clause is where the build up in memory is occurring.

 

Other places where memory can build up is if you are using the fanout setting in the writer or if you have multiple writers.

 

What is the error message you get about the memory?


Did you try and run the process without FeatureCaching? FME Server does not perform feature caching so if feature caching is the issue then it will not be an issue

@swach​ did you find a solution for this?

Thanks!


Did you try and run the process without FeatureCaching? FME Server does not perform feature caching so if feature caching is the issue then it will not be an issue

I used a query and wrote it to a staging file geodatabase in one workbench, then did the further processing in another workbench. Its one of those things that didn't want to work until one day it started running successfully. I don't know if it was my method that fixed it or a change in the server or something else.


Reply