Skip to main content
Hi,

 

I have a large amout of data in csv files 4 mil in each. Always gets memory problem when reading the data. The rest of the process is fine. I use a 64 bit fme on a powerful machine and the memory is only on 60%. Any suggestions?

 

 

Henrik Rinne
Hi,

 

 

difficult to tell without knowing your workspace in detail, but some general tips are:

 

  • Remove as many attributes as possible, as early as possible in your workflow (AttributeKeeper is your friend)
  • Beware of large lists that are kept "indefinitely" (AttributeRemover is your friend)
  • Don't store geometries in an attribute, e.g. GeometryExtractor
  • Beware of blocking transformers, e.g. FeatureMergers, they fill up memory quickly
If you rely on a lot of FeatureMergers and the like, consider staging your processing by first loading everything into a proper database (SQLite or PostgreSQL are good and free alternatives) and then do joins on the database side using e.g. a SQLCreator rather than a csv reader. With proper indexing it is usually a lot faster and much more memory efficient.

 

 

David

Reply