Skip to main content

I have 50+ tables i want to setup a migration jobb for between MSSQL and PG.

Fastest way to set this up is using the "Generate workspace" functionalities.

However using this, the table qualifier is set to dbo, default schema for MSSQL . In PG i am not using the same table qualifier name though, so I want to set to use X table qualifier for all the table to be written to PG, but it does not look like that is possible? Do i need to set it for each table manually?

Same goes for table name. I want to make it lowercase and remove any special characters automatically if possible.

I never use the Generate workspace functionality.

 

But then again... I also never use Readers and Writers.

 

Start with a creator, then a FeatureReader. Set the FeatureReader to read the MSSQL database, with all the tables you want to read.

 

Use bulkAttributeRenamers, split fme_featuretype if nessesary. And dynamic write all the tables with the FeatureWriter set to Postgis writer (If tables contain geometry) or Postgresql (if tables don't contain geometry)

 

image 


Thanks, that is a good way to move a lot of data. My added complexity is that the spatial column origin from the super geometry type whilst in postGIS I need them to be split into geometry type specific tables. (This is solved nicely automatically with geometry filter when you use the workspace functionality


@so_much_more​ I would lean towards using a dynamic workflow. It's easy to set-up and you can still add the GeometryFiltering and append a geometry type to the end of the table name or something.


Thanks, that is a good way to move a lot of data. My added complexity is that the spatial column origin from the super geometry type whilst in postGIS I need them to be split into geometry type specific tables. (This is solved nicely automatically with geometry filter when you use the workspace functionality

Use a GeometryFilter to split out the features. Then append the geometry type to the table name when you write the data.


Reply