Skip to main content

Hello @angela

 

 

Some things to look for that may help:

 

 

1) Ensure the writer shows all the correct attributes under 'User Attributes', if it is empty, change the writer attribute definition setting to Automatic. This will update the writer to write all attributes that it receives(when writing to a new table).

 

2) Ensure the attributes on the transformer directly before the writer, and the attributes on the writer are all connected(green).

 

3) Check the log file for any errors, some fields may not be written as a result of errors. Errors/warnings could be related to the datatype that is assigned to each attribute.

Here are some articles/tutorials that may give you some hints/tips to ensure the workflow is correct.

 

https://knowledge.safe.com/articles/29323/converting-to-postgis-create-drop-or-truncate-a-ta.html

 

https://knowledge.safe.com/articles/29324/converting-to-postgis-write-to-an-existing-table.html

If you want to attach your workspace/Log file, I can take a look at it and check to see exactly where the NULL's are being populated from.

 

 

I hope that helps.


Thanks @trentatsafe I had checked those three items and they look fine. What's the best way to send you the workspace/log file directly?


Here you go @trentatsafe

log.zipesrishape2geodatabase-sde3.fmw


Note that we'd like the writer to be dynamic as the landbase data is from an outside agency and changes quite frequently.


Hello @angela

Thank you for the workspace and the log file. Nothing in the log file stands out to me in terms of errors/warnings. Looking at the workspaces leads me to two thoughts.

 

 

1) I noticed you had two SDE writers within the workspace. A dynamically set one(<dynamic>) and a static one(ParcelsDistrictPlus), the dynamic writer is the one that confuses me. I am not quite sure you need it for this workflow. As you are just writing a shapefile to the SDE database. As such, I would try disabling that writer and connect the shapefile to the ParcelsDistrictPlus Writer.

 

 

2) Secondly, are you writing to a brand new table or inserting into a pre-existing table. If a pre-existing table, I would verify that the table in the SDE database, does, in fact, match the schema from the shapefile(all the attributes match). Alternatively, If it is a brand new table I would try the Drop Table option on the first translation. This will ensure that if there is a pre-existing table it gets dropped, and the table will pick up the new schema correctly. (If trying the drop table, I would double check to make sure the table-name is not present in the SDE database. This is to prevent loss of data should there be a table present with that name).

 

 

I have also altered the AttributeDefinition to automatic(in the event of writing to a new table) this will pick up the schema automatically from the Shapefile. I have attached a modified version of the workspace you attached. Let me know if that helps. modifiedworkspace.fmw

Same result. I exported the results to shapefile so you can see. parcels.zip


Same result. I exported the results to shapefile so you can see. parcels.zip

Hello @angela,

 

With there being no errors in the FME log, I have a new suggestion for the dynamic workflow.

 

Would you mind trying the following:

 

1) Remove all attributes from the the writer(using the little minus sign at the bottom, or can simply re-add the writer and set the Feature Definition to Dynamic(should only show a blank attribute with 200 length, and the data type 'char').

 

2) Change Schema Sources to : "Schema From Schema Feature"

 

3) Truncate Table: Yes.

 

 

If this doesn't work, would you mind uploading a sample of your data? This would allow us to try and reproduce the behavior you are seeing. As the only other idea that comes to mind, is that the source data is just missing the values in the first place.

 

 

If none of the above works, we can try a screenshare to determine the cause/issue if you would like.

 

 

I hope that helps.

 

 


Hello @angela,

 

With there being no errors in the FME log, I have a new suggestion for the dynamic workflow.

 

Would you mind trying the following:

 

1) Remove all attributes from the the writer(using the little minus sign at the bottom, or can simply re-add the writer and set the Feature Definition to Dynamic(should only show a blank attribute with 200 length, and the data type 'char').

 

2) Change Schema Sources to : "Schema From Schema Feature"

 

3) Truncate Table: Yes.

 

 

If this doesn't work, would you mind uploading a sample of your data? This would allow us to try and reproduce the behavior you are seeing. As the only other idea that comes to mind, is that the source data is just missing the values in the first place.

 

 

If none of the above works, we can try a screenshare to determine the cause/issue if you would like.

 

 

I hope that helps.

 

 

 

Okay, I tried that and got "Cannot define schema for 'ParcelsDistrictPlus' as the feature does not contain schema information, and schema source is set to 'Schema from Schema Feature'. Please verify writer feature type configuration." And the final error was "Unable to find the schema definition 'ParcelsDistrictPlus' for feature with feature type name 'ParcelsDistrictPlus'. Ensure the feature type name is correct and that its schema definition exists in the schema source"

 

 


 

Okay, I tried that and got "Cannot define schema for 'ParcelsDistrictPlus' as the feature does not contain schema information, and schema source is set to 'Schema from Schema Feature'. Please verify writer feature type configuration." And the final error was "Unable to find the schema definition 'ParcelsDistrictPlus' for feature with feature type name 'ParcelsDistrictPlus'. Ensure the feature type name is correct and that its schema definition exists in the schema source"

 

 

Hello @angela,

 

I apologize, I misunderstood the 'Schema from Scheme feature', you would want to unselect this(Set the Schema Sources to 'the Shapefile', you can keep both selected if you'd prefer, I don't believe this will harm the translation).

 

Also, if you wouldn't mind setting the Drop Table option on the writer to 'Yes'? I'd like to start fresh for the translation. If that does not work I'm afraid we will have to take a look at the data. If you'd prefer not to share your data here you can create a case, and send the data via there. I will pick up the case and can take a look at the data and try to run the workspace here.

 

 

Let me know how that goes.

 


Hello @angela,

 

I apologize, I misunderstood the 'Schema from Scheme feature', you would want to unselect this(Set the Schema Sources to 'the Shapefile', you can keep both selected if you'd prefer, I don't believe this will harm the translation).

 

Also, if you wouldn't mind setting the Drop Table option on the writer to 'Yes'? I'd like to start fresh for the translation. If that does not work I'm afraid we will have to take a look at the data. If you'd prefer not to share your data here you can create a case, and send the data via there. I will pick up the case and can take a look at the data and try to run the workspace here.

 

 

Let me know how that goes.

 

I'm afraid it still didn't work. I'll open a case.

 

 


I'm afraid it still didn't work. I'll open a case.

 

 

Can you point me to where to go to open a case please?

 

 


Can you point me to where to go to open a case please?

 

 

Here you go. https://www.safe.com/support/

 

Just select 'Submit a case' fill out the form and upload your data if you can. I will look for the case to be posted and will take a look once I see it.

 

 


To give an update on the answer to this question:

 

 

The issue that was occurring was as a result of a bug. The Shapefile attributes were being converted to lowercase during the writing process, even though the source schema was upper case. This has been filed as a PR(PR#78996).

 

 

The workaround, for the time being, is to use a BulkAttributeRenamer to change the character case of the attributes before the writing this will ensure the feature's attributes are correctly populated by the writer.

I have attached this question to the PR and will update this posting as more information becomes available.


@trentatsafe I am seeing this same behavior in FME Desktop 2018, but don't follow the workaround would it be possible for you to provide some more detail to the workaround.


@trentatsafe I am seeing this same behavior in FME Desktop 2018, but don't follow the workaround would it be possible for you to provide some more detail to the workaround.

Hello @chris_m,

 

 

The issue that the user was seeing in the original posting, had to do with the Shapefile attributes being Uppercase, while the dynamic Postgis writer was expecting lowercase attributes. So, while the attributes will be converted to lowercase on the writer, they will not be populated as FME is not mapping the Uppercase attributes to the lower case destination schema. By adding a BulkAttributeRenamer and changing the attributes to lowercase, it allows the attributes to be mapped to the correct attribute on the writer. This is because FME is case-sensitive. Let me know if that provides some clarity.

 


Hello @chris_m,

 

 

The issue that the user was seeing in the original posting, had to do with the Shapefile attributes being Uppercase, while the dynamic Postgis writer was expecting lowercase attributes. So, while the attributes will be converted to lowercase on the writer, they will not be populated as FME is not mapping the Uppercase attributes to the lower case destination schema. By adding a BulkAttributeRenamer and changing the attributes to lowercase, it allows the attributes to be mapped to the correct attribute on the writer. This is because FME is case-sensitive. Let me know if that provides some clarity.

 

I assume you are referring to the Attribute names, correct?

 


Reply