Hi all, how do I easiest update change the field type in a GDB?
Esri Introduced the BigInteger, which is giving me a lot of issues. It dosn’t work in all applications, and I can’t even do a relate or join on it.
I need to get it into a integer. For some reason the data we get delivered now on a weekly basis has all the number fields in BigInteger.
I need it to find the BigInteger Fields, and change the type, without loosing data. The data also fits an normal integer.
Any thoughts would be much appreciated.
Thanks
Best answer by debbiatsafe
Hello @kelin84
You can use a schema feature to identify any fields in the input geodatabase where the data type is big integer. This information would be stored in the schema feature’s attribute{}.native_data_type list attribute and the field name would be stored in attribute{}.name.
It appears both attribute{}.native_data_type and attribute{}.fme_data_type needs to be changed in order for the data to be written as long integer. Changing one but not the other does not work.
The workspace does assume all input big integer values fit the long integer type.
I have attached an example demonstrating this workflow and I hope it helps.
This post is closed to further activity.
It may be an old question, an answered question, an implemented idea, or a notification-only post.
Please check post dates before relying on any information in a question or answer.
For follow-up or related questions, please post a new question or idea.
If there is a genuine update to be made, please contact us and request that the post is reopened.
Hi @kelin84, you can change the Data Type of your data by using an AttributeManager and changing the Type to one of the int options or whatever you’d prefer. I hope this helps!
Sorry for not making it clear, my issue is to filter only fields that uses the BigInteger field type and then automaticly change that into Integer. If I only use the attribute manager, I need to handpick each field and create the change. I want it to loop though all random layers in a gdb, create (maybe a tester) that throws all fields with big integer one way, change the field type and merge all the failed teste values back into the same layer.
You can use a schema feature to identify any fields in the input geodatabase where the data type is big integer. This information would be stored in the schema feature’s attribute{}.native_data_type list attribute and the field name would be stored in attribute{}.name.
It appears both attribute{}.native_data_type and attribute{}.fme_data_type needs to be changed in order for the data to be written as long integer. Changing one but not the other does not work.
The workspace does assume all input big integer values fit the long integer type.
I have attached an example demonstrating this workflow and I hope it helps.