Skip to main content
Solved

Update Esri GDB field type


kelin84
Contributor
Forum|alt.badge.img+1

Hi all, how do I easiest update change the field type in a GDB?

Esri Introduced the BigInteger, which is giving me a lot of issues. It dosn’t work in all applications, and I can’t even do a relate or join on it.

 

I need to get it into a integer. For some reason the data we get delivered now on a weekly basis has all the number fields in BigInteger.

I need it to find the BigInteger Fields, and change the type, without loosing data. The data also fits an normal integer.

Any thoughts would be much appreciated.

 

Thanks 

 

Best answer by debbiatsafe

Hello @kelin84 

You can use a schema feature to identify any fields in the input geodatabase where the data type is big integer. This information would be stored in the schema feature’s  attribute{}.native_data_type list attribute and the field name would be stored in attribute{}.name.

You can follow the same principle as the Dynamic Workflows: Modifying the Schema Feature article and change the data type of any big integer fields to long integer type.

It appears both attribute{}.native_data_type and attribute{}.fme_data_type needs to be changed in order for the data to be written as long integer. Changing one but not the other does not work.

The workspace does assume all input big integer values fit the long integer type.

I have attached an example demonstrating this workflow and I hope it helps.

View original
Did this help you find an answer to your question?

4 replies

saraatsafe
Safer
Forum|alt.badge.img+8
  • Safer
  • May 10, 2024

Hi @kelin84, you can change the Data Type of your data by using an AttributeManager and changing the Type to one of the int options or whatever you’d prefer. I hope this helps!


kelin84
Contributor
Forum|alt.badge.img+1
  • Author
  • Contributor
  • May 13, 2024

Thank you for the reply.

Sorry for not making it clear, my issue is to filter only fields that uses the BigInteger field type and then automaticly change that into Integer. If I only use the attribute manager, I need to handpick each field and create the change. I want it to loop though all random layers in a gdb, create (maybe a tester) that throws all fields with big integer one way, change the field type and merge all the failed teste values back into the same layer.

 

Thanks


debbiatsafe
Safer
Forum|alt.badge.img+20
  • Safer
  • Best Answer
  • May 13, 2024

Hello @kelin84 

You can use a schema feature to identify any fields in the input geodatabase where the data type is big integer. This information would be stored in the schema feature’s  attribute{}.native_data_type list attribute and the field name would be stored in attribute{}.name.

You can follow the same principle as the Dynamic Workflows: Modifying the Schema Feature article and change the data type of any big integer fields to long integer type.

It appears both attribute{}.native_data_type and attribute{}.fme_data_type needs to be changed in order for the data to be written as long integer. Changing one but not the other does not work.

The workspace does assume all input big integer values fit the long integer type.

I have attached an example demonstrating this workflow and I hope it helps.


kelin84
Contributor
Forum|alt.badge.img+1
  • Author
  • Contributor
  • May 14, 2024

This is Amazing, thank you so much. This is the best answer I ever had. It would have taken me forever to get to that.

🙏🙏🙏🙏


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings