Is there a way to automatically detect datatime fields from a reader reading many tables (at least 2k) and convert them to a different format?
@colasamk It would depend on the source format. For most datasets that have a 'table' definition - i.e. databases, Shape, Mapinfo Tab etc - FME will detect the datatime field. For CSV FME can be configured to recognize data time patterns under the reader parameters - Advanced - Schema Generation
@colasamk It would depend on the source format. For most datasets that have a 'table' definition - i.e. databases, Shape, Mapinfo Tab etc - FME will detect the datatime field. For CSV FME can be configured to recognize data time patterns under the reader parameters - Advanced - Schema Generation
@Mark Stoakes, thank you for your response.
This is what I'm trying to do.
I have a workspace that contains an Oracle reader reading multiple tables (about 2k for now) and a Google BigQuery writer.
It turns out that the datetime format in Oracle is not compatible with Google BigQuery. Hence, I need to convert every datetime field from the Oracle table so that they can be loaded to Google BigQuery.
I learned that I could specify the datetime fields in DataTimeConverter but because of the number of tables that I need to deal with (eventually, the number will increase to tens of thousands of tables), going through all fields and specifying them is not really feasible.
Is there a way to detect datetime pattern from the tables automatically and convert it to a specific format?
Any help will be appreciated.
Thank you.
@Mark Stoakes, thank you for your response.
This is what I'm trying to do.
I have a workspace that contains an Oracle reader reading multiple tables (about 2k for now) and a Google BigQuery writer.
It turns out that the datetime format in Oracle is not compatible with Google BigQuery. Hence, I need to convert every datetime field from the Oracle table so that they can be loaded to Google BigQuery.
I learned that I could specify the datetime fields in DataTimeConverter but because of the number of tables that I need to deal with (eventually, the number will increase to tens of thousands of tables), going through all fields and specifying them is not really feasible.
Is there a way to detect datetime pattern from the tables automatically and convert it to a specific format?
Any help will be appreciated.
Thank you.
@colasamk If the Oracle date/time fields are a date/time data type, then FME should recognize them and the process should be seamless. If your date/time fields are text or varchar2 or something like that, then you can use the SchemaScanner to generate the list attribute Attributes{} to try and find the date time pattern and then use that list to identify the date/time attributes
@Mark Stoakes, thank you for your response.
This is what I'm trying to do.
I have a workspace that contains an Oracle reader reading multiple tables (about 2k for now) and a Google BigQuery writer.
It turns out that the datetime format in Oracle is not compatible with Google BigQuery. Hence, I need to convert every datetime field from the Oracle table so that they can be loaded to Google BigQuery.
I learned that I could specify the datetime fields in DataTimeConverter but because of the number of tables that I need to deal with (eventually, the number will increase to tens of thousands of tables), going through all fields and specifying them is not really feasible.
Is there a way to detect datetime pattern from the tables automatically and convert it to a specific format?
Any help will be appreciated.
Thank you.
@Mark Stoakes, the date fields are defined as datetime in Oracle.
Should I then use a DateTimeConverter to do it?
Do you think that you can send me a sample workspace? I'm using a DateTimeConverter but I'm not sure how to make it automatically detect datetime fields.
Thank you.
@colasamk If you generate a workspace with an Oracle reader and a Google Big Query writer, and the Oracle table has a date or datetime attribute type, FME will detect the datetime data type and FME will map that to the Google Big Query datetime data type. A little more in this article.
This will also work in a dynamic workflow, although you might not see the explicit datetime data types in the reader & writer feature types.
If this is not working for you then perhaps contact FME Support