Skip to main content

I have the following issue in Dynamic workflow. When converting from GDB to Shapefile, all the date fields (fme_datetime) become strings. I need what those fields stay fme_datetime. How can I solve this issue?

Esri Shapefile format doesn't support datetime data type unfortunately, I therefore think it's reasonable that FME creates a string type field in the destination Shapefile dataset in order to save a datetime value.


Esri Shapefile format doesn't support datetime data type unfortunately, I therefore think it's reasonable that FME creates a string type field in the destination Shapefile dataset in order to save a datetime value.

Really? But I have many shapefiles with data type – Date. So the format does support it. What's going on with FME?


Really? But I have many shapefiles with data type – Date. So the format does support it. What's going on with FME?

Date and datetime are not the same. Shape files only have a date data type.


Esri Shapefile format doesn't support datetime data type unfortunately, I therefore think it's reasonable that FME creates a string type field in the destination Shapefile dataset in order to save a datetime value.

Yes, Shapefile format supports date type, but doesn't support datetime.


Yes, Shapefile format supports date type, but doesn't support datetime.

Is there any workaround? I need what my date wouldbe formated like YYYY-MM-DD, instead of string in YYYYMMDD.


Yes, Shapefile format supports date type, but doesn't support datetime.

You can use the DateTimeConverter to convert the format of a date value.

Just be aware that FME treats 'YYYY-mm-dd' as a string value when writing into a destination dataset, would not write it into a column with Date type.


You can use the DateTimeConverter to convert the format of a date value.

Just be aware that FME treats 'YYYY-mm-dd' as a string value when writing into a destination dataset, would not write it into a column with Date type.

It seems DateTimeConverter is reasonable workaround, although data type of the target field remains a string, not a date. Thanks for the help!

If this is a dynamic workflow, how are you setting the schema? Unless the schema says that the field should be a date it won't be written as a date.


If this is a dynamic workflow, how are you setting the schema? Unless the schema says that the field should be a date it won't be written as a date.

I'm not setting up the schema manually. Instead I'm using the source database schema.


I'm not setting up the schema manually. Instead I'm using the source database schema.

So the source database schema will specify the field as datetime which cannot be written to shape, the schema would need to be altered so that this would be date if you want it to be written as a date. This is probably tricky to achieve within your dynamic workflow


So the source database schema will specify the field as datetime which cannot be written to shape, the schema would need to be altered so that this would be date if you want it to be written as a date. This is probably tricky to achieve within your dynamic workflow

I would like to know how can I alter the schema to store Date in .shp without putting a lot of additional effort into it.


I would like to know how can I alter the schema to store Date in .shp without putting a lot of additional effort into it.

Hi @fikusas, Both egomm and takashi had made some very important points:

  1. If you are defining the output schema using the source schema, when output format doesn't support the same date attribute type (Date v.s. Datetime), the attribute type will not be mapped over.
  2. It might not achieve your goal, to manually manipulate the date format to look like what you want to see on the output.
    • If the writer attribute type is of a date type, then it will only accept the standard FME DateTime format (YYYYMMDD).
    • You can set the writer attribute type to a string type, and it will accept the formatted date (YYYY-MM-DD). But the attribute type may not meet your requirement.

So, I think there are 2 potential types of approaches to solving this problem. Depending on whether you truly need the workflow to be dynamic.

  • If you are working with datasets that have the same schema, the workspace doesn't need to be dynamic. You can set the writer first to be Automatic, which will copy over all incoming attribute, then switch to Manual, to edit the attribute type to meet your requirement. When the writer attribute is set to "Date", the value read from the source "Datetime" type should write out correctly. No need to format it using DateTimeConverter.
  • If you are working with multiple datasets with different schema, then you will need a dynamic workflow. This will be more complicated. The method I can think of is to use the help of Schema features. Here is a good article to get you familiar with the method.

Hope this helps a bit. As you can see, if you don't need a dynamic workflow, the setup of the workspace requires very little effort. While, if you need dynamic workflow, the workspace setup will be more complex, but you will gain the flexibility of processing multiple schemas using one workspace, and it reduces the effort of further manual processing.

If you need more help, please let us know.


Reply