I am attempting to write to CSV file where the filename includes date/timestamp. My situation is the same as the question here: https://knowledge.safe.com/questions/35824/write-csv-file-where-filename-includes-datetimesta.html. I've tried adding the timestamp to the name but it produces a CSV for each row of the original table. I tried a parallel Creator>DateTimeStamper> but it generates two CSVs. I tried using a FeatureMerger but I'm confused as to what to join on; My options are to join on _timestamp and a column from the source data, which don't match. Should I ask a new question for this particular instance of the problem or try posting on aforementioned thread?
Join unconditionally on 1 = 1, so all features will get the timestamp created after the Creator transformer.
Join unconditionally on 1 = 1, so all features will get the timestamp created after the Creator transformer.
I tried joining _timestamp to every column from the source data but that did not work. How do you join unconditionally on 1=1?
I tried joining _timestamp to every column from the source data but that did not work. How do you join unconditionally on 1=1?
Enter the values 1 and 1 in the join fields, like this:
Join unconditionally on 1 = 1, so all features will get the timestamp created after the Creator transformer.
I did that and then added the _timestamp to the writer name and it worked.
Join unconditionally on 1 = 1, so all features will get the timestamp created after the Creator transformer.
In theory it shouldn't overwrite the existing file though, right? The time-stamp is the same each time I run it. Is there a way to have it change based on when I run the job?
In theory it shouldn't overwrite the existing file though, right? The time-stamp is the same each time I run it. Is there a way to have it change based on when I run the job?
What does your timestamp look like? It should not overwrite if it is a date time combination. If it includes only the date, it will overwrite. You can use a DateTimeConverter transformer after the TimeStamper to set the right format for the _timestamp attribute.
Hope this helps.
In theory it shouldn't overwrite the existing file though, right? The time-stamp is the same each time I run it. Is there a way to have it change based on when I run the job?
Did a quick test and this is my result (running twice in consecutive minutes):
Did a quick test and this is my result (running twice in consecutive minutes):
This is my result, still overwriting.

This is my flow
These are the parameters for DateTimeConverter:
This is my result, still overwriting.

This is my flow
These are the parameters for DateTimeConverter:
I notice you have the caching on. Are you running from start or is your process using the cache? In that case it will re-use the same timestamp every time and therefor overwrite the file.
I notice you have the caching on. Are you running from start or is your process using the cache? In that case it will re-use the same timestamp every time and therefor overwrite the file.
Yep that was it. Thank you.
Hi @anthonystokes You could just set the CSV file name using the date/time functions DateTimeNow and maybe DateTimeFormat. e.g. @DateTimeFormat(@DateTimeNow(),%Y%m%d_%H%M) However, the file name will be the time the first feature gets sent to the writer, not when the translation started.