I'm having trouble visualizing it. It would be useful to post a sample of what your data looks like and what you would like it to look like. It can be made up values, it's the schemas that are important.
If I understood the requirement correctly, this approach might help you.
- Create 15 min interval records for the entire duration using Creator, AttributeCreator etc.
- Merge the signal data to the interval records using time value as join key with the FeatureMerger.
- Write the features together from both the Merged port and UnmergedRequestor port into the destination table.
Hi both,
Thanks for your replies. To clarify I've added some files. These are the originals:
38499 PulseA 40103764 11C2399 10052019132213.txt
38500 PulseA 44233990 23C3037 10052019130729.txt
And the file structure:
filesttucture.txt
These files show data over a date range. The header of the file shows the ID of the item, with a start and end time. Later data shows a pulse at a particular date and time.
I'm using a TestFilter to alter the time to set 15 minute blocks (such as pulses between 00:00 and 00:15) and using a StatisticsCalculator to count them for the particular 15 min period.
There are 96 minute intervals in a day, but the file might only show data for 4 of them.
I need to join these to a master time dataset:
timemaster.xlsx
Which I can do, but I need to copy the logger id / serial id into the blank time master values to create a 96 interval profile, with 0's for the missing data, like so:
finaldata.xlsx
At the moment all I'm achieving is this:
myoutput.xlsx
Without filling in the blanks I can't create a smooth 96 interval profile in my BI Tool
I've added my workspace:
.pittesting4.fmw
Thank you for any help!
I think this workspace example describes a way to accomplish the requirement.
textline2none.fmwt
I think this workspace example describes a way to accomplish the requirement.
textline2none.fmwt
Takashi,
Thank you so much. Your workflow is simple, elegant and beautiful. Just by studying it you've taught me some many new techniques and methods I can use to create my workflows. I really appreciate the time you've taken to look at this for me.
Warmest regards,
Neil
I think this workspace example describes a way to accomplish the requirement.
textline2none.fmwt
Takashi,
Sorry - the only other thing I need to do is add the date field also to the output (DD/MM/YYYY). The Startdate and EndDate might be different if the data runs over into the next day. Is this possible?
Kind Regards
Neil
Hi Takashi,
Your solution is fantastic but I also need to add the date to the current timestamp. This will complicate things as the start date and end date might span days. Can this be added into the current workflow? I have added a file:
40150 PulseA 45942640 24C2727 15052019091558.txt
Kind Regards
Neil
Some date/time operations might help you. Example: textline2none-v2.fmwt
Some date/time operations might help you. Example: textline2none-v2.fmwt
Thanks Takashi. Is it possible you could downgrade the workspace you have added to FME 2018.1? I can't open it fully.
Kind Regards
Neil
Some date/time operations might help you. Example: textline2none-v2.fmwt
This is it. textline2none-v2-2018.fmwt
Hope everyone who asks a question mentions required FME version at first if it isn't the latest one.
This is it. textline2none-v2-2018.fmwt
Hope everyone who asks a question mentions required FME version at first if it isn't the latest one.
Thank you - and sorry, yes, I should have mentioned I'm not on 2019. My company IT dept hasn't got round to upgrading us yet.
I think this workspace example describes a way to accomplish the requirement.
textline2none.fmwt
Hi Takashi,
I'm using your workspace successfully but am now having an issue where if a file comes in with new data the process is adding more records to the database rather than overwriting them. I've had a look at previous replies on here and people suggested a Joiner but that is depreciated in 2019. Do you know what I might be able to use to check that the record exists and is not overwritten? My data has no ID column.
Thank you