Skip to main content

I'm working on a workflows that will run on my FME Cloud and is started by a REST command. The idea is to develop a database driven pdf creator that generates x amount of pdf pages and bundles them a zip.

The setup of my workflow is as follow(or should at least work like this):

Step 1: Rest command is send to the cloud with a TaskID and starts the workflow 1.

Step 2: Workflow 1 will get the information in our database that is connected to the TaskID. The table that is fetched contains an x amount of PageID's (depending on how many pages is needed) and each PageID contains the information that is needed to set the paper size, geom layers, WMS stuff, graphs and tables, name of the PDF and other information that is needed.

Step 3: For each PageID athere is one line submitted to the FMEServerjobsubmitter (in sequence and waits for job to complete) with the information that is needed to start workflow 2 that builds the pdf out of the table that is generated in step 2. For each PagaID a single pdf is generated (this workflow is tested and works offline and online).

step 4: The finished PDF from workflow 2 should be returned to workflow 1 and bundled with the other x amount of pfds (if PageID is more than 1) into a zip at the end of the workflow and send by email to a email provided in the rest command.

For this flow i still have a problem:

(How) can you use the data (pdf's) that is generated by a FMEServerjobsubmitter and use the files created and combine them into a zipfile?

The Data Download service actually bundles output in a zip file and can email it to the user, but you won't be able to call that using the FMEServerJobSubmitter.

If you're not able to rework your workspace to use the Data Download service you can write the PDF's to a temporary location using the FeatureWriter. Then a ZipArchiver (custom transformer, available on the hub) to create the zipfile and an Emailer transformer to email it to the recipient.


Took me a bit of time to make some time to rework everything

Current status:

The main flow loads a workspacerunner (flow a). Flow a makes the pdf's per page and stores them on the drive.

ZipArchiver (adds all the pdf files that are in my succeeded table coming out of the workspacerunner into a zip) -> featureholder (wait till everything is ready)-> statistic calculator(some summery data and also to have 1 outputobject) -> emailer (sends mail with zip).

This works, but the problem is that this does not work correctly with the data download.

Is it possible to read the PDF files that are created with the workspacerunner and stored on the disk into the workspace and use them as output and afterwards it will be stored and zipped by the data download service?

I tried the feature reader but it cannot read pdf's back in, so workspacerunner -> featurereader does not work.

Could featurereader with file and pathnames work? Any other ideas?

TLDR:

I have created x amount of pdf's on the hdd of my FME server with a workspace that is runned by the workspacerunner and would like to get them into the workspace so i can use the data download function (that zips them) and download the zip/mail the link to myself.


I tried the emailer but it has file-size limitation of xMB for most users(Most e-mail providers do not allow large filesizes). In theory the total file-size of the pdfs could be in the GB's so this road looks like a dead end.

(emailer was attached to the statistics so only one feature was kept.) The workspacerunner contains the feature writer that saved the files to a folder named with the PrintAssignmentID, the ZipArchiver creates and adds these files to the zip stored on the temp.

I can not get it working after reworking it. It creates the zip but afterwards i cannot read, fetch or drag it into the workflow so i can use the datadownload to mail this zip.

Is there a way to use feature reader and read the pdf file as data instead and then use the datadownload service?

Really frustrating to see those files there but somehow cant get it to work like i want it. 😛

Emailing the link to the zip directly, so that they could download it like a normal link could potentially lead to a data-breach. The folders do not get a randomised URL like the datadownloader does. So i can not use this tactic (although it looked like it worked.).


Hi @JeroenR, I think you can do that with this procedure.

  1. In the first workspace (Data Download Service), create a temporary folder path with the TempPathNameCreator.
  2. Run the second workspace (Job Submitter Service) via a FMEServerJobSubmitter (Wait for Jobs to Complete: Yes). Here, pass the temporary folder path to a published parameter of the second workspaces as a destination folder into which all the resulting files will be saved.
  3. Connect a FeatureReader (Format: Directory and File Pathnames - PATH) to the Succeeded [Edited] port of the FMEServerJobSubmitter to read every file path within the temporary folder.
  4. Finally add a File Copy writer in order to MOVE all the files within the temporary folder to a destination zip dataset.


Hi @JeroenR, I think you can do that with this procedure.

  1. In the first workspace (Data Download Service), create a temporary folder path with the TempPathNameCreator.
  2. Run the second workspace (Job Submitter Service) via a FMEServerJobSubmitter (Wait for Jobs to Complete: Yes). Here, pass the temporary folder path to a published parameter of the second workspaces as a destination folder into which all the resulting files will be saved.
  3. Connect a FeatureReader (Format: Directory and File Pathnames - PATH) to the Succeeded [Edited] port of the FMEServerJobSubmitter to read every file path within the temporary folder.
  4. Finally add a File Copy writer in order to MOVE all the files within the temporary folder to a destination zip dataset.

Since the first workspace will be run as a Data Download Service, it's not necessary to make a zip folder with the File Copy writer. Just move all the files created by the second workspace run with the FMEServerJobSbumitter into a common destination folder, FME Server will archive them into a single zip file automatically.

 


Reply