Skip to main content

I have two separate flows for a process as shown below (upper and lower). These two flows are separate but it is required that the lower workflow must be executed after the upper workflow has executed. To handle this, I have created two separate workbenches for each flow and i am calling the second workbench after the upper workbench has been run. Is there any way, I can merge these two workflows in a single workbench and have control on which workflow executes first?

 

I would do something like this:

  1. Creator
  2. FeatureReader
  3. ...process 1...
  4. FeatureWriter #1
  5. SQLExecutor, connected to the Summary port of FeatureWriter #1.
  6. ...process 2...
  7. FeatureWriter #2

 


I would do something like this:

  1. Creator
  2. FeatureReader
  3. ...process 1...
  4. FeatureWriter #1
  5. SQLExecutor, connected to the Summary port of FeatureWriter #1.
  6. ...process 2...
  7. FeatureWriter #2

 

Hi @david_r my workbenches have SDE Geodatabase writers which do not have output ports. I tried using featurewriter to have a control on how my records are being processed but after a lot of struggle I couldn't make use of feature writer (see my question here https://knowledge.safe.com/questions/91497/no-features-seen-in-feature-writer.html )


Hi @david_r my workbenches have SDE Geodatabase writers which do not have output ports. I tried using featurewriter to have a control on how my records are being processed but after a lot of struggle I couldn't make use of feature writer (see my question here https://knowledge.safe.com/questions/91497/no-features-seen-in-feature-writer.html )

You do indeed need to use the FeatureWriter for this to work. As you say, the regular writers have no output port.

I'm not sure why the FeatureWriter doesn't work in your case, I often use it with the SDE Geodatabase writer and it works very well.

If you cannot get the FeatureWriter with the SDE Geodatabase to work, consider contacting your local FME reseller or Safe support.


You do indeed need to use the FeatureWriter for this to work. As you say, the regular writers have no output port.

I'm not sure why the FeatureWriter doesn't work in your case, I often use it with the SDE Geodatabase writer and it works very well.

If you cannot get the FeatureWriter with the SDE Geodatabase to work, consider contacting your local FME reseller or Safe support.

@david_r FeatureWriter does work for me however I do not get a feature by feature status whether a feature has been written successfully or not. In my case, my second workflow starts with a SQL creator which doesn't have input port :(


@david_r FeatureWriter does work for me however I do not get a feature by feature status whether a feature has been written successfully or not. In my case, my second workflow starts with a SQL creator which doesn't have input port :(

It's not clear from your other question, do you get a feature out the summary port of the featurewriter? This should be all you need to trigger an sqlexecutor which can take the place of your SQL creator


It's not clear from your other question, do you get a feature out the summary port of the featurewriter? This should be all you need to trigger an sqlexecutor which can take the place of your SQL creator

I get only one record out of the summary port of featurewriter even if I process 100 records. There is no attribute coming out of the summary port which can trigger sql executor because the query in sql creator is not dynamic. If i get each processed feature out of the Featurewriter then only I can use sql executor by passing the ID of the processed feature.


I get only one record out of the summary port of featurewriter even if I process 100 records. There is no attribute coming out of the summary port which can trigger sql executor because the query in sql creator is not dynamic. If i get each processed feature out of the Featurewriter then only I can use sql executor by passing the ID of the processed feature.

That's normal, you only get one single feature from the Summary port.

You need to activate and use the "One per feature type" port, in the very bottom of the FeatureWriter setup.


That's normal, you only get one single feature from the Summary port.

You need to activate and use the "One per feature type" port, in the very bottom of the FeatureWriter setup.

I did that as well but it didn't work out. Does it work only for inserts? because when I checked with one per featuretype, the writer did couple of updates. And when I checked the summary, it showed me that it wrote 2 features.


I get only one record out of the summary port of featurewriter even if I process 100 records. There is no attribute coming out of the summary port which can trigger sql executor because the query in sql creator is not dynamic. If i get each processed feature out of the Featurewriter then only I can use sql executor by passing the ID of the processed feature.

An sqlcreator gets triggered once, so you should only need to trigger the sqlexecutor once too. If you're workflows are currently independent, you don't need any of the attributes on the top workflow within your sqlcreator, so you don't need any of them in the sqlexecutor either.


An sqlcreator gets triggered once, so you should only need to trigger the sqlexecutor once too. If you're workflows are currently independent, you don't need any of the attributes on the top workflow within your sqlcreator, so you don't need any of them in the sqlexecutor either.

My second workflow must execute only after my first workflow execution is over. To give you an idea, my first workflow processes features from a featureclass and my second workflow sets the processing flags for the successfully processed features from first workflow. What I am looking for is some way to find our what happened to each feature so that I can update the flag for each feature based on if the processing was successful or not.


My second workflow must execute only after my first workflow execution is over. To give you an idea, my first workflow processes features from a featureclass and my second workflow sets the processing flags for the successfully processed features from first workflow. What I am looking for is some way to find our what happened to each feature so that I can update the flag for each feature based on if the processing was successful or not.

And the workflow described will do this, only once data appears from the summary port will the sqlexecutor be triggered


And the workflow described will do this, only once data appears from the summary port will the sqlexecutor be triggered

Based on some complex conditions I had to set up multiple writers for the same featureclass like shown below. The upper one is dedicated for inserts and the lower ones are dedicated for update. These three different writers point to the same featureclass. Hence I need the processing info at feature level so that I can update the flags for each feature.

 


I did that as well but it didn't work out. Does it work only for inserts? because when I checked with one per featuretype, the writer did couple of updates. And when I checked the summary, it showed me that it wrote 2 features.

If you simply need to recuperate the input features after the FeatureWriter have terminated, you can do something like this:


Reply