Skip to main content
Archived

Allow embedded loops with potentially blocking transformers

Related products:Transformers
  • February 20, 2017
  • 1 reply
  • 7 views
danilo_fme
  • danilo_fme
    danilo_fme

jdh
Contributor
  • Contributor

Currently looping with blocking transformers is only allowed in linked custom transformers. (see http://knowledge.safe.com/articles/1273/looping-with-blocking-transformers.html)

However there are transformers that force a linked rather than embedded custom transformer are only blocking if they have a Group By, or other parameter setting that causes them to block.

It would be nice if these transformers could be used in an embedded loop when they are not in "blocking-mode"
ex. Bufferer (No Group By), AreaAmalgamtor (Self-Amalgamation Mode), PythonCaller (Function)

This post is closed to further activity.
It may be a question with a best answer, an implemented idea, or just a post needing no comment.
If you have a follow-up or related question, please post a new question or idea.
If there is a genuine update to be made, please contact us and request that the post is reopened.

1 reply

lifalin2016
Contributor
Forum|alt.badge.img+29
  • Contributor
  • February 21, 2017

It would be nicer, if the concept of forcing embedded transformers to be linked was abandoned altogether.

 

 

I think it might be implemented by having FME automagically create a hidden temp linked transformer on-the-fly from the embedded template whenever needed in the translation.


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings