Skip to main content

I wondered, maybe Safe could add an option to the Sampler to drop unsampled data, instead of sending it out of the unsampled port.

Why?

Because when I run a workspace with caching turned on, it’s using up resources caching unsampled data that I don’t need. 

This is especially true since I was set up to sample only the first feature. Having obtained that single feature, I really don’t need the Sampler to carry on storing other data. In fact, I don’t really need it to carry on processing at all!

This is a good idea, I also frequently use the sampler in the same way. Perhaps a feature caching update could include both this idea and the Bookmark Feature Cache Switch idea.


I'm sure I had a similar idea in the previous website, but a switch on all output ports of all transformers. Then you can choose any output ports where droppage of features is OK/expected whilst developing a workspace


I’d prefer an option to stop upstream processing when the sampler has passed the required number of features. At the moment I can wrap everything in a bookmark and collapse it to avoid caching but there’s often still a lot of processing time upstream waiting for all features to reach the sampler.


A workaround for now could be to put the sampler in a collapsed bookmark. Then only the output is cached and the non sampled port not connected to anything will be dropped. 


I was thinking about whether the Tester:Failed port could get the same behaviour, but as a couple of people mentioned, why not a switch for all output ports? That would be really useful! Of course, I always try to consider whether a smaller request is more likely to get done than a larger one. The bigger the request, the more we have to justify it.


NewOpen