Skip to main content

Hei,

The SQLExecutor transformer uses sets of 10000 records (as postgis reader uses as default). Is there any way to increase it?

There is an option to add a reader as a resource, giving you access to the specific PostGIS reader parameters, but the SQLCreator doesn't seem to pick those up. Other than that I don't really see a way of doing that.


Hi @tono

What are you using the SQLExecutor for?

Could you use the FeatureReader or the SQLCreator instead of the SQLExecutor?

Both allow using the WHERE Clause and can use a limit.

 


Hi @tono

 

I would suggest on using a LIMIT.

 

Just do a count to count the amount of objects you will fetch, define a maximum amount to be fetched at a time and use a cloner + expressionevaluator to update the OFFSET value that you will use in the SQLExecutor.

For example, you will fetch 100 000 objects but want to limit to 10 000 in one go.

nbRequests = @ceil l count / limit ]

cloner (nbRequest) - also creates attribute _copynum

offset = _copynum * limit

Good luck!


@tono

 

 

Is your questioned answered?

 


Hi @tono

 

I would suggest on using a LIMIT.

 

Just do a count to count the amount of objects you will fetch, define a maximum amount to be fetched at a time and use a cloner + expressionevaluator to update the OFFSET value that you will use in the SQLExecutor.

For example, you will fetch 100 000 objects but want to limit to 10 000 in one go.

nbRequests = @ceil l count / limit ]

cloner (nbRequest) - also creates attribute _copynum

offset = _copynum * limit

Good luck!

Actually I am fetching 10K and would rather prefer fetching 100K

 

 


Hi @tono

What are you using the SQLExecutor for?

Could you use the FeatureReader or the SQLCreator instead of the SQLExecutor?

Both allow using the WHERE Clause and can use a limit.

 

I am trying to extract data from a table but also running some PostGIS generalizing and sorting on the SQL clause. Need a bit more than a where clause :(

 

 


I would also advise using the sql creator/feature reader but first why would you want to increase the number? It doesn't automatically mean the data will be read faster.


Yes, redgeographics did. the answer is 'no' :)

 

 


Isnt it? I thought that fetching more rows reading from PostGIS it would be faster. I thought that this could be a bottleneck in my workspace


Isnt it? I thought that fetching more rows reading from PostGIS it would be faster. I thought that this could be a bottleneck in my workspace

To figure that out you can try first ONLY reading the data, by disabeling the transformers or connections right after the reader(s), after that you could try reading + transformation (transformers) and evaluate where does the time go.

 

 


How about creating a view in the PostGis database with the outcome of the SQL query (including the generalization) and using that view in a PostGis reader (with the option to limit the number of records).


Reply