Hei,
The SQLExecutor transformer uses sets of 10000 records (as postgis reader uses as default). Is there any way to increase it?
Hei,
The SQLExecutor transformer uses sets of 10000 records (as postgis reader uses as default). Is there any way to increase it?
There is an option to add a reader as a resource, giving you access to the specific PostGIS reader parameters, but the SQLCreator doesn't seem to pick those up. Other than that I don't really see a way of doing that.
Hi @tono
What are you using the SQLExecutor for?
Could you use the FeatureReader or the SQLCreator instead of the SQLExecutor?
Both allow using the WHERE Clause and can use a limit.
Hi @tono
I would suggest on using a LIMIT.
Just do a count to count the amount of objects you will fetch, define a maximum amount to be fetched at a time and use a cloner + expressionevaluator to update the OFFSET value that you will use in the SQLExecutor.
For example, you will fetch 100 000 objects but want to limit to 10 000 in one go.
nbRequests = @ceil l count / limit ]
cloner (nbRequest) - also creates attribute _copynum
offset = _copynum * limit
Good luck!
Is your questioned answered?
Hi @tono
I would suggest on using a LIMIT.
Just do a count to count the amount of objects you will fetch, define a maximum amount to be fetched at a time and use a cloner + expressionevaluator to update the OFFSET value that you will use in the SQLExecutor.
For example, you will fetch 100 000 objects but want to limit to 10 000 in one go.
nbRequests = @ceil l count / limit ]
cloner (nbRequest) - also creates attribute _copynum
offset = _copynum * limit
Good luck!
Hi @tono
What are you using the SQLExecutor for?
Could you use the FeatureReader or the SQLCreator instead of the SQLExecutor?
Both allow using the WHERE Clause and can use a limit.
I would also advise using the sql creator/feature reader but first why would you want to increase the number? It doesn't automatically mean the data will be read faster.
Isnt it? I thought that fetching more rows reading from PostGIS it would be faster. I thought that this could be a bottleneck in my workspace
Isnt it? I thought that fetching more rows reading from PostGIS it would be faster. I thought that this could be a bottleneck in my workspace
How about creating a view in the PostGis database with the outcome of the SQL query (including the generalization) and using that view in a PostGis reader (with the option to limit the number of records).