Skip to content

OutOfMemoryError in org.apache.kyuubi.engine.spark #2284

Answered by pan3793
stgztsw asked this question in Q&A
Discussion options

You must be logged in to vote

Hi @stgztsw, we definitely provide a solution for such scenario https://kyuubi.apache.org/docs/latest/deployment/spark/incremental_collection.html

And consider increase kyuubi.session.engine.request.timeout if you encounter socket timeout issue after turning on incremental collection mode.

Replies: 2 comments 2 replies

Comment options

You must be logged in to vote
1 reply
@stgztsw
Comment options

Comment options

You must be logged in to vote
1 reply
@stgztsw
Comment options

Answer selected by stgztsw
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants