Fix the bug I introduced previously when adapting `build.sbt` for JDK 11 by rjurney · Pull Request #476 · graphframes/graphframes · GitHub
Skip to content

Fix the bug I introduced previously when adapting build.sbt for JDK 11#476

Merged
rjurney merged 4 commits intomasterfrom
rjurney/jdk-8-fix
Jan 17, 2025
Merged

Fix the bug I introduced previously when adapting build.sbt for JDK 11#476
rjurney merged 4 commits intomasterfrom
rjurney/jdk-8-fix

Conversation

@rjurney
Copy link
Copy Markdown
Collaborator

@rjurney rjurney commented Jan 11, 2025

JDK 8 support is adequate for now, I believe it is still the recommended version for Spark.

@codecov-commenter
Copy link
Copy Markdown

@bjornjorgensen
Copy link
Copy Markdown
Contributor

for spark 4.0 we need java 17 and 21

@rjurney
Copy link
Copy Markdown
Collaborator Author

rjurney commented Jan 13, 2025

for spark 4.0 we need java 17 and 21

Okay, I will test again but this seemed to break Java 8, so I thought of pulling it. I will check again.

I do not have any idea what is required for Spark 4. If you have a better grasp, I would mucho appreciate a Spark 4 support ticket :)

@rjurney
Copy link
Copy Markdown
Collaborator Author

rjurney commented Jan 13, 2025

@bjornjorgensen yeah this breaks Java 0.8, which I think most Spark users run. I guess we should conditionally add it based on the JVM version?

Unrecognized option: --add-opens=java.base/sun.nio.ch=ALL-UNNAMED

@Kimahriman
Copy link
Copy Markdown
Contributor

Spark also includes -XX:+IgnoreUnrecognizedVMOptions so it works for both Java versions

@bjornjorgensen
Copy link
Copy Markdown
Contributor

@rjurney rjurney merged commit db8da51 into master Jan 17, 2025
@rjurney rjurney deleted the rjurney/jdk-8-fix branch April 15, 2025 00:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants