-
Notifications
You must be signed in to change notification settings - Fork 182
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
chore: Fix jdk documentation for spark #979
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @adi-kmt
### Supported Apache Spark Versions | ||
For more details, refer to the documentation: [Apache Spark supported by Comet](overview.md#supported-apache-spark-versions) | ||
|
||
- **Apache Spark 3.3, 3.4, or 3.5** |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We would like to remove duplications in doc for Supported Apache Versions @adi-kmt Is there any specific reason of having this duplication?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Main motive was to indicate the different jdk versions wrt to the apache spark versions. Do you think it could be written in a better way?
|
||
- **Apache Spark 3.3, 3.4, or 3.5** | ||
- **Supported Java Versions**: JDK 8 and above | ||
- **GLIBC**: 2.17 (CentOS 7) and above |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Our latest release is built on Ubuntu 20.04 which has glibc 2.31. Could we update that here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For pre-built binary release, we can mention what glibc we use. Although I think it doesn't mean older glibc is not supported, users still can build by themselves and use older glibc.
@adi-kmt please resolve the conflict |
@@ -30,17 +30,22 @@ Make sure the following requirements are met and software installed on your mach | |||
|
|||
### Supported Spark Versions |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
### Supported Spark Versions |
- **Scala**: 2.12/2.13 | ||
- **Apache Spark 4.0** *(Experimental Support - is intended for development/testing use only and should not be used in production yet.)* | ||
- **Supported Java Versions**: JDK 17/21 | ||
- **GLIBC**: 2.17 (CentOS 7) and above |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
for GLIBC Im thinking if its good to specify a Linux distribution... I would leave just a version and put a command to check GLIBC like ldd --version
Which issue does this PR close?
Closes
Closes #742
Rationale for this change
<-- -->
What changes are included in this PR?
Changed the
installation.md
andHow are these changes tested?
Just documentation change.