-
Notifications
You must be signed in to change notification settings - Fork 116
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Incompatibility in Spark 2.1.6 #19
Comments
Had the same Exception, I had to update the Spark dependency in the build.gradle of the library to make it work. You'll need to clone the repo and update it with the same version you're using on your cluster, or you can directly get my fork if that's easier. |
use org.apache.spark.internal.Logging instead org.apache.spark.Logging |
@csbenz your fork works only with Spark 1.6.2 and I can't find build.gradle (there's only build.sbt). Am I missing something? |
Can anyone elaborate on their solution to this problem? I'm running Spark 2.3.0 |
you can solve this problem the following steps. |
Hello!
I meet up a error when I run the spark_dbscan in Spark 2.1.6.
The error is
java.lang.NoClassDefFoundError: org/apache/spark/Logging
Caused by: java.lang.ClassNotFoundException: org.apache.spark.Logging
Spark 2.0 removes the org.apache.spark.Logging. So spark_dbscan cannot running.
Any solution to this problem?
I have tried to change the other dependencies, but still failed.
The text was updated successfully, but these errors were encountered: