Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AbstractMethodError in Spark 2.3.0 #38

Open
Majid-Soheili opened this issue Oct 15, 2018 · 3 comments
Open

AbstractMethodError in Spark 2.3.0 #38

Majid-Soheili opened this issue Oct 15, 2018 · 3 comments

Comments

@Majid-Soheili
Copy link

Majid-Soheili commented Oct 15, 2018

I would like to discretize the Epsilon dataset that it has 2k features and 400k records. To achieve this, I utilized the Spark 2.3.0. Once I execute the code, I deal with below error. It is noteworthy this error had not occurred when I used Spark 2.2.0, but It seems that the Spark 2.2.0 is not suitable for a high dimensional dataset.
If is it possible, please fix this problem.

#########################################################
Exception in thread "main" java.lang.AbstractMethodError
at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
at org.apache.spark.mllib.feature.MDLPDiscretizer.initializeLogIfNecessary(MDLPDiscretizer.scala:48)
at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
at org.apache.spark.mllib.feature.MDLPDiscretizer.log(MDLPDiscretizer.scala:48)
at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
at org.apache.spark.mllib.feature.MDLPDiscretizer.logInfo(MDLPDiscretizer.scala:48)
at org.apache.spark.mllib.feature.MDLPDiscretizer.runAll(MDLPDiscretizer.scala:106)
at org.apache.spark.mllib.feature.MDLPDiscretizer$.train(MDLPDiscretizer.scala:335)
at org.apache.spark.ml.feature.MDLPDiscretizer.fit(MDLPDiscretizer.scala:149)
#########################################################

@jfgosselin
Copy link

+1 I ran into the same issue today . Is there any plan for an upgrade to Spark 2.3.0 ?

@jfgosselin
Copy link

I've recompiled spark-MDLP-discretization with Spark 2.3.2 to and Scala 2.11.2 to solve the issue.

@Majid-Soheili
Copy link
Author

I've recompiled spark-MDLP-discretization with Spark 2.3.2 and Scala 2.11.2 to solve the issue.

Hi Jean, If it is possible for you, share your compiled library. Also, there is another issue on the last version on the library that is so important in my impression, The issue and the solution were reported in the address below.

#36

If it is possible for you, please fix the issue and share it.
Thanks a lot.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants