-
Notifications
You must be signed in to change notification settings - Fork 39
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
InitError: JavaCall.JavaCallError("Class Not Found org/apache/log4j/Level") #118
Comments
I also encountered this. MacOS 13.5. I tried it with Julia versions 1.8 and 1.9. |
Could you please re-build Spark.jl and post the log here? ] build Spark |
Thanks for looking! Here is the log. |
I don't see anything strange in your log and can't reproduce it, so let's start with a couple of tests. Check if JavaCall works fine: # Julia REPL
using JavaCall
JHashMap = @jimport java.util.HashMap
jmap = JHashMap(())
listmethods(jmap, "put")
jcall(jmap, "put", JObject, (JObject, JObject), "foo", "text value") Check the versions and env vars: # bash
java -version
mvn -version
echo $SPARK_HOME Check that Spark.jl generated all the needed artifacts: # bash
# you may need to install tree utility or anything to print the contents of the directory
tree ~/.julia/packages/Spark
# alternatively
ls ~/.julia/packages/Spark/*/jvm/sparkjl/target |
Looks like something with JavaCall...I'll see what I can do in tracking down the issue from that direction.
versions:
|
I have a bit more testing to do but at least part of the issue appears to relate back to this warning in the JavaCall docs:
https://juliainterop.github.io/JavaCall.jl/ Now I'm hitting:
Which seems unrelated to the original issue. So I still have some things to figure out but perhaps the original issue can be deemed resolved by setting JULIA_COPY_STACKS=yes before startup.jl |
Regarding segmentation fault, can you try a different combination of Julia and JDK? IIRC, we had a similar issue with Julia 1.2-1.5, and also a few issues with OpenJDK. Currently, Julia 1.9.2 and OpenJDK 11.0.19 work for me on Ubuntu 20.04, but I don't have MacOS to test this setup there. |
I'll dig into that further a little later this week; thank you for the pointers. |
Apologies for the slow response. I believe everything is working for me now. I'm not sure which exact combination of fixes got everything working; I hope to start over later and come up with a reproducible guide. But for reference, the major changes were:
At this point I have:
Then I was able to do
|
The error message suggests that the JavaCall package is unable to find the org/apache/log4j/Level class. This error is usually caused by a missing dependency or a version mismatch between the packages. Here are some steps you can try to resolve this issue: Check if you have installed the required dependencies for the JavaCall and Spark packages. You can do this by running the following command in your Julia REPL: Julia This will show you the list of installed packages and their versions. Make sure that all the required dependencies are installed and up-to-date. Try updating the JavaCall package to the latest version by running the following command in your Julia REPL: Julia If updating the package doesn’t work, try uninstalling and reinstalling the JavaCall package by running the following commands in your Julia REPL: Julia |
I am facing issue :
julia> if Sys.isunix()
ENV["JULIA_COPY_STACKS"] = 1
end
1
julia> using JavaCall
julia> using Spark
ERROR: InitError: JavaCall.JavaCallError("Class Not Found org/apache/log4j/Level")
Stacktrace:
[1] _metaclass(class::Symbol)
@ JavaCall ~/.julia/packages/JavaCall/MlduK/src/core.jl:383
[2] metaclass(class::Symbol)
@ JavaCall ~/.julia/packages/JavaCall/MlduK/src/core.jl:389
[3] jfield(typ::Type{JavaObject{Symbol("org.apache.log4j.Level")}}, field::String, fieldType::Type)
@ JavaCall ~/.julia/packages/JavaCall/MlduK/src/core.jl:263
[4] set_log_level(log_level::String)
@ Spark ~/.julia/packages/Spark/89BUd/src/init.jl:8
[5] init(; log_level::String)
@ Spark ~/.julia/packages/Spark/89BUd/src/init.jl:60
[6] init
@ ~/.julia/packages/Spark/89BUd/src/init.jl:16 [inlined]
[7] init()
@ Spark ~/.julia/packages/Spark/89BUd/src/core.jl:30
[8] _include_from_serialized(pkg::Base.PkgId, path::String, depmods::Vector{Any})
@ Base ./loading.jl:831
[9] _require_search_from_serialized(pkg::Base.PkgId, sourcepath::String, build_id::UInt64)
@ Base ./loading.jl:1039
[10] _require(pkg::Base.PkgId)
@ Base ./loading.jl:1315
[11] _require_prelocked(uuidkey::Base.PkgId)
@ Base ./loading.jl:1200
[12] macro expansion
@ ./loading.jl:1180 [inlined]
[13] macro expansion
@ ./lock.jl:223 [inlined]
[14] require(into::Module, mod::Symbol)
@ Base ./loading.jl:1144
during initialization of module Spark
can someone please help me?
The text was updated successfully, but these errors were encountered: