windows10下使用spark-2.3.0-bin-without-hadoop相关问题
标签:vat sdn windows10 用两个 get ESS source submit under
/* from: https://blog.csdn.net/ryanzhongj/article/details/80677281 */
1、启动spark-shell报错:
Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
at java.lang.Class.getMethod0(Class.java:3018)
at java.lang.Class.getMethod(Class.java:1784)
at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more
需要在%SPARK_HOME%\conf目录下新建spark-env.cmd文件,添加内容:
# for语句用在批处理命令中时,需要连用两个%
FOR /F %%i IN (‘hadoop classpath‘) DO @set SPARK_DIST_CLASSPATH=%%i
2、启动spark-shell报错:
[ERROR] Terminal initialization failed; falling back to unsupported
java.lang.NoClassDefFoundError: Could not initialize class scala.tools.fusesource_embedded.jansi.internal.Kernel32
at scala.tools.fusesource_embedded.jansi.internal.WindowsSupport.getConsoleMode(WindowsSupport.java:50)
at scala.tools.jline_embedded.WindowsTerminal.getConsoleMode(WindowsTerminal.java:204)
at scala.tools.jline_embedded.WindowsTerminal.init(WindowsTerminal.java:82)
at scala.tools.jline_embedded.TerminalFactory.create(TerminalFactory.java:101)
at scala.tools.jline_embedded.TerminalFactory.get(TerminalFactory.java:158)
at scala.tools.jline_embedded.console.ConsoleReader.(ConsoleReader.java:229)
at scala.tools.jline_embedded.console.ConsoleReader.(ConsoleReader.java:221)
at scala.tools.jline_embedded.console.ConsoleReader.(ConsoleReader.java:209)
at scala.tools.nsc.interpreter.jline_embedded.JLineConsoleReader.(JLineReader.scala:62)
at scala.tools.nsc.interpreter.jline_embedded.InteractiveReader.(JLineReader.scala:34)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at scala.tools.nsc.interpreter.ILoop$$anonfun$scala$tools$nsc$interpreter$ILoop$$instantiater$1$1.apply(ILoop.scala:858)
at scala.tools.nsc.interpreter.ILoop$$anonfun$scala$tools$nsc$interpreter$ILoop$$instantiater$1$1.apply(ILoop.scala:855)
at scala.tools.nsc.interpreter.ILoop.scala$tools$nsc$interpreter$ILoop$$mkReader$1(ILoop.scala:862)
at scala.tools.nsc.interpreter.ILoop$$anonfun$21$$anonfun$apply$9.apply(ILoop.scala:873)
at scala.tools.nsc.interpreter.ILoop$$anonfun$21$$anonfun$apply$9.apply(ILoop.scala:873)
at scala.util.Try$.apply(Try.scala:192)
at scala.tools.nsc.interpreter.ILoop$$anonfun$21.apply(ILoop.scala:873)
at scala.tools.nsc.interpreter.ILoop$$anonfun$21.apply(ILoop.scala:873)
at scala.collection.immutable.Stream$$anonfun$map$1.apply(Stream.scala:418)
at scala.collection.immutable.Stream$$anonfun$map$1.apply(Stream.scala:418)
at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1233)
at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1223)
at scala.collection.immutable.Stream.collect(Stream.scala:435)
at scala.tools.nsc.interpreter.ILoop.chooseReader(ILoop.scala:875)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$2.apply(ILoop.scala:914)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:914)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
at org.apache.spark.repl.Main$.doMain(Main.scala:76)
at org.apache.spark.repl.Main$.main(Main.scala:56)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:879)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
将jline-2.12.1.jar包放到%SPARK_HOME%\jars目录下
windows10下使用spark-2.3.0-bin-without-hadoop相关问题
标签:vat sdn windows10 用两个 get ESS source submit under
原文地址:https://www.cnblogs.com/jerryzh/p/11131994.html
评论