java.lang.RuntimeException

There are no available Samebug tips for this exception. Do you have an idea how to solve this issue? A short tip would help users who saw this issue last week.

  • Can't install/run
    via GitHub by mondras
    ,
  • GitHub comment 106#229506840
    via GitHub by joaquin386
    ,
  • GitHub comment 274#258340552
    via GitHub by dsblr
    ,
  • Trying to do a basic hit test against a DataStax Enterprise 5.0 setup to see if C* can be successfully spark-queried remotely on a Windows host, and I ran into the following issue when executing this command against the Spark shell: ``` PS C:\Spark\spark-1.6.2-bin-hadoop2.6> .\bin\spark-shell.cmd --master spark://10.0.69.2:7077 --conf spark.cassandra.connection=10.0.69.2 --packages datastax:spark-cassandra-connector:1.6.0-s_2.10 ``` This is using the 1.6.0 Datastax-Spark connector hosted at https://spark-packages.org/package/datastax/spark-cassandra-connector I believe. Upon executing this I ran into the following error: {noformat} PS C:\Spark\spark-1.6.2-bin-hadoop2.6> .\bin\spark-shell.cmd --master spark://10.0.69.2:7077 --conf spark.cassandra.conn ection=10.0.69.2 --packages datastax:spark-cassandra-connector:1.6.0-s_2.10 Ivy Default Cache set to: C:\Users\aaron\.ivy2\cache The jars for the packages stored in: C:\Users\aaron\.ivy2\jars :: loading settings :: url = jar:file:/C:/Spark/spark-1.6.2-bin-hadoop2.6/lib/spark-assembly-1.6.2-hadoop2.6.0.jar!/org/ apache/ivy/core/settings/ivysettings.xml datastax#spark-cassandra-connector added as a dependency :: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0 confs: [default] found datastax#spark-cassandra-connector;1.6.0-s_2.10 in spark-packages found org.apache.cassandra#cassandra-clientutil;3.0.2 in central found com.datastax.cassandra#cassandra-driver-core;3.0.0 in central found io.netty#netty-handler;4.0.33.Final in central found io.netty#netty-buffer;4.0.33.Final in central found io.netty#netty-common;4.0.33.Final in central found io.netty#netty-transport;4.0.33.Final in central found io.netty#netty-codec;4.0.33.Final in central found io.dropwizard.metrics#metrics-core;3.1.2 in local-m2-cache found org.slf4j#slf4j-api;1.7.7 in local-m2-cache found org.apache.commons#commons-lang3;3.3.2 in local-m2-cache found com.google.guava#guava;16.0.1 in central found org.joda#joda-convert;1.2 in central found joda-time#joda-time;2.3 in central found com.twitter#jsr166e;1.1.0 in central found org.scala-lang#scala-reflect;2.10.5 in local-m2-cache :: resolution report :: resolve 868ms :: artifacts dl 35ms :: modules in use: com.datastax.cassandra#cassandra-driver-core;3.0.0 from central in [default] com.google.guava#guava;16.0.1 from central in [default] com.twitter#jsr166e;1.1.0 from central in [default] datastax#spark-cassandra-connector;1.6.0-s_2.10 from spark-packages in [default] io.dropwizard.metrics#metrics-core;3.1.2 from local-m2-cache in [default] io.netty#netty-buffer;4.0.33.Final from central in [default] io.netty#netty-codec;4.0.33.Final from central in [default] io.netty#netty-common;4.0.33.Final from central in [default] io.netty#netty-handler;4.0.33.Final from central in [default] io.netty#netty-transport;4.0.33.Final from central in [default] joda-time#joda-time;2.3 from central in [default] org.apache.cassandra#cassandra-clientutil;3.0.2 from central in [default] org.apache.commons#commons-lang3;3.3.2 from local-m2-cache in [default] org.joda#joda-convert;1.2 from central in [default] org.scala-lang#scala-reflect;2.10.5 from local-m2-cache in [default] org.slf4j#slf4j-api;1.7.7 from local-m2-cache in [default] --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 16 | 0 | 0 | 0 || 16 | 0 | --------------------------------------------------------------------- :: problems summary :: :::: WARNINGS [NOT FOUND ] org.scala-lang#scala-reflect;2.10.5!scala-reflect.jar (3ms) ==== local-m2-cache: tried file:/C:/Users/aaron/.m2/repository/org/scala-lang/scala-reflect/2.10.5/scala-reflect-2.10.5.jar [NOT FOUND ] org.slf4j#slf4j-api;1.7.7!slf4j-api.jar (0ms) ==== local-m2-cache: tried file:/C:/Users/aaron/.m2/repository/org/slf4j/slf4j-api/1.7.7/slf4j-api-1.7.7.jar :::::::::::::::::::::::::::::::::::::::::::::: :: FAILED DOWNLOADS :: :: ^ see resolution messages for details ^ :: :::::::::::::::::::::::::::::::::::::::::::::: :: org.slf4j#slf4j-api;1.7.7!slf4j-api.jar :: org.scala-lang#scala-reflect;2.10.5!scala-reflect.jar :::::::::::::::::::::::::::::::::::::::::::::: :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS Exception in thread "main" java.lang.RuntimeException: [download failed: org.slf4j#slf4j-api;1.7.7!slf4j-api.jar, downlo ad failed: org.scala-lang#scala-reflect;2.10.5!scala-reflect.jar] at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1068) at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:287) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:154) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) {noformat} Any idea what I can do to fix this?
    via by Aaron Stannard,
  • spark-csv_2.11-s_2.11 not found
    via GitHub by Lewuathe
    ,
    • java.lang.RuntimeException: [download failed: commons-codec#commons-codec;1.6!commons-codec.jar, download failed: com.fasterxml.jackson.core#jackson-databind;2.5.3!jackson-databind.jar(bundle), download failed: com.fasterxml.jackson.core#jackson-annotations;2.5.0!jackson-annotations.jar(bun dle), download failed: com.fasterxml.jackson.core#jackson-core;2.5.3!jackson-core.jar(bundle)] at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1068) at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:287) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:154) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

    Users with the same issue

    Unknown visitor
    Unknown visitor1 times, last one,
    Unknown visitor
    Unknown visitor1 times, last one,