java.lang.RuntimeException: [unresolved dependency:;1.2.1: not found]

Stack Overflow | Avinash A | 3 months ago
Click on the to mark the solution that helps you, Samebug will learn from it.
As a community member, you’ll be rewarded for you help.
  1. 0

    Include aws jdbc driver while running spark application

    Stack Overflow | 3 months ago | Avinash A
    java.lang.RuntimeException: [unresolved dependency:;1.2.1: not found]
  2. 0

    GitHub comment 263#254332626

    GitHub | 7 months ago | trinker
    java.lang.RuntimeException: [unresolved dependency: "com.databricks#spark-csv_2.11;1.3.0": not found]
  3. 0

    GitHub comment 59#150682132

    GitHub | 2 years ago | jougNY
    java.lang.RuntimeException: [unresolved dependency: com.databricks#spark-csv_2.11;1.2.0: not found]
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    GitHub comment 274#258340552

    GitHub | 7 months ago | dsblr
    java.lang.RuntimeException: [unresolved dependency: com.databricks#spark-csv_2.11;1.3.0: not found]
  6. 0

    unresolved dependency: com.lucidworks.solr#spark-solr;2.0.1: not found

    GitHub | 12 months ago | lefromage
    java.lang.RuntimeException: [unresolved dependency: com.lucidworks.solr#spark-solr;2.0.1: not found]

    2 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.RuntimeException

      [unresolved dependency:;1.2.1: not found]

      at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates()
    2. Spark
      1. org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1066)
      2. org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:294)
      3. org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:158)
      4. org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
      5. org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
      5 frames