java.lang.RuntimeException: [unresolved dependency: com.amazon.redshift#redshift-jdbc41;1.2.1: not found]

Stack Overflow | Avinash A | 6 days ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    Include aws jdbc driver while running spark application

    Stack Overflow | 6 days ago | Avinash A
    java.lang.RuntimeException: [unresolved dependency: com.amazon.redshift#redshift-jdbc41;1.2.1: not found]
  2. 0

    GitHub comment 263#254332626

    GitHub | 4 months ago | trinker
    java.lang.RuntimeException: [unresolved dependency: "com.databricks#spark-csv_2.11;1.3.0": not found]
  3. 0

    GitHub comment 59#150682132

    GitHub | 1 year ago | jougNY
    java.lang.RuntimeException: [unresolved dependency: com.databricks#spark-csv_2.11;1.2.0: not found]
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    GitHub comment 274#258340552

    GitHub | 4 months ago | dsblr
    java.lang.RuntimeException: [unresolved dependency: com.databricks#spark-csv_2.11;1.3.0: not found]
  6. 0

    unresolved dependency: com.lucidworks.solr#spark-solr;2.0.1: not found

    GitHub | 9 months ago | lefromage
    java.lang.RuntimeException: [unresolved dependency: com.lucidworks.solr#spark-solr;2.0.1: not found]

    2 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.RuntimeException

      [unresolved dependency: com.amazon.redshift#redshift-jdbc41;1.2.1: not found]

      at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates()
    2. Spark
      SparkSubmit.main
      1. org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1066)
      2. org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:294)
      3. org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:158)
      4. org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
      5. org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
      5 frames