Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Samebug tips

  1. ,
    Expert tip

    A few things cause this exception:
    1) Check if you have all jars and if they're in the correct path.
    2) Your classpath might be broken, you can define it in the command line with java -cp yourClassPath or at your IDE if you're using one.

  2. ,

    you can change your scala version to 2.11.11

Solutions on the web

via GitHub by rajagopalanseeth
, 1 year ago
Failed to find data source: org.apache.spark.sql.cassandra. Please find packages at http://spark-packages.org
via GitHub by archerbj
, 1 year ago
Failed to find data source: /Users/ArcherMacPro/coding/bigData/resources/books.xml. Please find packages at http://spark-packages.org
via GitHub by mukundrv
, 2 years ago
Failed to load class for data source: com.databricks.spark.redshift.
via GitHub by yogeshdarji99
, 1 year ago
Failed to find data source: org.elasticsearch.spark.sql. Please find packages at http://spark-packages.org
via GitHub by schmee
, 1 year ago
Failed to load class for data source: com.databricks.spark.csv.
via Stack Overflow by harpreet kaur
, 4 months ago
Failed to find data source: com.vertica.spark.datasource.DefaultSource. Please find packages at http://spark-packages.org
java.lang.ClassNotFoundException: org.apache.spark.sql.cassandra.DefaultSource	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)	at org.apache.spark.sql.execution.datasources.ResolvedDataSource$$anonfun$4$$anonfun$apply$1.apply(ResolvedDataSource.scala:62)	at org.apache.spark.sql.execution.datasources.ResolvedDataSource$$anonfun$4$$anonfun$apply$1.apply(ResolvedDataSource.scala:62)	at scala.util.Try$.apply(Try.scala:161)	at org.apache.spark.sql.execution.datasources.ResolvedDataSource$$anonfun$4.apply(ResolvedDataSource.scala:62)	at org.apache.spark.sql.execution.datasources.ResolvedDataSource$$anonfun$4.apply(ResolvedDataSource.scala:62)	at scala.util.Try.orElse(Try.scala:82)	at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:62)	at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:102)	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119)	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)	at java.lang.reflect.Method.invoke(Method.java:498)	at org.apache.spark.api.csharp.CSharpBackendHandler.handleMethodCall(CSharpBackendHandler.scala:156)	at org.apache.spark.api.csharp.CSharpBackendHandler.handleBackendRequest(CSharpBackendHandler.scala:103)	at org.apache.spark.api.csharp.CSharpBackendHandler.channelRead0(CSharpBackendHandler.scala:30)	at org.apache.spark.api.csharp.CSharpBackendHandler.channelRead0(CSharpBackendHandler.scala:27)	at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:244)	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137)	at java.lang.Thread.run(Thread.java:745)