java.lang.RuntimeException: java.sql.SQLException: No suitable driver found for jdbc:phoenix:localhost:2181:/hbase-unsecure;

Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via hadoop-hdfs-user by Divya Gehlot, 1 year ago
java.sql.SQLException: No suitable driver found for jdbc:phoenix:localhost:2181:/hbase-unsecure;
via Stack Overflow by user4342532
, 6 months ago
java.sql.SQLException: No suitable driver found for jdbc:phoenix:localhost:2181:/hbase-unsecure;
via xluat.com by Unknown author, 2 years ago
java.sql.SQLException: ERROR 103 (08004): Unable to establish connection.
via Stack Overflow by Jeroen Vlek
, 1 year ago
java.sql.SQLException: ERROR 103 (08004): Unable to establish connection.
via apache.org by Unknown author, 2 years ago
java.sql.SQLException: ERROR 103 (08004): Unable to establish connection.
via dluat.com by Unknown author, 2 years ago
java.sql.SQLException: ERROR 103 (08004): Unable to establish connection.
java.lang.RuntimeException: java.sql.SQLException: No suitable driver found for jdbc:phoenix:localhost:2181:/hbase-unsecure;
at java.sql.DriverManager.getConnection(DriverManager.java:596)
at java.sql.DriverManager.getConnection(DriverManager.java:187)
at org.apache.phoenix.mapreduce.util.ConnectionUtil.getConnection(ConnectionUtil.java:99)
at org.apache.phoenix.mapreduce.util.ConnectionUtil.getOutputConnection(ConnectionUtil.java:82)
at org.apache.phoenix.mapreduce.util.ConnectionUtil.getOutputConnection(ConnectionUtil.java:70)
at org.apache.phoenix.mapreduce.PhoenixRecordWriter.(PhoenixRecordWriter.java:49)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12.apply(PairRDDFunctions.scala:1030)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12.apply(PairRDDFunctions.scala:1014)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:88)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

Users with the same issue

10261 times, 6 months ago
Samebug visitor profile picture
Unknown user
Once, 8 months ago
Samebug visitor profile picture
Unknown user
Once, 1 year ago
Samebug visitor profile picture
Unknown user
Once, 1 year ago
Samebug visitor profile picture
Unknown user
Once, 1 year ago
59 more bugmates

Know the solutions? Share your knowledge to help other developers to debug faster.