java.sql.SQLException: Cannot create PoolableConnectionFactory (Unknown database 'connect_db_0')

There are no available Samebug tips for this exception. Do you have an idea how to solve this issue? A short tip would help users who saw this issue last week.

  • MemSQL cannot find database
    via Stack Overflow by Rasputin Jones
    ,
  • sqlite dbcp
    via GitHub by xuyang198711
    ,
  • Spring AOP Exception Handling
    via Stack Overflow by user2683906
    ,
  • Derby not starting up in 1527 in AWS host (Coordinator)
    via by Udayakumar Pandurangan,
    • java.sql.SQLException: Cannot create PoolableConnectionFactory (Unknown database 'connect_db_0') at org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:2294)[commons-dbcp2-2.1.1.jar:2.1.1] at org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:2039)[commons-dbcp2-2.1.1.jar:2.1.1] at org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:1533)[commons-dbcp2-2.1.1.jar:2.1.1] at com.memsql.spark.connector.MemSQLConnectionPool$.connect(MemSQLConnectionPool.scala:34)[memsql-connector_2.10-1.3.3.jar:1.3.3] at com.memsql.spark.connector.rdd.MemSQLRDD$$anon$1.<init>(MemSQLRDD.scala:241)[memsql-connector_2.10-1.3.3.jar:1.3.3] at com.memsql.spark.connector.rdd.MemSQLRDD.compute(MemSQLRDD.scala:231)[memsql-connector_2.10-1.3.3.jar:1.3.3] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300)[spark-core_2.10-1.5.2.jar:1.5.2] at org.apache.spark.rdd.RDD.iterator(RDD.scala:264)[spark-core_2.10-1.5.2.jar:1.5.2] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)[spark-core_2.10-1.5.2.jar:1.5.2] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300)[spark-core_2.10-1.5.2.jar:1.5.2] at org.apache.spark.rdd.RDD.iterator(RDD.scala:264)[spark-core_2.10-1.5.2.jar:1.5.2] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)[spark-core_2.10-1.5.2.jar:1.5.2] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300)[spark-core_2.10-1.5.2.jar:1.5.2] at org.apache.spark.rdd.RDD.iterator(RDD.scala:264)[spark-core_2.10-1.5.2.jar:1.5.2] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)[spark-core_2.10-1.5.2.jar:1.5.2] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300)[spark-core_2.10-1.5.2.jar:1.5.2] at org.apache.spark.rdd.RDD.iterator(RDD.scala:264)[spark-core_2.10-1.5.2.jar:1.5.2] at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)[spark-core_2.10-1.5.2.jar:1.5.2] at org.apache.spark.scheduler.Task.run(Task.scala:88)[spark-core_2.10-1.5.2.jar:1.5.2] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)[spark-core_2.10-1.5.2.jar:1.5.2] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)[na:1.8.0_92] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)[na:1.8.0_92] at java.lang.Thread.run(Thread.java:745)[na:1.8.0_92]

    Users with the same issue

    Unknown visitor1 times, last one,
    Unknown visitor1 times, last one,