java.lang.IllegalStateException: open

JIRA | Luis Rodríguez | 2 years ago
  1. 0

    I am trying to integrate MongoDB with Apache Spark to process data. When trying to execute my program with this command (../spark-1.3.0-bin-hadoop2.4/bin/spark-submit --master spark://luis-VirtualBox:7077 --jars $(echo /home/luis/mongo-spark/lib/*.jar | tr ' ' ',') --class JavaWordCount target/scala-2.10/mongo-spark_2.10-1.0.jar mydb.testCollection mydb.outputTest7) I get the following exception: 15/03/23 17:05:34 WARN TaskSetManager: Lost task 0.1 in stage 0.0 (TID 4, 10.0.2.15): java.lang.IllegalStateException: open at org.bson.util.Assertions.isTrue(Assertions.java:36) at com.mongodb.DBTCPConnector.getPrimaryPort(DBTCPConnector.java:406) at com.mongodb.DBCollectionImpl.insert(DBCollectionImpl.java:184) at com.mongodb.DBCollectionImpl.insert(DBCollectionImpl.java:167) at com.mongodb.DBCollection.insert(DBCollection.java:161) at com.mongodb.DBCollection.insert(DBCollection.java:107) at com.mongodb.DBCollection.save(DBCollection.java:1049) at com.mongodb.DBCollection.save(DBCollection.java:1014) at com.mongodb.hadoop.output.MongoRecordWriter.write(MongoRecordWriter.java:105) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$12.apply(PairRDDFunctions.scala:1000) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$12.apply(PairRDDFunctions.scala:979) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61) at org.apache.spark.scheduler.Task.run(Task.scala:64) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) I have read in some places that it is caused by a close connection, but I don't close it in any part of the code. Thank you in advance.

    JIRA | 2 years ago | Luis Rodríguez
    java.lang.IllegalStateException: open
  2. Speed up your debug routine!

    Automated exception search integrated into your IDE

  3. 0

    I am trying to integrate MongoDB with Apache Spark to process data. When trying to execute my program with this command (../spark-1.3.0-bin-hadoop2.4/bin/spark-submit --master spark://luis-VirtualBox:7077 --jars $(echo /home/luis/mongo-spark/lib/*.jar | tr ' ' ',') --class JavaWordCount target/scala-2.10/mongo-spark_2.10-1.0.jar mydb.testCollection mydb.outputTest7) I get the following exception: 15/03/23 17:05:34 WARN TaskSetManager: Lost task 0.1 in stage 0.0 (TID 4, 10.0.2.15): java.lang.IllegalStateException: open at org.bson.util.Assertions.isTrue(Assertions.java:36) at com.mongodb.DBTCPConnector.getPrimaryPort(DBTCPConnector.java:406) at com.mongodb.DBCollectionImpl.insert(DBCollectionImpl.java:184) at com.mongodb.DBCollectionImpl.insert(DBCollectionImpl.java:167) at com.mongodb.DBCollection.insert(DBCollection.java:161) at com.mongodb.DBCollection.insert(DBCollection.java:107) at com.mongodb.DBCollection.save(DBCollection.java:1049) at com.mongodb.DBCollection.save(DBCollection.java:1014) at com.mongodb.hadoop.output.MongoRecordWriter.write(MongoRecordWriter.java:105) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$12.apply(PairRDDFunctions.scala:1000) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$12.apply(PairRDDFunctions.scala:979) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61) at org.apache.spark.scheduler.Task.run(Task.scala:64) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) I have read in some places that it is caused by a close connection, but I don't close it in any part of the code. Thank you in advance.

    JIRA | 2 years ago | Luis Rodríguez
    java.lang.IllegalStateException: open
  4. 0

    RDD is only partially written to mongo

    Stack Overflow | 2 years ago
    java.lang.IllegalStateException: The pool is closed

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.IllegalStateException

      open

      at org.bson.util.Assertions.isTrue()
    2. MongoDB Java Driver
      DBCollection.save
      1. org.bson.util.Assertions.isTrue(Assertions.java:36)
      2. com.mongodb.DBTCPConnector.getPrimaryPort(DBTCPConnector.java:406)
      3. com.mongodb.DBCollectionImpl.insert(DBCollectionImpl.java:184)
      4. com.mongodb.DBCollectionImpl.insert(DBCollectionImpl.java:167)
      5. com.mongodb.DBCollection.insert(DBCollection.java:161)
      6. com.mongodb.DBCollection.insert(DBCollection.java:107)
      7. com.mongodb.DBCollection.save(DBCollection.java:1049)
      8. com.mongodb.DBCollection.save(DBCollection.java:1014)
      8 frames
    3. com.mongodb.hadoop
      MongoRecordWriter.write
      1. com.mongodb.hadoop.output.MongoRecordWriter.write(MongoRecordWriter.java:105)
      1 frame
    4. Spark
      Executor$TaskRunner.run
      1. org.apache.spark.rdd.PairRDDFunctions$$anonfun$12.apply(PairRDDFunctions.scala:1000)
      2. org.apache.spark.rdd.PairRDDFunctions$$anonfun$12.apply(PairRDDFunctions.scala:979)
      3. org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
      4. org.apache.spark.scheduler.Task.run(Task.scala:64)
      5. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
      5 frames
    5. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
      3. java.lang.Thread.run(Thread.java:745)
      3 frames