org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark >>> client. at >>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57) at >>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:114) at >>> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:120) at >>> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:97) at >>> org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) at >>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88) at >>> org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1640)

hive-user | Xuefu Zhang | 12 months ago
  1. 0

    Re: Hive on Spark - Error: Child process exited before connecting back

    hive-user | 12 months ago | Xuefu Zhang
    org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark >>> client. at >>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57) at >>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:114) at >>> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:120) at >>> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:97) at >>> org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) at >>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88) at >>> org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1640)
  2. 0

    Re: Hive on Spark - Error: Child process exited before connecting back

    hive-user | 12 months ago | Ophir Etzion
    org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark > client. at > org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57) at > org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:114) at > org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:120) at > org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:97) at > org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) at > org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88) at > org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1640)
  3. 0

    [Hive-user] Hive on Spark - Grokbase

    grokbase.com | 1 year ago
    org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark client. at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57) at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:116) at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:113) at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:95)
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    [Hive-user] Building spark 1.3 from source code to work with Hive 1.2.1 - Grokbase

    grokbase.com | 11 months ago
    org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark client. at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57) at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:116) at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:112) at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:101)
  6. 0

    Re: error: Failed to create spark client. for hive on spark

    apache.org | 12 months ago
    org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark client. at org.apache.hadoop.hive.ql.exec.spark.session. SparkSessionImpl.open(SparkSessionImpl.java:57) at org.apache.hadoop.hive.ql.exec.spark.session. SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:116) at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities. getSparkSession(SparkUtilities.java:114) at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute( SparkTask.java:95)

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. org.apache.hadoop.hive.ql.metadata.HiveException

      Failed to create spark >>> client. at >>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57) at >>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:114) at >>> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:120) at >>> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:97) at >>> org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) at >>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88) at >>> org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1640)

      at org.apache.hadoop.hive.ql.Driver.execute()
    2. Hive Query Language
      Driver.execute
      1. org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1399)
      1 frame