org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark >>> client. at >>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57) at >>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:114) at >>> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:120) at >>> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:97) at >>> org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) at >>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88) at >>> org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1640)

hive-user | Xuefu Zhang | 1 year ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    Re: Hive on Spark - Error: Child process exited before connecting back

    hive-user | 1 year ago | Xuefu Zhang
    org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark >>> client. at >>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57) at >>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:114) at >>> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:120) at >>> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:97) at >>> org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) at >>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88) at >>> org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1640)
  2. 0

    Re: Hive on Spark - Error: Child process exited before connecting back

    hive-user | 1 year ago | Ophir Etzion
    org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark > client. at > org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57) at > org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:114) at > org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:120) at > org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:97) at > org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) at > org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88) at > org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1640)

    Root Cause Analysis

    1. org.apache.hadoop.hive.ql.metadata.HiveException

      Failed to create spark >>> client. at >>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57) at >>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:114) at >>> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:120) at >>> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:97) at >>> org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) at >>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88) at >>> org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1640)

      at org.apache.hadoop.hive.ql.Driver.execute()
    2. Hive Query Language
      Driver.execute
      1. org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1399)
      1 frame