org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark >>> client. at >>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57) at >>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:114) at >>> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:120) at >>> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:97) at >>> org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) at >>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88) at >>> org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1640)

hive-user | Xuefu Zhang | 1 year ago
tip
Do you know that we can give you better hits? Get more relevant results from Samebug’s stack trace search.
  1. 0

    Re: Hive on Spark - Error: Child process exited before connecting back

    hive-user | 1 year ago | Xuefu Zhang
    org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark >>> client. at >>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57) at >>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:114) at >>> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:120) at >>> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:97) at >>> org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) at >>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88) at >>> org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1640)
  2. 0

    Re: Hive on Spark - Error: Child process exited before connecting back

    hive-user | 1 year ago | Ophir Etzion
    org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark > client. at > org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57) at > org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:114) at > org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:120) at > org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:97) at > org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) at > org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88) at > org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1640)

    Root Cause Analysis

    1. org.apache.hadoop.hive.ql.metadata.HiveException

      Failed to create spark >>> client. at >>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57) at >>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:114) at >>> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:120) at >>> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:97) at >>> org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) at >>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88) at >>> org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1640)

      at org.apache.hadoop.hive.ql.Driver.execute()
    2. Hive Query Language
      Driver.execute
      1. org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1399)
      1 frame