Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via GitHub by anshbansal
, 1 year ago
Unable to resolve impressionid.1 given [impressionid.1];
via Stack Overflow by Mohit Bansal
, 1 year ago
Unable to resolve Sepal.Length given [Sepal.Length, Sepal.Width, Petal.Length, Petal.Width, Species];
via GitHub by kevinushey
, 1 year ago
Unable to resolve Sepal.Length given [Sepal.Length, Sepal.Width, Petal.Length, Petal.Width, Species];
org.apache.spark.sql.AnalysisException: Unable to resolve impressionid.1 given [impressionid.1];	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolve$1$$anonfun$apply$5.apply(LogicalPlan.scala:134)	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolve$1$$anonfun$apply$5.apply(LogicalPlan.scala:134)	at scala.Option.getOrElse(Option.scala:121)	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolve$1.apply(LogicalPlan.scala:133)	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolve$1.apply(LogicalPlan.scala:129)	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)	at scala.collection.Iterator$class.foreach(Iterator.scala:893)	at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)	at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)	at org.apache.spark.sql.types.StructType.foreach(StructType.scala:95)	at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)	at org.apache.spark.sql.types.StructType.map(StructType.scala:95)	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolve(LogicalPlan.scala:129)	at org.apache.spark.sql.execution.datasources.FileSourceStrategy$.apply(FileSourceStrategy.scala:87)	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:60)	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:60)	at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:61)	at org.apache.spark.sql.execution.SparkPlanner.plan(SparkPlanner.scala:47)	at org.apache.spark.sql.execution.SparkPlanner$$anonfun$plan$1$$anonfun$apply$1.applyOrElse(SparkPlanner.scala:51)	at org.apache.spark.sql.execution.SparkPlanner$$anonfun$plan$1$$anonfun$apply$1.applyOrElse(SparkPlanner.scala:48)	at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:301)	at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:301)	at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:69)	at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:300)	at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:298)	at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:298)	at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$5.apply(TreeNode.scala:321)	at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:179)	at org.apache.spark.sql.catalyst.trees.TreeNode.transformChildren(TreeNode.scala:319)	at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:298)	at org.apache.spark.sql.execution.SparkPlanner$$anonfun$plan$1.apply(SparkPlanner.scala:48)	at org.apache.spark.sql.execution.SparkPlanner$$anonfun$plan$1.apply(SparkPlanner.scala:48)