Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Samebug tips

  1. ,

    Check if the field you try to read really exists in the database. If it is optional, just use com.mongodb.casbah.commons.MongoDBObject#getAs

Solutions on the web

via DataStax JIRA by Todd, 1 year ago
key not found: UserDefinedType(relation,Vector(UDTFieldDef(type,VarCharType), UDTFieldDef(object_type,VarCharType), UDTFieldDef(related_to,VarCharType), UDTFieldDef(obj_id,VarCharType)))
via DataStax JIRA by Todd, 1 year ago
key not found: UserDefinedType(relation,Vector(UDTFieldDef(type,VarCharType), UDTFieldDef(object_type,VarCharType), UDTFieldDef(related_to,VarCharType), UDTFieldDef(obj_id,VarCharType)))
via DataStax JIRA by Jaroslaw Grabowski, 1 year ago
key not found: TupleType(Vector(TupleFieldDef(0,VarCharType), TupleFieldDef(1,IntType)))
via DataStax JIRA by Jaroslaw Grabowski, 1 year ago
key not found: TupleType(Vector(TupleFieldDef(0,VarCharType), TupleFieldDef(1,IntType)))
via Stack Overflow by Desanth pv
, 1 year ago
key not found: TupleType(Vector(TupleFieldDef(0,VarCharType), TupleFieldDef(1,IntType)))
java.util.NoSuchElementException: key not found: UserDefinedType(relation,Vector(UDTFieldDef(type,VarCharType), UDTFieldDef(object_type,VarCharType), UDTFieldDef(related_to,VarCharType), UDTFieldDef(obj_id,VarCharType)))	at scala.collection.MapLike$class.default(MapLike.scala:228)	at scala.collection.AbstractMap.default(Map.scala:58)	at scala.collection.MapLike$class.apply(MapLike.scala:141)	at scala.collection.AbstractMap.apply(Map.scala:58)	at org.apache.spark.sql.cassandra.DataTypeConverter$.catalystDataType(DataTypeConverter.scala:44)	at org.apache.spark.sql.cassandra.DataTypeConverter$.toStructField(DataTypeConverter.scala:57)	at org.apache.spark.sql.cassandra.CassandraSourceRelation$$anonfun$schema$1$$anonfun$apply$1.apply(CassandraSourceRelation.scala:57)	at org.apache.spark.sql.cassandra.CassandraSourceRelation$$anonfun$schema$1$$anonfun$apply$1.apply(CassandraSourceRelation.scala:57)	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)	at scala.collection.Iterator$class.foreach(Iterator.scala:727)	at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)	at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)	at scala.collection.AbstractIterable.foreach(Iterable.scala:54)	at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)	at scala.collection.AbstractTraversable.map(Traversable.scala:105)	at org.apache.spark.sql.cassandra.CassandraSourceRelation$$anonfun$schema$1.apply(CassandraSourceRelation.scala:57)	at org.apache.spark.sql.cassandra.CassandraSourceRelation$$anonfun$schema$1.apply(CassandraSourceRelation.scala:57)	at scala.Option.getOrElse(Option.scala:120)	at org.apache.spark.sql.cassandra.CassandraSourceRelation.schema(CassandraSourceRelation.scala:57)	at org.apache.spark.sql.sources.LogicalRelation.(LogicalRelation.scala:30)	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:120)	at org.apache.spark.sql.SQLContext.load(SQLContext.scala:1242)	at undefined.$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:26)	at undefined.$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:31)	at undefined.$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:33)	at undefined.$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:35)	at undefined.$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:37)	at undefined.$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:39)	at undefined.$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:41)	at undefined.$iwC$$iwC$$iwC$$iwC$$iwC.(:43)	at undefined.$iwC$$iwC$$iwC$$iwC.(:45)	at undefined.$iwC$$iwC$$iwC.(:47)	at undefined.$iwC$$iwC.(:49)	at undefined.$iwC.(:51)