Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via Stack Overflow by Shivansh Srivastava
, 1 year ago
Key length of 105500 is longer than maximum of 65535
via DataStax JIRA by vinayc, 1 year ago
via Stack Overflow by semsorock
, 1 year ago
SERIAL is not supported as conditional update commit consistency. Use ANY if you mean "make sure it is accepted but I don't care how many replicas commit it for non-SERIAL reads"
via Google Groups by Emīls Šolmanis, 2 years ago
The sum of all clustering columns is too long (72650 > 65535)
com.datastax.driver.core.exceptions.InvalidQueryException: Key length of 105500 is longer than maximum of 65535 at com.datastax.driver.core.Responses$Error.asException( at com.datastax.driver.core.DefaultResultSetFuture.onSet( at com.datastax.driver.core.RequestHandler.setFinalResult( at com.datastax.driver.core.RequestHandler.access$2500( at com.datastax.driver.core.RequestHandler$SpeculativeExecution.setFinalResult( at com.datastax.driver.core.RequestHandler$SpeculativeExecution.onSet( at com.datastax.driver.core.Connection$Dispatcher.channelRead0( at com.datastax.driver.core.Connection$Dispatcher.channelRead0( at at at at io.netty.handler.timeout.IdleStateHandler.channelRead( at at at io.netty.handler.codec.MessageToMessageDecoder.channelRead( at at at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead( at io.netty.handler.codec.ByteToMessageDecoder.channelRead( at at at at$EpollStreamUnsafe.epollInReady( at at at io.netty.util.concurrent.SingleThreadEventExecutor$ at