java.lang.IllegalStateException: Unable to create a side-input view from input

tip
Do you know that we can give you better hits? Get more relevant results from Samebug’s stack trace search.
  1. 0

    Writing to BigQuery from Cloud Dataflow: Unable to create a side-input view from input

    Stack Overflow | 8 months ago | Adam Brocklehurst
    java.lang.IllegalStateException: Unable to create a side-input view from input
  2. 0

    Unable to run a job with OutputTags

    Stack Overflow | 6 months ago | Haden Hooyeon Lee
    java.lang.IllegalStateException: Unable to return a default Coder for FilterByKeyword.out1 [PCollection]. Correct one of the following root causes: No Coder has been manually specified; you may do so using .setCoder(). Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for type variable V (declared by class com.google.cloud.dataflow.sdk.values.TupleTag) because the actual type is unknown due to erasure. If this error occurs for a side output of the producing ParDo, verify that the TupleTag for this output is constructed with proper type information (see TupleTag Javadoc) or explicitly set the Coder to use if this is not possible. Using the default output Coder from the producing PTransform failed: Cannot provide a coder for type variable V (declared by class com.google.cloud.dataflow.sdk.values.TupleTag) because the actual type is unknown due to erasure.

    Root Cause Analysis

    1. java.lang.IllegalStateException

      GroupByKey cannot be applied to non-bounded PCollection in the GlobalWindow without a trigger. Use a Window.into or Window.triggering transform prior to GroupByKey.

      at com.google.cloud.dataflow.sdk.transforms.GroupByKey.applicableTo()
    2. com.google.cloud
      PCollection.apply
      1. com.google.cloud.dataflow.sdk.transforms.GroupByKey.applicableTo(GroupByKey.java:192)
      2. com.google.cloud.dataflow.sdk.transforms.View$AsIterable.validate(View.java:275)
      3. com.google.cloud.dataflow.sdk.transforms.View$AsIterable.validate(View.java:268)
      4. com.google.cloud.dataflow.sdk.Pipeline.applyInternal(Pipeline.java:366)
      5. com.google.cloud.dataflow.sdk.Pipeline.applyTransform(Pipeline.java:274)
      6. com.google.cloud.dataflow.sdk.values.PCollection.apply(PCollection.java:161)
      7. com.google.cloud.dataflow.sdk.io.Write$Bound.createWrite(Write.java:214)
      8. com.google.cloud.dataflow.sdk.io.Write$Bound.apply(Write.java:79)
      9. com.google.cloud.dataflow.sdk.io.Write$Bound.apply(Write.java:68)
      10. com.google.cloud.dataflow.sdk.runners.PipelineRunner.apply(PipelineRunner.java:74)
      11. com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner.apply(DirectPipelineRunner.java:247)
      12. com.google.cloud.dataflow.sdk.Pipeline.applyInternal(Pipeline.java:367)
      13. com.google.cloud.dataflow.sdk.Pipeline.applyTransform(Pipeline.java:290)
      14. com.google.cloud.dataflow.sdk.values.PCollection.apply(PCollection.java:174)
      15. com.google.cloud.dataflow.sdk.io.BigQueryIO$Write$Bound.apply(BigQueryIO.java:1738)
      16. com.google.cloud.dataflow.sdk.io.BigQueryIO$Write$Bound.apply(BigQueryIO.java:1440)
      17. com.google.cloud.dataflow.sdk.runners.PipelineRunner.apply(PipelineRunner.java:74)
      18. com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner.apply(DirectPipelineRunner.java:247)
      19. com.google.cloud.dataflow.sdk.Pipeline.applyInternal(Pipeline.java:367)
      20. com.google.cloud.dataflow.sdk.Pipeline.applyTransform(Pipeline.java:274)
      21. com.google.cloud.dataflow.sdk.values.PCollection.apply(PCollection.java:161)
      21 frames
    3. co.uk.bubblestudent
      StarterPipeline.main
      1. co.uk.bubblestudent.dataflow.StarterPipeline.main(StarterPipeline.java:116)
      1 frame