org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to open iterator for alias sc. Backend error : Unable to recreate exception from backed error: Error: Found class org.apache.hadoop.mapreduce.TaskAttemptContext, but interface was expected

Apache's JIRA Issue Tracker | lin guo | 6 years ago
  1. 0

    [PIG-1748] Add load/store function AvroStorage for avro data - ASF JIRA

    apache.org | 1 year ago
    org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to open iterator for alias sc. Backend error : Unable to recreate exception from backed error: Error: Found class org.apache.hadoop.mapreduce.TaskAttemptContext, but interface was expected
  2. 0

    Creating an index for German DBpedia 2014

    GitHub | 2 years ago | RicardoUsbeck
    org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1000: Error during parsing. <file examples/indexing/names_and_entities.pig, line 43> Macro inline failed for macro 'read'. Reason: Macro contains argument or return value MIN_SURFACE_FORM_LENGTH which conflicts with a Pig parameter of the same name. Macro content: -- parse Wikipedia into IDs, article texts and link pairs DEFINE resolve pignlproc.helpers.SecondIfNotNullElseFirst(); DEFINE dbpediaEncode pignlproc.evaluation.DBpediaUriEncode('$LANG'); -- Parse the wikipedia dump and extract text and links data parsed = LOAD '$WIKIPEDIA_DUMP' USING pignlproc.storage.ParsingWikipediaLoader('$LANG') AS (title, id, pageUrl, text, redirect, links, headers, paragraphs); -- Normalize pageUrls to DBpedia URIs parsed = FOREACH parsed GENERATE title, id, dbpediaEncode(pageUrl) AS pageUrl, text, dbpediaEncode(redirect) AS redirect, links, headers, paragraphs; -- Separate redirects from non-redirects SPLIT parsed INTO parsedRedirects IF redirect IS NOT NULL, parsedNonRedirects IF redirect IS NULL; -- Page IDs and titles $ids = FOREACH parsedNonRedirects GENERATE title, id, pageUrl; -- Articles $articles = FOREACH parsedNonRedirects GENERATE pageUrl, text, links, paragraphs; -- Build transitive closure of redirects redirects = redirectTransClo(parsedRedirects); -- Make redirects surface form occurrences pairsFromRedirects = FOREACH redirects GENERATE redirectSourceTitle AS surfaceForm, redirectTarget AS uri; -- Get Links pageLinksNonEmptySf = getLinks(articles, $LANG, $MIN_SURFACE_FORM_LENGTH); -- Resolve redirects pageLinksRedirectsJoin = JOIN redirects BY redirectSource RIGHT, pageLinksNonEmptySf BY uri; resolvedLinks = FOREACH pageLinksRedirectsJoin GENERATE surfaceForm, FLATTEN(resolve(uri, redirectTarget)) AS uri, pageUrl; distinctLinks = DISTINCT resolvedLinks; $pairs = UNION ONSCHEMA pairsFromRedirects, distinctLinks;
  3. Speed up your debug routine!

    Automated exception search integrated into your IDE

  4. 0

    Wonderdog fails in Pig 0.10?

    GitHub | 4 years ago | rjurney
    org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1031: Incompatable schema: left is "json_record:chararray", right is "message_id:chararray,thread_id:chararray,in_reply_to:chararray,subject:chararray,body:chararray,date:chararray,froms:bag{ARRAY_ELEM:tuple(real_name:chararray,address:chararray)},tos:bag{ARRAY_ELEM:tuple(real_name:chararray,address:chararray)},ccs:bag{ARRAY_ELEM:tuple(real_name:chararray,address:chararray)},bccs:bag{ARRAY_ELEM:tuple(real_name:chararray,address:chararray)},reply_tos:bag{ARRAY_ELEM:tuple(real_name:chararray,address:chararray)}"
  5. 0

    Bulk loading using pig

    Stack Overflow | 3 years ago | user2806611
    org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1000: Error during parsing. Encountered " "as" "AS "" at line 1, column 138. Was expecting one of: "parallel" ... ";" ...

    7 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. org.apache.pig.backend.executionengine.ExecException

      ERROR 2997: Unable to recreate exception from backed error: Error: Found class org.apache.hadoop.mapreduce.TaskAttemptContext, but interface was expected

      at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.Launcher.getErrorMessages()
    2. org.apache.pig
      Main.main
      1. org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.Launcher.getErrorMessages(Launcher.java:221)
      2. org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.Launcher.getStats(Launcher.java:151)
      3. org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:337)
      4. org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.execute(HExecutionEngine.java:378)
      5. org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1198)
      6. org.apache.pig.PigServer.storeEx(PigServer.java:874)
      7. org.apache.pig.PigServer.store(PigServer.java:816)
      8. org.apache.pig.PigServer.openIterator(PigServer.java:728)
      9. org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:612)
      10. org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:303)
      11. org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:165)
      12. org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:141)
      13. org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:90)
      14. org.apache.pig.Main.run(Main.java:406)
      15. org.apache.pig.Main.main(Main.java:107)
      15 frames