java.io.IOException: Illegal file pattern: Expecting set closure character or end of range, or } for glob {201210 at 7

Pentaho BI Platform Tracking | Tim Lynch | 4 years ago
  1. 0

    the split on comma: https://github.com/pentaho/big-data-plugin/blob/master/src/org/pentaho/di/job/entries/hadooptransjobexecutor/JobEntryHadoopTransJobExecutor.java#L689 an input path with a glob containing a comma: /input/{201210,201211} will be split on the comma resulting in two paths with incomplete glob patterns throwing IOException: ERROR 27-11 23:19:54,573 - PriviledgedActionException as:tlynch (auth:SIMPLE) cause:java.io.IOException: Illegal file pattern: Expecting set closure character or end of range, or } for glob {201210 at 7 java.io.IOException: Illegal file pattern: Expecting set closure character or end of range, or } for glob {201210 at 7 at org.apache.hadoop.fs.FileSystem$GlobFilter.error(FileSystem.java:1234) at org.apache.hadoop.fs.FileSystem$GlobFilter.setRegex(FileSystem.java:1219) at org.apache.hadoop.fs.FileSystem$GlobFilter.<init>(FileSystem.java:1137) at org.apache.hadoop.fs.FileSystem.globStatusInternal(FileSystem.java:1057) at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1015) at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:174) at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:205) at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:977) at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:969) at org.apache.hadoop.mapred.JobClient.access$500(JobClient.java:170) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:880) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1177) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833) at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:807) at org.pentaho.di.job.entries.hadooptransjobexecutor.JobEntryHadoopTransJobExecutor.execute(JobEntryHadoopTransJobExecutor.java:874) at org.pentaho.di.job.Job.execute(Job.java:528) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:393) at org.pentaho.di.job.Job.run(Job.java:313) ERROR 27-11 23:19:54,577 - Pentaho Map NO Reduce 2 - Illegal file pattern: Expecting set closure character or end of range, or } for glob {201210 at 7 ERROR 27-11 23:19:54,578 - Pentaho Map NO Reduce 2 - java.io.IOException: Illegal file pattern: Expecting set closure character or end of range, or } for glob {201210 at 7 at org.apache.hadoop.fs.FileSystem$GlobFilter.error(FileSystem.java:1234) at org.apache.hadoop.fs.FileSystem$GlobFilter.setRegex(FileSystem.java:1219) at org.apache.hadoop.fs.FileSystem$GlobFilter.<init>(FileSystem.java:1137) at org.apache.hadoop.fs.FileSystem.globStatusInternal(FileSystem.java:1057) at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1015) at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:174) at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:205) at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:977) at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:969) at org.apache.hadoop.mapred.JobClient.access$500(JobClient.java:170) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:880) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1177) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833) at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:807) at org.pentaho.di.job.entries.hadooptransjobexecutor.JobEntryHadoopTransJobExecutor.execute(JobEntryHadoopTransJobExecutor.java:874) at org.pentaho.di.job.Job.execute(Job.java:528) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:393) at org.pentaho.di.job.Job.run(Job.java:313)

    Pentaho BI Platform Tracking | 4 years ago | Tim Lynch
    java.io.IOException: Illegal file pattern: Expecting set closure character or end of range, or } for glob {201210 at 7
  2. 0

    the split on comma: https://github.com/pentaho/big-data-plugin/blob/master/src/org/pentaho/di/job/entries/hadooptransjobexecutor/JobEntryHadoopTransJobExecutor.java#L689 an input path with a glob containing a comma: /input/{201210,201211} will be split on the comma resulting in two paths with incomplete glob patterns throwing IOException: ERROR 27-11 23:19:54,573 - PriviledgedActionException as:tlynch (auth:SIMPLE) cause:java.io.IOException: Illegal file pattern: Expecting set closure character or end of range, or } for glob {201210 at 7 java.io.IOException: Illegal file pattern: Expecting set closure character or end of range, or } for glob {201210 at 7 at org.apache.hadoop.fs.FileSystem$GlobFilter.error(FileSystem.java:1234) at org.apache.hadoop.fs.FileSystem$GlobFilter.setRegex(FileSystem.java:1219) at org.apache.hadoop.fs.FileSystem$GlobFilter.<init>(FileSystem.java:1137) at org.apache.hadoop.fs.FileSystem.globStatusInternal(FileSystem.java:1057) at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1015) at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:174) at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:205) at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:977) at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:969) at org.apache.hadoop.mapred.JobClient.access$500(JobClient.java:170) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:880) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1177) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833) at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:807) at org.pentaho.di.job.entries.hadooptransjobexecutor.JobEntryHadoopTransJobExecutor.execute(JobEntryHadoopTransJobExecutor.java:874) at org.pentaho.di.job.Job.execute(Job.java:528) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:393) at org.pentaho.di.job.Job.run(Job.java:313) ERROR 27-11 23:19:54,577 - Pentaho Map NO Reduce 2 - Illegal file pattern: Expecting set closure character or end of range, or } for glob {201210 at 7 ERROR 27-11 23:19:54,578 - Pentaho Map NO Reduce 2 - java.io.IOException: Illegal file pattern: Expecting set closure character or end of range, or } for glob {201210 at 7 at org.apache.hadoop.fs.FileSystem$GlobFilter.error(FileSystem.java:1234) at org.apache.hadoop.fs.FileSystem$GlobFilter.setRegex(FileSystem.java:1219) at org.apache.hadoop.fs.FileSystem$GlobFilter.<init>(FileSystem.java:1137) at org.apache.hadoop.fs.FileSystem.globStatusInternal(FileSystem.java:1057) at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1015) at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:174) at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:205) at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:977) at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:969) at org.apache.hadoop.mapred.JobClient.access$500(JobClient.java:170) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:880) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1177) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833) at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:807) at org.pentaho.di.job.entries.hadooptransjobexecutor.JobEntryHadoopTransJobExecutor.execute(JobEntryHadoopTransJobExecutor.java:874) at org.pentaho.di.job.Job.execute(Job.java:528) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:393) at org.pentaho.di.job.Job.run(Job.java:313)

    Pentaho BI Platform Tracking | 4 years ago | Tim Lynch
    java.io.IOException: Illegal file pattern: Expecting set closure character or end of range, or } for glob {201210 at 7
  3. 0

    FTP file name in FileInputFormat.setInputPath

    Stack Overflow | 4 years ago | RadAl
    java.io.IOException: Login failed on server - 0.0.0.0, port - 21
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    How do i use a file on FTP in a stand alone spark cluster (pySpark)?

    Stack Overflow | 7 months ago | rajat
    java.io.IOException: Login failed on server - 192.168.125.124, port - 21
  6. 0

    java.io.IOException: Expecting Ant GLOB pattern, but saw '<IPA FILE>'. See http://ant.apache.org/manual/Types/fileset.html for syntax 00:17:24.374 at hudson.FilePath.glob(FilePath.java:1624) 00:17:24.374 at hudson.FilePath.access$700(FilePath.java:178) 00:17:24.374 at hudson.FilePath$30.invoke(FilePath.java:1605) 00:17:24.374 at hudson.FilePath$30.invoke(FilePath.java:1602) 00:17:24.374 at hudson.FilePath.act(FilePath.java:920) 00:17:24.374 at hudson.FilePath.act(FilePath.java:893) 00:17:24.374 at hudson.FilePath.list(FilePath.java:1602) 00:17:24.374 at hudson.FilePath.list(FilePath.java:1587) 00:17:24.374 at hudson.FilePath.list(FilePath.java:1573) 00:17:24.375 at hockeyapp.HockeyappRecorder.perform(HockeyappRecorder.java:320) 00:17:24.375 at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20) 00:17:24.375 at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:745) 00:17:24.375 at hudson.model.AbstractBuild$AbstractBuildExecution.performAllBuildSteps(AbstractBuild.java:709) 00:17:24.375 at hudson.model.Build$BuildExecution.post2(Build.java:182) 00:17:24.375 at hudson.model.AbstractBuild$AbstractBuildExecution.post(AbstractBuild.java:658) 00:17:24.376 at hudson.model.Run.execute(Run.java:1731) 00:17:24.376 at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43) 00:17:24.376 at hudson.model.ResourceController.execute(ResourceController.java:88) 00:17:24.376 at hudson.model.Executor.run(Executor.java:231) Have had issues since v1.0.6

    Jenkins JIRA | 3 years ago | Jonathan Crooke
    java.io.IOException: Expecting Ant GLOB pattern, but saw '<IPA FILE>'. See http://ant.apache.org/manual/Types/fileset.html for syntax

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.io.IOException

      Illegal file pattern: Expecting set closure character or end of range, or } for glob {201210 at 7

      at org.apache.hadoop.fs.FileSystem$GlobFilter.error()
    2. Hadoop
      FileSystem.globStatus
      1. org.apache.hadoop.fs.FileSystem$GlobFilter.error(FileSystem.java:1234)
      2. org.apache.hadoop.fs.FileSystem$GlobFilter.setRegex(FileSystem.java:1219)
      3. org.apache.hadoop.fs.FileSystem$GlobFilter.<init>(FileSystem.java:1137)
      4. org.apache.hadoop.fs.FileSystem.globStatusInternal(FileSystem.java:1057)
      5. org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1015)
      5 frames
    3. Hadoop
      JobClient$2.run
      1. org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:174)
      2. org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:205)
      3. org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:977)
      4. org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:969)
      5. org.apache.hadoop.mapred.JobClient.access$500(JobClient.java:170)
      6. org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:880)
      7. org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)
      7 frames
    4. Java RT
      Subject.doAs
      1. java.security.AccessController.doPrivileged(Native Method)
      2. javax.security.auth.Subject.doAs(Subject.java:396)
      2 frames
    5. Hadoop
      UserGroupInformation.doAs
      1. org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1177)
      1 frame
    6. Hadoop
      JobClient.submitJob
      1. org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833)
      2. org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:807)
      2 frames
    7. org.pentaho.di
      Job.run
      1. org.pentaho.di.job.entries.hadooptransjobexecutor.JobEntryHadoopTransJobExecutor.execute(JobEntryHadoopTransJobExecutor.java:874)
      2. org.pentaho.di.job.Job.execute(Job.java:528)
      3. org.pentaho.di.job.Job.execute(Job.java:667)
      4. org.pentaho.di.job.Job.execute(Job.java:667)
      5. org.pentaho.di.job.Job.execute(Job.java:667)
      6. org.pentaho.di.job.Job.execute(Job.java:667)
      7. org.pentaho.di.job.Job.execute(Job.java:393)
      8. org.pentaho.di.job.Job.run(Job.java:313)
      8 frames