java.io.IOException: Illegal file pattern: Expecting set closure character or end of range, or } for glob {201210 at 7

Pentaho BI Platform Tracking | Tim Lynch | 4 years ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    the split on comma: https://github.com/pentaho/big-data-plugin/blob/master/src/org/pentaho/di/job/entries/hadooptransjobexecutor/JobEntryHadoopTransJobExecutor.java#L689 an input path with a glob containing a comma: /input/{201210,201211} will be split on the comma resulting in two paths with incomplete glob patterns throwing IOException: ERROR 27-11 23:19:54,573 - PriviledgedActionException as:tlynch (auth:SIMPLE) cause:java.io.IOException: Illegal file pattern: Expecting set closure character or end of range, or } for glob {201210 at 7 java.io.IOException: Illegal file pattern: Expecting set closure character or end of range, or } for glob {201210 at 7 at org.apache.hadoop.fs.FileSystem$GlobFilter.error(FileSystem.java:1234) at org.apache.hadoop.fs.FileSystem$GlobFilter.setRegex(FileSystem.java:1219) at org.apache.hadoop.fs.FileSystem$GlobFilter.<init>(FileSystem.java:1137) at org.apache.hadoop.fs.FileSystem.globStatusInternal(FileSystem.java:1057) at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1015) at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:174) at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:205) at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:977) at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:969) at org.apache.hadoop.mapred.JobClient.access$500(JobClient.java:170) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:880) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1177) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833) at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:807) at org.pentaho.di.job.entries.hadooptransjobexecutor.JobEntryHadoopTransJobExecutor.execute(JobEntryHadoopTransJobExecutor.java:874) at org.pentaho.di.job.Job.execute(Job.java:528) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:393) at org.pentaho.di.job.Job.run(Job.java:313) ERROR 27-11 23:19:54,577 - Pentaho Map NO Reduce 2 - Illegal file pattern: Expecting set closure character or end of range, or } for glob {201210 at 7 ERROR 27-11 23:19:54,578 - Pentaho Map NO Reduce 2 - java.io.IOException: Illegal file pattern: Expecting set closure character or end of range, or } for glob {201210 at 7 at org.apache.hadoop.fs.FileSystem$GlobFilter.error(FileSystem.java:1234) at org.apache.hadoop.fs.FileSystem$GlobFilter.setRegex(FileSystem.java:1219) at org.apache.hadoop.fs.FileSystem$GlobFilter.<init>(FileSystem.java:1137) at org.apache.hadoop.fs.FileSystem.globStatusInternal(FileSystem.java:1057) at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1015) at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:174) at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:205) at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:977) at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:969) at org.apache.hadoop.mapred.JobClient.access$500(JobClient.java:170) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:880) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1177) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833) at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:807) at org.pentaho.di.job.entries.hadooptransjobexecutor.JobEntryHadoopTransJobExecutor.execute(JobEntryHadoopTransJobExecutor.java:874) at org.pentaho.di.job.Job.execute(Job.java:528) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:393) at org.pentaho.di.job.Job.run(Job.java:313)

    Pentaho BI Platform Tracking | 4 years ago | Tim Lynch
    java.io.IOException: Illegal file pattern: Expecting set closure character or end of range, or } for glob {201210 at 7
  2. 0

    the split on comma: https://github.com/pentaho/big-data-plugin/blob/master/src/org/pentaho/di/job/entries/hadooptransjobexecutor/JobEntryHadoopTransJobExecutor.java#L689 an input path with a glob containing a comma: /input/{201210,201211} will be split on the comma resulting in two paths with incomplete glob patterns throwing IOException: ERROR 27-11 23:19:54,573 - PriviledgedActionException as:tlynch (auth:SIMPLE) cause:java.io.IOException: Illegal file pattern: Expecting set closure character or end of range, or } for glob {201210 at 7 java.io.IOException: Illegal file pattern: Expecting set closure character or end of range, or } for glob {201210 at 7 at org.apache.hadoop.fs.FileSystem$GlobFilter.error(FileSystem.java:1234) at org.apache.hadoop.fs.FileSystem$GlobFilter.setRegex(FileSystem.java:1219) at org.apache.hadoop.fs.FileSystem$GlobFilter.<init>(FileSystem.java:1137) at org.apache.hadoop.fs.FileSystem.globStatusInternal(FileSystem.java:1057) at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1015) at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:174) at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:205) at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:977) at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:969) at org.apache.hadoop.mapred.JobClient.access$500(JobClient.java:170) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:880) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1177) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833) at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:807) at org.pentaho.di.job.entries.hadooptransjobexecutor.JobEntryHadoopTransJobExecutor.execute(JobEntryHadoopTransJobExecutor.java:874) at org.pentaho.di.job.Job.execute(Job.java:528) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:393) at org.pentaho.di.job.Job.run(Job.java:313) ERROR 27-11 23:19:54,577 - Pentaho Map NO Reduce 2 - Illegal file pattern: Expecting set closure character or end of range, or } for glob {201210 at 7 ERROR 27-11 23:19:54,578 - Pentaho Map NO Reduce 2 - java.io.IOException: Illegal file pattern: Expecting set closure character or end of range, or } for glob {201210 at 7 at org.apache.hadoop.fs.FileSystem$GlobFilter.error(FileSystem.java:1234) at org.apache.hadoop.fs.FileSystem$GlobFilter.setRegex(FileSystem.java:1219) at org.apache.hadoop.fs.FileSystem$GlobFilter.<init>(FileSystem.java:1137) at org.apache.hadoop.fs.FileSystem.globStatusInternal(FileSystem.java:1057) at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1015) at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:174) at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:205) at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:977) at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:969) at org.apache.hadoop.mapred.JobClient.access$500(JobClient.java:170) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:880) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1177) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833) at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:807) at org.pentaho.di.job.entries.hadooptransjobexecutor.JobEntryHadoopTransJobExecutor.execute(JobEntryHadoopTransJobExecutor.java:874) at org.pentaho.di.job.Job.execute(Job.java:528) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:393) at org.pentaho.di.job.Job.run(Job.java:313)

    Pentaho BI Platform Tracking | 4 years ago | Tim Lynch
    java.io.IOException: Illegal file pattern: Expecting set closure character or end of range, or } for glob {201210 at 7
  3. 0

    FTP file name in FileInputFormat.setInputPath

    Stack Overflow | 4 years ago | RadAl
    java.io.IOException: Login failed on server - 0.0.0.0, port - 21
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    How do i use a file on FTP in a stand alone spark cluster (pySpark)?

    Stack Overflow | 11 months ago | rajat
    java.io.IOException: Login failed on server - 192.168.125.124, port - 21

    Root Cause Analysis

    1. java.io.IOException

      Illegal file pattern: Expecting set closure character or end of range, or } for glob {201210 at 7

      at org.apache.hadoop.fs.FileSystem$GlobFilter.error()
    2. Hadoop
      FileSystem.globStatus
      1. org.apache.hadoop.fs.FileSystem$GlobFilter.error(FileSystem.java:1234)
      2. org.apache.hadoop.fs.FileSystem$GlobFilter.setRegex(FileSystem.java:1219)
      3. org.apache.hadoop.fs.FileSystem$GlobFilter.<init>(FileSystem.java:1137)
      4. org.apache.hadoop.fs.FileSystem.globStatusInternal(FileSystem.java:1057)
      5. org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1015)
      5 frames
    3. Hadoop
      JobClient$2.run
      1. org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:174)
      2. org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:205)
      3. org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:977)
      4. org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:969)
      5. org.apache.hadoop.mapred.JobClient.access$500(JobClient.java:170)
      6. org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:880)
      7. org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)
      7 frames
    4. Java RT
      Subject.doAs
      1. java.security.AccessController.doPrivileged(Native Method)
      2. javax.security.auth.Subject.doAs(Subject.java:396)
      2 frames
    5. Hadoop
      UserGroupInformation.doAs
      1. org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1177)
      1 frame
    6. Hadoop
      JobClient.submitJob
      1. org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833)
      2. org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:807)
      2 frames
    7. org.pentaho.di
      Job.run
      1. org.pentaho.di.job.entries.hadooptransjobexecutor.JobEntryHadoopTransJobExecutor.execute(JobEntryHadoopTransJobExecutor.java:874)
      2. org.pentaho.di.job.Job.execute(Job.java:528)
      3. org.pentaho.di.job.Job.execute(Job.java:667)
      4. org.pentaho.di.job.Job.execute(Job.java:667)
      5. org.pentaho.di.job.Job.execute(Job.java:667)
      6. org.pentaho.di.job.Job.execute(Job.java:667)
      7. org.pentaho.di.job.Job.execute(Job.java:393)
      8. org.pentaho.di.job.Job.run(Job.java:313)
      8 frames