java.lang.IllegalArgumentException: Cannot process more than 5000 columns, taking into account expanded categoricals

JIRA | SriSatish Ambati | 2 years ago
tip
Do you know that we can give you better hits? Get more relevant results from Samebug’s stack trace search.
  1. 0

    On Monday, 22 December 2014 12:49:24 UTC+5:30, Kumar wrote: My Environment - RStudio connecting to 15 node h2o cluster on hadoop. I have two data sets training(pml.training.hex) and test((pml.testing.hex) Both have around 160 columns. Training has 20,000 rows. Test has 20 rows. I tried running this with test data set, pml.testing.pca.model = h2o.prcomp(pml.testing.hex) It goes fine and give the model with 70 principal components. Next I tried running for training data set pml.training.pca.model = h2o.prcomp(pml.training.hex) It gives error in log as below. 22-Dec 05:00:08.887 10.65.252.156:54321 11357 # Session INFO WATER: Running PCA on dataset with 6127 expanded columns in Gram matrix 22-Dec 05:00:08.887 10.65.252.156:54321 11357 # Session ERRR WATER: + java.lang.IllegalArgumentException: Cannot process more than 5000 columns, taking into account expanded categoricals + at hex.pca.PCA.init(PCA.java:102) + at water.Job.fork(Job.java:327) + at water.Job.serve(Job.java:311) + at water.api.Request.serveGrid(Request.java:165) + at water.Request2.superServeGrid(Request2.java:490) + at water.Request2.serveGrid(Request2.java:411) + at water.api.Request.serve(Request.java:142) + at water.api.RequestServer.serve(RequestServer.java:507) + at water.NanoHTTPD$HTTPSession.run(NanoHTTPD.java:425) + at java.lang.Thread.run(Thread.java:744) I changed the line to pml.training.pcamodel = h2o.prcomp(pml.training.hex, tol = 0.2, cols = "", max_pc = 1000, key = "", standardize = TRUE,retx = FALSE) It still gives exactly same error. It seems mac_pc value is still getting used as of the first job request . But why ? Also what would you suggest to resolve it other than re-starting the cluster ---------- Forwarded message ---------- From: Kumar <sureshemailid@gmail.com> Date: Mon, Dec 22, 2014 at 12:19 AM Subject: [h2ostream] Re: h2o.prcomp() issue To: h2ostream@googlegroups.com OK...I re-started the cloud and this time after all the parsing etc...I tried to execute pml.training.pcamodel = h2o.prcomp(pml.training.hex, max_pc = 1000) This time again I got the same error. Not sure what I am doing wrong. Need help. Attached is the log. thx -Kumar

    JIRA | 2 years ago | SriSatish Ambati
    java.lang.IllegalArgumentException: Cannot process more than 5000 columns, taking into account expanded categoricals
  2. 0

    On Monday, 22 December 2014 12:49:24 UTC+5:30, Kumar wrote: My Environment - RStudio connecting to 15 node h2o cluster on hadoop. I have two data sets training(pml.training.hex) and test((pml.testing.hex) Both have around 160 columns. Training has 20,000 rows. Test has 20 rows. I tried running this with test data set, pml.testing.pca.model = h2o.prcomp(pml.testing.hex) It goes fine and give the model with 70 principal components. Next I tried running for training data set pml.training.pca.model = h2o.prcomp(pml.training.hex) It gives error in log as below. 22-Dec 05:00:08.887 10.65.252.156:54321 11357 # Session INFO WATER: Running PCA on dataset with 6127 expanded columns in Gram matrix 22-Dec 05:00:08.887 10.65.252.156:54321 11357 # Session ERRR WATER: + java.lang.IllegalArgumentException: Cannot process more than 5000 columns, taking into account expanded categoricals + at hex.pca.PCA.init(PCA.java:102) + at water.Job.fork(Job.java:327) + at water.Job.serve(Job.java:311) + at water.api.Request.serveGrid(Request.java:165) + at water.Request2.superServeGrid(Request2.java:490) + at water.Request2.serveGrid(Request2.java:411) + at water.api.Request.serve(Request.java:142) + at water.api.RequestServer.serve(RequestServer.java:507) + at water.NanoHTTPD$HTTPSession.run(NanoHTTPD.java:425) + at java.lang.Thread.run(Thread.java:744) I changed the line to pml.training.pcamodel = h2o.prcomp(pml.training.hex, tol = 0.2, cols = "", max_pc = 1000, key = "", standardize = TRUE,retx = FALSE) It still gives exactly same error. It seems mac_pc value is still getting used as of the first job request . But why ? Also what would you suggest to resolve it other than re-starting the cluster ---------- Forwarded message ---------- From: Kumar <sureshemailid@gmail.com> Date: Mon, Dec 22, 2014 at 12:19 AM Subject: [h2ostream] Re: h2o.prcomp() issue To: h2ostream@googlegroups.com OK...I re-started the cloud and this time after all the parsing etc...I tried to execute pml.training.pcamodel = h2o.prcomp(pml.training.hex, max_pc = 1000) This time again I got the same error. Not sure what I am doing wrong. Need help. Attached is the log. thx -Kumar

    JIRA | 2 years ago | SriSatish Ambati
    java.lang.IllegalArgumentException: Cannot process more than 5000 columns, taking into account expanded categoricals
  3. 0

    Universal Image Loader : IllegalArgumentException when using FileNameGenerator with extension

    Stack Overflow | 2 years ago
    java.lang.IllegalArgumentException: keys must match regex [a-z0-9_-] {1,64}: "1828294.jpg" at com.nostra13.universalimageloader.cache.disc.impl.ext.DiskLruCache.valida teKey(DiskLruCache.java:697) at com.nostra13.universalimageloader.cache.disc.impl.ext.DiskLruCache.get(Di skLruCache.java:414) at com.nostra13.universalimageloader.cache.disc.impl.ext.LruDiscCache.get(Lr uDiscCache.java:133) at com.nostra13.universalimageloader.core.ImageLoaderEngine$1.run(ImageLoade rEngine.java:72) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java :1112) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.jav a:587)
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    [elasticsearch] elasticsearch couchdb-river startup issues - Grokbase

    grokbase.com | 9 months ago
    java.lang.IllegalArgumentException: URI can't be null. at sun.net.spi.DefaultProxySelector.select(DefaultProxySelector.java:141) at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:925) at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:849) at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1299) at org.elasticsearch.river.couchdb.CouchdbRiver$Slurper.run(CouchdbRiver.java:468)
  6. 0

    java.lang.IllegalArgumentException: argument type mismatch... WHEN DEPLOYING SIMPLE Service Assembly with servicemix-bean

    apache.org | 1 year ago
    java.lang.IllegalArgumentException: argument type mismatch</loc-message> <stack-trace></stack-trace> </msg-loc-info> </exception-info> </task-result-details> </component-task-result-details> </component-task-result> at org.apache.servicemix.common.ManagementSupport.failure(ManagementSupport.java:46)[93:servicemix-common:2011.01.0.fuse-03-01] at org.apache.servicemix.common.AbstractDeployer.failure(AbstractDeployer.java:43)[93:servicemix-common:2011.01.0.fuse-03-01] at org.apache.servicemix.common.xbean.AbstractXBeanDeployer.deploy(AbstractXBeanDeployer.java:118)[93:servicemix-common:2011.01.0.fuse-03-01] at org.apache.servicemix.common.BaseServiceUnitManager.doDeploy(BaseServiceUnitManager.java:88)[93:servicemix-common:2011.01.0.fuse-03-01] at org.apache.servicemix.common.BaseServiceUnitManager.deploy(BaseServiceUnitManager.java:69)[93:servicemix-common:2011.01.0.fuse-03-01] at org.apache.servicemix.jbi.deployer.artifacts.ServiceUnitImpl.deploy(ServiceUnitImpl.java:104)[96:org.apache.servicemix.jbi.deployer:1.4.0.fuse-03-01] at org.apache.servicemix.jbi.deployer.impl.ServiceAssemblyInstaller.deploySUs(ServiceAssemblyInstaller.java:207)[96:org.apache.servicemix.jbi.deployer:1.4.0.fuse-03-01] at org.apache.servicemix.jbi.deployer.impl.ServiceAssemblyInstaller.install(ServiceAssemblyInstaller.java:85)[96:org.apache.servicemix.jbi.deployer:1.4.0.fuse-03-01] at org.apache.servicemix.jbi.deployer.impl.Deployer.onBundleStarted(Deployer.java:334)[96:org.apache.servicemix.jbi.deployer:1.4.0.fuse-03-01] at org.apache.servicemix.jbi.deployer.impl.Deployer.bundleChanged(Deployer.java:264)[96:org.apache.servicemix.jbi.deployer:1.4.0.fuse-03-01] at org.eclipse.osgi.framework.internal.core.BundleContextImpl.dispatchEvent(BundleContextImpl.java:919)[osgi-3.6.0.v20100517.jar:] at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEvent(EventManager.java:227)[osgi-3.6.0.v20100517.jar:] at org.eclipse.osgi.framework.eventmgr.ListenerQueue.dispatchEventSynchronous(ListenerQueue.java:149)[osgi-3.6.0.v20100517.jar:] at org.eclipse.osgi.framework.internal.core.Framework.publishBundleEventPrivileged(Framework.java:1349)[osgi-3.6.0.v20100517.jar:] at org.eclipse.osgi.framework.internal.core.Framework.publishBundleEvent(Framework.java:1300)[osgi-3.6.0.v20100517.jar:] at org.eclipse.osgi.framework.internal.core.BundleHost.startWorker(BundleHost.java:380)[osgi-3.6.0.v20100517.jar:] at org.eclipse.osgi.framework.internal.core.AbstractBundle.start(AbstractBundle.java:284)[osgi-3.6.0.v20100517.jar:] at org.eclipse.osgi.framework.internal.core.AbstractBundle.start(AbstractBundle.java:276)[osgi-3.6.0.v20100517.jar:] at org.apache.karaf.shell.osgi.RestartBundle.doExecute(RestartBundle.java:32)[27:org.apache.karaf.shell.osgi:2.1.6.fuse-01-01] at org.apache.karaf.shell.osgi.BundlesCommand.doExecute(BundlesCommand.java:49)[27:org.apache.karaf.shell.osgi:2.1.6.fuse-01-01] at org.apache.karaf.shell.console.OsgiCommandSupport.execute(OsgiCommandSupport.java:38)[14:org.apache.karaf.shell.console:2.1.6.fuse-01-01] at org.apache.felix.gogo.commands.basic.AbstractCommand.execute(AbstractCommand.java:35)[14:org.apache.karaf.shell.console:2.1.6.fuse-01-01] at org.apache.felix.gogo.runtime.shell.CommandProxy.execute(CommandProxy.java:50)[14:org.apache.karaf.shell.console:2.1.6.fuse-01-01] at org.apache.felix.gogo.runtime.shell.Closure.execute(Closure.java:229)[14:org.apache.karaf.shell.console:2.1.6.fuse-01-01] at org.apache.felix.gogo.runtime.shell.Closure.executeStatement(Closure.java:162)[14:org.apache.karaf.shell.console:2.1.6.fuse-01-01] at org.apache.felix.gogo.runtime.shell.Pipe.run(Pipe.java:101)[14:org.apache.karaf.shell.console:2.1.6.fuse-01-01] at org.apache.felix.gogo.runtime.shell.Closure.execute(Closure.java:79)[14:org.apache.karaf.shell.console:2.1.6.fuse-01-01] at org.apache.felix.gogo.runtime.shell.CommandSessionImpl.execute(CommandSessionImpl.java:71)[14:org.apache.karaf.shell.console:2.1.6.fuse-01-01] at org.apache.karaf.shell.console.jline.Console.run(Console.java:170)[14:org.apache.karaf.shell.console:2.1.6.fuse-01-01]

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.IllegalArgumentException

      Cannot process more than 5000 columns, taking into account expanded categoricals

      at hex.pca.PCA.init()
    2. hex.pca
      PCA.init
      1. hex.pca.PCA.init(PCA.java:102)
      1 frame
    3. water
      Job.serve
      1. water.Job.fork(Job.java:327)
      2. water.Job.serve(Job.java:311)
      2 frames
    4. water.api
      Request.serveGrid
      1. water.api.Request.serveGrid(Request.java:165)
      1 frame
    5. water
      Request2.serveGrid
      1. water.Request2.superServeGrid(Request2.java:490)
      2. water.Request2.serveGrid(Request2.java:411)
      2 frames
    6. water.api
      RequestServer.serve
      1. water.api.Request.serve(Request.java:142)
      2. water.api.RequestServer.serve(RequestServer.java:507)
      2 frames
    7. water
      NanoHTTPD$HTTPSession.run
      1. water.NanoHTTPD$HTTPSession.run(NanoHTTPD.java:425)
      1 frame
    8. Java RT
      Thread.run
      1. java.lang.Thread.run(Thread.java:744)
      1 frame