java.lang.AssertionError: objvals from line-search and gradient tasks differ, 0.6670861437437924 != 0.6670984136694696

JIRA | Sebastian Vidrio | 2 years ago
  1. 0

    R Repro: pros.hex <- h2o.uploadFile(conn, locate("smalldata/prostate/prostate.csv.zip")) pros.hex[,2] <- as.factor(pros.hex[,2]) pros.hex[,4] <- as.factor(pros.hex[,4]) pros.hex[,5] <- as.factor(pros.hex[,5]) pros.hex[,6] <- as.factor(pros.hex[,6]) pros.hex[,9] <- as.factor(pros.hex[,9]) p.sid <- h2o.runif(pros.hex) pros.train <- h2o.assign(pros.hex[p.sid > .2, ], "pros.train") pros.test <- h2o.assign(pros.hex[p.sid <= .2, ], "pros.test") h2o.glm(x = 3:9, y = 2, training_frame = pros.train, family = "binomial", solver = "L_BFGS", alpha = 0.5, lambda_search = TRUE) stacktrace: t exception 'class java.lang.AssertionError', with msg 'objvals from line-search and gradient tasks differ, 0.6670861437437924 != 0.6670984136694696' java.lang.AssertionError: objvals from line-search and gradient tasks differ, 0.6670861437437924 != 0.6670984136694696 at hex.optimization.L_BFGS.solve(L_BFGS.java:278) at hex.glm.GLM$LBFGS_ProximalSolver.solve(GLM.java:1422) at hex.optimization.ADMM$L1Solver.solve(ADMM.java:85) at hex.optimization.ADMM$L1Solver.solve(ADMM.java:37) at hex.glm.GLM$GLMSingleLambdaTsk.solve(GLM.java:837) at hex.glm.GLM$GLMSingleLambdaTsk.compute2(GLM.java:1030) at water.H2O$H2OCountedCompleter.compute(H2O.java:682) at jsr166y.CountedCompleter.exec(CountedCompleter.java:429) at jsr166y.ForkJoinTask.doExec(ForkJoinTask.java:263) at jsr166y.ForkJoinPool$WorkQueue.pollAndExecAll(ForkJoinPool.java:914) at jsr166y.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:979) at jsr166y.ForkJoinPool.runWorker(ForkJoinPool.java:1477) at jsr166y.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:104)

    JIRA | 2 years ago | Sebastian Vidrio
    java.lang.AssertionError: objvals from line-search and gradient tasks differ, 0.6670861437437924 != 0.6670984136694696
  2. 0

    R Repro: pros.hex <- h2o.uploadFile(conn, locate("smalldata/prostate/prostate.csv.zip")) pros.hex[,2] <- as.factor(pros.hex[,2]) pros.hex[,4] <- as.factor(pros.hex[,4]) pros.hex[,5] <- as.factor(pros.hex[,5]) pros.hex[,6] <- as.factor(pros.hex[,6]) pros.hex[,9] <- as.factor(pros.hex[,9]) p.sid <- h2o.runif(pros.hex) pros.train <- h2o.assign(pros.hex[p.sid > .2, ], "pros.train") pros.test <- h2o.assign(pros.hex[p.sid <= .2, ], "pros.test") h2o.glm(x = 3:9, y = 2, training_frame = pros.train, family = "binomial", solver = "L_BFGS", alpha = 0.5, lambda_search = TRUE) stacktrace: t exception 'class java.lang.AssertionError', with msg 'objvals from line-search and gradient tasks differ, 0.6670861437437924 != 0.6670984136694696' java.lang.AssertionError: objvals from line-search and gradient tasks differ, 0.6670861437437924 != 0.6670984136694696 at hex.optimization.L_BFGS.solve(L_BFGS.java:278) at hex.glm.GLM$LBFGS_ProximalSolver.solve(GLM.java:1422) at hex.optimization.ADMM$L1Solver.solve(ADMM.java:85) at hex.optimization.ADMM$L1Solver.solve(ADMM.java:37) at hex.glm.GLM$GLMSingleLambdaTsk.solve(GLM.java:837) at hex.glm.GLM$GLMSingleLambdaTsk.compute2(GLM.java:1030) at water.H2O$H2OCountedCompleter.compute(H2O.java:682) at jsr166y.CountedCompleter.exec(CountedCompleter.java:429) at jsr166y.ForkJoinTask.doExec(ForkJoinTask.java:263) at jsr166y.ForkJoinPool$WorkQueue.pollAndExecAll(ForkJoinPool.java:914) at jsr166y.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:979) at jsr166y.ForkJoinPool.runWorker(ForkJoinPool.java:1477) at jsr166y.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:104)

    JIRA | 2 years ago | Sebastian Vidrio
    java.lang.AssertionError: objvals from line-search and gradient tasks differ, 0.6670861437437924 != 0.6670984136694696
  3. 0
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0
    Check if you use the right path
  6. 0

    on - H2O Build git hash b097777d1ce872722cdcd48264e6be45ee462230 H2O Built on 2016-12-01 14:50:11 alpha =0; nfold=5; meanimputaion; seed = 4962489898628097000 {code:java} [1] "dataset-71" [1] "/home/nidhi/auto_sklearn_csv/hypothyroid.arff.txt" |======================================================================| 100% | | 0% java.lang.AssertionError: invalid gradient/direction, got positive differential 0.028143596300444426 java.lang.AssertionError: invalid gradient/direction, got positive differential 0.028143596300444426 at hex.optimization.OptimizationUtils$MoreThuente.evaluate(OptimizationUtils.java:334) at hex.glm.GLM$GLMDriver.fitIRLSM(GLM.java:677) at hex.glm.GLM$GLMDriver.fitModel(GLM.java:933) at hex.glm.GLM$GLMDriver.computeSubmodel(GLM.java:1002) at hex.glm.GLM$GLMDriver.computeImpl(GLM.java:1071) at hex.ModelBuilder$Driver.compute2(ModelBuilder.java:169) at hex.glm.GLM$GLMDriver.compute2(GLM.java:535) at water.H2O$H2OCountedCompleter.compute(H2O.java:1214) at jsr166y.CountedCompleter.exec(CountedCompleter.java:468) at jsr166y.ForkJoinTask.doExec(ForkJoinTask.java:263) at jsr166y.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:974) at jsr166y.ForkJoinPool.runWorker(ForkJoinPool.java:1477) at jsr166y.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:104) Error: java.lang.AssertionError: invalid gradient/direction, got positive differential 0.028143596300444426 {code} {code:java} data = h2o.importFile("/home/nidhi/hypothyroid.arff.txt",destination_frame = "data",header = T) response = "binaryClass" response_indx = which(names(data)==response) data[,response_indx] = as.factor(data[,response_indx]) data = h2o.assign(data,key = "data") if (length(h2o.levels(data[,response_indx]))>2){ family ="multinomial" }else{ family = "binomial" } id_col_idx = which(names(data) %in% c("ID","Id")) if(length(id_col_idx)>0){ #remove id column data = h2o.assign(data[,-id_col_idx],key = "data") } myY = response myX = setdiff(names(data),response) griables = list(alpha = c(0,.2,.4,.6,.8,1),missing_values_handling = c("MeanImputation", "Skip")) gg = h2o.grid(algorithm = "glm",grid_id = "aa", x=myX, y=myY, training_frame=data, hyper_params= griables,family = family,nfolds=5,lambda_search=TRUE, search_criteria =list(strategy = "RandomDiscrete",stopping_metric = "AUTO", stopping_tolerance = 0.0001, stopping_rounds = 3)) {code}

    JIRA | 2 months ago | Nidhi Mehta
    java.lang.AssertionError: invalid gradient/direction, got positive differential 0.028143596300444426

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.AssertionError

      objvals from line-search and gradient tasks differ, 0.6670861437437924 != 0.6670984136694696

      at hex.optimization.L_BFGS.solve()
    2. hex.optimization
      L_BFGS.solve
      1. hex.optimization.L_BFGS.solve(L_BFGS.java:278)
      1 frame
    3. hex.glm
      GLM$LBFGS_ProximalSolver.solve
      1. hex.glm.GLM$LBFGS_ProximalSolver.solve(GLM.java:1422)
      1 frame
    4. hex.optimization
      ADMM$L1Solver.solve
      1. hex.optimization.ADMM$L1Solver.solve(ADMM.java:85)
      2. hex.optimization.ADMM$L1Solver.solve(ADMM.java:37)
      2 frames
    5. hex.glm
      GLM$GLMSingleLambdaTsk.compute2
      1. hex.glm.GLM$GLMSingleLambdaTsk.solve(GLM.java:837)
      2. hex.glm.GLM$GLMSingleLambdaTsk.compute2(GLM.java:1030)
      2 frames
    6. water
      H2O$H2OCountedCompleter.compute
      1. water.H2O$H2OCountedCompleter.compute(H2O.java:682)
      1 frame
    7. jsr166y
      ForkJoinWorkerThread.run
      1. jsr166y.CountedCompleter.exec(CountedCompleter.java:429)
      2. jsr166y.ForkJoinTask.doExec(ForkJoinTask.java:263)
      3. jsr166y.ForkJoinPool$WorkQueue.pollAndExecAll(ForkJoinPool.java:914)
      4. jsr166y.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:979)
      5. jsr166y.ForkJoinPool.runWorker(ForkJoinPool.java:1477)
      6. jsr166y.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:104)
      6 frames