java.lang.AssertionError: invalid gradient/direction, got positive differential 0.0010490659578066802

JIRA | Tomas Nykodym | 3 months ago
  1. 0

    Running multinomial (with lambda = 0, family='multinomial', rest default) on the given dataset result in following error 09-19 04:36:13.387 172.17.2.154:50000 3547 #70588-12 INFO: POST /3/ModelBuilders/glm, parms: {missing_values_handling=MeanImputation, lambda=0, family=multinomial, training_frame=py_10_sid_bb70, response_column=C6} 09-19 04:36:13.391 172.17.2.154:50000 3547 FJ-1-41 INFO: Building H2O GLM model with these parameters: 09-19 04:36:13.391 172.17.2.154:50000 3547 FJ-1-41 INFO: {"_train":{"name":"py_10_sid_bb70","type":"Key"},"_valid":null,"_nfolds":0,"_keep_cross_validation_predictions":false,"_keep_cross_validation_fold_assignment":false,"_parallelize_cross_validation":true,"_auto_rebalance":true,"_seed":-1,"_fold_assignment":"AUTO","_categorical_encoding":"AUTO","_distribution":"AUTO","_tweedie_power":1.5,"_quantile_alpha":0.5,"_huber_alpha":0.9,"_ignored_columns":null,"_ignore_const_cols":true,"_weights_column":null,"_offset_column":null,"_fold_column":null,"_is_cv_model":false,"_score_each_iteration":false,"_max_runtime_secs":0.0,"_stopping_rounds":3,"_stopping_metric":"deviance","_stopping_tolerance":1.0E-4,"_response_column":"C6","_balance_classes":false,"_max_after_balance_size":5.0,"_class_sampling_factors":null,"_max_confusion_matrix_size":20,"_checkpoint":null,"_pretrained_autoencoder":null,"_standardize":true,"_family":"multinomial","_link":"family_default","_solver":"AUTO","_tweedie_variance_power":0.0,"_tweedie_link_power":1.0,"_alpha":null,"_lambda":[0.0],"_missing_values_handling":"MeanImputation","_prior":-1.0,"_lambda_search":false,"_nlambdas":-1,"_non_negative":false,"_exactLambdas":false,"_lambda_min_ratio":-1.0,"_use_all_factor_levels":false,"_max_iterations":-1,"_intercept":true,"_beta_epsilon":1.0E-4,"_objective_epsilon":-1.0,"_gradient_epsilon":-1.0,"_obj_reg":-1.0,"_compute_p_values":false,"_remove_collinear_columns":false,"_interactions":null,"_early_stopping":true,"_beta_constraints":null,"_max_active_predictors":-1,"_stdOverride":false} 09-19 04:36:13.395 172.17.2.154:50000 3547 FJ-1-41 INFO: GLM[dest=GLM_model_python_1474284809696_2955, iter=0 lmb=.0E0 obj=1.3183 imp=.1E1 bdf=.22E1] picked solver IRLSM made vecs [[95, {/172.17.2.154:50000:0:}], [95, {/172.17.2.154:50000:0:}]] 09-19 04:36:13.396 172.17.2.154:50000 3547 FJ-1-41 INFO: GLM[dest=GLM_model_python_1474284809696_2955, iter=0 lmb=.0E0 obj=1.3183 imp=.1E1 bdf=.22E1] Class 0 got 5 active columns out of 5 total 09-19 04:36:13.396 172.17.2.154:50000 3547 FJ-1-41 INFO: GLM[dest=GLM_model_python_1474284809696_2955, iter=0 lmb=.0E0 obj=1.3183 imp=.1E1 bdf=.22E1] Class 1 got 5 active columns out of 5 total 09-19 04:36:13.396 172.17.2.154:50000 3547 FJ-1-41 INFO: GLM[dest=GLM_model_python_1474284809696_2955, iter=0 lmb=.0E0 obj=1.3183 imp=.1E1 bdf=.22E1] Class 2 got 5 active columns out of 5 total 09-19 04:36:13.396 172.17.2.154:50000 3547 FJ-1-41 INFO: GLM[dest=GLM_model_python_1474284809696_2955, iter=0 lmb=.0E0 obj=1.3183 imp=.1E1 bdf=.22E1] Class 3 got 5 active columns out of 5 total 09-19 04:36:13.399 172.17.2.154:50000 3547 FJ-1-41 INFO: GLM[dest=GLM_model_python_1474284809696_2955, iter=0 lmb=.0E0 obj=1.3183 imp=.1E1 bdf=.22E1] computed in 2+1+0+0=3ms, step = 1.4165876969584055 09-19 04:36:13.402 172.17.2.154:50000 3547 FJ-1-41 INFO: GLM[dest=GLM_model_python_1474284809696_2955, iter=0 lmb=.0E0 obj=1.3183 imp=.1E1 bdf=.22E1] computed in 2+1+0+0=3ms, step = 0.733585713016148 09-19 04:36:13.417 172.17.2.154:50000 3547 FJ-1-41 INFO: GLM[dest=GLM_model_python_1474284809696_2955, iter=0 lmb=.0E0 obj=1.3183 imp=.1E1 bdf=.22E1] computed in 8+6+1+0=15ms, step = 1.3305058187791514 09-19 04:36:13.429 172.17.2.154:50000 3547 FJ-1-41 INFO: GLM[dest=GLM_model_python_1474284809696_2955, iter=0 lmb=.0E0 obj=1.3183 imp=.1E1 bdf=.22E1] computed in 2+9+0+1=12ms, step = 0.3680862155100146 09-19 04:36:13.432 172.17.2.154:50000 3547 FJ-1-41 INFO: GLM[dest=GLM_model_python_1474284809696_2955, iter=1 lmb=.0E0 obj=0.7632 imp=.42E0 bdf=.16E1] computed in 1+1+0+1=3ms, step = 1.5871265752052253 09-19 04:36:13.439 172.17.2.154:50000 3547 FJ-1-41 INFO: GLM[dest=GLM_model_python_1474284809696_2955, iter=1 lmb=.0E0 obj=0.7632 imp=.42E0 bdf=.16E1] computed in 1+5+0+1=7ms, step = 1.0 09-19 04:36:13.442 172.17.2.154:50000 3547 FJ-1-41 ERRR: java.lang.AssertionError: invalid gradient/direction, got positive differential 0.0010490659578066802 09-19 04:36:13.442 172.17.2.154:50000 3547 FJ-1-41 ERRR: at hex.optimization.OptimizationUtils$MoreThuente.evaluate(OptimizationUtils.java:334) 09-19 04:36:13.442 172.17.2.154:50000 3547 FJ-1-41 ERRR: at hex.glm.GLM$GLMDriver.fitIRLSM_multinomial(GLM.java:606) 09-19 04:36:13.442 172.17.2.154:50000 3547 FJ-1-41 ERRR: at hex.glm.GLM$GLMDriver.fitModel(GLM.java:907) 09-19 04:36:13.442 172.17.2.154:50000 3547 FJ-1-41 ERRR: at hex.glm.GLM$GLMDriver.computeSubmodel(GLM.java:980) 09-19 04:36:13.442 172.17.2.154:50000 3547 FJ-1-41 ERRR: at hex.glm.GLM$GLMDriver.computeImpl(GLM.java:1042) 09-19 04:36:13.442 172.17.2.154:50000 3547 FJ-1-41 ERRR: at hex.ModelBuilder$Driver.compute2(ModelBuilder.java:169) 09-19 04:36:13.442 172.17.2.154:50000 3547 FJ-1-41 ERRR: at hex.glm.GLM$GLMDriver.compute2(GLM.java:515) 09-19 04:36:13.442 172.17.2.154:50000 3547 FJ-1-41 ERRR: at water.H2O$H2OCountedCompleter.compute(H2O.java:1198) 09-19 04:36:13.442 172.17.2.154:50000 3547 FJ-1-41 ERRR: at jsr166y.CountedCompleter.exec(CountedCompleter.java:468) 09-19 04:36:13.442 172.17.2.154:50000 3547 FJ-1-41 ERRR: at jsr166y.ForkJoinTask.doExec(ForkJoinTask.java:263) 09-19 04:36:13.442 172.17.2.154:50000 3547 FJ-1-41 ERRR: at jsr166y.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:974) 09-19 04:36:13.442 172.17.2.154:50000 3547 FJ-1-41 ERRR: at jsr166y.ForkJoinPool.runWorker(ForkJoinPool.java:1477) 09-19 04:36:13.442 172.17.2.154:50000 3547 FJ-1-41 ERRR: at jsr166y.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:104) 09-19 04:36:13.618 172.17.2.154:50000 3547 #70588-15 INFO: DELETE /4/sessions/_sid_bb70, parms: {}

    JIRA | 3 months ago | Tomas Nykodym
    java.lang.AssertionError: invalid gradient/direction, got positive differential 0.0010490659578066802
  2. 0

    R Repro: pros.hex <- h2o.uploadFile(conn, locate("smalldata/prostate/prostate.csv.zip")) pros.hex[,2] <- as.factor(pros.hex[,2]) pros.hex[,4] <- as.factor(pros.hex[,4]) pros.hex[,5] <- as.factor(pros.hex[,5]) pros.hex[,6] <- as.factor(pros.hex[,6]) pros.hex[,9] <- as.factor(pros.hex[,9]) p.sid <- h2o.runif(pros.hex) pros.train <- h2o.assign(pros.hex[p.sid > .2, ], "pros.train") pros.test <- h2o.assign(pros.hex[p.sid <= .2, ], "pros.test") h2o.glm(x = 3:9, y = 2, training_frame = pros.train, family = "binomial", solver = "L_BFGS", alpha = 0.5, lambda_search = TRUE) stacktrace: t exception 'class java.lang.AssertionError', with msg 'objvals from line-search and gradient tasks differ, 0.6670861437437924 != 0.6670984136694696' java.lang.AssertionError: objvals from line-search and gradient tasks differ, 0.6670861437437924 != 0.6670984136694696 at hex.optimization.L_BFGS.solve(L_BFGS.java:278) at hex.glm.GLM$LBFGS_ProximalSolver.solve(GLM.java:1422) at hex.optimization.ADMM$L1Solver.solve(ADMM.java:85) at hex.optimization.ADMM$L1Solver.solve(ADMM.java:37) at hex.glm.GLM$GLMSingleLambdaTsk.solve(GLM.java:837) at hex.glm.GLM$GLMSingleLambdaTsk.compute2(GLM.java:1030) at water.H2O$H2OCountedCompleter.compute(H2O.java:682) at jsr166y.CountedCompleter.exec(CountedCompleter.java:429) at jsr166y.ForkJoinTask.doExec(ForkJoinTask.java:263) at jsr166y.ForkJoinPool$WorkQueue.pollAndExecAll(ForkJoinPool.java:914) at jsr166y.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:979) at jsr166y.ForkJoinPool.runWorker(ForkJoinPool.java:1477) at jsr166y.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:104)

    JIRA | 2 years ago | Sebastian Vidrio
    java.lang.AssertionError: objvals from line-search and gradient tasks differ, 0.6670861437437924 != 0.6670984136694696
  3. 0

    R Repro: pros.hex <- h2o.uploadFile(conn, locate("smalldata/prostate/prostate.csv.zip")) pros.hex[,2] <- as.factor(pros.hex[,2]) pros.hex[,4] <- as.factor(pros.hex[,4]) pros.hex[,5] <- as.factor(pros.hex[,5]) pros.hex[,6] <- as.factor(pros.hex[,6]) pros.hex[,9] <- as.factor(pros.hex[,9]) p.sid <- h2o.runif(pros.hex) pros.train <- h2o.assign(pros.hex[p.sid > .2, ], "pros.train") pros.test <- h2o.assign(pros.hex[p.sid <= .2, ], "pros.test") h2o.glm(x = 3:9, y = 2, training_frame = pros.train, family = "binomial", solver = "L_BFGS", alpha = 0.5, lambda_search = TRUE) stacktrace: t exception 'class java.lang.AssertionError', with msg 'objvals from line-search and gradient tasks differ, 0.6670861437437924 != 0.6670984136694696' java.lang.AssertionError: objvals from line-search and gradient tasks differ, 0.6670861437437924 != 0.6670984136694696 at hex.optimization.L_BFGS.solve(L_BFGS.java:278) at hex.glm.GLM$LBFGS_ProximalSolver.solve(GLM.java:1422) at hex.optimization.ADMM$L1Solver.solve(ADMM.java:85) at hex.optimization.ADMM$L1Solver.solve(ADMM.java:37) at hex.glm.GLM$GLMSingleLambdaTsk.solve(GLM.java:837) at hex.glm.GLM$GLMSingleLambdaTsk.compute2(GLM.java:1030) at water.H2O$H2OCountedCompleter.compute(H2O.java:682) at jsr166y.CountedCompleter.exec(CountedCompleter.java:429) at jsr166y.ForkJoinTask.doExec(ForkJoinTask.java:263) at jsr166y.ForkJoinPool$WorkQueue.pollAndExecAll(ForkJoinPool.java:914) at jsr166y.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:979) at jsr166y.ForkJoinPool.runWorker(ForkJoinPool.java:1477) at jsr166y.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:104)

    JIRA | 2 years ago | Sebastian Vidrio
    java.lang.AssertionError: objvals from line-search and gradient tasks differ, 0.6670861437437924 != 0.6670984136694696
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    1. Getting Started: Compiling, Running, and Debugging - Java Cookbook, 3rd Edition [Book]

    safaribooksonline.com | 4 months ago
    java.lang.AssertionError: i is non-positive
  6. 0

    smali LexerTest fails always on Windows

    GitHub | 2 years ago | JesusFreke
    java.lang.AssertionError: Invalid token text at index 1. Expecting text &quot; &quot; [10], got &quot; &quot; [13, 10]

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.AssertionError

      invalid gradient/direction, got positive differential 0.0010490659578066802

      at hex.optimization.OptimizationUtils$MoreThuente.evaluate()
    2. hex.optimization
      OptimizationUtils$MoreThuente.evaluate
      1. hex.optimization.OptimizationUtils$MoreThuente.evaluate(OptimizationUtils.java:334)
      1 frame
    3. hex.glm
      GLM$GLMDriver.computeImpl
      1. hex.glm.GLM$GLMDriver.fitIRLSM_multinomial(GLM.java:606)
      2. hex.glm.GLM$GLMDriver.fitModel(GLM.java:907)
      3. hex.glm.GLM$GLMDriver.computeSubmodel(GLM.java:980)
      4. hex.glm.GLM$GLMDriver.computeImpl(GLM.java:1042)
      4 frames
    4. hex
      ModelBuilder$Driver.compute2
      1. hex.ModelBuilder$Driver.compute2(ModelBuilder.java:169)
      1 frame
    5. hex.glm
      GLM$GLMDriver.compute2
      1. hex.glm.GLM$GLMDriver.compute2(GLM.java:515)
      1 frame
    6. water
      H2O$H2OCountedCompleter.compute
      1. water.H2O$H2OCountedCompleter.compute(H2O.java:1198)
      1 frame
    7. jsr166y
      ForkJoinWorkerThread.run
      1. jsr166y.CountedCompleter.exec(CountedCompleter.java:468)
      2. jsr166y.ForkJoinTask.doExec(ForkJoinTask.java:263)
      3. jsr166y.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:974)
      4. jsr166y.ForkJoinPool.runWorker(ForkJoinPool.java:1477)
      5. jsr166y.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:104)
      5 frames