org.pentaho.di.core.exception.KettleException: We failed to initialize at least one step. Execution can not begin!

pentaho.com | 5 months ago
  1. 0

    Generate Rows crashing? [Archive] - Pentaho Community Forums

    pentaho.com | 5 months ago
    org.pentaho.di.core.exception.KettleException: We failed to initialize at least one step. Execution can not begin!
  2. 0

    Using the new CouchDB input step [Archive] - Pentaho Community Forums

    pentaho.com | 8 months ago
    org.pentaho.di.core.exception.KettleException: We failed to initialize at least one step. Execution can not begin!
  3. 0

    Convert String to XML parameter - Spatialytics - Forum

    spatialytics.com | 1 year ago
    org.pentaho.di.core.exception.KettleException: We failed to initialize at least one step. Execution can not begin!
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Exception trying to close file: java.lang.NullPointerException

    GitHub | 2 years ago | churtado
    org.pentaho.di.core.exception.KettleException: We failed to initialize at least one step. Execution can not begin!
  6. 0

    My mapreduce input file content like this,every line is a json string: {"appname":"Laucher","appver":"3.4.1","city":"HongKong"} {"appname":"Faker","appver":"1.2.1","city":"Singapore"} The mapreduce input hop to a UDJC,and its main code like this(I also test the code in my java project it works well and i already put the json-lib-2.4.jar and ezmorph-1.0.6.jar to libext folder,change the launcher.properties to add the libext,finally restart the PDI): import net.sf.json.JSONObject; import java.util.Map; String originalJsonField; String extractJsonField; public boolean processRow(StepMetaInterface smi, StepDataInterface sdi) throws KettleException { // First, get a row from the default input hop // Object[] r = getRow(); // If the row object is null, we are done processing. // if (r == null) { setOutputDone(); return false; } // Let's look up parameters only once for performance reason. // if (first) { originalJsonField = getParameter("ORIGINAL_JSONSTR_FIELD"); extractJsonField = getParameter("EXTRACT_FIELD"); first=false; } // It is always safest to call createOutputRow() to ensure that your output row's Object[] is large // enough to handle any new fields you are creating in this step. // Object[] outputRow = createOutputRow(r, data.outputRowMeta.size()); String originalJsonStr = get(Fields.In, originalJsonField).getString(r); // Set the value in the output field; JSONObject jsonObject = JSONObject.fromObject(originalJsonStr); Map jsonMap = (Map)jsonObject; String appName = (String)jsonMap.get("appname"); String appVer = (String)jsonMap.get("appver"); String city = (String)jsonMap.get("city"); String extractjsonField = appName + "_" + appVer + "_" + city; get(Fields.Out, extractJsonField).setValue(outputRow, extractjsonField); // putRow will send the row on to the default output hop. // putRow(data.outputRowMeta, outputRow); return true; } Later i config the extractjsonField as the output field of UDJC and can get extractjsonField in the next mapreduce output.But run the job.it always show the errors: 2016/11/09 10:13:54 - Pentaho MapReduce - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : [TIPFAILED] -- Task: attempt_1478591104009_0020_m_000000_3 Attempt: attempt_1478591104009_0020_m_000000_3 Event: 3 2016/11/09 10:13:54 - Pentaho MapReduce - Error: java.io.IOException: org.pentaho.di.core.exception.KettleException: We failed to initialize at least one step. Execution can not begin! at org.pentaho.hadoop.mapreduce.PentahoMapRunnable.run(PentahoMapRunnable.java:527) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) Caused by: org.pentaho.di.core.exception.KettleException: We failed to initialize at least one step. Execution can not begin! at org.pentaho.di.trans.Trans.prepareExecution(Trans.java:1142) at org.pentaho.hadoop.mapreduce.PentahoMapRunnable.run(PentahoMapRunnable.java:410) ... 7 more

    Pentaho BI Platform Tracking | 3 months ago | dongliang
    org.pentaho.di.core.exception.KettleException: We failed to initialize at least one step. Execution can not begin!

    1 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. org.pentaho.di.core.exception.KettleException

      We failed to initialize at least one step. Execution can not begin!

      at org.pentaho.di.trans.Trans.prepareExecution()
    2. org.pentaho.di
      TransPreviewProgressDialog$1.run
      1. org.pentaho.di.trans.Trans.prepareExecution(Trans.java:932)
      2. org.pentaho.di.ui.trans.dialog.TransPreviewProgressDialog.doPreview(TransPreviewProgressDialog.java:140)
      3. org.pentaho.di.ui.trans.dialog.TransPreviewProgressDialog.access$000(TransPreviewProgressDialog.java:52)
      4. org.pentaho.di.ui.trans.dialog.TransPreviewProgressDialog$1.run(TransPreviewProgressDialog.java:85)
      4 frames
    3. JFace
      ModalContext$ModalContextThread.run
      1. org.eclipse.jface.operation.ModalContext$ModalContextThread.run(ModalContext.java:113)
      1 frame