java.lang.IllegalArgumentException: Unknown/unsupported param List(--executor-memory, 0.5g, --executor-cores, 2, --primary-py-file, s3://<mybucketname>/mypythonfile.py, --class, org.apache.spark.deploy.PythonRunner) Usage: org.apache.spark.deploy.yarn.Client [options] Options: --jar JAR_PATH Path to your application's JAR file (required in yarn-cluster mode) . . .

Stack Overflow | Gopala | 7 months ago
  1. 0

    amazon emr spark submission from S3 not working

    Stack Overflow | 7 months ago | Gopala
    java.lang.IllegalArgumentException: Unknown/unsupported param List(--executor-memory, 0.5g, --executor-cores, 2, --primary-py-file, s3://<mybucketname>/mypythonfile.py, --class, org.apache.spark.deploy.PythonRunner) Usage: org.apache.spark.deploy.yarn.Client [options] Options: --jar JAR_PATH Path to your application's JAR file (required in yarn-cluster mode) . . .
  2. 0

    How to specify multiple dependencies using --packages for spark-submit?

    Stack Overflow | 1 year ago | davidpricedev
    java.lang.IllegalArgumentException: Given path is malformed: org.apache.hbase:hbase-common:1.0.0
  3. 0

    Running Spark on m4 instead of m3 on AWS

    Stack Overflow | 5 months ago | user6742737
    java.lang.IllegalArgumentException: Unknown/unsupported param List(--executor-cores, , --files, s3://pythonpicode/PythonPi.py, --primary-py-file, PythonPi.py, --class, org.apache.spark.deploy.PythonRunner) Usage: org.apache.spark.deploy.yarn.Client [options] Options: --jar JAR_PATH Path to your application's JAR file (required in yarn-cluster mode) --class CLASS_NAME Name of your application's main class (required) --primary-py-file A main Python file --arg ARG Argument to be passed to your application's main class. Multiple invocations are possible, each will be passed in order. --num-executors NUM Number of executors to start (Default: 2) --executor-cores NUM Number of cores per executor (Default: 1). --driver-memory MEM Memory for driver (e.g. 1000M, 2G) (Default: 512 Mb) --driver-cores NUM Number of cores used by the driver (Default: 1). --executor-memory MEM Memory per executor (e.g. 1000M, 2G) (Default: 1G) --name NAME The name of your application (Default: Spark) --queue QUEUE The hadoop queue to use for allocation requests (Default: 'default') --addJars jars Comma separated list of local jars that want SparkContext.addJar to work with. --py-files PY_FILES Comma-separated list of .zip, .egg, or .py files to place on the PYTHONPATH for Python apps. --files files Comma separated list of files to be distributed with the job. --archives archives Comma separated list of archives to be distributed with the job.
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    GitHub comment 209#98161267

    GitHub | 2 years ago | dsdinter
    java.lang.IllegalArgumentException: You must specify at least 1 executor! Usage: org.apache.spark.deploy.yarn.Client [options] Options: --jar JAR_PATH Path to your application's JAR file (required in yarn-cluster mode) --class CLASS_NAME Name of your application's main class (required) --primary-py-file A main Python file --arg ARG Argument to be passed to your application's main class. Multiple invocations are possible, each will be passed in order. --num-executors NUM Number of executors to start (Default: 2) --executor-cores NUM Number of cores per executor (Default: 1). --driver-memory MEM Memory for driver (e.g. 1000M, 2G) (Default: 512 Mb) --driver-cores NUM Number of cores used by the driver (Default: 1). --executor-memory MEM Memory per executor (e.g. 1000M, 2G) (Default: 1G) --name NAME The name of your application (Default: Spark) --queue QUEUE The hadoop queue to use for allocation requests (Default: 'default') --addJars jars Comma separated list of local jars that want SparkContext.addJar to work with. --py-files PY_FILES Comma-separated list of .zip, .egg, or .py files to place on the PYTHONPATH for Python apps. --files files Comma separated list of files to be distributed with the job. --archives archives Comma separated list of archives to be distributed with the job.
  6. 0
    samebug tip
    Some bots are sending malformed HTTP requests to your site. Try to find their IP addresses in the access logs and ask them to fix the bots or blacklist them.

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.IllegalArgumentException

      Unknown/unsupported param List(--executor-memory, 0.5g, --executor-cores, 2, --primary-py-file, s3://<mybucketname>/mypythonfile.py, --class, org.apache.spark.deploy.PythonRunner) Usage: org.apache.spark.deploy.yarn.Client [options] Options: --jar JAR_PATH Path to your application's JAR file (required in yarn-cluster mode) . . .

      at org.apache.spark.deploy.SparkSubmit$.main()
    2. Spark
      SparkSubmit.main
      1. org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
      2. org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
      2 frames