Spark-shell Startup script Interpretation

Source: Internet
Author: User
Tags posix

#!/usr/bin/Envbash## Licensed to the Apache software Foundation (ASF) under one or More# contributor license agreements. See the NOTICEfiledistributed with# this work foradditional information regarding copyright ownership.# the ASF licenses thisfileTo you under the Apache License, Version2.0# ( the"License"); You are not the use of thisfileExceptinchCompliance with# the License. Obtain a copy of the License at## http://www.apache.org/licenses/LICENSE-2.0# # unless required by applicable law or agreed toinchwriting, software# distributed under the License is distributed in an" as is"basis,# without warranties or CONDITIONS of any KIND, either express or implied.# see the License forThe specific language governing permissions and# limitations under the license.### Shell script forstarting the Spark Shell repl# determine if it is Cygwincygwin=false Case "' uname '" inchCYGWIN*) cygwin=true;;Esac# Enter POSIX mode forBashset-o posix## Global script variables# enters the installation directory for Spark Fwdir="$ (CD ' dirname $ '/...; pwd)"#定义帮助信息的方法 # Invoke Spark-submit the Help information, just to put the following help filter out # Usage:spark-submit [options] <app jar | Pythonfile>[app arguments]# Usage:spark-submit--Kill[Submission ID]--master [spark://...]# usage:spark-submit--status [submission ID]--master [spark://...]functionusage () {Echo "Usage:./bin/spark-shell [Options]"$FWDIR/bin/spark-submit--help2>&1|grep-V Usage1>&2Exit0}if[["[email protected]"= *--help]] | | [["[email protected]"= *-h]]; Thenusagefi#引用utils. Sh script, the function of the script is to organize the script parameters, judge the legitimacy of some parameters, assign the following two variables #submission_opts: #SUBMISSION_OPTS参数包括: # K-V forms are:--master | --deploy-mode | --class | --name | --jars | --py-files | --files | #                   --conf | --properties-file| --driver-memory | --driver-java-options | #                   --driver-library-path | --driver-class-path | --executor-memory | --driver-cores | #                   --total-executor-cores | --executor-cores | --queue | --num-executors | --Archives # non-K-v in the form of a #--verbose | -V | --supervise# kv in the form of the need to judge the number of #
#APPLICATION_OPTS参数包括除Parameters outside of submission_opts
Source $FWDIR/bin/utils.SH
#定义帮助信息方法的变量SUBMIT_USAGE_FUNCTION=usage
The Gathersparksubmitopts method in #调用utils. SH script. Sorting the Parameters gathersparksubmitopts"[email protected]"#主函数, call Spark-submit--class Org.apache.spark.repl.Main MethodfunctionMain () {if$cygwin; Then# Workaround forissue involving JLine and cygwin# (see http://sourceforge.net/p/jline/bugs/40/).# If You're using the Mintty terminal emulator in Cygwin, may need to set the#"Backspace sends ^h"Settinginch "Keys"section of the mintty options# (see https://github.com/sbt/sbt/issues/562).Stty-icanon min1-Echo>/dev/NULL 2>&1Export Spark_submit_opts="$SPARK _submit_opts-djline.terminal=unix"$FWDIR/bin/spark-submit--class Org.apache.spark.repl.Main"${submission_opts[@]}"Spark-shell"${application_opts[@]}"Stty IcanonEcho>/dev/NULL 2>&1ElseExport Spark_submit_opts$fwdir/bin/spark-submit--class Org.apache.spark.repl.Main"${submission_opts[@]}"Spark-shell"${application_opts[@]}"fi}# Copy Restore-tty-on-exit functions from Scala script so Spark-shell exits properly eveninch# Binary distribution of Spark where Scala is not installedexit_status=127Saved_stty=""# Restore Stty settings (Echo inchparticular)functionrestoresttysettings () {stty $saved _sttysaved_stty=""}functionOnExit () {if[["$saved _stty"!=""]]; Thenrestoresttysettingsfiexit $exit _status}# to reenableEcho ifwe are interrupted before Completing.trap onExit int# save Terminal Settingssaved_stty=$ (stty-g2>/dev/NULL)# ClearOn Error so we don'T later try to restore themif[[ ! $? ]]; ThenSaved_stty=""fiMain"[email protected]"# Record The exit status lest it be overwritten:# ThenReenableEchoAnd propagate the code.exit_status=$?OnExit

utils.sh Script content:

1#!/usr/bin/EnvBash2 3 #4# Licensed to the Apache software Foundation (ASF) under one or More5# Contributor license agreements. See the NOTICEfiledistributed with6# This work foradditional information regarding copyright ownership.7# The ASF licenses thisfileTo you under the Apache License, Version2.08# (The"License"); You are not the use of thisfileExceptinchCompliance with9 # The License. Obtain a copy of the License atTen # One# http://www.apache.org/licenses/LICENSE-2.0 A # -# unless required by applicable or agreed toinchwriting, Software -# Distributed under the License is distributed in an" as is"BASIS, the # without warranties or CONDITIONS of any KIND, either express or implied. -# See the License forThe specific language governing permissions and - # Limitations under the License. - # +  -# Gather All spark-submit options into Submission_opts + functiongathersparksubmitopts () { A  at   if[-Z"$SUBMIT _usage_function"]; Then -     Echo "Function for printing usage of $ am not set." 1>&2 -     Echo "Please set the usage function to shell variable ' submit_usage_function ' in" 1>&2 -Exit1 -   fi -  in# note:if you add or remove spark-sumbmit Options, - # Modify not only this script but also Sparksubmitargument.scala tosubmission_opts=() +application_opts=() -    while(($#)); Do the      Case " $" inch *--master | --deploy-mode | --class | --name | --jars | --py-files | --files |  $--conf | --properties-file| --driver-memory | --driver-java-options | Panax Notoginseng--driver-library-path | --driver-class-path | --executor-memory | --driver-cores |  ---total-executor-cores | --executor-cores | --queue | --num-executors | --archives) the         if[[$#-lt2]]; Then +           "$SUBMIT _usage_function" AExit1; the         fi +submission_opts+= (" $");Shift -submission_opts+= (" $");Shift $         ;; $  ---verbose | -V | --supervise) -submission_opts+= (" $");Shift the         ;; - Wuyi*) theapplication_opts+= (" $");Shift -         ;; Wu     Esac -    Done About  $ Export Submission_opts - Export Application_opts -}
View Code

Spark-shell Startup script Interpretation

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.