In the load-spark-env.sh script below the spark bin , there are the following statements:
  If [-F "${user_conf_dir}/spark-env.sh"]; Then    # Promote all variable declarations to environment (exported) variables    set-a    . "${user_conf_dir}/spark-env.sh"    set +a  
Not very familiar with set-a, so did the following small test.
There are two shell scripts, test.sh as follows:
#!/usr/bin/env bashset-aspark_home=aaaaset +astorm_home=ccccecho $SPARK _homeecho $STORM _home./test2.shecho $SPARK _ Homeecho $STORM _home
Test2.sh as follows:
#!/usr/bin/env bashecho $SPARK _homeecho $STORM _homespark_home=dddstorm_home=eee
The results of running test.sh are as follows:
Aaaaccccaaaaaaaacccc
You can see that the spark_home in Set-a can be accessed in another bash. In fact, this is the meaning of set-a, it exports the current variable, so that other scripts run in Bash can also access the change amount, but unlike the export is only accessible, cannot be modified.
In addition, if you do not have set-a, you can actually access through the child shell, but not modify, but this makes all the variables in the parent shell can be accessed by the shell, can not be a scope of control.
That is, if the test.sh is modified as follows:
#!/usr/bin/env bashset-aspark_home=aaaaset +astorm_home=ccccecho $SPARK _homeecho $STORM _home (../test2.sh) echo $ Spark_homeecho $STORM _home
The output results are as follows:
Aaaaccccaaaaccccaaaacccc
  
Summary: In fact, the shell in the doorway is still a lot of, just as a script, usually do not pay attention to, encountered do not know, more test on the good.
Reference: Http://stackoverflow.com/questions/9772036/pass-all-variables-from-one-shellscript-to-another
Shell Set Understanding