Installation: 1. Download http://d3kbcqa49mib13.cloudfront.net/spark-2.0.1-bin-hadoop2.6.tgz 2. Install Master to 192.168.8.94 machine to extract files and run start-master.sh bash start-master.sh in Sbin can be opened on the following page after normal installation:
3. Install worker./bin/spark-class org.apache.spark.deploy.worker.Worker spark://192.168.8.94:7077-c 4-m 2g-c parameter represents the number of cores. The-m parameter represents the memory size.
Installation Complete
Use: 1. Run the Pyspark shell, such as: Run the Pyspark Shell to allocate 6 CPUs, 2 CPUs per actuator, can be by the following command: Pyspark--master spark://192.168.8.94:7077--tota L-executor-cores 6--executor-cores 2 operation result: Welcome to ____ __/__/__ ___ _____//__ _\ \ _/_ '/__/' _//__/. __/\_ , _/_//_/\_\ version 2.0.2/_/
Using Python version 2.6.6 (r266:84292, Jul 15:22:56) sparksession available as ' spark '. >>> 2.spark-submit submit tasks such as submitting a task, allocating 6 CPUs, 2 CPUs per actuator, can be ordered by the following command [gcadmin006@cnhbase111 ~]$ spark-submit--master spark://172.17.13.111:7077--total-executor-cores 6--executor-cores 2 hbase_to_cloudhbase_prodesc.py
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.
A Free Trial That Lets You Build Big!
Start building with 50+ products and up to 12 months usage for Elastic Compute Service