Spark Environment Setup (standalone cluster mode)

Source: Internet
Author: User
Tags log4j

Reference articles
Spark Standalone Mode
Single-machine version of Spark on Mac Lite notes [0]
Big Data weapon: Spark's stand-alone deployment and test notes Spark 2.0.0

Download the latest version

2.0.0

Unpack and go to directory to explain

Standalone mode for Master-worker mode, start Master in local emulated cluster mode

> sbin/start-master.sh

Open http://localhost:8080/, the interface is as follows:

start Slave

Open our slave according to the URL of the red box callout in the picture.

> sbin/start-slave.sh spark://doctorqdemacbook-pro.local:7077

This time we will refresh the page above http://localhost:8080/

Can see a workers more information. Compute Instances

Shell

>./bin/run-example sparkpi 10 Output: Using Spark ' s default log4j profile:org/apache/spark/log4j-defaults.properties 16/ 09/01 22:54:29 INFO sparkcontext:running Spark version 2.0.0 16/09/01 22:54:29 WARN nativecodeloader:unable to load Nati  Ve-hadoop Library for your platform ... using Builtin-java classes where applicable 16/09/01 22:54:29 INFO SecurityManager:  Changing view ACLs To:doctorq 16/09/01 22:54:29 INFO securitymanager:changing Modify ACLs To:doctorq 16/09/01 22:54:29 Info securitymanager:changing View ACLS groups to:16/09/01 22:54:29 INFO securitymanager:changing modify ACLs groups T o:16/09/01 22:54:29 INFO SecurityManager:SecurityManager:authentication disabled; UI ACLs Disabled; Users with View Permissions:set (DOCTORQ); Groups with view Permissions:set (); Users with Modify Permissions:set (DOCTORQ); Groups with Modify Permissions:set () 16/09/01 22:54:30 INFO utils:successfully started service ' sparkdriver ' on port 629 16/09/01 22:54:30 INFO Sparkenv:registering mapoutputtracker 16/09/01 22:54:30 info sparkenv:registering blockmanagermaster 16/09/01 22:54:30 Info DiskBl ockmanager:created Local Directory at/private/var/folders/n9/cdmr_pnj5txgd82gvplmtbh00000gn/t/ BLOCKMGR-D431B694-9FB8-42BF-B04A-286369D41EA2 16/09/01 22:54:30 INFO Memorystore:memorystore started with capacity 366.3 MB 16/09/01 22:54:30 info sparkenv:registering outputcommitcoordinator 16/09/01 22:54:30 info utils:successfully s
tarted Service ' Sparkui ' on port 4040. 16/09/01 22:54:30 Info sparkui:bound sparkui to 0.0.0.0, and started at http://192.168.0.101:4040 16/09/01 22:54:30 info sparkcontext:added JAR File:/users/doctorq/documents/developer/spark-2.0.0-bin-hadoop2.7/examples/jars/scopt_ 2.11-3.3.0.jar at Spark://192.168.0.101:62953/jars/scopt_2.11-3.3.0.jar with timestamp 1472741670647 16/09/01 22:54:30 INFO sparkcontext:added JAR file:/users/doctorq/documents/developer/spark-2.0.0-bin-hadoop2.7/examples/ Jars/spark-examples_2.11-2.0.0.jar at spark://192.168.0.101:62953/jars/spark-examples_2.11-2.0.0.jar with timestamp 1472741670649 16/09/01 22:54:30 INFO Executor: Starting executor ID driver on host localhost 16/09/01 22:54:30 INFO utils:successfully started service ' Org.apache.spark
. Network.netty.NettyBlockTransferService ' on port 62954. 16/09/01 22:54:30 Info nettyblocktransferservice:server created on 192.168.0.101:62954 16/09/01 22:54:30 info BlockManag Ermaster:registering Blockmanager Blockmanagerid (Driver, 192.168.0.101, 62954) 16/09/01 22:54:30 INFO Blockmanagermasterendpoint:registering block manager 192.168.0.101:62954 with 366.3 MB RAM, Blockmanagerid (Driver, 192.168.0.101, 62954) 16/09/01 22:54:30 INFO blockmanagermaster:registered blockmanager blockmanagerid (Driver, 192.168.0.101, 62954) 16/09/01 22:54:30 WARN sparkcontext:use An existing sparkcontext, some configuration could not take E
Ffect. 16/09/01 22:54:31 INFO sharedstate:warehouse path is ' file:/users/doctorq/documents/developer/ Spark-2.0.0-bin-hadoop2.7/spark-wArehouse '.  16/09/01 22:54:31 Info sparkcontext:starting job:reduce at sparkpi.scala:38 16/09/01 22:54:31 Info dagscheduler:got Job 0 (reduce at sparkpi.scala:38) with output partitions 16/09/01 22:54:31 INFO dagscheduler:final stage:resultstage 0 (Reduce at sparkpi.scala:38) 16/09/01 22:54:31 info dagscheduler:parents of Final stage:list () 16/09/01 22:54:31 INFO DA Gscheduler:missing parents:list () 16/09/01 22:54:31 INFO dagscheduler:submitting resultstage 0 (MapPartitionsRDD[1] at Map at Sparkpi.scala:34), which have no missing parents 16/09/01 22:54:31 INFO memorystore:block broadcast_0 stored as Val  UEs in memory (estimated size 1832.0 B, free 366.3 MB) 16/09/01 22:54:31 INFO memorystore:block broadcast_0_piece0 stored As bytes in memory (estimated size 1169.0 B, free 366.3 MB) 16/09/01 22:54:31 INFO blockmanagerinfo:added broadcast_0_pi Ece0 in Memory on 192.168.0.101:62954 (size:1169.0 B, free:366.3 MB) 16/09/01 22:54:31 INFO sparkcontext:created BROADC AST 0 from BROadcast at dagscheduler.scala:1012 16/09/01 22:54:31 INFO dagscheduler:submitting missing tasks from Resultstage 0 (Ma
PPARTITIONSRDD[1] at map at sparkpi.scala:34) 16/09/01 22:54:31 INFO taskschedulerimpl:adding task set 0.0 with ten tasks 16/09/01 22:54:31 INFO tasksetmanager:starting task 0.0 in stage 0.0 (TID 0, localhost, partition 0, process_local, 5478 bytes) 16/09/01 22:54:31 INFO tasksetmanager:starting Task 1.0 in Stage 0.0 (TID 1, localhost, partition 1, process_local , 5478 bytes) 16/09/01 22:54:31 INFO tasksetmanager:starting Task 2.0 in stage 0.0 (TID 2, localhost, partition 2, Proces  s_local, 5478 bytes) 16/09/01 22:54:31 INFO tasksetmanager:starting Task 3.0 in stage 0.0 (TID 3, localhost, partition 3, process_local, 5478 bytes) 16/09/01 22:54:31 Info executor:running Task 1.0 in Stage 0.0 (TID 1) 16/09/01 22:54:31 Info Executor:running task 0.0 in stage 0.0 (tid 0) 16/09/01 22:54:31 INFO executor:running Task 2.0 in stage 0.0 (tid 2) 16/ 09/01 22:54:31 INFO ExecUtor:running Task 3.0 in stage 0.0 (TID 3) 16/09/01 22:54:31 INFO executor:fetching Spark://192.168.0.101:62953/jars/sco Pt_2.11-3.3.0.jar with timestamp 1472741670647 16/09/01 22:54:31 INFO transportclientfactory:successfully created Connection to/192.168.0.101:62953 after-MS (0 Ms spent in bootstraps) 16/09/01 22:54:31 INFO utils:fetching spark://1 92.168.0.101:62953/jars/scopt_2.11-3.3.0.jar to/private/var/folders/n9/cdmr_pnj5txgd82gvplmtbh00000gn/t/ spark-ca630415-5293-4442-950c-0dd300afce94/userfiles-ece307aa-39b1-4125-bb8f-c336385f0543/ Fetchfiletemp6343692004556899262.tmp 16/09/01 22:54:32 INFO executor:adding file:/private/var/folders/n9/cdmr_ pnj5txgd82gvplmtbh00000gn/t/spark-ca630415-5293-4442-950c-0dd300afce94/ Userfiles-ece307aa-39b1-4125-bb8f-c336385f0543/scopt_2.11-3.3.0.jar to class loader 16/09/01 22:54:32 INFO Executor: Fetching Spark://192.168.0.101:62953/jars/spark-examples_2.11-2.0.0.jar with timestamp 1472741670649 16/09/01 22:54:32 INFO utils:fetching spark://192.168.0.101:62953/jars/spark-examples_2.11-2.0.0.jar To/private/var/folders/n9/cdmr_ pnj5txgd82gvplmtbh00000gn/t/spark-ca630415-5293-4442-950c-0dd300afce94/ Userfiles-ece307aa-39b1-4125-bb8f-c336385f0543/fetchfiletemp5957043619844169017.tmp 16/09/01 22:54:32 INFO Executor:adding file:/private/var/folders/n9/cdmr_pnj5txgd82gvplmtbh00000gn/t/ Spark-ca630415-5293-4442-950c-0dd300afce94/userfiles-ece307aa-39b1-4125-bb8f-c336385f0543/spark-examples_ 2.11-2.0.0.jar to class loader 16/09/01 22:54:32 INFO executor:finished Task 1.0 in Stage 0.0 (TID 1). 1032 bytes result sent to driver 16/09/01 22:54:32 INFO executor:finished Task 3.0 in stage 0.0 (TID 3). 1032 bytes result sent to driver 16/09/01 22:54:32 INFO executor:finished task 0.0 in stage 0.0 (TID 0). 1032 bytes result sent to driver 16/09/01 22:54:32 INFO executor:finished Task 2.0 in stage 0.0 (TID 2). 945 bytes result sent to driver 16/09/01 22:54:32 INFO tasksetmanager:starting Task 4.0 in stage 0.0 (TID 4, localhost, p ArtitiOn 4, process_local, 5478 bytes) 16/09/01 22:54:32 INFO executor:running Task 4.0 in stage 0.0 (TID 4) 16/09/01 22:54:32 INFO tasksetmanager:starting Task 5.0 in stage 0.0 (TID 5, localhost, partition 5, process_local, 5478 bytes) 16/09/01 22 : 54:32 INFO tasksetmanager:starting Task 6.0 in stage 0.0 (TID 6, localhost, partition 6, process_local, 5478 bytes) 16/0 9/01 22:54:32 Info executor:running Task 6.0 in stage 0.0 (TID 6) 16/09/01 22:54:32 INFO tasksetmanager:finished task 3. 0 in stage 0.0 (TID 3) in 515 MS on localhost (1/<

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.