Start spark-shell locally

Source: Internet
Author: User

Start spark-shell locally

Spark-1.3, as a milestone release, has added many features, so it is necessary to study, spark-1.3 requires scala-2.10.x version support, on the System, the default scala version is 2.9. to upgrade it, see install scala 2.10.x on Ubuntu. after configuring the scala environment, download the spark cdh version. Click here to download it.

After the download, decompress the package directly and run./spark-shell directly in the bin directory:

The log is as follows:

Www.bkjia.com @ Hadoop01 :~ /Spark-evn/spark-1.3.0-bin-cdh4/bin $./spark-shell
Spark assembly has been built with Hive, including Datanucleus jars on classpath
Log4j: WARN No appenders cocould be found for logger (org. apache. hadoop. metrics2.lib. MutableMetricsFactory ).
Log4j: WARN Please initialize the log4j system properly.
Log4j: WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/04/14 00:03:30 INFO SecurityManager: Changing view acls to: zhangchao3
15/04/14 00:03:30 INFO SecurityManager: Changing modify acls to: zhangchao3
15/04/14 00:03:30 INFO SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set (zhangchao3); users with modify permissions: Set (zhangchao3)
15/04/14 00:03:30 INFO HttpServer: Starting HTTP Server
15/04/14 00:03:30 INFO Server: jetty-8.y.z-SNAPSHOT
15/04/14 00:03:30 INFO protected actconnector: Started [email protected]: 45918
15/04/14 00:03:30 INFO Utils: Successfully started service 'HTTP class Server' on port 45918.
Welcome
______
/__/__________//__
_\\/_\/_'/__/'_/
/___/. _/\ _, _/\ _ \ Version 1.3.0
/_/

Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0 _ 75)
Type in expressions to have them evaluated.
Type: help for more information.
15/04/14 00:03:33 WARN Utils: Your hostname, hadoop01 resolves to a loopback address: 127.0.1.1; using 172.18.147.71 instead (on interface em1)
15/04/14 00:03:33 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
15/04/14 00:03:33 INFO SparkContext: Running Spark version 1.3.0
15/04/14 00:03:33 INFO SecurityManager: Changing view acls to: zhangchao3
15/04/14 00:03:33 INFO SecurityManager: Changing modify acls to: zhangchao3
15/04/14 00:03:33 INFO SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set (zhangchao3); users with modify permissions: Set (zhangchao3)
15/04/14 00:03:33 INFO Slf4jLogger: Slf4jLogger started
15/04/14 00:03:33 INFO Remoting: Starting remoting
15/04/14 00:03:33 INFO Remoting: Remoting started; listening on addresses: [akka. tcp: // [email protected]: 51629]
15/04/14 00:03:33 INFO Utils: Successfully started service 'sparkdriver 'on port 51629.
15/04/14 00:03:33 INFO SparkEnv: Registering MapOutputTracker
15/04/14 00:03:33 INFO SparkEnv: Registering BlockManagerMaster
15/04/14 00:03:33 INFO DiskBlockManager: Created local directory at/tmp/spark-d398c8f3-6345-41f9-a712-36cad4a45e67/blockmgr-255070a6-19a9-49a5-a117-e4e8733c250a
15/04/14 00:03:33 INFO MemoryStore: MemoryStore started with capacity 265.4 MB
15/04/14 00:03:33 INFO HttpFileServer: HTTP File server directory is/tmp/spark-296eb142-92fc-46e9-bea8-f6065aa8f49d/httpd-4d6e4295-dd96-48bc-84b8-c26815a9364f
15/04/14 00:03:33 INFO HttpServer: Starting HTTP Server
15/04/14 00:03:33 INFO Server: jetty-8.y.z-SNAPSHOT
15/04/14 00:03:33 INFO protected actconnector: Started [email protected]: 56529
15/04/14 00:03:33 INFO Utils: Successfully started service 'HTTP file Server' on port 56529.
15/04/14 00:03:33 INFO SparkEnv: Registering OutputCommitCoordinator
15/04/14 00:03:33 INFO Server: jetty-8.y.z-SNAPSHOT
15/04/14 00:03:33 INFO protected actconnector: Started [email protected]: 4040
15/04/14 00:03:33 INFO Utils: Successfully started service 'sparkui' on port 4040.
15/04/14 00:03:33 INFO SparkUI: Started SparkUI at http: // 172.18.147.71: 4040
15/04/14 00:03:33 INFO Executor: Starting executor ID <driver> on host localhost
15/04/14 00:03:33 INFO Executor: Using REPL class URI: http: // 172.18.147.71: 45918
15/04/14 00:03:33 INFO AkkaUtils: Connecting to HeartbeatReceiver: akka. tcp: // [email protected]: 51629/user/HeartbeatReceiver
15/04/14 00:03:33 INFO NettyBlockTransferService: Server created on 55429
15/04/14 00:03:33 INFO BlockManagerMaster: Trying to register BlockManager
15/04/14 00:03:33 INFO BlockManagerMasterActor: Registering block manager localhost: 55429 with 265.4 mb ram, BlockManagerId (<driver>, localhost, 55429)
15/04/14 00:03:33 INFO BlockManagerMaster: Registered BlockManager
15/04/14 00:03:34 INFO SparkILoop: Created spark context ..
Spark context available as SC.
15/04/14 00:03:34 INFO SparkILoop: Created SQL context (with Hive support )..
SQL context available as sqlContext.

Scala>

Http: // 172.18.147.71: 4040/jobs/, you can see the spark running status:

-------------------------------------- Split line --------------------------------------

Spark1.0.0 Deployment Guide

Install Spark0.8.0 in CentOS 6.2 (64-bit)

Introduction to Spark and its installation and use in Ubuntu

Install the Spark cluster (on CentOS)

Hadoop vs Spark Performance Comparison

Spark installation and learning

Spark Parallel Computing Model

-------------------------------------- Split line --------------------------------------

Spark details: click here
Spark: click here

This article permanently updates the link address:

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.