Test Spark's work through the shell of Spark

Source: Internet
Author: User

STEP1: Start the Spark cluster, which is very detailed in the third lecture, after the start of the WebUI as follows:

STEP2: Start the spark Shell:

You can now view the shell situation through the following Web console:

STEP3: Copy the Spark installation directory "README.MD" to the HDFS system

Start a new command terminal on the master node and go to the Spark installation directory:

We copy the files to the root folder in HDFs:

At this point, we'll look at the Web console and find that the file has been successfully uploaded to HDFs:

STEP4: Work with the spark Shell to write code to manipulate the "readme.md" We Upload:

First, let's look at the "SC" in the shell environment, which automatically helps us to produce the environment variables:

It can be seen that SC is an example of Sparkcontext, which is what the system helps us generate automatically when launching the spark Shell, Sparkcontext is to commit the code to the cluster or local channel, we write the spark code, You must have an instance of Sparkcontext whether you want to run a local or a cluster.

Next, we read the file "readme.md":

We save the read content to the file, in fact, file is a mappedrdd, in the code of Spark, everything is based on the RDD operation;

Next, we filter out all the word "Spark" from the file we read.

A filteredrdd is generated at this time;

Next, let's count how many times "Spark" has appeared:

From the execution results we found that the word "Spark" appeared 15 times altogether.

At this point, we look at the Web console of the spark shell:

The discovery console shows that we have submitted a task and completed it successfully, and click on the task to see its execution details:

So how do we verify that the spark shell is correct for the 15 occurrences of "spark" in this readme.md file? In fact the method is very simple, we can use the Ubuntu comes with the WC command to statistics, as follows:

The result of this is also 15 times, and the spark shell count is the same.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.