Step 4: build and test the spark development environment through spark ide
Step 1: Import the package corresponding to spark-hadoop, select "file"> "project structure"> "Libraries", and select "+" to import the package corresponding to spark-hadoop:
Click "OK" to confirm:
Click "OK ":
After idea is completed, we will find that the spark jar package is imported into our project:
Step 2: Develop the first spark program. Open the examples directory that comes with spark:
At this time, we found that there were a lot of internal files, which were examples provided by spark for me.
Create a Scala object named sparkpi under SRC of our first Scala project:
Open the sparkpi file under examples of spark:
We copy the content of this article directly to the sparkpi created in idea:
[Spark Asia Pacific Research Institute Series] the path to spark practice-Chapter 1 building a spark cluster (step 4) (7)