Spark application third-party JAR file Dependency solution

Source: Internet
Author: User

The first way

Action: Package A third-party jar file into the resulting spark application jar file

Scenario: Third-party jar files are relatively small, with fewer places to apply

The second way

Action: Use Spark-submit to submit a command parameter:--jars

Requirements:

1. The corresponding jar file exists on the machine using the Spark-submit command

2. When the jar file is required for services on other machines in the cluster, the jar file is obtained through an HTTP interface provided by driver (for example: http://192.168.187.146:50206/jars/ Mysql-connector-java-5.1.27-bin.jar Added by User)

# # Configuration parameters:--jars Jars The following example: $ Bin/spark-shell--jars/opt/cdh-5.3.6/hive/lib/mysql-connector-java-5.1.27-bin.jar

Scenario: Requires that a corresponding jar file be required locally

The Third Way

Action: Use Spark-submit to submit a command parameter:--packages


# # Configuration parameters:-- The MAVEN address of the packages jar package is as follows example: $ bin/spark-shell--packages Mysql:mysql-connector-java: 5.1. --repositories http://maven.aliyun.com/nexus/content/groups/public/

# #--repositories is the MAVEN address for the Mysql-connector-java package, and if not given, it will be downloaded using the MAVEN default source installed on the machine
# # If you rely on multiple packages, repeat the above jar package, separated by commas
# # The default downloaded package is located in the. Ivy/jars folder in the current user's root directory

Scenario: Local can not, when the service in the cluster needs the package, is from the given MAVEN address, directly download

Fourth Way

Action: Change the configuration information for SPARK: Spark_classpath, add a third-party jar file to the SPARK_CLASSPATH environment variable

Note: Added third-party jar files must exist on all machines that are required to run the spark application

mkdir External_jars
B. Modifying the Spark configuration Information command: VIM conf/spark-env. SH What to modify: Spark_classpath= $SPARK _classpath:/opt/cdh-5.3. 6/spark/external_jars/*
C. Copy the dependent jar files to the new Folder command: $ cp/opt/cdh-5.3.6/hive/lib/mysql-connector-java-5.1.27-bin.jar./external_jars/

Application scenario: The dependent jar package is very many, the writing command method is more cumbersome, the case that relies on the package application is also many cases

Note: (For spark on yarn (cluster) mode only)Spark on Yarn (cluster), if the app relies on third-party jar filesFinal Solution: Copy the third-party jar file into the ${hadoop_home}/share/hadoop/common/lib folder (all machines in the HADOOP cluster require copy)

Spark application third-party JAR file Dependency solution

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.