Jar packet minimum static dependency implementation

Source: Internet
Author: User

1, first use Jaranalyzer Open Source Tool rundotsummary.bat command to the target directory of Jar packet analysis to generate HEHE.GRPH files

Http://www.kirkk.com/main/zip/JarAnalyzer-1.2.zip

Example of using WordCount in hadoop1.1.2

digraph G {commons_beanutils_1_7_0-> commons_collections_3_2_1;
    Commons_beanutils_1_7_0-> commons_logging_1_1_1;
    Commons_configuration_1_6-> Commons_lang_2_4;
    Commons_configuration_1_6-> commons_logging_1_1_1;
    Commons_configuration_1_6-> commons_beanutils_1_7_0;
    Commons_configuration_1_6-> commons_collections_3_2_1;
    Commons_configuration_1_6-> commons_digester_1_8;
    Commons_configuration_1_6-> commons_codec_1_4;
    Commons_digester_1_8-> commons_beanutils_1_7_0;
    Commons_digester_1_8-> commons_logging_1_1_1;
    Commons_httpclient_3_0_1-> commons_logging_1_1_1;
    Commons_httpclient_3_0_1-> commons_codec_1_4;
    Commons_logging_1_1_1-> log4j_1_2_15;
    Hadoop_core_1_1_2-> Jackson_core_asl_1_8_8;
    Hadoop_core_1_1_2-> commons_logging_1_1_1;
    Hadoop_core_1_1_2-> commons_io_2_1;
    Hadoop_core_1_1_2-> Commons_net_3_1;
    Hadoop_core_1_1_2-> jetty_util_6_1_26; Hadoop_coRe_1_1_2-> jetty_6_1_26;
    Hadoop_core_1_1_2-> commons_daemon_1_0_1;
    Hadoop_core_1_1_2-> Jasper_runtime_5_5_12;
    Hadoop_core_1_1_2-> commons_cli_1_2;
    Hadoop_core_1_1_2-> commons_codec_1_4;
    Hadoop_core_1_1_2-> log4j_1_2_15;
    Hadoop_core_1_1_2-> Jackson_mapper_asl_1_8_8;
    Hadoop_core_1_1_2-> commons_httpclient_3_0_1;
    Hadoop_core_1_1_2-> Commons_lang_2_4;
    Hadoop_core_1_1_2-> commons_configuration_1_6;
    Hadoop_core_1_1_2-> commons_math_2_1;
    Hadoop_core_1_1_2-> Slf4j_api_1_4_3;
    Jackson_mapper_asl_1_8_8-> Jackson_core_asl_1_8_8;
    Jasper_runtime_5_5_12-> commons_logging_1_1_1;
    Jasper_runtime_5_5_12-> Commons_el_1_0;
    Jetty_6_1_26-> jetty_util_6_1_26;
    Jetty_util_6_1_26-> Slf4j_api_1_4_3;
    Slf4j_api_1_4_3-> Slf4j_log4j12_1_4_3;
    Slf4j_log4j12_1_4_3-> Slf4j_api_1_4_3;
Slf4j_log4j12_1_4_3-> log4j_1_2_15;
 }
Import Java.io.BufferedReader;
Import Java.io.FileReader;
Import Java.util.HashMap;
Import Java.util.LinkedHashMap;
Import Java.util.LinkedHashSet;

Import Java.util.Map; Grph file analysis based on Jaranalyzer generated public class Getminjar {private static hashmap<string, linkedhashset<string>> l

	Ines=new hashmap<> ();
			public static void Main (string[] args) {try {//fileinputstream fileinputstream=new fileinputstream ("Hehe.grph");
			String line= "";
			String rootdata= "Hadoop_core_1_1_2";
			String grphpath= "Hehe.grph";
				if (args.length>1) {rootdata=args[0];
			GRPHPATH=ARGS[1];

			} if (args.length!=2) {System.out.println ("Useage:java-jar getminjar hadoop_core_1_1_2 hehe.grph");
			} filereader filereader=new FileReader (Grphpath);

			BufferedReader bufferedreader=new BufferedReader (FileReader);
				while ((Line=bufferedreader.readline ())!=null) {string[] Datas=line.split ("->|;"); if (datas.length>1) {String key=datas[0].Trim ();
					Linkedhashset<string> Val=lines.get (key);
					if (val!=null) {Val.add (Datas[1].trim ());
						}else{linkedhashset<string> set=new linkedhashset<string> ();
						Set.add (Datas[1].trim ());
					Lines.put (Datas[0].trim (), set);
			}} bufferedreader.close ();
			Filereader.close ();
			Linkedhashmap<string, string> results=new linkedhashmap<> ();

			Hehe (rootdata,results);


			System.out.println (Rootdata.replaceall ("_", "-") + ". Jar dependent on" +results.size () + "jar Pack ===========");
				For (map.entry<string, string> entry:results.entrySet ()) {String key=entry.getkey ();
				Key=key.replaceall ("_", "-");
				String Value=entry.getvalue ();
			System.out.println (key+ ". Jar");
		} catch (Exception e) {e.printstacktrace (); } private static void hehe (String data,linkedhashmap<string, string> results) {Linkedhashset<string&gt ;
		Val=lines.get (data); if (Val!=null) {for (String StrinG:val) {results.put (String, "");
				Detects whether recursive dead loops linkedhashset<string> tmp=lines.get (String);
					if (tmp!=null) {if (Tmp.contains (data)) {continue;

			} hehe (string,results);
 }

		}

	}

}

Only the least static-dependent jar pack is obtained, and the dynamically loaded program is not known until it is run. To get rid of useless jar packs, test yourself carefully.

The wordcount example of hadoop1.1.2 is the minimum dependent jar package for:

Commons-beanutils-1.7.0.jar
Commons-cli-1.2.jar
Commons-codec-1.4.jar
Commons-collections-3.2.1.jar
Commons-configuration-1.6.jar
Commons-daemon-1.0.1.jar
Commons-digester-1.8.jar
commons-el-1.0.jar
commons-httpclient-3.0.1.jar
Commons-io-2.1.jar
Commons-lang-2.4.jar
commons-logging-1.1.1.jar
commons-math-2.1.jar
Commons-net-3.1.jar
Hadoop-core-1.1.2.jar
Jackson-core-asl-1.8.8.jar
Jackson-mapper-asl-1.8.8.jar
Jasper-runtime-5.5.12.jar
jetty-6.1.26.jar
jetty-util-6.1.26.jar
Log4j-1.2.15.jar
Slf4j-api-1.4.3.jar
Slf4j-log4j12-1.4.3.jar


Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.