從零教你如何擷取hadoop2.4源碼並使用eclipse關聯hadoop2.4源碼

來源:互聯網
上載者:User

標籤:des   style   class   code   java   http   

從零教你如何擷取hadoop2.4源碼並使用eclipse關聯hadoop2.4源碼
http://www.aboutyun.com/thread-8211-1-1.html
(出處: about雲開發)

 

問題導讀:
1.如何通過官網src包,擷取hadoop的全部代碼
2.通過什麼樣的操作,可以查看hadoop某個函數或則類的實現?
3.maven的作用是什嗎?






我們如果想搞開發,研究源碼對我們的協助很大。不明白原理就如同黑盒子,遇到問題,我們也摸不著思路。所以這裡交給大家
一.如何擷取源碼
二.如何關聯源碼

一.如何擷取源碼

1.下載hadoop的maven程式包

(1)官網下載
這裡我們先從官網上下載maven包hadoop-2.4.0-src.tar.gz。
官網

對於不知道怎麼去官網下載,可以查看:新手指導:hadoop官網介紹及如何下載hadoop(2.4)各個版本與查看hadoop API介紹

(2)網盤下載
也可以從網盤下載:
http://pan.baidu.com/s/1kToPuGB

2.通過maven擷取源碼
擷取源碼的方式有兩種,一種是通過命令列的方式,一種是通過eclipse。這裡主要講通過命令的方式

通過命令的方式擷取源碼:
1.解壓包

 


解壓包的時候遇到了下面問題。不過不用管,我們繼續往下走

1        : 無法建立檔案:D:\hadoop2\hadoop-2.4.0-src\hadoop-yarn-project\hadoop-yarn\hadoop-yarn-server\hadoop-yarn-server-applicationhistoryservice\target\classes\org\apache\hadoop\yarn\server\applicationhistoryservice\ApplicationHistoryClientService$ApplicationHSClientProtocolHandler.class:
路徑和檔案名稱總長度不能超過260個字元
系統找不到指定的路徑。        D:\hadoop2\hadoop-2.4.0-src.zip
2        : 無法建立檔案:D:\hadoop2\hadoop-2.4.0-src\hadoop-yarn-project\hadoop-yarn\hadoop-yarn-server\hadoop-yarn-server-applicationhistoryservice\target\classes\org\apache\hadoop\yarn\server\applicationhistoryservice\timeline\LeveldbTimelineStore$LockMap$CountingReentrantLock.class:系統找不到指定的路徑。        D:\hadoop2\hadoop-2.4.0-src.zip
3        : 無法建立檔案:D:\hadoop2\hadoop-2.4.0-src\hadoop-yarn-project\hadoop-yarn\hadoop-yarn-server\hadoop-yarn-server-applicationhistoryservice\target\test-classes\org\apache\hadoop\yarn\server\applicationhistoryservice\webapp\TestAHSWebApp$MockApplicationHistoryManagerImpl.class:系統找不到指定的路徑。        D:\hadoop2\hadoop-2.4.0-src.zip
4        : 無法建立檔案:D:\hadoop2\hadoop-2.4.0-src\hadoop-yarn-project\hadoop-yarn\hadoop-yarn-server\hadoop-yarn-server-resourcemanager\target\test-classes\org\apache\hadoop\yarn\server\resourcemanager\monitor\capacity\TestProportionalCapacityPreemptionPolicy$IsPreemptionRequestFor.class:
路徑和檔案名稱總長度不能超過260個字元
系統找不到指定的路徑。        D:\hadoop2\hadoop-2.4.0-src.zip
5        : 無法建立檔案:D:\hadoop2\hadoop-2.4.0-src\hadoop-yarn-project\hadoop-yarn\hadoop-yarn-server\hadoop-yarn-server-resourcemanager\target\test-classes\org\apache\hadoop\yarn\server\resourcemanager\recovery\TestFSRMStateStore$TestFSRMStateStoreTester$TestFileSystemRMStore.class:系統找不到指定的路徑。        D:\hadoop2\hadoop-2.4.0-src.zip
6        : 無法建立檔案:D:\hadoop2\hadoop-2.4.0-src\hadoop-yarn-project\hadoop-yarn\hadoop-yarn-server\hadoop-yarn-server-resourcemanager\target\test-classes\org\apache\hadoop\yarn\server\resourcemanager\recovery\TestZKRMStateStore$TestZKRMStateStoreTester$TestZKRMStateStoreInternal.class:
路徑和檔案名稱總長度不能超過260個字元
系統找不到指定的路徑。        D:\hadoop2\hadoop-2.4.0-src.zip
7        : 無法建立檔案:D:\hadoop2\hadoop-2.4.0-src\hadoop-yarn-project\hadoop-yarn\hadoop-yarn-server\hadoop-yarn-server-resourcemanager\target\test-classes\org\apache\hadoop\yarn\server\resourcemanager\recovery\TestZKRMStateStoreZKClientConnections$TestZKClient$TestForwardingWatcher.class:
路徑和檔案名稱總長度不能超過260個字元
系統找不到指定的路徑。        D:\hadoop2\hadoop-2.4.0-src.zip
8        : 無法建立檔案:D:\hadoop2\hadoop-2.4.0-src\hadoop-yarn-project\hadoop-yarn\hadoop-yarn-server\hadoop-yarn-server-resourcemanager\target\test-classes\org\apache\hadoop\yarn\server\resourcemanager\recovery\TestZKRMStateStoreZKClientConnections$TestZKClient$TestZKRMStateStore.class:
路徑和檔案名稱總長度不能超過260個字元
系統找不到指定的路徑。        D:\hadoop2\hadoop-2.4.0-src.zip
9        : 無法建立檔案:D:\hadoop2\hadoop-2.4.0-src\hadoop-yarn-project\hadoop-yarn\hadoop-yarn-server\hadoop-yarn-server-resourcemanager\target\test-classes\org\apache\hadoop\yarn\server\resourcemanager\rmapp\attempt\TestRMAppAttemptTransitions$TestApplicationAttemptEventDispatcher.class:
路徑和檔案名稱總長度不能超過260個字元
系統找不到指定的路徑。        D:\hadoop2\hadoop-2.4.0-src.zip



2.通過maven擷取源碼

這裡需要說明的是,在使用maven的時候,需要先安裝jdk,protoc ,如果沒有安裝可以參考win7如何安裝maven、安裝protoc

(1)進入hadoop-2.4.0-src\hadoop-maven-plugins,運行mvn install

  1. D:\hadoop2\hadoop-2.4.0-src\hadoop-maven-plugins>mvn install
複製代碼



顯示如下資訊

  1. [INFO] Scanning for projects...
  2. [WARNING]
  3. [WARNING] Some problems were encountered while building the effective model for
  4. org.apache.hadoop:hadoop-maven-plugins:maven-plugin:2.4.0
  5. [WARNING] ‘build.plugins.plugin.(groupId:artifactId)‘ must be unique but found d
  6. uplicate declaration of plugin org.apache.maven.plugins:maven-enforcer-plugin @
  7. org.apache.hadoop:hadoop-project:2.4.0, D:\hadoop2\hadoop-2.4.0-src\hadoop-proje
  8. ct\pom.xml, line 1015, column 15
  9. [WARNING]
  10. [WARNING] It is highly recommended to fix these problems because they threaten t
  11. he stability of your build.
  12. [WARNING]
  13. [WARNING] For this reason, future Maven versions might no longer support buildin
  14. g such malformed projects.
  15. [WARNING]
  16. [INFO]
  17. [INFO] Using the builder org.apache.maven.lifecycle.internal.builder.singlethrea
  18. ded.SingleThreadedBuilder with a thread count of 1
  19. [INFO]
  20. [INFO] ------------------------------------------------------------------------
  21. [INFO] Building Apache Hadoop Maven Plugins 2.4.0
  22. [INFO] ------------------------------------------------------------------------
  23. [INFO]
  24. [INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-maven-plugins
  25. ---
  26. [INFO] Executing tasks
  27. main:
  28. [INFO] Executed tasks
  29. [INFO]
  30. [INFO] --- maven-plugin-plugin:3.0:descriptor (default-descriptor) @ hadoop-mave
  31. n-plugins ---
  32. [INFO] Using ‘UTF-8‘ encoding to read mojo metadata.
  33. [INFO] Applying mojo extractor for language: java-annotations
  34. [INFO] Mojo extractor for language: java-annotations found 2 mojo descriptors.
  35. [INFO] Applying mojo extractor for language: java
  36. [INFO] Mojo extractor for language: java found 0 mojo descriptors.
  37. [INFO] Applying mojo extractor for language: bsh
  38. [INFO] Mojo extractor for language: bsh found 0 mojo descriptors.
  39. [INFO]
  40. [INFO] --- maven-resources-plugin:2.2:resources (default-resources) @ hadoop-mav
  41. en-plugins ---
  42. [INFO] Using default encoding to copy filtered resources.
  43. [INFO]
  44. [INFO] --- maven-compiler-plugin:2.5.1:compile (default-compile) @ hadoop-maven-
  45. plugins ---
  46. [INFO] Nothing to compile - all classes are up to date
  47. [INFO]
  48. [INFO] --- maven-plugin-plugin:3.0:descriptor (mojo-descriptor) @ hadoop-maven-p
  49. lugins ---
  50. [INFO] Using ‘UTF-8‘ encoding to read mojo metadata.
  51. [INFO] Applying mojo extractor for language: java-annotations
  52. [INFO] Mojo extractor for language: java-annotations found 2 mojo descriptors.
  53. [INFO] Applying mojo extractor for language: java
  54. [INFO] Mojo extractor for language: java found 0 mojo descriptors.
  55. [INFO] Applying mojo extractor for language: bsh
  56. [INFO] Mojo extractor for language: bsh found 0 mojo descriptors.
  57. [INFO]
  58. [INFO] --- maven-resources-plugin:2.2:testResources (default-testResources) @ ha
  59. doop-maven-plugins ---
  60. [INFO] Using default encoding to copy filtered resources.
  61. [INFO]
  62. [INFO] --- maven-compiler-plugin:2.5.1:testCompile (default-testCompile) @ hadoo
  63. p-maven-plugins ---
  64. [INFO] No sources to compile
  65. [INFO]
  66. [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hadoop-maven-plugins
  67. ---
  68. [INFO] No tests to run.
  69. [INFO]
  70. [INFO] --- maven-jar-plugin:2.3.1:jar (default-jar) @ hadoop-maven-plugins ---
  71. [INFO] Building jar: D:\hadoop2\hadoop-2.4.0-src\hadoop-maven-plugins\target\had
  72. oop-maven-plugins-2.4.0.jar
  73. [INFO]
  74. [INFO] --- maven-plugin-plugin:3.0:addPluginArtifactMetadata (default-addPluginA
  75. rtifactMetadata) @ hadoop-maven-plugins ---
  76. [INFO]
  77. [INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ hadoop-
  78. maven-plugins ---
  79. [INFO]
  80. [INFO] --- maven-install-plugin:2.3.1:install (default-install) @ hadoop-maven-p
  81. lugins ---
  82. [INFO] Installing D:\hadoop2\hadoop-2.4.0-src\hadoop-maven-plugins\target\hadoop
  83. -maven-plugins-2.4.0.jar to C:\Users\hyj\.m2\repository\org\apache\hadoop\hadoop
  84. -maven-plugins\2.4.0\hadoop-maven-plugins-2.4.0.jar
  85. [INFO] Installing D:\hadoop2\hadoop-2.4.0-src\hadoop-maven-plugins\pom.xml to C:
  86. \Users\hyj\.m2\repository\org\apache\hadoop\hadoop-maven-plugins\2.4.0\hadoop-ma
  87. ven-plugins-2.4.0.pom
  88. [INFO] ------------------------------------------------------------------------
  89. [INFO] BUILD SUCCESS
  90. [INFO] ------------------------------------------------------------------------
  91. [INFO] Total time: 4.891 s
  92. [INFO] Finished at: 2014-06-23T14:47:33+08:00
  93. [INFO] Final Memory: 21M/347M
  94. [INFO] ------------------------------------------------------------------------
複製代碼


部分如下:

 


 



(2)運行

  1. mvn eclipse:eclipse -DskipTests
複製代碼

這時候注意,我們進入的是hadoop_home,我這裡是D:\hadoop2\hadoop-2.4.0-src

部分資訊如下

  1. [INFO]
  2. [INFO] ------------------------------------------------------------------------
  3. [INFO] Reactor Summary:
  4. [INFO]
  5. [INFO] Apache Hadoop Main ................................ SUCCESS [  0.684 s]
  6. [INFO] Apache Hadoop Project POM ......................... SUCCESS [  0.720 s]
  7. [INFO] Apache Hadoop Annotations ......................... SUCCESS [  0.276 s]
  8. [INFO] Apache Hadoop Project Dist POM .................... SUCCESS [  0.179 s]
  9. [INFO] Apache Hadoop Assemblies .......................... SUCCESS [  0.121 s]
  10. [INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [  1.680 s]
  11. [INFO] Apache Hadoop MiniKDC ............................. SUCCESS [  1.802 s]
  12. [INFO] Apache Hadoop Auth ................................ SUCCESS [  1.024 s]
  13. [INFO] Apache Hadoop Auth Examples ....................... SUCCESS [  0.160 s]
  14. [INFO] Apache Hadoop Common .............................. SUCCESS [  1.061 s]
  15. [INFO] Apache Hadoop NFS ................................. SUCCESS [  0.489 s]
  16. [INFO] Apache Hadoop Common Project ...................... SUCCESS [  0.056 s]
  17. [INFO] Apache Hadoop HDFS ................................ SUCCESS [  2.770 s]
  18. [INFO] Apache Hadoop HttpFS .............................. SUCCESS [  0.965 s]
  19. [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [  0.629 s]
  20. [INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [  0.284 s]
  21. [INFO] Apache Hadoop HDFS Project ........................ SUCCESS [  0.061 s]
  22. [INFO] hadoop-yarn ....................................... SUCCESS [  0.052 s]
  23. [INFO] hadoop-yarn-api ................................... SUCCESS [  0.842 s]
  24. [INFO] hadoop-yarn-common ................................ SUCCESS [  0.322 s]
  25. [INFO] hadoop-yarn-server ................................ SUCCESS [  0.065 s]
  26. [INFO] hadoop-yarn-server-common ......................... SUCCESS [  0.972 s]
  27. [INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [  0.580 s]
  28. [INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [  0.379 s]
  29. [INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [  0.281 s]
  30. [INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [  0.378 s]
  31. [INFO] hadoop-yarn-server-tests .......................... SUCCESS [  0.534 s]
  32. [INFO] hadoop-yarn-client ................................ SUCCESS [  0.307 s]
  33. [INFO] hadoop-yarn-applications .......................... SUCCESS [  0.050 s]
  34. [INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [  0.202 s]
  35. [INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [  0.194 s]
  36. [INFO] hadoop-yarn-site .................................. SUCCESS [  0.057 s]
  37. [INFO] hadoop-yarn-project ............................... SUCCESS [  0.066 s]
  38. [INFO] hadoop-mapreduce-client ........................... SUCCESS [  0.091 s]
  39. [INFO] hadoop-mapreduce-client-core ...................... SUCCESS [  1.321 s]
  40. [INFO] hadoop-mapreduce-client-common .................... SUCCESS [  0.786 s]
  41. [INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [  0.456 s]
  42. [INFO] hadoop-mapreduce-client-app ....................... SUCCESS [  0.508 s]
  43. [INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [  0.834 s]
  44. [INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [  0.541 s]
  45. [INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [  0.284 s]
  46. [INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [  0.851 s]
  47. [INFO] hadoop-mapreduce .................................. SUCCESS [  0.099 s]
  48. [INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [  0.742 s]
  49. [INFO] Apache Hadoop Distributed Copy .................... SUCCESS [  0.335 s]
  50. [INFO] Apache Hadoop Archives ............................ SUCCESS [  0.397 s]
  51. [INFO] Apache Hadoop Rumen ............................... SUCCESS [  0.371 s]
  52. [INFO] Apache Hadoop Gridmix ............................. SUCCESS [  0.230 s]
  53. [INFO] Apache Hadoop Data Join ........................... SUCCESS [  0.184 s]
  54. [INFO] Apache Hadoop Extras .............................. SUCCESS [  0.217 s]
  55. [INFO] Apache Hadoop Pipes ............................... SUCCESS [  0.048 s]
  56. [INFO] Apache Hadoop OpenStack support ................... SUCCESS [  0.244 s]
  57. [INFO] Apache Hadoop Client .............................. SUCCESS [  0.590 s]
  58. [INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [  0.230 s]
  59. [INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [  0.650 s]
  60. [INFO] Apache Hadoop Tools Dist .......................... SUCCESS [  0.334 s]
  61. [INFO] Apache Hadoop Tools ............................... SUCCESS [  0.042 s]
  62. [INFO] Apache Hadoop Distribution ........................ SUCCESS [  0.144 s]
  63. [INFO] ------------------------------------------------------------------------
  64. [INFO] BUILD SUCCESS
  65. [INFO] ------------------------------------------------------------------------
  66. [INFO] Total time: 31.234 s
  67. [INFO] Finished at: 2014-06-23T14:55:08+08:00
  68. [INFO] Final Memory: 84M/759M
  69. [INFO] ------------------------------------------------------------------------
複製代碼

這時候,我們已經把源碼給下載下來了。這時候,我們會看到檔案會明顯增大。


<ignore_js_op> 



3.關聯eclipse源碼

加入我們以下程式
<ignore_js_op> hadoop2.2mapreduce例子.rar (1.14 MB, 下載次數: 7, 售價: 1 雲幣) 
如示,對他們進行了打包
<ignore_js_op> 

這兩個檔案, MaxTemperature.zip為mapreduce例子,mockito-core-1.8.5.jar為mapreduce例子所引用的包
(這裡需要說明的是,mapreduce為2.2,但是不影響關聯源碼,只是交給大家該如何關聯源碼)
我們解壓之後,匯入eclipse
(對於匯入項目不熟悉,參考零基礎教你如何匯入eclipse項目)

<ignore_js_op> 


我們匯入之後,看到很多的紅線,這些其實都是沒有引用包,下面我們開始解決這些文法問題。
一、解決匯入jar包
(1)引入mockito-core-1.8.5.jar

(2)hadoop2.4編譯包中的jar檔案,這些檔案的位置如下:

hadoop_home中share\hadoop檔案夾下,具體我這的位置D:\hadoop2\hadoop-2.4.0\share\hadoop
找到裡面的jar包,舉例如下:lib檔案中的jar包,以及下面的jar包都添加到buildpath中。
如果對於引用包,不知道該如何添加這些jar包,參考hadoop開發方式總結及操作指導。
(注意的是,我們這裡是引入的是編譯包,編譯的下載hadoop--642.4.0.tar.gz
連結: http://pan.baidu.com/s/1c0vPjG0 密碼:xj6l)

更多包下載可以參考hadoop家族、strom、spark、Linux、flume等jar包、安裝包匯總下載



 


二、關聯源碼
1.我們匯入jar包之後,就沒有錯誤了,如所示

 

2.找不到源碼

當我們想看一個類或則函數怎麼實現的時候,通過Open Call Hierarchy,卻找不到源檔案。

 



 


3.Attach Source

 

上面三處,我們按照順序添加即可,我們選定壓縮包之後,單擊確定,ok了,我們的工作已經完畢。

注意:對於hadoop-2.2.0-src.zip則是我們上面通過maven下載的源碼,然後壓縮的檔案,記得一定是壓縮檔zip的形式


4.驗證關聯後查看源碼

我們再次執行上面操作,通過Open Call Hierarchy




看到下面內容



 


然後我們雙擊主類,即紅字部分,我們看到下面內容:

 




問題:
細心的同學,這裡面我們產生一個問題,因為我們看到的是.class檔案,而不是.java檔案。那麼他會不會和我們所看到的.java檔案不一樣那。
其實是一樣的,感興趣的同學,可以驗證一下。


下一篇:
如何通過eclipse查看、閱讀hadoop2.4源碼

相關文章

聯繫我們

該頁面正文內容均來源於網絡整理,並不代表阿里雲官方的觀點,該頁面所提到的產品和服務也與阿里云無關,如果該頁面內容對您造成了困擾,歡迎寫郵件給我們,收到郵件我們將在5個工作日內處理。

如果您發現本社區中有涉嫌抄襲的內容,歡迎發送郵件至: info-contact@alibabacloud.com 進行舉報並提供相關證據,工作人員會在 5 個工作天內聯絡您,一經查實,本站將立刻刪除涉嫌侵權內容。

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.