歡迎來到Linux教程網
Linux教程網
Linux教程網
Linux教程網
Linux教程網 >> Linux基礎 >> 關於Linux >> Ubuntu server 14.0464位上編譯Hadoop 2.6.4(純鍵盤模式)

Ubuntu server 14.0464位上編譯Hadoop 2.6.4(純鍵盤模式)

日期:2017/3/1 11:44:14   编辑:關於Linux

注意:以下都是在root用戶下完成的,不喜歡撸(打)sudo這個單詞。

Step1 配置JDK環境

自動化安裝jdk
因為安裝的ubuntu server 14.04 的初始版本是沒有JDK的:
root@master:~# javac
The program 'javac' can be found in the following packages:
 * default-jdk
 * ecj
 * gcj-4.8-jdk
 * openjdk-7-jdk
 * gcj-4.6-jdk
 * openjdk-6-jdk
Try: apt-get install 
root@master:~#

所以需要我們安裝包安裝jdk也行,自動安裝也行。安裝包的話,見我的參考文檔,自動的話,我們開始吧。
輸入命令:

root@master:~# apt-get install openjdk-7-jdk

安裝的時間比較長,需要等待一段時間。

配置JDK的環境變量
Jdk安裝完成後,我們需要配置一下環境變量,至於這個java安裝到哪了,我們需要查找一下,在這裡查找jre*即可,因為jre一般都放在java目錄下的:
root@master:~# find / -name  'jre*'
/usr/lib/jvm/java-7-openjdk-amd64/jre
root@master:~#

配置java的環境變量:

root@master:~# vi /etc/profile
將這些輸入到最後:
export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
export JRE_HOME=$JAVA_HOME/jre
export CLASSPATH=.:$JAVA_HOME/lib:$JRE_HOME/lib:$CLASSPATH
export PATH=$JAVA_HOME/bin:$JRE_HOME/bin:$PATH

重新啟動機器,查看java的版本信息,如果出現如下信息,則安裝成功:

root@master:~# java -version
java version "1.7.0_101"
OpenJDK Runtime Environment (IcedTea 2.6.6) (7u101-2.6.6-0ubuntu0.14.04.1)
OpenJDK 64-Bit Server VM (build 24.95-b01, mixed mode)
root@master:~#
安裝Maven編譯器
Maven:
maven是一個項目構建和管理的工具,提供了幫助管理 構建、文檔、報告、依賴、scms、發布、分發的方法。可以方便的編譯代碼、進行依賴管理、管理二進制庫等等。
maven的好處在於可以將項目過程規范化、自動化、高效化以及強大的可擴展性
利用maven自身及其插件還可以獲得代碼檢查報告、單元測試覆蓋率、實現持續集成等等。
在這裡是自動化安裝,當然是直接命令啦:
root@master:~# apt-get install maven

經過漫長的等待,如果輸入查看版本信息,輸出版本信息則證明安裝成功:

root@master:~# mvn --version
Apache Maven 3.0.5
Maven home: /usr/share/maven
Java version: 1.7.0_101, vendor: Oracle Corporation
Java home: /usr/lib/jvm/java-7-openjdk-amd64/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "3.13.0-24-generic", arch: "amd64", family: "unix"
root@master:~#
安裝openssh
Openssh:
OpenSSH 是 SSH (Secure SHell) 協議的免費開源實現。SSH協議族可以用來進行遠程控制, 或在計算機之間傳送文件。而實現此功能的傳統方式,如telnet(終端仿真協議)、 rcp ftp、 rlogin、rsh都是極為不安全的,並且會使用明文傳送密碼。OpenSSH提供了服務端後台程序和客戶端工具,用來加密遠程控件和文件傳輸過程中的數據,並由此來代替原來的類似服務。
OpenSSH是使用SSH透過計算機網絡加密通訊的實現。它是取代由SSH Communications Security所提供的商用版本的開放源代碼方案。目前OpenSSH是OpenBSD的子計劃。
OpenSSH常常被誤認以為與OpenSSL有關聯,但實際上這兩個計劃的有不同的目的,不同的發展團隊,名稱相近只是因為兩者有同樣的軟件發展目標──提供開放源代碼的加密通訊軟件。
現在上命令:
apt-get install openssh-server openssh-client

又是一段漫長的等待啊:

root@master:~# apt-get install openssh-server openssh-client
Reading package lists... Done
Building dependency tree       
Reading state information... Done
Suggested packages:
  ssh-askpass libpam-ssh keychain monkeysphere rssh molly-guard
The following packages will be upgraded:
  openssh-client openssh-server
2 upgraded, 0 newly installed, 0 to remove and 204 not upgraded.
Need to get 885 kB of archives.
After this operation, 4,096 B of additional disk space will be used.
Get:1 http://us.archive.ubuntu.com/ubuntu/ trusty-updates/main openssh-server amd64 1:6.6p1-2ubuntu2.7 [322 kB]
Get:2 http://us.archive.ubuntu.com/ubuntu/ trusty-updates/main openssh-client amd64 1:6.6p1-2ubuntu2.7 [564 kB]                  
Fetched 885 kB in 2min 44s (5,390 B/s)                                                                                           
Preconfiguring packages ...
(Reading database ... 65436 files and directories currently installed.)
Preparing to unpack .../openssh-server_1%3a6.6p1-2ubuntu2.7_amd64.deb ...
Unpacking openssh-server (1:6.6p1-2ubuntu2.7) over (1:6.6p1-2ubuntu1) ...
Preparing to unpack .../openssh-client_1%3a6.6p1-2ubuntu2.7_amd64.deb ...
Unpacking openssh-client (1:6.6p1-2ubuntu2.7) over (1:6.6p1-2ubuntu1) ...
Processing triggers for ureadahead (0.100.0-16) ...
ureadahead will be reprofiled on next reboot
Processing triggers for ufw (0.34~rc-0ubuntu2) ...
Processing triggers for man-db (2.6.7.1-1) ...
Setting up openssh-client (1:6.6p1-2ubuntu2.7) ...
Setting up openssh-server (1:6.6p1-2ubuntu2.7) ...
ssh stop/waiting
ssh start/running, process 4902
root@master:~#
安裝protobuf-compiler
Protobuf:
Protocol Buffers (ProtocolBuffer/ protobuf )是Google公司開發的一種數據描述語言,類似於XML能夠將結構化數據序列化,可用於數據存儲、通信協議等方面。現階段支持C++、JAVA、Python等三種編程語言。
root@master:~# apt-get install protobuf-compiler
Reading package lists... Done
Building dependency tree       
Reading state information... Done
The following extra packages will be installed:
  libprotobuf8 libprotoc8
The following NEW packages will be installed:
  libprotobuf8 libprotoc8 protobuf-compiler
0 upgraded, 3 newly installed, 0 to remove and 204 not upgraded.
Need to get 550 kB of archives.
After this operation, 2,133 kB of additional disk space will be used.
Do you want to continue? [Y/n] y
Get:1 http://us.archive.ubuntu.com/ubuntu/ trusty/main libprotobuf8 amd64 2.5.0-9ubuntu1 [296 kB]
Get:2 http://us.archive.ubuntu.com/ubuntu/ trusty/main libprotoc8 amd64 2.5.0-9ubuntu1 [235 kB]
Get:3 http://us.archive.ubuntu.com/ubuntu/ trusty/main protobuf-compiler amd64 2.5.0-9ubuntu1 [19.8 kB]
Fetched 550 kB in 3s (173 kB/s)         
Selecting previously unselected package libprotobuf8:amd64.
(Reading database ... 65436 files and directories currently installed.)
Preparing to unpack .../libprotobuf8_2.5.0-9ubuntu1_amd64.deb ...
Unpacking libprotobuf8:amd64 (2.5.0-9ubuntu1) ...
Selecting previously unselected package libprotoc8:amd64.
Preparing to unpack .../libprotoc8_2.5.0-9ubuntu1_amd64.deb ...
Unpacking libprotoc8:amd64 (2.5.0-9ubuntu1) ...
Selecting previously unselected package protobuf-compiler.
Preparing to unpack .../protobuf-compiler_2.5.0-9ubuntu1_amd64.deb ...
Unpacking protobuf-compiler (2.5.0-9ubuntu1) ...
Processing triggers for man-db (2.6.7.1-1) ...
Setting up libprotobuf8:amd64 (2.5.0-9ubuntu1) ...
Setting up libprotoc8:amd64 (2.5.0-9ubuntu1) ...
Setting up protobuf-compiler (2.5.0-9ubuntu1) ...
Processing triggers for libc-bin (2.19-0ubuntu6) ...
root@master:~# protoc --version
libprotoc 2.5.0
root@master:~#
安裝依賴庫
這些庫啊包啊基本都會在編譯過程中用到,缺少的話會影響編譯,看到error了再找solution非常麻煩,提前裝好一勞永逸。
apt-get install g++ autoconf automake libtool cmake zlib1g-dev pkg-config libssl-dev
安裝findbugs
FindBugs:
是一個靜態分析工具,它檢查類或者 JAR 文件,將字節碼與一組缺陷模式進行對比以發現可能的問題。有了靜態分析工具,就可以在不實際運行程序的情況對軟件進行分析。不是通過分析類文件的形式或結構來確定程序的意圖,而是通常使用 Visitor 模式(請參閱 參考資料)。圖 1 顯示了分析一個匿名項目的結果(為防止可怕的犯罪,這裡不給出它的名字):
在FindBugs的GUI中,需要先選擇待掃描的.class文件(FindBugs其實就是對編譯後的class進行掃描,藉以發現一些隱藏的bug。)。如果你擁有這些.class檔對應的源文件,可把這些.java文件再選上,這樣便可以從稍後得出的報告中快捷的定位到出問題的代碼上面。此外,還可以選上工程所使用的library,這樣似乎可以幫助FindBugs做一些高階的檢查,藉以發現一些更深層的bug。
選定了以上各項後,便可以開始檢測了。檢測的過程可能會花好幾分鐘,具體視工程的規模而定。檢測完畢可生成一份詳細的報告,藉由這份報告,可以發現許多代碼中間潛在的bug。比較典型的,如引用了空指針(null pointer dereference), 特定的資源(db connection)未關閉,等等。如果用人工檢查的方式,這些bug可能很難才會被發現,或許永遠也無法發現,直到運行時發作…當除掉了這些典型的 (classic) bug後,可以確信的是,我們的系統穩定度將會上一個新的台階。
以目前遇到的狀況來看,FindBugs可以有兩種使用時機。
輸入命令(自我感覺這個用處不大,但是為了防止後面出現問題,還是多此一舉吧):
root@master:~# apt-get install findbugs
開始編譯
給大家道個歉,因為我說過,是純手打編譯的,但是這個命令是我拷貝的,因為太長了,地址怕搞錯,所以在現實工作環境中,沒有拷貝環境的話,還是請大家手打吧,敲得時候認真點。首先,我們需要在網上下載一個Hadoop版本,我用的是Hadoop 2.6.4版本的,下載地址:
root@master:~# wget http://mirrors.tuna.tsinghua.edu.cn/apache/hadoop/common/hadoop-2.6.4/hadoop-2.6.4-src.tar.gz
--2016-07-05 18:24:46--  http://mirrors.tuna.tsinghua.edu.cn/apache/hadoop/common/hadoop-2.6.4/hadoop-2.6.4-src.tar.gz
Resolving mirrors.tuna.tsinghua.edu.cn (mirrors.tuna.tsinghua.edu.cn)... 166.111.206.63, 2402:f000:1:416:166:111:206:63
Connecting to mirrors.tuna.tsinghua.edu.cn (mirrors.tuna.tsinghua.edu.cn)|166.111.206.63|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 17282122 (16M) [application/octet-stream]
Saving to: ‘hadoop-2.6.4-src.tar.gz’

100%[========================================================================================>] 17,282,122  10.2MB/s   in 1.6s   

2016-07-05 18:24:48 (10.2 MB/s) - ‘hadoop-2.6.4-src.tar.gz’ saved [17282122/17282122]

root@master:~# ls
hadoop-2.6.4-src.tar.gz

解壓文件夾到當前目錄:

root@master:~# tar -zxvf hadoop-2.6.4-src.tar.gz
root@master:~# ls
hadoop-2.6.4-src  hadoop-2.6.4-src.tar.gz
進入到Hadoop-2.6.4-src文件中:
root@master:~#cd /root/xyj/hadoop-2.6.4-src

激動人心的時刻到了,終於到了編譯時刻了:

root@master:~/hadoop-2.6.4-src# mvn clean package -Pdist,native -DskipTests -Dtar

OK,在這裡我們需要等待漫長的時間。期間保證網速達到要求,不斷電,一般都是一兩個小時即可完成編譯,如果設備和網絡的原因,那就需要時間大於2小時不等了。
Hadoop下載地址怎麼找?
【首先來到官網:http://hadoop.apache.org/,找到Getting Started,點擊Download,進入到Hadoop的release版本下載地址:http://hadoop.apache.org/releases.html,我們會發現很多版本的下載地址,在這裡選擇一個你認為順眼的一個,我選擇是Hadoop 2.6.4,點擊Tarball下對應的source,進入到:http://www.apache.org/dyn/closer.cgi/hadoop/common/hadoop-2.6.4/hadoop-2.6.4-src.tar.gz,我們會發現一個http地址,將這個地址就是我們的下載地址。把地址放在wget後面即可!】

問題解決
程序員最怕就是bug了,改不出bug,就是我們的噩夢,Error同樣恐怖啊!!!但是我們還是要面對問題的,遇到問題要慢慢來解決,隨話說,改bug不只有今天的苟且,還有明天和後天,我就不信改不出來,有時候你會發現,突然他就那麼好了,雖然不知道什麼問題。
錯誤1:
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 54:47.276s
[INFO] Finished at: Tue Jul 05 07:16:18 EDT 2016
[INFO] Final Memory: 80M/473M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project hadoop-hdfs-httpfs: Could not resolve dependencies for project org.apache.hadoop:hadoop-hdfs-httpfs:war:2.6.4: Failed to collect dependencies for [junit:junit:jar:4.11 (test), org.mockito:mockito-all:jar:1.8.5 (test), org.apache.hadoop:hadoop-auth:jar:2.6.4 (compile), com.sun.jersey:jersey-core:jar:1.9 (compile), com.sun.jersey:jersey-server:jar:1.9 (compile), javax.servlet:servlet-api:jar:2.5 (provided), com.google.guava:guava:jar:11.0.2 (compile), com.googlecode.json-simple:json-simple:jar:1.1 (compile), org.mortbay.jetty:jetty:jar:6.1.26 (test), org.apache.hadoop:hadoop-common:jar:2.6.4 (compile), org.apache.hadoop:hadoop-hdfs:jar:2.6.4 (compile), org.apache.hadoop:hadoop-common:jar:tests:2.6.4 (test), org.apache.hadoop:hadoop-hdfs:jar:tests:2.6.4 (test), log4j:log4j:jar:1.2.17 (compile), org.slf4j:slf4j-api:jar:1.7.5 (compile), org.slf4j:slf4j-log4j12:jar:1.7.5 (runtime)]: Failed to read artifact descriptor for com.googlecode.json-simple:json-simple:jar:1.1: Could not transfer artifact com.googlecode.json-simple:json-simple:pom:1.1 from/to central (http://repo.maven.apache.org/maven2): Connection to http://repo.maven.apache.org refused: Connection refused -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn  -rf :hadoop-hdfs-httpfs
root@master:~/xyj/hadoop-2.6.4-src#

去吃了個飯,發現編譯出錯了,好了,開始解決吧!在網上搜錯誤怎麼解決,沒找到,還是自己看日志吧,日志大致意思是tomcat服務器有問題,所以我感覺自己安裝的新系統沒有裝tomcat服務器,所以輸入命令:

root@master:~/xyj/hadoop-2.6.4-src# apt-get install tomcat7
重新編譯:
root@master:~/xyj/hadoop-2.6.4-src# mvn clean package -Pdist,native -DskipTests -Dtar

漫長的等待,以為上面沒有解決問題呢,但是我發現,編譯剛才錯的那一塊竟然過去了,過去了就過去了吧,這也是解決問題的方法之一,總比沒解決問題吧,唯一的不足之處就是真正不知道錯哪了,只是憑感覺走的,正好誤打誤撞過去了,希望不是吧!
錯誤2:

[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:14:21.695s
[INFO] Finished at: Tue Jul 05 09:21:06 EDT 2016
[INFO] Final Memory: 76M/439M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project hadoop-yarn-server-nodemanager: Could not resolve dependencies for project org.apache.hadoop:hadoop-yarn-server-nodemanager:jar:2.6.4: Failed to collect dependencies for [org.apache.hadoop:hadoop-common:jar:2.6.4 (provided), org.apache.hadoop:hadoop-yarn-common:jar:2.6.4 (compile), org.apache.hadoop:hadoop-yarn-api:jar:2.6.4 (compile), javax.xml.bind:jaxb-api:jar:2.2.2 (compile), org.codehaus.jettison:jettison:jar:1.1 (compile), commons-lang:commons-lang:jar:2.6 (compile), javax.servlet:servlet-api:jar:2.5 (compile), commons-codec:commons-codec:jar:1.4 (compile), com.sun.jersey:jersey-core:jar:1.9 (compile), com.sun.jersey:jersey-client:jar:1.9 (compile), org.mortbay.jetty:jetty-util:jar:6.1.26 (compile), com.google.guava:guava:jar:11.0.2 (compile), commons-logging:commons-logging:jar:1.1.3 (compile), org.slf4j:slf4j-api:jar:1.7.5 (compile), org.apache.hadoop:hadoop-annotations:jar:2.6.4 (compile), org.apache.hadoop:hadoop-common:jar:tests:2.6.4 (test), com.google.inject.extensions:guice-servlet:jar:3.0 (compile), com.google.protobuf:protobuf-java:jar:2.5.0 (compile), junit:junit:jar:4.11 (test), org.mockito:mockito-all:jar:1.8.5 (test), com.google.inject:guice:jar:3.0 (compile), com.sun.jersey.jersey-test-framework:jersey-test-framework-grizzly2:jar:1.9 (test), com.sun.jersey:jersey-json:jar:1.9 (compile), com.sun.jersey.contribs:jersey-guice:jar:1.9 (compile), org.apache.hadoop:hadoop-yarn-common:jar:tests:2.6.4 (test), org.apache.hadoop:hadoop-yarn-server-common:jar:2.6.4 (compile), org.fusesource.leveldbjni:leveldbjni-all:jar:1.8 (compile)]: Failed to read artifact descriptor for org.glassfish.grizzly:grizzly-http:jar:2.1.2: Could not transfer artifact org.glassfish.grizzly:grizzly-http:pom:2.1.2 from/to apache.snapshots.https (https://repository.apache.org/content/repositories/snapshots): Read timed out -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn  -rf :hadoop-yarn-server-nodemanager

真是頭疼,好好搜搜問題解決方法吧!告訴大家一個調試詳細錯誤的命令,在原有命令上添加一個 –X參數,我們就可以看到詳細的編譯過程了:

root@master:~/xyj/hadoop-2.6.4-src# mvn clean package -Pdist,native -DskipTests -Dtar -X

這個問題,我找了很多網站,也是找不到結果,看了很多的帖子和問題,歸根一點就是我的網速和鏡像站的問題,可能是網速有時候太慢,網絡不能訪問到外網的原因吧!所以索性下載了一個lantern,翻牆用的(也可能是網絡原因,網絡在某個時刻下載大文件卡住了,然後編譯執行到這一步,沒有這個文件,或者是文件下載不全,都會引起編譯失敗,所以不一定是翻牆的原因,可能是網絡在編譯時尋找路由或者是網站的時候,沒有找到,第二次就找到了,這怎麼感覺就是拼人品啊!)。
接下來就教大家安裝lantern吧!
Github下載地址(github源碼地址見上面網址):

root@master:~/xyj# wget https://raw.githubusercontent.com/getlantern/lantern-binaries/master/lantern-installer-3.0.4-64-bit.deb

安裝工具:

root@master:~/xyj# apt-get install gdebi-core
root@master:~/xyj# apt-get install libappindicator3-1

安裝開啟:

root@master:~/xyj #gdebi lantern-installer-3.0.4-64-bit.deb
root@master:~/xyj#lantern
root@master:~# lantern 
Running installation script...
/usr/lib/lantern/lantern-binary: OK
Jul 06 09:24:51.955 - 0m0s DEBUG flashlight: flashlight.go:49 ****************************** Package Version: 2.1.2
Jul 06 09:24:51.956 - 0m0s DEBUG flashlight.ui: ui.go:58 Creating tarfs filesystem that prefers local resources at /lantern/src/github.com/getlantern/lantern-ui/app
Jul 06 09:24:51.960 - 0m0s DEBUG flashlight: settings.go:57 Loading settings
Jul 06 09:24:51.960 - 0m0s DEBUG flashlight: settings.go:70 Could not read file open /root/.lantern/settings.yaml: no such file or directory
Jul 06 09:24:51.961 - 0m0s DEBUG flashlight.ui: service.go:134 Accepting websocket connections at: /data
Jul 06 09:24:51.962 - 0m0s DEBUG flashlight: settings.go:99 Sending Lantern settings to new client
Jul 06 09:24:51.965 - 0m0s DEBUG flashlight: settings.go:109 Reading settings messages!!
Jul 06 09:24:52.000 - 0m0s DEBUG flashlight: flashlight.go:49 ****************************** Package Version: 2.1.2
Jul 06 09:24:52.001 - 0m0s DEBUG flashlight.ui: ui.go:58 Creating tarfs filesystem that prefers local resources at /lantern/src/github.com/getlantern/lantern-ui/app
Jul 06 09:24:52.028 - 0m0s DEBUG flashlight: settings.go:57 Loading settings
Jul 06 09:24:52.028 - 0m0s DEBUG flashlight: settings.go:70 Could not read file open /root/.lantern/settings.yaml: no such file or directory
Jul 06 09:24:52.031 - 0m0s DEBUG flashlight.ui: service.go:134 Accepting websocket connections at: /data
Jul 06 09:24:52.032 - 0m0s DEBUG flashlight: settings.go:99 Sending Lantern settings to new client
Jul 06 09:24:52.037 - 0m0s DEBUG flashlight: settings.go:109 Reading settings messages!!

(lantern:18322): Gtk-WARNING **: cannot open display:

查看一下:

root@master:~# ps -aux | grep lantern
root      18331  0.0  0.0  11740   940 pts/1    S+   05:26   0:00 grep --color=auto lantern

好了,這個安裝完,我們繼續吧!我感覺只是給自己一個心理安慰,這個重新編譯是否成功,就看人品了!還好,編譯通過了,感謝人品啊!

找到編譯結果
以上編譯需要經過漫長的等待時間,終於看到了結果,如下:
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Main ................................ SUCCESS [4.493s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [3.007s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [5.774s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.552s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [5.550s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [12.361s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [15.264s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [15.034s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [8.842s]
[INFO] Apache Hadoop Common .............................. SUCCESS [5:43.242s]
[INFO] Apache Hadoop NFS ................................. SUCCESS [31.949s]
[INFO] Apache Hadoop KMS ................................. SUCCESS [35.223s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.500s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [9:19.175s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [1:12.408s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [25.086s]
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [14.865s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.188s]
[INFO] hadoop-yarn ....................................... SUCCESS [0.167s]
[INFO] hadoop-yarn-api ................................... SUCCESS [2:13.539s]
[INFO] hadoop-yarn-common ................................ SUCCESS [1:57.809s]
[INFO] hadoop-yarn-server ................................ SUCCESS [1.107s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [38.032s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [1:03.906s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [9.610s]
[INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [15.939s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [1:06.344s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [23.420s]
[INFO] hadoop-yarn-client ................................ SUCCESS [18.195s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [0.291s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [5.631s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [3.816s]
[INFO] hadoop-yarn-site .................................. SUCCESS [0.152s]
[INFO] hadoop-yarn-registry .............................. SUCCESS [11.657s]
[INFO] hadoop-yarn-project ............................... SUCCESS [23.399s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.980s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [1:26.429s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [1:00.837s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [18.942s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [28.096s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [21.970s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [35.789s]
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [5.011s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [11.536s]
[INFO] hadoop-mapreduce .................................. SUCCESS [17.663s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [19.450s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [59.560s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [9.721s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [17.239s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [11.690s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [11.958s]
[INFO] Apache Hadoop Ant Tasks ........................... SUCCESS [7.894s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [6.610s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [29.171s]
[INFO] Apache Hadoop OpenStack support ................... SUCCESS [25.039s]
[INFO] Apache Hadoop Amazon Web Services support ......... SUCCESS [4:21.123s]
[INFO] Apache Hadoop Client .............................. SUCCESS [24.644s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.710s]
[INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [22.584s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [32.450s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [1.933s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [3:01.581s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 45:05.494s
[INFO] Finished at: Wed Jul 06 06:14:20 EDT 2016
[INFO] Final Memory: 101M/473M
[INFO] ------------------------------------------------------------------------

上圖顯示了編譯的所有文件,他們都是分布一個個編譯,當失敗到哪,就會把成功的和失敗的,跳過失敗之後的那些項目顯示出來,這個忘了截圖,好吧!你們自己編譯會看到的。
編譯好之後,我們需要找到我們編譯好的包,路徑是:

root@master:~/xyj/hadoop-2.6.4-src/hadoop-dist/target# pwd
/root/xyj/hadoop-2.6.4-src/hadoop-dist/target
root@master:~/xyj/hadoop-2.6.4-src/hadoop-dist/target# ll
total 539268
drwxr-xr-x 7 root root      4096 Jul  6 06:13 ./
drwxr-xr-x 3 root root      4096 Jul  6 06:11 ../
drwxr-xr-x 2 root root      4096 Jul  6 06:11 antrun/
-rw-r--r-- 1 root root      1866 Jul  6 06:11 dist-layout-stitching.sh
-rw-r--r-- 1 root root       639 Jul  6 06:11 dist-tar-stitching.sh
drwxr-xr-x 9 root root      4096 Jul  6 06:11 hadoop-2.6.4/
-rw-r--r-- 1 root root 183757063 Jul  6 06:12 hadoop-2.6.4.tar.gz
-rw-r--r-- 1 root root      2779 Jul  6 06:11 hadoop-dist-2.6.4.jar
-rw-r--r-- 1 root root 368403396 Jul  6 06:14 hadoop-dist-2.6.4-javadoc.jar
drwxr-xr-x 2 root root      4096 Jul  6 06:13 javadoc-bundle-options/
drwxr-xr-x 2 root root      4096 Jul  6 06:11 maven-archiver/
drwxr-xr-x 2 root root      4096 Jul  6 06:11 test-dir/

其中有一個hadoop-2.6.4.tar.gz的包就是的,還有解壓好的hadoop-2.6.4文件夾。
Hadoop的版本信息:

root@master:~/xyj/hadoop-2.6.4-src/hadoop-dist/target/hadoop-2.6.4/bin# ./hadoop version
Hadoop 2.6.4
Subversion Unknown -r Unknown
Compiled by root on 2016-07-06T09:30Z
Compiled with protoc 2.5.0
From source with checksum 8dee2286ecdbbbc930a6c87b65cbc010
This command was run using /root/xyj/hadoop-2.6.4-src/hadoop-dist/target/hadoop-2.6.4/share/hadoop/common/hadoop-common-2.6.4.jar
root@master:~/xyj/hadoop-2.6.4-src/hadoop-dist/target/hadoop-2.6.4/bin# pwd
/root/xyj/hadoop-2.6.4-src/hadoop-dist/target/hadoop-2.6.4/bin

Hadoop的動態庫連接庫:

/root/xyj/hadoop-2.6.4-src/hadoop-dist/target/hadoop-2.6.4/lib/native
root@master:~/xyj/hadoop-2.6.4-src/hadoop-dist/target/hadoop-2.6.4/lib/native# file *
libhadoop.a:        current ar archive
libhadooppipes.a:   current ar archive
libhadoop.so:       symbolic link to `libhadoop.so.1.0.0' 
libhadoop.so.1.0.0: ELF 64-bit LSB  shared object, x86-64, version 1 (SYSV), dynamically linked, BuildID[sha1]=2414b17dc9802b68da89538507e71ff61c8630c4, not stripped
libhadooputils.a:   current ar archive
libhdfs.a:          current ar archive
libhdfs.so:         symbolic link to `libhdfs.so.0.0.0' 
libhdfs.so.0.0.0:   ELF 64-bit LSB  shared object, x86-64, version 1 (SYSV), dynamically linked, BuildID[sha1]=a5aa61121dfb8d075dca4deab83067c812acd4c4, not stripped
root@master:~/xyj/hadoop-2.6.4-src/hadoop-dist/target/hadoop-2.6.4/lib/native# ll
total 4768
drwxr-xr-x 2 root root    4096 Jul  6 06:11 ./
drwxr-xr-x 3 root root    4096 Jul  6 06:11 ../
-rw-r--r-- 1 root root 1278070 Jul  6 06:11 libhadoop.a
-rw-r--r-- 1 root root 1632656 Jul  6 06:11 libhadooppipes.a
lrwxrwxrwx 1 root root      18 Jul  6 06:11 libhadoop.so -> libhadoop.so.1.0.0*
-rwxr-xr-x 1 root root  750783 Jul  6 06:11 libhadoop.so.1.0.0*
-rw-r--r-- 1 root root  476210 Jul  6 06:11 libhadooputils.a
-rw-r--r-- 1 root root  441046 Jul  6 06:11 libhdfs.a
lrwxrwxrwx 1 root root      16 Jul  6 06:11 libhdfs.so -> libhdfs.so.0.0.0*
-rwxr-xr-x 1 root root  282519 Jul  6 06:11 libhdfs.so.0.0.0*
root@master:~/xyj/hadoop-2.6.4-src/hadoop-dist/target/hadoop-2.6.4/lib/native#

那個hadoop-2.6.4.tar.gz中的文件解壓後和這個一樣。一會用的時候拷貝那個hadoop-2.6.4.tar.gz的文件,人嘛,就喜歡用原裝的,嘻嘻。。。

小結
至此,Hadoop-2.6.4版本的編譯好了。給大家的承諾是全鍵盤操作,不過我自己就作弊了,給大家拷貝了那麼多流程,還有一個網址,懶得打,對於初學者,還是建議大家一個字一個字敲下去,對於成熟的開發人員,這個你們自便。
對於裡面出現的一些問題,以及我沒遇到的問題,可能你們在安裝的時候,會出現問題,請大家積極去解決它,一天不行兩天,兩天不行三天,三天不行放棄吧!不必浪費這麼多時間了,再重新搞,或者是把問題消滅掉。
編譯的過程大致是,mvn這個軟件將我們編譯Hadoop的文件全部下載到本地,然後慢慢的一個一個編譯,直到編譯成功。如果中途失敗,則程序停止,只有將問題解決了,才能重新運行程序編譯成功。
對於編譯過程中,遇到的問題,大多都是網絡的原因,還有就是鏡像站的原因,我們下載不了外面的資源,只能認倒霉重新讓網絡找一個好的地址,這樣的話,導致後果就是我們要重新編譯,重新編譯有時也解決不了一些問題,只有大家,慢慢的憑經驗吧!

Copyright © Linux教程網 All Rights Reserved