来自: 丁仔
回复量:3
创建时间: 2017-03-27 15:34
我在Apache-common下载的hadoop-3.0.0-alpha1.tar.gz,ldd查看后是64位,经查询,1.2版本后都是64位,但我的VM环境是ubuntu32bit,所以重新下载的hadoop-3.0.0-alpha1-src源码包,想要重新编译为32bit。
运行命令行:
apt-get install -y openjdk-7-jdk libprotobuf-dev protobuf-compiler maven cmake build-essential pkg-config libssl-dev zlib1g-dev llvm-gcc automake autoconf make; mvn package -Pdist,native -DskipTests #–Dtar
提示以下错误:
[INFO] Apache Hadoop Distribution ........................ SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 40:37.715s [INFO] Finished at: Thu Mar 23 03:11:14 PDT 2017 [INFO] Final Memory: 47M/113M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal on project hadoop-common: Could not resolve dependencies for project org.apache.hadoop:hadoop-common:jar:3.0.0-alpha1: The following artifacts could not be resolved: com.google.protobuf:protobuf-java:jar:2.5.0, org.apache.curator:curator-recipes:jar:2.7.1: Could not transfer artifact com.google.protobuf:protobuf-java:jar:2.5.0 from/to central (http://repo.maven.apache.org/maven2): GET request of: com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar from central failed: Read timed out -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hadoop-common
在网上找了很多,但是解决不了,求教问题出在哪里?
0 赞
3 回复
在BIOS设置中,把CPU的VT打开,就可以支持64位虚拟机
现在玩大数据都要求高性能,64位系统才能使用4G以上内存,已经不考虑32位的了,所以要玩大数据,就更新机器吧
建议更新为64位的机器 以后都要用64的了 免得以后再换了
好吧,听取了楼上的建议,直接重装linux系统,然后再搭建lamp和hadoop环境,虽然很费时,但总算避免了问题。
696
mapreduce中 combiner 合并文件,默认是一次合并多少个文件啊?
730
请问 谁有课程 (推荐算法与Spark MLLIB) 里面的代码
946
733
952