Date: Thu, 1 Sep 2011 08:25:31 +0200 From: endzed@gmail.com To: Clement Laforet <sheepkiller@cultdeadsheep.org> Cc: freebsd-ports@FreeBSD.org Subject: Re: [CFT] Hadoop preliminary port Message-ID: <ED1A2057-F2A6-4093-B312-06F3F060B18C@gmail.com> In-Reply-To: <20110808124304.GB16138@goofy.cultdeadsheep.org> References: <20110808091432.GA16138@goofy.cultdeadsheep.org> <20110808124304.GB16138@goofy.cultdeadsheep.org>
next in thread | previous in thread | raw e-mail | index | archive | help
Le 8 ao=FBt 2011 =E0 14:43, Clement Laforet a =E9crit : > On Mon, Aug 08, 2011 at 11:14:32AM +0200, Clement Laforet wrote: >> Hi, >>=20 >> You can find a preliminary port of hadoop 0.20.203.0 here: >> http://people.freebsd.org/~clement/hadoop/ >=20 > Basic hive and pig ports are available here too. Hello Clem, I'm currently trying your preliminary port for hadoop and pig For hadoop I had to launch su -m hadoop -c 'hadoop namenode -format' = afterwhat all's running fine :) =3D> Maybe some namenodeformat rc command would help for 1st time users = (similar to the postgresql initdb rc command) Pig was a little bit more complex to run, since I needed to setup = /usr/local/etc/hadoop in HADOOP_CONF_DIR and PIG_CLASSPATH env vars for = it to run (so if it is possible to make this path the default would be = great ?), and I'm still unable to run with hadoop, only pig -x local is = running Here's what happen : %pig -x local --version Apache Pig version 0.9.0 (r1148983)=20 compiled Jul 20 2011, 17:49:23 %pig -x local=20 2011-09-01 06:16:25,649 [main] INFO org.apache.pig.Main - Logging error = messages to: /usr/home/hadoop/pig_1314857785643.log 2011-09-01 06:16:26,020 [main] INFO = org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - = Connecting to hadoop file system at: file:/// grunt> quit; %pig 2011-09-01 06:16:35,671 [main] INFO org.apache.pig.Main - Logging error = messages to: /usr/home/hadoop/pig_1314857795666.log 2011-09-01 06:16:36,096 [main] INFO = org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - = Connecting to hadoop file system at: hdfs://localhost:9000 2011-09-01 06:16:36,436 [main] ERROR org.apache.pig.Main - ERROR 2999: = Unexpected internal error. Failed to create DataStorage Details at logfile: /usr/home/hadoop/pig_1314857795666.log %cat /usr/home/hadoop/pig_1314857795666.log Error before Pig is launched ---------------------------- ERROR 2999: Unexpected internal error. Failed to create DataStorage java.lang.RuntimeException: Failed to create DataStorage at = org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.j= ava:75) at = org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage= .java:58) at = org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecu= tionEngine.java:196) at = org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecu= tionEngine.java:116) at org.apache.pig.impl.PigContext.connect(PigContext.java:184) at org.apache.pig.PigServer.<init>(PigServer.java:243) at org.apache.pig.PigServer.<init>(PigServer.java:228) at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:46) at org.apache.pig.Main.run(Main.java:484) at org.apache.pig.Main.main(Main.java:108) Caused by: java.io.IOException: Call to localhost/127.0.0.1:9000 failed = on local exception: java.io.EOFException at org.apache.hadoop.ipc.Client.wrapException(Client.java:775) at org.apache.hadoop.ipc.Client.call(Client.java:743) at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220) at $Proxy0.getProtocolVersion(Unknown Source) at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359) at = org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170) at = org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSys= tem.java:82) at = org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378) at = org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66) at = org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95) at = org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.j= ava:72) ... 9 more Caused by: java.io.EOFException at java.io.DataInputStream.readInt(DataInputStream.java:375) at = org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501) at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446) = =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D I understand that this has something to do with hadoop and pig version = compatibility, but since I'm only an enduser (i.e. don't know about ant = or java stuffs) I'm a little bit lost as you can guess... Can you help with this ?=20 please notice that for lazy reason I'm working with hadoop user directly = (I had to set it home and shell btw), I didn't tried to run pig under = another user for the moment. Thanks, David
Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?ED1A2057-F2A6-4093-B312-06F3F060B18C>