From owner-freebsd-ports@FreeBSD.ORG Thu Sep 1 06:51:49 2011 Return-Path: Delivered-To: freebsd-ports@freebsd.org Received: from mx1.freebsd.org (mx1.freebsd.org [IPv6:2001:4f8:fff6::34]) by hub.freebsd.org (Postfix) with ESMTP id 5A4531065670 for ; Thu, 1 Sep 2011 06:51:49 +0000 (UTC) (envelope-from endzed@gmail.com) Received: from mail-ew0-f54.google.com (mail-ew0-f54.google.com [209.85.215.54]) by mx1.freebsd.org (Postfix) with ESMTP id DC1B28FC15 for ; Thu, 1 Sep 2011 06:51:48 +0000 (UTC) Received: by ewy1 with SMTP id 1so1139829ewy.13 for ; Wed, 31 Aug 2011 23:51:47 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=gamma; h=subject:mime-version:content-type:from:in-reply-to:date:cc :content-transfer-encoding:message-id:references:to:x-mailer; bh=sGaFDtA9c5KlfX5F08k8VLkc+0yqgR5FDRkEzWkGNzQ=; b=gXOkBLlFkwaM7+k92maKrXVcNLPUUMzWSU4AnKLpuCIE/PkwZnAFOUCpPlA0ZYoeA1 KckhO6pNInSY5N5BK2XEJvwTjAlmLEuyH40FiIUypCDI0xdJJ3DPQJs7L5slXOxtY7vs sDIFquMzwb1D+iabtl+gT69dzAilb6lqDKWbk= Received: by 10.213.114.12 with SMTP id c12mr166759ebq.142.1314858335062; Wed, 31 Aug 2011 23:25:35 -0700 (PDT) Received: from [10.0.1.6] (178-211-254-181.dhcp.sevj.net [178.211.254.181]) by mx.google.com with ESMTPS id z15sm205605eem.59.2011.08.31.23.25.32 (version=TLSv1/SSLv3 cipher=OTHER); Wed, 31 Aug 2011 23:25:33 -0700 (PDT) Mime-Version: 1.0 (Apple Message framework v1244.3) Content-Type: text/plain; charset=iso-8859-1 From: endzed@gmail.com In-Reply-To: <20110808124304.GB16138@goofy.cultdeadsheep.org> Date: Thu, 1 Sep 2011 08:25:31 +0200 Content-Transfer-Encoding: quoted-printable Message-Id: References: <20110808091432.GA16138@goofy.cultdeadsheep.org> <20110808124304.GB16138@goofy.cultdeadsheep.org> To: Clement Laforet X-Mailer: Apple Mail (2.1244.3) Cc: freebsd-ports@FreeBSD.org Subject: Re: [CFT] Hadoop preliminary port X-BeenThere: freebsd-ports@freebsd.org X-Mailman-Version: 2.1.5 Precedence: list List-Id: Porting software to FreeBSD List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-List-Received-Date: Thu, 01 Sep 2011 06:51:49 -0000 Le 8 ao=FBt 2011 =E0 14:43, Clement Laforet a =E9crit : > On Mon, Aug 08, 2011 at 11:14:32AM +0200, Clement Laforet wrote: >> Hi, >>=20 >> You can find a preliminary port of hadoop 0.20.203.0 here: >> http://people.freebsd.org/~clement/hadoop/ >=20 > Basic hive and pig ports are available here too. Hello Clem, I'm currently trying your preliminary port for hadoop and pig For hadoop I had to launch su -m hadoop -c 'hadoop namenode -format' = afterwhat all's running fine :) =3D> Maybe some namenodeformat rc command would help for 1st time users = (similar to the postgresql initdb rc command) Pig was a little bit more complex to run, since I needed to setup = /usr/local/etc/hadoop in HADOOP_CONF_DIR and PIG_CLASSPATH env vars for = it to run (so if it is possible to make this path the default would be = great ?), and I'm still unable to run with hadoop, only pig -x local is = running Here's what happen : %pig -x local --version Apache Pig version 0.9.0 (r1148983)=20 compiled Jul 20 2011, 17:49:23 %pig -x local=20 2011-09-01 06:16:25,649 [main] INFO org.apache.pig.Main - Logging error = messages to: /usr/home/hadoop/pig_1314857785643.log 2011-09-01 06:16:26,020 [main] INFO = org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - = Connecting to hadoop file system at: file:/// grunt> quit; %pig 2011-09-01 06:16:35,671 [main] INFO org.apache.pig.Main - Logging error = messages to: /usr/home/hadoop/pig_1314857795666.log 2011-09-01 06:16:36,096 [main] INFO = org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - = Connecting to hadoop file system at: hdfs://localhost:9000 2011-09-01 06:16:36,436 [main] ERROR org.apache.pig.Main - ERROR 2999: = Unexpected internal error. Failed to create DataStorage Details at logfile: /usr/home/hadoop/pig_1314857795666.log %cat /usr/home/hadoop/pig_1314857795666.log Error before Pig is launched ---------------------------- ERROR 2999: Unexpected internal error. Failed to create DataStorage java.lang.RuntimeException: Failed to create DataStorage at = org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.j= ava:75) at = org.apache.pig.backend.hadoop.datastorage.HDataStorage.(HDataStorage= .java:58) at = org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecu= tionEngine.java:196) at = org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecu= tionEngine.java:116) at org.apache.pig.impl.PigContext.connect(PigContext.java:184) at org.apache.pig.PigServer.(PigServer.java:243) at org.apache.pig.PigServer.(PigServer.java:228) at org.apache.pig.tools.grunt.Grunt.(Grunt.java:46) at org.apache.pig.Main.run(Main.java:484) at org.apache.pig.Main.main(Main.java:108) Caused by: java.io.IOException: Call to localhost/127.0.0.1:9000 failed = on local exception: java.io.EOFException at org.apache.hadoop.ipc.Client.wrapException(Client.java:775) at org.apache.hadoop.ipc.Client.call(Client.java:743) at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220) at $Proxy0.getProtocolVersion(Unknown Source) at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359) at = org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106) at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:207) at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:170) at = org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSys= tem.java:82) at = org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378) at = org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66) at = org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95) at = org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.j= ava:72) ... 9 more Caused by: java.io.EOFException at java.io.DataInputStream.readInt(DataInputStream.java:375) at = org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501) at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446) = =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D I understand that this has something to do with hadoop and pig version = compatibility, but since I'm only an enduser (i.e. don't know about ant = or java stuffs) I'm a little bit lost as you can guess... Can you help with this ?=20 please notice that for lazy reason I'm working with hadoop user directly = (I had to set it home and shell btw), I didn't tried to run pig under = another user for the moment. Thanks, David