From owner-freebsd-pkg-fallout@freebsd.org Thu Oct 29 06:34:29 2015 Return-Path: Delivered-To: freebsd-pkg-fallout@mailman.ysv.freebsd.org Received: from mx1.freebsd.org (mx1.freebsd.org [IPv6:2001:1900:2254:206a::19:1]) by mailman.ysv.freebsd.org (Postfix) with ESMTP id C02E8A20FD3 for ; Thu, 29 Oct 2015 06:34:29 +0000 (UTC) (envelope-from pkg-fallout@FreeBSD.org) Received: from mailman.ysv.freebsd.org (mailman.ysv.freebsd.org [IPv6:2001:1900:2254:206a::50:5]) by mx1.freebsd.org (Postfix) with ESMTP id AA4321FE9 for ; Thu, 29 Oct 2015 06:34:29 +0000 (UTC) (envelope-from pkg-fallout@FreeBSD.org) Received: by mailman.ysv.freebsd.org (Postfix) id A776BA20FD1; Thu, 29 Oct 2015 06:34:29 +0000 (UTC) Delivered-To: pkg-fallout@mailman.ysv.freebsd.org Received: from mx1.freebsd.org (mx1.freebsd.org [IPv6:2001:1900:2254:206a::19:1]) by mailman.ysv.freebsd.org (Postfix) with ESMTP id A6096A20FD0 for ; Thu, 29 Oct 2015 06:34:29 +0000 (UTC) (envelope-from pkg-fallout@FreeBSD.org) Received: from beefy5.nyi.freebsd.org (beefy5.nyi.freebsd.org [IPv6:2610:1c1:1:6080::16:e8]) (using TLSv1.2 with cipher ECDHE-RSA-AES256-GCM-SHA384 (256/256 bits)) (Client did not present a certificate) by mx1.freebsd.org (Postfix) with ESMTPS id 68F021FE7; Thu, 29 Oct 2015 06:34:29 +0000 (UTC) (envelope-from pkg-fallout@FreeBSD.org) Received: from beefy5.nyi.freebsd.org (localhost [127.0.0.1]) by beefy5.nyi.freebsd.org (8.15.2/8.15.2) with ESMTP id t9T6YSBg043347; Thu, 29 Oct 2015 06:34:28 GMT (envelope-from pkg-fallout@FreeBSD.org) Received: (from root@localhost) by beefy5.nyi.freebsd.org (8.15.2/8.15.2/Submit) id t9T6YSJI043345; Thu, 29 Oct 2015 06:34:28 GMT (envelope-from pkg-fallout@FreeBSD.org) Date: Thu, 29 Oct 2015 06:34:28 GMT From: pkg-fallout@FreeBSD.org Message-Id: <201510290634.t9T6YSJI043345@beefy5.nyi.freebsd.org> To: demon@FreeBSD.org Subject: [package - 101i386-default][devel/spark] Failed for apache-spark-1.2.1 in build Cc: pkg-fallout@FreeBSD.org X-BeenThere: freebsd-pkg-fallout@freebsd.org X-Mailman-Version: 2.1.20 Precedence: list List-Id: Fallout logs from package building List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-List-Received-Date: Thu, 29 Oct 2015 06:34:29 -0000 You are receiving this mail as a port that you maintain is failing to build on the FreeBSD package build server. Please investigate the failure and submit a PR to fix build. Maintainer: demon@FreeBSD.org Last committer: mat@FreeBSD.org Ident: $FreeBSD: head/devel/spark/Makefile 386097 2015-05-11 18:34:57Z mat $ Log URL: http://beefy5.nyi.freebsd.org/data/101i386-default/400425/logs/apache-spark-1.2.1.log Build URL: http://beefy5.nyi.freebsd.org/build.html?mastername=101i386-default&build=400425 Log: ====>> Building devel/spark build started at Thu Oct 29 06:29:31 UTC 2015 port directory: /usr/ports/devel/spark building for: FreeBSD 101i386-default-job-01 10.1-RELEASE-p23 FreeBSD 10.1-RELEASE-p23 i386 maintained by: demon@FreeBSD.org Makefile ident: $FreeBSD: head/devel/spark/Makefile 386097 2015-05-11 18:34:57Z mat $ Poudriere version: 3.1.9 Host OSVERSION: 1100079 Jail OSVERSION: 1001000 ---Begin Environment--- SHELL=/bin/csh UNAME_p=i386 UNAME_m=i386 OSVERSION=1001000 UNAME_v=FreeBSD 10.1-RELEASE-p23 UNAME_r=10.1-RELEASE-p23 BLOCKSIZE=K MAIL=/var/mail/root STATUS=1 SAVED_TERM= MASTERMNT=/usr/local/poudriere/data/.m/101i386-default/ref PATH=/sbin:/bin:/usr/sbin:/usr/bin:/usr/games:/usr/local/sbin:/usr/local/bin:/root/bin POUDRIERE_BUILD_TYPE=bulk PKGNAME=apache-spark-1.2.1 OLDPWD=/ PWD=/usr/local/poudriere/data/.m/101i386-default/ref/.p/pool MASTERNAME=101i386-default SCRIPTPREFIX=/usr/local/share/poudriere USER=root HOME=/root POUDRIERE_VERSION=3.1.9 SCRIPTPATH=/usr/local/share/poudriere/bulk.sh LIBEXECPREFIX=/usr/local/libexec/poudriere LOCALBASE=/usr/local PACKAGE_BUILDING=yes ---End Environment--- ---Begin OPTIONS List--- ---End OPTIONS List--- --CONFIGURE_ARGS-- --End CONFIGURE_ARGS-- --CONFIGURE_ENV-- PYTHON="/usr/local/bin/python2.7" XDG_DATA_HOME=/wrkdirs/usr/ports/devel/spark/work XDG_CONFIG_HOME=/wrkdirs/usr/ports/devel/spark/work HOME=/wrkdirs/usr/ports/devel/spark/work TMPDIR="/tmp" SHELL=/bin/sh CONFIG_SHELL=/bin/sh --End CONFIGURE_ENV-- --MAKE_ENV-- MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m" XDG_DATA_HOME=/wrkdirs/usr/ports/devel/spark/work XDG_CONFIG_HOME=/wrkdirs/usr/ports/devel/spark/work HOME=/wrkdirs/usr/ports/devel/spark/work TMPDIR="/tmp" NO_PIE=yes SHELL=/bin/sh NO_LINT=YES PREFIX=/usr/local LOCALBASE=/usr/local LIBDIR="/usr/lib" CC="cc" CFLAGS="-O2 -pipe -fstack-protector -fno-strict-aliasing" CPP="cpp" CPPFLAGS="" LDFLAGS=" -fstack-protector" LIBS="" CXX="c++" CXXFLAGS="-O2 -pipe -fstack-protector -fno-strict-aliasing " MANPREFIX="/usr/local" BSD_INSTALL_PROGRAM="install -s -m 555" BSD_INSTALL_LIB="install -s -m 444" BSD_INSTALL_SCRIPT="install -m 555" BSD_INSTALL_DATA="install -m 0644" BSD_INSTALL_MAN="install -m 444" --End MAKE_ENV-- --PLIST_SUB-- SPARK_USER=spark SPARK_GROUP=spark VER=1.2.1 JAVASHAREDIR="share/java" JAVAJARDIR="share/java/classes" PYTHON_INCLUDEDIR=include/python2.7 PYTHON_LIBDIR=lib/python2.7 PYTHON_PLATFORM=freebsd10 PYTHON_SITELIBDIR=lib/python2.7/site-packages PYTHON_VERSION=python2.7 PYTHON_VER=2.7 OSREL=10.1 PREFIX=%D LOCALBASE=/usr/local RESETPREFIX=/usr/local PORTDOCS="" PORTEXAMPLES="" LIB32DIR=lib DOCSDIR="share/doc/spark" EXAMPLESDIR="share/examples/spark" DATADIR="share/spark" WWWDIR="www/spark" ETCDIR="etc/spark" --End PLIST_SUB-- --SUB_LIST-- SPARK_USER=spark SPARK_GROUP=spark JAVASHAREDIR="/usr/local/share/java" JAVAJARDIR="/usr/local/share/java/classes" JAVALIBDIR="/usr/local/share/java/classes" JAVA_VERSION="1.7+" PREFIX=/usr/local LOCALBASE=/usr/local DATADIR=/usr/local/share/spark DOCSDIR=/usr/local/share/doc/spark EXAMPLESDIR=/usr/local/share/examples/spark WWWDIR=/usr/local/www/spark ETCDIR=/usr/local/etc/spark --End SUB_LIST-- ---Begin make.conf--- MACHINE=i386 MACHINE_ARCH=i386 ARCH=${MACHINE_ARCH} USE_PACKAGE_DEPENDS=yes BATCH=yes WRKDIRPREFIX=/wrkdirs PORTSDIR=/usr/ports PACKAGES=/packages DISTDIR=/distfiles #### /usr/local/etc/poudriere.d/make.conf #### DISABLE_MAKE_JOBS=poudriere ---End make.conf--- =================================================== ===> License APACHE20 accepted by the user =========================================================================== =================================================== ===> apache-spark-1.2.1 depends on file: /usr/local/sbin/pkg - not found ===> Installing existing package /packages/All/pkg-1.6.1_1.txz [101i386-default-job-01] Installing pkg-1.6.1_1... [101i386-default-job-01] Extracting pkg-1.6.1_1: .......... done Message from pkg-1.6.1_1: If you are upgrading from the old package format, first run: # pkg2ng ===> apache-spark-1.2.1 depends on file: /usr/local/sbin/pkg - found ===> Returning to build of apache-spark-1.2.1 =========================================================================== =================================================== =========================================================================== =================================================== ===> License APACHE20 accepted by the user ===> Fetching all distfiles required by apache-spark-1.2.1 for building =========================================================================== =================================================== ===> License APACHE20 accepted by the user ===> Fetching all distfiles required by apache-spark-1.2.1 for building => SHA256 Checksum OK for hadoop/spark-1.2.1.tgz. => SHA256 Checksum OK for hadoop/FreeBSD-spark-1.2.1-maven-repository.tar.gz. =========================================================================== =================================================== =========================================================================== =================================================== ===> License APACHE20 accepted by the user ===> Fetching all distfiles required by apache-spark-1.2.1 for building ===> Extracting for apache-spark-1.2.1 => SHA256 Checksum OK for hadoop/spark-1.2.1.tgz. => SHA256 Checksum OK for hadoop/FreeBSD-spark-1.2.1-maven-repository.tar.gz. =========================================================================== =================================================== =========================================================================== =================================================== ===> Patching for apache-spark-1.2.1 ===> Applying FreeBSD patches for apache-spark-1.2.1 =========================================================================== =================================================== ===> apache-spark-1.2.1 depends on file: /usr/local/share/java/maven3/bin/mvn - not found ===> Installing existing package /packages/All/maven3-3.0.5.txz [101i386-default-job-01] Installing maven3-3.0.5... [101i386-default-job-01] `-- Installing maven-wrapper-1_2... [101i386-default-job-01] `-- Extracting maven-wrapper-1_2: . done [101i386-default-job-01] `-- Installing openjdk8-8.60.24... [101i386-default-job-01] | `-- Installing giflib-5.0.6... [101i386-default-job-01] | `-- Extracting giflib-5.0.6: .......... done [101i386-default-job-01] | `-- Installing libXt-1.1.5,1... [101i386-default-job-01] | | `-- Installing xproto-7.0.28... [101i386-default-job-01] | | `-- Extracting xproto-7.0.28: .......... done [101i386-default-job-01] | | `-- Installing libSM-1.2.2_3,1... [101i386-default-job-01] | | `-- Installing libICE-1.0.9_1,1... [101i386-default-job-01] | | `-- Extracting libICE-1.0.9_1,1: .......... done [101i386-default-job-01] | | `-- Extracting libSM-1.2.2_3,1: .......... done [101i386-default-job-01] | | `-- Installing libX11-1.6.3,1... [101i386-default-job-01] | | `-- Installing kbproto-1.0.7... [101i386-default-job-01] | | `-- Extracting kbproto-1.0.7: .......... done [101i386-default-job-01] | | `-- Installing libXdmcp-1.1.2... [101i386-default-job-01] | | `-- Extracting libXdmcp-1.1.2: ......... done [101i386-default-job-01] | | `-- Installing libxcb-1.11.1... [101i386-default-job-01] | | | `-- Installing libxml2-2.9.2_3... [101i386-default-job-01] | | | `-- Extracting libxml2-2.9.2_3: .......... done [101i386-default-job-01] | | | `-- Installing libpthread-stubs-0.3_6... [101i386-default-job-01] | | | `-- Extracting libpthread-stubs-0.3_6: ..... done [101i386-default-job-01] | | | `-- Installing libXau-1.0.8_3... [101i386-default-job-01] | | | `-- Extracting libXau-1.0.8_3: .......... done [101i386-default-job-01] | | `-- Extracting libxcb-1.11.1: .......... done [101i386-default-job-01] | | `-- Extracting libX11-1.6.3,1: .......... done [101i386-default-job-01] | `-- Extracting libXt-1.1.5,1: .......... done [101i386-default-job-01] | `-- Installing libXtst-1.2.2_3... [101i386-default-job-01] | | `-- Installing libXext-1.3.3_1,1... [101i386-default-job-01] | | `-- Installing xextproto-7.3.0... [101i386-default-job-01] | | `-- Extracting xextproto-7.3.0: .......... done [101i386-default-job-01] | | `-- Extracting libXext-1.3.3_1,1: .......... done [101i386-default-job-01] | | `-- Installing inputproto-2.3.1... [101i386-default-job-01] | | `-- Extracting inputproto-2.3.1: ..... done [101i386-default-job-01] | | `-- Installing libXi-1.7.5,1... [101i386-default-job-01] | | `-- Installing libXfixes-5.0.1_3... [101i386-default-job-01] | | | `-- Installing fixesproto-5.0... [101i386-default-job-01] | | | `-- Extracting fixesproto-5.0: .... done [101i386-default-job-01] | | `-- Extracting libXfixes-5.0.1_3: .......... done [101i386-default-job-01] | | `-- Extracting libXi-1.7.5,1: .......... done [101i386-default-job-01] | | `-- Installing recordproto-1.14.2... [101i386-default-job-01] | | `-- Extracting recordproto-1.14.2: .... done [101i386-default-job-01] | `-- Extracting libXtst-1.2.2_3: .......... done [101i386-default-job-01] | `-- Installing java-zoneinfo-2015.f... [101i386-default-job-01] | `-- Extracting java-zoneinfo-2015.f: .......... done [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-scala-sources) @ spark-network-shuffle_2.10 --- [INFO] Source directory: /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/network/shuffle/src/main/scala added. [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ spark-network-shuffle_2.10 --- [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ spark-network-shuffle_2.10 --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/network/shuffle/src/main/resources [INFO] Copying 3 resources [INFO] [INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) @ spark-network-shuffle_2.10 --- [WARNING] Zinc server is not available at port 3030 - reverting to normal incremental compile [INFO] Using incremental compilation [INFO] compiler plugin: BasicArtifact(org.scalamacros,paradise_2.10.4,2.0.1,null) [INFO] Compiling 20 Java sources to /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/network/shuffle/target/scala-2.10/classes... [WARNING] warning: [options] bootstrap class path not set in conjunction with -source 1.6 [WARNING] 1 warning [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ spark-network-shuffle_2.10 --- [INFO] Changes detected - recompiling the module! [INFO] Compiling 20 source files to /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/network/shuffle/target/scala-2.10/classes [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-test-source (add-scala-test-sources) @ spark-network-shuffle_2.10 --- [INFO] Test Source directory: /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/network/shuffle/src/test/scala added. [INFO] [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ spark-network-shuffle_2.10 --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/network/shuffle/src/test/resources [INFO] Copying 3 resources [INFO] [INFO] --- scala-maven-plugin:3.2.0:testCompile (scala-test-compile-first) @ spark-network-shuffle_2.10 --- [WARNING] Zinc server is not available at port 3030 - reverting to normal incremental compile [INFO] Using incremental compilation [INFO] compiler plugin: BasicArtifact(org.scalamacros,paradise_2.10.4,2.0.1,null) [INFO] Compiling 11 Java sources to /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/network/shuffle/target/scala-2.10/test-classes... [WARNING] warning: [options] bootstrap class path not set in conjunction with -source 1.6 [WARNING] 1 warning [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ spark-network-shuffle_2.10 --- [INFO] Changes detected - recompiling the module! [INFO] Compiling 11 source files to /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/network/shuffle/target/scala-2.10/test-classes [INFO] [INFO] --- maven-dependency-plugin:2.9:build-classpath (default) @ spark-network-shuffle_2.10 --- [INFO] Wrote classpath file '/wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/network/shuffle/target/spark-test-classpath.txt'. [INFO] [INFO] --- gmavenplus-plugin:1.2:execute (default) @ spark-network-shuffle_2.10 --- [INFO] Using Groovy 2.3.7 to perform execute. [INFO] [INFO] --- maven-surefire-plugin:2.17:test (default-test) @ spark-network-shuffle_2.10 --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ spark-network-shuffle_2.10 --- [INFO] Building jar: /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/network/shuffle/target/spark-network-shuffle_2.10-1.2.1.jar [INFO] [INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ spark-network-shuffle_2.10 --- [INFO] [INFO] --- maven-shade-plugin:2.2:shade (default) @ spark-network-shuffle_2.10 --- [INFO] Excluding org.apache.spark:spark-network-common_2.10:jar:1.2.1 from the shaded jar. [INFO] Excluding io.netty:netty-all:jar:4.0.23.Final from the shaded jar. [INFO] Including org.spark-project.spark:unused:jar:1.0.0 in the shaded jar. [INFO] Replacing original artifact with shaded artifact. [INFO] Replacing /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/network/shuffle/target/spark-network-shuffle_2.10-1.2.1.jar with /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/network/shuffle/target/spark-network-shuffle_2.10-1.2.1-shaded.jar [INFO] Dependency-reduced POM written at: /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/network/shuffle/dependency-reduced-pom.xml [INFO] Dependency-reduced POM written at: /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/network/shuffle/dependency-reduced-pom.xml [INFO] [INFO] --- maven-source-plugin:2.2.1:jar-no-fork (create-source-jar) @ spark-network-shuffle_2.10 --- [INFO] Building jar: /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/network/shuffle/target/spark-network-shuffle_2.10-1.2.1-sources.jar [INFO] [INFO] --- scalastyle-maven-plugin:0.4.0:check (default) @ spark-network-shuffle_2.10 --- [WARNING] sourceDirectory is not specified or does not exist value=/wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/network/shuffle/src/main/scala Saving to outputFile=/wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/network/shuffle/scalastyle-output.xml Processed 0 file(s) Found 0 errors Found 0 warnings Found 0 infos Finished in 1 ms [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Spark Project Core 1.2.1 [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ spark-core_2.10 --- [INFO] [INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-versions) @ spark-core_2.10 --- [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-scala-sources) @ spark-core_2.10 --- [INFO] Source directory: /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/core/src/main/scala added. [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ spark-core_2.10 --- [INFO] [INFO] --- maven-antrun-plugin:1.7:run (default) @ spark-core_2.10 --- [WARNING] Parameter tasks is deprecated, use target instead [INFO] Executing tasks main: [unzip] Expanding: /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/python/lib/py4j-0.8.2.1-src.zip into /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/python/build [INFO] Executed tasks [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ spark-core_2.10 --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 11 resources [INFO] Copying 23 resources [INFO] Copying 7 resources [INFO] Copying 3 resources [INFO] [INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) @ spark-core_2.10 --- [WARNING] Zinc server is not available at port 3030 - reverting to normal incremental compile [INFO] Using incremental compilation [INFO] compiler plugin: BasicArtifact(org.scalamacros,paradise_2.10.4,2.0.1,null) [INFO] Compiling 403 Scala sources and 33 Java sources to /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/core/target/scala-2.10/classes... [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/core/src/main/scala/org/apache/spark/scheduler/TaskResultGetter.scala:50: inferred existential type (org.apache.spark.scheduler.DirectTaskResult[_$1], Int) forSome { type _$1 }, which cannot be expressed by wildcards, should be enabled by making the implicit value scala.language.existentials visible. This can be achieved by adding the import clause 'import scala.language.existentials' or by setting the compiler option -language:existentials. See the Scala docs for value scala.language.existentials for a discussion why the feature should be explicitly enabled. [WARNING] val (result, size) = serializer.get().deserialize[TaskResult[_]](serializedData) match { [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/core/src/main/scala/org/apache/spark/SparkContext.scala:591: constructor Job in class Job is deprecated: see corresponding Javadoc for more information. [WARNING] val job = new NewHadoopJob(hadoopConfiguration) [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/core/src/main/scala/org/apache/spark/SparkContext.scala:637: constructor Job in class Job is deprecated: see corresponding Javadoc for more information. [WARNING] val job = new NewHadoopJob(hadoopConfiguration) [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/core/src/main/scala/org/apache/spark/SparkContext.scala:796: constructor Job in class Job is deprecated: see corresponding Javadoc for more information. [WARNING] val job = new NewHadoopJob(conf) [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/core/src/main/scala/org/apache/spark/SparkHadoopWriter.scala:168: constructor TaskID in class TaskID is deprecated: see corresponding Javadoc for more information. [WARNING] new TaskAttemptID(new TaskID(jID.value, true, splitID), attemptID)) [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/core/src/main/scala/org/apache/spark/SparkHadoopWriter.scala:189: method makeQualified in class Path is deprecated: see corresponding Javadoc for more information. [WARNING] outputPath.makeQualified(fs) [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala:103: method isDir in class FileStatus is deprecated: see corresponding Javadoc for more information. [WARNING] if (!fs.getFileStatus(path).isDir) { [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala:157: method isDir in class FileStatus is deprecated: see corresponding Javadoc for more information. [WARNING] val logDirs = if (logStatus != null) logStatus.filter(_.isDir).toSeq else Seq[FileStatus]() [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/core/src/main/scala/org/apache/spark/input/PortableDataStream.scala:49: method isDir in class FileStatus is deprecated: see corresponding Javadoc for more information. [WARNING] if (file.isDir) 0L else file.getLen [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/core/src/main/scala/org/apache/spark/input/WholeTextFileInputFormat.scala:63: method isDir in class FileStatus is deprecated: see corresponding Javadoc for more information. [WARNING] if (file.isDir) 0L else file.getLen [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/core/src/main/scala/org/apache/spark/mapred/SparkHadoopMapRedUtil.scala:56: constructor TaskAttemptID in class TaskAttemptID is deprecated: see corresponding Javadoc for more information. [WARNING] new TaskAttemptID(jtIdentifier, jobId, isMap, taskId, attemptId) [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/core/src/main/scala/org/apache/spark/rdd/CheckpointRDD.scala:110: method getDefaultReplication in class FileSystem is deprecated: see corresponding Javadoc for more information. [WARNING] fs.create(tempOutputPath, false, bufferSize, fs.getDefaultReplication, blockSize) [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/core/src/main/scala/org/apache/spark/rdd/HadoopRDD.scala:349: constructor TaskID in class TaskID is deprecated: see corresponding Javadoc for more information. [WARNING] val taId = new TaskAttemptID(new TaskID(jobID, true, splitId), attemptId) [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala:888: constructor Job in class Job is deprecated: see corresponding Javadoc for more information. [WARNING] val job = new NewAPIHadoopJob(hadoopConf) [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala:956: constructor Job in class Job is deprecated: see corresponding Javadoc for more information. [WARNING] val job = new NewAPIHadoopJob(hadoopConf) [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/core/src/main/scala/org/apache/spark/scheduler/EventLoggingListener.scala:202: method isDir in class FileStatus is deprecated: see corresponding Javadoc for more information. [WARNING] fileStatuses.filter(!_.isDir).map(_.getPath).toSeq [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/core/src/main/scala/org/apache/spark/scheduler/InputFormatInfo.scala:106: constructor Job in class Job is deprecated: see corresponding Javadoc for more information. [WARNING] val job = new Job(conf) [WARNING] ^ [WARNING] 17 warnings found [WARNING] warning: [options] bootstrap class path not set in conjunction with -source 1.6 [WARNING] 1 warning [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ spark-core_2.10 --- [INFO] Changes detected - recompiling the module! [INFO] Compiling 33 source files to /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/core/target/scala-2.10/classes [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-test-source (add-scala-test-sources) @ spark-core_2.10 --- [INFO] Test Source directory: /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/core/src/test/scala added. [INFO] [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ spark-core_2.10 --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 4 resources [INFO] Copying 3 resources [INFO] [INFO] --- scala-maven-plugin:3.2.0:testCompile (scala-test-compile-first) @ spark-core_2.10 --- [WARNING] Zinc server is not available at port 3030 - reverting to normal incremental compile [INFO] Using incremental compilation [INFO] compiler plugin: BasicArtifact(org.scalamacros,paradise_2.10.4,2.0.1,null) [INFO] Compiling 125 Scala sources and 4 Java sources to /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/core/target/scala-2.10/test-classes... # # There is insufficient memory for the Java Runtime Environment to continue. # Native memory allocation (malloc) failed to allocate 1218768 bytes for Chunk::new # An error report file with more information is saved as: # /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/hs_err_pid25267.log # # Compiler replay data is saved as: # /wrkdirs/usr/ports/devel/spark/work/spark-1.2.1/replay_pid25267.log *** Error code 1 Stop. make: stopped in /usr/ports/devel/spark