Date: Thu, 26 Oct 2017 20:11:51 GMT From: pkg-fallout@FreeBSD.org To: demon@FreeBSD.org Cc: pkg-fallout@FreeBSD.org Subject: [exp - 103i386-default-build-as-user][devel/spark] Failed for apache-spark-2.1.1 in build Message-ID: <201710262011.v9QKBptW099637@package19.nyi.freebsd.org>
next in thread | raw e-mail | index | archive | help
You are receiving this mail as a port that you maintain is failing to build on the FreeBSD package build server. Please investigate the failure and submit a PR to fix build. Maintainer: demon@FreeBSD.org Last committer: demon@FreeBSD.org Ident: $FreeBSD: head/devel/spark/Makefile 442541 2017-06-04 10:35:54Z demon $ Log URL: http://package19.nyi.freebsd.org/data/103i386-default-build-as-user/452898/logs/apache-spark-2.1.1.log Build URL: http://package19.nyi.freebsd.org/build.html?mastername=103i386-default-build-as-user&build=452898 Log: ====>> Building devel/spark build started at Thu Oct 26 19:37:52 UTC 2017 port directory: /usr/ports/devel/spark building for: FreeBSD 103i386-default-build-as-user-job-03 10.3-RELEASE-p22 FreeBSD 10.3-RELEASE-p22 i386 maintained by: demon@FreeBSD.org Makefile ident: $FreeBSD: head/devel/spark/Makefile 442541 2017-06-04 10:35:54Z demon $ Poudriere version: 3.1.21-7-g66ad3813 Host OSVERSION: 1200050 Jail OSVERSION: 1003000 Job Id: 03 ---Begin Environment--- SHELL=/bin/csh UNAME_p=i386 UNAME_m=i386 OSVERSION=1003000 UNAME_v=FreeBSD 10.3-RELEASE-p22 UNAME_r=10.3-RELEASE-p22 BLOCKSIZE=K MAIL=/var/mail/root STATUS=1 SAVED_TERM= MASTERMNT=/poudriere/data/.m/103i386-default-build-as-user/ref UID=0 PATH=/sbin:/bin:/usr/sbin:/usr/bin:/usr/games:/usr/local/sbin:/usr/local/bin:/root/bin POUDRIERE_BUILD_TYPE=bulk PKGNAME=apache-spark-2.1.1 OLDPWD=/ PWD=/poudriere/data/.m/103i386-default-build-as-user/ref/.p/pool MASTERNAME=103i386-default-build-as-user SCRIPTPREFIX=/usr/local/share/poudriere USER=root HOME=/root POUDRIERE_VERSION=3.1.21-7-g66ad3813 SCRIPTPATH=/usr/local/share/poudriere/bulk.sh GID=0 LIBEXECPREFIX=/usr/local/libexec/poudriere LOCALBASE=/usr/local POUDRIEREPATH=/usr/local/bin/poudriere ---End Environment--- ---Begin Poudriere Port Flags/Env--- PORT_FLAGS= PKGENV= ---End Poudriere Port Flags/Env--- ---Begin OPTIONS List--- ---End OPTIONS List--- --CONFIGURE_ARGS-- --End CONFIGURE_ARGS-- --CONFIGURE_ENV-- PYTHON="/usr/local/bin/python2.7" XDG_DATA_HOME=/wrkdirs/usr/ports/devel/spark/work XDG_CONFIG_HOME=/wrkdirs/usr/ports/devel/spark/work HOME=/wrkdirs/usr/ports/devel/spark/work TMPDIR="/tmp" PATH=/wrkdirs/usr/ports/devel/spark/work/.bin:/sbin:/bin:/usr/sbin:/usr/bin:/usr/games:/usr/local/sbin:/usr/local/bin:/root/bin SHELL=/bin/sh CONFIG_SHELL=/bin/sh --End CONFIGURE_ENV-- --MAKE_ENV-- JAVA_HOME=/usr/local/openjdk8 MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m" XDG_DATA_HOME=/wrkdirs/usr/ports/devel/spark/work XDG_CONFIG_HOME=/wrkdirs/usr/ports/devel/spark/work HOME=/wrkdirs/usr/ports/devel/spark/work TMPDIR="/tmp" PATH=/wrkdirs/usr/ports/devel/spark/work/.bin:/sbin:/bin:/usr/sbin:/usr/bin:/usr/games:/usr/local/sbin:/usr/local/bin:/root/bin NO_PIE=yes WITHOUT_DEBUG_FILES=yes WITHOUT_KERNEL_SYMBOLS=yes SHELL=/bin/sh NO_LINT=YES PREFIX=/usr/local LOCALBASE=/usr/local LIBDIR="/usr/lib" CC="cc" CFLAGS="-O2 -pipe -fstack-protector -fno-strict-aliasing" CPP="cpp" CPPFLAGS="" LDFLAGS=" -fstack-protector" LIBS="" CXX="c++" CXXFLAGS="-O2 -pipe -fstack-protector -fno-strict-aliasing " MANPREFIX="/usr/local" BSD_INSTALL_PROGRAM="install -s -m 555" BSD_INSTALL_LIB="install -s -m 0644" BSD_INSTALL_SCRIPT="install -m 555" BSD_INSTALL_DATA="install -m 0644" BSD_INSTALL_MAN="install -m 444" --End MAKE_ENV-- --PLIST_SUB-- SPARK_USER=spark SPARK_GROUP=spark VER=2.1.1 JAVASHAREDIR="share/java" JAVAJARDIR="share/java/classes" PYTHON_INCLUDEDIR=include/python2.7 PYTHON_LIBDIR=lib/python2.7 PYTHON_PLATFORM=freebsd10 PYTHON_PYOEXTENSION=pyo PYTHON_SITELIBDIR=lib/python2.7/site-packages PYTHON_SUFFIX=27 PYTHON_VER=2.7 PYTHON_VERSION=python2.7 PYTHON2="" PYTHON3="@comment " OSREL=10.3 PREFIX=%D LOCALBASE=/usr/local RESETPREFIX=/usr/local PORTDOCS="" PORTEXAMPLES="" LIB32DIR=lib DOCSDIR="share/doc/spark" EXAMPLESDIR="share/examples/spark" DATADIR="share/spark" WWWDIR="www/spark" ETCDIR="etc/spark" --End PLIST_SUB-- --SUB_LIST-- SPARK_USER=spark SPARK_GROUP=spark JAVASHAREDIR="/usr/local/share/java" JAVAJARDIR="/usr/local/share/java/classes" JAVALIBDIR="/usr/local/share/java/classes" JAVA_VERSION="1.7+" PREFIX=/usr/local LOCALBASE=/usr/local DATADIR=/usr/local/share/spark DOCSDIR=/usr/local/share/doc/spark EXAMPLESDIR=/usr/local/share/examples/spark WWWDIR=/usr/local/www/spark ETCDIR=/usr/local/etc/spark --End SUB_LIST-- ---Begin make.conf--- USE_PACKAGE_DEPENDS=yes BATCH=yes WRKDIRPREFIX=/wrkdirs PORTSDIR=/usr/ports PACKAGES=/packages DISTDIR=/distfiles FORCE_PACKAGE=yes PACKAGE_BUILDING=yes MACHINE=i386 MACHINE_ARCH=i386 ARCH=${MACHINE_ARCH} #### /usr/local/etc/poudriere.d/make.conf #### # Build ALLOW_MAKE_JOBS_PACKAGES with 2 jobs MAKE_JOBS_NUMBER=2 #### /usr/ports/Mk/Scripts/ports_env.sh #### ARCH=i386 CONFIGURE_MAX_CMD_LEN=262144 OPSYS=FreeBSD OSREL=10.3 OSVERSION=1003000 PYTHONBASE=/usr/local _JAVA_OS_LIST_REGEXP=native|linux _JAVA_VENDOR_LIST_REGEXP=openjdk|oracle|sun _JAVA_VERSION_LIST_REGEXP=1.6|1.7|1.8|1.9|1.6\+|1.7\+|1.8\+|1.9\+ _OSRELEASE=10.3-RELEASE-p22 #### Misc Poudriere #### DISABLE_MAKE_JOBS=poudriere ---End make.conf--- --Resource limits-- cpu time (seconds, -t) unlimited file size (512-blocks, -f) unlimited data seg size (kbytes, -d) 524288 stack size (kbytes, -s) 65536 core file size (512-blocks, -c) unlimited max memory size (kbytes, -m) unlimited locked memory (kbytes, -l) unlimited max user processes (-u) 89999 open files (-n) 1024 virtual mem size (kbytes, -v) unlimited swap limit (kbytes, -w) unlimited sbsize (bytes, -b) unlimited pseudo-terminals (-p) unlimited --End resource limits-- =======================<phase: check-sanity >============================ ===> License APACHE20 accepted by the user =========================================================================== =======================<phase: pkg-depends >============================ ===> apache-spark-2.1.1 depends on file: /usr/local/sbin/pkg - not found ===> Installing existing package /packages/All/pkg-1.10.1.txz [103i386-default-build-as-user-job-03] Installing pkg-1.10.1... [103i386-default-build-as-user-job-03] Extracting pkg-1.10.1: .......... done ===> apache-spark-2.1.1 depends on file: /usr/local/sbin/pkg - found ===> Returning to build of apache-spark-2.1.1 =========================================================================== =======================<phase: fetch-depends >============================ =========================================================================== =======================<phase: fetch >============================ ===> License APACHE20 accepted by the user ===> Fetching all distfiles required by apache-spark-2.1.1 for building =========================================================================== =======================<phase: checksum >============================ ===> License APACHE20 accepted by the user ===> Fetching all distfiles required by apache-spark-2.1.1 for building => SHA256 Checksum OK for hadoop/spark-2.1.1.tgz. => SHA256 Checksum OK for hadoop/FreeBSD-spark-2.1.1-maven-repository.tar.gz. =========================================================================== =======================<phase: extract-depends>============================ =========================================================================== =======================<phase: extract >============================ ===> License APACHE20 accepted by the user ===> Fetching all distfiles required by apache-spark-2.1.1 for building ===> Extracting for apache-spark-2.1.1 => SHA256 Checksum OK for hadoop/spark-2.1.1.tgz. => SHA256 Checksum OK for hadoop/FreeBSD-spark-2.1.1-maven-repository.tar.gz. =========================================================================== =======================<phase: patch-depends >============================ =========================================================================== =======================<phase: patch >============================ ===> Patching for apache-spark-2.1.1 ===> Applying FreeBSD patches for apache-spark-2.1.1 =========================================================================== =======================<phase: build-depends >============================ ===> apache-spark-2.1.1 depends on file: /usr/local/share/java/maven33/bin/mvn - not found ===> Installing existing package /packages/All/maven33-3.3.9.txz [103i386-default-build-as-user-job-03] Installing maven33-3.3.9... [103i386-default-build-as-user-job-03] `-- Installing maven-wrapper-1_2... [103i386-default-build-as-user-job-03] `-- Extracting maven-wrapper-1_2: .. done [103i386-default-build-as-user-job-03] `-- Installing openjdk8-8.144.1... [103i386-default-build-as-user-job-03] | `-- Installing alsa-lib-1.1.2... [103i386-default-build-as-user-job-03] | `-- Extracting alsa-lib-1.1.2: .......... done <snip> [INFO] --- maven-compiler-plugin:3.5.1:testCompile (default-testCompile) @ spark-tools_2.11 --- [INFO] No sources to compile [INFO] [INFO] --- maven-dependency-plugin:2.10:build-classpath (generate-test-classpath) @ spark-tools_2.11 --- [INFO] [INFO] --- maven-surefire-plugin:2.19.1:test (default-test) @ spark-tools_2.11 --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-surefire-plugin:2.19.1:test (test) @ spark-tools_2.11 --- [INFO] Tests are skipped. [INFO] [INFO] --- scalatest-maven-plugin:1.0:test (test) @ spark-tools_2.11 --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.6:test-jar (prepare-test-jar) @ spark-tools_2.11 --- [INFO] Building jar: /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/tools/target/spark-tools_2.11-2.1.1-tests.jar [INFO] [INFO] --- maven-jar-plugin:2.6:jar (default-jar) @ spark-tools_2.11 --- [INFO] Building jar: /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/tools/target/spark-tools_2.11-2.1.1.jar [INFO] [INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ spark-tools_2.11 --- [INFO] [INFO] --- maven-shade-plugin:2.4.3:shade (default) @ spark-tools_2.11 --- [INFO] Excluding org.scala-lang:scala-reflect:jar:2.11.8 from the shaded jar. [INFO] Excluding org.scala-lang:scala-library:jar:2.11.8 from the shaded jar. [INFO] Excluding org.scala-lang:scala-compiler:jar:2.11.8 from the shaded jar. [INFO] Excluding org.scala-lang.modules:scala-xml_2.11:jar:1.0.4 from the shaded jar. [INFO] Excluding org.scala-lang.modules:scala-parser-combinators_2.11:jar:1.0.4 from the shaded jar. [INFO] Excluding org.clapper:classutil_2.11:jar:1.0.6 from the shaded jar. [INFO] Excluding org.ow2.asm:asm:jar:5.0.2 from the shaded jar. [INFO] Excluding org.ow2.asm:asm-commons:jar:5.0.2 from the shaded jar. [INFO] Excluding org.ow2.asm:asm-tree:jar:5.0.2 from the shaded jar. [INFO] Excluding org.ow2.asm:asm-util:jar:5.0.2 from the shaded jar. [INFO] Excluding org.clapper:grizzled-scala_2.11:jar:1.4.0 from the shaded jar. [INFO] Excluding org.scala-lang.modules:scala-async_2.11:jar:0.9.1 from the shaded jar. [INFO] Excluding jline:jline:jar:2.12.1 from the shaded jar. [INFO] Excluding org.clapper:grizzled-slf4j_2.11:jar:1.0.2 from the shaded jar. [INFO] Excluding org.slf4j:slf4j-api:jar:1.7.16 from the shaded jar. [INFO] Including org.spark-project.spark:unused:jar:1.0.0 in the shaded jar. [INFO] Replacing original artifact with shaded artifact. [INFO] Replacing /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/tools/target/spark-tools_2.11-2.1.1.jar with /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/tools/target/spark-tools_2.11-2.1.1-shaded.jar [INFO] Dependency-reduced POM written at: /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/tools/dependency-reduced-pom.xml [INFO] Dependency-reduced POM written at: /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/tools/dependency-reduced-pom.xml [INFO] [INFO] --- maven-source-plugin:2.4:jar-no-fork (create-source-jar) @ spark-tools_2.11 --- [INFO] Building jar: /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/tools/target/spark-tools_2.11-2.1.1-sources.jar [INFO] [INFO] --- maven-source-plugin:2.4:test-jar-no-fork (create-source-jar) @ spark-tools_2.11 --- [INFO] Building jar: /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/tools/target/spark-tools_2.11-2.1.1-test-sources.jar [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Spark Project Hive 2.1.1 [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:3.0.0:clean (default-clean) @ spark-hive_2.11 --- [INFO] [INFO] --- maven-enforcer-plugin:1.4.1:enforce (enforce-versions) @ spark-hive_2.11 --- [INFO] [INFO] --- scala-maven-plugin:3.2.2:add-source (eclipse-add-source) @ spark-hive_2.11 --- [INFO] Add Source directory: /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/sql/hive/src/main/scala [INFO] Add Test Source directory: /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/sql/hive/src/test/scala [INFO] [INFO] --- maven-dependency-plugin:2.10:build-classpath (default-cli) @ spark-hive_2.11 --- [INFO] Dependencies classpath: /wrkdirs/usr/ports/devel/spark/work/m2/mx4j/mx4j/3.0.2/mx4j-3.0.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/hadoop/hadoop-annotations/2.7.2/hadoop-annotations-2.7.2.jar:/wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/launcher/target/spark-launcher_2.11-2.1.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/io/dropwizard/metrics/metrics-core/3.1.2/metrics-core-3.1.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/com/univocity/univocity-parsers/2.2.1/univocity-parsers-2.2.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/parquet/parquet-jackson/1.8.1/parquet-jackson-1.8.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/htrace/htrace-core/3.1.0-incubating/htrace-core-3.1.0-incubating.jar:/wrkdirs/usr/ports/devel/spark/work/m2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/wrkdirs/usr/ports/devel/spark/work/m2/net/java/dev/jets3t/jets3t/0.9.3/jets3t-0.9.3.jar:/wrkdirs/usr/ports/devel/spark/work/m2/com/fasterxml/jackson/core/jackson-databind/2.6.5/jackson-d atabind-2.6.5.jar:/wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/common/network-common/target/spark-network-common_2.11-2.1.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/hadoop/hadoop-mapreduce-client-shuffle/2.7.2/hadoop-mapreduce-client-shuffle-2.7.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/io/dropwizard/metrics/metrics-jvm/3.1.2/metrics-jvm-3.1.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/codehaus/janino/janino/3.0.0/janino-3.0.0.jar:/wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/common/sketch/target/spark-sketch_2.11-2.1.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/com/googlecode/javaewah/JavaEWAH/0.3.2/JavaEWAH-0.3.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/fusesource/leveldbjni/leveldbjni-all/1.8/leveldbjni-all-1.8.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/parquet/parquet-format/2.3.0-incubating/parquet-format-2.3.0-incubating.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/antlr/stringtemplate/3.2.1/stringtemplate-3.2.1.jar:/wrkdirs/usr/port s/devel/spark/work/spark-2.1.1/common/networ! k-shuffle/target/spark-network-shuffle_2.11-2.1.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/glassfish/jersey/bundles/repackaged/jersey-guava/2.22.2/jersey-guava-2.22.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/io/dropwizard/metrics/metrics-json/3.1.2/metrics-json-3.1.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/commons/commons-crypto/1.0.0/commons-crypto-1.0.0.jar:/wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/sql/core/target/spark-sql_2.11-2.1.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/commons-io/commons-io/2.4/commons-io-2.4.jar:/wrkdirs/usr/ports/devel/spark/work/m2/commons-dbcp/commons-dbcp/1.4/commons-dbcp-1.4.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/commons/commons-lang3/3.5/commons-lang3-3.5.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/codehaus/jackson/jackson-jaxrs/1.9.13/jackson-jaxrs-1.9.13.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/hadoop/hadoop-mapreduce-client-core/2.7.2/hadoop-mapreduce-client-core-2.7.2.jar:/wrkdirs/usr/po rts/devel/spark/work/m2/org/scala-lang/scala-reflect/2.11.8/scala-reflect-2.11.8.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/json4s/json4s-jackson_2.11/3.2.11/json4s-jackson_2.11-3.2.11.jar:/wrkdirs/usr/ports/devel/spark/work/m2/com/esotericsoftware/minlog/1.3.0/minlog-1.3.0.jar:/wrkdirs/usr/ports/devel/spark/work/m2/javax/mail/mail/1.4.7/mail-1.4.7.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/antlr/ST4/4.0.4/ST4-4.0.4.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar:/wrkdirs/usr/ports/devel/spark/work/m2/javax/validation/validation-api/1.1.0.Final/validation-api-1.1.0.Final.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/bouncycastle/bcprov-jdk15on/1.51/bcprov-jdk15on-1.51.jar:/wrkdirs/usr/ports/devel/spark/work/m2/net/jpountz/lz4/lz4/1.3.0/lz4-1.3.0.jar:/wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/sql/catalyst/target/spark-catalyst_2.11-2.1.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/glassfish/jersey/co re/jersey-common/2.22.2/jersey-common-2.22.2! .jar:/wrk! dirs/usr/ports/devel/spark/work/spark-2.1.1/common/unsafe/target/spark-unsafe_2.11-2.1.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/joda-time/joda-time/2.9.3/joda-time-2.9.3.jar:/wrkdirs/usr/ports/devel/spark/work/m2/io/netty/netty/3.8.0.Final/netty-3.8.0.Final.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/iq80/snappy/snappy/0.2/snappy-0.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/hadoop/hadoop-mapreduce-client-jobclient/2.7.2/hadoop-mapreduce-client-jobclient-2.7.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar:/wrkdirs/usr/ports/devel/spark/work/m2/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/thrift/libthrift/0.9.3/libthrift-0.9.3.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/glassfish/hk2/hk2-api/2.4.0-b34/hk2-api-2.4.0-b34.jar:/wrkdirs/usr/ports/devel/spark/work/m2/net/sf/opencsv/opencsv/2.3/opencsv-2.3.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/hadoo p/hadoop-mapreduce-client-common/2.7.2/hadoop-mapreduce-client-common-2.7.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/glassfish/jersey/core/jersey-server/2.22.2/jersey-server-2.22.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/com/twitter/chill-java/0.8.0/chill-java-0.8.0.jar:/wrkdirs/usr/ports/devel/spark/work/m2/commons-lang/commons-lang/2.6/commons-lang-2.6.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/spark-project/hive/hive-exec/1.2.1.spark2/hive-exec-1.2.1.spark2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/tukaani/xz/1.0/xz-1.0.jar:/wrkdirs/usr/ports/devel/spark/work/m2/javax/activation/activation/1.1.1/activation-1.1.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/javolution/javolution/5.5.1/javolution-5.5.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/com/jolbox/bonecp/0.8.0.RELEASE/bonecp-0.8.0.RELEASE.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/hadoop/hadoop-client/2 .7.2/hadoop-client-2.7.2.jar:/wrkdirs/usr/po! rts/devel! /spark/work/m2/commons-codec/commons-codec/1.10/commons-codec-1.10.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/javassist/javassist/3.18.1-GA/javassist-3.18.1-GA.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/ivy/ivy/2.4.0/ivy-2.4.0.jar:/wrkdirs/usr/ports/devel/spark/work/m2/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/hadoop/hadoop-hdfs/2.7.2/hadoop-hdfs-2.7.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/javax/servlet/javax.servlet-api/3.1.0/javax.servlet-api-3.1.0.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/glassfish/hk2/osgi-resource-locator/1.0.1/osgi-resource-locator-1.0.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/commons/commons-math3/3.4.1/commons-math3-3.4.1.jar:/wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/common/tags/target/spark-tags_2.11-2.1.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/javax/transaction/jta/1.1/jta-1.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/ja vax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/avro/avro/1.7.7/avro-1.7.7.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/codehaus/jackson/jackson-xc/1.9.13/jackson-xc-1.9.13.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/hadoop/hadoop-yarn-client/2.7.2/hadoop-yarn-client-2.7.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/glassfish/hk2/hk2-utils/2.4.0-b34/hk2-utils-2.4.0-b34.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/glassfish/hk2/external/aopalliance-repackaged/2.4.0-b34/aopalliance-repackaged-2.4.0-b34.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/glassfish/hk2/external/javax.inject/2.4.0-b34/javax.inject-2.4.0-b34.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/glassfish/jersey/containers/jersey-container-servlet-core/2.22.2/jersey-container-servlet-core-2.22.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/c alcite/calcite-core/1.2.0-incubating/calcite! -core-1.2! .0-incubating.jar:/wrkdirs/usr/ports/devel/spark/work/m2/oro/oro/2.0.8/oro-2.0.8.jar:/wrkdirs/usr/ports/devel/spark/work/m2/javax/jdo/jdo-api/3.0.1/jdo-api-3.0.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/scala-lang/scala-compiler/2.11.8/scala-compiler-2.11.8.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/curator/curator-framework/2.6.0/curator-framework-2.6.0.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/antlr/antlr-runtime/3.4/antlr-runtime-3.4.jar:/wrkdirs/usr/ports/devel/spark/work/m2/stax/stax-api/1.0.1/stax-api-1.0.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/calcite/calcite-avatica/1.2.0-incubating/calcite-avatica-1.2.0-incubating.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/roaringbitmap/RoaringBitmap/0.5.11/RoaringBitmap-0.5.11.jar:/wrkdirs/usr/ports/devel/spark/work/m2/log4j/log4j/1.2.17/log4j-1.2.17.jar:/wrkdirs/usr/ports/devel/spark/work/m2/com/twitter/parquet-hadoop-bundle/1.6.0/parquet-hadoop-bundle-1.6.0.jar:/wrkdirs/usr/ports/devel/spark/work /m2/org/scala-lang/scala-library/2.11.8/scala-library-2.11.8.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/datanucleus/datanucleus-core/3.2.10/datanucleus-core-3.2.10.jar:/wrkdirs/usr/ports/devel/spark/work/m2/io/netty/netty-all/4.0.42.Final/netty-all-4.0.42.Final.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/directory/server/apacheds-kerberos-codec/2.0.0-M15/apacheds-kerberos-codec-2.0.0-M15.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/curator/curator-recipes/2.6.0/curator-recipes-2.6.0.jar:/wrkdirs/usr/ports/devel/spark/work/m2/com/fasterxml/jackson/core/jackson-annotations/2.6.5/jackson-annotations-2.6.5.jar:/wrkdirs/usr/ports/devel/spark/work/m2/com/jamesmurty/utils/java-xmlbuilder/1.0/java-xmlbuilder-1.0.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/parquet/parquet-encoding/1.8.1/parquet-encoding-1.8.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/o rg/apache/calcite/calcite-linq4j/1.2.0-incub! ating/cal! cite-linq4j-1.2.0-incubating.jar:/wrkdirs/usr/ports/devel/spark/work/m2/antlr/antlr/2.7.7/antlr-2.7.7.jar:/wrkdirs/usr/ports/devel/spark/work/m2/commons-pool/commons-pool/1.5.4/commons-pool-1.5.4.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/slf4j/slf4j-api/1.7.16/slf4j-api-1.7.16.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/xbean/xbean-asm5-shaded/4.4/xbean-asm5-shaded-4.4.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/json4s/json4s-core_2.11/3.2.11/json4s-core_2.11-3.2.11.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/httpcomponents/httpclient/4.5.2/httpclient-4.5.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/jodd/jodd-core/3.5.2/jodd-core-3.5.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/parquet/parquet-column/1.8.1/parquet-column-1.8.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/parquet/parquet-common/1.8.1/parquet-common-1.8.1.jar:/wrkdirs/usr /ports/devel/spark/work/m2/org/antlr/antlr4-runtime/4.5.3/antlr4-runtime-4.5.3.jar:/wrkdirs/usr/ports/devel/spark/work/m2/net/hydromatic/eigenbase-properties/1.1.5/eigenbase-properties-1.1.5.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar:/wrkdirs/usr/ports/devel/spark/work/m2/com/esotericsoftware/kryo-shaded/3.0.3/kryo-shaded-3.0.3.jar:/wrkdirs/usr/ports/devel/spark/work/m2/javax/ws/rs/javax.ws.rs-api/2.0.1/javax.ws.rs-api-2.0.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/javax/servlet/jsp/jsp-api/2.1/jsp-api-2.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/glassfish/jersey/core/jersey-client/2.22.2/jersey-client-2.22.2.jar:/wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/core/target/spark-core_2.11-2.1.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/avro/avro-mapred/1.7.7/avro-mapred-1.7.7-hadoop2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/com/fasterxml/jackson/module/jackson-module-paranamer/2.6.5/jac kson-module-paranamer-2.6.5.jar:/wrkdirs/usr! /ports/de! vel/spark/work/m2/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/wrkdirs/usr/ports/devel/spark/work/m2/com/ning/compress-lzf/1.0.3/compress-lzf-1.0.3.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/spark-project/hive/hive-metastore/1.2.1.spark2/hive-metastore-1.2.1.spark2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/parquet/parquet-hadoop/1.8.1/parquet-hadoop-1.8.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/directory/api/api-util/1.0.0-M20/api-util-1.0.0-M20.jar:/wrkdirs/usr/ports/devel/spark/work/m2/com/fasterxml/jackson/module/jackson-module-scala_2.11/2.6.5/jackson-module-scala_2.11-2.6.5.jar:/wrkdirs/usr/ports/devel/spark/work/m2/commons-net/commons-net/2.2/commons-net-2.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/derby/derby/10.12.1.1/derby-10.12.1.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/com/google/code/gson/gson/2.2.4/gson-2.2.4.jar:/wrkdirs/usr/ports/deve l/spark/work/m2/org/json4s/json4s-ast_2.11/3.2.11/json4s-ast_2.11-3.2.11.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/directory/api/api-asn1-api/1.0.0-M20/api-asn1-api-1.0.0-M20.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/scala-lang/scalap/2.11.8/scalap-2.11.8.jar:/wrkdirs/usr/ports/devel/spark/work/m2/com/clearspring/analytics/stream/2.7.0/stream-2.7.0.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/curator/curator-client/2.6.0/curator-client-2.6.0.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/thrift/libfb303/0.9.3/libfb303-0.9.3.jar:/wrkdirs/usr/ports/devel/spark/work/m2/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/glassfish/jersey/containers/jersey-container-servlet/2.22.2/jersey-container-servlet-2.22.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/commons-digester/commons-digester/1.8/commons-digester-1.8.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/slf4j/jul-to-slf4j/1.7.16/jul -to-slf4j-1.7.16.jar:/wrkdirs/usr/ports/deve! l/spark/w! ork/m2/org/apache/directory/server/apacheds-i18n/2.0.0-M15/apacheds-i18n-2.0.0-M15.jar:/wrkdirs/usr/ports/devel/spark/work/m2/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/hadoop/hadoop-mapreduce-client-app/2.7.2/hadoop-mapreduce-client-app-2.7.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/codehaus/janino/commons-compiler/3.0.0/commons-compiler-3.0.0.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/scala-lang/modules/scala-xml_2.11/1.0.2/scala-xml_2.11-1.0.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/com/fasterxml/jackson/core/jackson-core/2.6.5/jackson-core-2.6.5.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/scala-lang/modules/scala-parser-combinators_2.11/1.0.4/scala-parser-combinators_2.11-1.0.4.jar:/wrkdirs/usr/ports/devel/spark/work/m2/commons-cli/commons-cli/1.2/commons-cli-1.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/io/dropwizard/metrics/metrics-graphite/3.1.2/metrics-graphite-3.1.2.jar:/wrkdirs/us r/ports/devel/spark/work/m2/xerces/xercesImpl/2.9.1/xercesImpl-2.9.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/net/iharder/base64/2.3.8/base64-2.3.8.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/objenesis/objenesis/2.1/objenesis-2.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/javax/annotation/javax.annotation-api/1.2/javax.annotation-api-1.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/slf4j/slf4j-log4j12/1.7.16/slf4j-log4j12-1.7.16.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/xerial/snappy/snappy-java/1.1.2.6/snappy-java-1.1.2.6.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/hadoop/hadoop-yarn-server-common/2.7.2/hadoop-yarn-server-common-2.7.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/glassfish/jersey/media/jersey-media-jaxb/2.22.2/jersey-media-jaxb-2.22.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/spark-project/spark/unused/1.0.0/unused-1.0.0.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/avro/avro-ipc/1.7.7/avro-ipc-1.7.7.jar:/wrkdirs/usr/ports/devel/ spark/work/m2/org/apache/hadoop/hadoop-yarn-! common/2.! 7.2/hadoop-yarn-common-2.7.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/net/sf/py4j/py4j/0.10.4/py4j-0.10.4.jar:/wrkdirs/usr/ports/devel/spark/work/m2/commons-collections/commons-collections/3.2.2/commons-collections-3.2.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/datanucleus/datanucleus-rdbms/3.2.9/datanucleus-rdbms-3.2.9.jar:/wrkdirs/usr/ports/devel/spark/work/m2/com/twitter/chill_2.11/0.8.0/chill_2.11-0.8.0.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/hadoop/hadoop-common/2.7.2/hadoop-common-2.7.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/hadoop/hadoop-auth/2.7.2/hadoop-auth-2.7.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/glassfish/hk2/hk2-locator/2.4.0-b34/hk2-locator-2.4.0-b34.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/hadoop/hadoop-yarn-api/2.7.2/hadoop-yarn-api-2.7.2.jar:/wrkdirs/usr/ports/devel/spark/work/m2/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar:/wrkdirs/usr/ports/devel/spark/work/m2/net/razorvine/pyrolite/4 .13/pyrolite-4.13.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/datanucleus/datanucleus-api-jdo/3.2.6/datanucleus-api-jdo-3.2.6.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/slf4j/jcl-over-slf4j/1.7.16/jcl-over-slf4j-1.7.16.jar:/wrkdirs/usr/ports/devel/spark/work/m2/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar:/wrkdirs/usr/ports/devel/spark/work/m2/org/apache/httpcomponents/httpcore/4.4.4/httpcore-4.4.4.jar [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ spark-hive_2.11 --- [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ spark-hive_2.11 --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] Copying 3 resources [INFO] [INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ spark-hive_2.11 --- [WARNING] Zinc server is not available at port 3030 - reverting to normal incremental compile [INFO] Using incremental compilation [INFO] Compiling 28 Scala sources and 2 Java sources to /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/sql/hive/target/scala-2.11/classes... [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/TableReader.scala:89: trait Deserializer in package serde2 is deprecated: see corresponding Javadoc for more information. [WARNING] Utils.classForName(relation.tableDesc.getSerdeClassName).asInstanceOf[Class[Deserializer]], [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/TableReader.scala:103: trait Deserializer in package serde2 is deprecated: see corresponding Javadoc for more information. [WARNING] deserializerClass: Class[_ <: Deserializer], [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/TableReader.scala:137: trait Deserializer in package serde2 is deprecated: see corresponding Javadoc for more information. [WARNING] (part, part.getDeserializer.getClass.asInstanceOf[Class[Deserializer]])).toMap [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/TableReader.scala:152: trait Deserializer in package serde2 is deprecated: see corresponding Javadoc for more information. [WARNING] partitionToDeserializer: Map[HivePartition, Class[_ <: Deserializer]], [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/TableReader.scala:157: trait Deserializer in package serde2 is deprecated: see corresponding Javadoc for more information. [WARNING] partitionToDeserializer: Map[HivePartition, Class[_ <: Deserializer]]): [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/TableReader.scala:158: trait Deserializer in package serde2 is deprecated: see corresponding Javadoc for more information. [WARNING] Map[HivePartition, Class[_ <: Deserializer]] = { [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/TableReader.scala:360: trait Deserializer in package serde2 is deprecated: see corresponding Javadoc for more information. [WARNING] rawDeser: Deserializer, [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/TableReader.scala:363: trait Deserializer in package serde2 is deprecated: see corresponding Javadoc for more information. [WARNING] tableDeser: Deserializer): Iterator[InternalRow] = { [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformation.scala:424: method initialize in class AbstractSerDe is deprecated: see corresponding Javadoc for more information. [WARNING] serde.initialize(null, properties) [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/hiveUDFs.scala:204: method initialize in class GenericUDTF is deprecated: see corresponding Javadoc for more information. [WARNING] protected lazy val outputInspector = function.initialize(inputInspectors.toArray) [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/hiveUDFs.scala:287: class UDAF in package exec is deprecated: see corresponding Javadoc for more information. [WARNING] new GenericUDAFBridge(funcWrapper.createFunction[UDAF]()) [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/hiveUDFs.scala:315: trait AggregationBuffer in object GenericUDAFEvaluator is deprecated: see corresponding Javadoc for more information. [WARNING] private[this] var buffer: GenericUDAFEvaluator.AggregationBuffer = _ [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/hiveWriterContainers.scala:158: trait Serializer in package serde2 is deprecated: see corresponding Javadoc for more information. [WARNING] def newSerializer(tableDesc: TableDesc): Serializer = { [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/hiveWriterContainers.scala:159: trait Serializer in package serde2 is deprecated: see corresponding Javadoc for more information. [WARNING] val serializer = tableDesc.getDeserializerClass.newInstance().asInstanceOf[Serializer] [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/hiveWriterContainers.scala:181: trait Serializer in package serde2 is deprecated: see corresponding Javadoc for more information. [WARNING] val (serializer, standardOI, fieldOIs, dataTypes, wrappers, outputData) = prepareForWrite() [WARNING] ^ [WARNING] /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/hiveWriterContainers.scala:251: trait Serializer in package serde2 is deprecated: see corresponding Javadoc for more information. [WARNING] val (serializer, standardOI, fieldOIs, dataTypes, wrappers, outputData) = prepareForWrite() [WARNING] ^ [WARNING] 16 warnings found [WARNING] warning: [options] bootstrap class path not set in conjunction with -source 1.7 [WARNING] 1 warning [INFO] [INFO] --- maven-compiler-plugin:3.5.1:compile (default-compile) @ spark-hive_2.11 --- [INFO] Changes detected - recompiling the module! [INFO] Compiling 2 source files to /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/sql/hive/target/scala-2.11/classes [INFO] [INFO] --- maven-antrun-plugin:1.8:run (create-tmp-dir) @ spark-hive_2.11 --- [INFO] Executing tasks main: [mkdir] Created dir: /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/sql/hive/target/tmp [INFO] Executed tasks [INFO] [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ spark-hive_2.11 --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 9391 resources [INFO] Copying 3 resources [INFO] [INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ spark-hive_2.11 --- [WARNING] Zinc server is not available at port 3030 - reverting to normal incremental compile [INFO] Using incremental compilation [INFO] Compiling 75 Scala sources and 14 Java sources to /wrkdirs/usr/ports/devel/spark/work/spark-2.1.1/sql/hive/target/scala-2.11/test-classes... [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Spark Project Parent POM ........................... SUCCESS [ 7.485 s] [INFO] Spark Project Tags ................................. SUCCESS [ 40.663 s] [INFO] Spark Project Sketch ............................... SUCCESS [ 19.377 s] [INFO] Spark Project Networking ........................... SUCCESS [ 24.080 s] [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 16.153 s] [INFO] Spark Project Unsafe ............................... SUCCESS [ 35.336 s] [INFO] Spark Project Launcher ............................. SUCCESS [ 19.760 s] [INFO] Spark Project Core ................................. SUCCESS [05:35 min] [INFO] Spark Project ML Local Library ..................... SUCCESS [01:17 min] [INFO] Spark Project GraphX ............................... SUCCESS [01:45 min] [INFO] Spark Project Streaming ............................ SUCCESS [03:06 min] [INFO] Spark Project Catalyst ............................. SUCCESS [05:17 min] [INFO] Spark Project SQL .................................. SUCCESS [06:26 min] [INFO] Spark Project ML Library ........................... SUCCESS [04:55 min] [INFO] Spark Project Tools ................................ SUCCESS [ 15.488 s] [INFO] Spark Project Hive ................................. FAILURE [01:52 min] [INFO] Spark Project REPL ................................. SKIPPED [INFO] Spark Project YARN Shuffle Service ................. SKIPPED [INFO] Spark Project YARN ................................. SKIPPED [INFO] Spark Project Assembly ............................. SKIPPED [INFO] Spark Project External Flume Sink .................. SKIPPED [INFO] Spark Project External Flume ....................... SKIPPED [INFO] Spark Project External Flume Assembly .............. SKIPPED [INFO] Spark Integration for Kafka 0.8 .................... SKIPPED [INFO] Spark Project Examples ............................. SKIPPED [INFO] Spark Project External Kafka Assembly .............. SKIPPED [INFO] Spark Integration for Kafka 0.10 ................... SKIPPED [INFO] Spark Integration for Kafka 0.10 Assembly .......... SKIPPED [INFO] Kafka 0.10 Source for Structured Streaming ......... SKIPPED [INFO] Spark Project Java 8 Tests ......................... SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 33:17 min [INFO] Finished at: 2017-10-26T20:11:49+00:00 [INFO] Final Memory: 89M/1057M [INFO] ------------------------------------------------------------------------ [ERROR] OutOfMemoryError -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/OutOfMemoryError *** Error code 1 Stop. make: stopped in /usr/ports/devel/spark
Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?201710262011.v9QKBptW099637>