Date: Tue, 1 Jan 2019 01:14:42 GMT From: pkg-fallout@FreeBSD.org To: pkg-fallout@FreeBSD.org Subject: [package - 120amd64-default][devel/hadoop2] Failed for hadoop2-2.7.2_2 in build Message-ID: <201901010114.x011EgY5073765@beefy6.nyi.freebsd.org>
next in thread | raw e-mail | index | archive | help
You are receiving this mail as a port that you maintain is failing to build on the FreeBSD package build server. Please investigate the failure and submit a PR to fix build. Maintainer: ports@FreeBSD.org Last committer: demon@FreeBSD.org Ident: $FreeBSD: head/devel/hadoop2/Makefile 487877 2018-12-20 11:13:34Z demon $ Log URL: http://beefy6.nyi.freebsd.org/data/120amd64-default/488857/logs/hadoop2-2.7.2_2.log Build URL: http://beefy6.nyi.freebsd.org/build.html?mastername=120amd64-default&build=488857 Log: =>> Building devel/hadoop2 build started at Tue Jan 1 01:13:18 UTC 2019 port directory: /usr/ports/devel/hadoop2 package name: hadoop2-2.7.2_2 building for: FreeBSD 120amd64-default-job-21 12.0-RELEASE-p1 FreeBSD 12.0-RELEASE-p1 amd64 maintained by: ports@FreeBSD.org Makefile ident: $FreeBSD: head/devel/hadoop2/Makefile 487877 2018-12-20 11:13:34Z demon $ Poudriere version: 3.2.8 Host OSVERSION: 1300002 Jail OSVERSION: 1200086 Job Id: 21 ---Begin Environment--- SHELL=/bin/csh OSVERSION=1200086 UNAME_v=FreeBSD 12.0-RELEASE-p1 UNAME_r=12.0-RELEASE-p1 BLOCKSIZE=K MAIL=/var/mail/root STATUS=1 HOME=/root PATH=/sbin:/bin:/usr/sbin:/usr/bin:/usr/local/sbin:/usr/local/bin:/root/bin LOCALBASE=/usr/local USER=root LIBEXECPREFIX=/usr/local/libexec/poudriere POUDRIERE_VERSION=3.2.8 MASTERMNT=/usr/local/poudriere/data/.m/120amd64-default/ref POUDRIERE_BUILD_TYPE=bulk PACKAGE_BUILDING=yes SAVED_TERM= PWD=/usr/local/poudriere/data/.m/120amd64-default/ref/.p/pool P_PORTS_FEATURES=FLAVORS SELECTED_OPTIONS MASTERNAME=120amd64-default SCRIPTPREFIX=/usr/local/share/poudriere OLDPWD=/usr/local/poudriere/data/.m/120amd64-default/ref/.p SCRIPTPATH=/usr/local/share/poudriere/bulk.sh POUDRIEREPATH=/usr/local/bin/poudriere ---End Environment--- ---Begin Poudriere Port Flags/Env--- PORT_FLAGS= PKGENV= FLAVOR= DEPENDS_ARGS= MAKE_ARGS= ---End Poudriere Port Flags/Env--- ---Begin OPTIONS List--- ===> The following configuration options are available for hadoop2-2.7.2_2: EXAMPLES=on: Build and/or install examples ===> Use 'make config' to modify these settings ---End OPTIONS List--- --MAINTAINER-- ports@FreeBSD.org --End MAINTAINER-- --CONFIGURE_ARGS-- --End CONFIGURE_ARGS-- --CONFIGURE_ENV-- XDG_DATA_HOME=/wrkdirs/usr/ports/devel/hadoop2/work XDG_CONFIG_HOME=/wrkdirs/usr/ports/devel/hadoop2/work HOME=/wrkdirs/usr/ports/devel/hadoop2/work TMPDIR="/tmp" PATH=/wrkdirs/usr/ports/devel/hadoop2/work/.bin:/sbin:/bin:/usr/sbin:/usr/bin:/usr/local/sbin:/usr/local/bin:/root/bin SHELL=/bin/sh CONFIG_SHELL=/bin/sh --End CONFIGURE_ENV-- --MAKE_ENV-- JAVA_HOME=/usr/local/openjdk8 HADOOP_PROTOC_PATH=/usr/local/protobuf25/bin/protoc XDG_DATA_HOME=/wrkdirs/usr/ports/devel/hadoop2/work XDG_CONFIG_HOME=/wrkdirs/usr/ports/devel/hadoop2/work HOME=/wrkdirs/usr/ports/devel/hadoop2/work TMPDIR="/tmp" PATH=/wrkdirs/usr/ports/devel/hadoop2/work/.bin:/sbin:/bin:/usr/sbin:/usr/bin:/usr/local/sbin:/usr/local/bin:/root/bin NO_PIE=yes MK_DEBUG_FILES=no MK_KERNEL_SYMBOLS=no SHELL=/bin/sh NO_LINT=YES PREFIX=/usr/local LOCALBASE=/usr/local CC="cc" CFLAGS="-O2 -pipe -fstack-protector -fno-strict-aliasing " CPP="cpp" CPPFLAGS="" LDFLAGS=" -fstack-protector " LIBS="" CXX="c++" CXXFLAGS="-O2 -pipe -fstack-protector -fno-strict-aliasing " MANPREFIX="/usr/local" BSD_INSTALL_PROGRAM="install -s -m 555" BSD_INSTALL_LIB="install -s -m 0644" BSD_INSTALL_SCRIPT="install -m 555" BSD_INSTALL_DATA="install -m 0644" BSD_INSTALL_MAN="install -m 444" --End MAKE_ENV-- --PLIST_SUB-- PORTVERSION="2.7.2" HADOOP_LOGDIR="/var/log/hadoop" HADOOP_RUNDIR="/var/run/hadoop" HDFS_USER="hdfs" MAPRED_USER="mapred" HADOOP_GROUP="hadoop" PORTEXAMPLES="" JAVASHAREDIR="share/java" JAVAJARDIR="share/java/classes" OSREL=12.0 PREFIX=%D LOCALBASE=/usr/local RESETPREFIX=/usr/local LIB32DIR=lib DOCSDIR="share/doc/hadoop" EXAMPLESDIR="share/examples/hadoop" DATADIR="share/hadoop" WWWDIR="www/hadoop" ETCDIR="etc/hadoop" --End PLIST_SUB-- --SUB_LIST-- HDFS_USER="hdfs" MAPRED_USER="mapred" HADOOP_GROUP="hadoop" JAVA_HOME="/usr/local/openjdk8" HADOOP_LOGDIR="/var/log/hadoop" HADOOP_RUNDIR="/var/run/hadoop" JAVASHAREDIR="/usr/local/share/java" JAVAJARDIR="/usr/local/share/java/classes" JAVALIBDIR="/usr/local/share/java/classes" JAVA_VERSION="1.7+" PREFIX=/usr/local LOCALBASE=/usr/local DATADIR=/usr/local/share/hadoop DOCSDIR=/usr/local/share/doc/hadoop EXAMPLESDIR=/usr/local/share/examples/hadoop WWWDIR=/usr/local/www/hadoop ETCDIR=/usr/local/etc/hadoop --End SUB_LIST-- ---Begin make.conf--- USE_PACKAGE_DEPENDS=yes BATCH=yes WRKDIRPREFIX=/wrkdirs PORTSDIR=/usr/ports PACKAGES=/packages DISTDIR=/distfiles PACKAGE_BUILDING=yes PACKAGE_BUILDING_FLAVORS=yes #### /usr/local/etc/poudriere.d/make.conf #### # XXX: We really need this but cannot use it while 'make checksum' does not # try the next mirror on checksum failure. It currently retries the same # failed mirror and then fails rather then trying another. It *does* # try the next if the size is mismatched though. #MASTER_SITE_FREEBSD=yes # Build ALLOW_MAKE_JOBS_PACKAGES with 2 jobs MAKE_JOBS_NUMBER=2 # stable/10 includes src.conf too late but make.conf is in sys.mk .if ${.CURDIR:M/poudriere/jails/10*/usr/src/usr.bin/xlint*} # Disable build of llib now that head no longer has lint(1) LINT= true .endif #### /usr/ports/Mk/Scripts/ports_env.sh #### _CCVERSION_921dbbb2=FreeBSD clang version 6.0.1 (tags/RELEASE_601/final 335540) (based on LLVM 6.0.1) Target: x86_64-unknown-freebsd12.0 Thread model: posix InstalledDir: /usr/bin _ALTCCVERSION_921dbbb2=none _CXXINTERNAL_acaad9ca=FreeBSD clang version 6.0.1 (tags/RELEASE_601/final 335540) (based on LLVM 6.0.1) Target: x86_64-unknown-freebsd12.0 Thread model: posix InstalledDir: /usr/bin "/usr/bin/ld" "--eh-frame-hdr" "-dynamic-linker" "/libexec/ld-elf.so.1" "--hash-style=both" "--enable-new-dtags" "-o" "a.out" "/usr/lib/crt1.o" "/usr/lib/crti.o" "/usr/lib/crtbegin.o" "-L/usr/lib" "/dev/null" "-lc++" "-lm" "-lgcc" "--as-needed" "-lgcc_s" "--no-as-needed" "-lc" "-lgcc" "--as-needed" "-lgcc_s" "--no-as-needed" "/usr/lib/crtend.o" "/usr/lib/crtn.o" CC_OUTPUT_921dbbb2_58173849=yes CC_OUTPUT_921dbbb2_9bdba57c=yes CC_OUTPUT_921dbbb2_6a4fe7f5=yes CC_OUTPUT_921dbbb2_6bcac02b=yes CC_OUTPUT_921dbbb2_67d20829=yes CC_OUTPUT_921dbbb2_bfa62e83=yes CC_OUTPUT_921dbbb2_f0b4d593=yes CC_OUTPUT_921dbbb2_308abb44=yes CC_OUTPUT_921dbbb2_f00456e5=yes CC_OUTPUT_921dbbb2_65ad290d=yes CC_OUTPUT_921dbbb2_f2776b26=yes CC_OUTPUT_921dbbb2_b2657cc3=yes CC_OUTPUT_921dbbb2_380987f7=yes CC_OUTPUT_921dbbb2_160933ec=yes CC_OUTPUT_921dbbb2_fb62803b=yes _OBJC_CCVERSION_921dbbb2=FreeBSD clang version 6.0.1 (tags/RELEASE_601/final 335540) (based on LLVM 6.0.1) Target: x86_64-unknown-freebsd12.0 Thread model: posix InstalledDir: /usr/bin _OBJC_ALTCCVERSION_921dbbb2=none ARCH=amd64 OPSYS=FreeBSD _OSRELEASE=12.0-RELEASE-p1 OSREL=12.0 OSVERSION=1200086 PYTHONBASE=/usr/local HAVE_COMPAT_IA32_KERN=YES CONFIGURE_MAX_CMD_LEN=262144 HAVE_PORTS_ENV=1 #### Misc Poudriere #### GID=0 UID=0 DISABLE_MAKE_JOBS=poudriere ---End make.conf--- --Resource limits-- cpu time (seconds, -t) unlimited file size (512-blocks, -f) unlimited data seg size (kbytes, -d) 33554432 stack size (kbytes, -s) 524288 core file size (512-blocks, -c) unlimited max memory size (kbytes, -m) unlimited locked memory (kbytes, -l) unlimited max user processes (-u) 89999 open files (-n) 1024 virtual mem size (kbytes, -v) unlimited swap limit (kbytes, -w) unlimited socket buffer size (bytes, -b) unlimited pseudo-terminals (-p) unlimited kqueues (-k) unlimited umtx shared locks (-o) unlimited --End resource limits-- =======================<phase: check-sanity >============================ ===> NOTICE: The hadoop port currently does not have a maintainer. As a result, it is more likely to have unresolved issues, not be up-to-date, or even be removed in the future. To volunteer to maintain this port, please create an issue at: https://bugs.freebsd.org/bugzilla More information about port maintainership is available at: https://www.freebsd.org/doc/en/articles/contributing/ports-contributing.html#maintain-port ===> License APACHE20 accepted by the user =========================================================================== =======================<phase: pkg-depends >============================ ===> hadoop2-2.7.2_2 depends on file: /usr/local/sbin/pkg - not found ===> Installing existing package /packages/All/pkg-1.10.5_5.txz [120amd64-default-job-21] Installing pkg-1.10.5_5... [120amd64-default-job-21] Extracting pkg-1.10.5_5: .......... done ===> hadoop2-2.7.2_2 depends on file: /usr/local/sbin/pkg - found ===> Returning to build of hadoop2-2.7.2_2 =========================================================================== =======================<phase: fetch-depends >============================ =========================================================================== =======================<phase: fetch >============================ ===> NOTICE: The hadoop port currently does not have a maintainer. As a result, it is more likely to have unresolved issues, not be up-to-date, or even be removed in the future. To volunteer to maintain this port, please create an issue at: https://bugs.freebsd.org/bugzilla More information about port maintainership is available at: https://www.freebsd.org/doc/en/articles/contributing/ports-contributing.html#maintain-port ===> License APACHE20 accepted by the user ===> Fetching all distfiles required by hadoop2-2.7.2_2 for building =========================================================================== =======================<phase: checksum >============================ ===> NOTICE: The hadoop port currently does not have a maintainer. As a result, it is more likely to have unresolved issues, not be up-to-date, or even be removed in the future. To volunteer to maintain this port, please create an issue at: <snip> [INFO] [INFO] --- maven-antrun-plugin:1.7:run (make) @ hadoop-common --- [INFO] Executing tasks main: [exec] -- The C compiler identification is Clang 6.0.1 [exec] -- The CXX compiler identification is Clang 6.0.1 [exec] -- Check for working C compiler: /usr/bin/cc [exec] -- Check for working C compiler: /usr/bin/cc -- works [exec] -- Detecting C compiler ABI info [exec] -- Detecting C compiler ABI info - done [exec] -- Detecting C compile features [exec] -- Detecting C compile features - done [exec] -- Check for working CXX compiler: /usr/bin/c++ [exec] -- Check for working CXX compiler: /usr/bin/c++ -- works [exec] -- Detecting CXX compiler ABI info [exec] -- Detecting CXX compiler ABI info - done [exec] -- Detecting CXX compile features [exec] -- Detecting CXX compile features - done [exec] -- Found JNI: /usr/local/openjdk8/jre/lib/amd64/libjawt.so [exec] -- Found ZLIB: /usr/lib/libz.so (found version "1.2.11") [exec] -- Looking for sync_file_range [exec] -- Looking for sync_file_range - not found [exec] -- Looking for posix_fadvise [exec] -- Looking for posix_fadvise - found [exec] -- Looking for dlopen in dl [exec] CUSTOM_OPENSSL_PREFIX = [exec] -- Looking for dlopen in dl - found [exec] -- Performing Test HAS_NEW_ENOUGH_OPENSSL [exec] CMake Warning: [exec] Manually-specified variables were not used by the project: [exec] [exec] EXTRA_LIBHADOOP_RPATH [exec] REQUIRE_BZIP2 [exec] REQUIRE_OPENSSL [exec] REQUIRE_SNAPPY [exec] [exec] [exec] -- Performing Test HAS_NEW_ENOUGH_OPENSSL - Success [exec] -- Configuring done [exec] -- Generating done [exec] -- Build files have been written to: /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native [exec] /usr/local/bin/cmake -S/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src -B/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native --check-build-system CMakeFiles/Makefile.cmake 0 [exec] /usr/local/bin/cmake -E cmake_progress_start /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native/CMakeFiles /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native/CMakeFiles/progress.marks [exec] make -f CMakeFiles/Makefile2 all [exec] make -f CMakeFiles/hadoop_static.dir/build.make CMakeFiles/hadoop_static.dir/depend [exec] cd /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native && /usr/local/bin/cmake -E cmake_depends "Unix Makefiles" /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native/CMakeFiles/hadoop_static.dir/DependInfo.cmake --color= [exec] Dependee "/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native/CMakeFiles/hadoop_static.dir/DependInfo.cmake" is newer than depender "/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native/CMakeFiles/hadoop_static.dir/depend.internal". [exec] Dependee "/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native/CMakeFiles/CMakeDirectoryInformation.cmake" is newer than depender "/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native/CMakeFiles/hadoop_static.dir/depend.internal". [exec] Scanning dependencies of target hadoop_static [exec] make -f CMakeFiles/hadoop_static.dir/build.make CMakeFiles/hadoop_static.dir/build [exec] [ 1%] Building C object CMakeFiles/hadoop_static.dir/main/native/src/exception.c.o [exec] /usr/bin/cc -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native/javah -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/src -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native -I/usr/local/openjdk8/include -I/usr/local/openjdk8/include/freebsd -I/usr/local/include -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/util -O2 -pipe -fstack-protector -fno-strict-aliasing -g -Wall -O2 -D_REENTRANT -D_GNU_SOURCE -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -o CMakeFiles/hadoop_static.dir/main/native/src/exception.c.o -c /wrkdirs/usr/port s/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src/exception.c [exec] [ 3%] Building C object CMakeFiles/hadoop_static.dir/main/native/src/org/apache/hadoop/io/compress/lz4/Lz4Compressor.c.o [exec] /usr/bin/cc -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native/javah -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/src -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native -I/usr/local/openjdk8/include -I/usr/local/openjdk8/include/freebsd -I/usr/local/include -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/util -O2 -pipe -fstack-protector -fno-strict-aliasing -g -Wall -O2 -D_REENTRANT -D_GNU_SOURCE -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -o CMakeFiles/hadoop_static.dir/main/native/src/org/apache/hadoop/io/compress/lz4/Lz 4Compressor.c.o -c /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/lz4/Lz4Compressor.c [exec] [ 5%] Building C object CMakeFiles/hadoop_static.dir/main/native/src/org/apache/hadoop/io/compress/lz4/Lz4Decompressor.c.o [exec] /usr/bin/cc -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native/javah -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/src -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native -I/usr/local/openjdk8/include -I/usr/local/openjdk8/include/freebsd -I/usr/local/include -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/util -O2 -pipe -fstack-protector -fno-strict-aliasing -g -Wall -O2 -D_REENTRANT -D_GNU_SOURCE -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -o CMakeFiles/hadoop_static.dir/main/native/src/org/apache/hadoop/io/compress/lz4/Lz 4Decompressor.c.o -c /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/lz4/Lz4Decompressor.c [exec] [ 7%] Building C object CMakeFiles/hadoop_static.dir/main/native/src/org/apache/hadoop/io/compress/lz4/lz4.c.o [exec] /usr/bin/cc -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native/javah -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/src -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native -I/usr/local/openjdk8/include -I/usr/local/openjdk8/include/freebsd -I/usr/local/include -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/util -O2 -pipe -fstack-protector -fno-strict-aliasing -g -Wall -O2 -D_REENTRANT -D_GNU_SOURCE -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -o CMakeFiles/hadoop_static.dir/main/native/src/org/apache/hadoop/io/compress/lz4/lz 4.c.o -c /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/lz4/lz4.c [exec] [ 8%] Building C object CMakeFiles/hadoop_static.dir/main/native/src/org/apache/hadoop/io/compress/lz4/lz4hc.c.o [exec] /usr/bin/cc -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native/javah -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/src -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native -I/usr/local/openjdk8/include -I/usr/local/openjdk8/include/freebsd -I/usr/local/include -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/util -O2 -pipe -fstack-protector -fno-strict-aliasing -g -Wall -O2 -D_REENTRANT -D_GNU_SOURCE -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -o CMakeFiles/hadoop_static.dir/main/native/src/org/apache/hadoop/io/compress/lz4/lz 4hc.c.o -c /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/lz4/lz4hc.c [exec] [ 10%] Building C object CMakeFiles/hadoop_static.dir/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c.o [exec] /usr/bin/cc -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native/javah -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/src -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native -I/usr/local/openjdk8/include -I/usr/local/openjdk8/include/freebsd -I/usr/local/include -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/util -O2 -pipe -fstack-protector -fno-strict-aliasing -g -Wall -O2 -D_REENTRANT -D_GNU_SOURCE -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -o CMakeFiles/hadoop_static.dir/main/native/src/org/apache/hadoop/io/compress/snappy /SnappyCompressor.c.o -c /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c [exec] [ 12%] Building C object CMakeFiles/hadoop_static.dir/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyDecompressor.c.o [exec] /usr/bin/cc -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native/javah -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/src -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native -I/usr/local/openjdk8/include -I/usr/local/openjdk8/include/freebsd -I/usr/local/include -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/util -O2 -pipe -fstack-protector -fno-strict-aliasing -g -Wall -O2 -D_REENTRANT -D_GNU_SOURCE -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -o CMakeFiles/hadoop_static.dir/main/native/src/org/apache/hadoop/io/compress/snappy /SnappyDecompressor.c.o -c /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyDecompressor.c [exec] [ 14%] Building C object CMakeFiles/hadoop_static.dir/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c.o [exec] /usr/bin/cc -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native/javah -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/src -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native -I/usr/local/openjdk8/include -I/usr/local/openjdk8/include/freebsd -I/usr/local/include -I/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/util -O2 -pipe -fstack-protector -fno-strict-aliasing -g -Wall -O2 -D_REENTRANT -D_GNU_SOURCE -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -o CMakeFiles/hadoop_static.dir/main/native/src/org/apache/hadoop/crypto/OpensslCiph er.c.o -c /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c [exec] /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c:256:14: error: incomplete definition of type 'struct evp_cipher_ctx_st' [exec] if (context->flags & EVP_CIPH_NO_PADDING) { [exec] ~~~~~~~^ [exec] /usr/include/openssl/ossl_typ.h:90:16: note: forward declaration of 'struct evp_cipher_ctx_st' [exec] typedef struct evp_cipher_ctx_st EVP_CIPHER_CTX; [exec] ^ [exec] /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c:262:20: error: incomplete definition of type 'struct evp_cipher_ctx_st' [exec] int b = context->cipher->block_size; [exec] ~~~~~~~^ [exec] /usr/include/openssl/ossl_typ.h:90:16: note: forward declaration of 'struct evp_cipher_ctx_st' [exec] typedef struct evp_cipher_ctx_st EVP_CIPHER_CTX; [exec] ^ [exec] /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c:263:16: error: incomplete definition of type 'struct evp_cipher_ctx_st' [exec] if (context->encrypt) { [exec] ~~~~~~~^ [exec] /usr/include/openssl/ossl_typ.h:90:16: note: forward declaration of 'struct evp_cipher_ctx_st' [exec] typedef struct evp_cipher_ctx_st EVP_CIPHER_CTX; [exec] ^ [exec] /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c:310:14: error: incomplete definition of type 'struct evp_cipher_ctx_st' [exec] if (context->flags & EVP_CIPH_NO_PADDING) { [exec] ~~~~~~~^ [exec] /usr/include/openssl/ossl_typ.h:90:16: note: forward declaration of 'struct evp_cipher_ctx_st' [exec] typedef struct evp_cipher_ctx_st EVP_CIPHER_CTX; [exec] ^ [exec] /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c:313:20: error: incomplete definition of type 'struct evp_cipher_ctx_st' [exec] int b = context->cipher->block_size; [exec] ~~~~~~~^ [exec] /usr/include/openssl/ossl_typ.h:90:16: note: forward declaration of 'struct evp_cipher_ctx_st' [exec] typedef struct evp_cipher_ctx_st EVP_CIPHER_CTX; [exec] ^ [exec] 5 errors generated. [exec] *** Error code 1 [exec] [exec] Stop. [exec] make[3]: stopped in /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native [exec] *** Error code 1 [exec] [exec] Stop. [exec] make[2]: stopped in /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native [exec] *** Error code 1 [exec] [exec] Stop. [exec] make[1]: stopped in /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop Main ................................ SUCCESS [1.228s] [INFO] Apache Hadoop Project POM ......................... SUCCESS [1.055s] [INFO] Apache Hadoop Annotations ......................... SUCCESS [4.641s] [INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.197s] [INFO] Apache Hadoop Project Dist POM .................... SUCCESS [1.299s] [INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [3.969s] [INFO] Apache Hadoop MiniKDC ............................. SUCCESS [4.493s] [INFO] Apache Hadoop Auth ................................ SUCCESS [7.067s] [INFO] Apache Hadoop Auth Examples ....................... SUCCESS [4.029s] [INFO] Apache Hadoop Common .............................. FAILURE [23.821s] [INFO] Apache Hadoop NFS ................................. SKIPPED [INFO] Apache Hadoop KMS ................................. SKIPPED [INFO] Apache Hadoop Common Project ...................... SKIPPED [INFO] Apache Hadoop HDFS ................................ SKIPPED [INFO] Apache Hadoop HttpFS .............................. SKIPPED [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED [INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED [INFO] Apache Hadoop HDFS Project ........................ SKIPPED [INFO] hadoop-yarn ....................................... SKIPPED [INFO] hadoop-yarn-api ................................... SKIPPED [INFO] hadoop-yarn-common ................................ SKIPPED [INFO] hadoop-yarn-server ................................ SKIPPED [INFO] hadoop-yarn-server-common ......................... SKIPPED [INFO] hadoop-yarn-server-nodemanager .................... SKIPPED [INFO] hadoop-yarn-server-web-proxy ...................... SKIPPED [INFO] hadoop-yarn-server-applicationhistoryservice ...... SKIPPED [INFO] hadoop-yarn-server-resourcemanager ................ SKIPPED [INFO] hadoop-yarn-server-tests .......................... SKIPPED [INFO] hadoop-yarn-client ................................ SKIPPED [INFO] hadoop-yarn-server-sharedcachemanager ............. SKIPPED [INFO] hadoop-yarn-applications .......................... SKIPPED [INFO] hadoop-yarn-applications-distributedshell ......... SKIPPED [INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SKIPPED [INFO] hadoop-yarn-site .................................. SKIPPED [INFO] hadoop-yarn-registry .............................. SKIPPED [INFO] hadoop-yarn-project ............................... SKIPPED [INFO] hadoop-mapreduce-client ........................... SKIPPED [INFO] hadoop-mapreduce-client-core ...................... SKIPPED [INFO] hadoop-mapreduce-client-common .................... SKIPPED [INFO] hadoop-mapreduce-client-shuffle ................... SKIPPED [INFO] hadoop-mapreduce-client-app ....................... SKIPPED [INFO] hadoop-mapreduce-client-hs ........................ SKIPPED [INFO] hadoop-mapreduce-client-jobclient ................. SKIPPED [INFO] hadoop-mapreduce-client-hs-plugins ................ SKIPPED [INFO] Apache Hadoop MapReduce Examples .................. SKIPPED [INFO] hadoop-mapreduce .................................. SKIPPED [INFO] Apache Hadoop MapReduce Streaming ................. SKIPPED [INFO] Apache Hadoop Distributed Copy .................... SKIPPED [INFO] Apache Hadoop Archives ............................ SKIPPED [INFO] Apache Hadoop Rumen ............................... SKIPPED [INFO] Apache Hadoop Gridmix ............................. SKIPPED [INFO] Apache Hadoop Data Join ........................... SKIPPED [INFO] Apache Hadoop Ant Tasks ........................... SKIPPED [INFO] Apache Hadoop Extras .............................. SKIPPED [INFO] Apache Hadoop Pipes ............................... SKIPPED [INFO] Apache Hadoop OpenStack support ................... SKIPPED [INFO] Apache Hadoop Amazon Web Services support ......... SKIPPED [INFO] Apache Hadoop Azure support ....................... SKIPPED [INFO] Apache Hadoop Client .............................. SKIPPED [INFO] Apache Hadoop Mini-Cluster ........................ SKIPPED [INFO] Apache Hadoop Scheduler Load Simulator ............ SKIPPED [INFO] Apache Hadoop Tools Dist .......................... SKIPPED [INFO] Apache Hadoop Tools ............................... SKIPPED [INFO] Apache Hadoop Distribution ........................ SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 53.835s [INFO] Finished at: Tue Jan 01 01:14:41 GMT 2019 [INFO] Final Memory: 88M/2802M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-common: An Ant BuildException has occured: exec returned: 1 [ERROR] around Ant part ...<exec failonerror="true" dir="/wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/native" executable="make">... @ 7:157 in /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/target/antrun/build-main.xml [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hadoop-common *** Error code 1 Stop. make: stopped in /usr/ports/devel/hadoop2
Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?201901010114.x011EgY5073765>