[blfs-dev] llvm-6.0.0 FTBFS with glibc-2.28

Ken Moffat zarniwhoop at ntlworld.com
Fri Aug 17 13:27:29 PDT 2018


On Fri, Aug 17, 2018 at 08:57:02PM +0200, Thomas Trepl wrote:
> 
> Rerun the tests here with "ninja check{,-clang{,-tooling}}" - not
> "make". I used
> 
> CC=gcc CXX=g++                              \
> cmake -DCMAKE_INSTALL_PREFIX=/usr           \
>       -DLLVM_ENABLE_FFI=ON                  \
>       -DCMAKE_BUILD_TYPE=Release            \
>       -DLLVM_BUILD_LLVM_DYLIB=ON            \
>       -DLLVM_TARGETS_TO_BUILD="host;AMDGPU" \
>       -DLLVM_BUILD_TESTS=ON                 \
>       -Wno-dev -G Ninja ..
> 
> The summary looks fine to me:
> 
> [0/3] Running the LLVM regression tests
> -- Testing: 23298 tests, 4 threads --
> Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 
> Testing Time: 106.64s
>   Expected Passes    : 15113
>   Expected Failures  : 56
>   Unsupported Tests  : 8129
> [1/3] Running lit suite /tmp/llvm/build/llvm-
> 6.0.1.src/tools/clang/test/Tooling
> llvm-lit: /tmp/llvm/build/llvm-
> 6.0.1.src/utils/lit/lit/llvm/config.py:334: note: using clang: /tmp/l
> lvm/build/llvm-6.0.1.src/build/bin/clang
> -- Testing: 26 tests, 4 threads --
> Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 
> Testing Time: 3.32s
>   Expected Passes    : 26
> [2/3] Running the Clang regression tests
> llvm-lit: /tmp/llvm/build/llvm-
> 6.0.1.src/utils/lit/lit/llvm/config.py:334: note: using clang:
> /tmp/llvm/build/llvm-6.0.1.src/build/bin/clang
> -- Testing: 11832 tests, 4 threads --
> Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 
> Testing Time: 135.52s
>   Expected Passes    : 11572
>   Expected Failures  : 18
>   Unsupported Tests  : 242
> ...
> 
> When running "ninja check-all" it looks like (same build instructions,
> clean build):
> 
> ...
> Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 
> 
> 1 warning(s) in tests.
> Testing Time: 615.44s
> ********************
> Failing Tests (8):
>     LeakSanitizer-AddressSanitizer-x86_64 ::
> TestCases/Linux/use_tls_dynamic.cc
>     LeakSanitizer-Standalone-x86_64 ::
> TestCases/Linux/use_tls_dynamic.cc
>     MemorySanitizer-X86_64 :: Linux/sunrpc.cc
>     MemorySanitizer-X86_64 :: Linux/sunrpc_bytes.cc
>     MemorySanitizer-X86_64 :: Linux/sunrpc_string.cc
>     MemorySanitizer-X86_64 :: dtls_test.c
>     SanitizerCommon-lsan-x86_64-Linux ::
> Posix/sanitizer_set_death_callback_test.cc
>     ThreadSanitizer-x86_64 :: sunrpc.cc
> 
>   Expected Passes    : 29119
>   Expected Failures  : 103
>   Unsupported Tests  : 8914
>   Unexpected Failures: 8
> FAILED: CMakeFiles/check-all 
> ...
> 
> So, comparable to Ken's results but different anyhow - all failures
> have to with the rpc header files. Interesting that cmake checks for
> them and states that they are not found but continues (see the first
> two lines of following grep output), so i think they are no hard
> prerequisites. The tests seems not taking care of not having the
> headers available. I simply did a grep on the my log file:
> 
> # grep "rpc/.*not found" llvm-check-all.log 
> -- Looking for rpc/xdr.h - not found
> -- Looking for tirpc/rpc/xdr.h - not found> /home/lfs/tmp/llvm/build/llvm-6.0.1.src/projects/compiler-
> rt/test/msan/Linux/sunrpc.cc:15:10: fatal error: 'rpc/xdr.h' file not
> found
> /home/lfs/tmp/llvm/build/llvm-6.0.1.src/projects/compiler-
> rt/test/msan/Linux/sunrpc_bytes.cc:8:10: fatal error: 'rpc/xdr.h' file
> not found
> /home/lfs/tmp/llvm/build/llvm-6.0.1.src/projects/compiler-
> rt/test/msan/Linux/sunrpc_string.cc:8:10: fatal error: 'rpc/xdr.h' file
> not found
> /home/lfs/tmp/llvm/build/llvm-6.0.1.src/projects/compiler-
> rt/test/tsan/sunrpc.cc:4:10: fatal error: 'rpc/types.h' file not found
> 
> Can we assume that those unexpected failures are cause by a flaw in the
> test suite?
> 

I think so.  I was going to suggest dropping back to the targets DJ
was using, but I see that there were a lot fewer expected passes
there (11572 instead of 29119).  As to the marginally different
number of unexpected failures, probably a minor difference in what
we have installed.
> 
> Btw, just redoing build with 'make' to see if the hang is reproducable
> there. If not than it may have been caused by whatever reason...
> 

Currently rerunning the base build (i.e. no docs) with
hyperthreading (should be slower) and ninja -j4.  It is consistently
running at 400%, unlike the build on 4 real cores which mostly only
ran at 100% and 200%.

So, it now looks as if ninja makes a muh better job of the build
than cmake's Makefiles.  OTOH, building docs might be a lot messier.

Still thinking about what will suit _me_ best, but if your run with
'make' works, don't let me stop you updating with your preference.

ĸen
-- 
           Entropy not found, thump keyboard to continue





More information about the blfs-dev mailing list