| Summary: | openmpi rebuild for gcc 8.4.0 | ||
|---|---|---|---|
| Product: | Mageia | Reporter: | Chris Denice <eatdirt> |
| Component: | RPM Packages | Assignee: | QA Team <qa-bugs> |
| Status: | RESOLVED FIXED | QA Contact: | |
| Severity: | normal | ||
| Priority: | Normal | CC: | andrewsfarm, sysadmin-bugs, tarazed25, tmb |
| Version: | 7 | Keywords: | advisory, validated_update |
| Target Milestone: | --- | ||
| Hardware: | All | ||
| OS: | Linux | ||
| Whiteboard: | MGA7-64-OK | ||
| Source RPM: | CVE: | ||
| Status comment: | |||
| Bug Depends on: | 26294 | ||
| Bug Blocks: | |||
|
Description
Chris Denice
2020-03-05 22:10:45 CET
Thomas Backlund
2020-03-06 21:17:24 CET
CC:
(none) =>
tmb mga7, x86_64 Updated the packages and followed Herman's lead in using bug 2787 as a reference. Claire's link no longer works. Copied files from /usr/share/doc/lib64openmpi-devel/examples/ $ mpirun --version mpirun (Open MPI) 4.0.1 ... $ mpif90 --version GNU Fortran (Mageia 8.4.0-1.mga7) 8.4.0 Copyright (C) 2018 Free Software Foundation, Inc. ... $ make all mpicc -g hello_c.c -o hello_c mpicc -g ring_c.c -o ring_c mpicc -g connectivity_c.c -o connectivity_c mpicc -g spc_example.c -o spc_example make[1]: Entering directory '/data/qa/openmpi/examples' make[2]: Entering directory '/data/qa/openmpi/examples' mpic++ -g hello_cxx.cc -o hello_cxx mpic++ -g ring_cxx.cc -o ring_cxx make[2]: Leaving directory '/data/qa/openmpi/examples' make[2]: Entering directory '/data/qa/openmpi/examples' mpifort -g hello_mpifh.f -o hello_mpifh mpifort -g ring_mpifh.f -o ring_mpifh make[2]: Leaving directory '/data/qa/openmpi/examples' make[2]: Entering directory '/data/qa/openmpi/examples' mpifort -g hello_usempi.f90 -o hello_usempi mpifort -g ring_usempi.f90 -o ring_usempi make[2]: Leaving directory '/data/qa/openmpi/examples' make[2]: Entering directory '/data/qa/openmpi/examples' mpifort -g hello_usempif08.f90 -o hello_usempif08 mpifort -g ring_usempif08.f90 -o ring_usempif08 make[2]: Leaving directory '/data/qa/openmpi/examples' make[1]: Leaving directory '/data/qa/openmpi/examples' $ ./ring_c Process 0 sending 10 to 0, tag 201 (1 processes in ring) Process 0 sent to 0 Process 0 decremented value: 9 Process 0 decremented value: 8 Process 0 decremented value: 7 Process 0 decremented value: 6 Process 0 decremented value: 5 Process 0 decremented value: 4 Process 0 decremented value: 3 Process 0 decremented value: 2 Process 0 decremented value: 1 Process 0 decremented value: 0 Process 0 exiting $ ./hello_mpifh Hello, world, I am 0 of 1: Open MPI v4.0.1, package: Open MPI iurt@rabbit.mageia.org Distribution, ident: 4.0.1, repo rev: v4.0.1, Mar 26, 2019 $ ./hello_cxx Hello, world! I am 0 of 1(Open MPI v4.0.1, package: Open MPI iurt@rabbit.mageia.org Distribution, ident: 4.0.1, repo rev: v4.0.1, Mar 26, 2019, 117) $ ./connectivity_c Connectivity test on 1 processes PASSED. $ ./spc_example Usage: mpirun -np 2 --mca mpi_spc_attach all --mca mpi_spc_dump_enabled true ./spc_example [num_messages] [message_size] $ mpirun -np 2 --mca mpi_spc_attach all --mca mpi_spc_dump_enabled true ./spc_example 5 32 ERROR: Couldn't find the appropriate SPC counter in the MPI_T pvars. ERROR: Couldn't find the appropriate SPC counter in the MPI_T pvars. -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD with errorcode -1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [difda:16019] 1 more process has sent help message help-mpi-api.txt / mpi-abort [difda:16019] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages $ mpirun -np 2 --mca mpi_spc_attach all --mca mpi_spc_dump_enabled true ./spc_example Usage: mpirun -np 2 --mca mpi_spc_attach all --mca mpi_spc_dump_enabled true ./spc_example [num_messages] [message_size] -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- Usage: mpirun -np 2 --mca mpi_spc_attach all --mca mpi_spc_dump_enabled true ./spc_example [num_messages] [message_size] -------------------------------------------------------------------------- mpirun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[37684,1],0] Exit code: 255 -------------------------------------------------------------------------- Whatever! Anyway, everything seems to work OK. CC:
(none) =>
tarazed25 Validating. CC:
(none) =>
andrewsfarm, sysadmin-bugs An update for this issue has been pushed to the Mageia Updates repository. https://advisories.mageia.org/MGAA-2020-0071.html Resolution:
(none) =>
FIXED |