Discussion:
[OMPI users] issue compiling openmpi 3.2.1 with pmi and slurm
Ross, Daniel B. via users
2018-10-10 13:44:15 UTC
Permalink
I have been able to configure without issue using the following options:
./configure --prefix=/usr/local/ --with-cuda --with-slurm --with-pmi=/usr/local/slurm/include/slurm --with-pmi-libdir=/usr/local/slurm/lib64

Everything compiles just fine until I get this error:

make[3]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x'
make[2]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x'
Making all in mca/pmix/s1
make[2]: Entering directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/s1'
CC mca_pmix_s1_la-pmix_s1.lo
pmix_s1.c:29:17: fatal error: pmi.h: No such file or directory
#include <pmi.h>
^
compilation terminated.
make[2]: *** [mca_pmix_s1_la-pmix_s1.lo] Error 1
make[2]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/s1'
make[1]: *** [all-recursive] Error 1
make[1]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal'
make: *** [all-recursive] Error 1


any ideas why I am getting this error?
Thanks
Ralph H Castain
2018-10-10 15:26:14 UTC
Permalink
It appears that the CPPFLAGS isn’t getting set correctly as the component didn’t find the Slurm PMI-1 header file. Perhaps it would help if we saw the config.log output so we can see where OMPI thought the file was located.


> On Oct 10, 2018, at 6:44 AM, Ross, Daniel B. via users <***@lists.open-mpi.org> wrote:
>
> I have been able to configure without issue using the following options:
> ./configure --prefix=/usr/local/ --with-cuda --with-slurm --with-pmi=/usr/local/slurm/include/slurm --with-pmi-libdir=/usr/local/slurm/lib64
>
> Everything compiles just fine until I get this error:
>
> make[3]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x'
> make[2]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x'
> Making all in mca/pmix/s1
> make[2]: Entering directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/s1'
> CC mca_pmix_s1_la-pmix_s1.lo
> pmix_s1.c:29:17: fatal error: pmi.h: No such file or directory
> #include <pmi.h>
> ^
> compilation terminated.
> make[2]: *** [mca_pmix_s1_la-pmix_s1.lo] Error 1
> make[2]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/s1'
> make[1]: *** [all-recursive] Error 1
> make[1]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal'
> make: *** [all-recursive] Error 1
>
>
> any ideas why I am getting this error?
> Thanks
>
> _______________________________________________
> users mailing list
> ***@lists.open-mpi.org <mailto:***@lists.open-mpi.org>
> https://lists.open-mpi.org/mailman/listinfo/users <https://lists.open-mpi.org/mailman/listinfo/users>
Ross, Daniel B. via users
2018-10-10 16:38:54 UTC
Permalink
Here is the best I can do. The config.log is too large to email.
grep pmi config.log
$ ./configure --prefix=/usr/local/ --with-cuda --with-slurm --with-pmi=/usr/local/slurm/include/slurm --with pmi-libdir=/usr/local/slurm/lib64
configure:71744: result: '--prefix=/usr/local/' '--with-cuda' '--with-slurm' '--with-pmi=/usr/local/slurm/include/slurm' '--with-pmi-libdir=/usr/local/slurm/lib64'
configure:85150: checking for pmi.h in /usr/local/slurm/include/slurm
configure:85208: checking pmi.h usability
configure:85208: checking pmi.h presence
configure:85208: checking for pmi.h
configure:85270: checking for libpmi in /usr/local/slurm/lib64
configure:85277: checking for PMI_Init in -lpmi
configure:85302: gcc -std=gnu99 -o conftest -O3 -DNDEBUG -finline-functions -fno-strict-aliasing -mcx16 -pthread -I/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/hwloc/hwloc1117/hwloc/include -I/usr/local/slurm/include/slurm -L/usr/local/slurm/lib64 conftest.c -lpmi -lm -lutil -lz -lpmi >&5
configure:85481: checking for pmi2.h in /usr/local/slurm/include/slurm
configure:85539: checking pmi2.h usability
configure:85539: checking pmi2.h presence
configure:85539: checking for pmi2.h
configure:85601: checking for libpmi2 in /usr/local/slurm/lib64
configure:85608: checking for PMI2_Init in -lpmi2
configure:85633: gcc -std=gnu99 -o conftest -O3 -DNDEBUG -finline-functions -fno-strict-aliasing -mcx16 -pthread -I/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/hwloc/hwloc1117/hwloc/include -I/usr/local/slurm/include/slurm -L/usr/local/slurm/lib64 conftest.c -lpmi2 -lm -lutil -lz -lpmi2 >&5
configure:86063: result: common, allocator, backtrace, btl, compress, crs, dl, event, hwloc, if, installdirs, memchecker, memcpy, memory, mpool, patcher, pmix, pstat, rcache, reachable, shmem, timer
configure:130096: running /bin/sh './configure' --disable-dns --disable-http --disable-rpc --disable-openssl --enable-thread-support --disable-evport '--prefix=/usr/local/' '--with-cuda' '--with-slurm' '--with-pmi=/usr/local/slurm/include/slurm' '--with-pmi-libdir=/usr/local/slurm/lib64' 'CPPFLAGS=-I/usr/local/src/openmpi/openmpi-3.1.2 -I/usr/local/src/openmpi/openmpi-3.1.2 -I/usr/local/src/openmpi/openmpi-3.1.2/opal/include -I/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/hwloc/hwloc1117/hwloc/include -Drandom=opal_random' --cache-file=/dev/null --srcdir=. --disable-option-checking
configure:5903: +++ Configuring MCA framework pmix
configure:146895: checking for no configure components in framework pmix
configure:146899: checking for m4 configure components in framework pmix
configure:146901: result: cray, ext1x, ext2x, flux, pmix2x, s1, s2
configure:5918: --- MCA component pmix:isolated (no configuration)
configure:146996: checking for MCA component pmix:isolated compile mode
configure:147038: checking if MCA component pmix:isolated can compile
configure:5918: --- MCA component pmix:cray (m4 configuration macro)
configure:147646: checking for MCA component pmix:cray compile mode
configure:147691: $PKG_CONFIG --exists --print-errors "cray-pmi"
Package cray-pmi was not found in the pkg-config search path.
Perhaps you should add the directory containing `cray-pmi.pc'
No package 'cray-pmi' found
configure:147708: $PKG_CONFIG --exists --print-errors "cray-pmi"
Package cray-pmi was not found in the pkg-config search path.
Perhaps you should add the directory containing `cray-pmi.pc'
No package 'cray-pmi' found
configure:147725: $PKG_CONFIG --exists --print-errors "cray-pmi"
Package cray-pmi was not found in the pkg-config search path.
Perhaps you should add the directory containing `cray-pmi.pc'
No package 'cray-pmi' found
No package 'cray-pmi' found
configure:149038: checking if MCA component pmix:cray can compile
configure:5918: --- MCA component pmix:ext1x (m4 configuration macro)
configure:149182: checking for MCA component pmix:ext1x compile mode
configure:149748: checking if MCA component pmix:ext1x can compile
configure:5918: --- MCA component pmix:ext2x (m4 configuration macro)
configure:149892: checking for MCA component pmix:ext2x compile mode
configure:150447: checking if MCA component pmix:ext2x can compile
configure:5918: --- MCA component pmix:flux (m4 configuration macro)
configure:150591: checking for MCA component pmix:flux compile mode
configure:150802: checking if MCA component pmix:flux can compile
configure:5918: --- MCA component pmix:pmix2x (m4 configuration macro)
configure:151409: checking for MCA component pmix:pmix2x compile mode
configure:151568: OPAL configuring in opal/mca/pmix/pmix2x/pmix
configure:151649: running /bin/sh './configure' --enable-embedded-mode --disable-debug --with-pmix-symbol-rename=OPAL_MCA_PMIX2X_ --disable-pmix-timing --without-tests-examples --disable-pmix-backward-compatibility --disable-visibility --enable-embedded-libevent --with-libevent-header=\"opal/mca/event/libevent2022/libevent2022.h\" '--prefix=/usr/local/' '--with-cuda' '--with-slurm' '--with-pmi=/usr/local/slurm/include/slurm' '--with-pmi-libdir=/usr/local/slurm/lib64' 'CFLAGS=-O3 -DNDEBUG ' 'CPPFLAGS=-I/usr/local/src/openmpi/openmpi-3.1.2 -I/usr/local/src/openmpi/openmpi-3.1.2 -I/usr/local/src/openmpi/openmpi-3.1.2/opal/include -I/usr/local/src/openmpi/openmpi-3.1.2/opal/include -I/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/event/libevent2022/libevent -I/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/event/libevent2022/libevent/include -I/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/hwloc/hwloc1117/hwloc/include ' --cache-file=/dev/null --srcdir=. --disable-option-checking
configure:151655: /bin/sh './configure' succeeded for opal/mca/pmix/pmix2x/pmix
configure:151809: checking if MCA component pmix:pmix2x can compile
configure:5918: --- MCA component pmix:s1 (m4 configuration macro)
configure:152416: checking for MCA component pmix:s1 compile mode
configure:152478: checking if MCA component pmix:s1 can compile
configure:5918: --- MCA component pmix:s2 (m4 configuration macro)
configure:153085: checking for MCA component pmix:s2 compile mode
configure:153147: checking if MCA component pmix:s2 can compile
configure:177961: result: env, hnp, pmi, singleton, tool
configure:5918: --- MCA component ess:pmi (no configuration)
configure:179336: checking for MCA component ess:pmi compile mode
configure:179378: checking if MCA component ess:pmi can compile
config.status:4112: creating opal/mca/pmix/Makefile
config.status:4112: creating opal/mca/pmix/isolated/Makefile
config.status:4112: creating opal/mca/pmix/cray/Makefile
config.status:4112: creating opal/mca/pmix/ext1x/Makefile
config.status:4112: creating opal/mca/pmix/ext2x/Makefile
config.status:4112: creating opal/mca/pmix/flux/Makefile
config.status:4112: creating opal/mca/pmix/pmix2x/Makefile
config.status:4112: creating opal/mca/pmix/s1/Makefile
config.status:4112: creating opal/mca/pmix/s2/Makefile
config.status:4112: creating orte/mca/ess/pmi/Makefile
ac_cv_header_pmi2_h=yes
ac_cv_header_pmi_h=yes
ac_cv_lib_pmi2_PMI2_Init=yes
ac_cv_lib_pmi_PMI_Init=yes
MCA_BUILD_opal_pmix_cray_DSO_FALSE='#'
MCA_BUILD_opal_pmix_cray_DSO_TRUE=''
MCA_BUILD_opal_pmix_ext1x_DSO_FALSE='#'
MCA_BUILD_opal_pmix_ext1x_DSO_TRUE=''
MCA_BUILD_opal_pmix_ext2x_DSO_FALSE='#'
MCA_BUILD_opal_pmix_ext2x_DSO_TRUE=''
MCA_BUILD_opal_pmix_flux_DSO_FALSE='#'
MCA_BUILD_opal_pmix_flux_DSO_TRUE=''
MCA_BUILD_opal_pmix_isolated_DSO_FALSE='#'
MCA_BUILD_opal_pmix_isolated_DSO_TRUE=''
MCA_BUILD_opal_pmix_pmix2x_DSO_FALSE='#'
MCA_BUILD_opal_pmix_pmix2x_DSO_TRUE=''
MCA_BUILD_opal_pmix_s1_DSO_FALSE='#'
MCA_BUILD_opal_pmix_s1_DSO_TRUE=''
MCA_BUILD_opal_pmix_s2_DSO_FALSE='#'
MCA_BUILD_opal_pmix_s2_DSO_TRUE=''
MCA_BUILD_orte_ess_pmi_DSO_FALSE='#'
MCA_BUILD_orte_ess_pmi_DSO_TRUE=''
MCA_opal_FRAMEWORKS='common allocator backtrace btl compress crs dl event hwloc if installdirs memchecker memcpy memory mpool patcher pmix pstat rcache reachable shmem timer'
MCA_opal_FRAMEWORKS_SUBDIRS='mca/common mca/allocator mca/backtrace mca/btl mca/compress mca/crs mca/dl mca/event mca/hwloc mca/if mca/installdirs mca/memchecker mca/memcpy mca/memory mca/mpool mca/patcher mca/pmix mca/pstat mca/rcache mca/reachable mca/shmem mca/timer'
MCA_opal_FRAMEWORK_COMPONENT_ALL_SUBDIRS='$(MCA_opal_common_ALL_SUBDIRS) $(MCA_opal_allocator_ALL_SUBDIRS) $(MCA_opal_backtrace_ALL_SUBDIRS) $(MCA_opal_btl_ALL_SUBDIRS) $(MCA_opal_compress_ALL_SUBDIRS) $(MCA_opal_crs_ALL_SUBDIRS) $(MCA_opal_dl_ALL_SUBDIRS) $(MCA_opal_event_ALL_SUBDIRS) $(MCA_opal_hwloc_ALL_SUBDIRS) $(MCA_opal_if_ALL_SUBDIRS) $(MCA_opal_installdirs_ALL_SUBDIRS) $(MCA_opal_memchecker_ALL_SUBDIRS) $(MCA_opal_memcpy_ALL_SUBDIRS) $(MCA_opal_memory_ALL_SUBDIRS) $(MCA_opal_mpool_ALL_SUBDIRS) $(MCA_opal_patcher_ALL_SUBDIRS) $(MCA_opal_pmix_ALL_SUBDIRS) $(MCA_opal_pstat_ALL_SUBDIRS) $(MCA_opal_rcache_ALL_SUBDIRS) $(MCA_opal_reachable_ALL_SUBDIRS) $(MCA_opal_shmem_ALL_SUBDIRS) $(MCA_opal_timer_ALL_SUBDIRS)'
MCA_opal_FRAMEWORK_COMPONENT_DSO_SUBDIRS='$(MCA_opal_common_DSO_SUBDIRS) $(MCA_opal_allocator_DSO_SUBDIRS) $(MCA_opal_backtrace_DSO_SUBDIRS) $(MCA_opal_btl_DSO_SUBDIRS) $(MCA_opal_compress_DSO_SUBDIRS) $(MCA_opal_crs_DSO_SUBDIRS) $(MCA_opal_dl_DSO_SUBDIRS) $(MCA_opal_event_DSO_SUBDIRS) $(MCA_opal_hwloc_DSO_SUBDIRS) $(MCA_opal_if_DSO_SUBDIRS) $(MCA_opal_installdirs_DSO_SUBDIRS) $(MCA_opal_memchecker_DSO_SUBDIRS) $(MCA_opal_memcpy_DSO_SUBDIRS) $(MCA_opal_memory_DSO_SUBDIRS) $(MCA_opal_mpool_DSO_SUBDIRS) $(MCA_opal_patcher_DSO_SUBDIRS) $(MCA_opal_pmix_DSO_SUBDIRS) $(MCA_opal_pstat_DSO_SUBDIRS) $(MCA_opal_rcache_DSO_SUBDIRS) $(MCA_opal_reachable_DSO_SUBDIRS) $(MCA_opal_shmem_DSO_SUBDIRS) $(MCA_opal_timer_DSO_SUBDIRS)'
MCA_opal_FRAMEWORK_COMPONENT_STATIC_SUBDIRS='$(MCA_opal_common_STATIC_SUBDIRS) $(MCA_opal_allocator_STATIC_SUBDIRS) $(MCA_opal_backtrace_STATIC_SUBDIRS) $(MCA_opal_btl_STATIC_SUBDIRS) $(MCA_opal_compress_STATIC_SUBDIRS) $(MCA_opal_crs_STATIC_SUBDIRS) $(MCA_opal_dl_STATIC_SUBDIRS) $(MCA_opal_event_STATIC_SUBDIRS) $(MCA_opal_hwloc_STATIC_SUBDIRS) $(MCA_opal_if_STATIC_SUBDIRS) $(MCA_opal_installdirs_STATIC_SUBDIRS) $(MCA_opal_memchecker_STATIC_SUBDIRS) $(MCA_opal_memcpy_STATIC_SUBDIRS) $(MCA_opal_memory_STATIC_SUBDIRS) $(MCA_opal_mpool_STATIC_SUBDIRS) $(MCA_opal_patcher_STATIC_SUBDIRS) $(MCA_opal_pmix_STATIC_SUBDIRS) $(MCA_opal_pstat_STATIC_SUBDIRS) $(MCA_opal_rcache_STATIC_SUBDIRS) $(MCA_opal_reachable_STATIC_SUBDIRS) $(MCA_opal_shmem_STATIC_SUBDIRS) $(MCA_opal_timer_STATIC_SUBDIRS)'
MCA_opal_FRAMEWORK_LIBS=' $(MCA_opal_common_STATIC_LTLIBS) mca/allocator/libmca_allocator.la $(MCA_opal_allocator_STATIC_LTLIBS) mca/backtrace/libmca_backtrace.la $(MCA_opal_backtrace_STATIC_LTLIBS) mca/btl/libmca_btl.la $(MCA_opal_btl_STATIC_LTLIBS) mca/compress/libmca_compress.la $(MCA_opal_compress_STATIC_LTLIBS) mca/crs/libmca_crs.la $(MCA_opal_crs_STATIC_LTLIBS) mca/dl/libmca_dl.la $(MCA_opal_dl_STATIC_LTLIBS) mca/event/libmca_event.la $(MCA_opal_event_STATIC_LTLIBS) mca/hwloc/libmca_hwloc.la $(MCA_opal_hwloc_STATIC_LTLIBS) mca/if/libmca_if.la $(MCA_opal_if_STATIC_LTLIBS) mca/installdirs/libmca_installdirs.la $(MCA_opal_installdirs_STATIC_LTLIBS) mca/memchecker/libmca_memchecker.la $(MCA_opal_memchecker_STATIC_LTLIBS) mca/memcpy/libmca_memcpy.la $(MCA_opal_memcpy_STATIC_LTLIBS) mca/memory/libmca_memory.la $(MCA_opal_memory_STATIC_LTLIBS) mca/mpool/libmca_mpool.la $(MCA_opal_mpool_STATIC_LTLIBS) mca/patcher/libmca_patcher.la $(MCA_opal_patcher_STATIC_LTLIBS) mca/pmix/libmca_pmix.la $(MCA_opal_pmix_STATIC_LTLIBS) mca/pstat/libmca_pstat.la $(MCA_opal_pstat_STATIC_LTLIBS) mca/rcache/libmca_rcache.la $(MCA_opal_rcache_STATIC_LTLIBS) mca/reachable/libmca_reachable.la $(MCA_opal_reachable_STATIC_LTLIBS) mca/shmem/libmca_shmem.la $(MCA_opal_shmem_STATIC_LTLIBS) mca/timer/libmca_timer.la $(MCA_opal_timer_STATIC_LTLIBS)'
MCA_opal_pmix_ALL_COMPONENTS=' isolated cray ext1x ext2x flux pmix2x s1 s2'
MCA_opal_pmix_ALL_SUBDIRS=' mca/pmix/isolated mca/pmix/cray mca/pmix/ext1x mca/pmix/ext2x mca/pmix/flux mca/pm x/pmix2x mca/pmix/s1 mca/pmix/s2'
MCA_opal_pmix_DSO_COMPONENTS=' isolated flux pmix2x s1 s2'
MCA_opal_pmix_DSO_SUBDIRS=' mca/pmix/isolated mca/pmix/flux mca/pmix/pmix2x mca/pmix/s1 mca/pmix/s2'
MCA_opal_pmix_STATIC_COMPONENTS=''
MCA_opal_pmix_STATIC_LTLIBS=''
MCA_opal_pmix_STATIC_SUBDIRS=''
MCA_orte_ess_ALL_COMPONENTS=' env hnp pmi singleton tool alps lsf slurm tm'
MCA_orte_ess_ALL_SUBDIRS=' mca/ess/env mca/ess/hnp mca/ess/pmi mca/ess/singleton mca/ess/tool mca/ess/alps mca/ess/lsf mca/ess/slurm mca/ess/tm'
MCA_orte_ess_DSO_COMPONENTS=' env hnp pmi singleton tool slurm'
MCA_orte_ess_DSO_SUBDIRS=' mca/ess/env mca/ess/hnp mca/ess/pmi mca/ess/singleton mca/ess/tool mca/ess/slurm'
OPAL_CONFIGURE_CLI=' \'\''--prefix=/usr/local/\'\'' \'\''--with-cuda\'\'' \'\''--with-slurm\'\'' \'\''--with-pmi=/usr/local/slurm/include/slurm\'\'' \'\''--with-pmi-libdir=/usr/local/slurm/lib64\'\'''
opal_pmi1_CPPFLAGS=''
opal_pmi1_LDFLAGS=''
opal_pmi1_LIBS='-lpmi'
opal_pmi1_rpath=''
opal_pmi2_CPPFLAGS=''
opal_pmi2_LDFLAGS=''
opal_pmi2_LIBS='-lpmi2'
opal_pmi2_rpath=''
opal_pmix_ext1x_CPPFLAGS=''
opal_pmix_ext1x_LDFLAGS=''
opal_pmix_ext1x_LIBS=''
opal_pmix_ext2x_CPPFLAGS=''
opal_pmix_ext2x_LDFLAGS=''
opal_pmix_ext2x_LIBS=''
opal_pmix_pmix2x_CPPFLAGS='-I/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x/pmix/include -I/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x/pmix -I/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x/pmix/include -I/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x/pmix'
opal_pmix_pmix2x_DEPENDENCIES='/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x/pmix/src/libpmix.la'
opal_pmix_pmix2x_LDFLAGS=''
opal_pmix_pmix2x_LIBS='/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x/pmix/src/libpmix.la'
pmix_alps_CPPFLAGS=''
pmix_alps_LDFLAGS=''
pmix_alps_LIBS=''
pmix_cray_CPPFLAGS=''
pmix_cray_LDFLAGS=''
pmix_cray_LIBS=''

From: users <users-***@lists.open-mpi.org> On Behalf Of Ralph H Castain
Sent: Wednesday, October 10, 2018 11:26 AM
To: Open MPI Users <***@lists.open-mpi.org>
Subject: Re: [OMPI users] issue compiling openmpi 3.2.1 with pmi and slurm

It appears that the CPPFLAGS isn’t getting set correctly as the component didn’t find the Slurm PMI-1 header file. Perhaps it would help if we saw the config.log output so we can see where OMPI thought the file was located.



On Oct 10, 2018, at 6:44 AM, Ross, Daniel B. via users <***@lists.open-mpi.org<mailto:***@lists.open-mpi.org>> wrote:

I have been able to configure without issue using the following options:
./configure --prefix=/usr/local/ --with-cuda --with-slurm --with-pmi=/usr/local/slurm/include/slurm --with-pmi-libdir=/usr/local/slurm/lib64

Everything compiles just fine until I get this error:

make[3]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x'
make[2]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x'
Making all in mca/pmix/s1
make[2]: Entering directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/s1'
CC mca_pmix_s1_la-pmix_s1.lo
pmix_s1.c:29:17: fatal error: pmi.h: No such file or directory
#include <pmi.h>
^
compilation terminated.
make[2]: *** [mca_pmix_s1_la-pmix_s1.lo] Error 1
make[2]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/s1'
make[1]: *** [all-recursive] Error 1
make[1]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal'
make: *** [all-recursive] Error 1


any ideas why I am getting this error?
Thanks

_______________________________________________
users mailing list
***@lists.open-mpi.org<mailto:***@lists.open-mpi.org>
https://lists.open-mpi.org/mailman/listinfo/users
Ralph H Castain
2018-10-10 16:51:01 UTC
Permalink
Yeah, the CPPFLAGS didn’t get set - I suspect because the location being pointed to starts with “/usr/local” and therefore looks like a standard one. What you might do is set CPPFLAGS yourself to be “/usr/local/slurm/include/slurm” before running configure - should fix the problem. Might also need to set LDFLAGS=“-L/usr/local/slurm/lib64”


> On Oct 10, 2018, at 9:38 AM, Ross, Daniel B. via users <***@lists.open-mpi.org> wrote:
>
> Here is the best I can do. The config.log is too large to email.
> grep pmi config.log
> $ ./configure --prefix=/usr/local/ --with-cuda --with-slurm --with-pmi=/usr/local/slurm/include/slurm --with pmi-libdir=/usr/local/slurm/lib64
> configure:71744: result: '--prefix=/usr/local/' '--with-cuda' '--with-slurm' '--with-pmi=/usr/local/slurm/include/slurm' '--with-pmi-libdir=/usr/local/slurm/lib64'
> configure:85150: checking for pmi.h in /usr/local/slurm/include/slurm
> configure:85208: checking pmi.h usability
> configure:85208: checking pmi.h presence
> configure:85208: checking for pmi.h
> configure:85270: checking for libpmi in /usr/local/slurm/lib64
> configure:85277: checking for PMI_Init in -lpmi
> configure:85302: gcc -std=gnu99 -o conftest -O3 -DNDEBUG -finline-functions -fno-strict-aliasing -mcx16 -pthread -I/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/hwloc/hwloc1117/hwloc/include -I/usr/local/slurm/include/slurm -L/usr/local/slurm/lib64 conftest.c -lpmi -lm -lutil -lz -lpmi >&5
> configure:85481: checking for pmi2.h in /usr/local/slurm/include/slurm
> configure:85539: checking pmi2.h usability
> configure:85539: checking pmi2.h presence
> configure:85539: checking for pmi2.h
> configure:85601: checking for libpmi2 in /usr/local/slurm/lib64
> configure:85608: checking for PMI2_Init in -lpmi2
> configure:85633: gcc -std=gnu99 -o conftest -O3 -DNDEBUG -finline-functions -fno-strict-aliasing -mcx16 -pthread -I/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/hwloc/hwloc1117/hwloc/include -I/usr/local/slurm/include/slurm -L/usr/local/slurm/lib64 conftest.c -lpmi2 -lm -lutil -lz -lpmi2 >&5
> configure:86063: result: common, allocator, backtrace, btl, compress, crs, dl, event, hwloc, if, installdirs, memchecker, memcpy, memory, mpool, patcher, pmix, pstat, rcache, reachable, shmem, timer
> configure:130096: running /bin/sh './configure' --disable-dns --disable-http --disable-rpc --disable-openssl --enable-thread-support --disable-evport '--prefix=/usr/local/' '--with-cuda' '--with-slurm' '--with-pmi=/usr/local/slurm/include/slurm' '--with-pmi-libdir=/usr/local/slurm/lib64' 'CPPFLAGS=-I/usr/local/src/openmpi/openmpi-3.1.2 -I/usr/local/src/openmpi/openmpi-3.1.2 -I/usr/local/src/openmpi/openmpi-3.1.2/opal/include -I/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/hwloc/hwloc1117/hwloc/include -Drandom=opal_random' --cache-file=/dev/null --srcdir=. --disable-option-checking
> configure:5903: +++ Configuring MCA framework pmix
> configure:146895: checking for no configure components in framework pmix
> configure:146899: checking for m4 configure components in framework pmix
> configure:146901: result: cray, ext1x, ext2x, flux, pmix2x, s1, s2
> configure:5918: --- MCA component pmix:isolated (no configuration)
> configure:146996: checking for MCA component pmix:isolated compile mode
> configure:147038: checking if MCA component pmix:isolated can compile
> configure:5918: --- MCA component pmix:cray (m4 configuration macro)
> configure:147646: checking for MCA component pmix:cray compile mode
> configure:147691: $PKG_CONFIG --exists --print-errors "cray-pmi"
> Package cray-pmi was not found in the pkg-config search path.
> Perhaps you should add the directory containing `cray-pmi.pc'
> No package 'cray-pmi' found
> configure:147708: $PKG_CONFIG --exists --print-errors "cray-pmi"
> Package cray-pmi was not found in the pkg-config search path.
> Perhaps you should add the directory containing `cray-pmi.pc'
> No package 'cray-pmi' found
> configure:147725: $PKG_CONFIG --exists --print-errors "cray-pmi"
> Package cray-pmi was not found in the pkg-config search path.
> Perhaps you should add the directory containing `cray-pmi.pc'
> No package 'cray-pmi' found
> No package 'cray-pmi' found
> configure:149038: checking if MCA component pmix:cray can compile
> configure:5918: --- MCA component pmix:ext1x (m4 configuration macro)
> configure:149182: checking for MCA component pmix:ext1x compile mode
> configure:149748: checking if MCA component pmix:ext1x can compile
> configure:5918: --- MCA component pmix:ext2x (m4 configuration macro)
> configure:149892: checking for MCA component pmix:ext2x compile mode
> configure:150447: checking if MCA component pmix:ext2x can compile
> configure:5918: --- MCA component pmix:flux (m4 configuration macro)
> configure:150591: checking for MCA component pmix:flux compile mode
> configure:150802: checking if MCA component pmix:flux can compile
> configure:5918: --- MCA component pmix:pmix2x (m4 configuration macro)
> configure:151409: checking for MCA component pmix:pmix2x compile mode
> configure:151568: OPAL configuring in opal/mca/pmix/pmix2x/pmix
> configure:151649: running /bin/sh './configure' --enable-embedded-mode --disable-debug --with-pmix-symbol-rename=OPAL_MCA_PMIX2X_ --disable-pmix-timing --without-tests-examples --disable-pmix-backward-compatibility --disable-visibility --enable-embedded-libevent --with-libevent-header=\"opal/mca/event/libevent2022/libevent2022.h\" '--prefix=/usr/local/' '--with-cuda' '--with-slurm' '--with-pmi=/usr/local/slurm/include/slurm' '--with-pmi-libdir=/usr/local/slurm/lib64' 'CFLAGS=-O3 -DNDEBUG ' 'CPPFLAGS=-I/usr/local/src/openmpi/openmpi-3.1.2 -I/usr/local/src/openmpi/openmpi-3.1.2 -I/usr/local/src/openmpi/openmpi-3.1.2/opal/include -I/usr/local/src/openmpi/openmpi-3.1.2/opal/include -I/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/event/libevent2022/libevent -I/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/event/libevent2022/libevent/include -I/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/hwloc/hwloc1117/hwloc/include ' --cache-file=/dev/null --srcdir=. --disable-option-checking
> configure:151655: /bin/sh './configure' succeeded for opal/mca/pmix/pmix2x/pmix
> configure:151809: checking if MCA component pmix:pmix2x can compile
> configure:5918: --- MCA component pmix:s1 (m4 configuration macro)
> configure:152416: checking for MCA component pmix:s1 compile mode
> configure:152478: checking if MCA component pmix:s1 can compile
> configure:5918: --- MCA component pmix:s2 (m4 configuration macro)
> configure:153085: checking for MCA component pmix:s2 compile mode
> configure:153147: checking if MCA component pmix:s2 can compile
> configure:177961: result: env, hnp, pmi, singleton, tool
> configure:5918: --- MCA component ess:pmi (no configuration)
> configure:179336: checking for MCA component ess:pmi compile mode
> configure:179378: checking if MCA component ess:pmi can compile
> config.status:4112: creating opal/mca/pmix/Makefile
> config.status:4112: creating opal/mca/pmix/isolated/Makefile
> config.status:4112: creating opal/mca/pmix/cray/Makefile
> config.status:4112: creating opal/mca/pmix/ext1x/Makefile
> config.status:4112: creating opal/mca/pmix/ext2x/Makefile
> config.status:4112: creating opal/mca/pmix/flux/Makefile
> config.status:4112: creating opal/mca/pmix/pmix2x/Makefile
> config.status:4112: creating opal/mca/pmix/s1/Makefile
> config.status:4112: creating opal/mca/pmix/s2/Makefile
> config.status:4112: creating orte/mca/ess/pmi/Makefile
> ac_cv_header_pmi2_h=yes
> ac_cv_header_pmi_h=yes
> ac_cv_lib_pmi2_PMI2_Init=yes
> ac_cv_lib_pmi_PMI_Init=yes
> MCA_BUILD_opal_pmix_cray_DSO_FALSE='#'
> MCA_BUILD_opal_pmix_cray_DSO_TRUE=''
> MCA_BUILD_opal_pmix_ext1x_DSO_FALSE='#'
> MCA_BUILD_opal_pmix_ext1x_DSO_TRUE=''
> MCA_BUILD_opal_pmix_ext2x_DSO_FALSE='#'
> MCA_BUILD_opal_pmix_ext2x_DSO_TRUE=''
> MCA_BUILD_opal_pmix_flux_DSO_FALSE='#'
> MCA_BUILD_opal_pmix_flux_DSO_TRUE=''
> MCA_BUILD_opal_pmix_isolated_DSO_FALSE='#'
> MCA_BUILD_opal_pmix_isolated_DSO_TRUE=''
> MCA_BUILD_opal_pmix_pmix2x_DSO_FALSE='#'
> MCA_BUILD_opal_pmix_pmix2x_DSO_TRUE=''
> MCA_BUILD_opal_pmix_s1_DSO_FALSE='#'
> MCA_BUILD_opal_pmix_s1_DSO_TRUE=''
> MCA_BUILD_opal_pmix_s2_DSO_FALSE='#'
> MCA_BUILD_opal_pmix_s2_DSO_TRUE=''
> MCA_BUILD_orte_ess_pmi_DSO_FALSE='#'
> MCA_BUILD_orte_ess_pmi_DSO_TRUE=''
> MCA_opal_FRAMEWORKS='common allocator backtrace btl compress crs dl event hwloc if installdirs memchecker memcpy memory mpool patcher pmix pstat rcache reachable shmem timer'
> MCA_opal_FRAMEWORKS_SUBDIRS='mca/common mca/allocator mca/backtrace mca/btl mca/compress mca/crs mca/dl mca/event mca/hwloc mca/if mca/installdirs mca/memchecker mca/memcpy mca/memory mca/mpool mca/patcher mca/pmix mca/pstat mca/rcache mca/reachable mca/shmem mca/timer'
> MCA_opal_FRAMEWORK_COMPONENT_ALL_SUBDIRS='$(MCA_opal_common_ALL_SUBDIRS) $(MCA_opal_allocator_ALL_SUBDIRS) $(MCA_opal_backtrace_ALL_SUBDIRS) $(MCA_opal_btl_ALL_SUBDIRS) $(MCA_opal_compress_ALL_SUBDIRS) $(MCA_opal_crs_ALL_SUBDIRS) $(MCA_opal_dl_ALL_SUBDIRS) $(MCA_opal_event_ALL_SUBDIRS) $(MCA_opal_hwloc_ALL_SUBDIRS) $(MCA_opal_if_ALL_SUBDIRS) $(MCA_opal_installdirs_ALL_SUBDIRS) $(MCA_opal_memchecker_ALL_SUBDIRS) $(MCA_opal_memcpy_ALL_SUBDIRS) $(MCA_opal_memory_ALL_SUBDIRS) $(MCA_opal_mpool_ALL_SUBDIRS) $(MCA_opal_patcher_ALL_SUBDIRS) $(MCA_opal_pmix_ALL_SUBDIRS) $(MCA_opal_pstat_ALL_SUBDIRS) $(MCA_opal_rcache_ALL_SUBDIRS) $(MCA_opal_reachable_ALL_SUBDIRS) $(MCA_opal_shmem_ALL_SUBDIRS) $(MCA_opal_timer_ALL_SUBDIRS)'
> MCA_opal_FRAMEWORK_COMPONENT_DSO_SUBDIRS='$(MCA_opal_common_DSO_SUBDIRS) $(MCA_opal_allocator_DSO_SUBDIRS) $(MCA_opal_backtrace_DSO_SUBDIRS) $(MCA_opal_btl_DSO_SUBDIRS) $(MCA_opal_compress_DSO_SUBDIRS) $(MCA_opal_crs_DSO_SUBDIRS) $(MCA_opal_dl_DSO_SUBDIRS) $(MCA_opal_event_DSO_SUBDIRS) $(MCA_opal_hwloc_DSO_SUBDIRS) $(MCA_opal_if_DSO_SUBDIRS) $(MCA_opal_installdirs_DSO_SUBDIRS) $(MCA_opal_memchecker_DSO_SUBDIRS) $(MCA_opal_memcpy_DSO_SUBDIRS) $(MCA_opal_memory_DSO_SUBDIRS) $(MCA_opal_mpool_DSO_SUBDIRS) $(MCA_opal_patcher_DSO_SUBDIRS) $(MCA_opal_pmix_DSO_SUBDIRS) $(MCA_opal_pstat_DSO_SUBDIRS) $(MCA_opal_rcache_DSO_SUBDIRS) $(MCA_opal_reachable_DSO_SUBDIRS) $(MCA_opal_shmem_DSO_SUBDIRS) $(MCA_opal_timer_DSO_SUBDIRS)'
> MCA_opal_FRAMEWORK_COMPONENT_STATIC_SUBDIRS='$(MCA_opal_common_STATIC_SUBDIRS) $(MCA_opal_allocator_STATIC_SUBDIRS) $(MCA_opal_backtrace_STATIC_SUBDIRS) $(MCA_opal_btl_STATIC_SUBDIRS) $(MCA_opal_compress_STATIC_SUBDIRS) $(MCA_opal_crs_STATIC_SUBDIRS) $(MCA_opal_dl_STATIC_SUBDIRS) $(MCA_opal_event_STATIC_SUBDIRS) $(MCA_opal_hwloc_STATIC_SUBDIRS) $(MCA_opal_if_STATIC_SUBDIRS) $(MCA_opal_installdirs_STATIC_SUBDIRS) $(MCA_opal_memchecker_STATIC_SUBDIRS) $(MCA_opal_memcpy_STATIC_SUBDIRS) $(MCA_opal_memory_STATIC_SUBDIRS) $(MCA_opal_mpool_STATIC_SUBDIRS) $(MCA_opal_patcher_STATIC_SUBDIRS) $(MCA_opal_pmix_STATIC_SUBDIRS) $(MCA_opal_pstat_STATIC_SUBDIRS) $(MCA_opal_rcache_STATIC_SUBDIRS) $(MCA_opal_reachable_STATIC_SUBDIRS) $(MCA_opal_shmem_STATIC_SUBDIRS) $(MCA_opal_timer_STATIC_SUBDIRS)'
> MCA_opal_FRAMEWORK_LIBS=' $(MCA_opal_common_STATIC_LTLIBS) mca/allocator/libmca_allocator.la <http://libmca_allocator.la/> $(MCA_opal_allocator_STATIC_LTLIBS) mca/backtrace/libmca_backtrace.la <http://libmca_backtrace.la/> $(MCA_opal_backtrace_STATIC_LTLIBS) mca/btl/libmca_btl.la <http://libmca_btl.la/> $(MCA_opal_btl_STATIC_LTLIBS) mca/compress/libmca_compress.la <http://libmca_compress.la/> $(MCA_opal_compress_STATIC_LTLIBS) mca/crs/libmca_crs.la <http://libmca_crs.la/> $(MCA_opal_crs_STATIC_LTLIBS) mca/dl/libmca_dl.la <http://libmca_dl.la/>$(MCA_opal_dl_STATIC_LTLIBS) mca/event/libmca_event.la <http://libmca_event.la/> $(MCA_opal_event_STATIC_LTLIBS) mca/hwloc/libmca_hwloc.la <http://libmca_hwloc.la/>$(MCA_opal_hwloc_STATIC_LTLIBS) mca/if/libmca_if.la <http://libmca_if.la/> $(MCA_opal_if_STATIC_LTLIBS) mca/installdirs/libmca_installdirs.la <http://libmca_installdirs.la/>$(MCA_opal_installdirs_STATIC_LTLIBS) mca/memchecker/libmca_memchecker.la <http://libmca_memchecker.la/> $(MCA_opal_memchecker_STATIC_LTLIBS) mca/memcpy/libmca_memcpy.la <http://libmca_memcpy.la/> $(MCA_opal_memcpy_STATIC_LTLIBS) mca/memory/libmca_memory.la <http://libmca_memory.la/> $(MCA_opal_memory_STATIC_LTLIBS) mca/mpool/libmca_mpool.la <http://libmca_mpool.la/> $(MCA_opal_mpool_STATIC_LTLIBS) mca/patcher/libmca_patcher.la <http://libmca_patcher.la/> $(MCA_opal_patcher_STATIC_LTLIBS) mca/pmix/libmca_pmix.la <http://libmca_pmix.la/> $(MCA_opal_pmix_STATIC_LTLIBS) mca/pstat/libmca_pstat.la <http://libmca_pstat.la/> $(MCA_opal_pstat_STATIC_LTLIBS) mca/rcache/libmca_rcache.la <http://libmca_rcache.la/>$(MCA_opal_rcache_STATIC_LTLIBS) mca/reachable/libmca_reachable.la <http://libmca_reachable.la/> $(MCA_opal_reachable_STATIC_LTLIBS) mca/shmem/libmca_shmem.la <http://libmca_shmem.la/>$(MCA_opal_shmem_STATIC_LTLIBS) mca/timer/libmca_timer.la <http://libmca_timer.la/> $(MCA_opal_timer_STATIC_LTLIBS)'
> MCA_opal_pmix_ALL_COMPONENTS=' isolated cray ext1x ext2x flux pmix2x s1 s2'
> MCA_opal_pmix_ALL_SUBDIRS=' mca/pmix/isolated mca/pmix/cray mca/pmix/ext1x mca/pmix/ext2x mca/pmix/flux mca/pm x/pmix2x mca/pmix/s1 mca/pmix/s2'
> MCA_opal_pmix_DSO_COMPONENTS=' isolated flux pmix2x s1 s2'
> MCA_opal_pmix_DSO_SUBDIRS=' mca/pmix/isolated mca/pmix/flux mca/pmix/pmix2x mca/pmix/s1 mca/pmix/s2'
> MCA_opal_pmix_STATIC_COMPONENTS=''
> MCA_opal_pmix_STATIC_LTLIBS=''
> MCA_opal_pmix_STATIC_SUBDIRS=''
> MCA_orte_ess_ALL_COMPONENTS=' env hnp pmi singleton tool alps lsf slurm tm'
> MCA_orte_ess_ALL_SUBDIRS=' mca/ess/env mca/ess/hnp mca/ess/pmi mca/ess/singleton mca/ess/tool mca/ess/alps mca/ess/lsf mca/ess/slurm mca/ess/tm'
> MCA_orte_ess_DSO_COMPONENTS=' env hnp pmi singleton tool slurm'
> MCA_orte_ess_DSO_SUBDIRS=' mca/ess/env mca/ess/hnp mca/ess/pmi mca/ess/singleton mca/ess/tool mca/ess/slurm'
> OPAL_CONFIGURE_CLI=' \'\''--prefix=/usr/local/\'\'' \'\''--with-cuda\'\'' \'\''--with-slurm\'\'' \'\''--with-pmi=/usr/local/slurm/include/slurm\'\'' \'\''--with-pmi-libdir=/usr/local/slurm/lib64\'\'''
> opal_pmi1_CPPFLAGS=''
> opal_pmi1_LDFLAGS=''
> opal_pmi1_LIBS='-lpmi'
> opal_pmi1_rpath=''
> opal_pmi2_CPPFLAGS=''
> opal_pmi2_LDFLAGS=''
> opal_pmi2_LIBS='-lpmi2'
> opal_pmi2_rpath=''
> opal_pmix_ext1x_CPPFLAGS=''
> opal_pmix_ext1x_LDFLAGS=''
> opal_pmix_ext1x_LIBS=''
> opal_pmix_ext2x_CPPFLAGS=''
> opal_pmix_ext2x_LDFLAGS=''
> opal_pmix_ext2x_LIBS=''
> opal_pmix_pmix2x_CPPFLAGS='-I/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x/pmix/include -I/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x/pmix -I/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x/pmix/include -I/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x/pmix'
> opal_pmix_pmix2x_DEPENDENCIES='/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x/pmix/src/libpmix.la <http://libpmix.la/>'
> opal_pmix_pmix2x_LDFLAGS=''
> opal_pmix_pmix2x_LIBS='/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x/pmix/src/libpmix.la <http://libpmix.la/>'
> pmix_alps_CPPFLAGS=''
> pmix_alps_LDFLAGS=''
> pmix_alps_LIBS=''
> pmix_cray_CPPFLAGS=''
> pmix_cray_LDFLAGS=''
> pmix_cray_LIBS=''
>
> From: users <users-***@lists.open-mpi.org> On Behalf Of Ralph H Castain
> Sent: Wednesday, October 10, 2018 11:26 AM
> To: Open MPI Users <***@lists.open-mpi.org>
> Subject: Re: [OMPI users] issue compiling openmpi 3.2.1 with pmi and slurm
>
> It appears that the CPPFLAGS isn’t getting set correctly as the component didn’t find the Slurm PMI-1 header file. Perhaps it would help if we saw the config.log output so we can see where OMPI thought the file was located.
>
>
>
> On Oct 10, 2018, at 6:44 AM, Ross, Daniel B. via users <***@lists.open-mpi.org <mailto:***@lists.open-mpi.org>> wrote:
>
> I have been able to configure without issue using the following options:
> ./configure --prefix=/usr/local/ --with-cuda --with-slurm --with-pmi=/usr/local/slurm/include/slurm --with-pmi-libdir=/usr/local/slurm/lib64
>
> Everything compiles just fine until I get this error:
>
> make[3]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x'
> make[2]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x'
> Making all in mca/pmix/s1
> make[2]: Entering directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/s1'
> CC mca_pmix_s1_la-pmix_s1.lo
> pmix_s1.c:29:17: fatal error: pmi.h: No such file or directory
> #include <pmi.h>
> ^
> compilation terminated.
> make[2]: *** [mca_pmix_s1_la-pmix_s1.lo] Error 1
> make[2]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/s1'
> make[1]: *** [all-recursive] Error 1
> make[1]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal'
> make: *** [all-recursive] Error 1
>
>
> any ideas why I am getting this error?
> Thanks
>
> _______________________________________________
> users mailing list
> ***@lists.open-mpi.org <mailto:***@lists.open-mpi.org>
> https://lists.open-mpi.org/mailman/listinfo/users <https://lists.open-mpi.org/mailman/listinfo/users>
>
> _______________________________________________
> users mailing list
> ***@lists.open-mpi.org
> https://lists.open-mpi.org/mailman/listinfo/users
Bennet Fauber
2018-10-10 17:14:26 UTC
Permalink
I thought the --with-pmi=/the/dir was meant to point to the top of a
traditional FHS installation of PMI; e.g., /opt/pmi with
subdirectories for bin, lib, include, man, etc. It looks like this is
pointing only to the header file, based on the name.


On Wed, Oct 10, 2018 at 11:27 AM Ralph H Castain <***@open-mpi.org> wrote:
>
> It appears that the CPPFLAGS isn’t getting set correctly as the component didn’t find the Slurm PMI-1 header file. Perhaps it would help if we saw the config.log output so we can see where OMPI thought the file was located.
>
>
> On Oct 10, 2018, at 6:44 AM, Ross, Daniel B. via users <***@lists.open-mpi.org> wrote:
>
> I have been able to configure without issue using the following options:
> ./configure --prefix=/usr/local/ --with-cuda --with-slurm --with-pmi=/usr/local/slurm/include/slurm --with-pmi-libdir=/usr/local/slurm/lib64
>
> Everything compiles just fine until I get this error:
>
> make[3]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x'
> make[2]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x'
> Making all in mca/pmix/s1
> make[2]: Entering directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/s1'
> CC mca_pmix_s1_la-pmix_s1.lo
> pmix_s1.c:29:17: fatal error: pmi.h: No such file or directory
> #include <pmi.h>
> ^
> compilation terminated.
> make[2]: *** [mca_pmix_s1_la-pmix_s1.lo] Error 1
> make[2]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/s1'
> make[1]: *** [all-recursive] Error 1
> make[1]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal'
> make: *** [all-recursive] Error 1
>
>
> any ideas why I am getting this error?
> Thanks
>
> _______________________________________________
> users mailing list
> ***@lists.open-mpi.org
> https://lists.open-mpi.org/mailman/listinfo/users
>
>
> _______________________________________________
> users mailing list
> ***@lists.open-mpi.org
> https://lists.open-mpi.org/mailman/listinfo/users
Ross, Daniel B. via users
2018-10-10 18:06:32 UTC
Permalink
This is a slurm build and you point it to the slurm pmi directories.


________________________________
From: users <users-***@lists.open-mpi.org> on behalf of Bennet Fauber <***@umich.edu>
Sent: Wednesday, October 10, 2018 1:14 PM
To: ***@lists.open-mpi.org
Subject: Re: [OMPI users] issue compiling openmpi 3.2.1 with pmi and slurm

I thought the --with-pmi=/the/dir was meant to point to the top of a
traditional FHS installation of PMI; e.g., /opt/pmi with
subdirectories for bin, lib, include, man, etc. It looks like this is
pointing only to the header file, based on the name.


On Wed, Oct 10, 2018 at 11:27 AM Ralph H Castain <***@open-mpi.org> wrote:
>
> It appears that the CPPFLAGS isn’t getting set correctly as the component didn’t find the Slurm PMI-1 header file. Perhaps it would help if we saw the config.log output so we can see where OMPI thought the file was located.
>
>
> On Oct 10, 2018, at 6:44 AM, Ross, Daniel B. via users <***@lists.open-mpi.org> wrote:
>
> I have been able to configure without issue using the following options:
> ./configure --prefix=/usr/local/ --with-cuda --with-slurm --with-pmi=/usr/local/slurm/include/slurm --with-pmi-libdir=/usr/local/slurm/lib64
>
> Everything compiles just fine until I get this error:
>
> make[3]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x'
> make[2]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x'
> Making all in mca/pmix/s1
> make[2]: Entering directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/s1'
> CC mca_pmix_s1_la-pmix_s1.lo
> pmix_s1.c:29:17: fatal error: pmi.h: No such file or directory
> #include <pmi.h>
> ^
> compilation terminated.
> make[2]: *** [mca_pmix_s1_la-pmix_s1.lo] Error 1
> make[2]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/s1'
> make[1]: *** [all-recursive] Error 1
> make[1]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal'
> make: *** [all-recursive] Error 1
>
>
> any ideas why I am getting this error?
> Thanks
>
> _______________________________________________
> users mailing list
> ***@lists.open-mpi.org
> https://lists.open-mpi.org/mailman/listinfo/users
users Info Page - Open MPI<https://lists.open-mpi.org/mailman/listinfo/users>
lists.open-mpi.org
General user information and questions about Open MPI. ==> Users Archives Using users: To post a message to all the list members, send email to ***@lists.open-mpi.org. You can subscribe to the list, or change your existing subscription, in the sections below.



>
>
> _______________________________________________
> users mailing list
> ***@lists.open-mpi.org
> https://lists.open-mpi.org/mailman/listinfo/users
Bennet Fauber
2018-10-10 18:19:06 UTC
Permalink
Yes, I was just wondering about the path as specified,

--with-pmi=/usr/local/slurm/include/slurm

What do you get from

$ find /usr/local/slurm -name pmi.h



On Wed, Oct 10, 2018 at 2:07 PM Ross, Daniel B. via users <
***@lists.open-mpi.org> wrote:

> This is a slurm build and you point it to the slurm pmi directories.
>
>
> ------------------------------
> *From:* users <users-***@lists.open-mpi.org> on behalf of Bennet
> Fauber <***@umich.edu>
> *Sent:* Wednesday, October 10, 2018 1:14 PM
> *To:* ***@lists.open-mpi.org
> *Subject:* Re: [OMPI users] issue compiling openmpi 3.2.1 with pmi and
> slurm
>
> I thought the --with-pmi=/the/dir was meant to point to the top of a
> traditional FHS installation of PMI; e.g., /opt/pmi with
> subdirectories for bin, lib, include, man, etc. It looks like this is
> pointing only to the header file, based on the name.
>
>
> On Wed, Oct 10, 2018 at 11:27 AM Ralph H Castain <***@open-mpi.org> wrote:
> >
> > It appears that the CPPFLAGS isn’t getting set correctly as the
> component didn’t find the Slurm PMI-1 header file. Perhaps it would help if
> we saw the config.log output so we can see where OMPI thought the file was
> located.
> >
> >
> > On Oct 10, 2018, at 6:44 AM, Ross, Daniel B. via users <
> ***@lists.open-mpi.org> wrote:
> >
> > I have been able to configure without issue using the following options:
> > ./configure --prefix=/usr/local/ --with-cuda --with-slurm
> --with-pmi=/usr/local/slurm/include/slurm
> --with-pmi-libdir=/usr/local/slurm/lib64
> >
> > Everything compiles just fine until I get this error:
> >
> > make[3]: Leaving directory
> `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x'
> > make[2]: Leaving directory
> `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x'
> > Making all in mca/pmix/s1
> > make[2]: Entering directory
> `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/s1'
> > CC mca_pmix_s1_la-pmix_s1.lo
> > pmix_s1.c:29:17: fatal error: pmi.h: No such file or directory
> > #include <pmi.h>
> > ^
> > compilation terminated.
> > make[2]: *** [mca_pmix_s1_la-pmix_s1.lo] Error 1
> > make[2]: Leaving directory
> `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/s1'
> > make[1]: *** [all-recursive] Error 1
> > make[1]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal'
> > make: *** [all-recursive] Error 1
> >
> >
> > any ideas why I am getting this error?
> > Thanks
> >
> > _______________________________________________
> > users mailing list
> > ***@lists.open-mpi.org
> > https://lists.open-mpi.org/mailman/listinfo/users
> users Info Page - Open MPI
> <https://lists.open-mpi.org/mailman/listinfo/users>
> lists.open-mpi.org
> General user information and questions about Open MPI. ==> Users Archives
> Using users: To post a message to all the list members, send email to
> ***@lists.open-mpi.org. You can subscribe to the list, or change your
> existing subscription, in the sections below.
>
>
> >
> >
> > _______________________________________________
> > users mailing list
> > ***@lists.open-mpi.org
> > https://lists.open-mpi.org/mailman/listinfo/users
> _______________________________________________
> users mailing list
> ***@lists.open-mpi.org
> https://lists.open-mpi.org/mailman/listinfo/users
> _______________________________________________
> users mailing list
> ***@lists.open-mpi.org
> https://lists.open-mpi.org/mailman/listinfo/users
Ross, Daniel B. via users
2018-10-10 20:12:42 UTC
Permalink
find /usr/local/slurm/ -name pmi.h
/usr/local/slurm/include/slurm/pmi.h


From: users <users-***@lists.open-mpi.org> On Behalf Of Bennet Fauber
Sent: Wednesday, October 10, 2018 2:19 PM
To: ***@lists.open-mpi.org
Subject: Re: [OMPI users] issue compiling openmpi 3.2.1 with pmi and slurm

Yes, I was just wondering about the path as specified,

--with-pmi=/usr/local/slurm/include/slurm

What do you get from

$ find /usr/local/slurm -name pmi.h



On Wed, Oct 10, 2018 at 2:07 PM Ross, Daniel B. via users <***@lists.open-mpi.org<mailto:***@lists.open-mpi.org>> wrote:

This is a slurm build and you point it to the slurm pmi directories.

________________________________
From: users <users-***@lists.open-mpi.org<mailto:users-***@lists.open-mpi.org>> on behalf of Bennet Fauber <***@umich.edu<mailto:***@umich.edu>>
Sent: Wednesday, October 10, 2018 1:14 PM
To: ***@lists.open-mpi.org<mailto:***@lists.open-mpi.org>
Subject: Re: [OMPI users] issue compiling openmpi 3.2.1 with pmi and slurm

I thought the --with-pmi=/the/dir was meant to point to the top of a
traditional FHS installation of PMI; e.g., /opt/pmi with
subdirectories for bin, lib, include, man, etc. It looks like this is
pointing only to the header file, based on the name.


On Wed, Oct 10, 2018 at 11:27 AM Ralph H Castain <***@open-mpi.org<mailto:***@open-mpi.org>> wrote:
>
> It appears that the CPPFLAGS isn’t getting set correctly as the component didn’t find the Slurm PMI-1 header file. Perhaps it would help if we saw the config.log output so we can see where OMPI thought the file was located.
>
>
> On Oct 10, 2018, at 6:44 AM, Ross, Daniel B. via users <***@lists.open-mpi.org<mailto:***@lists.open-mpi.org>> wrote:
>
> I have been able to configure without issue using the following options:
> ./configure --prefix=/usr/local/ --with-cuda --with-slurm --with-pmi=/usr/local/slurm/include/slurm --with-pmi-libdir=/usr/local/slurm/lib64
>
> Everything compiles just fine until I get this error:
>
> make[3]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x'
> make[2]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x'
> Making all in mca/pmix/s1
> make[2]: Entering directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/s1'
> CC mca_pmix_s1_la-pmix_s1.lo
> pmix_s1.c:29:17: fatal error: pmi.h: No such file or directory
> #include <pmi.h>
> ^
> compilation terminated.
> make[2]: *** [mca_pmix_s1_la-pmix_s1.lo] Error 1
> make[2]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/s1'
> make[1]: *** [all-recursive] Error 1
> make[1]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal'
> make: *** [all-recursive] Error 1
>
>
> any ideas why I am getting this error?
> Thanks
>
> _______________________________________________
> users mailing list
> ***@lists.open-mpi.org<mailto:***@lists.open-mpi.org>
> https://lists.open-mpi.org/mailman/listinfo/users
users Info Page - Open MPI<https://lists.open-mpi.org/mailman/listinfo/users>
lists.open-mpi.org<http://lists.open-mpi.org>
General user information and questions about Open MPI. ==> Users Archives Using users: To post a message to all the list members, send email to ***@lists.open-mpi.org<mailto:***@lists.open-mpi.org>. You can subscribe to the list, or change your existing subscription, in the sections below.



>
>
> _______________________________________________
> users mailing list
> ***@lists.open-mpi.org<mailto:***@lists.open-mpi.org>
> https://lists.open-mpi.org/mailman/listinfo/users
_______________________________________________
users mailing list
***@lists.open-mpi.org<mailto:***@lists.open-mpi.org>
https://lists.open-mpi.org/mailman/listinfo/users
_______________________________________________
users mailing list
***@lists.open-mpi.org<mailto:***@lists.open-mpi.org>
https://lists.open-mpi.org/mailman/listinfo/users
Charles A Taylor
2018-10-10 22:02:16 UTC
Permalink
In our config the "--with-pmi" points to the slurm “prefix” dir not the slurm libdir. The options below work for us with SLURM installed in “/opt/slurm”.

I’ll note that after sharing this config with regard to another issue, it was recommended to drop the “/usr” in the “—with-foo=/usr” options and simply
use “—with-foo”. That said, the configuration builds, runs, and works with both pmi2 and pmix_v2 under SLURM (17.11.5).

Hope it helps,

Charlie Taylor
UF Research Computing

CFG_OPTS=""
CFG_OPTS="$CFG_OPTS C=icc CXX=icpc FC=ifort FFLAGS=\"-O2 -g -warn -m64\" LDFLAGS=\"\" "
CFG_OPTS="$CFG_OPTS --enable-static"
CFG_OPTS="$CFG_OPTS --enable-orterun-prefix-by-default"
CFG_OPTS="$CFG_OPTS --with-slurm=/opt/slurm"
CFG_OPTS="$CFG_OPTS --with-pmix=/opt/pmix/2.1.1"
CFG_OPTS="$CFG_OPTS --with-pmi=/opt/slurm"
CFG_OPTS="$CFG_OPTS --with-libevent=external"
CFG_OPTS="$CFG_OPTS --with-hwloc=external"
CFG_OPTS="$CFG_OPTS --with-verbs=/usr"
CFG_OPTS="$CFG_OPTS --with-libfabric=/usr"
CFG_OPTS="$CFG_OPTS --with-ucx=/usr"
CFG_OPTS="$CFG_OPTS --with-verbs-libdir=/usr/lib64"
CFG_OPTS="$CFG_OPTS --with-mxm=no"
CFG_OPTS="$CFG_OPTS --with-cuda=${HPC_CUDA_DIR}"
CFG_OPTS="$CFG_OPTS --enable-openib-udcm"
CFG_OPTS="$CFG_OPTS --enable-openib-rdmacm”




> On Oct 10, 2018, at 9:44 AM, Ross, Daniel B. via users <***@lists.open-mpi.org> wrote:
>
> I have been able to configure without issue using the following options:
> ./configure --prefix=/usr/local/ --with-cuda --with-slurm --with-pmi=/usr/local/slurm/include/slurm --with-pmi-libdir=/usr/local/slurm/lib64
>
> Everything compiles just fine until I get this error:
>
> make[3]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x'
> make[2]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x'
> Making all in mca/pmix/s1
> make[2]: Entering directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/s1'
> CC mca_pmix_s1_la-pmix_s1.lo
> pmix_s1.c:29:17: fatal error: pmi.h: No such file or directory
> #include <pmi.h>
> ^
> compilation terminated.
> make[2]: *** [mca_pmix_s1_la-pmix_s1.lo] Error 1
> make[2]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/s1'
> make[1]: *** [all-recursive] Error 1
> make[1]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal'
> make: *** [all-recursive] Error 1
>
>
> any ideas why I am getting this error?
> Thanks
>
> _______________________________________________
> users mailing list
> ***@lists.open-mpi.org
> https://urldefense.proofpoint.com/v2/url?u=https-3A__lists.open-2Dmpi.org_mailman_listinfo_users&d=DwICAg&c=pZJPUDQ3SB9JplYbifm4nt2lEVG5pWx2KikqINpWlZM&r=HOtXciFqK5GlgIgLAxthUQ&m=kx16f4bm7qZNj7skG8__z3EMDluM8L_LHYZHmA1ZhFI&s=YSuW5T_SPA66GoYyG95eakEqTMAdWuVKsuzdYU4N56A&e=
Gilles Gouaillardet
2018-10-11 03:00:39 UTC
Permalink
I digged a bit the configury logic and found


./configure --prefix=/usr/local/ --with-cuda --with-slurm
--with-pmi=/usr/local/slurm


should do the trick, if not


./configure --prefix=/usr/local/ --with-cuda --with-slurm
--with-pmi=/usr/local/slurm --with-pmi-libdir=/usr/local/slurm/lib64


should work.


Internally, if we --with-pmi=FOO and FOO/pmi.h exists, when we do not
set slurm_pmi_found=yes, and the opal_pmi{1,2}_CPPFLAGS do not get set.


Cheers,


Gilles


On 10/11/2018 7:02 AM, Charles A Taylor wrote:
> ./configure --prefix=/usr/local/ --with-cuda --with-slurm
> --with-pmi=/usr/local/slurm/include/slurm
> --with-pmi-libdir=/usr/local/slurm/lib64
Ross, Daniel B. via users
2018-10-11 12:01:46 UTC
Permalink
$ ./configure --prefix=/usr/local/ --with-cuda --with-slurm --with-pmi=/usr/local/slurm/include/slurm --with-
pmi-libdir=/usr/local/slurm/lib64 CPPFLAGS=-I/usr/local/slurm/include/slurm LDFLAGS=-L/usr/local/slurm/lib64

seemed to work just fine.
Thanks for the assistance.

From: Charles A Taylor <***@ufl.edu>
Sent: Wednesday, October 10, 2018 6:02 PM
To: Open MPI Users <***@lists.open-mpi.org>
Cc: Ross, Daniel B. <***@ornl.gov>
Subject: Re: [OMPI users] issue compiling openmpi 3.2.1 with pmi and slurm

In our config the "--with-pmi" points to the slurm “prefix” dir not the slurm libdir. The options below work for us with SLURM installed in “/opt/slurm”.

I’ll note that after sharing this config with regard to another issue, it was recommended to drop the “/usr” in the “—with-foo=/usr” options and simply
use “—with-foo”. That said, the configuration builds, runs, and works with both pmi2 and pmix_v2 under SLURM (17.11.5).

Hope it helps,

Charlie Taylor
UF Research Computing

CFG_OPTS=""
CFG_OPTS="$CFG_OPTS C=icc CXX=icpc FC=ifort FFLAGS=\"-O2 -g -warn -m64\" LDFLAGS=\"\" "
CFG_OPTS="$CFG_OPTS --enable-static"
CFG_OPTS="$CFG_OPTS --enable-orterun-prefix-by-default"
CFG_OPTS="$CFG_OPTS --with-slurm=/opt/slurm"
CFG_OPTS="$CFG_OPTS --with-pmix=/opt/pmix/2.1.1"
CFG_OPTS="$CFG_OPTS --with-pmi=/opt/slurm"
CFG_OPTS="$CFG_OPTS --with-libevent=external"
CFG_OPTS="$CFG_OPTS --with-hwloc=external"
CFG_OPTS="$CFG_OPTS --with-verbs=/usr"
CFG_OPTS="$CFG_OPTS --with-libfabric=/usr"
CFG_OPTS="$CFG_OPTS --with-ucx=/usr"
CFG_OPTS="$CFG_OPTS --with-verbs-libdir=/usr/lib64"
CFG_OPTS="$CFG_OPTS --with-mxm=no"
CFG_OPTS="$CFG_OPTS --with-cuda=${HPC_CUDA_DIR}"
CFG_OPTS="$CFG_OPTS --enable-openib-udcm"
CFG_OPTS="$CFG_OPTS --enable-openib-rdmacm”




On Oct 10, 2018, at 9:44 AM, Ross, Daniel B. via users <***@lists.open-mpi.org<mailto:***@lists.open-mpi.org>> wrote:

I have been able to configure without issue using the following options:
./configure --prefix=/usr/local/ --with-cuda --with-slurm --with-pmi=/usr/local/slurm/include/slurm --with-pmi-libdir=/usr/local/slurm/lib64

Everything compiles just fine until I get this error:

make[3]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x'
make[2]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/pmix2x'
Making all in mca/pmix/s1
make[2]: Entering directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/s1'
CC mca_pmix_s1_la-pmix_s1.lo
pmix_s1.c:29:17: fatal error: pmi.h: No such file or directory
#include <pmi.h>
^
compilation terminated.
make[2]: *** [mca_pmix_s1_la-pmix_s1.lo] Error 1
make[2]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal/mca/pmix/s1'
make[1]: *** [all-recursive] Error 1
make[1]: Leaving directory `/usr/local/src/openmpi/openmpi-3.1.2/opal'
make: *** [all-recursive] Error 1


any ideas why I am getting this error?
Thanks

_______________________________________________
users mailing list
***@lists.open-mpi.org<mailto:***@lists.open-mpi.org>
https://urldefense.proofpoint.com/v2/url?u=https-3A__lists.open-2Dmpi.org_mailman_listinfo_users&d=DwICAg&c=pZJPUDQ3SB9JplYbifm4nt2lEVG5pWx2KikqINpWlZM&r=HOtXciFqK5GlgIgLAxthUQ&m=kx16f4bm7qZNj7skG8__z3EMDluM8L_LHYZHmA1ZhFI&s=YSuW5T_SPA66GoYyG95eakEqTMAdWuVKsuzdYU4N56A&e=
Loading...