How do I build PETSc with Open MPI?
The only special configuration that you need to build PETSc is to ensure that Open MPI’s wrapper compilers (i.e., mpicc and mpif77) are in your $PATH before running the PETSc configure.py script. PETSc should then automatically find Open MPI’s wrapper compilers and correctly build itself using Open MPI. 15. How do I build VASP with Open MPI? The following was reported by an Open MPI user who was able to successfully build and run VASP with Open MPI: I just compiled the latest VASP v4.6 using Open MPI v1.2.1, ifort v9.1, ACML v3.6.0, BLACS with patch-03 and Scalapack v1.7.5 built with ACML. I configured Open MPI with –enable-static flag. I used the VASP supplied makefile.linux_ifc_opt and only corrected the paths to the ACML, scalapack, and BLACS dirs (I didn’t lower the optimization to -O0 for mpi.f like I suggested before). The -D’s are standard except I get a little better performance with -DscaLAPACK (I tested it with out this option too): CPP = $(CPP_) -DMPI -DHOST=”LinuxIFC” -DIF