Difference between revisions of "MPI"

From SNIC Documentation
Jump to: navigation, search
(Update of URL to the MPI standard documentation)
 
(9 intermediate revisions by 3 users not shown)
Line 1: Line 1:
{{software info
+
[[Category:Parallel programming]]
|description=Message Passing Interface
 
|license=free
 
|fields=Library
 
|resources=}}
 
  
{{PAGENAME}} (Message Passing Interface) is a famous general library which is designed to provide the information exchange between different cores on parallel executions. It consists of a group of functions which supports different communication operations. More information to follow by AEs.
+
{{PAGENAME}} (Message Passing Interface) is a library, which is designed to provide the information exchange between different tasks of a [[distributed memory programming|distributed memory]] parallel program. It is presently the de-facto standard to implement [[message passing]] in programs written in [[Fortran]], [[C]] or [[C++]].  
 +
 
 +
It consists of a group of functions which supports different communication operations.  
  
 
== Experts ==
 
== Experts ==
Line 11: Line 9:
  
 
== Availability ==
 
== Availability ==
{{list resources for software}}
+
All SNIC resources
 +
 
 +
 
 +
== Resources ==
 +
 
 +
=== Tutorials and slide sets ===
 +
*  A [https://htor.inf.ethz.ch/teaching/mpi_tutorials/ppopp13/2013-02-24-ppopp-mpi-basic.pdf slide set] by Pavan Balaji and Torsten Hoefler from 2013 introducing basic MPI concepts
 +
 
 +
* Lecture slides from course on ''Extreme scale systems'' by William Gropp at the University of Illinois.  The following lectures deal with MPI:  [http://wgropp.cs.illinois.edu/courses/cs598-s16/lectures/lecture22.pdf 22] [http://wgropp.cs.illinois.edu/courses/cs598-s16/lectures/lecture23.pdf 23] [http://wgropp.cs.illinois.edu/courses/cs598-s16/lectures/lecture24.pdf 24] [http://wgropp.cs.illinois.edu/courses/cs598-s16/lectures/lecture25.pdf 25] [http://wgropp.cs.illinois.edu/courses/cs598-s16/lectures/lecture26.pdf 26] [http://wgropp.cs.illinois.edu/courses/cs598-s16/lectures/lecture27.pdf 27] [http://wgropp.cs.illinois.edu/courses/cs598-s16/lectures/lecture28.pdf 28] [http://wgropp.cs.illinois.edu/courses/cs598-s16/lectures/lecture29.pdf 29] [http://wgropp.cs.illinois.edu/courses/cs598-s16/lectures/lecture30.pdf 30] [http://wgropp.cs.illinois.edu/courses/cs598-s16/lectures/lecture31.pdf 31]  [http://wgropp.cs.illinois.edu/courses/cs598-s16/lectures/lecture32.pdf 32] [http://wgropp.cs.illinois.edu/courses/cs598-s16/lectures/lecture33.pdf 33] [http://wgropp.cs.illinois.edu/courses/cs598-s16/lectures/lecture34.pdf 34] [http://wgropp.cs.illinois.edu/courses/cs598-s16/lectures/lecture35.pdf 35] [http://wgropp.cs.illinois.edu/courses/cs598-s16/lectures/lecture36.pdf 36] [http://wgropp.cs.illinois.edu/courses/cs598-s16/lectures/lecture37.pdf 37].  This is an extremely comprehensive introduction to MPI.
 +
 
 +
=== Standard specificatons ===
 +
* [https://www.mpi-forum.org/docs/ Documentation of the MPI standard]
  
 
== License ==
 
== License ==
{{show license}}
+
Free (e.g. OpenMPI) and paid for (e.g. Intel MPI) implementations
 
 
== Links ==
 
There are more than tons of MPI documentations and tutorials. You can start with any tutorial sessions with which you feel comfortable.
 

Latest revision as of 09:25, 22 September 2022


MPI (Message Passing Interface) is a library, which is designed to provide the information exchange between different tasks of a distributed memory parallel program. It is presently the de-facto standard to implement message passing in programs written in Fortran, C or C++.

It consists of a group of functions which supports different communication operations.

Experts

These experts have registered specific competence on this subject:

  FieldAE FTEGeneral activities
Anders Sjöström (LUNARC)LUNARCGPU computing
MATLAB
General programming
Technical acoustics
50Helps users with MATLAB, General programming, Image processing, Usage of clusters
Chandan Basu (NSC)NSCComputational science100Working on climate and weather codes
EU projects IS-ENES and PRACE.
Joachim Hein (LUNARC)LUNARCParallel programming
Performance optimisation
85Parallel programming support
Performance optimisation
HPC training
Lilit Axner (PDC)PDCComputational fluid dynamics50
Marcus Lundberg (UPPMAX)UPPMAXComputational science
Parallel programming
Performance tuning
Sensitive data
100I help users with productivity, program performance, and parallelisation. I also work with allocations and with sensitive data questions

Availability

All SNIC resources


Resources

Tutorials and slide sets

  • A slide set by Pavan Balaji and Torsten Hoefler from 2013 introducing basic MPI concepts
  • Lecture slides from course on Extreme scale systems by William Gropp at the University of Illinois. The following lectures deal with MPI: 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37. This is an extremely comprehensive introduction to MPI.

Standard specificatons

License

Free (e.g. OpenMPI) and paid for (e.g. Intel MPI) implementations