From: http://zhangyu8374.javaeye.com/blog/86305
OpenMP and MPI are two methods of parallel programming. The comparison is as follows:
- OpenMP: Line-level (parallel granularity); shared storage; implicit (data allocation method); poor scalability;
- MPI: Process-level, distributed storage, explicit, and highly scalable.
OpenMP adopts shared storage, which means it only applies to SMP and DSM machines and is not suitable for clusters. Although MPI is suitable for various machines, its programming model is complex:
- You need to analyze and divide Application Problems and map the problems to the distributed process set;
- Communication latency and load imbalance need to be solved;
- Debugging MPI programs is troublesome;
- Poor MPI program reliability. If a process goes wrong, the entire program will be wrong;
Among them, 2nd were deeply touched. Every time I hear reports from people in our department's parallel groups, I always hear that they are dealing with problems such as communication latency and load imbalance. The quality of a parallel algorithm depends on whether it can effectively solve these two problems.
What are the advantages of mapreduce compared with OpenMP and MPI?
- Automatic Parallelism;
- Fault Tolerance;
- Mapreduce has a low learning threshold.
Appendix:
- SMP (Multi-processing), shared bus and memory, a single operating system image. Software is extensible, but hardware is not.
- DSM (Distributed Shared Memory), SMP extension. Physical distributed storage; single memory address space; inconsistent memory access; single operating system image.
Related comments:
OpenMP is dominant in scientific computing and has great advantages in multithreading (including multiple cores.
The combination of OpenMP and MPI has mature cases in terms of clusters.
Mapreducesh is a model developed by Google, a distributed programming model for massive data processing in large-scale groups.
The above is reprinted!
Resources related to MPI:
Many of the following URLs have expired for your reference only!
MPI function manual page
Http://www.mpi-forum.org/docs/mpi-11-html/node182.html#Node182
MPI program design exercise step by step
Http://www-unix.mcs.anl.gov/mpi/tutorial/mpiexmpl/contents.html
Main MPI homepage and MPI standard
Http://www.mpi-forum.org MPIF Home Page
Http://www.mcs.anl.gov/mpi MPI Homepage
MPI site on http://www.netlib.org/mpi/index.html netlib
Implementation of MPI
Mpich homepage implemented by http://www-unix.mcs.anl.gov/mpi/mpich/ ANL/MSU
Http://www.mcs.anl.gov/mpi/mpich mpich implementation
Ftp://ftp.mcs.anl.gov/pub/mpi mpich implementation
Other MPI implementations of http://www.erc.msstate.edu/misc/mpi/
Http://www.lsc.nd.edu/MPI2 MPI implementation list
MPI project for http://www.erc.msstate.edu/mpi MSU
MPI of http://www.mpi.nd.edu/lam/ Lam
Http://www.lam-mpi.org/
MPI documentation, discussion, and examples
Http://www-unix.mcs.anl.gov/mpi/tutorial/ MPI Related Materials
Http://www.erc.msstate.edu/mpi/mpi-faq.html FAQ
Answers to http://wiki.mcs.anl.gov/mpich2/index.php/Frequently_Asked_Questions FAQs
Http://www.mpi-forum.org/docs errata
Http://www.erc.msstate.edu/misc/mpi/resources.html about MPI newsgroups
Ftp://ftp.mpi-forum.org/pub/docs/ MPI document
Http://www.mcs.anl.gov/mpi/usingmpi MPI tutorial
Http://www.mcs.anl.gov/mpi/usingmpi2 MPI tutorial
Ftp://ftp.mcs.anl.gov/pub/mpi/using/examples MPI example Program
Ftp://ftp.mcs.anl.gov/pub/mpi/using2/examples MPI example Program
Http://www-unix.mcs.anl.gov/mpi/tutorial/mpiexmpl/contents.html MPI example Program
One-step Tutorial: MPI: It's easy to get started
Http://www.lam-mpi.org/tutorials/one-step/ezstart.php
Http://www-unix.mcs.anl.gov/mpi/tutorial/mpiexmpl/src/hellow/C/main.html
Http://www.huihoo.org/npact/mpiindex.htm