Online Encyclopedia
Parallel programming
Parallel programming (also concurrent programming), is a computer programming technique that provides for the execution of operations concurrently, either within a single computer, or across a number of systems. In the latter case, the term distributed computing is used. Multiprocessor machines achieve better performance by taking advantage of this kind of programming.
In parallel programming, single tasks are split into a number of subtasks that can be computed relatively independently and then aggregated to form a single coherent solution. Parallel programming is most effective for tasks that can easily broken down into independent tasks such as purely mathematical problems, e.g. factorisation.
One way to achieve parallel programming is through distributed computing, which is a method of information processing in which work is performed by separate computers linked through a communications network.
Pioneers in the field of concurrent programming include Edsger Dijkstra and C. A. R. Hoare.
See also
- critical sections
- mutual exclusion
- synchronization
- Computer multitasking
- Multithreading
- Concurrency control
- Coroutines
- Parallel processor