Synchronization (computer science)

From Wikipedia, the free encyclopedia

In computer science, especially parallel computing, synchronization means the coordination of simultaneous threads or processes to complete a task in order to get correct runtime order and avoid unexpected race conditions.

There are many types of synchronization:

On the other hand, file synchronization is the communication between two computers (using a sync cable - see Mini-USB).

[edit] See also

Topics in Parallel Computing  v  d  e 
General High-performance computing
Parallelism Data parallelismTask parallelism
Theory SpeedupAmdahl's lawFlynn's TaxonomyCost efficiencyGustafson's LawKarp-Flatt Metric
Elements ProcessThreadFiberParallel Random Access Machine
Coordination MultiprocessingMultithreadingMultitaskingMemory coherencyCache coherencyBarrierSynchronizationDistributed computingGrid computing
Programming Programming modelImplicit parallelismExplicit parallelism
Hardware Computer clusterBeowulfSymmetric multiprocessingNon-Uniform Memory AccessCache only memory architectureAsymmetric multiprocessingSimultaneous multithreadingShared memoryDistributed memoryMassively parallel processingSuperscalar processingVector processingSupercomputer
Software Distributed shared memoryApplication checkpointing
APIs PthreadsOpenMPMessage Passing Interface (MPI)
Problems Embarrassingly parallelGrand Challenge
In other languages