In computer science, concurrency is the execution of multiple sequential instructions simultaneously or with overlapping time frames. It is an essential characteristic of modern systems due to growing demand for parallel computing, real-time processing and multitasking. Concurrency optimizes resource sharing, computation speed and system modularity.
For example, multiple computers on one network, multiple applications open on one computer, multiple users on one website etc. Concurrency could occur on individual servers, computers, applications or networks. It is implemented through concurrent computing.
Concurrency is based on several principles to ensure threads execute efficiently, effectively and do not interfere with one another such as:
- Interleaving — Interleaved thread/process execution to ensure all threads and processes get a just share of processor time and resources.
- Synchronization — Coordination of multiple threads/processes to prevent interference between them.
- Mutual exclusion — Only one thread/process should access a shared resource at a time.
- Deadlock avoidance — Preventing deadlocks from occurring.
- Resource allocation — Allocation of system resources such as processor time, memory and I/O devices.
Concurrency comes with certain process challenges that need accurate management such as:
- Race conditions — Correct system output is dependent on the timing and order of constituent events. Depending on which event is completed first, there could be unpredictable behavior.
- Deadlocks — Two or more threads/processes waiting for one another to release a resource.
- Starvation — A thread/process is perpetually denied resource access.
- Priority inversion — A low priority thread/process holds and locks a resource preventing high priority threads/processes from securing access.