Multithreading & Asynchrony
Process
A process is an executing instance of a program that contains its own memory space and resources, operating independently of other processes.
Thread
Thread is an execution unit within a process. A process can be composed of multiple threads. Each thread has its own private stack, but they all access the same heap.
Multithreading
Multithreading is the capability of a program to execute multiple threads, allowing for efficient utilization of the system’s resources. If we have multiple threads, they can be run on different cores simultaneously, speeding the application up. Multiple threads can either run concurrently or in parallel.
Concurrency vs Parallelism
Concurrency
When 2 or more tasks can start, run, and complete in overlapping time periods. Concurrency can happen even on a single core processor.
Parallelism
When 2 or more tasks can run at the same time. It happens when we have 2 or more cores in a processor and they consume the instructions simultaneously.
Relationship
Parallelism is a subset of concurrecy. Concurrency is a structural design ability, that the program is structured to manage multiple tasks that make progress independently – even if only one runs at a time. Parallelism is about a physical execution state. Multiple tasks are literally executing at the same instant, on separate CPU cores or processors.
Asynchrony
Asynchrony refers to the occurrence of events independent of the main program flow. Asynchronous actions are executed in a non-blocking way, allowing the main program flow to continue processing.
Asynchrony is a programming model – the idea that you can initiate an operation and move on without waiting for it to finish. How that’s implemented varies: a thread pool, an event loop, callbacks, coroutines – these are the mechnisms underneath.
It is opposite to synchrony, where a process runs only as a result of some other process being completed.
Threading vs Asynchrony
Threading is about workers – the execution units that do the work. Asynchrony is about waiting – it lets a worker drop a task the moment it hits a wait (I/O. network, etc.), and pick up something else instead. In a thread pool model, tasks are assigned to workers; async makes sure those workers are never stuck idle just waiting.
Asynchrony doesn’t even require multiple threads. JavaScript’s event loop is the classic example: a single thread processes tasks one at a time, but when it hits an async operation – a network call, a timer, a file read – it handles that off to the runtime and moves on to the next task. When the operation completes, a callback is queued and the thread picks it up. One worker, many tasks in flight, zero blocking. This is why you can write highly concurrent JavaScript without ever spawning a thread.