Computers do not have the ability to think. They perform tasks according to the instructions provided to them by the user. So how does this process work?
Computing is the process of processing information to complete a goal-oriented task using a computer. Since a wide variety of tasks can be assigned to the computer depending on the outputs required, there are many ways in which the computation process takes place. This may depend on the architecture of the system.
In earlier days, a computer system had one processor. Each problem that needed to be solved had to be divided into a series of instructions and fed into the processor sequentially. Single instruction was computed at each moment. So only one task could be completed at a time. This was an inefficient and slow process.
To improve, the frequency was increased but that led to increasing temperatures. In the world of multitasking, a single processor thus couldn’t do the job alone. We expect our computer to find a solution to all our problems and provide an error-free output as well. This evolved new techniques of computation to match our needs and expectations.
Here we will discuss two computation types: parallel computing and distributed computing.
Parallel Computing vs Distributed Computing
The difference between Parallel Computing and Distributed Computing has been concisely highlighted in the table below.
Parallel computing | Distributed computing |
Multiple processors perform multiple tasks assigned to them simultaneously. These tasks are broken down from a single main problem. | Multiple computers perform tasks at the same time to achieve a single decided goal through networked computers. |
Done using a single computer | Requires multiple computers |
Memory in parallel systems is either shared or distributed between processors | Each computer has its own memory |
Multiple processors perform processing | Multiple computers perform multiple operations |
Processors communicate with each other using a bus | Computers communicate with each other via the network |
Parallel computing helps to significantly increase the performance of the system, provides concurrency and saves time | Allows scalability, to share resources and helps to perform computation tasks efficiently |
To provide synchronization all processors share a single master clock | There is no global clock, uses synchronization algorithms |
Environments are tightly coupled | Environments might be loosely coupled or tightly coupled |
Used in places requiring excessively higher and faster processing power | Used when computers are located at different geographical locations and speed doesn’t matter |
Example: Supercomputers | Example: Facebook |
What is Parallel Computing?
Parallel computing (also known as parallel processing), in simple terms, is a system where several processes compute parallelly.
A single processor couldn’t do the job alone. Hence parallel computing was introduced. Here, a single problem or process is divided into many smaller, discrete problems which are further broken down into instructions.
Every instruction is assigned a processor each to complete the task. These multiple processors are communicating with each other via a shared memory space and executing all the instructions simultaneously. The results of the execution of each task are finally combined for a single output of the overall problem.
Parallel computing is housed in a single data centre, where several processors are installed, problems to compute are distributed in small chunks by the server. They are then executed simultaneously on each server. The importance of parallel computing grows exponentially with the increasing usage of multicore processors.
Parallel computing is used in areas of fields where massive computation or processing power is required and complex calculations are required. Parallel computation saves time.
The downside to parallel computing is that it might be expensive at times to increase the number of processors. Also, latency may occur as while instruction is executed at a processor it may be needed by another processor.
Parallel computation can be classified into bit-level, instructional level, super word-level parallelism, data and task parallelism.
What is Distributed Computing?
Complex problems cannot be solved by using a single processor. It wouldn’t be accurate or efficient. So with this case of computing, a single problem can be divided into multiple tasks and distributed to many computers.
Distributed computing follows the same principle as parallel computing does. These computers communicate with each other by passing messages through the network. All computers work harmoniously to achieve a single goal. Once each computer finishes its process execution the final result is collated and presented to the user.
A computer in the distributed system can be called a node and a group of nodes can be called a cluster.
Distributed computing has many advantages. It provides scalability and is extendable to increasing growth. Distributed computing is used to synchronize the use of shared resources or to supply communication services to users. However, at times it may be difficult to get a stable network connection and develop an efficient distributed computing system.
Author
Shriya Upasani
MIT World Peace University