Parallel and Distributed Computing

parallel and distributed computing n.w
1 / 11
Embed
Share

Explore the concepts of parallel and distributed computing, delving into the definitions, differences, and aspects such as scalability, resource sharing, and synchronization. Discover how parallel computing enhances speed and efficiency by breaking down tasks into sub-tasks executed simultaneously, while distributed computing involves multiple systems collaborating on tasks across different locations.

  • Computing
  • Distributed Systems
  • Parallel Computing
  • Scalability
  • Resource Sharing

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Parallel and Distributed Computing

  2. Points to study What is parallel computing? What is distributed computing? Key difference between parallel and distributed computer system. 1) No. of system 2)Dependency 3)Scalability 4)Resource Sharing 5) Synchronization 6)Usage

  3. What is Parallel Computing? What is Distributed Computing? Parallel computing is a model that divides a task into multiple sub- tasks and executes simultaneously to increase the speed and efficiency. Here, a problem is broken down into multiple parts. Each part is then broke down into a number of instructions. These parts are allocated to different processors execute them This increases execution of programs whole. Distributed computing is different than parallel computing even though the principle is the same. Distributed computing is a field that studies distributed systems. Distributed systems are systems that have multiple computers located locations. These computers system work on the same program. The program is divided into different tasks and allocated to different computers. The computers communicate with the help of message completion of computing, the result is collated and presented to the user. them in different in a distributed which simultaneously. the speed passing. Upon of a as

  4. No. of computer in Distributed Computing No. of computer in Parallel Computing Parallel computing generally requires one computer with multiple Multiple processors within the same computer system execute simultaneously. All the processors towards completing same task. Thus they have to share resources and data In distributed computing, several computer systems are involved. Here multiple autonomous systems work divided tasks. These computer can be located at different geographical well. processors. computer on the instructions systems work the locations as

  5. Dependency in Distributed Computing Dependency in Parallel Computing In parallel computing, the Some distributed systems might be loosely coupled, while others tightly coupled tasks to be solved are divided into multiple smaller parts. These smaller assigned to processors. Here the outcome of one task might be the input of another. This increases between the processors. We can also computing environments are tightly coupled. might be tasks are multiple dependency say, parallel

  6. Scalability of Parallel Computing Scalability of Distributed Computing In parallel computing Distributed environments scalable. This is because the computers are connected over the communicate messages computing are environments, the number of processors you can add is restricted. This is because the bus connecting the processors and the memory can handle a limited number connections. There are limitations on the number of processors that the bus connecting them and the memory can limitation makes the parallel systems less scalable. more network by and of passing handle. This

  7. Resource Sharing in Parallel Computing Resource Sharing In Distributed Computing In Distributed systems, on the other hand, have their own memory and processors systems implementing parallel computing, all the processors share the same memory. They also share the same communication and network. processors with each other with the help of shared memory. medium The communicate

  8. Synchronization of Distributed Computing Synchronization of Parallel Computing In parallel systems, all the processes share the same master synchronization. the processors are hosted on the same system, they do not need any synchronization algorithms In distributed systems, the individual systems do not have access to any central clock. Hence, they need to implement synchronization algorithms processing clock for all Since physical

  9. Uses of Parallel Computing Users of Distributed Computing Parallel computing is often used in places requiring higher and faster processing power. For supercomputers. Since there are no lags in the passing of messages, these systems have high speed and efficiency. Distributed used when computers are located at geographical locations. In these scenarios, speed is generally not matter. They preferred scalability is required. computing is different example, a are crucial the when choice

More Related Content