Wednesday 25 February 2015

Cluster Computing VS Grid Computing

Cluster computing 
It is a type of computing in which several nodes are made to run as a single entity . The various
nodes involved in cluster are normally connected to each other using some fast local area networks . There are
mainly two reasons of deploying a cluster instead of a single computer which are performance and fault
tolerance. An application desires high computation in terms of response time, memory and throughout especially
when we talk about real time applications. Cluster computing provides high computation by employing
parallel programming, which is use of many processors simultaneously for a number of or a single problem.
Another reason is fault tolerance which is actually the ability of a system to operate gracefully even in
the presence of any fault. As the clusters are the replicas of similar components, the fault in one component only
affects the cluster’s power but not its availability . So, users always have some components to work with even in
the presence of fault.

Grid computing 
It is the segregation of resources from multiple sites so as to solve a problem that can’t be solved
by using the processing of a single computer . It employs use of multiple clusters that are loosely coupled,
heterogeneous and are geographically dispersed . Here individual user gets access to the resources (like
processors, storage, data etc.) on demand with little or no knowledge of the fact that where those resources are
physically located. For example, we use electricity for running air-conditioners, televisions etc. through wall
sockets without concerned about the fact that from where that electricity is coming and how it is being generated.
It is more popularly known as a collection of servers that are bound together to attack a single problem .
Grid computing is concerned about sharing, collecting, hosting and providing services to various consumers. 

No comments:

Post a Comment