What is the difference between cluster computing and grid computing?
Cluster computing and grid computing both refer to systems that use multiple computers to perform a task. The primary difference between the two is that grid computing relies on an application to be broken into discrete modules, where each module can run on a separate server. Cluster computing typically runs an entire application on each server, with redundancy between servers. Standard cluster computing is designed to produce a redundant environment that will ensure an application will continue to function in the event of a hardware or software failure. This cluster design requires that each node in the cluster mirror the existing nodes in both hardware environment and operating systems.