Big Data Hadoop

Category: Miscellaneous
Summary: 

HADOOP DESCRIPTION:

Hadoop is a frame work for running applications for large cluster built of commodity hardware. The Hadoop framework transparently provides application both reliability and data motion. Hadoop implements a computational paradigm named Map/Reduce, where the application is divided into many small fragments of work, each of which may be executed or re executed on any node in the cluster.

In addition, it provides a distributed file system (HDFS) that stores data on the compute cluster. Both Map/Reduce and the distributed file system are designed so that node failures are automatically handled by the Framework.Hadoop is open-source implementation for Google Map Reduce .Hadoop is based on a simple programming model called Map Reduce .Hadoop is based on a simple data model, any data will fit.

Hadoop is one of the tools designed to handle big data. Hadoop and other software products work to interpret orparse the results of big data searches through specific proprietary algorithms and methods. Hadoop is an open-source program under the Apache license that is maintained by a global community of users. It includes various main components, including a Map Reduce set of functions and a Hadoop distributed file system (HDFS).

Start Date:  Saturday, 14 November 2015
Enrolment start date:  Monday, 19 October 2015
Cost: 850(USD)
Topics: 10