The core map-reduce framework for big data consists of several interlocking technologies. This first installment of our tutorial explains what Hadoop does and how the pieces fit together.
MapReduce on small datasets can be run easily and without much coding or fiddling provided you know what to do. Here's how.
The Hadoop ecosystem relies on composability the ability to use output from one tool as input to the next to efficiently process data at scale, from simple projects, to processing streams of real-time data, to building data warehouses.