Hadoop is an open-source framework for distributed storage and processing of large datasets across clusters of commodity hardware. It uses a programming model called MapReduce where developers write mapping and reducing functions that are automatically parallelized and executed on a large cluster. Hadoop also includes HDFS, a distributed file system that stores data across nodes providing high bandwidth. Major companies like Yahoo, Google and IBM use Hadoop to process large amounts of data from users and applications.