Search
Hadoop Distributed File System (HDFS)
HDFS is an open-source storage system that underpins Hadoop. It’s designed to store large volumes of data across a distributed cluster, providing high throughput access to this data.
It is similar to RAID, but without the hardware requirement.
It’s often used by companies who need to handle and store big data. HDFS is a key component of many Hadoop systems, as it provides a means for managing big data, as well as supporting big data analytics.
HDFS complements MapReduce by serving as the storage layer where your data resides before and after MapReduce processes it.
# History
HDFS was the open-source implementation of Google File System (GFS). The preprocessors of MapReduce are Spark, Flink, and Apache Tez.
Origin: MapReduce
References:
Created 2024-01-07