Hadoop FS Shell df

Beginner
Practice Now

Introduction

In this lab, we will immerse ourselves in a sci-fi space city where a diplomatic envoy from an alien planet has arrived. The city's data infrastructure is powered by Hadoop HDFS, and we will utilize the Hadoop FS Shell df command to explore the file system and manage storage resources efficiently. Our goal is to assist the alien envoy in understanding how data is stored and managed in this futuristic city.

Exploring the File System with df

In this step, we will delve into the Hadoop HDFS file system using the df command to display information about disk space usage.

Open the terminal and follow the steps below to get started.

  1. Switch to the hadoop user:

    su - hadoop
  2. Run the Hadoop FS Shell df command to display disk space usage and save the output to a file:

    hdfs dfs -df > /home/hadoop/hdfs-df-report.txt
  3. Explore the output to understand the capacity, remaining space, and percentage used for each directory in the HDFS.

    cat /home/hadoop/hdfs-df-report.txt

Managing Disk Space Allocation

In this step, we will create a new directory in the HDFS and allocate disk space resources effectively.

  1. Create a new directory in the HDFS:

    hdfs dfs -mkdir /user/alien_envoy
  2. Set a quota of 10GB for the newly created directory:

    hdfs dfsadmin -setSpaceQuota 10g /user/alien_envoy
  3. Verify the quota has been set correctly:

    hdfs dfs -count -q -h -v /user/alien_envoy

Analyzing Disk Usage

In this step, we will use the du command to analyze disk usage within a specific directory in the HDFS.

  1. Create a new file and upload it to the HDFS:

    echo "Hello, Alien Envoy" > /home/hadoop/testfile.txt && hdfs dfs -put /home/hadoop/testfile.txt /user/alien_envoy/
  2. Analyze disk usage for a specific directory and save the output to a file:

    hdfs dfs -du -v -h /user/alien_envoy > /home/hadoop/hdfs-du-report.txt
  3. Review the output to understand the storage consumption by files and directories within the specified directory.

    cat /home/hadoop/hdfs-du-report.txt

Summary

In this lab, we created a futuristic scenario in a space city to explore Hadoop HDFS using the df command within the FS Shell. By guiding the alien envoy through disk space management tasks, we gained practical experience in utilizing Hadoop commands for file system operations. This hands-on lab aimed to provide a foundational understanding of managing storage resources in a distributed environment and leveraging Hadoop's capabilities for efficient data storage.

Other Tutorials you may like