Book Image

Big Data Forensics: Learning Hadoop Investigations

By : Joe Sremack
Book Image

Big Data Forensics: Learning Hadoop Investigations

By: Joe Sremack

Overview of this book

Big Data forensics is an important type of digital investigation that involves the identification, collection, and analysis of large-scale Big Data systems. Hadoop is one of the most popular Big Data solutions, and forensically investigating a Hadoop cluster requires specialized tools and techniques. With the explosion of Big Data, forensic investigators need to be prepared to analyze the petabytes of data stored in Hadoop clusters. Understanding Hadoop’s operational structure and performing forensic analysis with court-accepted tools and best practices will help you conduct a successful investigation. Discover how to perform a complete forensic investigation of large-scale Hadoop clusters using the same tools and techniques employed by forensic experts. This book begins by taking you through the process of forensic investigation and the pitfalls to avoid. It will walk you through Hadoop's internals and architecture, and you will discover what types of information Hadoop stores and how to access that data. You will learn to identify Big Data evidence using techniques to survey a live system and interview witnesses. After setting up your own Hadoop system, you will collect evidence using techniques such as forensic imaging and application-based extractions. You will analyze Hadoop evidence using advanced tools and techniques to uncover events and statistical information. Finally, data visualization and evidence presentation techniques are covered to help you properly communicate your findings to any audience.
Table of Contents (10 chapters)
9
Index

Managing files in Hadoop

Hadoop has its own file management concepts that come with many different mechanisms for data storage and retrieval. Hadoop is designed to manage large volumes of data distributed across many nodes built with commodity hardware. As such, Hadoop manages the distribution of large volumes of data using techniques designed to divide, compress, and share the data all while dealing with the possibilities of node failures and numerous processes accessing the same data simultaneously. Many of the filesystem concepts in Hadoop are exactly the same as in other systems, such as directory structures. However, other concepts, such as MapFiles and Hadoop Archive Files, are unique to Hadoop. This section covers many of the file management concepts that are unique to Hadoop.

File permissions

HDFS uses a standard file permission approach. The three types of permissions for files and directories are:

  • Read (r): Read a file and list a directory's contents
  • Write (w): Write to a file...