Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Big Data Forensics: Learning Hadoop Investigations
  • Toc
  • feedback
Big Data Forensics: Learning Hadoop Investigations

Big Data Forensics: Learning Hadoop Investigations

By : Joe Sremack
5 (3)
close
Big Data Forensics: Learning Hadoop Investigations

Big Data Forensics: Learning Hadoop Investigations

5 (3)
By: Joe Sremack

Overview of this book

Big Data forensics is an important type of digital investigation that involves the identification, collection, and analysis of large-scale Big Data systems. Hadoop is one of the most popular Big Data solutions, and forensically investigating a Hadoop cluster requires specialized tools and techniques. With the explosion of Big Data, forensic investigators need to be prepared to analyze the petabytes of data stored in Hadoop clusters. Understanding Hadoop’s operational structure and performing forensic analysis with court-accepted tools and best practices will help you conduct a successful investigation. Discover how to perform a complete forensic investigation of large-scale Hadoop clusters using the same tools and techniques employed by forensic experts. This book begins by taking you through the process of forensic investigation and the pitfalls to avoid. It will walk you through Hadoop's internals and architecture, and you will discover what types of information Hadoop stores and how to access that data. You will learn to identify Big Data evidence using techniques to survey a live system and interview witnesses. After setting up your own Hadoop system, you will collect evidence using techniques such as forensic imaging and application-based extractions. You will analyze Hadoop evidence using advanced tools and techniques to uncover events and statistical information. Finally, data visualization and evidence presentation techniques are covered to help you properly communicate your findings to any audience.
Table of Contents (10 chapters)
close
9
Index

Hadoop data analysis tools

Hadoop was designed to store and analyze large volumes of data. The ecosystem of tools for Hadoop analysis is large and complex. Depending on the type of analysis, many different tools can be used. The Apache Foundation set of tools has a number of standard options such as Hive, HBase, and Pig, but other open source and commercial solutions have been developed to meet different analysis requirements using Hadoop's HDFS and MapReduce features. For example, Cloudera's Impala database runs on Hadoop, but it is not part of the Apache Foundation suite of applications.

Understanding which data analysis tools are used in a Hadoop cluster is important for identifying and properly collecting data. Some data analysis tools store data in formatted files and may offer easier methods for data collection. Other tools may read data directly from files stored in HDFS, but the scripts used for the tool may serve as useful information when later analyzing the data. This...

bookmark search playlist font-size

Change the font size

margin-width

Change margin width

day-mode

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Delete Bookmark

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete