Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Data Engineering with Google Cloud Platform
  • Toc
  • feedback
Data Engineering with Google Cloud Platform

Data Engineering with Google Cloud Platform

By : Adi Wijaya
4.7 (12)
close
Data Engineering with Google Cloud Platform

Data Engineering with Google Cloud Platform

4.7 (12)
By: Adi Wijaya

Overview of this book

With this book, you'll understand how the highly scalable Google Cloud Platform (GCP) enables data engineers to create end-to-end data pipelines right from storing and processing data and workflow orchestration to presenting data through visualization dashboards. Starting with a quick overview of the fundamental concepts of data engineering, you'll learn the various responsibilities of a data engineer and how GCP plays a vital role in fulfilling those responsibilities. As you progress through the chapters, you'll be able to leverage GCP products to build a sample data warehouse using Cloud Storage and BigQuery and a data lake using Dataproc. The book gradually takes you through operations such as data ingestion, data cleansing, transformation, and integrating data with other sources. You'll learn how to design IAM for data governance, deploy ML pipelines with the Vertex AI, leverage pre-built GCP models as a service, and visualize data with Google Data Studio to build compelling reports. Finally, you'll find tips on how to boost your career as a data engineer, take the Professional Data Engineer certification exam, and get ready to become an expert in data engineering with GCP. By the end of this data engineering book, you'll have developed the skills to perform core data engineering tasks and build efficient ETL data pipelines with GCP.
Table of Contents (17 chapters)
close
1
Section 1: Getting Started with Data Engineering with GCP
4
Section 2: Building Solutions with GCP Components
11
Section 3: Key Strategies for Architecting Top-Notch Data Pipelines

Understanding the working of Airflow

Airflow handles all three of the preceding elements using Python scripts. As data engineers, what we need to do is to code in Python for handling the task dependencies, schedule our jobs, and integrate with other systems. This is different from traditional extract, transform, load (ETL) tools. If you have ever heard of or used tools such as Control-M, Informatica, Talend, or many other ETL tools, Airflow has the same positioning as these tools. The difference is Airflow is not a user interface (UI)-based drag and drop tool. Airflow is designed for you to write the workflow using code.

There are a couple of good reasons why managing the workflow using code is a good idea compared to the drag and drop tools. Here's why we should do this:

  • Using code, you can automate a lot of development and deployment processes.
  • Using code, it's easier for you to enable good testing practices.
  • All the configurations can be managed in...
bookmark search playlist download font-size

Change the font size

margin-width

Change margin width

day-mode

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Delete Bookmark

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete