Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Book Overview & Buying Data Engineering with Databricks Cookbook
  • Table Of Contents Toc
  • Feedback & Rating feedback
Data Engineering with Databricks Cookbook

Data Engineering with Databricks Cookbook

By : Pulkit Chadha
4.4 (7)
close
close
Data Engineering with Databricks Cookbook

Data Engineering with Databricks Cookbook

4.4 (7)
By: Pulkit Chadha

Overview of this book

Written by a Senior Solutions Architect at Databricks, Data Engineering with Databricks Cookbook will show you how to effectively use Apache Spark, Delta Lake, and Databricks for data engineering, starting with comprehensive introduction to data ingestion and loading with Apache Spark. What makes this book unique is its recipe-based approach, which will help you put your knowledge to use straight away and tackle common problems. You’ll be introduced to various data manipulation and data transformation solutions that can be applied to data, find out how to manage and optimize Delta tables, and get to grips with ingesting and processing streaming data. The book will also show you how to improve the performance problems of Apache Spark apps and Delta Lake. Advanced recipes later in the book will teach you how to use Databricks to implement DataOps and DevOps practices, as well as how to orchestrate and schedule data pipelines using Databricks Workflows. You’ll also go through the full process of setup and configuration of the Unity Catalog for data governance. By the end of this book, you’ll be well-versed in building reliable and scalable data pipelines using modern data engineering technologies.
Table of Contents (16 chapters)
close
close
Free Chapter
1
Part 1 – Working with Apache Spark and Delta Lake
9
Part 2 – Data Engineering Capabilities within Databricks

Running and managing Databricks Workflows

With Databricks Workflows, you can see the current and past runs of any jobs you can access, even if they were started by other tools. Databricks keeps your job runs for up to 60 days. If you want to save them longer, Databricks suggests exporting them before expiration.

In this recipe, you will learn how to use the Databricks UI to see the jobs you can access, the past runs of a job, and the run details.

How to do it...

  1. Go to the workflow: Click on Workflows in the sidebar, then click on a job name in the Name column:
Figure 8.9 – Databricks Workflows list view

Figure 8.9 – Databricks Workflows list view

You will see the Runs tab with two views: matrix and list, as shown:

Figure 8.10 – Job matrix and run list view

Figure 8.10 – Job matrix and run list view

  1. Matrix view: The matrix view shows you the previous runs of the job and its tasks.
    • The Run total duration row shows you how long each run took and what state it was in:
...

Unlock full access

Continue reading for free

A Packt free trial gives you instant online access to our library of over 7000 practical eBooks and videos, constantly updated with the latest in tech
bookmark search playlist download font-size

Change the font size

margin-width

Change margin width

day-mode

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Delete Bookmark

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete

Confirmation

Modal Close icon
claim successful

Buy this book with your credits?

Modal Close icon
Are you sure you want to buy this book with one of your credits?
Close
YES, BUY