-
Book Overview & Buying
-
Table Of Contents
-
Feedback & Rating

Data Engineering with Databricks Cookbook
By :

Databricks workflows are a way to automate and orchestrate your data processing tasks on the Databricks platform. You can use workflows to create pipelines, ETL processes, machine learning models, and more.
A task is a unit of work that you can run on a schedule or on-demand. A Databricks workflow is a sequence of tasks that are linked by dependencies. A workflow can be defined using the Databricks UI, the Databricks CLI, or the Databricks REST API. A workflow can also have parameters that can be passed to it at runtime. A workflow can have one or more runs, which are the instances of the workflow execution.
In this recipe, you will learn how to use the Databricks Workflows UI to create a multi-task workflow in Databricks.
Figure...