Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Book Overview & Buying Learning Apache Apex
  • Table Of Contents Toc
  • Feedback & Rating feedback
Learning Apache Apex

Learning Apache Apex

By : Gundabattula, Thomas Weise, Munagala V. Ramanath, David Yan, Kenneth Knowles
5 (1)
close
close
Learning Apache Apex

Learning Apache Apex

5 (1)
By: Gundabattula, Thomas Weise, Munagala V. Ramanath, David Yan, Kenneth Knowles

Overview of this book

Apache Apex is a next-generation stream processing framework designed to operate on data at large scale, with minimum latency, maximum reliability, and strict correctness guarantees. Half of the book consists of Apex applications, showing you key aspects of data processing pipelines such as connectors for sources and sinks, and common data transformations. The other half of the book is evenly split into explaining the Apex framework, and tuning, testing, and scaling Apex applications. Much of our economic world depends on growing streams of data, such as social media feeds, financial records, data from mobile devices, sensors and machines (the Internet of Things - IoT). The projects in the book show how to process such streams to gain valuable, timely, and actionable insights. Traditional use cases, such as ETL, that currently consume a significant chunk of data engineering resources are also covered. The final chapter shows you future possibilities emerging in the streaming space, and how Apache Apex can contribute to it.
Table of Contents (11 chapters)
close
close

Transformations

So far, we have looked at the operators that connect Apex pipelines to the outside world, to read data from messaging systems, files, and other sources and to write results to various destinations. We have seen that the Apex library has comprehensive support to integrate various external systems with feature rich connectors.

Now it is time to look at the support available for the actual functionality of the pipeline. These building blocks are transformations: their purpose is to modify or accumulate the tuples that flow through the processing pipeline. Examples of typical transformations are parsing, filtering, aggregation by key, and join:

The preceding diagram categorizes transformations into those that are applied to individual tuples and those that aggregate tuples based on keys and windows. Often, per tuple transforms are stateless and windowed transforms...

Unlock full access

Continue reading for free

A Packt free trial gives you instant online access to our library of over 7000 practical eBooks and videos, constantly updated with the latest in tech

Create a Note

Modal Close icon
You need to login to use this feature.
notes
bookmark search playlist font-size

Change the font size

margin-width

Change margin width

day-mode

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Delete Bookmark

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete

Delete Note

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete

Edit Note

Modal Close icon
Write a note (max 255 characters)
Cancel
Update Note

Confirmation

Modal Close icon
claim successful

Buy this book with your credits?

Modal Close icon
Are you sure you want to buy this book with one of your credits?
Close
YES, BUY