Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Codeless Time Series Analysis with KNIME
  • Toc
  • feedback
Codeless Time Series Analysis with KNIME

Codeless Time Series Analysis with KNIME

By : KNIME AG , Corey Weisinger, Maarit Widmann, Daniele Tonini
4.8 (10)
close
Codeless Time Series Analysis with KNIME

Codeless Time Series Analysis with KNIME

4.8 (10)
By: KNIME AG , Corey Weisinger, Maarit Widmann, Daniele Tonini

Overview of this book

This book will take you on a practical journey, teaching you how to implement solutions for many use cases involving time series analysis techniques. This learning journey is organized in a crescendo of difficulty, starting from the easiest yet effective techniques applied to weather forecasting, then introducing ARIMA and its variations, moving on to machine learning for audio signal classification, training deep learning architectures to predict glucose levels and electrical energy demand, and ending with an approach to anomaly detection in IoT. There’s no time series analysis book without a solution for stock price predictions and you’ll find this use case at the end of the book, together with a few more demand prediction use cases that rely on the integration of KNIME Analytics Platform and other external tools. By the end of this time series book, you’ll have learned about popular time series analysis techniques and algorithms, KNIME Analytics Platform, its time series extension, and how to apply both to common use cases.
Table of Contents (20 chapters)
close
1
Part 1: Time Series Basics and KNIME Analytics Platform
7
Part 2: Building and Deploying a Forecasting Model
14
Part 3: Forecasting on Mixed Platforms

Preparing data for modeling

Now that we’ve converted our single column of high-frequency time series data into multiple columns of frequency amplitudes by windowing our data and applying the FFT, we’re in more familiar territory. The way our data is shaped now contains columns we can use as inputs for modeling. However, there are far too many columns.

Reducing dimensionality

Some modeling algorithms support very wide tables or very large input sets; neural networks come to mind. However, practicality is not the only reason to reduce dimensionality. Overfitting is a serious concern when working with wide datasets; we want our model to generalize well to new data and not just pick up on one frequency that, perhaps by random chance, turned out to be an amazing classifier.

In the following sections, we’ll review different types of binning and filtering to reduce the dimensionality of our newly cross-sectional data to a manageable size without too much information...

bookmark search playlist download font-size

Change the font size

margin-width

Change margin width

day-mode

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Delete Bookmark

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete