Generative AI has profoundly changed the world, and Transformers are a crucial instrument in this process. However, the application of Transformers for time series hasn't been widely adopted yet, despite the immense potential in this field. Transformers, among other things, possess the ability to identify long-range dependencies and interactions in the data.
In the Transformers for Time Series Forecasting book, the most recent research findings are presented in a highly practical fashion. Utilizing real-life projects and employing PyTorch and TensorFlow, the reader is guided through various use cases. Starting with the most commonly utilised applications for time series data, such as forecasting and classification, the book introduces the reader to both the theory and implementation. Later, more specialised cases are covered, including anomaly detection, event forecasting, and spatio-temporal modelling.
The final chapters introduce how to improve these algorithms further, what the best practices are, how to optimise with hyperparameter tuning techniques and architecture-level modifications. Lastly, we discuss how to scale transformer-based solutions when dealing with large amounts of data.