PatchTST is a transformer-based model that has emerged as a breakthrough in long-term forecasting tasks. It incorporates channel-independence to handle multivariate time series data and utilizes patching to extract local semantic information within the data. By considering groups of time steps rather than individual time steps, PatchTST can capture meaningful patterns and dependencies in the time series. This approach reduces the space and time complexity of the model, making it faster and lighter while still being able to handle longer input sequences. After the patching process, the time series is passed into a transformer encoder and predictions are generated based on the processed information. PatchTST can also leverage self-supervised representation learning to capture abstract representations of the data and improve forecasting performance. In comparison to other models like N-BEATS and N-HiTS, PatchTST has shown promising performance in long-term forecasting tasks. The model has been tested on the Exchange dataset, which contains daily exchange rates of eight countries in relation to the US dollar from 1990 to 2016.

source update: PatchTST — A Step Forward in Time Series Forecasting – Towards AI


There are no comments yet.

Leave a comment