@@ -25,6 +25,8 @@ The abstract from the paper is the following:
...
@@ -25,6 +25,8 @@ The abstract from the paper is the following:
This model was contributed by [elisim](https://huggingface.co/elisim) and [kashif](https://huggingface.co/kashif).
This model was contributed by [elisim](https://huggingface.co/elisim) and [kashif](https://huggingface.co/kashif).
The original code can be found [here](https://github.com/zhouhaoyi/Informer2020).
The original code can be found [here](https://github.com/zhouhaoyi/Informer2020).
Tips:
- Check out the Informer blog-post in HuggingFace blog: [Multivariate Probabilistic Time Series Forecasting with Informer](https://huggingface.co/blog/informer)
@@ -25,6 +25,7 @@ The Time Series Transformer model is a vanilla encoder-decoder Transformer for t
...
@@ -25,6 +25,7 @@ The Time Series Transformer model is a vanilla encoder-decoder Transformer for t
Tips:
Tips:
- Check out the Time Series Transformer blog-post in HuggingFace blog: [Probabilistic Time Series Forecasting with 🤗 Transformers](https://huggingface.co/blog/time-series-transformers)
- Similar to other models in the library, [`TimeSeriesTransformerModel`] is the raw Transformer without any head on top, and [`TimeSeriesTransformerForPrediction`]
- Similar to other models in the library, [`TimeSeriesTransformerModel`] is the raw Transformer without any head on top, and [`TimeSeriesTransformerForPrediction`]
adds a distribution head on top of the former, which can be used for time-series forecasting. Note that this is a so-called probabilistic forecasting model, not a
adds a distribution head on top of the former, which can be used for time-series forecasting. Note that this is a so-called probabilistic forecasting model, not a
point forecasting model. This means that the model learns a distribution, from which one can sample. The model doesn't directly output values.
point forecasting model. This means that the model learns a distribution, from which one can sample. The model doesn't directly output values.