Unverified Commit fc7ce2eb authored by Eli Simhayev's avatar Eli Simhayev Committed by GitHub
Browse files

[Time-Series] Added blog-post to tips (#24482)

* [Time-Series] Added blog-post to tips

* added Resources to time series models docs

* removed "with Bert"
parent e16191a8
...@@ -29,6 +29,12 @@ The abstract from the paper is the following: ...@@ -29,6 +29,12 @@ The abstract from the paper is the following:
This model was contributed by [elisim](https://huggingface.co/elisim) and [kashif](https://huggingface.co/kashif). This model was contributed by [elisim](https://huggingface.co/elisim) and [kashif](https://huggingface.co/kashif).
The original code can be found [here](https://github.com/thuml/Autoformer). The original code can be found [here](https://github.com/thuml/Autoformer).
## Resources
A list of official Hugging Face and community (indicated by 🌎) resources to help you get started. If you're interested in submitting a resource to be included here, please feel free to open a Pull Request and we'll review it! The resource should ideally demonstrate something new instead of duplicating an existing resource.
- Check out the Autoformer blog-post in HuggingFace blog: [Yes, Transformers are Effective for Time Series Forecasting (+ Autoformer)](https://huggingface.co/blog/autoformer)
## AutoformerConfig ## AutoformerConfig
[[autodoc]] AutoformerConfig [[autodoc]] AutoformerConfig
...@@ -43,4 +49,4 @@ The original code can be found [here](https://github.com/thuml/Autoformer). ...@@ -43,4 +49,4 @@ The original code can be found [here](https://github.com/thuml/Autoformer).
## AutoformerForPrediction ## AutoformerForPrediction
[[autodoc]] AutoformerForPrediction [[autodoc]] AutoformerForPrediction
- forward - forward
\ No newline at end of file
...@@ -29,7 +29,10 @@ The abstract from the paper is the following: ...@@ -29,7 +29,10 @@ The abstract from the paper is the following:
This model was contributed by [elisim](https://huggingface.co/elisim) and [kashif](https://huggingface.co/kashif). This model was contributed by [elisim](https://huggingface.co/elisim) and [kashif](https://huggingface.co/kashif).
The original code can be found [here](https://github.com/zhouhaoyi/Informer2020). The original code can be found [here](https://github.com/zhouhaoyi/Informer2020).
Tips: ## Resources
A list of official Hugging Face and community (indicated by 🌎) resources to help you get started. If you're interested in submitting a resource to be included here, please feel free to open a Pull Request and we'll review it! The resource should ideally demonstrate something new instead of duplicating an existing resource.
- Check out the Informer blog-post in HuggingFace blog: [Multivariate Probabilistic Time Series Forecasting with Informer](https://huggingface.co/blog/informer) - Check out the Informer blog-post in HuggingFace blog: [Multivariate Probabilistic Time Series Forecasting with Informer](https://huggingface.co/blog/informer)
## InformerConfig ## InformerConfig
......
...@@ -29,7 +29,6 @@ The Time Series Transformer model is a vanilla encoder-decoder Transformer for t ...@@ -29,7 +29,6 @@ The Time Series Transformer model is a vanilla encoder-decoder Transformer for t
Tips: Tips:
- Check out the Time Series Transformer blog-post in HuggingFace blog: [Probabilistic Time Series Forecasting with 🤗 Transformers](https://huggingface.co/blog/time-series-transformers)
- Similar to other models in the library, [`TimeSeriesTransformerModel`] is the raw Transformer without any head on top, and [`TimeSeriesTransformerForPrediction`] - Similar to other models in the library, [`TimeSeriesTransformerModel`] is the raw Transformer without any head on top, and [`TimeSeriesTransformerForPrediction`]
adds a distribution head on top of the former, which can be used for time-series forecasting. Note that this is a so-called probabilistic forecasting model, not a adds a distribution head on top of the former, which can be used for time-series forecasting. Note that this is a so-called probabilistic forecasting model, not a
point forecasting model. This means that the model learns a distribution, from which one can sample. The model doesn't directly output values. point forecasting model. This means that the model learns a distribution, from which one can sample. The model doesn't directly output values.
...@@ -60,6 +59,12 @@ which is then fed to the decoder in order to make the next prediction (also call ...@@ -60,6 +59,12 @@ which is then fed to the decoder in order to make the next prediction (also call
This model was contributed by [kashif](https://huggingface.co/kashif). This model was contributed by [kashif](https://huggingface.co/kashif).
## Resources
A list of official Hugging Face and community (indicated by 🌎) resources to help you get started. If you're interested in submitting a resource to be included here, please feel free to open a Pull Request and we'll review it! The resource should ideally demonstrate something new instead of duplicating an existing resource.
- Check out the Time Series Transformer blog-post in HuggingFace blog: [Probabilistic Time Series Forecasting with 🤗 Transformers](https://huggingface.co/blog/time-series-transformers)
## TimeSeriesTransformerConfig ## TimeSeriesTransformerConfig
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment