Unverified Commit 92e5ddd2 authored by Ella Charlaix's avatar Ella Charlaix Committed by GitHub
Browse files

Fix typo documentation (#4320)

fix typo documentation
parent 1926331e
......@@ -11,7 +11,7 @@ specific language governing permissions and limitations under the License.
-->
# How to use the ONNX Runtime for inference
# How to use ONNX Runtime for inference
🤗 [Optimum](https://github.com/huggingface/optimum) provides a Stable Diffusion pipeline compatible with ONNX Runtime.
......@@ -27,7 +27,7 @@ pip install optimum["onnxruntime"]
### Inference
To load an ONNX model and run inference with the ONNX Runtime, you need to replace [`StableDiffusionPipeline`] with `ORTStableDiffusionPipeline`. In case you want to load a PyTorch model and convert it to the ONNX format on-the-fly, you can set `export=True`.
To load an ONNX model and run inference with ONNX Runtime, you need to replace [`StableDiffusionPipeline`] with `ORTStableDiffusionPipeline`. In case you want to load a PyTorch model and convert it to the ONNX format on-the-fly, you can set `export=True`.
```python
from optimum.onnxruntime import ORTStableDiffusionPipeline
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment