"src/vscode:/vscode.git/clone" did not exist on "5755d16868ec3da7d5eb4f42db77b01fac842ea8"
Unverified Commit 579b4b20 authored by Ella Charlaix's avatar Ella Charlaix Committed by GitHub
Browse files

Update documentation (#4422)

* update documentation

* minor
parent 6c5bd2a3
......@@ -86,12 +86,13 @@ optimum-cli export onnx --model stabilityai/stable-diffusion-xl-base-1.0 --task
### Inference
To load an ONNX model and run inference with ONNX Runtime, you need to replace `StableDiffusionPipelineXL` with `ORTStableDiffusionPipelineXL` :
Here is an example of how you can load a SDXL ONNX model from [stabilityai/stable-diffusion-xl-base-1.0](https://huggingface.co/stabilityai/stable-diffusion-xl-base-1.0) and run inference with ONNX Runtime :
```python
from optimum.onnxruntime import ORTStableDiffusionXLPipeline
pipeline = ORTStableDiffusionXLPipeline.from_pretrained("sd_xl_onnx")
model_id = "stabilityai/stable-diffusion-xl-base-1.0"
pipeline = ORTStableDiffusionXLPipeline.from_pretrained(model_id)
prompt = "sailing ship in storm by Leonardo da Vinci"
image = pipeline(prompt).images[0]
```
......
......@@ -85,11 +85,13 @@ You can find more examples in the optimum [documentation](https://huggingface.co
### Inference
Here is an example of how you can load a SDXL OpenVINO model from [stabilityai/stable-diffusion-xl-base-1.0](https://huggingface.co/stabilityai/stable-diffusion-xl-base-1.0) and run inference with OpenVINO Runtime :
```python
from optimum.intel import OVStableDiffusionXLPipeline
model_id = "stabilityai/stable-diffusion-xl-base-1.0"
pipeline = OVStableDiffusionXLPipeline.from_pretrained(model_id, export=True)
pipeline = OVStableDiffusionXLPipeline.from_pretrained(model_id)
prompt = "sailing ship in storm by Rembrandt"
image = pipeline(prompt).images[0]
```
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment