1
0
Fork 0

add note about onnx-fp16 opt

This commit is contained in:
Sean Sube 2023-03-19 22:30:24 -05:00
parent 31e841ab4a
commit d26416465b
Signed by: ssube
GPG Key ID: 3EED7B957D362AF1
1 changed files with 7 additions and 1 deletions

View File

@ -331,9 +331,15 @@ any new packages.
```shell ```shell
> git clone https://github.com/microsoft/onnxruntime > git clone https://github.com/microsoft/onnxruntime
> cd onnxruntime/onnxruntime/python/tools/transformers/models/stable_diffusion > cd onnxruntime/onnxruntime/python/tools/transformers/models/stable_diffusion
> python3 optimize_pipeline.py -i /home/ssube/onnx-web/models/stable-diffusion > python3 optimize_pipeline.py \
-i /home/ssube/onnx-web/models/stable-diffusion-onnx-v1-5 \
-o /home/ssube/onnx-web/models/stable-diffusion-optimized-v1-5 \
--use_external_data_format
``` ```
You can convert the model into ONNX float16 using the `--float16` option. Models converted into ONNX float16 can only
be run when using [the `onnx-fp16` optimization](server-admin#pipeeline-optimizations).
The `optimize_pipeline.py` script should work on any [diffusers directory with ONNX models](#converting-diffusers-models), The `optimize_pipeline.py` script should work on any [diffusers directory with ONNX models](#converting-diffusers-models),
but you will need to use the `--use_external_data_format` option if you are not using `--float16`. See the `--help` for but you will need to use the `--use_external_data_format` option if you are not using `--float16`. See the `--help` for
more details. more details.