1
0
Fork 0

fix(docs): add note about MIopen errors (#444)

This commit is contained in:
Sean Sube 2023-12-23 12:46:02 -06:00
parent 52065ef317
commit a4c4877064
Signed by: ssube
GPG Key ID: 3EED7B957D362AF1
1 changed files with 11 additions and 0 deletions

View File

@ -127,6 +127,7 @@ Please see [the server admin guide](server-admin.md) for details on how to confi
- [Shape mismatch attempting to re-use buffer](#shape-mismatch-attempting-to-re-use-buffer) - [Shape mismatch attempting to re-use buffer](#shape-mismatch-attempting-to-re-use-buffer)
- [Cannot read properties of undefined (reading 'default')](#cannot-read-properties-of-undefined-reading-default) - [Cannot read properties of undefined (reading 'default')](#cannot-read-properties-of-undefined-reading-default)
- [Missing key(s) in state\_dict](#missing-keys-in-state_dict) - [Missing key(s) in state\_dict](#missing-keys-in-state_dict)
- [Missing MIopen.so.1](#missing-miopenso1)
- [Output Image Sizes](#output-image-sizes) - [Output Image Sizes](#output-image-sizes)
## Outline ## Outline
@ -1740,6 +1741,16 @@ RuntimeError: Error(s) in loading state_dict for AutoencoderKL:
Unexpected key(s) in state_dict: "encoder.mid_block.attentions.0.key.bias", "encoder.mid_block.attentions.0.key.weight", "encoder.mid_block.attentions.0.proj_attn.bias", "encoder.mid_block.attentions.0.proj_attn.weight", "encoder.mid_block.attentions.0.query.bias", "encoder.mid_block.attentions.0.query.weight", "encoder.mid_block.attentions.0.value.bias", "encoder.mid_block.attentions.0.value.weight", "decoder.mid_block.attentions.0.key.bias", "decoder.mid_block.attentions.0.key.weight", "decoder.mid_block.attentions.0.proj_attn.bias", "decoder.mid_block.attentions.0.proj_attn.weight", "decoder.mid_block.attentions.0.query.bias", "decoder.mid_block.attentions.0.query.weight", "decoder.mid_block.attentions.0.value.bias", "decoder.mid_block.attentions.0.value.weight". Unexpected key(s) in state_dict: "encoder.mid_block.attentions.0.key.bias", "encoder.mid_block.attentions.0.key.weight", "encoder.mid_block.attentions.0.proj_attn.bias", "encoder.mid_block.attentions.0.proj_attn.weight", "encoder.mid_block.attentions.0.query.bias", "encoder.mid_block.attentions.0.query.weight", "encoder.mid_block.attentions.0.value.bias", "encoder.mid_block.attentions.0.value.weight", "decoder.mid_block.attentions.0.key.bias", "decoder.mid_block.attentions.0.key.weight", "decoder.mid_block.attentions.0.proj_attn.bias", "decoder.mid_block.attentions.0.proj_attn.weight", "decoder.mid_block.attentions.0.query.bias", "decoder.mid_block.attentions.0.query.weight", "decoder.mid_block.attentions.0.value.bias", "decoder.mid_block.attentions.0.value.weight".
``` ```
#### Missing MIopen.so.1
This can happen when you upgrade Torch to a version that was built against a more recent version of ROCm than the one
you are using.
This can sometimes be fixed by upgrading ROCm, but make sure to check the supported architectures before you upgrade,
since they often remove support for older GPUs.
If you cannot upgrade ROCm, downgrade Torch to the correct version for the libraries available on your machine.
## Output Image Sizes ## Output Image Sizes
You can use this table to figure out the final size for each image, based on the combination of parameters that you are You can use this table to figure out the final size for each image, based on the combination of parameters that you are