1
0
Fork 0
Commit Graph

225 Commits

Author SHA1 Message Date
Sean Sube 0ecae65f88
fix(api): set VAE attn processor during conversion 2023-09-24 15:02:21 -05:00
Sean Sube 5d3a7d77a5
fix(api): use Torch pipelines while loading models for conversion 2023-09-24 10:04:21 -05:00
Sean Sube c99481f484
fix(api): load replacement VAE from single file for SD v1/v2 2023-09-24 09:49:50 -05:00
Sean Sube 52fdf4f48a
fix(api): test LoRA blending code 2023-09-21 18:24:08 -05:00
Sean Sube a71298ff33
remove unused schema, lint 2023-09-15 08:40:56 -05:00
Sean Sube b851c234fe
more tests 2023-09-14 19:35:48 -05:00
Sean Sube 8a5e211172
lint(api): deduplicate some imports 2023-09-13 17:27:05 -05:00
Sean Sube cf2cf51b17
fix(api): resolve XL VAE within model folder 2023-09-11 20:21:43 -05:00
Sean Sube cd06f9291b
add custom VAE and fp16 support to SDXL conversion 2023-09-11 18:18:38 -05:00
Sean Sube 51b10de265
pass initial device to chain 2023-09-10 22:46:06 -05:00
Sean Sube 956a260db6
fix import 2023-09-10 12:15:39 -05:00
Sean Sube f2d0c2f765
apply lint 2023-09-10 11:53:36 -05:00
Sean Sube fe68670844
feat(api): add conversion for SDXL models 2023-09-10 11:52:46 -05:00
Sean Sube 1d5ac7dde5
optimize XL LoRA node matching 2023-09-04 21:20:54 -05:00
Sean Sube ea9023c2eb
fix(api): only run SDXL LoRA node matching on XL models 2023-09-04 21:20:39 -05:00
Sean Sube ecac0d5770
apply lint 2023-09-04 21:20:33 -05:00
Sean Sube bbacfd1ca0
feat(api): support for SDXL LoRAs 2023-09-04 21:20:30 -05:00
Sean Sube 8e1f188d8f
fix(api): pass replacement VAE to diffusers conversion 2023-09-04 21:19:36 -05:00
Sean Sube 6ec7777f77
lint(api): type fixes and hints throughout 2023-07-04 10:20:28 -05:00
Sean Sube 7c8dc7a6b6
feat(api): create hash file for models without one 2023-06-27 08:17:21 -05:00
Sean Sube 7db53a82df
fix(api): correctly handle sliced kernels 2023-06-17 14:51:58 -05:00
Sean Sube 5713957026
fix(api): keep kernel slices within bounds 2023-06-16 20:49:43 -05:00
Sean Sube c1a4484b75
apply lint 2023-06-16 20:43:28 -05:00
Sean Sube 719b34967f
fix(api): handle blending of mismatched kernels 2023-06-16 20:41:25 -05:00
Sean Sube ac2eceb0ac
fix(api): remove trailing space from param name 2023-06-10 11:54:33 -05:00
Sean Sube e64b813ac0
fix(api): pass inpaint param to checkpoint extraction 2023-06-10 11:52:25 -05:00
Sean Sube 665cf57a86
fix typo 2023-06-10 11:30:47 -05:00
HoopyFreud 22ff0be758 fix inpainting conversion via torch 2023-06-10 12:01:02 -04:00
Sean Sube 4810ca858a
fix(api): correct name of torch extraction var 2023-06-03 09:52:31 -05:00
Sean Sube 4dc192a2c7
apply lint 2023-05-28 19:44:16 -05:00
Sean Sube 00b1bec7a9
fix(api): skip loading custom VAE after torch extraction (#379) 2023-05-27 12:31:08 -05:00
Sean Sube 86d458a0c6
fix(api): avoid passing cache path before sanitizing config name 2023-05-27 12:27:33 -05:00
Sean Sube 24f1ecc0eb
fix(api): load local custom VAE from within model path 2023-05-27 11:26:16 -05:00
Sean Sube 1562f4011e
replace pydantic polyfill 2023-05-20 20:19:41 -05:00
Sean Sube eb53238fd5
do not override source after fetch 2023-05-20 20:15:19 -05:00
Sean Sube df2e854dcc
fix load_yaml import 2023-05-20 20:09:45 -05:00
Sean Sube 9fa1f0387b
add env var for torch extraction 2023-05-20 20:06:54 -05:00
Sean Sube 9c28154dee
feat(api): add option to reload CNet for conversion 2023-05-20 15:19:38 -05:00
Sean Sube 20107f559a
use Torch working model for conversion 2023-05-20 15:12:39 -05:00
Sean Sube 339441cb16
fix(api): restore old checkpoint conversion code path 2023-05-20 13:41:23 -05:00
Sean Sube fe80e61e79
fix CNet conversion condition 2023-05-14 20:53:12 -05:00
Sean Sube b6a4cbaae1
fix(api): collate CNet after unloading UNet 2023-05-14 20:43:16 -05:00
Sean Sube 01ab864a55
collate UNet after unloading original, add option to skip pipeline loading test 2023-05-14 20:29:18 -05:00
Sean Sube 0d51d61728
feat(api): add env vars for controlnet conversion and opset 2023-05-14 20:04:43 -05:00
Sean Sube a99388645a
correct import 2023-05-14 19:56:40 -05:00
Sean Sube e2035c3fbf
fix(api): run GC during diffusers conversion, add flag to skip ControlNet (#369) 2023-05-14 19:30:30 -05:00
Sean Sube d66bf9e54f
fix(api): download pretrained models from HF correctly (#371) 2023-05-03 22:21:04 -05:00
Sean Sube 572a5159ad
fix(api): limit fp16 ops for v2.1 models (#364) 2023-04-30 08:49:34 -05:00
Sean Sube bf2f77d7a2
apply lint 2023-04-29 23:12:15 -05:00
Sean Sube 077845e868
fix log levels 2023-04-29 23:11:46 -05:00
Sean Sube 62abc76f3f
convert checkpoint dtype after loading 2023-04-29 23:08:48 -05:00
Sean Sube e28bbe2c18
set safetensor and dtype when loading 2023-04-29 23:05:51 -05:00
Sean Sube bc71583393
fix(api): add model image size and version hint to extras file 2023-04-29 22:56:52 -05:00
Sean Sube 2690eafe09
fix(api): pass correct text model type when converting v2 checkpoints (#360) 2023-04-29 22:45:48 -05:00
Sean Sube 4eba9a6400
avoid joining none path 2023-04-29 21:16:23 -05:00
Sean Sube 5760be710a
fix(api): restore use of config key when converting from checkpoints 2023-04-29 20:26:01 -05:00
Sean Sube f782f39cce
fix(api): remove size limit on inpainting stage 2023-04-29 14:23:00 -05:00
Sean Sube acde899559
reuse config from loaded UNet 2023-04-29 13:27:39 -05:00
Sean Sube 0175d7edcf
fix(api): specify input channels when converting inpainting models (#356) 2023-04-29 12:18:15 -05:00
Sean Sube 4c126153f5
fix(api): switch to diffusers ckpt loading, add more pipelines to conversion (#337, #356) 2023-04-29 12:00:43 -05:00
Sean Sube dbd9a186ae
feat(api): allow converting inpaint models (#356) 2023-04-27 23:41:04 -05:00
Sean Sube dc6bf330c1
feat(api): add optimization for max attention slicing (#355) 2023-04-24 19:13:32 -05:00
Sean Sube 5f5b01fed0
apply lint 2023-04-22 12:28:46 -05:00
Sean Sube 5f5418132b
fix(api): build sum tokens for TIs using emb_params key 2023-04-22 12:07:52 -05:00
Sean Sube a46f00aeb4
fix(api): use slice multiplication for 3x3 kernels without CP decomp 2023-04-20 22:10:49 -05:00
Sean Sube 49781df2fd
apply lint 2023-04-20 21:26:16 -05:00
Sean Sube ff4c8a9404
handle tensors of the same rank 2023-04-20 20:24:37 -05:00
Sean Sube 3e8f4b3edf
fix(api): improve summing of mismatched weights 2023-04-20 20:06:43 -05:00
Sean Sube f0109d3406
fix(api): blend LoHA and LoRA weights for 1x1 kernels 2023-04-20 19:31:40 -05:00
Sean Sube e9e9d75f9d
fix(api): correct return value after fully converting checkpoints 2023-04-20 18:00:35 -05:00
Sean Sube 161913bb7e
fix(api): add pre-converted ControlNets to base sources, add missing ControlNet strings 2023-04-15 16:55:53 -05:00
Sean Sube 317029356e
apply lint 2023-04-15 14:42:14 -05:00
Sean Sube 95841ffe2b
fix(api): check diffusers version before imports (#336) 2023-04-15 14:32:22 -05:00
Sean Sube 6effbea1eb
fix missing import 2023-04-15 10:57:37 -05:00
Sean Sube 7545d7c73d
add ControlNet UNet to existing checkpoints 2023-04-15 10:54:52 -05:00
Sean Sube 7f16b2f0ae
clean up CNet-only conversion code 2023-04-15 09:08:14 -05:00
Sean Sube 0dd8272285
feat(api): convert CNet for existing diffusion models 2023-04-15 09:00:17 -05:00
Sean Sube 9e017ee35d
feat: add parameter for ControlNet selection 2023-04-12 08:43:15 -05:00
Sean Sube fbf576746c
attention fix for cnet export 2023-04-11 23:11:08 -05:00
Sean Sube f49a4ddfdf
test controlnet pipe 2023-04-11 23:06:32 -05:00
Sean Sube a3daaf0112
initial integration of controlnet pipeline 2023-04-11 19:29:25 -05:00
Sean Sube 62aa7e8473
feat(api): add initial support for BSRGAN and SwinIR upscaling (#153, #154) 2023-04-10 17:49:56 -05:00
Sean Sube 512f41135d
add logging when conv shapes do not match 2023-04-10 08:14:47 -05:00
Sean Sube 4db1737f98
apply lint 2023-04-10 08:09:29 -05:00
Sean Sube 7d2d865c19
convert LoHA T weights to same dtype, log shapes rather than data 2023-04-10 08:06:28 -05:00
Sean Sube 1ed51352c4
handle LoHA nodes with CP decomp 2023-04-10 08:06:22 -05:00
Sean Sube 88c7fd9f60
start handling Gemm nodes 2023-04-10 07:58:21 -05:00
Sean Sube df02e20b34
only use mids once 2023-04-10 07:58:18 -05:00
Sean Sube 4947734add
flip axis 2023-04-10 07:58:15 -05:00
Sean Sube 87bba0e169
reverse factors 2023-04-10 07:58:12 -05:00
Sean Sube d671cb8113
attempt layer by layer recomp 2023-04-10 07:58:08 -05:00
ForserX 7c4e729f85
Fix for LoHA weight and LoRA alpha read 2023-04-10 10:58:47 +03:00
Sean Sube b6bb246032
apply lint 2023-04-09 20:34:10 -05:00
Sean Sube 9698e29268
lint(api): name context params consistently (#278) 2023-04-09 20:33:03 -05:00
Sean Sube ff9ce03af5
fix(api): better handling for errors while converting checkpoint to torch (#165) 2023-04-09 18:05:44 -05:00
Sean Sube 34f1973707
feat(api): add support for PyTorch 2.0 (#292) 2023-04-09 16:07:40 -05:00
ForserX d244ca9a9c
LoHA: Fixed blending error 2023-04-08 13:46:05 +03:00
Sean Sube bd03edc2ad
fix(api): handle 3x3 kernels in LoHA networks 2023-04-07 20:54:42 -05:00
Sean Sube 7f7476bcbb
fix(api): use correct weight pairs in LoHA 2023-04-07 19:51:30 -05:00
Sean Sube 35432f1ab2
feat(api): experimental support for LoHA networks 2023-04-07 18:50:12 -05:00