1
0
Fork 0
Go to file
Sean Sube 8c190682fd
build / github-pending (push) Failing after 1m12s Details
build / build-api-coverage-3-9 (push) Has been skipped Details
build / build-api-coverage-3-10 (push) Has been skipped Details
build / build-api-coverage-3-8 (push) Has been skipped Details
build / build-gui-bundle (push) Has been skipped Details
build / package-api-oci (cpu-buster, api/Containerfile.cpu.buster) (push) Has been skipped Details
build / package-api-oci (cuda-ubuntu, api/Containerfile.cuda.ubuntu) (push) Has been skipped Details
build / package-api-oci (rocm-ubuntu, api/Containerfile.rocm.ubuntu) (push) Has been skipped Details
build / package-gui-oci (nginx-alpine, Containerfile.nginx.alpine) (push) Has been skipped Details
build / package-gui-oci (nginx-bullseye, Containerfile.nginx.bullseye) (push) Has been skipped Details
build / package-gui-oci (node-alpine, Containerfile.node.alpine) (push) Has been skipped Details
build / package-gui-oci (node-buster, Containerfile.node.buster) (push) Has been skipped Details
build / package-api-twine (push) Has been skipped Details
build / package-api-twine-dry (push) Has been skipped Details
build / package-gui-npm (push) Has been skipped Details
build / package-gui-npm-dry (push) Has been skipped Details
build / github-failure (push) Has been skipped Details
build / github-success (push) Has been skipped Details
fix runner tags
2024-04-16 19:17:18 -05:00
.devcontainer feat(build): add basic codespaces definition (#25) 2023-02-04 16:11:24 -06:00
.github/workflows fix runner tags 2024-04-16 19:17:18 -05:00
.vscode feat(build): add additional launch configs for vscode 2024-03-03 15:38:51 -06:00
api only log keys for ignored kwargs 2024-03-16 15:32:00 -05:00
common clean up chain params to match backend 2023-09-13 17:28:38 -05:00
docs document new API endpoints 2024-01-14 19:25:54 -06:00
exe chore(release): 0.12.0 2023-12-31 14:15:31 -06:00
gui fix(gui): handle responses with 0 images correctly 2024-04-16 19:10:26 -05:00
models fix(models): use a real domain for pre-converted models 2023-12-24 13:48:46 -06:00
outputs feat(api): switch to device pool for background workers 2023-02-04 10:06:22 -06:00
run fix region regex 2023-11-05 17:46:28 -06:00
.gitattributes set up LFS for images 2023-01-19 21:04:19 -06:00
.gitignore feat(docs): set up docs site (#436) 2023-12-16 16:01:21 -06:00
.gitlab-ci.yml fix(build): use python image for pip release job 2023-12-31 14:15:04 -06:00
BENCHMARK.md feat(docs): add platform/model compatibility list 2023-01-24 08:39:25 -06:00
CHANGELOG.md chore(release): 0.12.0 2023-12-31 14:15:31 -06:00
LICENSE add attribution for ControlNet and Pix2Pix pipelines 2023-04-15 11:44:20 -05:00
Makefile feat(build): add root makefile and common targets 2023-01-05 15:03:34 -06:00
README.md spruce up the readme intro 2023-12-31 21:19:14 -06:00
mkdocs.yml feat(docs): cover pre-converted models in user guide 2023-12-21 22:02:03 -06:00
onnx-web.code-workspace add feature flag for vertical mirroring 2024-01-27 18:05:46 -06:00
renovate.json fix(api): pin pip dependencies, add requirements to renovate 2023-06-19 18:02:00 -05:00

README.md

onnx-web

onnx-web is designed to simplify the process of running Stable Diffusion and other ONNX models so you can focus on making high quality, high resolution art. With the efficiency of hardware acceleration on both AMD and Nvidia GPUs, and offering a reliable CPU software fallback, it offers the full feature set on desktop, laptops, and multi-GPU servers with a seamless user experience.

You can navigate through the user-friendly web UI, hosted on Github Pages and accessible across all major browsers, including your go-to mobile device. Here, you have the flexibility to choose diffusion models and accelerators for each image pipeline, with easy access to the image parameters that define each modes. Whether you're uploading images or expressing your artistic touch through inpainting and outpainting, onnx-web provides an environment that's as user-friendly as it is powerful. Recent output images are neatly presented beneath the controls, serving as a handy visual reference to revisit previous parameters or remix your earlier outputs.

Dive deeper into the onnx-web experience with its API, compatible with both Linux and Windows. This RESTful interface seamlessly integrates various pipelines from the HuggingFace diffusers library, offering valuable metadata on models and accelerators, along with detailed outputs from your creative runs.

Embark on your generative art journey with onnx-web, and explore its capabilities through our detailed documentation site. Find a comprehensive getting started guide, setup guide, and user guide waiting to empower your creative endeavors!

Please check out the documentation site for more info:

preview of txt2img tab using SDXL to generate ghostly astronauts eating weird hamburgers on an abandoned space station

Features

This is an incomplete list of new and interesting features:

Contents

Setup

There are a few ways to run onnx-web:

You only need to run the server and should not need to compile anything. The client GUI is hosted on Github Pages and is included with the Windows all-in-one bundle.

The extended setup docs have been moved to the setup guide.

Adding your own models

You can add your own models by downloading them from the HuggingFace Hub or Civitai or by converting them from local files, without making any code changes. You can also download and blend in additional networks, such as LoRAs and Textual Inversions, using tokens in the prompt.

Usage

Known errors and solutions

Please see the Known Errors section of the user guide.

Running the containers

This has been moved to the server admin guide.

Credits

Some of the conversion and pipeline code was copied or derived from code in:

Those parts have their own licenses with additional restrictions on commercial usage, modification, and redistribution. The rest of the project is provided under the MIT license, and I am working to isolate these components into a library.

There are many other good options for using Stable Diffusion with hardware acceleration, including:

Getting this set up and running on AMD would not have been possible without guides by: