This works on both Linux and Windows, for both AMD and Nvidia, but requires some familiarity with the command line.
### Install Git and Python
Install Git and Python 3.10 for your environment:
- https://gitforwindows.org/
- https://www.python.org/downloads/
The latest version of git should be fine. Python should be 3.9 or 3.10, although 3.8 and 3.11 may work if the correct
packages are available for your platform. If you already have Python installed for another form of Stable Diffusion,
that should work, but make sure to verify the version in the next step.
Make sure you have Python 3.9 or 3.10:
```shell
> python --version
Python 3.10
```
If your system differentiates between Python 2 and 3 and uses the `python3` and `pip3` commands for the Python 3.x
tools, make sure to adjust the commands shown here. They should otherwise be the same: `python3 --version`.
Once you have those basic packages installed, clone this git repository:
```shell
> git clone https://github.com/ssube/onnx-web.git
```
### Note about setup paths
This project contains both Javascript and Python, for the client and server respectively. Make sure you are in the
correct directory when working with each part.
Most of these setup commands should be run in the Python environment and the `api/` directory:
```shell
> cd api
> pwd
/home/ssube/code/github/ssube/onnx-web/api
```
The Python virtual environment will be created within the `api/` directory.
The Javascript client can be built and run within the `gui/` directory.
### Create a virtual environment
Change into the `api/` directory, then create a virtual environment:
```shell
> pip install virtualenv
> python -m venv onnx_env
```
This will contain all of the pip libraries. If you update or reinstall Python, you will need to recreate the virtual
environment.
If you receive an error like `Error: name 'cmd' is not defined`, there may be [a bug in the `venv` module](https://www.mail-archive.com/debian-bugs-dist@lists.debian.org/msg1884072.html) on certain
Debian-based systems. You may need to install venv through apt instead:
```shell
> sudo apt install python3-venv # only if you get an error
```
Every time you start using ONNX web, activate the virtual environment:
```shell
# on linux:
> source ./onnx_env/bin/activate
# on windows:
> .\onnx_env\Scripts\Activate.bat
```
Update pip itself:
```shell
> python -m pip install --upgrade pip
```
### Install pip packages
You can install all of the necessary packages at once using [the `requirements/base.txt` file](./api/requirements/base.txt)
and the `requirements/` file for your platform. Install them in separate commands and make sure to install the
platform-specific packages first:
```shell
> pip install -r requirements/amd-linux.txt
> pip install -r requirements/base.txt
# or
> pip install -r requirements/amd-windows.txt
> pip install -r requirements/base.txt
# or
> pip install -r requirements/cpu.txt
> pip install -r requirements/base.txt
# or
> pip install -r requirements/nvidia.txt
> pip install -r requirements/base.txt
```
Only install one of the platform-specific requirements files, otherwise you may end up with the wrong version of
PyTorch or the ONNX runtime. The full list of available ONNX runtime packages [can be found here
](https://download.onnxruntime.ai/).
If you have successfully installed both of the requirements files for your platform, you do not need to install
any of the packages shown in the following sections and you should [skip directly to testing the models](#test-the-models).
The ONNX runtime nightly packages used by the `requirements/*-nightly.txt` files can be substantially faster than the
last release, but may not always be stable. Many of the nightly packages are specific to one version of Python and
some are only available for Python 3.8 and 3.9, so you may need to find the correct package for your environment. If
you are using Python 3.10, download the `cp310` package. For Python 3.9, download the `cp39` package, and so on.
Installing with pip will figure out the correct package for you.
#### For AMD on Linux: PyTorch ROCm and ONNX runtime ROCm
If you are running on Linux with an AMD GPU, install the ROCm versions of PyTorch and `onnxruntime`: