Skip to content

Commit cf26786

Browse files
committed
Update readme again
1 parent 5c18d14 commit cf26786

File tree

1 file changed

+41
-41
lines changed

1 file changed

+41
-41
lines changed

README.md

Lines changed: 41 additions & 41 deletions
Original file line numberDiff line numberDiff line change
@@ -1,59 +1,59 @@
11
# docker-diffusers-api ("banana-sd-base")
22

3-
Diffusers / Stable Diffusion in docker with a REST API, supporting various models, pipelines & schedulers. Used by [kiri.art](https://kiri.art/), perfect for [banana.dev](https://www.banana.dev/).
3+
Diffusers / Stable Diffusion in docker with a REST API, supporting various models, pipelines & schedulers. Used by [kiri.art](https://kiri.art/), perfect for [banana.dev](https://www.banana.dev/).
44

5-
Copyright (c) Gadi Cohen, 2022. MIT Licensed.
5+
Copyright (c) Gadi Cohen, 2022. MIT Licensed.
66
Please give credit and link back to this repo if you use it in a public project.
77

88
## Features
99

10-
* Pipelines: txt2img, img2img and inpainting in a single container
11-
* Models: stable-diffusion, waifu-diffusion, and easy to add others (e.g. jp-sd)
12-
* All model inputs supported, including setting nsfw filter per request
13-
* *Permute* base config to multiple forks based on yaml config with vars
14-
* Optionally send signed event logs / performance data to a REST endpoint
15-
* Can automatically download a checkpoint file and convert to diffusers.
16-
* S3 support, dreambooth training.
10+
- Pipelines: txt2img, img2img and inpainting in a single container
11+
- Models: stable-diffusion, waifu-diffusion, and easy to add others (e.g. jp-sd)
12+
- All model inputs supported, including setting nsfw filter per request
13+
- _Permute_ base config to multiple forks based on yaml config with vars
14+
- Optionally send signed event logs / performance data to a REST endpoint
15+
- Can automatically download a checkpoint file and convert to diffusers.
16+
- S3 support, dreambooth training.
1717

1818
Note: This image was created for [kiri.art](https://kiri.art/).
1919
Everything is open source but there may be certain request / response
20-
assumptions. If anything is unclear, please open an issue.
20+
assumptions. If anything is unclear, please open an issue.
2121

2222
## Updates and Help
2323

24-
* [Official `docker-diffusers-api` Forum](https://banana-forums.dev/c/open-source/docker-diffusers-api/16):
24+
- [Official `docker-diffusers-api` Forum](https://banana-forums.dev/c/open-source/docker-diffusers-api/16):
2525
help, updates, discussion.
26-
* Subscribe ("watch") these forum topics for:
27-
* [notable **`main`** branch updates](https://banana-forums.dev/t/official-releases-main-branch/35)
28-
* [notable **`dev`** branch updates](https://banana-forums.dev/t/development-releases-dev-branch/53)
29-
* Always [check the CHANGELOG](./CHANGELOG.md) for important updates when upgrading.
26+
- Subscribe ("watch") these forum topics for:
27+
- [notable **`main`** branch updates](https://banana-forums.dev/t/official-releases-main-branch/35)
28+
- [notable **`dev`** branch updates](https://banana-forums.dev/t/development-releases-dev-branch/53)
29+
- Always [check the CHANGELOG](./CHANGELOG.md) for important updates when upgrading.
3030

3131
**Official help in our dedicated forum https://banana-forums.dev/c/open-source/docker-diffusers-api/16.**
3232

33-
*[See the `dev` branch for the latest features.](https://github.com/kiri-art/docker-diffusers-api/tree/dev)
34-
**Pull Requests must be submitted against the dev branch.***
33+
\*[See the `dev` branch for the latest features.](https://github.com/kiri-art/docker-diffusers-api/tree/dev)
34+
**Pull Requests must be submitted against the dev branch.\***
3535

3636
## Usage:
3737

3838
Firstly, fork and clone this repo.
3939

40-
Most of the configuration happens via docker build variables. You can
40+
Most of the configuration happens via docker build variables. You can
4141
see all the options in the [Dockerfile](./Dockerfile), and edit them
4242
there directly, or set via docker command line or e.g. Banana's dashboard
4343
UI once support for build variables land (any day now).
4444

45-
If you're only deploying one container, that's all you need! If you
45+
If you're only deploying one container, that's all you need! If you
4646
intend to deploy multiple containers each with different variables
4747
(e.g. a few different models), you can edit the example
4848
[`scripts/permutations.yaml`](scripts/permutations.yaml)] file and
4949
run [`scripts/permute.sh`](scripts/permute.sh) to create a number
5050
of sub-repos in the `permutations` directory.
5151

52-
Lastly, there's an option to set `MODEL_ID=ALL`, and *all* models will
52+
Lastly, there's an option to set `MODEL_ID=ALL`, and _all_ models will
5353
be downloaded, and switched at request time (great for dev, useless for
5454
serverless).
5555

56-
**Deploying to banana?** That's it! You're done. Commit your changes and push.
56+
**Deploying to banana?** That's it! You're done. Commit your changes and push.
5757

5858
## Running locally / development:
5959

@@ -62,16 +62,16 @@ serverless).
6262
1. `docker build -t banana-sd --build-arg HF_AUTH_TOKEN=$HF_AUTH_TOKEN .`
6363
1. See [CONTRIBUTING.md](./CONTRIBUTING.md) for more helpful hints.
6464
1. Note: your first build can take a really long time, depending on
65-
your PC & network speed, and *especially when using the `CHECKPOINT_URL`
66-
feature*. Great time to grab a coffee or take a walk.
65+
your PC & network speed, and _especially when using the `CHECKPOINT_URL`
66+
feature_. Great time to grab a coffee or take a walk.
6767

6868
**Running**
6969

7070
1. `docker run -it --gpus all -p 8000:8000 banana-sd`
7171
1. Note: the `-it` is optional but makes it alot quicker/easier to stop the
72-
container using `Ctrl-C`.
72+
container using `Ctrl-C`.
7373
1. If you get a `CUDA initialization: CUDA unknown error` after suspend,
74-
just stop the container, `rmmod nvidia_uvm`, and restart.
74+
just stop the container, `rmmod nvidia_uvm`, and restart.
7575

7676
## Sending requests
7777

@@ -91,15 +91,18 @@ The container expects an `HTTP POST` request with the following JSON body:
9191
"MODEL_ID": "runwayml/stable-diffusion-v1-5",
9292
"PIPELINE": "StableDiffusionPipeline",
9393
"SCHEDULER": "LMSDiscreteScheduler",
94-
"safety_checker": true,
95-
},
94+
"safety_checker": true
95+
}
9696
}
9797
```
9898

9999
If you're using banana's SDK, it looks something like this:
100100

101101
```js
102-
const out = await banana.run(apiKey, modelKey, { "modelInputs": modelInputs, "callInputs": callInputs });
102+
const out = await banana.run(apiKey, modelKey, {
103+
modelInputs: modelInputs,
104+
callInputs: callInputs,
105+
});
103106
```
104107

105108
NB: if you're coming from another banana starter repo, note that we
@@ -112,13 +115,14 @@ If provided, `init_image` and `mask_image` should be base64 encoded.
112115
**Schedulers**: docker-diffusers-api is simply a wrapper around diffusers,
113116
literally any scheduler included in diffusers will work out of the box,
114117
provided it can loaded with its default config and without requiring
115-
any other explicit arguments at init time. In any event, the following
118+
any other explicit arguments at init time. In any event, the following
116119
schedulers are the most common and most well tested:
117-
`DPMSolverMultistepScheduler` (fast! only needs 20 steps!),
120+
`DPMSolverMultistepScheduler` (fast! only needs 20 steps!),
118121
`LMSDiscreteScheduler`, `DDIMScheduler`, `PNDMScheduler`,
119122
`EulerAncestralDiscreteScheduler`, `EulerDiscreteScheduler`.
120123

121124
<a name="testing"></a>
125+
122126
## Examples and testing
123127

124128
There are also very basic examples in [test.py](./test.py), which you can view
@@ -150,11 +154,9 @@ Request took 3.0s (init: 2.4s, inference: 2.1s)
150154
The best example of course is https://kiri.art/ and it's
151155
[source code](https://github.com/kiri-art/stable-diffusion-react-nextjs-mui-pwa).
152156

153-
154-
155157
## Troubleshooting
156158

157-
* **403 Client Error: Forbidden for url**
159+
- **403 Client Error: Forbidden for url**
158160

159161
Make sure you've accepted the license on the model card of the HuggingFace model
160162
specified in `MODEL_ID`, and that you correctly passed `HF_AUTH_TOKEN` to the
@@ -165,12 +167,12 @@ The best example of course is https://kiri.art/ and it's
165167
You have two options.
166168

167169
1. For a diffusers model, simply set the `MODEL_ID` docker build variable to the name
168-
of the model hosted on HuggingFace, and it will be downloaded automatically at
169-
build time.
170+
of the model hosted on HuggingFace, and it will be downloaded automatically at
171+
build time.
170172

171173
1. For a non-diffusers model, simply set the `CHECKPOINT_URL` docker build variable
172-
to the URL of a `.ckpt` file, which will be downloaded and converted to the diffusers
173-
format automatically at build time.
174+
to the URL of a `.ckpt` file, which will be downloaded and converted to the diffusers
175+
format automatically at build time.
174176

175177
## Keeping forks up to date
176178

@@ -182,15 +184,15 @@ Per your personal preferences, rebase or merge, e.g.
182184

183185
Or, if you're confident, do it in one step with no confirmations:
184186

185-
`git fetch upstream && git merge upstream/main --no-edit && git push`
187+
`git fetch upstream && git merge upstream/main --no-edit && git push`
186188

187189
Check `scripts/permute.sh` and your git remotes, some URLs are hardcoded, I'll
188190
make this easier in a future release.
189191

190192
## Event logs / performance data
191193

192194
Set `CALL_URL` and `SIGN_KEY` environment variables to send timing data on `init`
193-
and `inference` start and end data. You'll need to check the source code of here
195+
and `inference` start and end data. You'll need to check the source code of here
194196
and sd-mui as the format is in flux.
195197

196198
This info is now logged regardless, and `init()` and `inference()` times are sent
@@ -199,5 +201,3 @@ back via `{ $timings: { init: timeInMs, inference: timeInMs } }`.
199201
## Acknowledgements
200202

201203
Originally based on https://github.com/bananaml/serverless-template-stable-diffusion.
202-
203-

0 commit comments

Comments
 (0)