Skip to content
Vladimir Mandic edited this page Dec 11, 2024 · 59 revisions

Welcome to SD.Next

Step 1: Installation

Tip

And for platform specific information, check out
WSL | Intel Arc | DirectML | OpenVINO | ONNX & Olive | ZLUDA | AMD ROCm | MacOS | nVidia | Docker

Warning

If you run into issues, check out troubleshooting and debugging guides

Step 2: Experiment

SD.Next supports broad range of model types

A good starting point is to use built-in Reference model list which includes a variety of models that will be auto-downloaded and configured on first access.

Additionally, you can either download your own models or use built-in CivitAI and HuggingFace models downloader.

Tip

For additional model-specific information, check out
Stable Diffusion XL | Stable Diffusion 3.x | Stable Cascade | FLUX.1 | LCM

Step 3: Learn more

Guides and tutorials

  • Getting started
  • Overview of main features
  • Frequently Asked Questions FAQ
  • Prompting guide: what is the syntax and how to get the best results
  • Networks interface overview and search
  • What is LoRA and how to use it guide
  • Control guide and overview: what is it and how to use it
  • Styles guide: how to use them
  • Wildcards how-to: use them to your advantage
  • XYZ grid how-to: expertiment with different parameters
  • Image processing overview: postprocess your existing images
  • Intro to IP Adapters: image guidance
  • Work with models: validate, convert, download, etc.
  • Overview of built-in scripts that provide additional functionality

Advanced model handling

Some models may require additional access checks before they can be used: Gated access
Learn how to optimize your environemnt: Performance tuning
Learn how you can manage your resources better:

Overview of differnet compile/device settings impact on performance: Benchmarks

User interface

Behind-the-scenes

If you want to use SD.Next via API or via CLI,
check out tools examples

If you want to understand more how Stable Diffusion works

Step 4: Contribute

Clone this wiki locally