You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 9, 2023. It is now read-only.
What command line parameters are you using to start WebUI ?
I had no trouble myself with float16 using : --opt-sub-quad-attention --skip-torch-cuda-test --upcast-sampling --no-half-vae --use-cpu interogate --listen --port 64640 --api --enable-insecure-extension-access
I hardcoded fp16 everywhere because when UNET moving from VRAM to RAM torch allocates extra VRAM and instantly releases it, which can cause OOM. Very annoying, I did not figure out a good way to deal with this.
To make this work on Apple M1, I had to change the float16 to float32.
This is so far out of my understanding as to why this is needed, but maybe you can shed some light/update the script to work when f32 is needed.
Thanks!
`from pathlib import Path
import torch
from modules import scripts, script_callbacks, devices, sd_models, sd_models_config
import gradio as gr
import sgm.modules.diffusionmodules.denoiser_scaling
import sgm.modules.diffusionmodules.discretizer
from safetensors.torch import load_file
from sgm.modules.diffusionmodules.wrappers import OPENAIUNETWRAPPER
from sgm.util import (
disabled_train,
get_obj_from_str,
instantiate_from_config,
)
def safe_import(import_name, pkg_name=None):
try:
import(import_name)
except Exception:
pkg_name = pkg_name or import_name
import pip
if hasattr(pip, 'main'):
pip.main(['install', pkg_name])
else:
pip._internal.main(['install', pkg_name])
import(import_name)
safe_import('omegaconf')
from omegaconf import DictConfig, OmegaConf
config_path = Path(file).parent.resolve() / '../config.yaml'
class Refiner(scripts.Script):
def init(self):
super().init()
if not config_path.exists():
open(config_path, 'w').close()
self.config: DictConfig = OmegaConf.load(config_path)
self.callback_set = False
self.model = None
self.conditioner = None
self.base = None
self.swapped = False
self.model_name = ''
`
The text was updated successfully, but these errors were encountered: