Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fooocus generates past/wrong image #207

Open
dscsqrl opened this issue Nov 29, 2024 · 0 comments
Open

fooocus generates past/wrong image #207

dscsqrl opened this issue Nov 29, 2024 · 0 comments

Comments

@dscsqrl
Copy link

dscsqrl commented Nov 29, 2024

Hi! I have Focus set up on a Linux server, running through a daemon that activates a Python virtual environment under a special user. Nginx is configured as a proxy. I noticed a strange bug where when I switch the model in the web interface and try to generate an image, and at the same time another person tries to generate an image, my generation hangs, but the second person gets my generated image instead. Nothing happens on my end until I refresh the page. Any ideas?

My focus settings
{
"advanced_mode": true,
"image_number": 1,
"save_metadata_json": true,
"save_metadata_image": true,
"output_format": "jpg",
"seed_random": true,
"same_seed_for_all": false,
"seed": 0,
"styles": ["Default (Slightly Cinematic)"],
"prompt_expansion": true,
"prompt": "",
"negative_prompt": "",
"performance": "Speed",
"custom_steps": 24,
"custom_switch": 0.75,
"img2img_mode": false,
"img2img_start_step": 0.06,
"img2img_denoise": 0.94,
"img2img_scale": 1.0,
"control_lora_canny": false,
"canny_edge_low": 0.2,
"canny_edge_high": 0.8,
"canny_start": 0.0,
"canny_stop": 0.4,
"canny_strength": 0.8,
"canny_model": "control-lora-canny-rank128.safetensors",
"control_lora_depth": false,
"depth_start": 0.0,
"depth_stop": 0.4,
"depth_strength": 0.8,
"depth_model": "control-lora-depth-rank128.safetensors",
"keep_input_names": true,
"revision": false,
"positive_prompt_strength": 1.0,
"negative_prompt_strength": 1.0,
"revision_strength_1": 1.0,
"revision_strength_2": 1.0,
"revision_strength_3": 1.0,
"revision_strength_4": 1.0,
"resolution": "1152�896 (9:7)",
"sampler": "dpmpp_2m_sde_gpu",
"scheduler": "karras",
"cfg": 7.0,
"base_clip_skip": -2,
"refiner_clip_skip": -2,
"sharpness": 2.0,
"base_model": "sd_xl_base_1.0_0.9vae.safetensors",
"refiner_model": "sd_xl_refiner_1.0_0.9vae.safetensors",
"lora_1_model": "sd_xl_offset_example-lora_1.0.safetensors",
"lora_1_weight": 0.5,
"lora_2_model": "None",
"lora_2_weight": 0.5,
"lora_3_model": "None",
"lora_3_weight": 0.5,
"lora_4_model": "None",
"lora_4_weight": 0.5,
"lora_5_model": "None",
"lora_5_weight": 0.5,
"freeu": true,
"freeu_b1": 1.01,
"freeu_b2": 1.02,
"freeu_s1": 0.99,
"freeu_s2": 0.95
}

Even if you run Focus without Nginx, directly on the port, the issue persists.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant