- 28 Jun, 2023 1 commit
- 
- 
hako-mikan authored
 
- 
- 27 Jun, 2023 7 commits
- 
- 
AUTOMATIC authored
- 
AUTOMATIC1111 authoredZoom and pan: More options in the settings and improved error output 
- 
AUTOMATIC1111 authoredfixed typos 
- 
AUTOMATIC1111 authoredUse os.makedirs(..., exist_ok=True) 
- 
AUTOMATIC1111 authoredFix Typo of hints.js 
- 
AUTOMATIC1111 authoredStrip whitespaces from URL and dirname prior to extension installation 
- 
AUTOMATIC authored
 
- 
- 25 Jun, 2023 1 commit
- 
- 
Jabasukuriputo Wang authoredThis avoid some cryptic errors brought by accidental spaces around urls 
 
- 
- 18 Jun, 2023 1 commit
- 
- 
zhtttylz authored
 
- 
- 14 Jun, 2023 2 commits
- 
- 
Danil Boldyrev authored
- 
Danil Boldyrev authored
 
- 
- 13 Jun, 2023 3 commits
- 
- 
Aarni Koskela authored
- 
Danil Boldyrev authored
- 
Danil Boldyrev authored
 
- 
- 10 Jun, 2023 9 commits
- 
- 
arch-fan authored
- 
AUTOMATIC authored
- 
AUTOMATIC authored
- 
AUTOMATIC authored
- 
AUTOMATIC1111 authoredAllow activation of Generate Forever during generation 
- 
AUTOMATIC1111 authoredpersistent conds cache 
- 
AUTOMATIC1111 authoredDon't die when a LoRA is a broken symlink 
- 
AUTOMATIC1111 authoredSplit mask blur into X and Y components, patch Outpainting MK2 accordingly 
- 
AUTOMATIC1111 authoredForcing Torch Version to 1.13.1 for RX 5000 series GPUs 
 
- 
- 09 Jun, 2023 3 commits
- 
- 
Aarni Koskela authoredFixes #11098 
- 
Splendide Imaginarius authoredFixes unexpected noise in non-outpainted borders when using MK2 script. 
- 
Splendide Imaginarius authoredPrequisite to fixing Outpainting MK2 mask blur bug. 
 
- 
- 08 Jun, 2023 2 commits
- 07 Jun, 2023 3 commits
- 
- 
AUTOMATIC1111 authoredlink footer API to Wiki when API is not active 
- 
AUTOMATIC1111 authoredFix upcast attention dtype error. 
- 
Alexander Ljungberg authoredWithout this fix, enabling the "Upcast cross attention layer to float32" option while also using `--opt-sdp-attention` breaks generation with an error: ``` File "/ext3/automatic1111/stable-diffusion-webui/modules/sd_hijack_optimizations.py", line 612, in sdp_attnblock_forward out = torch.nn.functional.scaled_dot_product_attention(q, k, v, dropout_p=0.0, is_causal=False) RuntimeError: Expected query, key, and value to have the same dtype, but got query.dtype: float key.dtype: float and value.dtype: c10::Half instead. ``` The fix is to make sure to upcast the value tensor too.
 
- 
- 06 Jun, 2023 8 commits
