1. 07 Jul, 2023 1 commit
    • Neil Mahseth's avatar
      Fix UnicodeEncodeError when writing to file CLIP Interrogator Batch Mode · c258dd34
      Neil Mahseth authored
      The code snippet print(interrogation_function(img), file=open(os.path.join(ii_output_dir, f"{left}.txt"), 'a')) raises a UnicodeEncodeError with the message "'charmap' codec can't encode character '\u016b' in position 129". This error occurs because the default encoding used by the open() function cannot handle certain Unicode characters.
      
      To fix this issue, the encoding parameter needs to be explicitly specified when opening the file. By using an appropriate encoding, such as 'utf-8', we can ensure that Unicode characters are properly encoded and written to the file.
      
      The updated code should be modified as follows:
      
      python
      Copy code
      print(interrogation_function(img), file=open(os.path.join(ii_output_dir, f"{left}.txt"), 'a', encoding='utf-8'))
      By making this change, the code will no longer raise the UnicodeEncodeError and will correctly handle Unicode characters during the file write operation.
      c258dd34
  2. 27 Jun, 2023 7 commits
  3. 25 Jun, 2023 1 commit
  4. 18 Jun, 2023 1 commit
  5. 14 Jun, 2023 2 commits
  6. 13 Jun, 2023 3 commits
  7. 10 Jun, 2023 9 commits
  8. 09 Jun, 2023 3 commits
  9. 08 Jun, 2023 2 commits
  10. 07 Jun, 2023 3 commits
    • AUTOMATIC1111's avatar
      Merge pull request #11058 from AUTOMATIC1111/api-wiki · cf28aed1
      AUTOMATIC1111 authored
      link footer API to Wiki when API is not active
      cf28aed1
    • AUTOMATIC1111's avatar
      Merge pull request #11066 from aljungberg/patch-1 · 806ea639
      AUTOMATIC1111 authored
      Fix upcast attention dtype error.
      806ea639
    • Alexander Ljungberg's avatar
      Fix upcast attention dtype error. · d9cc0910
      Alexander Ljungberg authored
      Without this fix, enabling the "Upcast cross attention layer to float32" option while also using `--opt-sdp-attention` breaks generation with an error:
      
      ```
        File "/ext3/automatic1111/stable-diffusion-webui/modules/sd_hijack_optimizations.py", line 612, in sdp_attnblock_forward
          out = torch.nn.functional.scaled_dot_product_attention(q, k, v, dropout_p=0.0, is_causal=False)
      RuntimeError: Expected query, key, and value to have the same dtype, but got query.dtype: float key.dtype: float and value.dtype: c10::Half instead.
      ```
      
      The fix is to make sure to upcast the value tensor too.
      d9cc0910
  11. 06 Jun, 2023 8 commits