Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Sign in / Register
Toggle navigation
S
Stable Diffusion Webui
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Locked Files
Issues
0
Issues
0
List
Boards
Labels
Service Desk
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Security & Compliance
Security & Compliance
Dependency List
License Compliance
Packages
Packages
List
Container Registry
Analytics
Analytics
CI / CD
Code Review
Insights
Issues
Repository
Value Stream
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
novelai-storage
Stable Diffusion Webui
Commits
0dca0db7
Commit
0dca0db7
authored
Sep 03, 2022
by
AUTOMATIC
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Update to support embedding with length greater than 1.
parent
4cafad66
Changes
2
Show whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
10 additions
and
7 deletions
+10
-7
webui.bat
webui.bat
+1
-1
webui.py
webui.py
+9
-6
No files found.
webui.bat
View file @
0dca0db7
...
...
@@ -7,7 +7,7 @@ set VENV_DIR=venv
mkdir
tmp
2
>
NUL
set
TORCH_COMMAND
=
pip
install
torch
-
-extra-index-url
https
://download.pytorch.org/whl/cu113
set
TORCH_COMMAND
=
pip
install
torch
==
1
.12.1
+cu
113
-
-extra-index-url
https
://download.pytorch.org/whl/cu113
set
REQS_FILE
=
requirements_versions
.txt
%PYTHON%
-c
""
>
tmp
/stdout
.txt
2
>
tmp
/stderr
.txt
...
...
webui.py
View file @
0dca0db7
...
...
@@ -746,9 +746,9 @@ class StableDiffusionModelHijack:
if
hasattr
(
param_dict
,
'_parameters'
):
param_dict
=
getattr
(
param_dict
,
'_parameters'
)
# fix for torch 1.12.1 loading saved file from torch 1.11
assert
len
(
param_dict
)
==
1
,
'embedding file has multiple terms in it'
emb
=
next
(
iter
(
param_dict
.
items
()))[
1
]
.
reshape
(
768
)
self
.
word_embeddings
[
name
]
=
emb
self
.
word_embeddings_checksums
[
name
]
=
f
'{const_hash(emb)&0xffff:04x}'
emb
=
next
(
iter
(
param_dict
.
items
()))[
1
]
self
.
word_embeddings
[
name
]
=
emb
.
detach
()
self
.
word_embeddings_checksums
[
name
]
=
f
'{const_hash(emb
.reshape(-1)
)&0xffff:04x}'
ids
=
tokenizer
([
name
],
add_special_tokens
=
False
)[
'input_ids'
][
0
]
...
...
@@ -838,9 +838,10 @@ class FrozenCLIPEmbedderWithCustomWords(torch.nn.Module):
found
=
False
for
ids
,
word
in
possible_matches
:
if
tokens
[
i
:
i
+
len
(
ids
)]
==
ids
:
emb_len
=
int
(
self
.
hijack
.
word_embeddings
[
word
]
.
shape
[
0
])
fixes
.
append
((
len
(
remade_tokens
),
word
))
remade_tokens
.
append
(
777
)
multipliers
.
append
(
mult
)
remade_tokens
+=
[
0
]
*
emb_len
multipliers
+=
[
mult
]
*
emb_len
i
+=
len
(
ids
)
-
1
found
=
True
used_custom_terms
.
append
((
word
,
self
.
hijack
.
word_embeddings_checksums
[
word
]))
...
...
@@ -903,7 +904,9 @@ class EmbeddingsWithFixes(nn.Module):
if
batch_fixes
is
not
None
:
for
fixes
,
tensor
in
zip
(
batch_fixes
,
inputs_embeds
):
for
offset
,
word
in
fixes
:
tensor
[
offset
]
=
self
.
embeddings
.
word_embeddings
[
word
]
emb
=
self
.
embeddings
.
word_embeddings
[
word
]
emb_len
=
min
(
tensor
.
shape
[
0
]
-
offset
,
emb
.
shape
[
0
])
tensor
[
offset
:
offset
+
emb_len
]
=
self
.
embeddings
.
word_embeddings
[
word
][
0
:
emb_len
]
return
inputs_embeds
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment