Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Sign in / Register
Toggle navigation
S
Stable Diffusion Webui
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Locked Files
Issues
0
Issues
0
List
Boards
Labels
Service Desk
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Security & Compliance
Security & Compliance
Dependency List
License Compliance
Packages
Packages
List
Container Registry
Analytics
Analytics
CI / CD
Code Review
Insights
Issues
Repository
Value Stream
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
novelai-storage
Stable Diffusion Webui
Commits
dab5002c
Commit
dab5002c
authored
Apr 13, 2023
by
Brad Smith
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
sort self.word_embeddings without instantiating it a new dict
parent
27b9ec60
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
6 additions
and
3 deletions
+6
-3
modules/textual_inversion/textual_inversion.py
modules/textual_inversion/textual_inversion.py
+6
-3
No files found.
modules/textual_inversion/textual_inversion.py
View file @
dab5002c
...
...
@@ -2,7 +2,7 @@ import os
import
sys
import
traceback
import
inspect
from
collections
import
namedtuple
,
OrderedDict
from
collections
import
namedtuple
import
torch
import
tqdm
...
...
@@ -108,7 +108,7 @@ class DirWithTextualInversionEmbeddings:
class
EmbeddingDatabase
:
def
__init__
(
self
):
self
.
ids_lookup
=
{}
self
.
word_embeddings
=
OrderedDict
()
self
.
word_embeddings
=
{}
self
.
skipped_embeddings
=
{}
self
.
expected_shape
=
-
1
self
.
embedding_dirs
=
{}
...
...
@@ -234,7 +234,10 @@ class EmbeddingDatabase:
embdir
.
update
()
# re-sort word_embeddings because load_from_dir may not load in alphabetic order.
self
.
word_embeddings
=
{
e
.
name
:
e
for
e
in
sorted
(
self
.
word_embeddings
.
values
(),
key
=
lambda
e
:
e
.
name
.
lower
())}
# using a temporary copy so we don't reinitialize self.word_embeddings in case other objects have a reference to it.
sorted_word_embeddings
=
{
e
.
name
:
e
for
e
in
sorted
(
self
.
word_embeddings
.
values
(),
key
=
lambda
e
:
e
.
name
.
lower
())}
self
.
word_embeddings
.
clear
()
self
.
word_embeddings
.
update
(
sorted_word_embeddings
)
displayed_embeddings
=
(
tuple
(
self
.
word_embeddings
.
keys
()),
tuple
(
self
.
skipped_embeddings
.
keys
()))
if
self
.
previously_displayed_embeddings
!=
displayed_embeddings
:
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment