Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Sign in / Register
Toggle navigation
S
Stable Diffusion Webui
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Locked Files
Issues
0
Issues
0
List
Boards
Labels
Service Desk
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Security & Compliance
Security & Compliance
Dependency List
License Compliance
Packages
Packages
List
Container Registry
Analytics
Analytics
CI / CD
Code Review
Insights
Issues
Repository
Value Stream
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
novelai-storage
Stable Diffusion Webui
Commits
dc390618
Commit
dc390618
authored
Jul 13, 2023
by
AUTOMATIC1111
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
thank you linter
parent
6c5f83b1
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
3 deletions
+3
-3
extensions-builtin/Lora/lora.py
extensions-builtin/Lora/lora.py
+3
-3
No files found.
extensions-builtin/Lora/lora.py
View file @
dc390618
...
@@ -229,9 +229,9 @@ def load_lora(name, lora_on_disk):
...
@@ -229,9 +229,9 @@ def load_lora(name, lora_on_disk):
elif
type
(
sd_module
)
==
torch
.
nn
.
Conv2d
and
weight
.
shape
[
2
:]
==
(
3
,
3
):
elif
type
(
sd_module
)
==
torch
.
nn
.
Conv2d
and
weight
.
shape
[
2
:]
==
(
3
,
3
):
module
=
torch
.
nn
.
Conv2d
(
weight
.
shape
[
1
],
weight
.
shape
[
0
],
(
3
,
3
),
bias
=
False
)
module
=
torch
.
nn
.
Conv2d
(
weight
.
shape
[
1
],
weight
.
shape
[
0
],
(
3
,
3
),
bias
=
False
)
else
:
else
:
print
(
f
'Lora layer {key_
diffusers
} matched a layer with unsupported type: {type(sd_module).__name__}'
)
print
(
f
'Lora layer {key_
lora
} matched a layer with unsupported type: {type(sd_module).__name__}'
)
continue
continue
raise
AssertionError
(
f
"Lora layer {key_
diffusers
} matched a layer with unsupported type: {type(sd_module).__name__}"
)
raise
AssertionError
(
f
"Lora layer {key_
lora
} matched a layer with unsupported type: {type(sd_module).__name__}"
)
with
torch
.
no_grad
():
with
torch
.
no_grad
():
module
.
weight
.
copy_
(
weight
)
module
.
weight
.
copy_
(
weight
)
...
@@ -243,7 +243,7 @@ def load_lora(name, lora_on_disk):
...
@@ -243,7 +243,7 @@ def load_lora(name, lora_on_disk):
elif
lora_key
==
"lora_down.weight"
:
elif
lora_key
==
"lora_down.weight"
:
lora_module
.
down
=
module
lora_module
.
down
=
module
else
:
else
:
raise
AssertionError
(
f
"Bad Lora layer name: {key_
diffusers
} - must end in lora_up.weight, lora_down.weight or alpha"
)
raise
AssertionError
(
f
"Bad Lora layer name: {key_
lora
} - must end in lora_up.weight, lora_down.weight or alpha"
)
if
keys_failed_to_match
:
if
keys_failed_to_match
:
print
(
f
"Failed to match keys when loading Lora {lora_on_disk.filename}: {keys_failed_to_match}"
)
print
(
f
"Failed to match keys when loading Lora {lora_on_disk.filename}: {keys_failed_to_match}"
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment