Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Sign in / Register
Toggle navigation
S
Stable Diffusion Webui
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Locked Files
Issues
0
Issues
0
List
Boards
Labels
Service Desk
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Security & Compliance
Security & Compliance
Dependency List
License Compliance
Packages
Packages
List
Container Registry
Analytics
Analytics
CI / CD
Code Review
Insights
Issues
Repository
Value Stream
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
novelai-storage
Stable Diffusion Webui
Commits
1792e193
Commit
1792e193
authored
Mar 16, 2024
by
Kohaku-Blueleaf
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Use correct implementation, fix device error
parent
851c3d51
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
15 additions
and
3 deletions
+15
-3
extensions-builtin/Lora/network.py
extensions-builtin/Lora/network.py
+15
-3
No files found.
extensions-builtin/Lora/network.py
View file @
1792e193
...
...
@@ -153,7 +153,7 @@ class NetworkModule:
self
.
scale
=
weights
.
w
[
"scale"
]
.
item
()
if
"scale"
in
weights
.
w
else
None
self
.
dora_scale
=
weights
.
w
.
get
(
"dora_scale"
,
None
)
self
.
dora_
mean_dim
=
tuple
(
i
for
i
in
range
(
len
(
self
.
shape
))
if
i
!=
1
)
self
.
dora_
norm_dims
=
len
(
self
.
shape
)
-
1
def
multiplier
(
self
):
if
'transformer'
in
self
.
sd_key
[:
20
]:
...
...
@@ -170,10 +170,22 @@ class NetworkModule:
return
1.0
def
apply_weight_decompose
(
self
,
updown
,
orig_weight
):
orig_weight
=
orig_weight
.
to
(
updown
)
# Match the device/dtype
orig_weight
=
orig_weight
.
to
(
updown
.
dtype
)
dora_scale
=
self
.
dora_scale
.
to
(
device
=
orig_weight
.
device
,
dtype
=
updown
.
dtype
)
updown
=
updown
.
to
(
orig_weight
.
device
)
merged_scale1
=
updown
+
orig_weight
merged_scale1_norm
=
(
merged_scale1
.
transpose
(
0
,
1
)
.
reshape
(
merged_scale1
.
shape
[
1
],
-
1
)
.
norm
(
dim
=
1
,
keepdim
=
True
)
.
reshape
(
merged_scale1
.
shape
[
1
],
*
[
1
]
*
self
.
dora_norm_dims
)
.
transpose
(
0
,
1
)
)
dora_merged
=
(
merged_scale1
/
merged_scale1
(
dim
=
self
.
dora_mean_dim
,
keepdim
=
True
)
*
self
.
dora_scale
merged_scale1
*
(
dora_scale
/
merged_scale1_norm
)
)
final_updown
=
dora_merged
-
orig_weight
return
final_updown
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment