Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Sign in / Register
Toggle navigation
S
Stable Diffusion Webui
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Locked Files
Issues
0
Issues
0
List
Boards
Labels
Service Desk
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Security & Compliance
Security & Compliance
Dependency List
License Compliance
Packages
Packages
List
Container Registry
Analytics
Analytics
CI / CD
Code Review
Insights
Issues
Repository
Value Stream
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
novelai-storage
Stable Diffusion Webui
Commits
c23f666d
Commit
c23f666d
authored
Oct 21, 2022
by
AUTOMATIC
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
a more strict check for activation type and a more reasonable check for type of layer in hypernets
parent
a26fc283
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
9 additions
and
3 deletions
+9
-3
modules/hypernetworks/hypernetwork.py
modules/hypernetworks/hypernetwork.py
+9
-3
No files found.
modules/hypernetworks/hypernetwork.py
View file @
c23f666d
...
...
@@ -32,10 +32,16 @@ class HypernetworkModule(torch.nn.Module):
linears
=
[]
for
i
in
range
(
len
(
layer_structure
)
-
1
):
linears
.
append
(
torch
.
nn
.
Linear
(
int
(
dim
*
layer_structure
[
i
]),
int
(
dim
*
layer_structure
[
i
+
1
])))
if
activation_func
==
"relu"
:
linears
.
append
(
torch
.
nn
.
ReLU
())
if
activation_func
==
"leakyrelu"
:
el
if
activation_func
==
"leakyrelu"
:
linears
.
append
(
torch
.
nn
.
LeakyReLU
())
elif
activation_func
==
'linear'
or
activation_func
is
None
:
pass
else
:
raise
RuntimeError
(
f
'hypernetwork uses an unsupported activation function: {activation_func}'
)
if
add_layer_norm
:
linears
.
append
(
torch
.
nn
.
LayerNorm
(
int
(
dim
*
layer_structure
[
i
+
1
])))
...
...
@@ -46,7 +52,7 @@ class HypernetworkModule(torch.nn.Module):
self
.
load_state_dict
(
state_dict
)
else
:
for
layer
in
self
.
linear
:
if
not
"ReLU"
in
layer
.
__str__
()
:
if
type
(
layer
)
==
torch
.
nn
.
Linear
:
layer
.
weight
.
data
.
normal_
(
mean
=
0.0
,
std
=
0.01
)
layer
.
bias
.
data
.
zero_
()
...
...
@@ -74,7 +80,7 @@ class HypernetworkModule(torch.nn.Module):
def
trainables
(
self
):
layer_structure
=
[]
for
layer
in
self
.
linear
:
if
not
"ReLU"
in
layer
.
__str__
()
:
if
type
(
layer
)
==
torch
.
nn
.
Linear
:
layer_structure
+=
[
layer
.
weight
,
layer
.
bias
]
return
layer_structure
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment