Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Sign in / Register
Toggle navigation
B
Basedformer
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Locked Files
Issues
0
Issues
0
List
Boards
Labels
Service Desk
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Security & Compliance
Security & Compliance
Dependency List
License Compliance
Packages
Packages
List
Container Registry
Analytics
Analytics
CI / CD
Code Review
Insights
Issues
Repository
Value Stream
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
novelai-storage
Basedformer
Commits
0879cf91
Commit
0879cf91
authored
Apr 06, 2022
by
novelailab
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
fix formatting
parent
4141d527
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
4 additions
and
0 deletions
+4
-0
basedformer/optimizer.py
basedformer/optimizer.py
+4
-0
No files found.
basedformer/optimizer.py
View file @
0879cf91
...
@@ -41,12 +41,15 @@ class BasedOptimizer:
...
@@ -41,12 +41,15 @@ class BasedOptimizer:
if
optimizer
==
"adamw"
:
if
optimizer
==
"adamw"
:
self
.
optimizer
=
optim
.
AdamW
(
parameters
,
lr
=
0
,
weight_decay
=
self
.
weight_decay
,
betas
=
(
self
.
beta1
,
self
.
beta2
),
eps
=
self
.
eps
)
self
.
optimizer
=
optim
.
AdamW
(
parameters
,
lr
=
0
,
weight_decay
=
self
.
weight_decay
,
betas
=
(
self
.
beta1
,
self
.
beta2
),
eps
=
self
.
eps
)
elif
optimizer
==
"adamw8bit"
:
elif
optimizer
==
"adamw8bit"
:
import
bitsandbytes
as
bnb
import
bitsandbytes
as
bnb
self
.
optimizer
=
bnb
.
optim
.
Adam8bit
(
parameters
,
lr
=
0
,
weight_decay
=
self
.
weight_decay
,
betas
=
(
self
.
beta1
,
self
.
beta2
),
eps
=
self
.
eps
)
self
.
optimizer
=
bnb
.
optim
.
Adam8bit
(
parameters
,
lr
=
0
,
weight_decay
=
self
.
weight_decay
,
betas
=
(
self
.
beta1
,
self
.
beta2
),
eps
=
self
.
eps
)
elif
optimizer
==
"adafactor"
:
elif
optimizer
==
"adafactor"
:
try
:
try
:
from
transformers.optimization
import
Adafactor
from
transformers.optimization
import
Adafactor
except
ImportError
:
except
ImportError
:
raise
ImportError
(
"Please install transformers for Adafactor"
)
raise
ImportError
(
"Please install transformers for Adafactor"
)
...
@@ -55,6 +58,7 @@ class BasedOptimizer:
...
@@ -55,6 +58,7 @@ class BasedOptimizer:
def
step
(
self
,
scaler
=
None
):
def
step
(
self
,
scaler
=
None
):
if
scaler
:
if
scaler
:
scaler
.
step
(
self
.
optimizer
)
scaler
.
step
(
self
.
optimizer
)
else
:
else
:
self
.
optimizer
.
step
()
self
.
optimizer
.
step
()
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment