Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Sign in / Register
Toggle navigation
H
Hydra Node Http
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Locked Files
Issues
0
Issues
0
List
Boards
Labels
Service Desk
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Security & Compliance
Security & Compliance
Dependency List
License Compliance
Packages
Packages
List
Container Registry
Analytics
Analytics
CI / CD
Code Review
Insights
Issues
Repository
Value Stream
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
novelai-storage
Hydra Node Http
Commits
84cfe3b5
Commit
84cfe3b5
authored
Aug 06, 2022
by
novelailab
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
prompting should work
parent
c6954d2d
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
4 additions
and
3 deletions
+4
-3
hydra_node/models.py
hydra_node/models.py
+4
-3
No files found.
hydra_node/models.py
View file @
84cfe3b5
...
@@ -34,6 +34,7 @@ def fix_batch(tensor, bs):
...
@@ -34,6 +34,7 @@ def fix_batch(tensor, bs):
return
torch
.
stack
([
tensor
.
squeeze
(
0
)]
*
bs
,
dim
=
0
)
return
torch
.
stack
([
tensor
.
squeeze
(
0
)]
*
bs
,
dim
=
0
)
# mix conditioning vectors for prompts
# mix conditioning vectors for prompts
# @aero
def
prompt_mixing
(
model
,
prompt_body
,
batch_size
):
def
prompt_mixing
(
model
,
prompt_body
,
batch_size
):
if
"|"
in
prompt_body
:
if
"|"
in
prompt_body
:
prompt_parts
=
prompt_body
.
split
(
"|"
)
prompt_parts
=
prompt_body
.
split
(
"|"
)
...
@@ -183,7 +184,7 @@ class StableDiffusionModel(nn.Module):
...
@@ -183,7 +184,7 @@ class StableDiffusionModel(nn.Module):
],
device
=
self
.
device
)
],
device
=
self
.
device
)
prompt
=
[
request
.
prompt
]
*
request
.
n_samples
prompt
=
[
request
.
prompt
]
*
request
.
n_samples
prompt_condition
=
self
.
model
.
get_learned_conditioning
(
prompt
)
prompt_condition
=
prompt_mixing
(
self
.
model
,
prompt
[
0
],
request
.
n_samples
)
uc
=
None
uc
=
None
if
request
.
scale
!=
1.0
:
if
request
.
scale
!=
1.0
:
...
@@ -227,13 +228,13 @@ class StableDiffusionModel(nn.Module):
...
@@ -227,13 +228,13 @@ class StableDiffusionModel(nn.Module):
if
request
.
scale
!=
1.0
:
if
request
.
scale
!=
1.0
:
uc
=
self
.
model
.
get_learned_conditioning
(
request
.
n_samples
*
[
""
])
uc
=
self
.
model
.
get_learned_conditioning
(
request
.
n_samples
*
[
""
])
c
=
prompt_mixing
(
self
.
model
,
prompt
[
0
],
request
.
n_samples
)
#(model.get_learned_conditioning(prompts) + model.get_learned_conditioning(["taken at night"])) / 2
prompt_condition
=
prompt_mixing
(
self
.
model
,
prompt
[
0
],
request
.
n_samples
)
# encode (scaled latent)
# encode (scaled latent)
start_code_terped
=
None
start_code_terped
=
None
z_enc
=
sampler
.
stochastic_encode
(
init_latent
,
torch
.
tensor
([
t_enc
]
*
request
.
n_samples
)
.
to
(
self
.
device
),
noise
=
start_code_terped
)
z_enc
=
sampler
.
stochastic_encode
(
init_latent
,
torch
.
tensor
([
t_enc
]
*
request
.
n_samples
)
.
to
(
self
.
device
),
noise
=
start_code_terped
)
# decode it
# decode it
samples
=
sampler
.
decode
(
z_enc
,
c
,
t_enc
,
unconditional_guidance_scale
=
request
.
scale
,
samples
=
sampler
.
decode
(
z_enc
,
prompt_condition
,
t_enc
,
unconditional_guidance_scale
=
request
.
scale
,
unconditional_conditioning
=
uc
,)
unconditional_conditioning
=
uc
,)
x_samples_ddim
=
self
.
model
.
decode_first_stage
(
samples
)
x_samples_ddim
=
self
.
model
.
decode_first_stage
(
samples
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment