Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Sign in / Register
Toggle navigation
S
Stable Diffusion Webui
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Locked Files
Issues
0
Issues
0
List
Boards
Labels
Service Desk
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Security & Compliance
Security & Compliance
Dependency List
License Compliance
Packages
Packages
List
Container Registry
Analytics
Analytics
CI / CD
Code Review
Insights
Issues
Repository
Value Stream
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
novelai-storage
Stable Diffusion Webui
Commits
73786c04
Commit
73786c04
authored
Jan 06, 2024
by
Nuullll
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
[IPEX] Fix torch.Generator hijack
parent
b00b4294
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
16 additions
and
4 deletions
+16
-4
modules/xpu_specific.py
modules/xpu_specific.py
+16
-4
No files found.
modules/xpu_specific.py
View file @
73786c04
...
@@ -94,11 +94,23 @@ def torch_xpu_scaled_dot_product_attention(
...
@@ -94,11 +94,23 @@ def torch_xpu_scaled_dot_product_attention(
return
torch
.
reshape
(
result
,
(
*
N
,
L
,
Ev
))
return
torch
.
reshape
(
result
,
(
*
N
,
L
,
Ev
))
def
is_xpu_device
(
device
:
str
|
torch
.
device
=
None
):
if
device
is
None
:
return
False
if
isinstance
(
device
,
str
):
return
device
.
startswith
(
"xpu"
)
return
device
.
type
==
"xpu"
if
has_xpu
:
if
has_xpu
:
# W/A for https://github.com/intel/intel-extension-for-pytorch/issues/452: torch.Generator API doesn't support XPU device
try
:
CondFunc
(
'torch.Generator'
,
# torch.Generator supports "xpu" device since 2.1
lambda
orig_func
,
device
=
None
:
torch
.
xpu
.
Generator
(
device
),
torch
.
Generator
(
"xpu"
)
lambda
orig_func
,
device
=
None
:
device
is
not
None
and
device
.
type
==
"xpu"
)
except
:
# W/A for https://github.com/intel/intel-extension-for-pytorch/issues/452: torch.Generator API doesn't support XPU device (for IPEX < 2.1)
CondFunc
(
'torch.Generator'
,
lambda
orig_func
,
device
=
None
:
torch
.
xpu
.
Generator
(
device
),
lambda
orig_func
,
device
=
None
:
is_xpu_device
(
device
))
# W/A for some OPs that could not handle different input dtypes
# W/A for some OPs that could not handle different input dtypes
CondFunc
(
'torch.nn.functional.layer_norm'
,
CondFunc
(
'torch.nn.functional.layer_norm'
,
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment