Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add unit test to investigate torch issues (scaled_dot_product_attention, index_put) #1864

Draft
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

xadupre
Copy link
Member

@xadupre xadupre commented Sep 11, 2024

@xadupre xadupre changed the title Add unit test to investigate torch issues Add unit test to investigate torch issues (scaled_dot_product_attention, index_put) Sep 11, 2024
Copy link

codecov bot commented Sep 11, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 75.32%. Comparing base (a99e443) to head (2c3179f).

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #1864      +/-   ##
==========================================
+ Coverage   75.26%   75.32%   +0.06%     
==========================================
  Files         251      251              
  Lines       27446    27446              
  Branches     5032     5032              
==========================================
+ Hits        20656    20673      +17     
+ Misses       5822     5808      -14     
+ Partials      968      965       -3     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Signed-off-by: xadupre <[email protected]>
key_states = torch.randn(batch_size, seq_length_kv, embedding_dim)
value_states = torch.randn(batch_size, seq_length_kv, embedding_dim)

output = model(query_states, key_states, value_states)

Check warning

Code scanning / CodeQL

Variable defined multiple times Warning test

This assignment to 'output' is unnecessary as it is
redefined
before this value is used.
else:
raise AssertionError(f"Unknown exporter {exporter!r}")

import onnxruntime

Check notice

Code scanning / lintrunner

PYLINT/C0415 Note test

Import outside toplevel (onnxruntime) (import-outside-toplevel)
See import-outside-toplevel. To disable, use # pylint: disable=import-outside-toplevel
onnx_file_path = f"scaled_dot_product_attention_{exporter}.onnx"

if exporter == "script":
torch.onnx.export(

Check failure

Code scanning / lintrunner

PYLINT/E1123 Error test

Unexpected keyword argument 'opset' in function call (unexpected-keyword-arg)
See unexpected-keyword-arg. To disable, use # pylint: disable=unexpected-keyword-arg
else:
raise AssertionError(f"Unknown exporter {exporter!r}")

import onnxruntime

Check notice

Code scanning / lintrunner

PYLINT/C0415 Note test

Import outside toplevel (onnxruntime) (import-outside-toplevel)
See import-outside-toplevel. To disable, use # pylint: disable=import-outside-toplevel
@justinchuby justinchuby marked this pull request as draft September 11, 2024 15:51
@justinchuby
Copy link
Collaborator

Do you have plans to merge or is this for investigation only? Marking as draft for now

@justinchuby
Copy link
Collaborator

We do need to rewrite our index put implementation. #1749

@xadupre
Copy link
Member Author

xadupre commented Sep 11, 2024

I don't have time to implement a fix this week but anybody doing it should check the with unit tests I made and decide whether or not they should be kept.

@shubhambhokare1 shubhambhokare1 self-assigned this Oct 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Development

Successfully merging this pull request may close these issues.

3 participants