Implement enable_gqa
in aten::scaled_dot_product_attention
#1802
Labels
contribution welcome
We welcome code contributions for this
topic: torch_lib
Related to the torch/aten function lib in development
We should follow https://pytorch.org/docs/main/generated/torch.nn.functional.scaled_dot_product_attention.html to implement
enable_gqa
The text was updated successfully, but these errors were encountered: