Commit Graph

2 Commits (17a886cb5825cd8c26df4e65f7112d404b99fe12)

Author SHA1 Message Date
Ma Zerun b017670e1b
[Improve] Use PyTorch official `scaled_dot_product_attention` to accelerate `MultiheadAttention`. (#1434)
* [Improve] Use PyTorch official `scaled_dot_product_attention` to accelerate `MultiheadAttention`.

* Support `--local-rank` and `--amp` option for new version PyTorch.

* Fix imports and UT.
2023-03-29 15:50:44 +08:00
mzr1996 0979e78573 Rename the package name to `mmpretrain`. 2023-02-17 15:20:55 +08:00