Neighborhood Attention Extension

Bringing attention to a neighborhood near you!

NATTEN is an extension to PyTorch, which provides the first fast sliding window attention with efficient CPU and CUDA kernels. It provides Neighborhood Attention (local attention) and Dilated Neighborhood Attention (sparse global attention, a.k.a. dilated local attention) as PyTorch modules for 1D, 2D, and 3D data.

Start using our new Fused Neighborhood Attention implementation today!

GitHub / PyPI

Neighborhood Attention Transformers

Install with pip

Latest release: 0.17.1

Please select your preferred PyTorch version with the correct CUDA build, or CPU build if you're not using CUDA:

Run this command:

pip3 install natten==0.17.1+torch230cu121 -f https://shi-labs.com/natten/wheels/

Run this command:

pip3 install natten==0.17.1+torch230cu118 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.17.1+torch230cpu -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.17.1+torch220cu121 -f https://shi-labs.com/natten/wheels/

Run this command:

pip3 install natten==0.17.1+torch220cu118 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.17.1+torch220cpu -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.17.1+torch210cu121 -f https://shi-labs.com/natten/wheels/

Run this command:

pip3 install natten==0.17.1+torch210cu118 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.17.1+torch210cpu -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.17.1+torch200cu118 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.17.1+torch200cu117 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.17.1+torch200cpu -f https://shi-labs.com/natten/wheels

Your build isn't listed? Mac user? Just do:

pip install natten==0.17.1

Careful though, without pre-compiled wheels installing might take a while.

You're also required to have CUDA > 11.7, cmake > 3.20 and PyTorch > 2.0 installed before attempting to install/build NATTEN.

Don't know your torch/cuda version? Run this:

python3 -c "import torch; print(torch.__version__)"

Note: the CUDA version refers to your torch build, not the actual CUDA version you have installed, which may be higher.

Make sure your Python version ∈ [3.8, 3.12].

NATTEN does not have pre-compiled wheels for Windows (you can try building from source).

PRs and contributions are appreciated.

Older releases

We're only able to support the most recent major torch releases starting 2024. If you're using an older torch / CUDA build, you can still installer previous NATTEN releases.

Run this command:

pip3 install natten==0.14.6+torch200cu118 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch200cu117 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch200cpu -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch1130cu117 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch1130cu116 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch1130cpu -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch1121cu116 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch1121cu113 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch1121cu102 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch1121cpu -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch1120cu116 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch1120cu113 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch1120cu102 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch1120cpu -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch1110cu115 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch1110cu113 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch1110cu102 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch1110cpu -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch1101cu113 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch1101cu111 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch1101cu102 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch1101cpu -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch1100cu113 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch1100cu111 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch1100cu102 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch1100cpu -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch190cu111 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch190cu102 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch190cpu -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch180cu111 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch180cu102 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch180cu101 -f https://shi-labs.com/natten/wheels

Run this command:

pip3 install natten==0.14.6+torch180cpu -f https://shi-labs.com/natten/wheels

Quick links

Citation

Please consider citing our papers if you use NATTEN in your work:

@misc{hassani2024faster,
  title        = {Faster Neighborhood Attention: Reducing the O(n^2) Cost of Self Attention at the Threadblock Level},
  author       = {Ali Hassani and Wen-Mei Hwu and Humphrey Shi},
  year         = 2024,
  url          = {https://arxiv.org/abs/2403.04690},
  eprint       = {2403.04690},
  archiveprefix = {arXiv},
  primaryclass = {cs.CV}
}
@inproceedings{hassani2023neighborhood,
  title        = {Neighborhood Attention Transformer},
  author       = {Ali Hassani and Steven Walton and Jiachen Li and Shen Li and Humphrey Shi},
  year         = 2023,
        booktitle    = {IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}
}
@misc{hassani2022dilated,
  title        = {Dilated Neighborhood Attention Transformer},
  author       = {Ali Hassani and Humphrey Shi},
  year         = 2022,
  url          = {https://arxiv.org/abs/2209.15001},
  eprint       = {2209.15001},
  archiveprefix = {arXiv},
  primaryclass = {cs.CV}
}