NATTEN is an extension to PyTorch, which provides the first fast sliding window attention with efficient CPU and CUDA kernels. It provides Neighborhood Attention (local attention) and Dilated Neighborhood Attention (sparse global attention, a.k.a. dilated local attention) as PyTorch modules for both 1D and 2D data.
Install with pip
Latest release: 0.14.5
Please select your preferred PyTorch version with the correct CUDA build, or CPU build if you're not using CUDA:
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cu118/torch2.0.0/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cu117/torch2.0.0/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cpu/torch2.0.0/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cu117/torch1.13/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cu116/torch1.13/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cpu/torch1.13/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cu116/torch1.12.1/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cu113/torch1.12.1/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cu102/torch1.12.1/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cpu/torch1.12.1/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cu116/torch1.12/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cu113/torch1.12/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cu102/torch1.12/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cpu/torch1.12/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cu115/torch1.11/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cu113/torch1.11/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cu102/torch1.11/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cpu/torch1.11/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cu113/torch1.10.1/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cu111/torch1.10.1/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cu102/torch1.10.1/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cpu/torch1.10.1/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cu113/torch1.10/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cu111/torch1.10/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cu102/torch1.10/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cpu/torch1.10/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cu111/torch1.9/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cu102/torch1.9/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cpu/torch1.9/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cu111/torch1.8/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cu102/torch1.8/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cu101/torch1.8/index.html
Run this command:
pip3 install natten -f https://shi-labs.com/natten/wheels/cpu/torch1.8/index.html
Your build isn't listed? Mac user? Just do:
pip install natten==0.14.5
Careful though, without precompiled wheels installing might take a while.
Don't know your torch/cuda version? Run this:
python3 -c "import torch; print(torch.__version__)"
Note: the CUDA version refers to your torch build, not the actual CUDA version you have installed, which may be higher.
Make sure your Python version ∈ [3.7, 3.8, 3.9], unless you're using PyTorch 1.11 and above in which case Python 3.10 is also supported, or using PyTorch 1.13, in which case Python 3.11 is also supported. PyTorch 2.0 dropped support for 3.7 and 3.8.
NATTEN does not have precompiled wheels for Windows (you can try building from source). PRs and contributions are appreciated.
Build from source
You can build from source if you are on an unsupported environment.
Building with ninja is recommended.
Run this command:
pip install ninja # Recommended, not required git clone https://github.com/SHI-Labs/NATTEN cd NATTEN pip install -e .
Requirements
Presently NATTEN only supports PyTorch, and the CUDA build only supports Pascal(i.e. Tesla P100) and above (SM60 and higher). Make sure your GPU is supported by referring to this link.
Future versions will include support for Maxell and Kepler.
Usage
Easy to use, just import and go:
from natten import NeighborhoodAttention1D, NeighborhoodAttention2D na1d = NeighborhoodAttention1D(dim=128, kernel_size=7, dilation=3, num_heads=4) na2d = NeighborhoodAttention2D(dim=128, kernel_size=7, dilation=3, num_heads=4)
Citation
Please consider citing our papers if you use NATTEN in your work:
@inproceedings{hassani2023neighborhood,
title = {Neighborhood Attention Transformer},
author = {Ali Hassani and Steven Walton and Jiachen Li and Shen Li and Humphrey Shi},
year = 2023,
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}
}
@article{hassani2022dilated,
title = {Dilated Neighborhood Attention Transformer},
author = {Ali Hassani and Humphrey Shi},
year = 2022,
url = {https://arxiv.org/abs/2209.15001},
eprint = {2209.15001},
archiveprefix = {arXiv},
primaryclass = {cs.CV}
}