The Underrated Interlaced Sparse Self-Attention in Vision Transformers less than 1 minute read Published: August 25, 2022The blog is availale at Zhihu (in Chinese).Share on Twitter Facebook LinkedIn