Abstract: Recently, vision transformers have become very popular. However, deploying them in many applications is computationally expensive partly due to the Softmax layer in the attention block. We ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results