summaryrefslogtreecommitdiff
path: root/text_recognizer/networks/transformer/axial_attention/utils.py
diff options
context:
space:
mode:
authorGustaf Rydholm <gustaf.rydholm@gmail.com>2022-01-29 15:54:00 +0100
committerGustaf Rydholm <gustaf.rydholm@gmail.com>2022-01-29 15:54:00 +0100
commit57e539eecff8211d1a69de81891796797a2ced38 (patch)
treefbd75c3d49ca10519698d565bd8295501e48b77c /text_recognizer/networks/transformer/axial_attention/utils.py
parent877ee5984dad08379e0c781d35a534b97012e325 (diff)
chore: remove old attention layer block
Diffstat (limited to 'text_recognizer/networks/transformer/axial_attention/utils.py')
0 files changed, 0 insertions, 0 deletions