summaryrefslogtreecommitdiff
path: root/text_recognizer/networks/transformer
AgeCommit message (Collapse)Author
2023-08-25Rename and add flash attenGustaf Rydholm
2022-10-04Format and rewording docstringGustaf Rydholm
2022-10-02Update docstringsGustaf Rydholm
2022-10-02Add commentsGustaf Rydholm
2022-09-30Make rotary pos encoding mandatoryGustaf Rydholm
2022-09-30Update transformerGustaf Rydholm
2022-09-28Remove inplace ops for grad flowGustaf Rydholm
2022-09-27Refactor decoder blockGustaf Rydholm
2022-09-18Format importsGustaf Rydholm
Format imports
2022-09-13Remove importGustaf Rydholm
Remove imports
2022-09-13Rename hidden dim to dimGustaf Rydholm
2022-09-13Remove axial encoderGustaf Rydholm
2022-09-13Add axial encoderGustaf Rydholm
2022-09-05Update normGustaf Rydholm
2022-09-05Steal lucidrains axial encodingGustaf Rydholm
2022-09-05Add absolute pos embeddingGustaf Rydholm
2022-06-11Update ff importGustaf Rydholm
2022-06-10Fix check for pos embGustaf Rydholm
2022-06-10Add importsGustaf Rydholm
2022-06-10Move mlp to ffGustaf Rydholm
2022-06-07Add subsampler layerGustaf Rydholm
2022-06-05Remove attrsGustaf Rydholm
2022-06-05Update transformer initGustaf Rydholm
2022-06-05Fix kwargsGustaf Rydholm
2022-06-01Replace attr with attrsGustaf Rydholm
2022-02-03chore: remove axial attentionGustaf Rydholm
chore: remove axial attention
2022-01-29fix(decoder): typosGustaf Rydholm
2022-01-29feat(norm): add prenormGustaf Rydholm
2022-01-29feat: add RMSNormGustaf Rydholm
2022-01-29chore: remove residual blockGustaf Rydholm
2022-01-29chore: remove old attention layer blockGustaf Rydholm
2022-01-29feat: add new transformer decoderGustaf Rydholm
2022-01-26fix: refactor AttentionLayersGustaf Rydholm
2021-12-05Remove ViT once againGustaf Rydholm
2021-12-04Revert "Remove ViT"Gustaf Rydholm
This reverts commit 3de4312d1796b1ee56d6467d36773df29a831e45.
2021-11-28Remove ViTGustaf Rydholm
2021-11-28Refactor attention layer moduleGustaf Rydholm
2021-11-27Revert "Remove ViT"Gustaf Rydholm
This reverts commit 4a6550ddef7d1f1971737bc22715db6381441f79.
2021-11-27Revert "Revert "Remove default transformer""Gustaf Rydholm
This reverts commit 89be047a46c8e88511d301f63d7f6795fe04ab81.
2021-11-27Revert "Remove default transformer"Gustaf Rydholm
This reverts commit b3d3b7ddc0796e98d78561bc5ca22728dc0372b0.
2021-11-21Format filesGustaf Rydholm
2021-11-21Add axial transformerGustaf Rydholm
2021-11-17Remove local attentionGustaf Rydholm
2021-11-05Rename mask to input_maskGustaf Rydholm
Rename mask to input_mask Rename mask to input_mask
2021-11-03Fix output from attn modulesGustaf Rydholm
2021-11-03Fix local attn to work with any input lengthGustaf Rydholm
2021-11-03Update output shape from attn moduleGustaf Rydholm
2021-11-01Fix bugs in local attentionGustaf Rydholm
2021-11-01Refactor transformer layerGustaf Rydholm
2021-11-01Fix self attention moduleGustaf Rydholm