summaryrefslogtreecommitdiff
path: root/text_recognizer/networks/transformer
AgeCommit message (Collapse)Author
2022-06-05Update transformer initGustaf Rydholm
2022-06-05Fix kwargsGustaf Rydholm
2022-06-01Replace attr with attrsGustaf Rydholm
2022-02-03chore: remove axial attentionGustaf Rydholm
chore: remove axial attention
2022-01-29fix(decoder): typosGustaf Rydholm
2022-01-29feat(norm): add prenormGustaf Rydholm
2022-01-29feat: add RMSNormGustaf Rydholm
2022-01-29chore: remove residual blockGustaf Rydholm
2022-01-29chore: remove old attention layer blockGustaf Rydholm
2022-01-29feat: add new transformer decoderGustaf Rydholm
2022-01-26fix: refactor AttentionLayersGustaf Rydholm
2021-12-05Remove ViT once againGustaf Rydholm
2021-12-04Revert "Remove ViT"Gustaf Rydholm
This reverts commit 3de4312d1796b1ee56d6467d36773df29a831e45.
2021-11-28Remove ViTGustaf Rydholm
2021-11-28Refactor attention layer moduleGustaf Rydholm
2021-11-27Revert "Remove ViT"Gustaf Rydholm
This reverts commit 4a6550ddef7d1f1971737bc22715db6381441f79.
2021-11-27Revert "Revert "Remove default transformer""Gustaf Rydholm
This reverts commit 89be047a46c8e88511d301f63d7f6795fe04ab81.
2021-11-27Revert "Remove default transformer"Gustaf Rydholm
This reverts commit b3d3b7ddc0796e98d78561bc5ca22728dc0372b0.
2021-11-21Format filesGustaf Rydholm
2021-11-21Add axial transformerGustaf Rydholm
2021-11-17Remove local attentionGustaf Rydholm
2021-11-05Rename mask to input_maskGustaf Rydholm
Rename mask to input_mask Rename mask to input_mask
2021-11-03Fix output from attn modulesGustaf Rydholm
2021-11-03Fix local attn to work with any input lengthGustaf Rydholm
2021-11-03Update output shape from attn moduleGustaf Rydholm
2021-11-01Fix bugs in local attentionGustaf Rydholm
2021-11-01Refactor transformer layerGustaf Rydholm
2021-11-01Fix self attention moduleGustaf Rydholm
2021-11-01Update rotary embeddingGustaf Rydholm
2021-10-28Update rotary embeddingGustaf Rydholm
2021-10-28Remove 2D positional embeddingGustaf Rydholm
2021-10-28Remove absolute positional embeddingGustaf Rydholm
2021-10-28Fix multihead local attentionGustaf Rydholm
2021-10-28Fix bug with local attentionGustaf Rydholm
2021-10-28Refactor attention moduleGustaf Rydholm
2021-10-27Add axial embeddingGustaf Rydholm
2021-10-27Remove ViTGustaf Rydholm
2021-10-27Remove default transformerGustaf Rydholm
2021-10-27Add comments to transformer modulesGustaf Rydholm
2021-10-27Clean up local attention, add comments and typesGustaf Rydholm
2021-10-27Add local attn in transformer layerGustaf Rydholm
2021-10-27Remove unused import and add comments in attn moduleGustaf Rydholm
2021-10-27Remove imports from __init__ in transformer networkGustaf Rydholm
2021-10-27Rename transformer embeddingsGustaf Rydholm
2021-10-24Add local attentionGustaf Rydholm
2021-10-24Refator attentionGustaf Rydholm
2021-10-03Refactor rotary embeddingGustaf Rydholm
2021-10-03Fix rotary embeddingGustaf Rydholm
2021-09-30Major bug fix in attention layerGustaf Rydholm
2021-08-03Training working, multiple bug fixesGustaf Rydholm