summaryrefslogtreecommitdiff
path: root/text_recognizer/networks/transformer
AgeCommit message (Expand)Author
2022-06-05Fix kwargsGustaf Rydholm
2022-06-01Replace attr with attrsGustaf Rydholm
2022-02-03chore: remove axial attentionGustaf Rydholm
2022-01-29fix(decoder): typosGustaf Rydholm
2022-01-29feat(norm): add prenormGustaf Rydholm
2022-01-29feat: add RMSNormGustaf Rydholm
2022-01-29chore: remove residual blockGustaf Rydholm
2022-01-29chore: remove old attention layer blockGustaf Rydholm
2022-01-29feat: add new transformer decoderGustaf Rydholm
2022-01-26fix: refactor AttentionLayersGustaf Rydholm
2021-12-05Remove ViT once againGustaf Rydholm
2021-12-04Revert "Remove ViT"Gustaf Rydholm
2021-11-28Remove ViTGustaf Rydholm
2021-11-28Refactor attention layer moduleGustaf Rydholm
2021-11-27Revert "Remove ViT"Gustaf Rydholm
2021-11-27Revert "Revert "Remove default transformer""Gustaf Rydholm
2021-11-27Revert "Remove default transformer"Gustaf Rydholm
2021-11-21Format filesGustaf Rydholm
2021-11-21Add axial transformerGustaf Rydholm
2021-11-17Remove local attentionGustaf Rydholm
2021-11-05Rename mask to input_maskGustaf Rydholm
2021-11-03Fix output from attn modulesGustaf Rydholm
2021-11-03Fix local attn to work with any input lengthGustaf Rydholm
2021-11-03Update output shape from attn moduleGustaf Rydholm
2021-11-01Fix bugs in local attentionGustaf Rydholm
2021-11-01Refactor transformer layerGustaf Rydholm
2021-11-01Fix self attention moduleGustaf Rydholm
2021-11-01Update rotary embeddingGustaf Rydholm
2021-10-28Update rotary embeddingGustaf Rydholm
2021-10-28Remove 2D positional embeddingGustaf Rydholm
2021-10-28Remove absolute positional embeddingGustaf Rydholm
2021-10-28Fix multihead local attentionGustaf Rydholm
2021-10-28Fix bug with local attentionGustaf Rydholm
2021-10-28Refactor attention moduleGustaf Rydholm
2021-10-27Add axial embeddingGustaf Rydholm
2021-10-27Remove ViTGustaf Rydholm
2021-10-27Remove default transformerGustaf Rydholm
2021-10-27Add comments to transformer modulesGustaf Rydholm
2021-10-27Clean up local attention, add comments and typesGustaf Rydholm
2021-10-27Add local attn in transformer layerGustaf Rydholm
2021-10-27Remove unused import and add comments in attn moduleGustaf Rydholm
2021-10-27Remove imports from __init__ in transformer networkGustaf Rydholm
2021-10-27Rename transformer embeddingsGustaf Rydholm
2021-10-24Add local attentionGustaf Rydholm
2021-10-24Refator attentionGustaf Rydholm
2021-10-03Refactor rotary embeddingGustaf Rydholm
2021-10-03Fix rotary embeddingGustaf Rydholm
2021-09-30Major bug fix in attention layerGustaf Rydholm
2021-08-03Training working, multiple bug fixesGustaf Rydholm
2021-08-02Fix log import, fix mapping in datamodules, fix nn modules can be hashedGustaf Rydholm