summaryrefslogtreecommitdiff
path: root/text_recognizer/networks/transformer
AgeCommit message (Expand)Author
2023-08-25Rename and add flash attenGustaf Rydholm
2022-10-04Format and rewording docstringGustaf Rydholm
2022-10-02Update docstringsGustaf Rydholm
2022-10-02Add commentsGustaf Rydholm
2022-09-30Make rotary pos encoding mandatoryGustaf Rydholm
2022-09-30Update transformerGustaf Rydholm
2022-09-28Remove inplace ops for grad flowGustaf Rydholm
2022-09-27Refactor decoder blockGustaf Rydholm
2022-09-18Format importsGustaf Rydholm
2022-09-13Remove importGustaf Rydholm
2022-09-13Rename hidden dim to dimGustaf Rydholm
2022-09-13Remove axial encoderGustaf Rydholm
2022-09-13Add axial encoderGustaf Rydholm
2022-09-05Update normGustaf Rydholm
2022-09-05Steal lucidrains axial encodingGustaf Rydholm
2022-09-05Add absolute pos embeddingGustaf Rydholm
2022-06-11Update ff importGustaf Rydholm
2022-06-10Fix check for pos embGustaf Rydholm
2022-06-10Add importsGustaf Rydholm
2022-06-10Move mlp to ffGustaf Rydholm
2022-06-07Add subsampler layerGustaf Rydholm
2022-06-05Remove attrsGustaf Rydholm
2022-06-05Update transformer initGustaf Rydholm
2022-06-05Fix kwargsGustaf Rydholm
2022-06-01Replace attr with attrsGustaf Rydholm
2022-02-03chore: remove axial attentionGustaf Rydholm
2022-01-29fix(decoder): typosGustaf Rydholm
2022-01-29feat(norm): add prenormGustaf Rydholm
2022-01-29feat: add RMSNormGustaf Rydholm
2022-01-29chore: remove residual blockGustaf Rydholm
2022-01-29chore: remove old attention layer blockGustaf Rydholm
2022-01-29feat: add new transformer decoderGustaf Rydholm
2022-01-26fix: refactor AttentionLayersGustaf Rydholm
2021-12-05Remove ViT once againGustaf Rydholm
2021-12-04Revert "Remove ViT"Gustaf Rydholm
2021-11-28Remove ViTGustaf Rydholm
2021-11-28Refactor attention layer moduleGustaf Rydholm
2021-11-27Revert "Remove ViT"Gustaf Rydholm
2021-11-27Revert "Revert "Remove default transformer""Gustaf Rydholm
2021-11-27Revert "Remove default transformer"Gustaf Rydholm
2021-11-21Format filesGustaf Rydholm
2021-11-21Add axial transformerGustaf Rydholm
2021-11-17Remove local attentionGustaf Rydholm
2021-11-05Rename mask to input_maskGustaf Rydholm
2021-11-03Fix output from attn modulesGustaf Rydholm
2021-11-03Fix local attn to work with any input lengthGustaf Rydholm
2021-11-03Update output shape from attn moduleGustaf Rydholm
2021-11-01Fix bugs in local attentionGustaf Rydholm
2021-11-01Refactor transformer layerGustaf Rydholm
2021-11-01Fix self attention moduleGustaf Rydholm