summaryrefslogtreecommitdiff
path: root/text_recognizer
AgeCommit message (Expand)Author
2022-06-09Fix conformerGustaf Rydholm
2022-06-09Remove abstract mappingGustaf Rydholm
2022-06-08Fix conformer netGustaf Rydholm
2022-06-08Add efficient self attentionGustaf Rydholm
2022-06-08Fix bug in lit conformerGustaf Rydholm
2022-06-08WIP mappingsGustaf Rydholm
2022-06-07Add subsampler layerGustaf Rydholm
2022-06-06Add lit conformer modelGustaf Rydholm
2022-06-06Rename lit modelsGustaf Rydholm
2022-06-05Add efficiet net to its initGustaf Rydholm
2022-06-05Remove attrsGustaf Rydholm
2022-06-05Add init for mappingsGustaf Rydholm
2022-06-05Remove attrsGustaf Rydholm
2022-06-05Update transformer initGustaf Rydholm
2022-06-05Fix kwargsGustaf Rydholm
2022-06-05Update initGustaf Rydholm
2022-06-05Add conformerGustaf Rydholm
2022-06-05Fix conformer blockGustaf Rydholm
2022-06-05Fix kwargsGustaf Rydholm
2022-06-05Remove depth wise conv classGustaf Rydholm
2022-06-05Rename mlp to ffGustaf Rydholm
2022-06-02Add conformer blockGustaf Rydholm
2022-06-02Add conformer conv layerGustaf Rydholm
2022-06-02Add mlp conformer layerGustaf Rydholm
2022-06-01FormatGustaf Rydholm
2022-06-01WIP conformerGustaf Rydholm
2022-06-01Replace attr with attrsGustaf Rydholm
2022-05-30WIP conformerGustaf Rydholm
2022-05-30Add a basic cnn encoderGustaf Rydholm
2022-02-11fix: move image utils back into data folderGustaf Rydholm
2022-02-06chore: remove unused importsGustaf Rydholm
2022-02-06fix: refactor and move image utilsGustaf Rydholm
2022-02-06chore: remove word pieces util codeGustaf Rydholm
2022-02-06chore: remove word piecesGustaf Rydholm
2022-02-03chore: remove axial attentionGustaf Rydholm
2022-02-03chore: remove unused code block in lit trans modelGustaf Rydholm
2022-01-29fix(decoder): typosGustaf Rydholm
2022-01-29feat(norm): add prenormGustaf Rydholm
2022-01-29chore(decoder): add new import path for decoderGustaf Rydholm
2022-01-29feat(base): remove output normGustaf Rydholm
2022-01-29feat: add RMSNormGustaf Rydholm
2022-01-29chore: remove residual blockGustaf Rydholm
2022-01-29chore: remove old attention layer blockGustaf Rydholm
2022-01-29feat: add new transformer decoderGustaf Rydholm
2022-01-26fix: refactor AttentionLayersGustaf Rydholm
2022-01-26fix: change Set to SequenceGustaf Rydholm
2022-01-26fix: lint and formatGustaf Rydholm
2021-12-05Format files with blackGustaf Rydholm
2021-12-05Remove ViT once againGustaf Rydholm
2021-12-05Update conv transformer with inheritance from base networkGustaf Rydholm