summaryrefslogtreecommitdiff
path: root/text_recognizer
AgeCommit message (Expand)Author
2022-01-26fix: refactor AttentionLayersGustaf Rydholm
2022-01-26fix: change Set to SequenceGustaf Rydholm
2022-01-26fix: lint and formatGustaf Rydholm
2021-12-05Format files with blackGustaf Rydholm
2021-12-05Remove ViT once againGustaf Rydholm
2021-12-05Update conv transformer with inheritance from base networkGustaf Rydholm
2021-12-05Add base networkGustaf Rydholm
2021-12-04Revert "Remove ViT"Gustaf Rydholm
2021-11-28Remove ViTGustaf Rydholm
2021-11-28Add norm layer to output from decoderGustaf Rydholm
2021-11-28Refactor attention layer moduleGustaf Rydholm
2021-11-27Revert "Remove ViT"Gustaf Rydholm
2021-11-27Revert "Revert "Remove default transformer""Gustaf Rydholm
2021-11-27Revert "Remove default transformer"Gustaf Rydholm
2021-11-22Fix import efficientnetGustaf Rydholm
2021-11-22Remove unused utilsGustaf Rydholm
2021-11-22Format conv transformerGustaf Rydholm
2021-11-22Move efficientnet from encoder dirGustaf Rydholm
2021-11-21Remove criterion and ctc lossGustaf Rydholm
2021-11-21Format filesGustaf Rydholm
2021-11-21Add axial encoder to conv transformerGustaf Rydholm
2021-11-21Remove VQVAE stuff, did not work...Gustaf Rydholm
2021-11-21Add axial transformerGustaf Rydholm
2021-11-21Remove label smoothing lossGustaf Rydholm
2021-11-17Update VqTransformerGustaf Rydholm
2021-11-17Update encoder fun in conv_transformerGustaf Rydholm
2021-11-17Update vqvae with new quantizerGustaf Rydholm
2021-11-17Add new quantizerGustaf Rydholm
2021-11-17Remove local attentionGustaf Rydholm
2021-11-17Replace last two layers in efficientnet with oneGustaf Rydholm
2021-11-17Format efficientnetGustaf Rydholm
2021-11-17Update vq transformer lit modelGustaf Rydholm
2021-11-17Fix import path vqganlossGustaf Rydholm
2021-11-07Fix dropout in efficientnetGustaf Rydholm
2021-11-07Add dropout to final layer in efficientnetGustaf Rydholm
2021-11-07Fix mbconv sub modulesGustaf Rydholm
2021-11-06Fix efficientnet incorrect channel calculationGustaf Rydholm
2021-11-06Format efficientnetGustaf Rydholm
2021-11-05Remove conv attentionGustaf Rydholm
2021-11-05Rename mask to input_maskGustaf Rydholm
2021-11-05Remove out_channels as a settable parameter in effnetGustaf Rydholm
2021-11-05Rename iam lines transformsGustaf Rydholm
2021-11-05Format pad transformGustaf Rydholm
2021-11-03Add pad transformGustaf Rydholm
2021-11-03Remove unused args from conv transformerGustaf Rydholm
2021-11-03Remove pixelcnnGustaf Rydholm
2021-11-03Fix output from attn modulesGustaf Rydholm
2021-11-03Fix local attn to work with any input lengthGustaf Rydholm
2021-11-03Update output shape from attn moduleGustaf Rydholm
2021-11-01Add check for positional encoding in attn layerGustaf Rydholm