summaryrefslogtreecommitdiff
path: root/text_recognizer/networks
AgeCommit message (Collapse)Author
2021-11-05Remove out_channels as a settable parameter in effnetGustaf Rydholm
2021-11-03Remove unused args from conv transformerGustaf Rydholm
2021-11-03Remove pixelcnnGustaf Rydholm
2021-11-03Fix output from attn modulesGustaf Rydholm
2021-11-03Fix local attn to work with any input lengthGustaf Rydholm
2021-11-03Update output shape from attn moduleGustaf Rydholm
2021-11-01Add check for positional encoding in attn layerGustaf Rydholm
2021-11-01Fix bugs in local attentionGustaf Rydholm
2021-11-01Refactor transformer layerGustaf Rydholm
2021-11-01Fix self attention moduleGustaf Rydholm
2021-11-01Update rotary embeddingGustaf Rydholm
2021-10-28Update rotary embeddingGustaf Rydholm
2021-10-28Remove 2D positional embeddingGustaf Rydholm
2021-10-28Remove absolute positional embeddingGustaf Rydholm
2021-10-28Add check for position embeddingGustaf Rydholm
2021-10-28Fix multihead local attentionGustaf Rydholm
2021-10-28Fix bug with local attentionGustaf Rydholm
2021-10-28Refactor attention moduleGustaf Rydholm
2021-10-27Remove Barlow TwinsGustaf Rydholm
2021-10-27Add axial embeddingGustaf Rydholm
2021-10-27Remove ViTGustaf Rydholm
2021-10-27Remove default transformerGustaf Rydholm
2021-10-27Add comments to transformer modulesGustaf Rydholm
2021-10-27Clean up local attention, add comments and typesGustaf Rydholm
2021-10-27Add local attn in transformer layerGustaf Rydholm
2021-10-27Remove unused import and add comments in attn moduleGustaf Rydholm
2021-10-27Remove imports from __init__ in transformer networkGustaf Rydholm
2021-10-27Rename transformer embeddingsGustaf Rydholm
2021-10-24Add local attentionGustaf Rydholm
2021-10-24Remove unused import in conv transformerGustaf Rydholm
2021-10-24Refator attentionGustaf Rydholm
2021-10-24Format of filesGustaf Rydholm
2021-10-11Add BarlowNetworkGustaf Rydholm
2021-10-10Remove importsGustaf Rydholm
2021-10-07LintGustaf Rydholm
2021-10-07Add Barlow Twins network and training proceduerGustaf Rydholm
2021-10-03Refactor rotary embeddingGustaf Rydholm
2021-10-03Fix rotary embeddingGustaf Rydholm
2021-10-03Lint filesGustaf Rydholm
2021-09-30Major bug fix in attention layerGustaf Rydholm
2021-09-30Bug fix for loading pretrained vq encoderGustaf Rydholm
2021-09-30Set leaky relu inplace to falseGustaf Rydholm
2021-09-30Rename context to trg in transformerGustaf Rydholm
2021-09-30Rename vqloss to commitment loss in vqvae networkGustaf Rydholm
2021-09-30Refactor residual blockGustaf Rydholm
2021-09-30Add num res blocks as a variableGustaf Rydholm
2021-09-19Add loading of encoder in vq transformer networkGustaf Rydholm
2021-09-19Add ability to set use norm in vqvaeGustaf Rydholm
2021-09-18Add comment for quantizationGustaf Rydholm
2021-09-18Remove TODO in conv transformerGustaf Rydholm