summaryrefslogtreecommitdiff
path: root/text_recognizer/networks/transformer
AgeCommit message (Collapse)Author
2021-11-01Fix self attention moduleGustaf Rydholm
2021-11-01Update rotary embeddingGustaf Rydholm
2021-10-28Update rotary embeddingGustaf Rydholm
2021-10-28Remove 2D positional embeddingGustaf Rydholm
2021-10-28Remove absolute positional embeddingGustaf Rydholm
2021-10-28Fix multihead local attentionGustaf Rydholm
2021-10-28Fix bug with local attentionGustaf Rydholm
2021-10-28Refactor attention moduleGustaf Rydholm
2021-10-27Add axial embeddingGustaf Rydholm
2021-10-27Remove ViTGustaf Rydholm
2021-10-27Remove default transformerGustaf Rydholm
2021-10-27Add comments to transformer modulesGustaf Rydholm
2021-10-27Clean up local attention, add comments and typesGustaf Rydholm
2021-10-27Add local attn in transformer layerGustaf Rydholm
2021-10-27Remove unused import and add comments in attn moduleGustaf Rydholm
2021-10-27Remove imports from __init__ in transformer networkGustaf Rydholm
2021-10-27Rename transformer embeddingsGustaf Rydholm
2021-10-24Add local attentionGustaf Rydholm
2021-10-24Refator attentionGustaf Rydholm
2021-10-03Refactor rotary embeddingGustaf Rydholm
2021-10-03Fix rotary embeddingGustaf Rydholm
2021-09-30Major bug fix in attention layerGustaf Rydholm
2021-08-03Training working, multiple bug fixesGustaf Rydholm
2021-08-02Fix log import, fix mapping in datamodules, fix nn modules can be hashedGustaf Rydholm
2021-07-30attr bug fix, properly loading networkGustaf Rydholm
2021-07-28Reformatting with attrs, config for encoder and decoderGustaf Rydholm
2021-07-23Remove nystromerGustaf Rydholm
2021-07-09Working on cnn transformer, continue with predictGustaf Rydholm
2021-07-08Add commentsGustaf Rydholm
2021-06-07Removed docs pipeline in noxfile, reformatted code w blackGustaf Rydholm
2021-06-07Working feedforward of full transformer arch in notebookGustaf Rydholm
2021-06-06Working on fixing decoder transformerGustaf Rydholm
2021-05-13Decoder module workingGustaf Rydholm
2021-05-09Reformatting of positional encodings and ViT workingGustaf Rydholm
2021-05-09tranformer layer doneGustaf Rydholm
2021-05-09Attention layer soon doneGustaf Rydholm
2021-05-06Working on attention layer configurationGustaf Rydholm
2021-05-04Bug fixGustaf Rydholm
2021-05-04Nyströmer implemented but not testedGustaf Rydholm
2021-05-02ViT todoGustaf Rydholm
2021-05-02Note to self implement Nystrom attentionGustaf Rydholm
2021-05-02Attention layer finishedGustaf Rydholm
2021-05-02Working on attentionGustaf Rydholm
2021-05-02Implemented training script with hydraGustaf Rydholm
2021-05-01Working on new attention moduleGustaf Rydholm
2021-04-26Reformatting transformer (work in progress)Gustaf Rydholm
2021-04-25Efficient net and non working transformer model.Gustaf Rydholm
2021-04-05Configure backbone commented out, transformer init changeGustaf Rydholm
2021-04-04black reformattingGustaf Rydholm
2021-04-04Add new transformer networkGustaf Rydholm