summaryrefslogtreecommitdiff
path: root/text_recognizer/networks/transformer/attention.py
AgeCommit message (Collapse)Author
2023-08-25Rename and add flash attenGustaf Rydholm
2022-09-30Update transformerGustaf Rydholm
2022-09-18Format importsGustaf Rydholm
Format imports
2022-06-05Remove attrsGustaf Rydholm
2022-06-01Replace attr with attrsGustaf Rydholm
2021-11-21Format filesGustaf Rydholm
2021-11-05Rename mask to input_maskGustaf Rydholm
Rename mask to input_mask Rename mask to input_mask
2021-11-03Update output shape from attn moduleGustaf Rydholm
2021-11-01Fix self attention moduleGustaf Rydholm
2021-10-28Refactor attention moduleGustaf Rydholm
2021-10-27Remove unused import and add comments in attn moduleGustaf Rydholm
2021-10-27Rename transformer embeddingsGustaf Rydholm
2021-10-24Refator attentionGustaf Rydholm
2021-10-03Fix rotary embeddingGustaf Rydholm
2021-09-30Major bug fix in attention layerGustaf Rydholm
2021-08-02Fix log import, fix mapping in datamodules, fix nn modules can be hashedGustaf Rydholm
2021-07-30attr bug fix, properly loading networkGustaf Rydholm
2021-07-28Reformatting with attrs, config for encoder and decoderGustaf Rydholm
2021-06-07Working feedforward of full transformer arch in notebookGustaf Rydholm
2021-05-13Decoder module workingGustaf Rydholm
2021-05-09Reformatting of positional encodings and ViT workingGustaf Rydholm
2021-05-02Attention layer finishedGustaf Rydholm
2021-05-02Working on attentionGustaf Rydholm
2021-05-01Working on new attention moduleGustaf Rydholm
2021-04-04Add new transformer networkGustaf Rydholm
2021-03-20Inital commit for refactoring to lightningGustaf Rydholm