Age | Commit message (Expand) | Author |
2021-11-28 | Refactor attention layer module | Gustaf Rydholm |
2021-11-27 | Revert "Remove ViT" | Gustaf Rydholm |
2021-11-27 | Revert "Revert "Remove default transformer"" | Gustaf Rydholm |
2021-11-27 | Revert "Remove default transformer" | Gustaf Rydholm |
2021-11-21 | Format files | Gustaf Rydholm |
2021-11-21 | Add axial transformer | Gustaf Rydholm |
2021-11-17 | Remove local attention | Gustaf Rydholm |
2021-11-05 | Rename mask to input_mask | Gustaf Rydholm |
2021-11-03 | Fix output from attn modules | Gustaf Rydholm |
2021-11-03 | Fix local attn to work with any input length | Gustaf Rydholm |
2021-11-03 | Update output shape from attn module | Gustaf Rydholm |
2021-11-01 | Fix bugs in local attention | Gustaf Rydholm |
2021-11-01 | Refactor transformer layer | Gustaf Rydholm |
2021-11-01 | Fix self attention module | Gustaf Rydholm |
2021-11-01 | Update rotary embedding | Gustaf Rydholm |
2021-10-28 | Update rotary embedding | Gustaf Rydholm |
2021-10-28 | Remove 2D positional embedding | Gustaf Rydholm |
2021-10-28 | Remove absolute positional embedding | Gustaf Rydholm |
2021-10-28 | Fix multihead local attention | Gustaf Rydholm |
2021-10-28 | Fix bug with local attention | Gustaf Rydholm |
2021-10-28 | Refactor attention module | Gustaf Rydholm |
2021-10-27 | Add axial embedding | Gustaf Rydholm |
2021-10-27 | Remove ViT | Gustaf Rydholm |
2021-10-27 | Remove default transformer | Gustaf Rydholm |
2021-10-27 | Add comments to transformer modules | Gustaf Rydholm |
2021-10-27 | Clean up local attention, add comments and types | Gustaf Rydholm |
2021-10-27 | Add local attn in transformer layer | Gustaf Rydholm |
2021-10-27 | Remove unused import and add comments in attn module | Gustaf Rydholm |
2021-10-27 | Remove imports from __init__ in transformer network | Gustaf Rydholm |
2021-10-27 | Rename transformer embeddings | Gustaf Rydholm |
2021-10-24 | Add local attention | Gustaf Rydholm |
2021-10-24 | Refator attention | Gustaf Rydholm |
2021-10-03 | Refactor rotary embedding | Gustaf Rydholm |
2021-10-03 | Fix rotary embedding | Gustaf Rydholm |
2021-09-30 | Major bug fix in attention layer | Gustaf Rydholm |
2021-08-03 | Training working, multiple bug fixes | Gustaf Rydholm |
2021-08-02 | Fix log import, fix mapping in datamodules, fix nn modules can be hashed | Gustaf Rydholm |
2021-07-30 | attr bug fix, properly loading network | Gustaf Rydholm |
2021-07-28 | Reformatting with attrs, config for encoder and decoder | Gustaf Rydholm |
2021-07-23 | Remove nystromer | Gustaf Rydholm |
2021-07-09 | Working on cnn transformer, continue with predict | Gustaf Rydholm |
2021-07-08 | Add comments | Gustaf Rydholm |
2021-06-07 | Removed docs pipeline in noxfile, reformatted code w black | Gustaf Rydholm |
2021-06-07 | Working feedforward of full transformer arch in notebook | Gustaf Rydholm |
2021-06-06 | Working on fixing decoder transformer | Gustaf Rydholm |
2021-05-13 | Decoder module working | Gustaf Rydholm |
2021-05-09 | Reformatting of positional encodings and ViT working | Gustaf Rydholm |
2021-05-09 | tranformer layer done | Gustaf Rydholm |
2021-05-09 | Attention layer soon done | Gustaf Rydholm |
2021-05-06 | Working on attention layer configuration | Gustaf Rydholm |