Age | Commit message (Expand) | Author |
2021-11-01 | Fix bugs in local attention | Gustaf Rydholm |
2021-11-01 | Refactor transformer layer | Gustaf Rydholm |
2021-11-01 | Fix self attention module | Gustaf Rydholm |
2021-11-01 | Update rotary embedding | Gustaf Rydholm |
2021-10-28 | Update rotary embedding | Gustaf Rydholm |
2021-10-28 | Remove 2D positional embedding | Gustaf Rydholm |
2021-10-28 | Remove absolute positional embedding | Gustaf Rydholm |
2021-10-28 | Fix multihead local attention | Gustaf Rydholm |
2021-10-28 | Fix bug with local attention | Gustaf Rydholm |
2021-10-28 | Refactor attention module | Gustaf Rydholm |
2021-10-27 | Add axial embedding | Gustaf Rydholm |
2021-10-27 | Remove ViT | Gustaf Rydholm |
2021-10-27 | Remove default transformer | Gustaf Rydholm |
2021-10-27 | Add comments to transformer modules | Gustaf Rydholm |
2021-10-27 | Clean up local attention, add comments and types | Gustaf Rydholm |
2021-10-27 | Add local attn in transformer layer | Gustaf Rydholm |
2021-10-27 | Remove unused import and add comments in attn module | Gustaf Rydholm |
2021-10-27 | Remove imports from __init__ in transformer network | Gustaf Rydholm |
2021-10-27 | Rename transformer embeddings | Gustaf Rydholm |
2021-10-24 | Add local attention | Gustaf Rydholm |
2021-10-24 | Refator attention | Gustaf Rydholm |
2021-10-03 | Refactor rotary embedding | Gustaf Rydholm |
2021-10-03 | Fix rotary embedding | Gustaf Rydholm |
2021-09-30 | Major bug fix in attention layer | Gustaf Rydholm |
2021-08-03 | Training working, multiple bug fixes | Gustaf Rydholm |
2021-08-02 | Fix log import, fix mapping in datamodules, fix nn modules can be hashed | Gustaf Rydholm |
2021-07-30 | attr bug fix, properly loading network | Gustaf Rydholm |
2021-07-28 | Reformatting with attrs, config for encoder and decoder | Gustaf Rydholm |
2021-07-23 | Remove nystromer | Gustaf Rydholm |
2021-07-09 | Working on cnn transformer, continue with predict | Gustaf Rydholm |
2021-07-08 | Add comments | Gustaf Rydholm |
2021-06-07 | Removed docs pipeline in noxfile, reformatted code w black | Gustaf Rydholm |
2021-06-07 | Working feedforward of full transformer arch in notebook | Gustaf Rydholm |
2021-06-06 | Working on fixing decoder transformer | Gustaf Rydholm |
2021-05-13 | Decoder module working | Gustaf Rydholm |
2021-05-09 | Reformatting of positional encodings and ViT working | Gustaf Rydholm |
2021-05-09 | tranformer layer done | Gustaf Rydholm |
2021-05-09 | Attention layer soon done | Gustaf Rydholm |
2021-05-06 | Working on attention layer configuration | Gustaf Rydholm |
2021-05-04 | Bug fix | Gustaf Rydholm |
2021-05-04 | Nyströmer implemented but not tested | Gustaf Rydholm |
2021-05-02 | ViT todo | Gustaf Rydholm |
2021-05-02 | Note to self implement Nystrom attention | Gustaf Rydholm |
2021-05-02 | Attention layer finished | Gustaf Rydholm |
2021-05-02 | Working on attention | Gustaf Rydholm |
2021-05-02 | Implemented training script with hydra | Gustaf Rydholm |
2021-05-01 | Working on new attention module | Gustaf Rydholm |
2021-04-26 | Reformatting transformer (work in progress) | Gustaf Rydholm |
2021-04-25 | Efficient net and non working transformer model. | Gustaf Rydholm |
2021-04-05 | Configure backbone commented out, transformer init change | Gustaf Rydholm |