Age | Commit message (Expand) | Author |
---|---|---|
2022-09-30 | Update transformer | Gustaf Rydholm |
2022-09-28 | Remove inplace ops for grad flow | Gustaf Rydholm |
2022-09-27 | Refactor decoder block | Gustaf Rydholm |
2022-09-18 | Format imports | Gustaf Rydholm |
2022-09-13 | Remove import | Gustaf Rydholm |
2022-09-13 | Rename hidden dim to dim | Gustaf Rydholm |
2022-09-13 | Remove axial encoder | Gustaf Rydholm |
2022-09-13 | Add axial encoder | Gustaf Rydholm |
2022-09-05 | Update norm | Gustaf Rydholm |
2022-09-05 | Steal lucidrains axial encoding | Gustaf Rydholm |
2022-09-05 | Add absolute pos embedding | Gustaf Rydholm |
2022-06-11 | Update ff import | Gustaf Rydholm |
2022-06-10 | Fix check for pos emb | Gustaf Rydholm |
2022-06-10 | Add imports | Gustaf Rydholm |
2022-06-10 | Move mlp to ff | Gustaf Rydholm |
2022-06-07 | Add subsampler layer | Gustaf Rydholm |
2022-06-05 | Remove attrs | Gustaf Rydholm |
2022-06-05 | Update transformer init | Gustaf Rydholm |
2022-06-05 | Fix kwargs | Gustaf Rydholm |
2022-06-01 | Replace attr with attrs | Gustaf Rydholm |
2022-02-03 | chore: remove axial attention | Gustaf Rydholm |
2022-01-29 | fix(decoder): typos | Gustaf Rydholm |
2022-01-29 | feat(norm): add prenorm | Gustaf Rydholm |
2022-01-29 | feat: add RMSNorm | Gustaf Rydholm |
2022-01-29 | chore: remove residual block | Gustaf Rydholm |
2022-01-29 | chore: remove old attention layer block | Gustaf Rydholm |
2022-01-29 | feat: add new transformer decoder | Gustaf Rydholm |
2022-01-26 | fix: refactor AttentionLayers | Gustaf Rydholm |
2021-12-05 | Remove ViT once again | Gustaf Rydholm |
2021-12-04 | Revert "Remove ViT" | Gustaf Rydholm |
2021-11-28 | Remove ViT | Gustaf Rydholm |
2021-11-28 | Refactor attention layer module | Gustaf Rydholm |
2021-11-27 | Revert "Remove ViT" | Gustaf Rydholm |
2021-11-27 | Revert "Revert "Remove default transformer"" | Gustaf Rydholm |
2021-11-27 | Revert "Remove default transformer" | Gustaf Rydholm |
2021-11-21 | Format files | Gustaf Rydholm |
2021-11-21 | Add axial transformer | Gustaf Rydholm |
2021-11-17 | Remove local attention | Gustaf Rydholm |
2021-11-05 | Rename mask to input_mask | Gustaf Rydholm |
2021-11-03 | Fix output from attn modules | Gustaf Rydholm |
2021-11-03 | Fix local attn to work with any input length | Gustaf Rydholm |
2021-11-03 | Update output shape from attn module | Gustaf Rydholm |
2021-11-01 | Fix bugs in local attention | Gustaf Rydholm |
2021-11-01 | Refactor transformer layer | Gustaf Rydholm |
2021-11-01 | Fix self attention module | Gustaf Rydholm |
2021-11-01 | Update rotary embedding | Gustaf Rydholm |
2021-10-28 | Update rotary embedding | Gustaf Rydholm |
2021-10-28 | Remove 2D positional embedding | Gustaf Rydholm |
2021-10-28 | Remove absolute positional embedding | Gustaf Rydholm |
2021-10-28 | Fix multihead local attention | Gustaf Rydholm |