Age | Commit message (Collapse) | Author | |
---|---|---|---|
2022-06-10 | Move mlp to ff | Gustaf Rydholm | |
2022-06-07 | Add subsampler layer | Gustaf Rydholm | |
2022-06-05 | Remove attrs | Gustaf Rydholm | |
2022-06-05 | Update transformer init | Gustaf Rydholm | |
2022-06-05 | Fix kwargs | Gustaf Rydholm | |
2022-06-01 | Replace attr with attrs | Gustaf Rydholm | |
2022-02-03 | chore: remove axial attention | Gustaf Rydholm | |
chore: remove axial attention | |||
2022-01-29 | fix(decoder): typos | Gustaf Rydholm | |
2022-01-29 | feat(norm): add prenorm | Gustaf Rydholm | |
2022-01-29 | feat: add RMSNorm | Gustaf Rydholm | |
2022-01-29 | chore: remove residual block | Gustaf Rydholm | |
2022-01-29 | chore: remove old attention layer block | Gustaf Rydholm | |
2022-01-29 | feat: add new transformer decoder | Gustaf Rydholm | |
2022-01-26 | fix: refactor AttentionLayers | Gustaf Rydholm | |
2021-12-05 | Remove ViT once again | Gustaf Rydholm | |
2021-12-04 | Revert "Remove ViT" | Gustaf Rydholm | |
This reverts commit 3de4312d1796b1ee56d6467d36773df29a831e45. | |||
2021-11-28 | Remove ViT | Gustaf Rydholm | |
2021-11-28 | Refactor attention layer module | Gustaf Rydholm | |
2021-11-27 | Revert "Remove ViT" | Gustaf Rydholm | |
This reverts commit 4a6550ddef7d1f1971737bc22715db6381441f79. | |||
2021-11-27 | Revert "Revert "Remove default transformer"" | Gustaf Rydholm | |
This reverts commit 89be047a46c8e88511d301f63d7f6795fe04ab81. | |||
2021-11-27 | Revert "Remove default transformer" | Gustaf Rydholm | |
This reverts commit b3d3b7ddc0796e98d78561bc5ca22728dc0372b0. | |||
2021-11-21 | Format files | Gustaf Rydholm | |
2021-11-21 | Add axial transformer | Gustaf Rydholm | |
2021-11-17 | Remove local attention | Gustaf Rydholm | |
2021-11-05 | Rename mask to input_mask | Gustaf Rydholm | |
Rename mask to input_mask Rename mask to input_mask | |||
2021-11-03 | Fix output from attn modules | Gustaf Rydholm | |
2021-11-03 | Fix local attn to work with any input length | Gustaf Rydholm | |
2021-11-03 | Update output shape from attn module | Gustaf Rydholm | |
2021-11-01 | Fix bugs in local attention | Gustaf Rydholm | |
2021-11-01 | Refactor transformer layer | Gustaf Rydholm | |
2021-11-01 | Fix self attention module | Gustaf Rydholm | |
2021-11-01 | Update rotary embedding | Gustaf Rydholm | |
2021-10-28 | Update rotary embedding | Gustaf Rydholm | |
2021-10-28 | Remove 2D positional embedding | Gustaf Rydholm | |
2021-10-28 | Remove absolute positional embedding | Gustaf Rydholm | |
2021-10-28 | Fix multihead local attention | Gustaf Rydholm | |
2021-10-28 | Fix bug with local attention | Gustaf Rydholm | |
2021-10-28 | Refactor attention module | Gustaf Rydholm | |
2021-10-27 | Add axial embedding | Gustaf Rydholm | |
2021-10-27 | Remove ViT | Gustaf Rydholm | |
2021-10-27 | Remove default transformer | Gustaf Rydholm | |
2021-10-27 | Add comments to transformer modules | Gustaf Rydholm | |
2021-10-27 | Clean up local attention, add comments and types | Gustaf Rydholm | |
2021-10-27 | Add local attn in transformer layer | Gustaf Rydholm | |
2021-10-27 | Remove unused import and add comments in attn module | Gustaf Rydholm | |
2021-10-27 | Remove imports from __init__ in transformer network | Gustaf Rydholm | |
2021-10-27 | Rename transformer embeddings | Gustaf Rydholm | |
2021-10-24 | Add local attention | Gustaf Rydholm | |
2021-10-24 | Refator attention | Gustaf Rydholm | |
2021-10-03 | Refactor rotary embedding | Gustaf Rydholm | |