Age | Commit message (Expand) | Author |
---|---|---|
2022-06-10 | Move mlp to ff | Gustaf Rydholm |
2022-06-10 | Fix efficientnet | Gustaf Rydholm |
2022-06-09 | Remove abstract lightning module | Gustaf Rydholm |
2022-06-09 | Refactor padding | Gustaf Rydholm |
2022-06-09 | Fix conformer | Gustaf Rydholm |
2022-06-08 | Fix conformer net | Gustaf Rydholm |
2022-06-08 | Add efficient self attention | Gustaf Rydholm |
2022-06-07 | Add subsampler layer | Gustaf Rydholm |
2022-06-05 | Add efficiet net to its init | Gustaf Rydholm |
2022-06-05 | Remove attrs | Gustaf Rydholm |
2022-06-05 | Remove attrs | Gustaf Rydholm |
2022-06-05 | Update transformer init | Gustaf Rydholm |
2022-06-05 | Fix kwargs | Gustaf Rydholm |
2022-06-05 | Update init | Gustaf Rydholm |
2022-06-05 | Add conformer | Gustaf Rydholm |
2022-06-05 | Fix conformer block | Gustaf Rydholm |
2022-06-05 | Fix kwargs | Gustaf Rydholm |
2022-06-05 | Remove depth wise conv class | Gustaf Rydholm |
2022-06-05 | Rename mlp to ff | Gustaf Rydholm |
2022-06-02 | Add conformer block | Gustaf Rydholm |
2022-06-02 | Add conformer conv layer | Gustaf Rydholm |
2022-06-02 | Add mlp conformer layer | Gustaf Rydholm |
2022-06-01 | Format | Gustaf Rydholm |
2022-06-01 | WIP conformer | Gustaf Rydholm |
2022-06-01 | Replace attr with attrs | Gustaf Rydholm |
2022-05-30 | WIP conformer | Gustaf Rydholm |
2022-05-30 | Add a basic cnn encoder | Gustaf Rydholm |
2022-02-06 | chore: remove unused imports | Gustaf Rydholm |
2022-02-03 | chore: remove axial attention | Gustaf Rydholm |
2022-01-29 | fix(decoder): typos | Gustaf Rydholm |
2022-01-29 | feat(norm): add prenorm | Gustaf Rydholm |
2022-01-29 | chore(decoder): add new import path for decoder | Gustaf Rydholm |
2022-01-29 | feat(base): remove output norm | Gustaf Rydholm |
2022-01-29 | feat: add RMSNorm | Gustaf Rydholm |
2022-01-29 | chore: remove residual block | Gustaf Rydholm |
2022-01-29 | chore: remove old attention layer block | Gustaf Rydholm |
2022-01-29 | feat: add new transformer decoder | Gustaf Rydholm |
2022-01-26 | fix: refactor AttentionLayers | Gustaf Rydholm |
2022-01-26 | fix: lint and format | Gustaf Rydholm |
2021-12-05 | Format files with black | Gustaf Rydholm |
2021-12-05 | Remove ViT once again | Gustaf Rydholm |
2021-12-05 | Update conv transformer with inheritance from base network | Gustaf Rydholm |
2021-12-05 | Add base network | Gustaf Rydholm |
2021-12-04 | Revert "Remove ViT" | Gustaf Rydholm |
2021-11-28 | Remove ViT | Gustaf Rydholm |
2021-11-28 | Add norm layer to output from decoder | Gustaf Rydholm |
2021-11-28 | Refactor attention layer module | Gustaf Rydholm |
2021-11-27 | Revert "Remove ViT" | Gustaf Rydholm |
2021-11-27 | Revert "Revert "Remove default transformer"" | Gustaf Rydholm |
2021-11-27 | Revert "Remove default transformer" | Gustaf Rydholm |