Age | Commit message (Expand) | Author |
2021-11-21 | Remove VQVAE stuff, did not work... | Gustaf Rydholm |
2021-11-21 | Add axial transformer | Gustaf Rydholm |
2021-11-17 | Update VqTransformer | Gustaf Rydholm |
2021-11-17 | Update encoder fun in conv_transformer | Gustaf Rydholm |
2021-11-17 | Update vqvae with new quantizer | Gustaf Rydholm |
2021-11-17 | Add new quantizer | Gustaf Rydholm |
2021-11-17 | Remove local attention | Gustaf Rydholm |
2021-11-17 | Replace last two layers in efficientnet with one | Gustaf Rydholm |
2021-11-17 | Format efficientnet | Gustaf Rydholm |
2021-11-07 | Fix dropout in efficientnet | Gustaf Rydholm |
2021-11-07 | Add dropout to final layer in efficientnet | Gustaf Rydholm |
2021-11-07 | Fix mbconv sub modules | Gustaf Rydholm |
2021-11-06 | Fix efficientnet incorrect channel calculation | Gustaf Rydholm |
2021-11-06 | Format efficientnet | Gustaf Rydholm |
2021-11-05 | Remove conv attention | Gustaf Rydholm |
2021-11-05 | Rename mask to input_mask | Gustaf Rydholm |
2021-11-05 | Remove out_channels as a settable parameter in effnet | Gustaf Rydholm |
2021-11-03 | Remove unused args from conv transformer | Gustaf Rydholm |
2021-11-03 | Remove pixelcnn | Gustaf Rydholm |
2021-11-03 | Fix output from attn modules | Gustaf Rydholm |
2021-11-03 | Fix local attn to work with any input length | Gustaf Rydholm |
2021-11-03 | Update output shape from attn module | Gustaf Rydholm |
2021-11-01 | Add check for positional encoding in attn layer | Gustaf Rydholm |
2021-11-01 | Fix bugs in local attention | Gustaf Rydholm |
2021-11-01 | Refactor transformer layer | Gustaf Rydholm |
2021-11-01 | Fix self attention module | Gustaf Rydholm |
2021-11-01 | Update rotary embedding | Gustaf Rydholm |
2021-10-28 | Update rotary embedding | Gustaf Rydholm |
2021-10-28 | Remove 2D positional embedding | Gustaf Rydholm |
2021-10-28 | Remove absolute positional embedding | Gustaf Rydholm |
2021-10-28 | Add check for position embedding | Gustaf Rydholm |
2021-10-28 | Fix multihead local attention | Gustaf Rydholm |
2021-10-28 | Fix bug with local attention | Gustaf Rydholm |
2021-10-28 | Refactor attention module | Gustaf Rydholm |
2021-10-27 | Remove Barlow Twins | Gustaf Rydholm |
2021-10-27 | Add axial embedding | Gustaf Rydholm |
2021-10-27 | Remove ViT | Gustaf Rydholm |
2021-10-27 | Remove default transformer | Gustaf Rydholm |
2021-10-27 | Add comments to transformer modules | Gustaf Rydholm |
2021-10-27 | Clean up local attention, add comments and types | Gustaf Rydholm |
2021-10-27 | Add local attn in transformer layer | Gustaf Rydholm |
2021-10-27 | Remove unused import and add comments in attn module | Gustaf Rydholm |
2021-10-27 | Remove imports from __init__ in transformer network | Gustaf Rydholm |
2021-10-27 | Rename transformer embeddings | Gustaf Rydholm |
2021-10-24 | Add local attention | Gustaf Rydholm |
2021-10-24 | Remove unused import in conv transformer | Gustaf Rydholm |
2021-10-24 | Refator attention | Gustaf Rydholm |
2021-10-24 | Format of files | Gustaf Rydholm |
2021-10-11 | Add BarlowNetwork | Gustaf Rydholm |
2021-10-10 | Remove imports | Gustaf Rydholm |