summaryrefslogtreecommitdiff
AgeCommit message (Collapse)Author
2021-11-17Update vq transformer lit modelGustaf Rydholm
2021-11-17Fix import path vqganlossGustaf Rydholm
2021-11-17Test new efficientnet params in notebookGustaf Rydholm
2021-11-07Fix dropout in efficientnetGustaf Rydholm
2021-11-07Add dropout to final layer in efficientnetGustaf Rydholm
2021-11-07Update efficientnet notebookGustaf Rydholm
2021-11-07Update configsGustaf Rydholm
2021-11-07Add D105 to ignore in flake8Gustaf Rydholm
2021-11-07Fix mbconv sub modulesGustaf Rydholm
2021-11-06Fix efficientnet incorrect channel calculationGustaf Rydholm
2021-11-06Format efficientnetGustaf Rydholm
2021-11-06Efficientnet updated notebookGustaf Rydholm
2021-11-05Add cross entropy configGustaf Rydholm
2021-11-05Remove out_channels from effnet configGustaf Rydholm
2021-11-05Update paragraphs configGustaf Rydholm
2021-11-05Update lines configGustaf Rydholm
2021-11-05Add warp transform to paragraphsGustaf Rydholm
2021-11-05Remove barlow twins transform for linesGustaf Rydholm
2021-11-05Remove conv attentionGustaf Rydholm
2021-11-05Rename mask to input_maskGustaf Rydholm
Rename mask to input_mask Rename mask to input_mask
2021-11-05Remove out_channels as a settable parameter in effnetGustaf Rydholm
2021-11-05Rename iam lines transformsGustaf Rydholm
Rename iam lines transform
2021-11-05Format pad transformGustaf Rydholm
2021-11-05Try new efficientnet in notebookGustaf Rydholm
2021-11-05Look a warped lines in notebookGustaf Rydholm
2021-11-03Update dependenciesGustaf Rydholm
2021-11-03Update configsGustaf Rydholm
2021-11-03Remove unused configsGustaf Rydholm
2021-11-03Add pad configGustaf Rydholm
2021-11-03Add pad transformGustaf Rydholm
2021-11-03Remove unused args from conv transformerGustaf Rydholm
2021-11-03Remove pixelcnnGustaf Rydholm
2021-11-03Fix output from attn modulesGustaf Rydholm
2021-11-03Fix local attn to work with any input lengthGustaf Rydholm
2021-11-03Update output shape from attn moduleGustaf Rydholm
2021-11-03Update notebookGustaf Rydholm
2021-11-01Update readmeGustaf Rydholm
2021-11-01Update to configGustaf Rydholm
2021-11-01Add check for positional encoding in attn layerGustaf Rydholm
2021-11-01Fix bugs in local attentionGustaf Rydholm
2021-11-01Refactor transformer layerGustaf Rydholm
2021-11-01Fix self attention moduleGustaf Rydholm
2021-11-01Update rotary embeddingGustaf Rydholm
2021-11-01Update to notebooks with new conv trans configGustaf Rydholm
2021-11-01Delete barlow twins notebookGustaf Rydholm
2021-10-28Update readmeGustaf Rydholm
2021-10-28Update rotary embeddingGustaf Rydholm
2021-10-28Remove 2D positional embeddingGustaf Rydholm
2021-10-28Remove absolute positional embeddingGustaf Rydholm
2021-10-28Add check for position embeddingGustaf Rydholm