summaryrefslogtreecommitdiff
AgeCommit message (Collapse)Author
2021-11-06Format efficientnetGustaf Rydholm
2021-11-06Efficientnet updated notebookGustaf Rydholm
2021-11-05Add cross entropy configGustaf Rydholm
2021-11-05Remove out_channels from effnet configGustaf Rydholm
2021-11-05Update paragraphs configGustaf Rydholm
2021-11-05Update lines configGustaf Rydholm
2021-11-05Add warp transform to paragraphsGustaf Rydholm
2021-11-05Remove barlow twins transform for linesGustaf Rydholm
2021-11-05Remove conv attentionGustaf Rydholm
2021-11-05Rename mask to input_maskGustaf Rydholm
Rename mask to input_mask Rename mask to input_mask
2021-11-05Remove out_channels as a settable parameter in effnetGustaf Rydholm
2021-11-05Rename iam lines transformsGustaf Rydholm
Rename iam lines transform
2021-11-05Format pad transformGustaf Rydholm
2021-11-05Try new efficientnet in notebookGustaf Rydholm
2021-11-05Look a warped lines in notebookGustaf Rydholm
2021-11-03Update dependenciesGustaf Rydholm
2021-11-03Update configsGustaf Rydholm
2021-11-03Remove unused configsGustaf Rydholm
2021-11-03Add pad configGustaf Rydholm
2021-11-03Add pad transformGustaf Rydholm
2021-11-03Remove unused args from conv transformerGustaf Rydholm
2021-11-03Remove pixelcnnGustaf Rydholm
2021-11-03Fix output from attn modulesGustaf Rydholm
2021-11-03Fix local attn to work with any input lengthGustaf Rydholm
2021-11-03Update output shape from attn moduleGustaf Rydholm
2021-11-03Update notebookGustaf Rydholm
2021-11-01Update readmeGustaf Rydholm
2021-11-01Update to configGustaf Rydholm
2021-11-01Add check for positional encoding in attn layerGustaf Rydholm
2021-11-01Fix bugs in local attentionGustaf Rydholm
2021-11-01Refactor transformer layerGustaf Rydholm
2021-11-01Fix self attention moduleGustaf Rydholm
2021-11-01Update rotary embeddingGustaf Rydholm
2021-11-01Update to notebooks with new conv trans configGustaf Rydholm
2021-11-01Delete barlow twins notebookGustaf Rydholm
2021-10-28Update readmeGustaf Rydholm
2021-10-28Update rotary embeddingGustaf Rydholm
2021-10-28Remove 2D positional embeddingGustaf Rydholm
2021-10-28Remove absolute positional embeddingGustaf Rydholm
2021-10-28Add check for position embeddingGustaf Rydholm
2021-10-28Fix multihead local attentionGustaf Rydholm
2021-10-28Fix bug with local attentionGustaf Rydholm
2021-10-28Refactor attention moduleGustaf Rydholm
2021-10-27Update conv transfromer experiment configGustaf Rydholm
2021-10-27Remove transducerGustaf Rydholm
2021-10-27Fix import of criterion in VQGANGustaf Rydholm
2021-10-27Fix mapping import in base modelGustaf Rydholm
2021-10-27Rename to criterionGustaf Rydholm
2021-10-27Remove Barlow TwinsGustaf Rydholm
2021-10-27Update dependenciesGustaf Rydholm