index
:
text-recognizer.git
lightning-refactor
master
qa-training
A neural network that reads handwritten paragraphs.
Gustaf Rydholm
summary
refs
log
tree
commit
diff
log msg
author
committer
range
Age
Commit message (
Collapse
)
Author
2021-11-17
Update vq transformer lit model
Gustaf Rydholm
2021-11-17
Fix import path vqganloss
Gustaf Rydholm
2021-11-17
Test new efficientnet params in notebook
Gustaf Rydholm
2021-11-07
Fix dropout in efficientnet
Gustaf Rydholm
2021-11-07
Add dropout to final layer in efficientnet
Gustaf Rydholm
2021-11-07
Update efficientnet notebook
Gustaf Rydholm
2021-11-07
Update configs
Gustaf Rydholm
2021-11-07
Add D105 to ignore in flake8
Gustaf Rydholm
2021-11-07
Fix mbconv sub modules
Gustaf Rydholm
2021-11-06
Fix efficientnet incorrect channel calculation
Gustaf Rydholm
2021-11-06
Format efficientnet
Gustaf Rydholm
2021-11-06
Efficientnet updated notebook
Gustaf Rydholm
2021-11-05
Add cross entropy config
Gustaf Rydholm
2021-11-05
Remove out_channels from effnet config
Gustaf Rydholm
2021-11-05
Update paragraphs config
Gustaf Rydholm
2021-11-05
Update lines config
Gustaf Rydholm
2021-11-05
Add warp transform to paragraphs
Gustaf Rydholm
2021-11-05
Remove barlow twins transform for lines
Gustaf Rydholm
2021-11-05
Remove conv attention
Gustaf Rydholm
2021-11-05
Rename mask to input_mask
Gustaf Rydholm
Rename mask to input_mask Rename mask to input_mask
2021-11-05
Remove out_channels as a settable parameter in effnet
Gustaf Rydholm
2021-11-05
Rename iam lines transforms
Gustaf Rydholm
Rename iam lines transform
2021-11-05
Format pad transform
Gustaf Rydholm
2021-11-05
Try new efficientnet in notebook
Gustaf Rydholm
2021-11-05
Look a warped lines in notebook
Gustaf Rydholm
2021-11-03
Update dependencies
Gustaf Rydholm
2021-11-03
Update configs
Gustaf Rydholm
2021-11-03
Remove unused configs
Gustaf Rydholm
2021-11-03
Add pad config
Gustaf Rydholm
2021-11-03
Add pad transform
Gustaf Rydholm
2021-11-03
Remove unused args from conv transformer
Gustaf Rydholm
2021-11-03
Remove pixelcnn
Gustaf Rydholm
2021-11-03
Fix output from attn modules
Gustaf Rydholm
2021-11-03
Fix local attn to work with any input length
Gustaf Rydholm
2021-11-03
Update output shape from attn module
Gustaf Rydholm
2021-11-03
Update notebook
Gustaf Rydholm
2021-11-01
Update readme
Gustaf Rydholm
2021-11-01
Update to config
Gustaf Rydholm
2021-11-01
Add check for positional encoding in attn layer
Gustaf Rydholm
2021-11-01
Fix bugs in local attention
Gustaf Rydholm
2021-11-01
Refactor transformer layer
Gustaf Rydholm
2021-11-01
Fix self attention module
Gustaf Rydholm
2021-11-01
Update rotary embedding
Gustaf Rydholm
2021-11-01
Update to notebooks with new conv trans config
Gustaf Rydholm
2021-11-01
Delete barlow twins notebook
Gustaf Rydholm
2021-10-28
Update readme
Gustaf Rydholm
2021-10-28
Update rotary embedding
Gustaf Rydholm
2021-10-28
Remove 2D positional embedding
Gustaf Rydholm
2021-10-28
Remove absolute positional embedding
Gustaf Rydholm
2021-10-28
Add check for position embedding
Gustaf Rydholm
[prev]
[next]