index
:
text-recognizer.git
lightning-refactor
master
qa-training
A neural network that reads handwritten paragraphs.
Gustaf Rydholm
summary
refs
log
tree
commit
diff
log msg
author
committer
range
path:
root
/
text_recognizer
/
networks
Age
Commit message (
Expand
)
Author
2022-01-26
fix: refactor AttentionLayers
Gustaf Rydholm
2022-01-26
fix: lint and format
Gustaf Rydholm
2021-12-05
Format files with black
Gustaf Rydholm
2021-12-05
Remove ViT once again
Gustaf Rydholm
2021-12-05
Update conv transformer with inheritance from base network
Gustaf Rydholm
2021-12-05
Add base network
Gustaf Rydholm
2021-12-04
Revert "Remove ViT"
Gustaf Rydholm
2021-11-28
Remove ViT
Gustaf Rydholm
2021-11-28
Add norm layer to output from decoder
Gustaf Rydholm
2021-11-28
Refactor attention layer module
Gustaf Rydholm
2021-11-27
Revert "Remove ViT"
Gustaf Rydholm
2021-11-27
Revert "Revert "Remove default transformer""
Gustaf Rydholm
2021-11-27
Revert "Remove default transformer"
Gustaf Rydholm
2021-11-22
Fix import efficientnet
Gustaf Rydholm
2021-11-22
Remove unused utils
Gustaf Rydholm
2021-11-22
Format conv transformer
Gustaf Rydholm
2021-11-22
Move efficientnet from encoder dir
Gustaf Rydholm
2021-11-21
Format files
Gustaf Rydholm
2021-11-21
Add axial encoder to conv transformer
Gustaf Rydholm
2021-11-21
Remove VQVAE stuff, did not work...
Gustaf Rydholm
2021-11-21
Add axial transformer
Gustaf Rydholm
2021-11-17
Update VqTransformer
Gustaf Rydholm
2021-11-17
Update encoder fun in conv_transformer
Gustaf Rydholm
2021-11-17
Update vqvae with new quantizer
Gustaf Rydholm
2021-11-17
Add new quantizer
Gustaf Rydholm
2021-11-17
Remove local attention
Gustaf Rydholm
2021-11-17
Replace last two layers in efficientnet with one
Gustaf Rydholm
2021-11-17
Format efficientnet
Gustaf Rydholm
2021-11-07
Fix dropout in efficientnet
Gustaf Rydholm
2021-11-07
Add dropout to final layer in efficientnet
Gustaf Rydholm
2021-11-07
Fix mbconv sub modules
Gustaf Rydholm
2021-11-06
Fix efficientnet incorrect channel calculation
Gustaf Rydholm
2021-11-06
Format efficientnet
Gustaf Rydholm
2021-11-05
Remove conv attention
Gustaf Rydholm
2021-11-05
Rename mask to input_mask
Gustaf Rydholm
2021-11-05
Remove out_channels as a settable parameter in effnet
Gustaf Rydholm
2021-11-03
Remove unused args from conv transformer
Gustaf Rydholm
2021-11-03
Remove pixelcnn
Gustaf Rydholm
2021-11-03
Fix output from attn modules
Gustaf Rydholm
2021-11-03
Fix local attn to work with any input length
Gustaf Rydholm
2021-11-03
Update output shape from attn module
Gustaf Rydholm
2021-11-01
Add check for positional encoding in attn layer
Gustaf Rydholm
2021-11-01
Fix bugs in local attention
Gustaf Rydholm
2021-11-01
Refactor transformer layer
Gustaf Rydholm
2021-11-01
Fix self attention module
Gustaf Rydholm
2021-11-01
Update rotary embedding
Gustaf Rydholm
2021-10-28
Update rotary embedding
Gustaf Rydholm
2021-10-28
Remove 2D positional embedding
Gustaf Rydholm
2021-10-28
Remove absolute positional embedding
Gustaf Rydholm
2021-10-28
Add check for position embedding
Gustaf Rydholm
[prev]
[next]