index
:
text-recognizer.git
lightning-refactor
master
qa-training
A neural network that reads handwritten paragraphs.
Gustaf Rydholm
summary
refs
log
tree
commit
diff
log msg
author
committer
range
path:
root
/
text_recognizer
/
networks
/
transformer
Age
Commit message (
Expand
)
Author
2022-06-11
Update ff import
Gustaf Rydholm
2022-06-10
Fix check for pos emb
Gustaf Rydholm
2022-06-10
Add imports
Gustaf Rydholm
2022-06-10
Move mlp to ff
Gustaf Rydholm
2022-06-07
Add subsampler layer
Gustaf Rydholm
2022-06-05
Remove attrs
Gustaf Rydholm
2022-06-05
Update transformer init
Gustaf Rydholm
2022-06-05
Fix kwargs
Gustaf Rydholm
2022-06-01
Replace attr with attrs
Gustaf Rydholm
2022-02-03
chore: remove axial attention
Gustaf Rydholm
2022-01-29
fix(decoder): typos
Gustaf Rydholm
2022-01-29
feat(norm): add prenorm
Gustaf Rydholm
2022-01-29
feat: add RMSNorm
Gustaf Rydholm
2022-01-29
chore: remove residual block
Gustaf Rydholm
2022-01-29
chore: remove old attention layer block
Gustaf Rydholm
2022-01-29
feat: add new transformer decoder
Gustaf Rydholm
2022-01-26
fix: refactor AttentionLayers
Gustaf Rydholm
2021-12-05
Remove ViT once again
Gustaf Rydholm
2021-12-04
Revert "Remove ViT"
Gustaf Rydholm
2021-11-28
Remove ViT
Gustaf Rydholm
2021-11-28
Refactor attention layer module
Gustaf Rydholm
2021-11-27
Revert "Remove ViT"
Gustaf Rydholm
2021-11-27
Revert "Revert "Remove default transformer""
Gustaf Rydholm
2021-11-27
Revert "Remove default transformer"
Gustaf Rydholm
2021-11-21
Format files
Gustaf Rydholm
2021-11-21
Add axial transformer
Gustaf Rydholm
2021-11-17
Remove local attention
Gustaf Rydholm
2021-11-05
Rename mask to input_mask
Gustaf Rydholm
2021-11-03
Fix output from attn modules
Gustaf Rydholm
2021-11-03
Fix local attn to work with any input length
Gustaf Rydholm
2021-11-03
Update output shape from attn module
Gustaf Rydholm
2021-11-01
Fix bugs in local attention
Gustaf Rydholm
2021-11-01
Refactor transformer layer
Gustaf Rydholm
2021-11-01
Fix self attention module
Gustaf Rydholm
2021-11-01
Update rotary embedding
Gustaf Rydholm
2021-10-28
Update rotary embedding
Gustaf Rydholm
2021-10-28
Remove 2D positional embedding
Gustaf Rydholm
2021-10-28
Remove absolute positional embedding
Gustaf Rydholm
2021-10-28
Fix multihead local attention
Gustaf Rydholm
2021-10-28
Fix bug with local attention
Gustaf Rydholm
2021-10-28
Refactor attention module
Gustaf Rydholm
2021-10-27
Add axial embedding
Gustaf Rydholm
2021-10-27
Remove ViT
Gustaf Rydholm
2021-10-27
Remove default transformer
Gustaf Rydholm
2021-10-27
Add comments to transformer modules
Gustaf Rydholm
2021-10-27
Clean up local attention, add comments and types
Gustaf Rydholm
2021-10-27
Add local attn in transformer layer
Gustaf Rydholm
2021-10-27
Remove unused import and add comments in attn module
Gustaf Rydholm
2021-10-27
Remove imports from __init__ in transformer network
Gustaf Rydholm
2021-10-27
Rename transformer embeddings
Gustaf Rydholm
[next]