index
:
text-recognizer.git
lightning-refactor
master
qa-training
A neural network that reads handwritten paragraphs.
Gustaf Rydholm
summary
refs
log
tree
commit
diff
log msg
author
committer
range
path:
root
/
text_recognizer
/
networks
Age
Commit message (
Collapse
)
Author
2021-11-03
Remove pixelcnn
Gustaf Rydholm
2021-11-03
Fix output from attn modules
Gustaf Rydholm
2021-11-03
Fix local attn to work with any input length
Gustaf Rydholm
2021-11-03
Update output shape from attn module
Gustaf Rydholm
2021-11-01
Add check for positional encoding in attn layer
Gustaf Rydholm
2021-11-01
Fix bugs in local attention
Gustaf Rydholm
2021-11-01
Refactor transformer layer
Gustaf Rydholm
2021-11-01
Fix self attention module
Gustaf Rydholm
2021-11-01
Update rotary embedding
Gustaf Rydholm
2021-10-28
Update rotary embedding
Gustaf Rydholm
2021-10-28
Remove 2D positional embedding
Gustaf Rydholm
2021-10-28
Remove absolute positional embedding
Gustaf Rydholm
2021-10-28
Add check for position embedding
Gustaf Rydholm
2021-10-28
Fix multihead local attention
Gustaf Rydholm
2021-10-28
Fix bug with local attention
Gustaf Rydholm
2021-10-28
Refactor attention module
Gustaf Rydholm
2021-10-27
Remove Barlow Twins
Gustaf Rydholm
2021-10-27
Add axial embedding
Gustaf Rydholm
2021-10-27
Remove ViT
Gustaf Rydholm
2021-10-27
Remove default transformer
Gustaf Rydholm
2021-10-27
Add comments to transformer modules
Gustaf Rydholm
2021-10-27
Clean up local attention, add comments and types
Gustaf Rydholm
2021-10-27
Add local attn in transformer layer
Gustaf Rydholm
2021-10-27
Remove unused import and add comments in attn module
Gustaf Rydholm
2021-10-27
Remove imports from __init__ in transformer network
Gustaf Rydholm
2021-10-27
Rename transformer embeddings
Gustaf Rydholm
2021-10-24
Add local attention
Gustaf Rydholm
2021-10-24
Remove unused import in conv transformer
Gustaf Rydholm
2021-10-24
Refator attention
Gustaf Rydholm
2021-10-24
Format of files
Gustaf Rydholm
2021-10-11
Add BarlowNetwork
Gustaf Rydholm
2021-10-10
Remove imports
Gustaf Rydholm
2021-10-07
Lint
Gustaf Rydholm
2021-10-07
Add Barlow Twins network and training proceduer
Gustaf Rydholm
2021-10-03
Refactor rotary embedding
Gustaf Rydholm
2021-10-03
Fix rotary embedding
Gustaf Rydholm
2021-10-03
Lint files
Gustaf Rydholm
2021-09-30
Major bug fix in attention layer
Gustaf Rydholm
2021-09-30
Bug fix for loading pretrained vq encoder
Gustaf Rydholm
2021-09-30
Set leaky relu inplace to false
Gustaf Rydholm
2021-09-30
Rename context to trg in transformer
Gustaf Rydholm
2021-09-30
Rename vqloss to commitment loss in vqvae network
Gustaf Rydholm
2021-09-30
Refactor residual block
Gustaf Rydholm
2021-09-30
Add num res blocks as a variable
Gustaf Rydholm
2021-09-19
Add loading of encoder in vq transformer network
Gustaf Rydholm
2021-09-19
Add ability to set use norm in vqvae
Gustaf Rydholm
2021-09-18
Add comment for quantization
Gustaf Rydholm
2021-09-18
Remove TODO in conv transformer
Gustaf Rydholm
2021-09-18
Add Vq transfomer model
Gustaf Rydholm
2021-09-18
Add more residual blocks in vqvae
Gustaf Rydholm
[next]