summaryrefslogtreecommitdiff
path: root/README.md
blob: c30ee03f80399e612f3228953d92282d5eb74fbc (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
# Text Recognizer
Implementing the text recognizer project from the course ["Full Stack Deep Learning Course"](https://fullstackdeeplearning.com/march2019) (FSDL) in PyTorch in order to learn best practices when building a deep learning project. I have expanded on this project by adding additional feature and ideas given by Claudio Jolowicz in ["Hypermodern Python"](https://cjolowicz.github.io/posts/hypermodern-python-01-setup/).


## Setup

TBC


### Build word piece dataset

Extract text from the iam dataset:
```
poetry run extract-iam-text --use_words --save_text train.txt --save_tokens letters.txt
```

Create word pieces from the extracted training text:
```
poetry run make-wordpieces --output_prefix iamdb_1kwp --text_file train.txt --num_pieces 100
```

Optionally, build a transition graph for word pieces:
```
poetry run build-transitions --tokens iamdb_1kwp_tokens_1000.txt --lexicon iamdb_1kwp_lex_1000.txt --blank optional --self_loops --save_path 1kwp_prune_0_10_optblank.bin --prune 0 10
```
(TODO: Not working atm, needed for GTN loss function)

## Todo
- [x] create wordpieces
  - [x] make_wordpieces.py
  - [x] build_transitions.py
  - [x] transform that encodes iam targets to wordpieces
  - [x] transducer loss function
- [  ] Train with word pieces
- [ ] Local attention in first layer of transformer
- [ ] Halonet encoder
- [  ] Implement CPC
  - [ ] https://arxiv.org/pdf/1905.09272.pdf
  - [ ] https://pytorch-lightning-bolts.readthedocs.io/en/latest/self_supervised_models.html?highlight=byol


- [ ] Predictive coding
  - https://arxiv.org/pdf/1807.03748.pdf
  - https://arxiv.org/pdf/1904.05862.pdf
  - https://arxiv.org/pdf/1910.05453.pdf
  - https://blog.evjang.com/2016/11/tutorial-categorical-variational.html






## Run Sweeps
 Run the following commands to execute hyperparameter search with W&B:

```
wandb sweep training/sweep_emnist_resnet.yml
export SWEEP_ID=...
wandb agent $SWEEP_ID

```