From f4688482b4898c0b342d6ae59839dc27fbf856c6 Mon Sep 17 00:00:00 2001 From: Gustaf Rydholm Date: Thu, 13 May 2021 23:02:42 +0200 Subject: Remove bloat packages --- README.md | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) (limited to 'README.md') diff --git a/README.md b/README.md index ed93955..1c6c0ef 100644 --- a/README.md +++ b/README.md @@ -11,24 +11,24 @@ TBC Extract text from the iam dataset: ``` -poetry run python extract-iam-text --use_words --save_text train.txt --save_tokens letters.txt +python extract-iam-text --use_words --save_text train.txt --save_tokens letters.txt ``` Create word pieces from the extracted training text: ``` -poetry run python make-wordpieces --output_prefix iamdb_1kwp --text_file train.txt --num_pieces 100 +python make-wordpieces --output_prefix iamdb_1kwp --text_file train.txt --num_pieces 100 ``` Optionally, build a transition graph for word pieces: ``` -poetry run python build-transitions --tokens iamdb_1kwp_tokens_1000.txt --lexicon iamdb_1kwp_lex_1000.txt --blank optional --self_loops --save_path 1kwp_prune_0_10_optblank.bin --prune 0 10 +python build-transitions --tokens iamdb_1kwp_tokens_1000.txt --lexicon iamdb_1kwp_lex_1000.txt --blank optional --self_loops --save_path 1kwp_prune_0_10_optblank.bin --prune 0 10 ``` (TODO: Not working atm, needed for GTN loss function) ## Todo - [ ] Reimplement transformer from scratch -- [ ] Implement Nyström attention (for efficient attention) -- [ ] Dino +- [x] Implement Nyström attention (for efficient attention) +- [ ] Implement Dino - [ ] Efficient-net b0 + transformer decoder - [ ] Test encoder pre-training ViT (CvT?) with Dino, then train decoder in a separate step -- cgit v1.2.3-70-g09d2