From 76098a8da9731dd7cba1a7334ad9ae8a2acc760e Mon Sep 17 00:00:00 2001 From: Gustaf Rydholm Date: Thu, 3 Feb 2022 21:40:50 +0100 Subject: chore: update readme --- README.md | 34 +++++++++++++++++++++++++++++++++- 1 file changed, 33 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index 6e06a74..a2a4a41 100644 --- a/README.md +++ b/README.md @@ -28,8 +28,40 @@ make download make generate ``` +## Train + + +Use, modify, or create a new experiment found at `training/conf/experiment/`. +To run an experiment we first need to enter the virtual env by running: + +```sh +poetry shell +``` + +Then we can train a new model by running: + +```sh +python main.py +experiment=conv_transformer_paragraphs +``` + +## Network + +Create a picture of the network and place it here + +## Graveyard + +Ideas of mine that did not work unfortunately: + +* Use VQVAE to create pre-train a good latent representation + - Tests with various compressions did not show any performance increase compared to training directly e2e, more like decrease to be honest + - This is very unfortunate as I really hoped that this idea would work :( + - I still really like this idea, and I might not have given up just yet... + + +* Axial Transformer Encoder + - Added a lot of extra parameters with no gain in performance + - Cool idea, but on a single GPU, nah... not worth it! -## TODO ## Todo - [ ] remove einops -- cgit v1.2.3-70-g09d2