From c51707edb0e5634b487b62fecb44c68f14f63638 Mon Sep 17 00:00:00 2001 From: Dirk Groeneveld Date: Wed, 12 Oct 2022 11:02:20 -0700 Subject: [PATCH] Add a shout to allennlp-light to the README --- README.md | 1 + 1 file changed, 1 insertion(+) diff --git a/README.md b/README.md index 9bc070b2189..3a4f395a7a4 100644 --- a/README.md +++ b/README.md @@ -29,6 +29,7 @@ AllenNLP has been a big success, but as the field is advancing quickly it's time to focus on new initiatives. We're working hard to make [AI2 Tango](https://github.com/allenai/tango) the best way to organize research codebases. If you are an active user of AllenNLP, here are some suggested alternatives: * If you like the trainer, the configuration language, or are simply looking for a better way to manage your experiments, check out [AI2 Tango](https://github.com/allenai/tango). +* If you like AllenNLP's `modules` and `nn` packages, check out [delmaksym/allennlp-light](https://github.com/delmaksym/allennlp-light). It's even compatible with [AI2 Tango](https://github.com/allenai/tango)! * If you like the framework aspect of AllenNLP, check out [flair](https://github.com/flairNLP/flair). It has multiple state-of-art NLP models and allows you to easily use pretrained embeddings such as those from transformers. * If you like the AllenNLP metrics package, check out [torchmetrics](https://torchmetrics.readthedocs.io/en/stable/). It has the same API as AllenNLP, so it should be a quick learning curve to make the switch. * If you want to vectorize text, try [the transformers library](https://github.com/huggingface/transformers).