An autocompleter for code editors based on OpenAI GPT-2.
🏠 Homepage
Galois is an auto code completer for code editors (or any text editor) based on OpenAI GPT-2. It is trained (finetuned) on a curated list of approximately 45K Python (~470MB) files gathered from the Github. Currently, it just works properly on Python but not bad at other languages (thanks to GPT-2's power).
This repository now contains the very first release of the Galois Project. With this project, I aim to create a Deep Learning Based Autocompleter such that anyone can run it on their own computer easily. Thus, coding will be more easier and fun!
We strongly recommend processing Galois with GPU, as the response time tends to be absolutely faster than the CPU.
For running on GPU, as a prerequisite for ensuring full operation through these steps, it's necessary to have a pre-configured environment with the installation of the necessary libraries through the official NVIDIA Guide
If Galois system can't reach the NVidia/CUDA libraries or if you're running on a machine without GPU, then the system will run over the CPU. So you can use this same steps for running on CPU.
For building the image, run the following command:
docker build https://github.com/GabrielTamujo/galois-autocompleter.git -t galois/nvidia
Once the image is built, you must have a model that you wishes to run. Later on this file you can check steps for finetuning your own model, but you can download and uncompress the default Galois model by the following command:
curl -SL https://github.com/iedmrc/galois-autocompleter/releases/latest/download/model.tar.xz | tar -xJC ./opt
Run the container through the following command. Notice that you need to pass as a volume the model you wishes to run.
docker run --name galois_autocompleter --hostname galois_autocompleter --runtime nvidia -dit -p 3030:3030 --volume ./model:/galois/model galois/nvidia
Clone the repository:
git clone https://github.com/iedmrc/galois-autocompleter
Download the Galois latest model from releases, or the one you wishes to run, and uncompress it into the directory:
curl -SL https://github.com/iedmrc/galois-autocompleter/releases/latest/download/model.tar.xz | tar -xJC ./galois-autocompleter
Install dependencies:0
pip3 install -r requirements.txt
Run the autocompleter:
python3 main.py
Currently, there are no extensions for code editors. You can use it through HTTP. When you run the main.py
, it will serve an HTTP (flask) server. Then you can easily make a POST request to the http://localhost:3030/ with the some JSON
body like the following:
{text: "your python code goes here"}
An example curl command:
curl -X POST \
http://localhost:3030/autocomplete \
-H 'Content-Type: application/json' \
-d '{"text":"import os\nimport sys\n# Count lines of codes in the given directory, separated by file extension.\ndef main(directory):\n line_count = {}\n for filename in os.listdir(directory):\n _, ext = os.path.splitext(filename)\n if ext not"}'
Check out the gist here for a
docker-compose
file.
Even you can finetune (re-train over) the model with/for your code files. Just follow the Max Woolf's
gpt-2-simple or Neil Shepperd's
gpt-2 repositories with 345M
version. But don't forget to replace checkpoint (model) with the one in this repository.
You can train it on the Google Colaboratory for free. But if you need a production-grade (i.e. more accurate) one then you may need to train it for more longer time. In my case, it took ~48 hours on a P100 GPU.
- Train the model to predict in most common programming languages.
- Create extensions for most common code editors to use galois as an autocompleter.
- Create a new, more lightweight but powerful model such that anyone can run it in their computer easily.
Contributions are welcome. Feel free to create an issue or a pull request.
👤 Ibrahim Ethem DEMIRCI
Twitter: @iedmrc | Github: @iedmrc | Patreon: @iedmrc
Ibrahim's open-source projects are supported by his Patreon. If you found this project helpful, any monetary contributions to the Patreon are appreciated and will be put to good creative use.
It is licensed under MIT License as found in the LICENSE file.
This repo has no affiliation or relationship with OpenAI.