A simple text-based LLM that is based on some Wikipedia data
Go to file
2026-04-12 11:25:38 +01:00
.gitignore Created a basic model and ran it for 10 epochs. Used the GPT2 Tokenizer and the base GPT2 model to tokenize and train the retrain the model 2026-04-12 11:25:38 +01:00
Building an LLM.md Created a basic model and ran it for 10 epochs. Used the GPT2 Tokenizer and the base GPT2 model to tokenize and train the retrain the model 2026-04-12 11:25:38 +01:00
LICENSE Initial commit 2026-03-31 21:11:05 +01:00
LLM-gpt.ipynb Created a basic model and ran it for 10 epochs. Used the GPT2 Tokenizer and the base GPT2 model to tokenize and train the retrain the model 2026-04-12 11:25:38 +01:00
model_output_10_epochs.zip Created a basic model and ran it for 10 epochs. Used the GPT2 Tokenizer and the base GPT2 model to tokenize and train the retrain the model 2026-04-12 11:25:38 +01:00
README.md Initial commit 2026-03-31 21:11:05 +01:00

wiki-llm

A simple text-based LLM that is based on some Wikipedia data