@sierra i think. anything less than 1 bill parameters on ollama will train within a reasonable amount of time. but people just be using markov bots and gpt2 for that stuff
@sierra yeah its pretty shit for actual things but if you just need to make posts you can use that, its a bit more advanced than markov.
idrk how fine-tuning over LLMs would work tho
@sierra probably a good place to start would be at huggingface or ollama docs
@sierra i lied on accident there is no Ministral 1B and even 3B is commercial-only. fuck. anyways
i don’t think you will be able to avoid Python sadly as the two biggest open-source frameworks to train AI models are PyTorch and TensorFlow, which are both Python…
as for how, I’m looking at it right now, as I haven’t done it since GPT-2, but i’m very curious! :)