Hello everyone!
I’ve been trying out the new Claude-code challenge, which I was exited for, I tried to run the code locally but seems that the Openrouter service is paid and unfortunately can’t afford it right now. Is there a way to use another LLM service with a free tier ?? (at least locally, so I can test and try my code before commiting)
Thanks!!
Most LLM providers (local or hosted) should support an OpenAI-compatible API, so I think small changes to your code can get it working against any other provider. Haven’t tried this myself yet though!
thanks for you reply!
yeah, I ended up running ollama locally using docker and a gemma3 model
@tomasdepi nice! What changes did you have to make? I assume just pointing the base url to localhost + changing the model name worked?