I’ve been trying out the new Claude-code challenge, which I was exited for, I tried to run the code locally but seems that the Openrouter service is paid and unfortunately can’t afford it right now. Is there a way to use another LLM service with a free tier ?? (at least locally, so I can test and try my code before commiting)
Most LLM providers (local or hosted) should support an OpenAI-compatible API, so I think small changes to your code can get it working against any other provider. Haven’t tried this myself yet though!
Run ollama locally docker run -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
Pull the model docker exec -it ollama ollama pull gemma3
btw, gemma3 does not support Tools, which is required for next steps in the challenge, so I’m exploring other model
Set the env vars (api key may have any value, since it’s local ollama doesn’t auth) OPENROUTER_API_KEY=ollama OPENROUTER_BASE_URL=``http://localhost:11434/v1