Claude Code API Token

Hello everyone!

I’ve been trying out the new Claude-code challenge, which I was exited for, I tried to run the code locally but seems that the Openrouter service is paid and unfortunately can’t afford it right now. Is there a way to use another LLM service with a free tier ?? (at least locally, so I can test and try my code before commiting)

Thanks!!

Most LLM providers (local or hosted) should support an OpenAI-compatible API, so I think small changes to your code can get it working against any other provider. Haven’t tried this myself yet though!

thanks for you reply!

yeah, I ended up running ollama locally using docker and a gemma3 model

2 Likes

@tomasdepi nice! What changes did you have to make? I assume just pointing the base url to localhost + changing the model name worked?

here are the commands:

  1. Run ollama locally
    docker run -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama

  2. Pull the model
    docker exec -it ollama ollama pull gemma3
    btw, gemma3 does not support Tools, which is required for next steps in the challenge, so I’m exploring other model

  3. Set the env vars (api key may have any value, since it’s local ollama doesn’t auth)
    OPENROUTER_API_KEY=ollama
    OPENROUTER_BASE_URL=``http://localhost:11434/v1

    1. Change the model name
2 Likes

2 posts were split to a new topic: Error code 134 in Typescript