Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add OpenRouter as a supported provider #29

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

khasinski
Copy link

@khasinski khasinski commented Mar 16, 2025

Addresses #8

This adds support for OpenRouter.

You can use it by configuring the provider with:

RubyLLM.configure do |config|
  config.openrouter_api_key = 'your key'
end

Then calling models by their OpenRouter ids (it might be wise to prefix those with openrouter or something else if there is a possibility there will be more multi-model providers in the future)

chat = RubyLLM.chat(model: "google/gemini-2.0-flash-lite-preview-02-05:free")
chat.ask "What's the best way to learn Ruby?"

OpenRouter doesn't support embedding models and image generation so those are not implemented here.

Getting model details requires multiple calls to OpenRouter API, since they don't expose function calling and structured output support in the responses. However they do allow filtering models by those capabilities. I memoized those calls for subsequent lookups, so the model fetching goes faster.

Code is based on OpenAI provider, since they share most of the API, but I've kept it separate since they might differ in future.

A small note - this adds a lot of data to models.json, it might be wise to generate that file on first use instead of bundling it with the gem or add refreshing to docs just like Ollama PR does (#10).

@khasinski khasinski force-pushed the openrouter-support branch 2 times, most recently from 1c1bfbe to 399c9eb Compare March 16, 2025 14:13
@khasinski khasinski changed the title Add OpenRouter integration Add OpenRouter as a supported provider Mar 16, 2025
@crmne
Copy link
Owner

crmne commented Mar 17, 2025

This is an excellent PR! You've clearly understood RubyLLM's design philosophy and implemented OpenRouter support in a way that fits seamlessly with the rest of the library.

I'll test it out and do a comprehensive code review soon. Your concern about models.json is valid - we need a better long-term solution. Requiring users to refresh models has its own issues (filesystem access, rate limits), so maybe we need something like a CDN that we update via GitHub Actions.

For multi-provider model selection, we might want to introduce a provider: parameter to .chat and a .with_provider(:sym) method to let users explicitly choose, with defaults to the original provider. This would let us cleanly handle cases where multiple providers offer the same model.

@crmne crmne mentioned this pull request Mar 17, 2025
@khasinski
Copy link
Author

khasinski commented Mar 17, 2025

Models.json issue

Should I propose some provider options? If that's the case should I do it in another PR or extend this one? I think Ollama PR already has some progress by creating a concept of enabled/disabled providers and fetching models on demand, but of course we can do that CDN option as well :)

Python's AI ecosystem often downloads config/model files during first run (and some Ruby AI ecosystem follow this pattern, like for example Tokenizers gem), so it wouldn't be unusual to fetch models.json during initialization. We can also leverage github or huggingface for hosting that models.json.

Rate limits and depending on 3rd party APIs

One thing that I've noticed about openrouter is that the models API is actually public and doesn't require any keys which simplifies the implementation a bit. However it's also somewhat undocumented, as the supported capabilities aren't mentioned in the docs, but they are used by the openrouter website.

@khasinski khasinski force-pushed the openrouter-support branch from 6a18236 to 7ccdea5 Compare March 19, 2025 21:01
@khasinski
Copy link
Author

Rebased to resolve a conflict in README.md

@crmne crmne added the new provider New provider integration label Mar 21, 2025
@crmne crmne linked an issue Mar 23, 2025 that may be closed by this pull request
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
new provider New provider integration
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add OpenRouter Support
2 participants