How to Use Any OpenRouter Model with Google Agent Development Kit (ADK)

See how you can use any model you want in Google Agent Development Kit (ADK) with LiteLLM. Configure OpenRouter models easy.

How to Use Any OpenRouter Model with Google Agent Development Kit (ADK)

The Google Agent Development Kit (ADK) is engineered for adaptability, empowering developers to integrate virtually any Large Language Model (LLM) into their agents, far beyond its native Google Gemini ecosystem. Launched on April 9, 2025, as an open-source Python framework, ADK achieves this flexibility through a dual integration approach: direct model strings for Google’s infrastructure (e.g., Gemini via google-genai) and wrapper classes like LiteLlm for external or custom models. This design allows ADK to tap into diverse LLM providers—such as OpenRouter, Anthropic, or even local setups—without requiring extensive code changes. By abstracting model interactions into a consistent LlmAgent interface, ADK enables developers to swap models (e.g., from Gemini to Grok) by simply adjusting the model parameter, while its tool and session management features remain intact. This modularity, combined with LiteLLM’s broad compatibility, ensures ADK agents can leverage the strengths of any model—speed, reasoning, or cost-efficiency—making it a versatile platform for building tailored AI solutions.

What is OpenRouter?

OpenRouter is a platform that simplifies access to a diverse array of Large Language Models by providing a single, OpenAI-compatible API endpoint. Launched to democratize AI model usage, it acts as a gateway to models hosted by various providers, including open-source options like Mistral and proprietary ones like GPT-4, all accessible with one API key. As of April 2025, OpenRouter supports over 50 models, making it a treasure trove for developers seeking flexibility.

Benefits of Using OpenRouter with ADK

  • Variety: Choose from models optimized for different tasks—speed (e.g., Grok), reasoning (e.g., Claude), or cost (e.g., LLaMA variants).
  • Cost Efficiency: OpenRouter’s pricing model often undercuts direct provider rates, with free tiers for experimentation.
  • Unified Interface: No need to juggle multiple SDKs or APIs; OpenRouter’s standardized endpoint works seamlessly with LiteLLM, which ADK supports natively.
  • Scalability: Easily switch models without rewriting code, thanks to LiteLLM’s abstraction layer.

Supported Models and LiteLLM Integration

OpenRouter integrates with LiteLLM, a lightweight library that translates its API calls into a format ADK understands. Whether you want to use xAI’s Grok, Anthropic’s Claude, or an open-source model like Mixtral, LiteLLM ensures compatibility by wrapping OpenRouter’s endpoint into ADK’s LlmAgent. This article will use Grok (created by xAI) as an example, but you can swap it for any OpenRouter-supported model by adjusting the model string.

With OpenRouter and ADK, you’re not just building an agent—you’re building a gateway to the future of AI flexibility.

What is LiteLLM

LiteLLM is a lightweight, open-source Python library designed to simplify interactions with a wide variety of Large Language Models (LLMs) by providing a standardized, OpenAI-compatible API interface. Developed to bridge the gap between diverse LLM providers and developers, LiteLLM abstracts away the complexities of individual model APIs, allowing seamless integration with over 100 models from providers like OpenAI, Anthropic, Hugging Face, and platforms like OpenRouter. As of April 2025, LiteLLM has become a go-to tool for AI developers seeking flexibility without the overhead of managing multiple SDKs.

Key Features of LiteLLM

  • Unified API: LiteLLM translates calls to different LLMs into a consistent format, mimicking OpenAI’s API structure. This means you write code once and swap models with minimal changes.
  • Broad Model Support: It supports models hosted on cloud platforms (e.g., OpenRouter, Anthropic), self-hosted endpoints (e.g., vLLM, Ollama), and even local setups, covering both proprietary and open-source options.
  • Tool Calling: LiteLLM supports function/tool calling for models that enable it, making it ideal for ADK’s tool integration needs (e.g., Tavily search in this guide).
  • Lightweight and Fast: With minimal dependencies and efficient request handling, it adds little overhead to your application.
  • Customizable: You can specify API bases, keys, headers, and other parameters to connect to custom or private endpoints.
  • Error Handling: It provides robust fallbacks and logging, helping debug issues across different providers.

How LiteLLM Works with ADK

In the context of Google’s Agent Development Kit (ADK), LiteLLM acts as a bridge between ADK’s LlmAgent and external models. ADK natively supports Google’s Gemini models via the google-genai library, but for non-Google models—like those on OpenRouter—you use the LiteLlm wrapper class. This class takes a model identifier (e.g., “openrouter/xai/grok”) and configuration details (e.g., API key, base URL) and handles the communication, ensuring ADK can leverage the model’s capabilities. LiteLLM’s support for tool calling is particularly valuable, as it enables ADK agents to use external tools seamlessly, even with models not originally designed for Google’s ecosystem.

Why Use LiteLLM? LiteLLM’s simplicity and versatility make it a perfect companion for ADK. It eliminates the need to rewrite agent logic for each provider, supports rapid prototyping by letting you test multiple models, and ensures compatibility with ADK’s architecture. For OpenRouter, LiteLLM connects to its unified endpoint (https://openrouter.ai/api/v1), passing requests to the chosen model and returning responses in a format ADK understands.

Switching Your ADK Agent to OpenRouter Models

In the previous part, we built an ADK agent capable of using Tavily search and remembering information, powered by a Google Gemini model connected via AI Studio. One of ADK’s strengths is its flexibility in using different Large Language Models (LLMs).

This part will guide you through modifying the agent to use a model hosted on OpenRouter, specifically openrouter/quasar-alpha, leveraging ADK’s integration with the LiteLLM library.

Why Switch Models?

OpenRouter provides access to a vast array of LLMs from different providers through a unified API. Switching allows you to experiment with models that might offer different performance characteristics, pricing, or capabilities, like quasar-alpha, which is noted for long context and coding tasks.

Prerequisites

  • You should have the complete agent code from the previous article (with InMemorySessionService, Runner, Tavily search tool, and remember_something tool).
  • You will need an API key from OpenRouter.

1. Setup for OpenRouter

First, let’s configure the necessary components to connect to OpenRouter via LiteLLM.

  • Get OpenRouter API Key: Sign up or log in to OpenRouter to obtain your API key.
  • Update .env File: Add your OpenRouter API key to the .env file in your project’s root directory:
    # Add this line to your existing .env file
    OPENROUTER_API_KEY="PASTE_YOUR_OPENROUTER_API_KEY_HERE"
    
    # Keep your existing keys as well
    GOOGLE_API_KEY="YOUR_GOOGLE_API_KEY" # Still needed if other parts use it
    TAVILY_API_KEY="YOUR_TAVILY_API_KEY"
  • Install LiteLLM: If you haven’t already, install the LiteLLM library using uv:
    # Run in your activated virtual environment
    uv add litellm

2. Modifying the Agent Code

Now, we’ll update the agent.py script to use the OpenRouter model.

  • Import LiteLlm: Add the necessary import at the top of your script:

    # Add this import alongside other ADK imports
    from google.adk.models.lite_llm import LiteLlm
    import os # Ensure 'os' is imported to read environment variables
  • Update Agent Definition: Modify the model parameter within your Agent definition. Instead of directly providing the Gemini model string, we’ll use the LiteLlm wrapper:

    # Find your root_agent definition and modify the 'model' parameter
    
    root_agent = Agent(
        name="my_adk_agent_openrouter", # Consider updating the name
        # --- MODIFIED PART ---
        model=LiteLlm(
            # Specify the OpenRouter model using 'openrouter/' prefix
            model="openrouter/openrouter/quasar-alpha",
            # Explicitly provide the API key from environment variables
            api_key=os.getenv("OPENROUTER_API_KEY"),
            # Explicitly provide the OpenRouter API base URL
            api_base="https://openrouter.ai/api/v1"
        ),
        # --- END MODIFIED PART ---
        description="A helpful assistant using OpenRouter Quasar Alpha, capable of web search and memory.",
        instruction="""You are a friendly and helpful assistant powered by Quasar Alpha.
    1. If the user asks for information that might require up-to-date details, recent events, or specific web searching, use the 'TavilySearchResults' tool.
    2. If the user asks you to remember something specific, use the 'remember_something' tool to save it. Also mention what was previously remembered if anything.
    3. Use information remembered earlier (from session state) if relevant to the current query.
    4. Otherwise, answer directly based on your general knowledge.""",
        tools=[adk_tavily_tool, remember_something] # Keep the tools the same
    )
    
    print(f"Agent '{root_agent.name}' updated to use OpenRouter model.")
    
    # IMPORTANT: Ensure the Runner uses this updated agent instance
    # If 'runner' was defined after 'root_agent', it should pick up the change.
    # If not, redefine the runner:
    # runner = Runner(agent=root_agent, app_name=APP_NAME, session_service=session_service)

Explanation of Changes:

  • We replaced the direct model string (like "gemini-1.5-flash-latest") with an instance of LiteLlm.
  • Inside LiteLlm, we set:
    • model: The LiteLLM identifier for the desired OpenRouter model (openrouter/quasar-alpha).
    • api_key: Explicitly read the OpenRouter key from the environment variables.
    • api_base: Explicitly set the standard OpenRouter API endpoint URL (https://openrouter.ai/api/v1). While LiteLLM might auto-detect some settings, being explicit is often clearer and safer.

3. Running the Modified Agent

That’s it! The rest of your code – the Runner, InMemorySessionService, tool definitions (remember_something, adk_tavily_tool), and the interaction loop (run_conversation) – should remain unchanged.

When you run the agent.py script again:

uv run python -m agent_module.agent

or

adk web

The Runner will now direct requests for the root_agent through the LiteLlm wrapper to OpenRouter, using the openrouter/quasar-alpha model to handle reasoning, instruction following, and decisions about when to use the Tavily or memory tools.

Conclusion

You’ve successfully switched the underlying LLM of your ADK agent from Gemini to an OpenRouter model using the LiteLlm wrapper. This highlights the power of ADK’s flexible architecture, allowing you to experiment with and leverage a wide variety of models while keeping your core agent logic and tool integrations intact. You can now easily adapt this process to try other models available through OpenRouter or different providers supported by LiteLLM.

Related Posts