Skip to main content

Google Gemini

Google offers two ways of calling Gemini via API:

  1. Via the Vertex APIs (docs)
  2. Via the Gemini API (docs)

Vertex API

Full Weave support for the Vertex AI SDK python package is currently in development, however there is a way you can integrate Weave with the Vertex API.

Vertex API supports OpenAI SDK compatibility (docs), and if this is a way you build your application, Weave will automatically track your LLM calls via our OpenAI SDK integration.

* Please note that some features may not fully work as Vertex API doesn't implement the full OpenAI SDK capabilities.

Gemini API

info

Weave native client integration with the google-generativeai python package is currently in development

While we build the native integration with the Gemini API native python package, you can easily integrate Weave with the Gemini API yourself simply by initializing Weave with weave.init('<your-project-name>') and then wrapping the calls that call your LLMs with weave.op(). See our guide on tracing for more details.