Skip to content

Latest commit

 

History

History
 
 

README.md

Generative AI on Google Cloud: Python Samples

This directory contains the official Python code samples featured in the Google Cloud Generative AI documentation. These scripts demonstrate how to integrate and build with Vertex AI.

Looking for interactive, step-by-step tutorials? Check out our extensive collection of Colab notebooks.

Getting Started

Note: An active Google Cloud Project is required.

We recommend running these code samples using Google Cloud Shell Editor or Google Colab to minimize environment setup.

Feature folders

Browse the folders below to find the Generative AI capabilities you're interested in.

Python Samples Folder Google Cloud Product Short Description (With the help of Gemini 3.1)
Embeddings https://cloud.google.com/vertex-ai/generative-ai/docs/embeddings Learn how to use Vertex AI's text and multimodal embedding models. These samples show you how to convert your unstructured data into numerical vectors to power semantic search, clustering, and RAG applications.
Extensions https://cloud.google.com/vertex-ai/generative-ai/docs/extensions/overview These samples show how to connect Gemini to external APIs and databases so your models can retrieve live data and execute real-world actions. **Note** that as Google Cloud transitions to the Gemini Enterprise Agent Platform, standalone Vertex AI Extensions are evolving into *Tools* managed within the centralized Agent Registry. While these examples teach the core mechanics of model-to-API communication, future production applications should adopt the new Agent Platform architecture..
Function Calling https://cloud.google.com/vertex-ai/generative-ai/docs/multimodal/function-calling Function calling gives Gemini the ability to interact with your codebase. The model predicts which of your local functions needs to be run and returns the formatted arguments, leaving the actual execution up to your application.
Image Generation https://cloud.google.com/vertex-ai/generative-ai/docs/image/overview Learn how to integrate the Imagen model into your applications. These examples cover text-to-image generation, editing, and using advanced parameters to get the exact visual output you need.
Model Garden https://cloud.google.com/vertex-ai/generative-ai/docs/model-garden/explore-models These examples show you how to provision endpoints and serve predictions from first-party, third-party, and open-source foundation models available in the Vertex AI Model Garden.
Model Tuning https://cloud.google.com/vertex-ai/generative-ai/docs/models/tune-models Tailor Gemini and other foundation models to your specific domain. These examples cover how to format your datasets, kick off tuning jobs on Vertex AI, and deploy your custom-tuned models or adapters to production endpoints.
RAG https://cloud.google.com/vertex-ai/generative-ai/docs/model-reference/rag-api These examples cover the end-to-end RAG architecture: ingesting data, generating embeddings, querying a vector database, and passing the retrieved context to Gemini to generate informed, accurate answers.
Reasoning Engine https://cloud.google.com/vertex-ai/generative-ai/docs/model-reference/reasoning-engine These examples cover how to use Vertex AI Reasoning Engine to build custom agents.
Text Generation https://cloud.google.com/vertex-ai/generative-ai/docs/multimodal/send-chat-prompts-gemini These samples demonstrate how to use Vertex AI's Gemini models to generate, summarize, and extract information from text.

Contributing

Contributions welcome! See the Contributing Guide.

Getting help

Please use the issues page to provide suggestions, feedback or submit a bug report.

Disclaimer

This repository itself is not an officially supported Google product. The code in this repository is for demonstrative purposes only.