Description
I would like to use Google's Gemini embedding models (e.g., gemini-embedding-001) with OpenEvolve. Currently, the EmbeddingClient in openevolve/embedding.py only supports OpenAI and Azure OpenAI models.
Motivation
Gemini embedding models provide a cost-effective (often free) alternative to OpenAI's embeddings. Enabling support for them would allow users to run the framework with lower costs.
Proposed Solution
Modify openevolve/embedding.py to:
- Recognize Gemini model names (e.g.,
gemini-embedding-001).
- When a Gemini model is selected, configure the
openai client to point to Google's OpenAI-compatible endpoint: https://generativelanguage.googleapis.com/v1beta/openai/.
- Use
GEMINI_API_KEY or GOOGLE_API_KEY for authentication.
Example Configuration
Users should be able to set:
database:
embedding_model: "gemini-embedding-001"
And have it work seamlessly provided the API key is set.
Description
I would like to use Google's Gemini embedding models (e.g.,
gemini-embedding-001) with OpenEvolve. Currently, theEmbeddingClientinopenevolve/embedding.pyonly supports OpenAI and Azure OpenAI models.Motivation
Gemini embedding models provide a cost-effective (often free) alternative to OpenAI's embeddings. Enabling support for them would allow users to run the framework with lower costs.
Proposed Solution
Modify
openevolve/embedding.pyto:gemini-embedding-001).openaiclient to point to Google's OpenAI-compatible endpoint:https://generativelanguage.googleapis.com/v1beta/openai/.GEMINI_API_KEYorGOOGLE_API_KEYfor authentication.Example Configuration
Users should be able to set:
And have it work seamlessly provided the API key is set.