Gemini - Google AI Studio Configs
When running OpenHands, you’ll need to set the following in the OpenHands UI through the Settings under theLLM tab:
LLM ProvidertoGeminiLLM Modelto the model you will be using. If the model is not in the list, enableAdvancedoptions, and enter it inCustom Model(e.g. gemini/<model-name> likegemini/gemini-2.0-flash).API Keyto your Gemini API key
VertexAI - Google Cloud Platform Configs
To use Vertex AI through Google Cloud Platform when running OpenHands, you’ll need to set the following environment variables using-e in the docker run command:
LLM tab:
LLM ProvidertoVertexAILLM Modelto the model you will be using. If the model is not in the list, enableAdvancedoptions, and enter it inCustom Model(e.g. vertex_ai/<model-name>).

