Google Developer Experts

Experts on various Google products talking tech.

Follow publication

Member-only story

Building Generative AI-powered Chat Android apps with Google’s Gemini

Source: https://blog.google/technology/ai/google-gemini-ai/

In the current era, where Generative AI is stealing the limelight and being explored and used in various spheres, mobile applications leveraging Generative AI are the need of the hour.

Google launched Gemini, a multimodal model in Dec 2023 and to enable Android developers to build apps leveraging these models, Google launched Google AI Client SDK.

In this blog, we’ll see how we can build generative AI powered chat-based apps using Google AI Client SDK.

All examples in this blog would need an API Key from Google AI Studio, so begin by creating an API Key here.

Lets start building a chatbot or customer support application. Lets start by adding the Generative AI dependency to our app.

dependencies {
...

implementation("com.google.ai.client.generativeai:generativeai:0.9.0")
}

Now, lets create a Generative Model.

GenerativeModel will contain your model’s name(which can be either gemini-1.0-pro, gemini-1.0-pro-vision, gemini-1.5-pro or gemini-1.5-flash), the API key and the generation config.

val generativeModel = GenerativeModel(
modelName = "gemini-1.5-flash-latest",
apiKey = BuildConfig.apiKey,
generationConfig = config
)

Here, we are using the generationConfig helper method and just setting a temperature, however you can add other safety settings as well( if you’d want to block harassment or hate speech content).

val config = generationConfig {
temperature = 0.7f
}

Because there will be a continuous conversation between user and the bot(LLM in this case), we’ll need to use Multi-turn chat API of Gemini AI SDK. We initialise the chat with startChat()function, to which we can provide the conversation history as well.

private val chat = generativeModel.startChat(
history = listOf(
content(role = "user") { text("Hello, I need some help.") },
content(role = "model") { text("Great to meet you. How can I help you?") }
)
)

Next, we can send a message to the LLM and get the response using sendMessage() as shown below,

val response =…

Create an account to read the full story.

The author made this story available to Medium members only.
If you’re new to Medium, create a new account to read this story on us.

Or, continue in mobile web

Already have an account? Sign in

Google Developer Experts
Google Developer Experts
Monika Kumar Jethani
Monika Kumar Jethani

Responses (1)

Write a response