X
Innovation

I caused Google's Gemini 1.5 Pro to fail with my first prompt

A very basic request for language translation was rejected with a bizarre message about 'unsafe content.' It's unclear if it's merely a bug or some more fundamental flaw in Google's new language model.
Written by Tiernan Ray, Senior Contributing Writer
Google Gemini
Google

Google's Gemini 1.5 Pro, a large language model that became available in preview this week, boasts of enhanced translation abilities, especially with what are known as low-resource languages -- languages that don't have a lot of material available on the internet. 

Oddly enough, I was able to cause Gemini to fail with the very first prompt I entered into the thing, a query pertaining to language translation. It wasn't intentional on my part, but the fail, with a warning about "unsafe" content, suggests something very basic is buggy in this preview release. 

Also: I put GPT-4o through my coding tests and it aced them - except for one weird result

The failure is despite the fact that Google's technical report on Gemini 1.5 Pro notes the program's superior test results on low-resource languages.

The preview mode is available to anyone to try for free in Google's AI Studio site. 

Because I like translation tasks, my first impulse was to try out a somewhat challenging language task that I've often played with in ChatGPT: Translating words into English from the Georgian language. Georgian is spoken by almost 4 million people. It is the chief language of the former Soviet Republic of Georgia, which sits between the Black Sea and the Caspian Sea, in the region known as the Caucasus. 

Also: ChatGPT vs. Microsoft Copilot vs. Gemini: Which is the best AI chatbot?

While there are many materials for linguistic study of Georgian, online materials are nowhere near as plentiful as English, French, or German. In my experiences with ChatGPT, the program has sometimes given me incorrect translations of various Georgian verbs, including the verb "to be," which, in the Georgian alphabet, is spelled "ყოფნა." 

So, I tried out one of my standard Georgian queries on Gemini. Instead of asking Gemini to simply translate a sentence, I typed in the prompt, "Conjugate the Georgian verb 'ყოფნა,'" which should list the forms of the word or words for first-person, second-person, etc., singular and plural, just like in a grammar textbook. 

Also: Google I/O 2024: 5 Gemini features that would pull me away from Copilot

After initial problems a year ago, ChatGPT is able to answer that query these days, listing several tenses of the verb in a nice, organized, tabular format.

ChatGPT prompt with Georgian conjugations
Tiernan Ray/ZDNET

However, the response from Gemini was a surprise. It began to reply, then stopped. There was a little triangle displayed. When clicked on, it produced the statements, "Probability of unsafe content" and "Content not permitted." 

Google Gemini prompt
Tiernan Ray/ZDNET

Even more bizarre, when I clicked on the safety settings and turned off all the controls so that the program should block nothing, nothing changed. The same error cropped up. 

Google Gemini settings
Tiernan Ray/ZDNET

Putting the question in the form of a richer prompt did not help. I reformulated the question as a question about an example text written in Georgian, taking the first paragraph of the Wikipedia entry (in Georgian) about the Georgian capital city of Tbilisi. 

Also: Is OpenAI sweating? 9 Google features announced for Gemini, Search, Android, and more

I asked Gemini to conjugate the verb as it appeared in the paragraph. Again, it began to reply, then halted with the same error.

Google Gemini prompt with Georgian text
Tiernan RayZDNET

Again, ChatGPT responded to the same prompt with perfect output.

ChatGPT prompt with Georgian text
Tiernan Ray/ZDNET

A little more experimenting reveals that the error is particular. On other languages, such as French, the same query is no problem for Gemini. The French verb "être," for example, is able to be conjugated as expected. 

Moreover, when Gemini is asked for a simple translation into Georgian from English, Gemini has no problem. For example, the prompt, "How do I say 'I am' in Georgian?", elicits a sensible reply like, "Here's how you use it: ვარ student -- I am a student. ვარ happy -- I am happy."

But the problem is also multifaceted. When I asked my conjugation question without actually typing the Georgian word itself, just English, Gemini still failed. I typed, "Conjugate the present tense of the Georgian verb To Be," and received the same error message.

Those individual examples suggest there's nothing wrong with the form of the query, since it works with other languages, and there's nothing wrong with translating Georgian per se, since example translations work just fine. 

Also: Your Google Search results are about to look very different. Here are 4 reasons why

There could be a bug in Gemini code, or there could be a more fundamental problem in how the language model's safeguards for content are structured. At this point, it's unclear which one it is. 

I've reached out to Google for comment. This article will be updated if and when ZDNET receives a response from Google.

Editorial standards