Google Translate shows bizarre messages about the end of the world and the second coming of Jesus

Biblical prophecies appearing through Google Translate are the result of issues with machine learning, the tech giant says: AFP/Getty Images
Biblical prophecies appearing through Google Translate are the result of issues with machine learning, the tech giant says: AFP/Getty Images

A glitch with Google Translate has resulted in a series of mysterious messages and prophecies appearing when gibberish text is entered into the app.

The translation service, which supports over 100 languages and serves over 500 million people each day, uses artificial intelligence to increase accuracy. But the technology has also caused an issue with some of the lesser-used languages.

Typing in the word "dog" 18 times into Google Translate and selecting the input language as Maori results in the following message: "Doomsday Clock is three minutes at twelve We are experiencing characters and a dramatic developments in the world, which indicate that we are increasingly approaching the end times and Jesus' return."

First reported by Motherboard, it is not the first time users of the translation service have noticed seemingly nonsensical messages being interpreted strangely. There is even an entire 'TranslateGate' forum on Reddit dedicated to the bizarre phenomenon.

Typing in the word "goo" 13 times in succession, for example, and translating it from Somali to English, results in the message: "Cut off the penis into pieces, cut it into pieces."

The Somali language also throws up other oddities, such as the text "do nal d try mp do nal d tru mp do nal d tru mp" being interpreted as "Do not do this anymore".

Many of the most peculiar translations appear to reference biblical prophecies. But while some people on social media have blamed demons, the real reason is likely to be more scientific.

Google claims that the Translate app takes its understanding from translations found elsewhere on the internet.

"Google Translate learns from examples of translations on the web and does not use 'private messages' to carry out translations, nor would the system even have access to that content," a Google spokesperson said.

"This is simply a function of inputting nonsense into the system, to which nonsense is generated."

Google has not fixed the issue, with all of the examples tested by The Independent resulting in the same mistranslations, however some previously reported examples appear to have been fixed.