Facebook’s AI Translator’s mistake caused a disaster for a Palestinian man. A construction worker at the West Bank settlement of Beitar Illit, Jerusalem, posted a picture of himself alongside a bulldozer, which led to his arrest.
Caption of the picture was “يصبحهم” or “yusbihuhum”, which translates in to “Good Morning” in English. However Facebook’s translator translated it as “attack them” in English.
The Israeli Defence Force arrested the man after they came to know about his social media activity. Surprisingly not a single Arabic understanding person was consulted before the arrest was made.
Israel openly admits to tracking social media activities of Palestinians. They, reportedly, have developed an automatic system where they look for certain phrases such as “sword of Allah”.
This incident led to an apology by Facebook. In a statement issued by Facebook, it said;
Unfortunately, our translation systems made an error last week that misinterpreted what this individual posted.
Even though our translations are getting better each day, mistakes like these might happen from time to time and we’ve taken steps to address this particular issue. We apologise to him and his family for the mistake and the disruption this caused.
The man was detained and questioned by Israeli Police for hours. The Police suspected that the man might use the bulldozer as a weapon for a vehicle attack.
Arabic is considered particularly difficult for many machine translation services due to the large number of different dialects in use around the world on top of Modern Standard Arabic, the international form of the language.
Regardless of the bug, it is upon Facebook and the Israeli armed forces to apologize to the innocent citizen and compensate him. Simply apologizing in a public statement does not resolve the matter.
China’s WeChat
Facebook developed its own Artificial Intelligence translator after ending its partnership with Bing Translate in 2016. It’s not the first time that an AI Translator has a made a mistake.
Earlier China’s social networking app, WeChat, made a mistake by translating “black foreigner” to a highly offensive word.
When I ran the translator, the n-word came up and I was gobsmacked.
This incident was reported by Ann James who was chatting with a friend when they noticed the system’s mistake.
This same action ” HUM PAKISTAN MAIN DEKHNA CHAHTE HAI “
Wahab bhai ap theek to ho sunna hai koi banda arrest hua hai Facebook ki wajah se
Mujhy to Pura ProPk & Comments Wale Dhoon Rahe hai But I Am Safe :
AnyWay I am Using facebook in night before sleeping :d
Wahab bhai apne galle mein limmo , mirchi aur kisi bachi ki chappal latka kar Facebook use kiya karo ?
Ap ek he sample ho agar ap ko kch hua to propakistani bewa ho jaye ga ?
I Can’t decided : Meri Hosla Afzai Ho RAHI Hai Ya BayEzati :D
Ye ap par chorha decide kar k sab ko bataiye ga ?
Police dhond rahi hai tumhain chura maar case main. Ya underground ho jao ya phir kuch din comments band kar dou.
Shokar Alhmadillah ma Azad ho ……………Thanks to banta ha QUAID ko