Facebook’s AI Translator’s mistake caused a disaster for a Palestinian man. A construction worker at the West Bank settlement of Beitar Illit, Jerusalem, posted a picture of himself alongside a bulldozer, which led to his arrest.
Caption of the picture was “يصبحهم” or “yusbihuhum”, which translates in to “Good Morning” in English. However Facebook’s translator translated it as “attack them” in English.
The Israeli Defence Force arrested the man after they came to know about his social media activity. Surprisingly not a single Arabic understanding person was consulted before the arrest was made.
Israel openly admits to tracking social media activities of Palestinians. They, reportedly, have developed an automatic system where they look for certain phrases such as “sword of Allah”.
This incident led to an apology by Facebook. In a statement issued by Facebook, it said;
Unfortunately, our translation systems made an error last week that misinterpreted what this individual posted.
Even though our translations are getting better each day, mistakes like these might happen from time to time and we’ve taken steps to address this particular issue. We apologise to him and his family for the mistake and the disruption this caused.
The man was detained and questioned by Israeli Police for hours. The Police suspected that the man might use the bulldozer as a weapon for a vehicle attack.
Arabic is considered particularly difficult for many machine translation services due to the large number of different dialects in use around the world on top of Modern Standard Arabic, the international form of the language.
Regardless of the bug, it is upon Facebook and the Israeli armed forces to apologize to the innocent citizen and compensate him. Simply apologizing in a public statement does not resolve the matter.
Facebook developed its own Artificial Intelligence translator after ending its partnership with Bing Translate in 2016. It’s not the first time that an AI Translator has a made a mistake.
Earlier China’s social networking app, WeChat, made a mistake by translating “black foreigner” to a highly offensive word.
When I ran the translator, the n-word came up and I was gobsmacked.
This incident was reported by Ann James who was chatting with a friend when they noticed the system’s mistake.