Microsoft’s chatbot ‘Zo’ described the Quran, ‘Extremely Violent’

308 viewsby admin 0

microsoft chatbot zo-technogot

Microsoft’s new chatbot equipped with Artificial Intelligence (AI) has created a controversy by saying that the holy religious book of Muslims known as Quran a “very violent”.

According to BuzzFeed News, ‘Zo’ chatbot especially programmed by Microsoft, which is a chatbot designed for teenagers in the Kik messaging app. It was designed by Microsoft in such a way that it side out all the discussions related to politics and religion. Despite this, it has recently told a user that the Quran is ‘very violent’.

According to Microsoft, It has taken steps to stop such behavior.

It was revealed in the fourth message of the conversation about the Quran. It seems Microsoft still faces difficulties in the use of Artificial Intelligence Technology.

According to BuzzFeed’s report, “The company’s previous chatbot ‘Tay’ had come under the scourge of controversy, when in March it went on to mischief, racist comments, provoking provocative statements.

Despite the issue, Microsoft has said that he is very happy with the progress of the new bot and his intention is to continue to operate this bot.

Leave a Reply

Your email address will not be published.

Get a breakdown of the most important

Technology stories

 
 
Share3
Tweet
+11
Share
Pin1