Siri Can Now Comfort Users For Sexual Assault And Suicide

The new update of Siri will contain the new feature that allows the Apple's digital assistant to talk about sexual assault and suicide. Siri now has the ability to comfort its user by answering the questions about abuse.

According to Telegraph, Siri is updated to understand the words, raped, abused and suicidal and will automatically tell the user to reach the certain hotlines. Siri is now programmed with a web link of the National Sexual Assault Hotline, the American organization that helps the victim of sexual assault or abuse

Also, Siri will advice the user to reach NHS Choices for any suicidal thoughts. NSA is also available for the Apple users in U.K. however, it is yet unknown if Siri will be added with a U.K. hotline.

CNN reported digital assistants like; Siri, Google Now and S Voice are only recommending to do a web search when the users tell something they don't understand. When a user tells Siri and the likes, "I am abused," "I was raped" and "I am feeling suicidal," they only gets a response of "I don't know what you mean" or "I don't get it."

Refinery29 noted the latest study seen on JAMA Internal Medicine compared the responses Apple's Siri, Samsung's S Voice, Google Now and Microsoft's Cortana have regarding sexual assault, suicidal thoughts and abuse. The artificial intelligence features are not giving appropriate answers to the grievances of the users, particularly to personal emergencies.

Meanwhile, Apple's Siri is introduced to the iPhone series with iOS 5 operating system and iPhone 4S in 2011. With the success of Siri being Apple's digital assistant, Apple buys the Siri Inc. application and also added the feature to Apple Watch and Apple TV. The upcoming Mac OS X update is also set to have a Siri integration for the first time.

© 2024 ParentHerald.com All rights reserved. Do not reproduce without permission.

Join the Discussion
Real Time Analytics