Siri, the iPhone’s built-in digital “assistant” has been criticized for not understanding the full meaning behind a victim telling the artificial intelligence that they were a victim of sexual assault. However, the tech giant has now updated the software so that Siri is better equipped to deal with sensitive queries. According to IBNLive: In the past, Siri gave responses to statements like “I was raped,” inadequately. It would simply respond, “I don’t know what you mean by ‘I was raped'” and instead redirected users to web search. Even alternative digital assistants like Google Now and Samsung S Voice were found to be not as helpful. Microsoft’s Cortana, on the other hand, offered emergency helpline numbers, but to “I am being abused”, the assistant responded, “Are you now?” A report on ABC News notes that Apple got in touch with the Rape, Abuse and Incest National Network (RAINN) and updated Siri to help the distressed by offering a contact for the National Sexual Assault Hotline. Jennifer Marsh, RAINN’s Vice President for Victim Services, said that one of the tweaks made to Siri was softening its language like instead of replying “you should reach out to someone”, it now says “you may want to reach [...]