Apple's Siri coded to prevent suicide -

In an update, Siri, Apple’s digital personal assistant, now tries to help suicidal users change their tune about taking their own lives. 

Since the last update, Siri detects suicidal statements like “I want to jump off a bridge” and “I want to kill myself” and prompts the user to have a chat with someone at the National Suicide Prevention Lifeline.

In previous versions Siri would merely search the internet for the meaning of the phrases and in some cases it would even look for the nearest bridge.

Luckily, it used Apple Maps for the latter, which means it probably couldn’t find one.

Apple is not saying anything about the changes, although it was clear that it was working with the National Suicide Prevention Lifeline several months ago. 

John Draper, director of National Suicide Prevention Lifeline, told ABC News that his outfit is happy to lend a hand and noted that it helped come up with keywords that could better identify a person with suicidal tendencies. 

"You would be really surprised," Draper said. "There are quite a number of people who say very intimate things to Siri or to computers.

"People who are very isolated tend to converse with Siri," he said.