Some cries for help are clearer than others, and sometimes there are no cries to be heard at all, but even when a so-called "red flag" is (metaphorically) held aloft by someone contemplating self-harm or suicide, it's unlikely that the first person who sees it will be a professional crisis counselor, willing or able to take action.

That, it seems, is why Facebook wants to ensure that all its users — anyone at all who might identify a friend who is visibly despairing — have access to professional resources that could potentially save that person's life. To put it grimly: Imagine if it were as easy to express concern (or summon professional assistance) in response to a post on the world's largest social network as it were to "like" it. At least to me, it appears that such is Facebook's vision.

The New York Times seems to have first reported Facebook's suicide prevention project, writing that the company has dedicated more than a dozen engineers and researchers to the project's development. Users can flag posts that raise the explicit threat or mere specter of self-harm or suicide.

"With the help of these new tools, if someone posts something on Facebook that makes you concerned about their well-being, you can reach out to them directly — and you also can also report the post to us," Facebook's Global Head of Safety Antigone Davis and Researcher Jennifer Guadagno wrote to the website, "We have teams working around the world, 24/7, who review reports that come in."

Sadly, the inspiration for the effort came close to home for the website: From the rash of suicides among Palo Alto teens. That town, of course, is where Facebook CEO Mark Zuckerberg owns a home and where the company was first headquartered. The suicide crisis of Palo Alto teens has been covered, forgive me, basically to death, and is addressed here to some small degree.

These "tools" were “developed in collaboration with mental health organizations and with input from people who have personal experience with self-injury and suicide,” and, as TechCrunch explains, appear on a dropdown menu beneath posts. Users may share words of support, in phrases suggested by Facebook in conjunction with its research, either under their own name or anonymously. Further, posts can be flagged for review by Facebook employees, who will "reach out to this person with information that might be helpful to them.” If a user suspects a member of their social network to be in immediate danger, the company recommends contacting police.

While the project may have been inspired long ago (per the New York Times it's been as much as a decade in the making), it comes at a time of particularly distinct need. Suicide, a National Center for Health Statistics study released this April determined, is at a 30-year high. An estimated 72 percent of Americans use Facebook, as do roughly 1.65 billion people across the world.

If someone you know exhibits warning signs of suicide: do not leave the person alone; remove any firearms, alcohol, drugs or sharp objects that could be used in a suicide attempt; and call the U.S. National Suicide Prevention Lifeline at 800-273-TALK (8255) or take the person to an emergency room or seek help from a medical or mental health professional.

Related: The Atlantic Is The Latest To Ask Why Palo Alto Teens Kill Themselves