Your app can't love you (but does it have a duty to help?)
On the heightened emotions of Valentine's Day, the blandness of programmatic messages from our apps, and the option to simply say nothing at all
Valentine's Day may be the peak of the "Hallmark holidays", but that doesn't stop institutions of all sorts from trying to weigh in on the theme of love. Snapchat, for instance, pushed a bland, programmatic message to its users for the holiday: "Happy Valentine's Day! Snap some love to those around you and save some for yourself of course".
■ Considering how much social-media applications depend upon user engagement to remain viable, it's no surprise that any viable opportunity to tug on emotional cues to suggest that people use an app is likely to turn into such a "push" message. But generic messages of this sort always seem hopelessly hollow, and that's odd, considering the centuries-long history of beautiful writing about love (and by those in love) throughout the canon of literature.
■ It's probably too much, of course, to expect anyone working in corporate branding at a place Snapchat to come up with a few lines that would achieve with an economy of words (and heart emojis) the kinds of sentiments it took Shakespeare 25,545 words to express in "Romeo and Juliet". But maybe it isn't too much to ask.
■ The half-hearted attempt to suggest that the user engage in "self-care" by "sav[ing] some for yourself of course" hints at the the inkling of a sense of duty: That we ought to know that Valentine's Day isn't a purely joyful day for all -- either because of love lost, or of love never found. Romance may be a nearly-universal aspect of human experience, but so are heartbreak and longing. Knowing that, is it really enough to commit the digital equivalent of a drive-by shooting with Cupid's arrow?
■ We have a long way to go before we truly grasp what our ever-present devices, our dopamine-triggering applications, and our complex senses of digital community are doing to us. For the exceptional good they are capable of doing, they are also risky: To at least the same extent that computers can help monitor our social and emotional well-being, they are also capable of creating insidious hazards to vulnerable brains.
■ The science of it all is still so young that it is undoubtedly premature to think that the duty to prevent harm can be effectively imposed by regulation or other forms of legal control. But that duty exists nonetheless, and the people who work on these things we so casually call "platforms" must be reminded constantly of their human responsibility to ensure that they do the right thing.
■ There may be nothing wrong with sending a push message on an emotionally charged holiday like Valentine's Day, but it cannot escape the attention of reasonable people that there is a big difference between offering an aisle full of candy and cards for sale in a grocery store, and sending a message virtually unfiltered straight into a user's brain on a day when they might be in a vulnerable state.
■ It's not just the quality of the language that matters (though surely someone ought to look at examples like the love letters of John and Abigail Adams to see that they can do better than "Snap some love"); it's the psychology at work in both what is said and unsaid that has consequences, too. The choice to "do no harm" is available to everyone, including those of us outside the medical professions. And sometimes, rather than saying things poorly or risking saying something unintentionally harmful, it may be best to refrain from saying anything at all.