remind us to contact significant others, boost our willpower, provide us with moral guidance, and encourage us to be civil. Taken together, we’re observing the emergence of tech that doesn’t just augment our intellect and lives — but is now beginning to automate and outsource our humanity.
Now, it’s perfectly possible that this app is a parody (the promo video includes bitcoin creator Satoshi Nakamoto and feminist voice Germaine Greer among the demo contacts), and its creators “James” and “Tom” didn’t share their last names with me. But my 29-year-old interlocutors — one who apparently has a degree in Engineering and Mathematics, the other in Design and Applied Finance — had clearly thought deeply about why relationship management tools are socially desirable and will be increasingly integrated into our everyday lives.
Drawn here and shared with permission is their rationale, which I believe goes beyond just this one app. So even if it’s a parody (indeed, sadly “we can’t tell”), it captures a real automation-app trend and widely held convictions in the tech community we need to pay attention to.
While I am far from a Luddite who fetishizes a life without tech, we
need to consider the consequences of this latest batch of apps and tools
that First, some quick background on how BroApp works: It not only sends
scheduled texts, but comes preloaded with 12 messages to help users get
started. The developers also took steps to conceal the automation going
on behind the scenes; in places designated “no bro zones,” the app is
automatically disabled. (After all, the jig is up if your girlfriend
received an automatic text from you while you’re at her place.) The app
even has a rating system that lowers the risk of the same message being
sent too frequently.
Despite the fact that the app currently advertises the core benefit of spending “more time with the bros”, it included other scenarios in the initially testing according to the developers: “A girl who used it to message her boyfriend.” Someone who “used it to message her Mum a few times a week.” But let’s put aside the many gender implications for a moment. There’s certainly much to discuss there, and by no means do I want to dismiss the fact that this type of thing exacerbates power differentials and perpetuates the problem of sexism in the tech industry.
Yet the app also suggests something else more subtly problematic that provoked me to focus more on how it functions than the obvious concerns around how it is depicted.
And that’s not to mention the very algorithmic, linear way of thinking James and Tom share here that glosses over the non-linear, tricky negotiations and nuances of relationships. Narratives of frictionless bliss like the one espoused by BroApp persuade because they depict scenarios where interpersonal exchanges become efficient without degrading the quality of communication. But just as laconic expressions of gratitude undermine the pro-social dimensions of etiquette, using duplicitous technological contrivance to increase the frequency of exchanges between romantic partners chips away at the moral commitments that make these relationships special.
When pushed to further elaborate, they cited the influence of Kevin Kelly’s What Technology Wants and made several strong predictions. “Do we believe that widespread adoption of self-driving cars are inevitable? Yes. … Do we believe that greater than human-level AIs are inevitable? Yes.” And so on. James and Tom then further declared:
If a smart yet inexpensive piece of technology can take some of life’s burdensome weight off our shoulders, isn’t it irrational — an outdated sense of humans-are-better-than-machines pride — to avoid accepting assistance that leads to better sleeping, eating, working, exercising, and even loving?
Of course, there’s all kinds of stuff we’re bad at doing or don’t want to do, and digital assistants, apps, and algorithms can help. I too see our relying on some kinds of outsourcing technology as both likely and helpful. But I also believe extreme dependency is a problem to be aware of. The line separating a beneficial from a self-undermining type of assistance isn’t always clear, and tipping points do exist. We can’t afford to overlook them, much less pretend they aren’t there in the first place.
It’s true that what we find upsetting and even creepy can change over time. But there are also plenty of cases where we course-correct because of pushback (sometimes leading to a better end result). And when meaningful distinctions aren’t drawn between different types of cases, we too easily draw false equivalencies. By the logic the BroApp makers use, we should accept that privacy is dead and embrace living in public. But if the Snowden fallout has taught us anything, it’s that the public can be roused to demand accountability and change when it realizes the consequences of seemingly minor decisions in aggregate.
* * *
Ultimately, the reason technologies like BroApp are problematic is that they’re deceptive. They take situations where people make commitments to be honest and sincere, but treat those underlying moral values as irrelevant — or, worse, as obstacles to be overcome. If they weren’t, BroApp’s press document wouldn’t contain cautions like: “Understandably, a girl who discovers their guy using BroApp won’t be happy.”
It’s easy to think of technologies like BroApp as helpful assistants that just do our bidding and make our lives better. But the more we outsource, the more of ourselves we lose.
Now, what if people actually use these apps in a meaningful way — to customize and program in their own personal messages, so the app only offloaded the logistics? That could be useful. But the reality is that inertia is a powerful force in human affairs; people are unlikely to take that extra step. And, even if users do, there still remains an important difference between messages becoming items crossed off a to-do list and conveying them in a heartfelt manner during the actual moments it feels appropriate to express them.
James and Tom compared using BroApp to lying to kids about the existence of Santa Claus. But that actually validates my argument: The relationship parents have with young children is a relationship between unequal parties. I would hope that relationships between adult romantic partners are predicated on equality, and don’t revolve around infantilizing behavior.
read more : http://www.wired.com/2014/02/outsourcing-humanity-apps/
In places designated ‘no bro zones,’ the app is automatically disabled.
But let’s take a concrete example. Instead of doing the professorial
pontification thing we tech philosophers are sometimes wont to do, I
talked to the makers of BroApp, a “clever relationship wingman” (their
words) that sends “automated daily text messages” to your significant
other. It offers the promise of “maximizing” romantic connection through
“seamless relationship outsourcing.”Now, it’s perfectly possible that this app is a parody (the promo video includes bitcoin creator Satoshi Nakamoto and feminist voice Germaine Greer among the demo contacts), and its creators “James” and “Tom” didn’t share their last names with me. But my 29-year-old interlocutors — one who apparently has a degree in Engineering and Mathematics, the other in Design and Applied Finance — had clearly thought deeply about why relationship management tools are socially desirable and will be increasingly integrated into our everyday lives.
Drawn here and shared with permission is their rationale, which I believe goes beyond just this one app. So even if it’s a parody (indeed, sadly “we can’t tell”), it captures a real automation-app trend and widely held convictions in the tech community we need to pay attention to.
Despite the fact that the app currently advertises the core benefit of spending “more time with the bros”, it included other scenarios in the initially testing according to the developers: “A girl who used it to message her boyfriend.” Someone who “used it to message her Mum a few times a week.” But let’s put aside the many gender implications for a moment. There’s certainly much to discuss there, and by no means do I want to dismiss the fact that this type of thing exacerbates power differentials and perpetuates the problem of sexism in the tech industry.
Yet the app also suggests something else more subtly problematic that provoked me to focus more on how it functions than the obvious concerns around how it is depicted.
Technology that optimizes for efficiency is good for society
BroApp is good for society, its makers argue, because it can make people happy without adverse consequences. To persuade me of this point, James and Tom presented me with this rosy scenario:“A guy starts using BroApp with his girlfriend, set to send a message around 12pm each weekday. Guy observes that girlfriend is now much happier when he arrives home from work. Guy is no longer stressed about finding time during a busy day to text. Girl is much happier because her boyfriend is more engaged with their relationship.”
‘Isn’t this a Pareto optimal (everybody happier, nobody unhappier) outcome?’
Most interestingly, the BroApp makers depicted this functionality in
economic terms — as increasing both agents’ happiness. As they observed,
“Isn’t this a Pareto optimal (everybody happier, nobody unhappier)
outcome?” But as other economists have observed, the Pareto efficiency doesn’t necessarily optimize for individual freedom.And that’s not to mention the very algorithmic, linear way of thinking James and Tom share here that glosses over the non-linear, tricky negotiations and nuances of relationships. Narratives of frictionless bliss like the one espoused by BroApp persuade because they depict scenarios where interpersonal exchanges become efficient without degrading the quality of communication. But just as laconic expressions of gratitude undermine the pro-social dimensions of etiquette, using duplicitous technological contrivance to increase the frequency of exchanges between romantic partners chips away at the moral commitments that make these relationships special.
Tech progress is inevitable; it’s “what technology wants”
The makers of BroApp believe it is one “small step” in the direction of transitioning to a world depicted in the movie Her, where the character falls in love with an intelligent OS. Even if autonomous OSes remain in the realm of science fiction, the digital assistants that end up attending to our desires will inevitably anticipate our needs and much more. Embracing this inevitability, the makers of BroApp argue that “The pace of technological change is past the point where it’s possible for us to reject it!”When pushed to further elaborate, they cited the influence of Kevin Kelly’s What Technology Wants and made several strong predictions. “Do we believe that widespread adoption of self-driving cars are inevitable? Yes. … Do we believe that greater than human-level AIs are inevitable? Yes.” And so on. James and Tom then further declared:
“If there is a niche to be filled: i.e. automated relationship helpers, then entrepreneurs will act to fill that niche. The combinatorial explosion of millions of entrepreneurs working with accessible technologies ensures this outcome. Regardless of moral ambiguity or societal push-back, if people find a technology useful, it will be developed and adopted.”It’s funny that everyone mentions Her. Certainly, the movie promises a new vision for the future of UI design — one where artificial intelligence isn’t isolated tech but a given part of our lives. But to me the film demonstrated how relationships diminish when others represent our intimate feelings for us — feelings we might not have or be attuned to. Meanwhile, things that seem useful in the moment can be disastrous long-term, not least because of emergent behavior. Even the ethics of saving lives with autonomous cars are far murkier than we might think, as my friend Patrick Lin shares here in WIRED. That’s why Lin and I argue companies like Google should have a critically minded A.I. ethics board — the issues are too complex to ignore moral ambiguities.
Things that seem useful in the moment can be disastrous long-term, not least because of emergent behavior.
We can’t (and shouldn’t) reject automation
The other presupposition the makers of BroApp — and arguably other tech-centric developers — make is that as artificial intelligence becomes more expert, we’ll find it harder to reject algorithmic judgment.If a smart yet inexpensive piece of technology can take some of life’s burdensome weight off our shoulders, isn’t it irrational — an outdated sense of humans-are-better-than-machines pride — to avoid accepting assistance that leads to better sleeping, eating, working, exercising, and even loving?
Of course, there’s all kinds of stuff we’re bad at doing or don’t want to do, and digital assistants, apps, and algorithms can help. I too see our relying on some kinds of outsourcing technology as both likely and helpful. But I also believe extreme dependency is a problem to be aware of. The line separating a beneficial from a self-undermining type of assistance isn’t always clear, and tipping points do exist. We can’t afford to overlook them, much less pretend they aren’t there in the first place.
Tech change elicits discomfort only at first before it changes the norm
Finally, many of the people who are uncomfortable with the type of innovation that changes relationships will experience momentary unease, observe James and Tom. But only momentarily; over time, people’s anxiety or dismay will fade and a new normal will emerge. Proof of this point, the BroApp makers told me, is exhibited by familiar examples of temporary moral panic: kids didn’t forget how to communicate because of text messaging; accusations died down that friendships on Facebook aren’t real; and so on.
There are also plenty of cases where we course-correct because of pushback.
The implication here is that after a little time passes, the folks
who hyperventilate over automating “sweet messages” will get over it.It’s true that what we find upsetting and even creepy can change over time. But there are also plenty of cases where we course-correct because of pushback (sometimes leading to a better end result). And when meaningful distinctions aren’t drawn between different types of cases, we too easily draw false equivalencies. By the logic the BroApp makers use, we should accept that privacy is dead and embrace living in public. But if the Snowden fallout has taught us anything, it’s that the public can be roused to demand accountability and change when it realizes the consequences of seemingly minor decisions in aggregate.
* * *
Ultimately, the reason technologies like BroApp are problematic is that they’re deceptive. They take situations where people make commitments to be honest and sincere, but treat those underlying moral values as irrelevant — or, worse, as obstacles to be overcome. If they weren’t, BroApp’s press document wouldn’t contain cautions like: “Understandably, a girl who discovers their guy using BroApp won’t be happy.”
But what if people actually use these apps in a meaningful way, so the apps only offloaded the logistics?
In our correspondence, James and Tom focus on managing subjective perceptions
as opposed to realities. The key, they say, is that a girlfriend will
be happy because she’ll “perceive her boyfriend as more engaged”. But
focusing on perception misses the point. When we commit to someone, we
basically promise to do our best to be aware of their needs and desires —
to be sensitive to signs of distress and respond accordingly, not give
the appearance of this fidelity and sensitivity. Time-delayed
notes do just the opposite: They allow the sender to focus on other
things, while simulating a narrow range of attention that obscures the
person’s real priorities.It’s easy to think of technologies like BroApp as helpful assistants that just do our bidding and make our lives better. But the more we outsource, the more of ourselves we lose.
Now, what if people actually use these apps in a meaningful way — to customize and program in their own personal messages, so the app only offloaded the logistics? That could be useful. But the reality is that inertia is a powerful force in human affairs; people are unlikely to take that extra step. And, even if users do, there still remains an important difference between messages becoming items crossed off a to-do list and conveying them in a heartfelt manner during the actual moments it feels appropriate to express them.
James and Tom compared using BroApp to lying to kids about the existence of Santa Claus. But that actually validates my argument: The relationship parents have with young children is a relationship between unequal parties. I would hope that relationships between adult romantic partners are predicated on equality, and don’t revolve around infantilizing behavior.
read more : http://www.wired.com/2014/02/outsourcing-humanity-apps/
0 comments: