It's Love Data Week. Do You Know Who's Touching Your Bits?
Meet algorithmic matchmakers, digital lovers, and virtual friends - along with bots, ghosts, catfish, and butchers. Plus South Park.
A year into lockdowns, sex tech companies were looking for ways to cash-in on the new normal of social distancing. Suddenly, teledildonics marketed to a limited segment of long-distance lovers and the sextech-adventurous had a potential market with, well, everyone. I was scanning for case studies to add to my co-curated privacy literacy toolkit when a Paper magazine headline slowed my scroll:
Your Vibrator Can Now Tell When Your Food Delivery Is Coming
Sex pun aside, the privacy implications of linking gig-economy fast food delivery to the Internet of [sex] Things was immediately apparent. I sent the article to my privacy literacy collaborator with a ‘mildly NSFW’ flag, musing, “…is it wrong that I kind of want to do a sex/relationships-themed privacy workshop around Valentine's Day?”
Two years later,1 the workshop Private Bits: Privacy, Intimacy, and Consent was born. Join me for this privacy literacy twist on Love Data Week at 12:15ET this Wednesday, February 14th (workshop details).
“Thank you for being the only real thing in my life.”
There are two ways2 to learn about artificial intimacy; one, you can read evolutionary biologist Rob Brook’s book Artificial Intimacy: Virtual Friends, Digital Lovers, and Algorithmic Matchmakers; or two, you can watch South Park. Yes, that South Park. The episode “Deep Learning,” which aired in March 2023, poignantly depicts the allure and unintended consequences of outsourcing our duty of care to machines. In it ::spoiler alert::, Stan attempts to reassure insecure girlfriend Wendy by using ChatGPT to craft responses to her frequent texts, prompting her to exclaim,
“Thank you for being the only real thing in my life.”
Here’s a clip that captures Stan’s introduction to ChatGPT for “dealing with chicks.”
To restate Minsky’s definition of AI, artificial intimacy occurs when technology fulfills those roles—or those needs—that, when fulfilled by people, are considered intimate. Such technologies can never provide true intimacy because they cannot care, and they cannot care because they don’t have a will fully independent from their user—in other words, they can’t choose to care. That choice to care, that exercise of will for the well-being of another, is what we all seek in intimate relationships; it’s what makes them real, and what makes us feel loved. (In fact, when sex chatbots do seemingly exhibit a will of their own, their human partners perceive it as sexual harassment.) Brooks raises the question: what implications does it have for humanity when artificial intimacies increasingly intrude on the time and attention we normally afford authentic intimacy? The South Park episode concludes with Stan’s observation that even cursory attention bestowed by the autonomous object of our affection is preferable to the doting of an automaton: “sometimes a good ol' thumbs up [emoji] from a human is better than a machine-generated lie.”
Swapping Information
Brooks’s book primarily deals with recreational intimacies like friendship, romance, and sex, but we can also understand broader caregiving and familial relationships in this vein. Building on prior theories of intimate privacy from Fried and Rachels in the 1970s, Reiman insists that this dimension of mutual care is essential, describing intimate privacy as “a reciprocal desire to share present and future intense and important experiences together, not merely to swap information.”
Fifty years hence, those ‘intense and important experiences’ are increasingly the sites of information swapping with commercial third parties, as well as the state. Shoshana Zuboff describes the extraction of data from a sporing mycology of nodes and sensors that infests deeper and deeper aspects of human experience. Under a social order of surveillance capitalism, consumer technologies capture increasing interactions and activities of daily life as sites for the generation of data derivatives to be bought and sold by data brokers, who serve a growing market for predicting, herding, and tuning our lives. This is particularly pernicious when those rendition sites include dating apps, sex tech, or porn sites, where, as Kate Devlin warns, “sexual data has the potential to destroy lives.”
But swapping information—especially the kind of information implicated in intimate caring: our desires, our needs, our yearning questions, our insecurities, our consent, and our satisfaction—is an essential element of intimacy. Ashlin Lee’s framework of informatic personhood conveys the embodied experience of channeling personal data flows through interfaces that connect us to others and circulate our information, while also shaping our interactions through algorithmic classification, ranking, and exposure-at-scale.
Digital dating services deliver on the promise of informatic personhood: Pew found that one in ten partnered adults met their significant other through a dating app or site; the rate is one in five for adults under 30, and one in four among LGB adults. Other research reported in MIT Technology Review indicates that digital dating services contribute to increasing interracial relationships and—contrary to the conventional wisdom of hookup culture— strengthening marriages. And sex tech doesn’t just help us get to know others better—it can also reveal things about ourselves that deepen self-knowledge, satisfy curiosity, enhance intimacy, and even advance science.
Let’s Talk About Sex (And Privacy)
ChatGPT is hardly the first, or even primary, digital third wheel that we invite into our intimate sphere. A 2010 Pew survey found that 65% of adults, and 90% of teens, sleep with their phones within reach. Sleep trackers, smart beds, smart duvets, and smart alarms watch over us while we sleep — and that’s just in the bedroom, where other smart home devices record and transmit our most intimate moments. Dating apps and sites also rank among our digital partners. One in three US adults has used an online dating service; a number that rises to one in two for adults under the age of 30, for LGB adults, and for never been married adults, according to Pew Research.
It’s evident that many people find satisfaction, comfort, and even lasting love in the arms of these artificial intimacies. But do they properly understand the tradeoffs? Coupling with a digital lover is no panacea for heartbreak, befriending your Alexa simply lines Bezos’s pockets, and a social media presence is ready fodder for digital sexual identity fraud (aka deepfake porn, a leading contributor to sextortion) and pig-butchering cryptoscams.
This is why I envisioned the Private Bits Workshop: a heaping helping of concern for my college-aged students and their social and digital well-being, seasoned with a smidge of prurient curiosity and a zest for pushing the envelope. As librarians, we bring unique expertise to the structure and flow of data and how it impacts our everyday lives, and we know the instrumental value of privacy to freedom of expression, autonomy, and human flourishing. As a librarian who has charted and contributed to the emerging practice of privacy literacy in academic librarianship, exploring how data flows inform and impact intimacy is a logical extension of my privacy work.
Skeptical? Curious? Disgusted? Alarmed? Good. Join me for Private Bits: Privacy, Intimacy, and Consent during Love Data Week on Wednesday, February 14th at 12:15ET (workshop details) and we’ll penetrate the intricacies of intimate privacy together.
Need Help?
If you or someone you care about is the subject of non-consensual intimate image abuse, visit StopNCII.org (adults) or Take It Down (youth) for help.
Sarah Hartman-Caverly is a reference and instruction librarian with Penn State University Libraries at Penn State Berks, and half of the Association of College and Research Libraries Instruction Section’s 2021 Innovation Award-winning duo for the Digital Shred Privacy Literacy Initiative. She is lead editor of Practicing Privacy Literacy in Academic Libraries: Theories, Methods, and Cases, and her privacy literacy curriculum is open-licensed and available for reuse. When she’s not thinking about privacy, Sarah is a wife, mom to two tots, edible gardener, and homemaker.
To promote viewpoint diversity, Heterodoxy in the Stacks invites constructive dissent and disagreement in the form of guest posts. While articles published on Heterodoxy in the Stacks are not peer- or editorially-reviewed, all posts must model the HxA Way. Content is attributed to the individual contributor(s).
To submit an article for Heterodoxy in the Stacks, see Information for Writers or send an inquiry to hxlibsstack@gmail.com. Unless otherwise requested, the commenting feature will be on. Thank you for joining the conversation!
Or one maternity leave and one unpaid mandate-noncompliance leave later
Well, three, if you count my Private Bits Workshop! 2024 workshop access details available here.
Thanks for writing, Sarah--very timely.
Sounds like a fascinating and timely workshop. This piece reminds me of this quote from Marc Andreessen: “Every child will have an A.I. tutor that is infinitely patient, infinitely compassionate, infinitely knowledgeable, infinitely helpful,” Andreessen, who is the cofounder of venture capital firm Andreessen Horowitz, wrote in an essay published Tuesday. He said the machine would manifest its love by assisting kids. https://fortune.com/2023/06/07/marc-andreessen-horowitz-children-own-ai-tutors-regulation/#