3 min read

2018 has thrown up a huge range of examples of the unintended consequences of algorithms. From the ACLU’s research in July which showed how the algorithm in Amazon’s facial recognition software incorrectly matched images of congress members with mugshots, to the same organization’s sexist algorithm used in the hiring process, this has been a year where the damage that algorithms can cause has become apparent.

But this week, an open letter by Gillian Brockell, who works at The Washington Post, highlighted the traumatic impact algorithmic personalization can have.

In it, Brockell detailed how personalized ads accompanied her pregnancy, and speculated how the major platforms that dominate our digital lives. “…I bet Amazon even told you [the tech companies to which the letter is addressed] my due date… when I created an Amazon registry,” she wrote.

But she went on to explain how those very algorithms were incapable of processing the tragic death of her unborn baby, blind to the grief that would unfold in the aftermath. “Did you not see the three days silence, uncommon for a high frequency user like me”.

But Brockell’s grief was compounded by the way those companies continued to engage with her through automated messaging. She explained that although she clicked the “It’s not relevant to me” option those ads offer users, this only led algorithms to ‘decide’ that she had given birth, offering deals on strollers and nursing bras.

As Brockell notes in her letter, stillbirths aren’t as rare as many think, with 26,000 happening in the U.S. alone every year. This fact only serves to emphasise the empathetic blind spots in the way algorithms are developed. “If you’re smart enough to realize that I’m pregnant, that I’ve given birth, then surely you’re smart enough to realize my baby died.”

Brockell’s open letter garnered a lot of attention on social media, to such an extent that a number of the companies at which Brockell had directed her letter responded.

Speaking to CNBC, a Twitter spokesperson said, “We cannot imagine the pain of those who have experienced this type of loss. We are continuously working on improving our advertising products to ensure they serve appropriate content to the people who use our services.

Meanwhile, a Facebook advertising executive, Rob Goldman responded, “I am so sorry for your loss and your painful experience with our products.” He also explained how these ads could be blocked. “We have a setting available that can block ads about some topics people may find painful — including parenting. It still needs improvement, but please know that we’re working on it & welcome your feedback.” Experian did not respond to requests for comment.

However, even after taking Goldman’s advice, Brockell revealed she was then shown adoption adverts:

It crossed the line from marketing into Emotional Stalking,” said one Twitter user.

While the political impact of algorithms has led to sustained commentary and criticism in 2018, this story reveals the personal impact algorithms can have. It highlights that as artificial intelligence systems become more and more embedded in everyday life, engineers will need an acute sensitivity and attention to detail to the potential use cases and consequences that certain algorithms may have.

You can read Brockell’s post on Twitter.

Read Next

Facebook’s artificial intelligence research team, FAIR, turns five. But what are its biggest accomplishments?

FAT Conference 2018 Session 3: Fairness in Computer Vision and NLP

FAT Conference 2018 Session 4: Fair Classification