A few years ago, a father called customer service at Target to admonish the company for repeatedly sending ads about baby products to his teenage daughter. Unbeknownst to him, his daughter actually was pregnant. This uncomfortable story became the classic illustration of something known in the business intelligence community as “the creep factor” in marketing—the unnerving realization that companies may infringe on your privacy via your computer or mobile phone data in a single-focused effort to drive product sales.
While working as a data analyst for a large supermarket, I looked into the Target story further with an academic enquiry in mind. What was it about the daughter’s purchases at Target that accurately singled her out for pregnancy advertisements? Her purchase of cotton balls and unscented lotion, as it turns out. Statistical models funnel loads of data through algorithms to infer traits about shoppers. The black box method of computation is part of what makes the big data craze so alarming. Seemingly innocuous data, taken in large quantities, may reveal more about you than you know.
This week, a new app called Castlight created a statistical model that predicts when users might become pregnant. The app flags women who have stopped filling birth-control prescriptions as well as women who have made fertility-related searches on their health app. On the surface at least, this just seems another version of the Target story—a company uses data to predict pregnancy. The trouble is that Castlight interfaces with both employers and their employees, providing a means of circumventing legalities that prevents employers from knowing about their employee’s health conditions. A new concern is that employers may know women are pregnant via third party apps, placing them at risk of serious discrimination.
Before faulting big data, profit-hungry companies, or a misogynistic work practices, let’s consider the role of algorithms generally. A statistician named George Box said, “All algorithms are bad, but some are useful.” The trouble, of course with being useful is that you have to be useful to someone, and more often than not, the creators and beneficiaries of algorithms are people in positions of power. To me, that’s the real issue at stake here.
There’s a lot of misunderstanding about the neutrality of algorithms. There’s an illusion that because they’re built of numbers, algorithms are inherently unbiased. The reality is that humans build their own biases into the algorithms. Furthermore, all data is biased toward the environment in which it was collected. That means that the health insurance people are collecting data with the purpose of creating a sustainable, high functioning insurance company. We expect our insurance companies to have solid business aims. The trouble of course is that algorithm's output is at odds with other priorities.
As long as companies are interested in profits, predictive algorithms are here to stay. On the surface, cutting healthcare costs is a perfectly reasonable goal. The trouble is that algorithms have a feedback loop with the outside world. Thus, private information about pregnancy can made public without the consent of the app users. Companies aren’t going to stop predicting the pregnancies of women. The harsh reality is that pregnant women are the most profitable demographic to Internet advertisers. Similarly, from a coldly quantitative view, extended employee absence represents a substantial cost to employers. The cultural trend of reducing women’s pregnancies to “zeroes and ones” with a profit-minded outlook is a symptom of severely disordered societal values. Unfortunately, we can’t expect the business intelligence community to adjust their worldview to accommodate a more holistic value system focused on the values raising children contributes to a flourishing society. Einstein says it best: “Not everything that counts can be counted, and not everything that can be counted counts.”
This news has several messages for women mindful of their data. For starters, it doesn’t make you an alarmist to be mindful of what apps you use. Read the fine print carefully, as third party redistribution of app data seems harmless on the surface, but it has real consequences.
On a related note, the legally grey territory of third party information passing between employer and employee merits quick and thorough investigation so that preventative privacy measures may be put in place, both for the sake of women who may become pregnant as well as women or men who may suffer from critical health conditions. As the proliferation of data and algorithms threatens to outstrip the nuances of our ethical codes, employers would do well to remember that no one is immune from the creep factor.
Lastly, the Castlight is a symptom of the larger issues at stake concerning pregnant women in the workforce. Hopefully this app prompts real, open conversation about creating work culture that is hospitable to pregnant employees—a culture mindful the longer-term value of employees with healthy families that also acknowledges the pressures on companies to stay afloat.
Photo Credit: Adobe Stock