Last week, the Connecticut Attorney General published a privacy enforcement update that made my stomach turn.
A consumer had sent a complaint to the AG’s office because they received an advertisement in the mail for cremation services after recently completing chemotherapy. Apparently, the individual had been part of a list sold to the cremation company by a data broker. We can guess how the marketing logic flowed from there: “Inference: Terminal Illness?” “Interest: Likely Purchase – Burial/Cremation.”
Then came the news of Publicis’ $350M settlement for its role in marketing opioids to doctors as Purdue Pharma’s longtime ad agency.
Publicis issued a statement denouncing any wrongdoing, reassuring its shareholders that the Rosetta unit responsible for the reprehensible campaigns in question has long been shuttered. Publicis’ statement goes on to say that its “role was limited to performing many of the standard advertising services that agencies provide to their clients, for products that are to this day prescribed to patients.”
That’s interesting. I didn’t realize that “standard advertising services” included recording and analyzing intimate patient-doctor conversations to help drug companies guide doctors on how to “proactively address patient pushback against opioids” and get patients “comfortable with taking OxyContin, including in higher and higher doses.”
But at least Publicis has stopped these invasive and deceptive practices. Right?
Wrong. Publicis altogether fails to acknowledge that Verilogue, the unit that performs “proprietary exam room dialogue research” by recording doctor-patient conversations, is still active.
In fact, in 2023, Verilogue became part of a new Publicis Health unit called Insagic, which purportedly combines Verilogue’s capabilities with that of its data broker unit, Epsilon. A press release announcing the launch of Insagic touts that its data set includes “more than 1.6 million minutes of exclusive, real-world doctor-patient dialogues.”
We’ve already seen Publicis Health’s willingness to farm this intimate data for insights that help pharmaceutical manufacturers push more prescriptions in higher doses of potentially harmful drugs. What else could possibly go wrong?
I’m not suggesting that this data can only be used for nefarious purposes. Nor am I disputing that such health data could be used in advertising to drive positive outcomes.
I’m disputing the misguided belief that the advertising industry has any interest in exercising the self-restraint necessary to not exploit this sensitive data in ways that cause harm. Not when there’s money to be made.
Get our editors’ roundup delivered to your inbox every weekday.
I would love to see the math on how much revenue Publicis generated through its work for Purdue versus its $350M settlement. We can even ignore the indirect costs of reputational harm or greater liability caps savvy clients will demand when engaging their data solutions. Alas, I digress.
After the news of Publicis’ settlement broke, a brave woman named Emily Deschamps, an employee of Publicis unit Performics, posted on LinkedIn, sharing her story about both of her parents that tragically passed away from opioid overdoses before she finished college.
Deschamps suggested that, given how widespread the opioid crisis is, she’s probably not the only Publicis employee to have lost loved ones to addiction. She went on to ask: “What is the company going to do for individuals like me after this fine? […] What do you say to the employees like me?”
As Emily’s story poignantly illustrates, there are real-world harms that result from the industry’s ethical failures. Just like there were when Publicis’ Epsilon and WPP’s KBM Group sold data that enabled elder fraud. Just like there will be as a result of IPG’s Acxiom selling data to credit rating agencies.
The scope of harm only increases given how often this data is wrong, according to academics and insiders alike. (For example, I am child-free and married, but I am a single mother of two according to a recent data broker access request I submitted.) So it isn’t just the risk that invasively collected, ‘accurate’ data will be misused, but also that incorrect assumptions and erroneous inferences can be exploited in ways that are virtually undetectable and often incurable.
Take, for example, this heart-wrenching comment submitted to the Federal Trade Commission in response to its Commercial Surveillance proposed rulemaking:
“When I had a miscarriage in 2016, due to my preparations for the baby up until the day the fetus expired, I was plagued with ads for infant supplies for months. […] Everywhere I looked I was reminded of my loss.” She ends by lamenting her powerlessness in avoiding these distressing targeted ads and the “advertiser surveillance” that enables them.
Is this what brands are paying for? Are these unethically extracted, low-quality data and off-base assumptions worth the human cost?
These are inferences and assumptions we have no right to make that impact futures that we have no right to limit. These are falsehoods we have no right to spread. These are real parents. Real people. Real lives. Real deaths.
So, no, I am not impressed by assertions like those of Publicis that its work was “compliant with the law.”
To be frank, if manipulating patients or recording doctor-patient conversations for advertising is legitimate, that is an indictment of the US legal system’s failure to protect citizens from harm. If the loopholes have grown so large, and if new laws have been this badly diluted and distorted by industry lobbyists on their way to signature, then the laws themselves are unfair and deceptive.
We shouldn’t be talking about the letter of the law on this issue.
We should be talking about Emily and how nobody else should have to lose their parents due in part to our industry’s broken moral compass. We should be talking about making sure that nobody undergoing treatment for a serious illness will receive ads targeted based on an inference of their impending death.
We have to do better. Holding companies. Advertisers. The industry. Congress. All of us.
A closing thought for brands: If your agency has been willing to abuse data in ways that can harm people, but you think it isn’t willing to do the same to your data and your budget, then I have a bridge to sell you.
“Data-Driven Thinking” is written by members of the media community and contains fresh ideas on the digital revolution in media.