09-20, 10:05–10:55 (Europe/Amsterdam), Van Gogh
Recent advancements in causal inference have led to the emergence of sophisticated targeting methods, which are perceived as intrusive by consumers. In response, policymakers have recently imposed bans on targeting due to its privacy invasive nature (e.g., Meta). In this talk, we introduce two private targeting strategies that we prove to satisfy differential privacy: a mathematical definition of privacy. These two private targeting strategies allow analysts to target customers while simultaneously establish a level of privacy risk. We first introduce "Private Causal Neural Networks" (PCNNs), which estimate the causal or incremental effect of a targeting intervention. The second strategy involves the randomization of the targeting decision. In two increasingly complex simulation studies, we benchmark the two private targeting strategies to accurately learn the population average treatment effect, conditional average treatment effect (i.e., CATE), and its targeting profitability. In a field experiment with over 400,00 customers, we empirically apply the privacy protection strategies and visualize the inherent trade-off between privacy risk and profitability.
Conditional Average Treatment Effects (CATEs) represent a cornerstone concept in causal inference, enabling the estimation of the impact of a specific treatment or targeting intervention. However, CATE estimation frequently necessitates access to private information, raising concerns about customer privacy violations. This issue is exemplified by the recent ban on Meta its use of personal data for targeting purposes.
This talk will highlight the application of two private targeting strategies that mathematically guarantee privacy protection through differential privacy. I will start the talk by introducing the double/debiased machine learning (DML) framework and targeting policies, on which the proposed strategies build. Following this, I will introduce the concept of differential privacy, which precisely quantifies the privacy risk of customers. Subsequently, I will apply the proposed strategies in two increasingly complex simulation studies and in a field experiment with over 400,000 customers.
The talk will be structured as follows:
0-5: Intro to double/debiased machine learning and targeting policies.
5-10: Into to differential privacy.
10-15: Introduction to private targeting strategy 1: the Private Causal Neural Networks (PCNNs).
15-20: Introduction to private targeting strategy 2: the Private Targeting Policy.
20-25: Application of the strategies in two simulation studies and the field experiment + Python package.
25-30: Conclusions
Target audience: The talk would be beneficial for policymakers, data scientists, scientists, and practitioners interested in causal inference but care about privacy. Attendees should have a basic understanding of statistical methods used in data science.
I am an Assistant Professor of Marketing at the Rotterdam School of Management (Erasmus University Rotterdam). My research focuses on (differential) privacy and marketing analytics.