On your smartphone, you’re not much more than a data machine, generating reams of valuable information that tech companies can mine for insights, sell to advertisers and use to optimize their products.
The Cambridge Analytica scandal, which involves a third-party Facebook app that harvested data well beyond the scope of the 270,000 users who initially consented to its terms of service for use in political campaigns (including Donald Trump’s 2016 bid for the presidency), highlights anew the vulnerability of consumer data in this digital age.
But it’s easy to forget these risks to personal privacy and security while tapping out messages to friends or scrolling endlessly through the web. The distraction machines at our fingertips ask for access and we give it up quickly, hastily agreeing to unread privacy policies and terms of service in exchange for a fresh jolt of content.
Studies highlight this “digital privacy paradox,” in which people express concerns over their privacy but then act in ways that undermine these beliefs, for example, offering up personal data for a small incentive. This review features research on this topic — consumer attitudes toward digital privacy — as well as studies of the supply-side — that is, research on the practices of app developers and other tech companies that shape data collection and use policies.
Summary: This paper looks at the risks big data poses to consumer privacy. The author describes the causes and consequences of data breaches and the ways in which technological tools can be used for data misuse. She then explores the interaction between privacy risks and the U.S. market. For example, the author highlights the “self-conflicting” views consumers hold about their privacy, citing literature in which consumers give away personal data for small incentives despite attitudes that might indicate otherwise. On the supply side, similar paradoxes exist — for example, despite an awareness of cyber risks, firms “tend to deploy new technology… before adopting security measures to protect them.” The author discusses how market forces might motivate firms to strengthen privacy settings in response to consumer concerns, but also mentions how market mechanisms can have the opposite effect, using the example of password policies and consumers’ demand for convenience (in the form of weaker password requirements). The author then describes how artificial intelligence might be used to mitigate data security and privacy risks. Lastly, she provides an overview of U.S. policy on consumer privacy and data security and describes future challenges in the field.
Abstract: “‘Notice and Choice’ has been a mainstay of policies designed to safeguard consumer privacy. This paper investigates distortions in consumer behavior when faced with notice and choice which may limit the ability of consumers to safeguard their privacy using field experiment data from the MIT digital currency experiment. There are three findings. First, the effect small incentives have on disclosure may explain the privacy paradox: Whereas people say they care about privacy, they are willing to relinquish private data quite easily when incentivized to do so. Second, small navigation costs have a tangible effect on how privacy-protective consumers’ choices are, often in sharp contrast with individual stated preferences about privacy. Third, the introduction of irrelevant, but reassuring information about privacy protection makes consumers less likely to avoid surveillance, regardless of their stated preferences towards privacy.”
Summary: This paper looks at strategies mobile app developers use to collect data, which apps are most likely to practice intrusive data collection, and what factors predict problematic personal data usage. By examining the variations in data collection strategies of different apps created by the same developers over a period of four years, the researchers uncover three trends. 1) With time and experience, developers adopt more intrusive data collection tactics. 2) Apps with intrusive data collection strategies most commonly target adolescents. 3) Apps that request “critical and atypical permissions” (i.e., access to various data sources) are linked with an increased risk of problematic data practices later on.
“The Economics of Privacy”
Acquisti, Alessandro; Taylor, Curtis R.; Wagman, Liad. Journal of Economic Literature, 2016. DOI: 10.2139/ssrn.2580411.
Abstract: “This article summarizes and draws connections among diverse streams of theoretical and empirical research on the economics of privacy. We focus on the economic value and consequences of protecting and disclosing personal information, and on consumers’ understanding and decisions regarding the trade-offs associated with the privacy and the sharing of personal data. We highlight how the economic analysis of privacy evolved over time, as advancements in information technology raised increasingly nuanced and complex issues associated with the protection and sharing of personal information. We find and highlight three themes that connect diverse insights from the literature. First, characterizing a single unifying economic theory of privacy is hard, because privacy issues of economic relevance arise in widely diverse contexts. Second, there are theoretical and empirical situations where the protection of privacy can both enhance, and detract from, individual and societal welfare. Third, in digital economies, consumers’ ability to make informed decisions about their privacy is severely hindered, because consumers are often in a position of imperfect or asymmetric information regarding when their data is collected, for what purposes, and with what consequences. We conclude the article by highlighting some of the ongoing issues in the privacy debate of interest to economists.”