Despite firms' attempts to anonymize user data, identities can still be detected
Data privacy are hard for consumers to understand, the onus may truly be on companies to develop better anonymization schemes
Image: Shutterstock
Data is all around us, to the point where it can potentially be unnerving. But Jiaming Xu, assistant professor in decision sciences at Duke University’s Fuqua School of Business, stressed that although many people are concerned about data privacy, they also experience many benefits from the ways their data is used.
Predictive analytics can help people find what they’re looking for faster, Xu said. In health care, algorithms can help identify patients at risk for certain diseases, or which patients are most likely to benefit a certain drug or treatment, he said.
However, even as firms attempt to protect users by anonymizing their data, there is the possibility their identities could be discovered. In a live discussion on Fuqua’s LinkedIn page, Xu explained some of his research on network data privacy and how easily information can be traced back to individual users.
“Due to the rapid developments of machine learning and data science, companies and individuals rely more and more on data to solve decision problems in businesses,” Xu said. “However, it turns out that collecting and disseminating a large amount of data in bulk can potentially expose customers to serious privacy breaches,” he said, noting a 2021 incident in which anonymized data for as many as 700 million LinkedIn users was scraped from a public dataset and could be used to reveal personal data such as user names, phone numbers and email addresses.
Even with normal anonymization and sanitization where a user’s identity has been removed and some of their activity redacted, users have unique patterns of behavior and can still be re-identified from these signatures, Xu said.
[This article has been reproduced with permission from Duke University's Fuqua School of Business. This piece originally appeared on Duke Fuqua Insights]