W Power 2024

How AI could potentially manipulate consumers

Professor Ali Makhdoumi says companies may be able to exploit a growing information advantage over consumers

Published: Apr 19, 2024 10:33:27 AM IST
Updated: Apr 19, 2024 10:38:51 AM IST

How AI could potentially manipulate consumersAI-powered platforms have more information about consumer preferences than what consumers have about themselves. Image: Shutterstock

The rise of artificial intelligence is enhancing online platforms’ ability to predict customer preferences. The results may benefit users, who may find value in personalized advertising and targeted product recommendations.

But according to Professor Ali Makhdoumi of Duke University’s Fuqua School of Business, the new AI tools are also making companies better at understanding customers’ vulnerabilities, which may enable them to manipulate user behavior in ways beneficial for the company but not for customers.

“Platforms have an information advantage,” Makhdoumi said in a talk on Fuqua’s LinkedIn page. “The question is whether this is good or bad for consumers.”

The company’s information advantage

Makhdoumi said that AI-powered platforms have more information about consumer preferences than what consumers have about themselves.

Consumers learn about products as they use them, he said, but it takes time for them to understand whether the attractiveness of the product has anything to do with the real quality of it.

He explained platforms collect data about what draws consumers to the products, regardless of any consideration about the inherent quality—which enables a greater understanding of the product’s “glossiness,” or its attractiveness to the consumer.

“We know for example about the mechanisms behind impulse buying,” Makhdoumi said. “Platforms can predict these forces and they may use them for gain.”

Also read: Ten ways AI is transforming marketing


A model of behavioral manipulation

In a working paper shared by the National Bureau of Economic Research, Makhdoumi and co-authors Daron Acemoglu and Asu Ozdaglar of MIT, and Azarakhsh Malekian of University of Toronto, built a model to predict the conditions under which platforms may find it profitable to manipulate consumers’ behavior by using  techniques to increase ‘glossiness,’ such as enhanced packaging, hidden costs, and exaggerated benefits.

The researchers found that it is optimal for companies to manipulate consumer perception—and drive them to low-quality products—when the “glossiness effect” is persistent, but not when customers can learn quickly about the products’ quality.

“If the platform can fool consumers for a long time, it will engage in behavioral manipulation,” Makhdoumi said.

This means companies may use the information imbalance to guide consumers to products that would only advantage the company and may even be “harmful” for the customers, he said, mentioning possibilities involving surveillance and price discrimination.

Potential regulation

The constant improvements of large language models and machine learning capabilities will likely offer even more opportunities for behavioral manipulation, Makhdoumi said.

Companies will be able to collect more and more information about consumers’ tendencies towards multiple products at a time, compounding the information imbalance even further, he said.

“It's sort of a double whammy,” he said. “When the consumer eventually learns about the low quality of a product, then the platform will switch to another product.”

He said it is not clear whether companies’ competition for user data will have any impact in moderating this practice. Competition could actually make the problem worse, he said, since once the user joins different platforms, each platform will be able to exploit the same amount of data.

Public policy may play a role, Makhdoumi said, but any regulation will need to strike a delicate balance.

“Preventing the harvesting of user information is not easy and may not always be desirable,” he said, “because AI-powered data collection may also have benefits for consumers.”

The other potential aspect for regulation could be to limit the way platforms can offer tailor recommendations, Makhdoumi said.

“You put a limit on personalized recommendations, so that the platform can’t take the process to the extreme. This may also help reduce behavioral manipulation.”

[This article has been reproduced with permission from Duke University's Fuqua School of Business. This piece originally appeared on Duke Fuqua Insights]

Post Your Comment
Required
Required, will not be published
All comments are moderated