AI-POWERED PREDICTIVE ANALYTICS IN MARKETING: ETHICAL CONCERNS SURROUNDING CONSUMER MANIPULATION AND PRIVACY
Author: Tuba Sameen
ABSTRACT
AI has taken marketing to a new level by ensuring the use of predictive analytics in developing new marketing strategies that accurately address clients’ needs. While bringing about changes in the corporation’s ability to efficiently meet its customer needs, these technologies raise ethical issues of explicit customer manipulation, privacy violation, and customer data use. This research uses a quantitative research method with a deductive approach to investigate these matters and hypotheses on 300 online consumers impotent with AI-based marketing advertisements. Both closed-ended questionnaires were used to collect the data and were analyzed using the statistical package of social sciences. Based on the general view of participants, 72% of consumers think that AI in advertisements is invasive, and 58% of customers easily fall victim to purchasing products they never intended to purchase due to AI recommendations. Unfortunately, only 35 percent of the consumers polled showed an understanding of how their data is collected and used, a decision that indicates that there remain transparency gaps in current procedures. In addition, 81% of the participants showed a high level of concern for restrictive regulation of AI in marketing to protect privacy and self-determination. These conclusions pave the way for the demand for increasing uses of ethical AI tools and measures, increased procedures for accountability, and improvement of legislation in data protection. The work can add to the current discussions of the responsible use of AI in marketing and bring some empirical results for policymakers and other stakeholders. By tackling these ethical issues, businesses can improve user trust and the continued use of AI-based predictive analytics in today’s dynamic business environment.
Keywords: AI-powered predictive analytics, consumer manipulation, data privacy, ethical marketing, quantitative research
REFERENCES
- Aguirre, E., Mahr, D., Grewal, D., de Ruyter, K. & Wetzels, M. (2015). ‘Unraveling the personalization paradox: The effect of information collection and trust-building strategies on online advertisement effectiveness,’ Journal of Retailing, 91(1), pp. 34–49.
- Awad, N. & Krishnan, M.S. (2006). ‘The personalization privacy paradox: An empirical evaluation of information transparency and the willingness to be profiled online for personalization,’ MIS Quarterly, 30(1), pp. 13–28.
- Barasch, A., Berger, J. and Kim, J. (2020). ‘How vulnerability enriches consumer well-being: The transformative power of compensatory consumption,’ Journal of Consumer Psychology, 30(4), pp. 555–571.
- Burrell, J. (2016). ‘How the machine “thinks”: Understanding opacity in machine learning algorithms’, Big Data & Society, 3(1).
- Cadwalladr, C. (2018). ‘I made Steve Bannon’s psychological warfare tool: Meet the data war whistleblower,’ The Guardian, 18 March.
- Chen, L., Mislove, A. & Wilson, C. (2021). ‘An empirical analysis of algorithmic pricing on Amazon marketplace,’ in WWW ’21: Proceedings of the Web Conference 2021, pp. 1–10.
- Crawford, K. (2021). Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence. New Haven: Yale University Press.
- Creel, K. and Hellman, D. (2022). ‘The algorithmic leviathan: Arbitrariness, fairness, and opportunity in algorithmic decision-making systems’, Canadian Journal of Philosophy, 52(1), pp. 26–43.
- Davis, F.D. (1989). ‘Perceived usefulness, perceived ease of use, and user acceptance of information technology’, MIS Quarterly, 13(3), pp. 319–340.
- Dinev, T. & Hart, P. (2006). ‘An extended privacy calculus model for e-commerce transactions,’ Information Systems Research, 17(1), pp. 61–80.
- Erevelles, S., Fukawa, N. & Swayne, L. (2016). ‘Big data consumer analytics and the transformation of marketing,’ Journal of Business Research, 69(2), pp. 897–904.
- Hann, I.H., Hui, K.L., Lee, S.Y.T. and Png, I.P.L. (2007). ‘Overcoming online information privacy concerns: An information-processing theory approach,’ Journal of Management Information Systems, 24(2), pp. 13–42.
- Martin, K. & Murphy, P.E. (2017). ‘The role of data privacy in marketing’, Journal of the Academy of Marketing Science, 45(2), pp. 135–155.
- Martin, K. & Shilton, K. (2016). ‘Putting mobile application privacy in context: An empirical study of user privacy expectations for mobile devices,’ The Information Society, 32(3), pp. 200–216.
- Milne, G.R., Rohm, A.J. and Bahl, S. (2008). ‘Consumers’ protection of online privacy and identity’, Journal of Consumer Affairs, 42(2), pp. 217–232.
- Nissenbaum, H. (2010). Privacy in Context: Technology, Policy, and the Integrity of Social Life. Stanford: Stanford University Press.
- Tufekci, Z. (2015). ‘Algorithmic harms beyond Facebook and Google: Emergent challenges of computational agency,’ Colorado Technology Law Journal, 13(203), pp. 203–218.
- Zuboff, S. (2019). The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. London: Profile Books.
- Zuiderveen Borgesius, F.J. (2020). ‘Strengthening legal protection against discrimination by algorithms and artificial intelligence,’ The International Journal of Human Rights, 24(10), pp. 1572–1593.