Sab. Gen 4th, 2025

    Artificial intelligence (AI) tools could be used to manipulate online audiences into making decisions – ranging from what to buy to who to vote for – according to researchers at the University of Cambridge.The paper highlights an emerging new marketplace for “digital signals of intent” – known as the “intention economy” – where AI assistants understand, forecast and manipulate human intentions and sell that information on to companies who can profit from it.The intention economy is touted by researchers at Cambridge’s Leverhulme Centre for the Future of Intelligence (LCFI) as a successor to the attention economy, where social networks keep users hooked on their platforms and serve them adverts.The intention economy involves AI-savvy tech companies selling what they know about your motivations, from plans for a stay in a hotel to opinions on a political candidate, to the highest bidder.“For decades, attention has been the currency of the internet,” said Dr Jonnie Penn, an historian of technology at LCFI. “Sharing your attention with social media platforms such as Facebook and Instagram drove the online economy.”He added: “Unless regulated, the intention economy will treat your motivations as the new currency. It will be a gold rush for those who target, steer and sell human intentions.“We should start to consider the likely impact such a marketplace would have on human aspirations, including free and fair elections, a free press and fair market competition, before we become victims of its unintended consequences.”The study claims that large language models (LLMs), the technology that underpins AI tools such as the ChatGPT chatbot, will be used to “anticipate and steer” users based on “intentional, behavioural and psychological data”.The authors said the attention economy allows advertisers to buy access to users’ attention in the present via real-time bidding on ad exchanges or buy it in the future by acquiring a month’s-worth of ad space on a billboard.LLMs will be able to access attention in real-time as well, by, for instance, asking if a user has thought about seeing a particular film – “have you thought about seeing Spider-Man tonight?” – as well as making suggestions relating to future intentions, such as asking: “You mentioned feeling overworked, shall I book you that movie ticket we’d talked about?”The study raises a scenario where these examples are “dynamically generated” to match factors such as a user’s “personal behavioural traces” and “psychological profile”.“In an intention economy, an LLM could, at low cost, leverage a user’s cadence, politics, vocabulary, age, gender, preferences for sycophancy, and so on, in concert with brokered bids, to maximise the likelihood of achieving a given aim (eg to sell a film ticket),” the study suggests. In such a world, an AI model would steer conversations in the service of advertisers, businesses and other third parties   

Di