You check prices online for a flight to Melbourne today. It鈥檚 $300. You leave your browser open. Two hours later, it鈥檚 $320. Half a day later, $280. Welcome to the world of , where technology tries to figure out what price you鈥檙e willing to pay.
Artificial intelligence (AI) is quietly remaking how companies set prices. Not only do prices shift with demand (dynamic pricing), but firms are increasingly tailoring prices to individual customers (personalised pricing).
This change isn鈥檛 just technical 鈥 it raises big questions about fairness, transparency and regulation.
How different pricing models work
Dynamic pricing reacts to the market and has been used for years on websites.
Algorithms track supply, demand, timing and competitor prices. When demand peaks, prices rise for everyone. When it eases, they fall. Think Uber鈥檚 surge fares, airline ticket jumps in school holidays, or hotel rates during major events. This kind of variable pricing is now commonplace.
goes further. AI uses personal data 鈥 your browsing history, purchase habits, device, even postcode 鈥 to predict your willingness to pay. The price varies with the individual. Some call this 鈥溾.
Two people looking at the same product at the same time might see different prices. A person who always abandons carts might get a discount, while someone who rarely shops might see a premium price.
A study by the European Parliament defines as 鈥減rice differentiation for identical products or services at the same time based on information a trader holds about a potential customer鈥.
Whereas dynamic pricing depends on the market, personalised pricing depends on the individual consumer.
It started with airfares
This shift began with the airline industry. Since deregulation in the 1990s, 鈥測ield management鈥 to alter fares depending on how many seats are left or how close to the departure date a booking is made.
More recently, airlines combine that with personalisation. They draw on shopping behaviour, social media context, device type, past browsing history 鈥 all to craft fare offers .
Hotels followed. A hotel might raise its base rate, but send a special 鈥渕ember only鈥 discount to someone who has stayed before, or offer a price drop to someone lingering on a booking page. In hotel revenue management, enable companies to target distinct customer segments with different benefits (such as leisure versus business travellers).
AI enhances this process by enabling of large amounts of customer data into individual pricing.
Now the trend is spreading. E-commerce platforms such as Booking.com personalised discounts, depending on your profile. , grocery promos, digital subscription plans 鈥 the reach can be broad.
How AI-driven personalised pricing works
At its core, such systems mine data, a lot of it. Every click, the amount of time spent on a web page, prior purchases, abandoned carts, location, device type, browsing path 鈥 these all feed into a profile. Machine learning models predict your 鈥渨illingness to pay鈥. Using these predictions, the system picks a price that maximises revenue while hoping not to lose the sale.
Some platforms go further. At Booking.com, to select which users should receive a special offer, while meeting budget constraints. This drove a 162% increase in sales, while limiting the cost of promotions for the platform.
So you might not be seeing a standard price; you might be seeing a price engineered for you.
The risk is consumer backlash
There are, of course, risks to the strategy of personalised pricing.
First, fairness. If two households in the same suburb pay different rent or mortgage rates, that seems arbitrary. Pricing that uses income proxies (such as device type or postcode) might entrench inequality. Algorithms may discriminate (even unintentionally) against certain demographics.
Second, alienation. Consumers often feel cheated when they find a lower price later. Once trust is lost, customers might turn away or seek to game the system (clear cookies, browse in incognito mode, switch devices).
Third, accountability. Currently, transparency is low; firms rarely disclose the use of personalised pricing. If AI sets a price that breaches consumer law by being misleading or discriminatory, who鈥檚 liable 鈥 the firm or the algorithm designer?
What the regulators say
In Australia, the Australian Competition and Consumer Commission (ACCC) is taking notice. A published in June 2025 flagged algorithmic transparency, unfair trading practices, and consumer harms as central issues.
The commission said:
current laws are insufficient and regulatory reform is urgently needed.
It recommended stronger oversight of digital platforms, economy-wide unfair trading rules, and mechanisms to force algorithmic disclosure.
Is this efficient, or creepy?
We鈥檙e entering a world where your price might differ from mine 鈥 even in real time. That can unlock efficiency, new forms of loyalty pricing, or targeted discounts. But it can also feel Orwellian, unfair or exploitative.
The challenge for business is to deploy AI pricing ethically and transparently, in ways customers can trust. The challenge for regulators is to catch up. The ACCC鈥檚 actions suggest Australia is moving in that direction but many legal, technical, and philosophical questions remain.![]()
, Professor of Marketing,
This article is republished from under a Creative Commons license. Read the .