Skip to main content

CMA poised to get personal on pricing algorithms!

27 January 2021

The CMA has published an eye-opening paper on algorithms, showing how they can, if misused, reduce competition in digital markets and cause harm to consumers.

So, if you sell online, take note as this paper shows the ‘direction of travel’ on key issues such as personalised pricing, personalised ads and product recommendations. It has launched an accompanying call for evidence

The  CMA acknowledges that machine learning, algorithms and AI can be beneficial for consumers, such as by facilitating individual recommendations for consumers (thus saving time) and allowing businesses to optimise interfaces and actions with customers (‘choice architecture’).  However, it is deeply concerned that algorithmic tools and processes can cause online harms and discrimination, not helped by the lack of transparency when they are deployed. 

The paper is extensive, so here are our key takeaways:

1. The CMA has set up a new Digital Market Units (DMU) to promote competition, which will have a proactive monitoring role.

2. It has also recruited engineers, technologists and behavioural scientists as part of a new ‘Data, Technology and Analytics’ (DaTA) team, to deploy new analytical and investigative techniques, and to broaden the CMA’s range of evidence and intelligence.

3. The CMA is looking at algorithms through competition and consumer law. From a consumer law perspective, the CMA’s lens is the CPRs/CPUT, including the requirement not to trade unfairly by acting contrary to the requirements of professional diligence and the requirement not to mislead consumers, in each case in ways that distort a consumer’s purchasing decisions.

4. Dynamic and personalised pricing is on the CMA’s radar.This, it says, can be harmful because it is difficult for consumers to detect, can target vulnerable consumers and/or have unfair ‘distributive’ effects. These harms can occur through the manipulation of consumer choices, often without the consumer being aware. Digital pricing tags can involve the use of lots of personal data relating to consumer spending habits, phone data and loyalty card data.An example is higher ‘surge’ pricing used by taxi firms when, for example, a user’s phone battery is almost out of juice.

5. Other topics of concern include:

  • product recommendations and filtering algorithms;
  • manipulation of user purchasing journeys, including creating bias that contributes towards product ratings/reviews;
  • algorithmic discrimination and geographic targeting, including online sharing-economy platforms which can fail to mitigate discrimination arising from aggregate user behaviour, especially when platforms attempt to build trust and facilitate transactions by reducing anonymity(such as ‘freelance marketplaces’);
  • unfair ranking and design, including where platforms manipulate search result rankings to obtain higher commissions or revenues (‘self-preferencing’);
  • ‘dark patterns’, including where consumer biases or vulnerabilities are exploited such as via the use of ‘scarcity’ messages;
  • exclusionary self-preferencing practices where algorithms are used by dominant firms to deter competitors from challenging their market position, including:
  • the use of automated pricing systems and collusion tactics;
  • businesses using the same algorithm systems to set prices (including using the same software or even by delegating pricing to a common intermediary (known as ‘hub and spoke’ structure); and
  • autonomous tacit collusion where algorithms learn to collude without even requiring other information sharing or coordination.

6. The CMA considers the use of various techniques to investigate harms including:

  • investigating automated systems without direct access to underlying code, such as the use of mystery digital shoppers (which the CMA has used in the past), scraping audits, the use of APIs, reverse engineering and emulating web apps to submit queries with specific data to intercept the output; and
  • investigating automated systems with direct access to data and algorithms, including dynamic analysis, such as automated testing though use of the code.The CMA might look to undertake ‘randomised control trials’ (RCTs) to run end to end audits, as often run internally by web facing businesses

7. The CMA concludes that there is a strong case for intervention for these key reasons:

  • The opacity of algorithmic systems and the lack of operational transparency make it hard for consumers and customers to effectively discipline traders.Many practices regarding ‘choice architecture’ are likely to become even more subtle, sophisticated and difficult to detect; and
  • Some of the practices outlined above involve the algorithmic systems of large businesses that occupy strategic positions in the economy.

8. The CMA points out that there is already some guidance available to businesses, for example from the Centre of Data Ethics and Innovation, especially in relation to concerns around bias.Organisations should carry out risk assessments along the lines of DPIAs (data protection impact assessments), and developers can use open source tools and share best practices. The ICO has also produced a framework for auditing AI as well as guidance on how to explain how algorithms work.

Do get in touch if you would like help with assessing the impact of the use of algorithmics tools and processes, explaining them to customers and/or with advice on associated equality and discrimination issues.

 

Related items

Back To Top