The UK’s competition watchdog is ratcheting up its probe into how companies are using algorithms – most notably through artificial intelligence – after publishing yet another study exposing how the often murky technology is stifling competition in digital markets and harming consumers.
The Competition & Markets Authority (CMA) is now seeking evidence from academics and industry experts and is looking for intelligence so that it can take further action.
The move is the latest in a long line of investigations into the impact of algorithms on competition and consumers, including a 2019 probe into the pricing practices of online travel agents.
As a result, leading hotel booking sites, including Expedia, Trivago, and Booking.com, were forced to clean up their act after being found to regularly engage in pressure selling, offer misleading discounts and prioritise hotels which pay the biggest commissions.
And over 12 months ago, the Government’s own advisory body, the Centre for Data Ethics & Innovation (CDEI), urged ministers to bring in new legislation to tighten up the use of algorithms that drive content on social media, including advertising, videos and posts. Insisting that existing legislation was “out of step” with consumers’ expectations, it called on the Government to hand back control to consumers rather than let big tech firms ride roughshod over everyone.
The majority of algorithms used by private firms online are currently subject to little or no regulatory oversight and the latest CMA research concludes that more monitoring and action is required by regulators, including by its own office.
While conceding that many online activities – including shopping, social media, dating and booking holidays – could not exist without algorithms, the CMA believes they can have a negative impact on consumers in numerous ways.
Algorithms can be used to personalise services in ways that are difficult to detect, the CMA said, leading to search results that can be manipulated to reduce choice or artificially change consumers’ perceptions. An example of this is misleading messages which suggest a product is in short supply.
Companies can also use algorithms to change the way they rank products on websites, pushing their own products to the fore and excluding competitors. More complex algorithms could aid collusion between businesses without firms directly sharing information, which could lead to sustained higher prices for products and services, the report claims.
CMA director of data science Kate Brand said: “Algorithms play an important role online but, if not used responsibly, can potentially do a tremendous amount of harm to consumers and businesses. Assessing this harm is the first step towards being able to ensure consumers are protected and complements our wider work in digital markets to promote greater competition and innovation online.
“We want to receive as much information as possible from stakeholders in academia, the competition community, firms, civil society and third sector organisations in order to understand where the harm is occurring and what the most effective regulatory approach is to protect consumers in the future.”
The work is being led by the CMA’s Data, Technology & Analytics (DaTA) unit, claimed to be the largest team of data and technology experts in any competition or consumer agency worldwide.
As well as playing a vital role in informing the work of the proposed digital regulator, the Digital Markets Unit, which was announced late last year, the investigation will support the CMA’s wider digital markets strategy to protect consumers online. The CMA says it intends to work closely with the Information Commissioner’s Office and Ofcom, through the Digital Regulation Cooperation Forum, in this initiative.
In response, Which? director of policy and advocacy Rocio Concha said: “Algorithms can help consumers find suitable products and services as well as good deals, but can also be used to track and monitor behaviours in ways they are unaware of, leading to them being manipulated or misled – either accidentally or by design.
“From pressure-selling tactics by online accommodation booking sites to unscrupulous sellers using fake reviews to game their way to a valuable Amazon’s Choice endorsement, too often algorithms can lead to consumers losing out. If the regulator finds companies are using algorithms to harm consumers, it must work alongside the new Digital Markets Unit and be prepared to take action.”
One example of a flawed algorithm emerged last summer, when qualifications watchdog Ofqual used an equation to predict 2020 grades for A level and GCSE qualifications. The move sparked uproar after it was revealed that disadvantaged students had been worst hit by downgrades, while private school pupils’ results were boosted. University admissions were thrown into chaos and public confidence in the exams system plummeted, forcing the Government into an embarrassing U-turn which saw the algorithm scrapped and replaced by teachers’ predicted results instead.
Related stories
Tech giants face tailored rules – and fines – in CMA plan
New Digital Markets Unit ‘to bring tech giants to heel’
CMA demands new laws to rein in Facebook and Google
How to beat the algorithms and let ad creativity shine
How marketers can learn from A-Level algorithm row
Tech firms must face AI curbs, says Govt advisory body
CMA batters booking sites but ‘it’s just tip of iceberg’
Revealed: how online retailers get you to spend more