
The UK government is seeking views on regulating algorithms used to serve up results on Google and the few search engine rivals it has. A small percentage of people use Duck Duck Go, Yahoo or MS Edge, but over 80% of search traffic gioes through Google. The main aim of the exercise seems to be reducing so-called bias in search engine results, or bias in algorithms being used by AI to make financial decisions. In other words, an equality of outcome, regardless of past data, or contextual data that may surround a postcode for example.
The move may make it slightly easier to showcase your insurance offers online. As every insurer, broker or comparison site knows, Google alters the way results are served according to its own algorithm and sometimes that seems to favour big brands spending big money on advertising with Google, or its associated publishers. But the everyday reality of low ranking search results is impossible to prove, because you have no way of knowing what your rivals are spending on Google.
As regards the suppression of independent information which may help consumers, it’s worth noting how much Google – and Facebook – spend on sponsored media. For example Google not only sponsors regional and national media, it has multi-million dollar partnerships with Reuters, partnerships with the BBC, plus also funds individual journalists.
All of which begs the question; if you regulate the algorithm what difference does that make if the `independent’ news content is being directly sponsored by Google?
Here’s the word from the UK Govt;
So called “algorithmic processing” is commonplace and often beneficial, underpinning many of the products and services we use in everyday life. From detecting fraudulent activity in financial services to connecting us with friends online or translating languages at the click of a button, these systems have become a core part of modern society. However, algorithmic systems, particularly modern Machine Learning (ML) or Artificial Intelligence (AI) approaches, pose significant risks if used without due care. They can introduce or amplify harmful biases that lead to discriminatory decisions or unfair outcomes that reinforce inequalities. They can be used to mislead consumers and distort competition.
Regulators need to work together to articulate the nature and severity of these risks and take measures to mitigate them. That’s how they can help empower the development and deployment of algorithmic processing systems in safe, responsible ways that are pro-innovation and pro-consumer.
The four digital watchdogs – the Competition and Markets Authority, Financial Conduct Authority, Information Commissioner’s Office and Ofcom – today invite views on what more is needed from regulators and where industry should step up. The four organisations are working together through the Digital Regulation Cooperation Forum (DRCF), which today publishes its annual report, its workplan for the year ahead and two papers on algorithms with a call for comments.
REGULATION OF DECISION-MAKING
The DRCF workplan for 2022/23 includes projects that will help to tackle some of our biggest digital challenges, including:
- Promoting competition and privacy in online advertising – foster competitive online advertising markets that deliver innovation and economic growth, while respecting consumer and data protection rights, via joint ICO and CMA work.
- Supporting improvements in algorithmic transparency – support the use of algorithmic processing to promote its benefits and mitigate the risks to people and to competition, by exploring ways of improving algorithmic transparency and auditing.
- Enabling innovation in the industries we regulate – encourage responsible innovation and explore different models for how we coordinate our work with industry to support innovation.
Gill Whitehead, DRCF Chief Executive, said:
The task ahead is significant – but by working together as regulators and in close co-operation with others, we intend for the DRCF to make an important contribution to the UK’s digital landscape to the benefit of people and businesses online. Just one of those areas is algorithms. Whether you’re scrolling on social media, flicking through films or deciding on dinner, algorithms are busy but hidden in the background of our digital lives. That’s good news for a lot of us a lot of the time, but there’s also a problematic side to algorithms. They can be manipulated to cause harm or misused because firms plugging them into websites and apps simply don’t understand them well enough. As regulators, we need to make sure the benefits win out.
Speaking on behalf of the algorithms project team, Stefan Hunt, CMA Chief Data and Technology Insight Officer, said:
Much work has already been done on algorithms by the CMA, FCA, ICO and Ofcom but there is more to do. We’re asking now, what more is needed, including from us as regulators and also from industry?
Today marks the chance for anyone involved in or with a view on the use of algorithms to have their say, particularly on how we might move to an effective, proportionate, approach to audit to help ensure they are being used safely. The opportunity to offer views is open until Wednesday 8 June 2022.
We invite comments and discussion on the DRCF’s workplan and priorities for the year ahead. These should be submitted to DRCF@ofcom.org.uk.
Be the first to comment