Press release: The Risks of Technology in Business

Journalists sitting and writing in notepads

Does technology work to level the playing field, or actually deepen bias?

Preference-based medical ads can lead to wrong treatments for users

Interacting algorithms (e.g. credit reporting +information) may worsen disparities

Embargo (for quotes) Monday 5 June 2023 7pm (UK time)

I would like to invite you to a lecture on The Risks of Technology in Business by Gresham Professor Raghavendra Rau on Monday 5 June 6pm (UK time) at Gresham College or online. Professor Rau will examine the issues that may arise when we blindly apply technology in our lives and the dangers individuals can face when algorithms target their preferences and monitor their ‘risks’ and behaviour.

Rau will say, “Observed behaviour is not the same as preferences. And basing ads on what we think are real preferences can have important real-world consequences. As an example, pharmaceutical companies use Facebook to target and advertise to people with certain medical conditions. For example, Citizen Browser collected data on Facebook advertisements and found that pharmaceutical companies use specific language and images to target users with medical conditions such as diabetes, arthritis, and cancer. Obviously, this type of targeted advertising can be harmful, as it can lead to users receiving biased information and potentially inappropriate treatments.”

It is not clear, he will explain, who has your data, or how your data is being used; and he will explore how hidden information for example in financial institutions may introduce more bias: “Many factors that appear neutral, such as where someone shops or takes vacations, could double for race, which goes against the Fair Housing Act of 1968. Worse, hidden relationships across credit interactions can inject biases across a number of different areas. For example, if a person is charged more for an auto loan, they could also be charged more for a mortgage,” Rau will say.

Algorithms used by the justice system carry even more risks, Rau suggests: “Consider for example, commonly used algorithms to recommend release of incarcerated people awaiting trial. The use of predictive algorithms in the criminal justice system can perpetuate racial biases and lead to unfair treatment of defendants. These algorithms, used to assess the risk of a defendant committing a future crime, are based on factors such as criminal history, age, and employment status, but they often don't consider other important factors such as poverty and racial disparities in the criminal justice system. There are numerous examples of people being deemed high risk by the algorithms, resulting in harsher sentences, even though they had not committed any crimes in the past.”

Rau will go on to discuss the issue of interacting algorithms, from credit-reporting algorithms to those used by government agencies, to information algorithms. Examples of issues created by these interacting algorithms are the cases in the UK, America and Australia where people were wrongly accused of fraud, such as the UK Post Office scandals case.

“Credit-reporting algorithms can interact with information algorithms that are used by media services such as Facebook, LinkedIn, Tiktok, or Twitter to direct your attention to particular news items. The interaction between credit-reporting algorithms and information algorithms can lead to the amplification of biased and discriminatory information, leading to significant negative consequences. These algorithms can easily target vulnerable individuals who may already have financial difficulties, making them more susceptible to misleading or fraudulent advertisements and information. Moreover, the biased information can impact people's decision-making process, including financial decisions, and may also lead to a reinforcement of social inequalities.”

ENDS

Notes to Editors

You can sign up to watch the hybrid lecture online or in person; or email us for an embargoed transcript or speak to the lecturer: l.graves@gresham.ac.uk / 07799 738 439

Read more about Professor Thomas

Sign up to our monthly newsletter to get advance notice of our events.