How biased computer engineers affect lives

What you need to know:

Was a computer system blamed for rejecting the application? Did you feel the system was fair or did you wish that you were given a chance to explain yourself?

Have you, or someone you know, ever applied for a loan, a job, travel visa or an admission to a school and were rejected?

Was a computer system blamed for rejecting the application? Did you feel the system was fair or did you wish that you were given a chance to explain yourself?

Denied opportunities

Millions of people around the world are denied opportunities by computer systems with faulty codes designed to intentionally or inadvertently produce results that lock out some people unfairly.

The computer engineers who stitch together these systems are sometimes influenced by their blind spots, values, desires and biases.

This is a big deal given that computer systems have become the tools of choice for making life-changing decisions about our lives: who to hire for a job, who to lend money to, how much one should pay for health insurance or who to admit to universities.

Slew of data

Computer systems are unmatched in their power to sift through a slew of data and spit out results that would otherwise take human beings ages to analyse.

Thanks to their efficiency, we have become overly dependent on computer systems. We must however not take them as God’s word. Computer engineers need to be careful not to allow their values and desires tilt the scales in favour of some, and lock out others who merit a service.

The biases could be in form of tribe, race, clan, gender, geography, marital status, income and many others.

Just because there are stereotypes that tag people from a certain region, profession, religion, age or marital status as loan defaulters, or lazy, thieves or violent; systems engineers and data scientists should inoculate themselves from such mindset when crafting systems that so many depend on. Doing so deliberately would be unethical.

Systems are modelled on how people have behaved in the past to predict the future.

For example, if people who fit certain characteristics have in the past defaulted in paying their loans, a computer engineer would code those parameters in the systems so that new loan applicants with these characteristics are labelled “high risk”.

But the past is not always a predictor of the future. People and circumstances change.

When a computerised system unfairly labels someone as a “bad hire”, “risky borrower”, “and potential criminal”, “perennial loser” based on past information of people like them, could be discriminating.

Cathy O’Neil, a bestselling author and a data scientist appropriately refers to biased computer systems as “weapons of math destruction”, because they use mathematical algorithms to ruin lives by placing them in buckets that they don’t belong.

Of course no system can be perfect and that’s why regularly auditing and updating them should be part of a system engineer’s modus operandi.

The writer is an informatics specialist. Email: [email protected] @samwambugu2