Qrius
  • Business
    • Banking
    • Finance
    • Investment Guide
    • Policy
    • SMEs
    • Net Worth
    • Startup
    • Pros and Cons
  • World
    • Entertainment
    • Climate
    • Culture
    • Economy
    • History
    • Politics
    • Elections
    • Sports
      • Scrabble
    • Health
    • Lifestyle
  • Science & Technology
    • Archaeology
    • Nature
    • Space
    • Tech
    • AI
    • Fintech
    • Futuristic Technologies
    • IOT
  • India
    • Culture
    • Economy
    • History
    • Politics
    • Sports
    • Entertainment
    • Climate
    • Health
    • Lifestyle
  • Contributors
    • Digital Marketing Guest Post
    • Education Guest Post
    • Travel Guest Post
    • Fashion Guest Post
    • Fintech Guest Post
    • Health Guest Post
    • IOT Guest Post
    • Politics Guest Post
    • Sports Guest Post
    • AI Guest Post
    • Technology Guest Post
    • Literature Guest Post
  • Content Services
  • Business
    • Banking
    • Finance
    • Investment Guide
    • Policy
    • SMEs
    • Net Worth
    • Startup
    • Pros and Cons
  • World
    • Entertainment
    • Climate
    • Culture
    • Economy
    • History
    • Politics
    • Elections
    • Sports
      • Scrabble
    • Health
    • Lifestyle
  • Science & Technology
    • Archaeology
    • Nature
    • Space
    • Tech
    • AI
    • Fintech
    • Futuristic Technologies
    • IOT
  • India
    • Culture
    • Economy
    • History
    • Politics
    • Sports
    • Entertainment
    • Climate
    • Health
    • Lifestyle
  • Contributors
    • Digital Marketing Guest Post
    • Education Guest Post
    • Travel Guest Post
    • Fashion Guest Post
    • Fintech Guest Post
    • Health Guest Post
    • IOT Guest Post
    • Politics Guest Post
    • Sports Guest Post
    • AI Guest Post
    • Technology Guest Post
    • Literature Guest Post
  • Content Services
24 Sep, 18
24 Sep, 18
Explained, Technology

Artificial intelligence hates the poor and disenfranchised

Machine learning algorithms, whether in the form of “AI” or simple shortcuts for sifting through data, are incapable of making rational decisions because they don’t rationalize — they find patterns.

By Zimble Digital

By Tristan Greene

The biggest actual threat faced by humans, when it comes to AI, has nothing to do with robots. It’s biased algorithms. And, like almost everything bad, it disproportionately affects the poor and marginalized.

Machine learning algorithms, whether in the form of “AI” or simple shortcuts for sifting through data, are incapable of making rational decisions because they don’t rationalize — they find patterns. That government agencies across the US put them in charge of decisions that profoundly impact the lives of humans, seems incomprehensibly unethical.

When an algorithm manages inventory for a grocery store, for example, machine learning helps humans do things that would, otherwise, be harder. The manager probably can’t keep track of millions of items in his head; the algorithm can. But, when it’s used to take away someone’s freedom or children: We’ve given it too much power.

Two years ago, the bias debate broke wide-open when Pro-Publica published a damning article exposing the apparent bias in the COMPAS algorithms – a system that’s used to sentence accused criminals based on several factors, including race. Basically, the report clearly showed several cases where it was obvious that the big fancy algorithm predicts recidivism rates based on skin tone.

In an age where algorithms are “helping” government employees do their jobs, if you’re not straight, not white, or not living above the poverty line you’re at greater risk of unfair bias.

That’s not to say straight, white, rich people can’t suffer at the hands of bias, but they’re far less likely to lose their freedom, children, or livelihood. The point here is that we’re being told the algorithms are helping. They’re actually making things worse.

Writer Elizabeth Rico believes unfair predictive analysis software may have influenced a social services investigator to take away her children. She wrote about her experience in an article where she describes how social services — whether intentionally or not — preys upon those who can’t afford to avoid the algorithm’s gaze. Her research revealed a system that equates being poor with being bad.

In the article, published on UNDARK, she says:

… the 131 indicators that feed into the algorithm include records for enrollment in Medicaid and other federal assistance programs, as well as public health records regarding mental-health and substance-use treatments. Putnam-Hornstein stresses that engaging with these services is not an automatic recipe for a high score. But more information exists on those who use the services than on those who don’t. Families who don’t have enough information in the system are excluded from being scored.

If you’re accused of being an abusive or neglectful parent, and you’ve had the means to treat any addictions or mental health problems you’ve had in a private facility, the algorithm may just skip you. But, if you use government assistance or have a state or county-issued medical card, you’re in the cross-hairs.

And that’s the problem in a nutshell. The best intentions of researchers and scientists are no match for capitalism and partisan politics. Take, for example, that Stanford researcher’s algorithm purported to predict gayness – it doesn’t, but that won’t stop people from thinking it does.

It isn’t dangerous in the Stanford machine learning lab, but the GOP-helmed Federal government is increasingly anti-LBGTQ+. What happens when it decides that applicants have to pass a “gaydar” test before entering military service?

Matters of sexuality and race may not be intrinsically related to poverty or disenfranchisement, but the marginalization of minorities is. LBGTQ+ individuals and black men, for example, already face unfair legislation and systemic injustice. Using algorithms to perpetuate that is nothing more than automating cruelty.

We cannot fix social problems by reinforcing them with black box AI or biased algorithms: It’s like literally trying to fight fire with fire. Until we develop 100 percent bias-proof AI, using them to take away a person’s freedom, children, or future is just wrong.


 This article has been previously published on The Next Web.


Stay updated with all the insights.
Navigate news, 1 email day.
Subscribe to Qrius

About Author

Zimble Digital

Visit Homepage

what is qrius

Qrius reduces complexity. We explain the most important issues of our time, answering the question: "What does this mean for me?"


Featured articles

1

Before Christ

What Does BCE Mean? Difference between BCE, CE, BC and AD
2

GDP

Revealing the Top 10 GDP Countries of 2024: A Deep Dive into Global Economic Powerhouses
3

Android

The Ultimate Guide to the Best Car Racing Games for Android in 2024
4

cars

Best Family Car in India in 2024: Experience Memorable Journeys with Loved Ones
5

Extreme sports

Hidden Chess Rules: Elevate Your Game with Secret Strategies
6

adventure sports

Cristiano Ronaldo vs Lionel Messi: Why Cristiano Ronaldo Is Better
7

40 Top GK Questions: Boost Your Knowledge Quotient!
8

Why has increased productivity not led to more free time?
9

gita

Gita quotes on karma: Want to live your best life? Laws to live by…
10

Facebook

Facebook and Instagram down: What reason did Meta give?

About Qrius

  • About Us
  • Content Services
  • Contributors
  • Become a Contributor
  • Contact

Contribute

  • Digital Marketing Guest Post
  • Education Guest Post
  • Travel Guest Post
  • Fashion Guest Post
  • Fintech Guest Post
  • Health Guest Post
  • IOT Guest Post
  • Literature Guest Post
  • Politics Guest Post
  • Sports Guest Post
  • Technology Guest Post
  • AI Guest Post

Quick Links

  • World
  • Entertainment
  • Climate
  • Culture
  • Economy
  • History
  • Politics
  • Elections
  • Sports
  • Health
  • Lifestyle
  • Science & Tech
  • Archaeology
  • Nature
  • Space
  • Tech
  • AI
  • Fintech
  • Futuristic Technologies
  • IoT
  • India
  • Culture
  • Economy
  • History
  • Politics
  • Sports
  • Entertainment
  • Climate
  • Health
  • Lifestyle
2018 QRIUS. All Rights Reserved