Home > Uncategorized > Sensitizing and Sanitizing Algorithms Essential for Defending Their Broader Use in Schools

Sensitizing and Sanitizing Algorithms Essential for Defending Their Broader Use in Schools

Cathy O’Neill, who blogs as Mathbabe and is a regular contributor to Bloomberg, wrote a post yesterday on the pushback “algorithmic overlords” are beginning to receive from researchers and politicians. Ms. O’Neill, who has written extensively about the bias of algorithms, offers some examples in her post:

Objective as they may seem, artificial intelligence and big-data algorithms can be as biased as any human. Examples pop up all the time. A Google AI designed to police online comments rated “I am a gay black woman” 87 percent toxic but “I am a man” only 20 percent. A machine-learning algorithm developed by Microsoft came to perceive people in kitchens as women. Left unchecked, the list will only grow.

This kind of inherent bias can be problematic for those of us who see promise in the use of algorithms in personalized learning. For example, if algorithms direct users to ever narrower learning opportunities that are determined based on inherent biases, young women might be directed away from mathematics and science content and toward content in “kitchen-related” fields while long men would be directed in the opposite way…. and as long as these kinds of algorithmic biases exist it will be impossible to overcome resistance to data-driven personalization.

Ms. O’Neill is no Luddite. She sees promise in the use of technology to enhance education. But she is not enthusiastic about the “algorithmic overlords” tendency to keep their methods secret in the name of proprietary information:

Many researchers and practitioners are working on how to assess algorithms and how to define fairness. This is great, but it inevitably runs into a bigger problem: secrecy. Algorithms are considered the legally protected “secret sauce” of the companies that build them, and hence largely immune to scrutiny. We almost never have sufficient information about them. How can we test them if we have no access in the first place?

Legislators need to intercede… and they are beginning to do so, albeit at a snail’s pace. Here’s hoping they succeed, for if they don’t, biases will persist.

Advertisements
  1. No comments yet.
  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: