Each week, we review the week’s news, offering analysis about the most important developments in the tech industry.
Hi, I’m Jamie Condliffe. Greetings from London. Here’s a look at the week’s tech news:
I’m about to suggest something that sounds controversial: Maybe it’s good that we keep discovering biased algorithms?
Let’s rewind. A pair of articles this past week revealed that software that is used to make decisions on the behalf of humans appears to do so with gender bias.
First, it came to light that the algorithm that calculates credit quotas for Apple’s new credit card may give higher limits to men than to women. Goldman Sachs, which issues the card, said its credit decisions were “based on a customer’s creditworthiness and not on factors like gender, race, age, sexual orientation or any other basis prohibited by law.”
And my colleague Cade Metz reported that artificial intelligence services from Google and Amazon both failed to recognize the word “hers” as a pronoun, but correctly identified “his.” Other, similar algorithms “generally don’t give women enough credit” when analyzing text, he added.
This isn’t surprising. Algorithms are written by humans, who are inherently biased — and that can seep into the way they frame the analysis that underlies their code. Artificial intelligence software is trained on data that contains all kinds of human biases, which can then appear in its own inferences.
“What algorithms are doing is giving you a look in the mirror,” Sandra Wachter, an associate professor in law and A.I. ethics at Oxford University, told me. “They reflect the inequalities of our society.”
That we often see inequality is troubling. If software is going to choose a credit limit, it should do so fairly. If A.I. is supposed to parse information for us, we would like it to do so objectively.
The problem is, algorithms are everywhere, making decisions on our behalf in ways that are often opaque to us. “How many times am I not seeing certain jobs ads? How many times do I get advertised higher prices?” Professor Wachter asked. “I often don’t know that I’m being treated unfairly.”
In that sense, the more often we unearth cases of bias, the better. Because not only is it one algorithm fewer left unchecked, it’s more evidence to demonstrate why we must solve the problem.
Not that that’s easy, of course. Computer scientists are trying to work out how to spot and remove bias in data; others are developing ways to make algorithms better able to explain their decisions. And there are pushes to force companies to be more transparent and accountable about how they use algorithms. It’s a slog, but it is happening.
“It’s unrealistic to assume that we’ll ever have a neutral system,” Professor Wachter said. “But with the right systems in place, we can mitigate some of the biases.”
Banking With Big Tech
With that Apple Card article fresh in your mind, now is a good moment to think about Big Tech’s push into personal finance.
Google announced that it was teaming up with Citigroup and the Stanford Federal Credit Union to offer a “smart checking” account next year. Exactly what that will entail is unclear, but Google says it will help bank customers “benefit from useful insights and budgeting tools,” via its Pay app.
Also, Facebook unveiled a digital payment system, called Pay, that will let users make payments across its Messenger, Instagram and WhatsApp platforms. That’s alongside its separate initiative to revolutionize the world of global finance with its own cryptocurrency, Libra.
This is all unsurprising. The companies are attracted to the prospect of a new revenue stream. The question is whether or not they can make a go of it.
Regulators and lawmakers are already unimpressed. New York State regulators said they would investigate the potentially discriminatory algorithm used by Apple Card. Libra has come under immense criticism from all sides because people don’t trust it. And Senator Mark Warner, Democrat of Virginia, already said of Google’s banking effort to CNBC: “There ought to be very strict scrutiny.”
But there’s another question here: Do people even want to use Big Tech to bank? There’s a lot of inertia in the world of personal finance, and persuading people to switch from their bank to Apple or Google or Facebook will require features with genuine utility, or higher interest rates, or something else stellar. Fear about financial data privacy is probably a concern for consumers, too. In other words, it’s a high bar.
Hunger for Health Data
News surfaced that a partnership between Google and the hospital provider Ascension could allow the data of the health care company’s patients — 50 million in total — to be uploaded to Google’s cloud-computing platform. Without patients or doctors being notified, some of that data, including names, dates of birth, lab tests and diagnoses, were already being uploaded.
Ascension said it was exploring “machine-learning applications that will have the potential to support improvements in clinical quality” through the deal. Google said it would provide “tools that Ascension could use to support improvements in clinical quality and patient safety.”
This may all be totally O.K. It’s perfectly legal for health care providers to share patients’ medical information with business partners like electronic medical record companies. Still, the Office for Civil Rights in the Department of Health and Human Services plans to seek more information about whether it complies with the law anyway.
Nonetheless, many people found it unnerving. That’s probably because of Google’s motivations: According to The Wall Street Journal, Google hasn’t charged Ascension for the work because it hopes to develop systems based on what it learns. It would eventually sell those systems to other health care providers.
Again, there’s nothing wrong with that. But it comes off as a little tone deaf to slurp up data as intimate as health care records without informing patients, at a time when data privacy concerns are more heightened than ever.
Source: The Week in Tech: Algorithmic Bias Is Bad. Uncovering It Is Good.
By By Jamie Condliffe
Techylawyer and its authors do not claim to have written this article, we acknowledge the works of the original author