Home Page
cover of Weapons of Math Destruction Podcast
Weapons of Math Destruction Podcast

Weapons of Math Destruction Podcast

carlyshepard

0 followers

00:00-11:06

CSCI 112 Podcast assignment

2
Plays
0
Downloads
0
Shares

Audio hosting, extended storage and much more

AI Mastering

Transcription

The podcast recording discusses the book "Weapons of Mass Destruction." They talk about various topics, including the use of algorithms in baseball statistics and how they can be beneficial without putting human characteristics into numbers. They also discuss how algorithms can disadvantage certain groups of people, such as the use of credit scores by employers and the targeting of poor areas for criminal profiling. The conversation highlights the negative impact of algorithms on disadvantaged communities and the need for more awareness and consideration of their effects on humans. They also express interest in exploring WMDs that are used for charitable purposes. Okay, so this is our podcast recording for our book, which was Weapons of Mass Destruction. I'm Carly. I'm Sam. I'm Rebecca. Okay. So, we can start with what we all found most interesting from the book. There was a lot. I liked when she talked about baseball. That was probably my favorite part because I'm from Ohio and she opens up with talking about the Cleveland Indians, which even though I'm not a baseball fan, I don't know, I thought this book was going to be kind of dry, but I thought that was like a good way to open it up and that kind of made me more engaged because I was like, oh, it's like Cleveland. Nice. I kind of thought it was a great example of what like a non-harmful or less harmful WMD would look like. Like, in my opinion, that's less of a WMD and more of just like a large scale algorithm that's used to streamline information and increase productivity. And, yeah, I was kind of hoping that she would bring it up because I'm a huge fan of the movie Moneyball starring Jonah Hill. I don't know if you've seen it. I have not. I've not seen it. But I like Jonah Hill. I really liked it and I didn't really understand baseball statistics until I watched the movie and then my brother's really into sports too, so he was able to kind of fill me in more on what those statistics look like. But they're super beneficial and I think the key component and what makes them not so harmful is that they don't put human characteristics into numbers. They just use stats. Like, they just use the feet that the ball was hit or the number of times someone strikes out and they don't get into the humanistic qualities of the sport. And in my opinion, that is exactly where WMD goes down because when they start to put human characteristics to numbers and that's just never quantifiable in my opinion. Yeah. Yeah, I think I saw that too with like in the beginning of the intro when they're talking about those schools in Washington, D.C. and they're trying to gauge how well students learn under a certain teacher. Yeah. And just kind of that alone, first of all, it pushed like teachers to cheat and then it kind of messed up like the algorithm in general, obviously. Yeah. And then, but like with algorithms alone, the author talked about like there's a lot of times where like developers can't really quantify stuff that's going on like within like human action, like human emotion and stuff like that. Absolutely. So, yeah, I think Moneyball in itself is like a good way of showing like the WMD like without that. And then once you kind of get into that gray area of like figuring out like what a human would do or like based off every scenario, then I think it does get a little scary. Yeah. Yeah. Something towards the beginning too is she talks about how she's a mom and like the things that she does every day. And when you think, this book is all about like algorithms, when you think about that, you probably think, oh my gosh, that sounds complicated. Like I could not understand an algorithm. But she breaks it down into how she's a mom and she like cooks every night and there's things that she needs to do with like making meals and taking care of her kids, which in and of itself is kind of an algorithm. And she calls it like an informal model. So I think that's like a nice way of introducing like a reader to kind of what an algorithm is that they don't know what it is. Or if you're just kind of confused, that's a good way to break it down into like a relatable, more understandable like way. I completely agree. It also gives a little bit of insight into like how much goes into decision making. Because you don't just decide like what's for dinner. You decide how long it's going to take and who's going to like it and how many ingredients you need. And so for anybody who has no experience in algorithms, which is me, it's nice to think about it in the sense of like a real life scenario, like cooking dinner for your kids. Yeah. Go ahead. Okay, so something that I thought was pretty cool and that I just was really shocked to hear that she talked about was how employers check credit scores of employees, which ties into her like one of her main themes of inequality with like some algorithmic components. And she kind of talks about how companies will check the credit scores of employees who are up for like a promotion. And if your credit score is low, then you basically just get like tossed aside in consideration in a lot of cases. And which is pretty unfair when you think about it. Because I don't know if that's like morally something a company should have access to with its employees. I don't really see how that would like correspond to being an employee or being a good employee. And it's kind of sucks for the employee, because if you can't get a job because of your credit score, then you're just kind of in like this loop of, okay, well, my credit score is going to get worse because I don't have a job. And because my bad credit score, like, it's just kind of like a loop. Yeah. Sucks. Yeah. I think it's like one of those things that really only benefits the employer. Yeah. And like that algorithm is really only for them. Yeah. I mean, I guess that's what the point of like a lot of these algorithms are, is to make things more efficient. But it really is at the cost of like the people trying to find a job or something like that. Yeah. Yeah. They really, like from what I understand from all of the points in the book, are that W&Es really only benefit a very small group of people. And it's not that they benefit a small group of people and then behave neutrally towards another group. Right. And they really disadvantage the group that is not benefiting from the algorithm. Like back to that case in Washington with the teachers. Mm-hmm. And this is used nationwide. The algorithm is to evaluate teachers' performance. The districts benefit from maybe higher test scores, but students are losing out on teachers who provide more than academic resources. They provide social-emotional benefits. Yeah. Right. And I mean, teachers impact so much of a growing student's mind. Yeah. And the students are disadvantaged. The teachers and their families are disadvantaged. Mm-hmm. And maybe the districts on a general scale get more funding and are benefited. Yeah. But there are so many people who are disadvantaged by that. Yeah. That reminds me of a point she makes like at kind of the very end of the book where she's talking about how some models are used to target potential criminals. And some, like a specific model that she talks about is like foster care and how algorithms are used in like the field of social work. And she talks about how like these particular models target more like poor areas instead of wealthy neighborhoods. Yeah. And it's talking about how abuse within like foster care doesn't just happen in poor neighborhoods. It also can happen in wealthy neighborhoods. It can happen anywhere. Yeah. Absolutely. And these models like skew what they're looking at to target more poor areas, which is not fair. Yep. Yeah. Yeah. One of the main things I've realized about WMDs from reading this is that it just perpetuates bad cycles that disadvantage people. Yeah. Predisadvantage. Mm-hmm. And then it just perpetuates all those things back. Like if you live in a poor neighborhood, you're likely exposed to more crime. Mm-hmm. If you're a minority, you're more likely due to WMDs to be stopped and frisked. You're more likely to be erected. Mm-hmm. You're more likely to be longer because of where you're from. And then when you get out, you're less likely to get a job because of higher recidivism. Yeah. Yeah. Yeah. Based on your surveys, which are also like large scale WMDs. It's just insane. Like they kick people over down. They make people stay there generationally. Mm-hmm. And it's not something that you'd think about because algorithms tend to not be considered to affect humans. Yeah. They're considered to just be numbers. Calculating numbers. Facts. Yeah. Yeah. Which is not true. And I think like the problem too that she like addressed in the end was the fact that like a lot of times algorithms are made and then they try to shake the real world around the algorithms. Yeah. So then whatever issues or discrepancies they actually have, they're not addressed. Yeah. Because they're just happy with whatever results they get. And if it's benefiting the company, then they're happy with that algorithm. So. Right. Do you wish you would have addressed anything else or she could have made any points that you were hoping for? I was kind of, I kind of want to look at like some of the WMDs that are used for, I guess like good. I mean, I know they point out some like money ball and stuff like that, but like, like I do want to know ones that are like used for, I don't know, like charitable like things or something like that. Yeah. They're around for a reason. Yeah. Yeah. Exactly. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah.

Other Creators