Redefining Big Data Research with Dr. Eric Stephens Cybertraps 007


Dr. Eric Stephen’s dissertation research was all about how researchers fall into traps using datasets they don’t curate themselves. They adopt the assumptions of the database design and the methodology. Stephens discusses how social justice researchers employ the doctrine of double effect to justify unintentional exploitation of their subjects, and proposes a big data research method to counter those exploitations by focusing on data created by institutions and not users.

  • Bias and unbias: 
  • Where they go to do research is already showing their bias.
  • We could also talk about questions schools should ask before adding their information to big data sets. 
  • Bag of words method
  • What is the doctrine of double effect? (Stanford Dictionary of Philosophy Link)
  • Explain the research method for big data
  • Institutional genre analysis – study what an institutions’ goal is across time. 

[spp-transcript]

 

[00:00:00] Hi folks. Welcome to the cyber traps podcast. I’m Jethro Jones, a host of the podcast, transformative principle and author of the book, school X, which is all about redesigning your school for the people in front of you.

[00:00:14] Fred Lane: [00:00:14] Greetings folks. I’m Frederick lane, an author, attorney, and educational consultant based in New York. I’m the author of 10 books, including most recently cyber traps for educators. 2.0, raising cyber ethical kids and cyber traps for expecting moms and dads, Jethro. And I are teaming up to bring timely, entertaining and useful information.

[00:00:35] To teachers, parents and others about the risks arising from the use and misuse of digital devices.

[00:00:43] Jethro Jones: [00:00:43] Coming weeks and months, we’ll be talking to some of the nation’s leading experts from the fields of education, parenting, cyber safety, sociology, and probably more than that. Join us as we look at what it takes to better navigate our increasingly high-tech world. Good afternoon, Fred. I am so excited to talk today.

[00:01:04] Fred Lane: [00:01:04] No. Likewise, what is this? Number three.

[00:01:06] Jethro Jones: [00:01:06] Number three today. How cool is that?

[00:01:10] Fred Lane: [00:01:10] We’re just loading them up. This is good.

[00:01:12] Jethro Jones: [00:01:12] That’s right. So our guest today is Eric Stevens and his dissertation research was all about how researchers fall into traps. Good for this podcast, using data sets that don’t create themselves. They adopt assumptions of the database design and the methodology. And he discusses how social justice researchers employ the doctrine of double effect to justify unintentional exploitation of their subjects.

[00:01:36] And he proposes big data research, a big data research method to counter those expectations by focusing on data created by institutions and not users. So Eric, welcome to the cyber traps podcast.

[00:01:49] Eric Stephens: [00:01:49] uh, hello, gentlemen. Thank you so much for having me. I’m really happy to be here. Thank you.

[00:01:54] Fred Lane: [00:01:54] Yeah, it’s good to have your thank you.

[00:01:56] Jethro Jones: [00:01:56] yeah, we’re excited to have you. So first let’s just start by talking about how you got into doing a dissertation on, uh, traps researchers use using data sets that they don’t create. I mean, I, would’ve never, in a million years thought of that for a PhD, and yet you, you did your dissertation on it. So tell us about how you got into that.

[00:02:17]Eric Stephens: [00:02:17] , I knew going into my doctorate program. Um, a couple of things. Um, I knew that I wanted to do big data research. Um, I just thought it was really cool. The potential that was out there. It was a buzzword, um, a lot of fun, and I knew that I wanted to do large data sets using specifically Corpus linguistics.

[00:02:41] Um, so using the text itself as data as a data source, um, I also knew that I wanted to do social justice research. Like I want, I wanted to do something that meant something, right. Like, I didn’t want to just research to research something I want to, I wanted it to matter. And I think I still do for that.

[00:03:06] Jethro Jones: [00:03:06] Yeah, that’s good. So tell us about what, what that the traps are that people fall into when they’re using these data sets and how they bring their own biases into that, or the biases of the data that they’ve collected, that they may not even recognize exists.

[00:03:22] Eric Stephens: [00:03:22] Yeah, I think that, I think when, 

[00:03:28] when someone wants to go in and they want to do research, , often they don’t, they don’t realize that the. , they like our research. They want to be as unbiased as possible when they’re doing their research. , what they don’t understand is where they go to do their research is already dictating a bias.

[00:03:50] Uh, one of the readers for my dissertation, Michael Mang, um, he, uh, he was a historian and did this archival, like he did a lot of cool stuff with history, um, because historians, they, they, they make these claims too. Too factual like this is, this is what happened, um, without realizing, well, this is what happened.

[00:04:11] According to a set of artifacts that were found and located and archived in this one particular . And this one and this one particular location. So the bias already exists. The exact same thing exists with a data set. Um, if you’re going into a dataset, you are not only, um, adopting the, uh, the inherent structures that are there, but also the biases that were created or that, that were used when that table was created.

[00:04:46] Fred Lane: [00:04:46] so Eric, if you would, um, to inform me and actually anyone else who’s listening, what are some examples of the kinds of big data sets that you’re referencing? I mean, obviously I can think of something like Google books, right. Which is going to be a data set driven by who got published. So there are biases built into that.

[00:05:05] What other examples are you working on or, or thinking of when you discuss this?

[00:05:10] Eric Stephens: [00:05:10] So when I was doing my particular research, I was focusing, focusing specifically on prisons. I knew that I wanted to do research in prisons. I watched a great episode of last week tonight on jump with John Oliver. Um, and I was, I knew I wanted to do social justice as I mentioned before. And so I, I picked prisons and.

[00:05:30] When, when you’re going into and doing research with prisons, you have to be very careful with what you’re doing as far as, um, the type of data that you’re collecting, especially with IRB approval processes. It can be lengthy. It can be just really problematic when you’re working with prison populations.

[00:05:49] And that’s why I wanted to help. I wanted my research to help prison populations. But I didn’t want to use the prison populations as my side of study. Um, because even though I could go through every IRB approval thing that I could to show that I was not exploiting these people, but what it came down to was that my research was helping me hopefully get tenure or get more research funding.

[00:06:17] They’re not getting a slice of that pie. They’re not getting an author credit. Um, and so I wanted to, I wanted to approach, like I wanted to, I wanted to find a data source that was not reliant on user information because the, the problem, and I can kind of get into like, like the philosophical thinking behind that.

[00:06:40] Cause like, that’s what, like I did, like my, my background is in rhetoric and philosophy. And so I got into like this, like really into this deep thinking about what happens. When your identity is replicated over and over and over and over and over and over. Um, and so that we can go with that in that direction.

[00:06:56] If we want to though.

[00:06:57] Fred Lane: [00:06:57] Well, if I can interject, um, just a little bit here because. I’m one of the books that I have written was called American privacy. And it was really what I described as a biography of the right to privacy in the United States. And one of the things I was fascinated by was the unintended consequences of the creation of a massive dataset of financial information.

[00:07:22] When we invented credit cards. And when we linked credit cards to specific social security numbers and all of a sudden two things that hadn’t existed in 1930, 50 years later were really dictating how people moved around in the world. And of course, that’s expanded now with mobile telephones narrowing down our, our particular, uh, digital pictures, if you will.

[00:07:48] So there’s huge ethical and philosophical implications in all of this.

[00:07:54] Eric Stephens: [00:07:54] Yeah, absolutely. And that’s what, and that’s what I wanted to do because basically it who, who I am like genuinely as a person is I want to disrupt things. Um, I w I want to find out what, what. What and how things work so that I can go in and tinker with it and make it what, in my eyes is something that is more ethical.

[00:08:19] Cause I, I claim to be an ethics scholar. I mean, that’s what that’s, what I want to do is do to do good ethical work. Um, and if I’m trying to, and if, and if I approach my research, um, and I am. With, with the intention of helping a group of people, but I’m using the data that they themselves have created and have been replicated by their, their own personal identity replicated over and over and over and over.

[00:08:47] My research is already flawed ethically. Some people that’s not a big thing for me, it was problematic because. I didn’t want to feel like I was exploiting people, but I still wanted to help. And so my research, like I, I ended up like what I, what I ended up creating was, um, a, like I wanted to, I wanted to understand the prison system at a language level, across.

[00:09:24] Time, um, and across space in the United States. Um, basically I wanted to understand if we send a person to prison, we’re sending them to a correctional facility, um, with correctional officers and we give them handbooks to say, Hey, this is what you should be doing. What I wanted to answer was at the language level with the technical documents that we hand to an inmate, what are we correcting them to?

[00:09:54] To what standard are we asking them to be at the language level? Not at like a political level when they’re making a speech at like, when you, when you write it down. Now we have two ways to do that. You can sit down and read 350 documents ranging from three pages to 150 pages, right. Or. You can train a computer to do it.

[00:10:19] And so that’s what I did. Um, I partnered with, um, a great friend of mine, bed Webster, and a friend of his Katie Blake. Well, um, there are two established, um, data scientists out of NLP logics. They specialize in doing this kind of thing. And for me it was a fun, fun side project. Um, it took them a total of a weekend really to like smash out the code and to do what we needed to do.

[00:10:43] But I, I created this Corpus, so I create, yeah, it was over 350 documents. It was over, I think 400. I can’t like that the numbers are kind of blanking on me. Um, but we ended up with, um, I think it was over 400,000 unique phrases, um, that, that carried a definition. So we were able to, um, we were able to predict what a word meant.

[00:11:13] By its proximity to all the other words in the document. So we created a dictionary based on the Corpus itself. Okay. So.

[00:11:25] Jethro Jones: [00:11:25] Okay, hold on a second. Let me wrap my mind around this. So you were talking before about. User generated data versus institution generated data. And so instead of surveying all these inmates, uh, over the course of 30 years and getting that data that they gave there opinion on you went to the documents that the prison’s created to tell what they were doing so that you would have institution generated, um, data instead of user generated data.

[00:12:00] Am I understanding that correctly?

[00:12:02] Eric Stephens: [00:12:02] So like, so coming into, I wanted to fix the prison system. I wanted to make an attempt, . , like fixed, here’s a crack in the prison system. , and I wanted to do it in a way that wasn’t using data created by inmates.

[00:12:19] Jethro Jones: [00:12:19] So if I could draw a parallel then to education, for example, instead of, um, instead of evaluating and using the data of student responses to test questions. You’re using the data of what the test questions are to see what the institution is, is actually asking the kids and creating a new dictionary based on all that.

[00:12:44] Is that am I interpreting that correctly? 

[00:12:46]Fred Lane: [00:12:46] Couple of different boy, this is just like, I’ve got my mind racing on this, Eric. This is really amazing stuff, but first of all, an observation, I mean, the work that you’re doing just to follow up on Jethro’s analysis or analogy, I guess is the better word is that. Um, evaluating and understanding what the school expects of the students from the wording of the tests that are administered over say eight decades is a really valuable way to understand the expectations of the society.

[00:13:25] Right. And, and what you’re looking at is the prison society. So you’re looking at what do these institutions expect of the people within them, but it’s not necessarily going to tell us anything about whether or not the end result was what was sought, you know, and you can have this language. Either on the tests or on the prison manuals, but in a way it’s aspirational, this is what we hope people will do.

[00:13:53] And, and so you would need some of that user generated information to determine the success of the particular documents and achieving the goal that it’s trying to do. So that that’s just one observation. But the other thing that fascinates me, because I write about the impact of technology on society.

[00:14:13] What you’re doing is the kind of thing that is getting fed into AI. And so you could produce the ideal prison manual based on the Corpus of material that you have. If you feed it into the right, you know, deep blue or deep prison, I guess we’d call it. So.

[00:14:32] Eric Stephens: [00:14:32] Yeah. I, I think that there are definitely, um, some really cool implications. Like one of the things that. One of the aspirational things I want to do with like, you know, phase two of my project that I pitched, like when I was on my tenure track, uh, job circuit, right. Was to take this Corpus of this, this database that I had built.

[00:14:52] Um, and because essentially I didn’t really, I didn’t, honestly, I didn’t with my dissertation, my research. I didn’t answer any questions. I didn’t really do anything like that. What I did was created something and provided an ethical reasoning for why I didn’t want to do answer a question. Right. Um,

[00:15:09] Fred Lane: [00:15:09] To created a mode of analysis though.

[00:15:12] Eric Stephens: [00:15:12] exactly, exactly. So, so, so now what I can do is empower other people to ask really interesting questions and let them do really cool research. Like that’s what I can do as a person to help other people and their research. Right. Um,

[00:15:30] Fred Lane: [00:15:30] need ethicists to figure out how to do this.

[00:15:33] Eric Stephens: [00:15:33] yeah. Yeah. And like, it, it helps to study ethics, honestly. Um, and so now, like here, here’s something that we found in this research that there is a high correlation between the word punishment and the word woman. And like, I looked at 350 handbooks in the United States. Every state was accounted for federal local, um, County levels, right?

[00:16:01] Every level, there’s a strong correlation between those two words. Why I have no idea,

[00:16:09] Fred Lane: [00:16:09] can you elaborate a little bit on what the correlation looks like? How, how did that play out?

[00:16:14]Eric Stephens: [00:16:14] uh, The method is called the bag of words. Right? You put, you put all of the words from these 350 manuals into a big giant bag, and then you shake it and you say like, okay, things that are similar fall together. Okay. Um, and it does that through some really cool vectoring and math stuff that my friend Ben and Katie could talk about with, for hours.

[00:16:38] I’m sure. Um, I just know buzzwords about it. Um, but essentially like, what you can do is if you give the algorithm, let’s say you give that, give it a sentence that is 25 words long. It can accurately predict what the 26th word will be. Um, it’s not going to be exact, but the other words that fit into like, it just like, there’s a something percent chance it could be this there’s, this something percent chance it could be this.

[00:17:09] Right. So it, so what it does is you can define the word you’re looking for. By the other words, it thinks it is

[00:17:16] Fred Lane: [00:17:16] well, I don’t know if you use Gmail, but g-mail uses the Corpus of my emails to do predictive technology when I’m writing an email

[00:17:25] Eric Stephens: [00:17:25] Oh, absolutely.

[00:17:27] Fred Lane: [00:17:27] it’s getting a little scary actually.

[00:17:29] Eric Stephens: [00:17:29] Yeah. Yeah. And I mean, the, the, the T NC and that’s, and that’s kinda like the, this going back to this idea of like, like big data in and of itself is not an evil thing. Like it’s not, it’s not necessarily a trap or it’s not like it, but it’s, it’s. A moral, it depends on how the person is using it, which kind of like, I, this is why I think that it is important.

[00:17:52] Like for everybody, especially like if you’re working with data, you’ve got to study ethics.

[00:17:59] Fred Lane: [00:17:59] The Latin phrase, CNTs potentia, which is knowledge is power. And one of the concerns that I have about big data, um, as, as a concept, as a construct is that the more data we bring together, the more that is known and the more that can be predicted about our behaviors. I mean, Cambridge Analytica and its use of Facebook data.

[00:18:24] Is a perfect example of how this stuff can be misused. So, you know, it’s a power source, like big oil or big sunshine or whatever else you want to talk about. Um, it, it can do good

[00:18:34] things or it can be really destructive. 

[00:18:37] Jethro Jones: [00:18:37] And this is where I think there’s there. When you have this idea or this big thing going on. Um, it takes on a life of its own. And so you’re talking about the correlation between the word punishment and women or women. And when that correlation happens, then it starts to feed into other areas in a way where it can, it can get out of control and you can’t, you can’t control what it’s doing at that point.

[00:19:06] And I think that that for me is one of my concerns around big data is that eventually things start happening. That nobody intended that nobody expected. And it goes back to reaffirm. Why, if you’re doing big data, you should be an ethicist as well.

[00:19:25]Eric Stephens: [00:19:25] I think when it comes down to, um, like the debate about data, um, I feel like it’s, it’s at the point now where it’s, it’s it’s it’s no longer about, should this be happening because it’s, it has already happened. Um, like, and so, so that. It’s a fun thing to talk about.

[00:19:52] But then we have to ask, like, what’s the next step? Like, so, so if we just accept that, that is the world in which we live, when that is filled with data, um, then, then, then what is the next thing that we do? Um, and, and for me, what, what that turned into for that research project was here’s a tool that someone else can use to do good work, right? It’s basically to, to build an intentional tool that it is designed to not exploit people and what it is designed to do. What I coined this thing is called institutional genre analysis, um, is to go into an institution and to study what it is and what its goals are across time. By the documents they themselves produce.  So that we can understand who these giant corporations are. Like at the core level, we can figure that out and we can analyze it and we can study it.

[00:21:04] Fred Lane: [00:21:04] It’s a really fascinating way to approach this. Let me ask you this though, because. I, I, I, I’m trying to get my head around all of these amazing concepts that you’re introducing us to. It’s it’s really terrific. But in terms of, of this genre analysis for institutions, doesn’t that imply kind of a unitary approach to institutions in the sense, I mean, it’s almost as if you’re imbuing them with a personality and of course.

[00:21:33] Institutions and organizations are made up of myriad individuals. Is there a group think then that emerges? Is that an implication of what you’re talking about?

[00:21:44]Eric Stephens: [00:21:44] To think that a company doesn’t have a personality, right. Is to underestimate how much money they spend on branding. Um, Like, like you get, like, you can tell, like you can go into a job, uh, onto a job. Like, I mean, I just came off a whole bunch of like job market stuff.

[00:22:03] Right. You can look at a job ad and you can just kind of get a feel for the tone of the company. Right. You can train a computer to make that feeling judgment, hundreds of thousands of times. Right. It can get pretty good at that, at that kind of thing. And so you can go in and you can, I mean, this is the same algorithm.

[00:22:22] Um, The algorithm branch, I would call it that rotten tomatoes uses to make all of their predictions about movies. Like if it’s a 76%, like there’s not someone going through and reading every review out there, they’re using a sentiment analysis algorithm, right. To see if this is a positive review or a negative review.

[00:22:41] Right. You can, you can do that with a company’s core mission statement. I mean, that, that’s something that was. That was produced by committee, by a company and they are trying to imbue a certain personality. What this does is takes a look at everything. And look at like one of the, one of my other readers, um, for my dissertation, Austin Herzog, he made an incredible analysis of a tool.

[00:23:09] He took every speech that every Senator made over a certain period of time. Um, and he ran his learning algorithms on it. Right. Um, and then he took how the person voted. And then he did an analysis and said, this is how conservative or liberal this person said they were, and this is how they actually voted.

[00:23:32] And you’d go in and you could engage with the data and look at your Senator and say, Oh, this person is like, just kind of like BS in here. Like they’re, they’re saying all of this. And like, he’s doing that through a language

[00:23:43] Fred Lane: [00:23:43] I’m not sure that’s groundbreaking analysis for politicians.

[00:23:48] Eric Stephens: [00:23:48] But, but here’s, but here’s what it is. Right. And th this is, and this is, this is why I wanted to do what I wanted to do is because we understand that that is an underlying assumption that we can make. Right. But it’s always that thing that like, you can just never quite point to that’s what institutional genre analysis does.

[00:24:09] It allows you, it allows you to give. Empirical evidence to anecdotal gut feelings. Everybody knows that there’s institutional racism. How can you prove that institutional genre analysis?

[00:24:31] Jethro Jones: [00:24:31] So what’s really cool here, Eric, is that, um, is that what I was thinking as we were discussing, this is, you know, I’ve had, I’ve had these gut feelings about different companies and, and you know, how much they value me or different schools and whether or not they value kids or teachers or whatever the case may be.

[00:24:50] And I’ve, I’ve thought. I’ve seen how that has played out and been. Proven true. Um, but what I like that you said there at the end is that this, this shows that there is an, uh, empirical evidence that, that, that gut feeling checks out. And, and what I think is so amazing about that is that we as humans who aren’t doing PhD research on this particular topic, we should probably listen to those gut feelings and.

[00:25:20] And make decisions based on some of those gut feelings, because they, they probably are pretty accurate and you’re showing empirical evidence that they in fact are. So how does this affect us? Like in a day to day way as

[00:25:34] normal human beings?

[00:25:35] Eric Stephens: [00:25:35] here’s what I did, right. Here’s how I internalized it was to, um, Any document, and this is what I taught my students. Write any document that you create, understand that it has its own personality, that once you send it, it becomes its own agent in a network of living and non-living things and it has influenced, and it doesn’t have influence just like you do as a person.

[00:26:00] Um, and so understand that that’s how things function, right? And that you are engaging with all of those data points at a, at a micro level. What I did with my teaching when I was at my previous university, where I was laid off from due to COVID, was I looked at the like, go, go into any fresh first year, first year classroom, and look at the syllabus, look at the learning outcomes.

[00:26:27] And there’s probably five or six of them because that’s the genre that there are, and look at how far down. The word writing actually happens. It’s not very high up.

[00:26:40] Fred Lane: [00:26:40] I find this particularly painful as an author and Jethro is as well. Well, you know that we’ve both had a chance to put some books out there. I I’m struck by your comment about the fact that documents. Things we write have lives of their own. I mean, you know where we’re male panel here, it’s as close to children as we’ll ever have to put a book out in the world and you don’t know what that.

[00:27:05] Book will do you don’t know how people react to it? What its implications will be? I mean, it’s, it’s fascinating stuff, but now you’ve got me thinking about what would happen if someone ran this analysis on the Corpus of the books I’ve read, and that would be really both interesting and scary analysis to have done on you.

[00:27:26] Eric Stephens: [00:27:26] Yeah, because I mean, because what it does and, and, and this is essentially what I was trained to do as a. Um, English major and as a rhetorician was to do close analysis of, of documents or of, of objects, um, of, of a particular artifact. Um, what I wanted to do was to perform that same level of analysis. On hundreds of documents at the same time.

[00:27:53] And to understand those relationships, like I was able to understand that there, um, that the idea of discipline is not very prevalent in North Eastern local prisons. It’s not a thing. Like, it’s just that, that makes sense because they’re usually like they’re holding cells, right? Like they’re just like there for transport and they’re like three page documents you get into.

[00:28:21] Uh, into Texas state penitentiaries. It’s a very different story, right. Um, there documents on average about 85 pages long, right? And so like there’s some really cool things that you can understand, um, about society. And that’s something that I wanted to do was really to just make change happen, um, and doing it in a way.

[00:28:44] Um, where I wasn’t using the user. I want it to critique the institution with what the institution produced.

[00:28:52] Fred Lane: [00:28:52] Cool stuff. So then the question becomes, what is the mechanism for getting the institutions to be self-reflective enough to look at this research.

[00:29:09] Eric Stephens: [00:29:09] Uh, well, I think the first hurdle would to be, to get like, Industry or listened to academia.

[00:29:15] Fred Lane: [00:29:15] Ouch, but yes.

[00:29:19] Eric Stephens: [00:29:19] Um, no, but, uh, yeah, I think it would, I think what it would involve, , is actually kind of what’s happening already in, in a lot of, like, you can kind of, you can kind of feel the trend building up about the idea of data literacy. Um, and it’s, it’s the ability to engage with data and understand that data has a story, right.

[00:29:44] And, and once you understand that, that’s what data is, um, is, is a tool that you can use to tell a story. I mean, we’ve been telling stories since we’ve been

[00:29:55] around. 

[00:29:56] Fred Lane: [00:29:56] Well, right. And I would say, I think, I think you’re really latching onto something because. I would argue that that businesses and institutions have been using data to tell stories for quite a long time in often debt, detrimental ways to society. And so what’s so encouraging about the work that you’re doing is that this is an opportunity to tell much more socially positive, ethically driven stories, which is great.

[00:30:25] Eric Stephens: [00:30:25] Yeah. Yeah. Thank you. And that’s, and that’s what I, that’s what I wanted to do. I understood going into the social justice research realm that I had to come to a terms, I guess, for lack of a better word with my identity, with who I was as a person who I am as a person. And I am a person who really embodies like every privilege that is out there.

[00:30:46] And they know we don’t need it. We didn’t really need another one of my voices. Out there kind of like advocating for this thing that I had arbitrarily chosen because I watched John Oliver. . So I wanted to build a tool that other people who are invested in this thing can use. Um, and like, I think that that’s what, what it, what it’s going to take to answer that question, um, about like what, what, what should, what can industries be doing?

[00:31:15] I think it’s just. As, as much as we need a data literacy, we need an ethical literacy. Um, and that’s some that can like shout from the shout from the rooftops, um, and like kind of like work into my S my sphere of circles, um, or however that phrase goes. Um, but yeah, I think that’s, I think it’s, it’s an ethical learning.

[00:31:37]Jethro Jones: [00:31:37] Yeah. Well, and I wonder the things that I’m frustrated about is the idea that we should be ashamed of our privilege. And that’s not something that people are. Sometimes people say that outright, but I think we need to recognize what our privilege is and use that privilege to lift up and help other people who don’t have that privilege.

[00:31:56] I think that that’s. To me, that’s what having privileges is. Being able to use that, to do something worthwhile and meaningful. And that’s, that’s exactly what you’re talking about here. And I really appreciate that because it’s easy to think. Well, I shouldn’t do this work because I’m privileged and somebody who’s not privileged to be in here doing it.

[00:32:16] But the reality is, is that because I’m privileged, you know, in our last episode we interviewed, uh, Charles Logan and he talked about how, um, At higher education institutions, the people with privilege and power are the very people who need to be refusing. Um, the use of educational, one technology that is harmful to students or faculty, how they’re really the only ones who have the ability.

[00:32:42] To do that because of their privilege. And so I, I appreciate what, what you’re talking about here and, you know, going back to Fred’s question of how do we get institutions to change on this and just them being aware enough, first of all, as you mentioned, Eric, that they need to. Recognize that their materials have a personality of themselves.

[00:33:05] And that is infused through the people who are creating it. Just recognizing that first and foremost, um, as, uh, you know, uh, English teacher, we called that voice, but, but there’s more to it than that. And, and it can be damaging and hurtful to others. Uh, but it certainly doesn’t have to be in, if we’re paying attention, we can make things.

[00:33:26] Uh, we can make

[00:33:27] things better for people. Let me ask you another question that Eric, um, you, you have a. A way to, to deal with big data as a researcher, but how should we as normal, everyday humans be interacting with big data and, um, what should,

[00:33:45] what should our approach be?

[00:33:46]Eric Stephens: [00:33:46] , this is something that I always love to talk about with my students, um, as cause as a writing teacher. , and I said, when you understand that we live in a world of consumerism, um, you have a couple options, right? You can get just continue to consume, um, or you can be critical while you consume. Right.

[00:34:07] Or you can just like, not consume anything and then just be better about everything. Um, but you know, like you got to eat. Um, and so I think that that, that what you should do, um, is understand that you are always a target audience. Like you’re, you’re always, always trying to be manipulated by something. Um, one of the, one of the.

[00:34:35] Cool things that I love about rhetoric. Um, and understanding like if a document has its own life, then other things do too, like, like the way that, um, a university designs and places a building has intent, like has, has it reason. So like they trying to direct you to do something. So understand that everything that you

[00:34:59] are doing, someone is trying to manipulate you to do something.

[00:35:05] And be critical 

[00:35:06] of that.

[00:35:06] Fred Lane: [00:35:06] Right. And that’s exactly what I was going to say, or which is that so many of our podcasts, I think Jethro are organizing themselves around this idea of critical thinking, which is one of the fundamental skills. I think that we need to elevate in our school system K through graduate school.

[00:35:27] Um, there’s, there’s so much. Need for people to be able to do that, whether you’re talking about politics or toaster ovens or television shows, you’re absolutely right. The goal is to tap into your emotions, to manipulate your actions in some way. And if you don’t think through why that’s being done and what the implications are, then, you know, what’s, what’s the term sheeple, you know, that’s, that’s where we’re at.

[00:35:56] And, and I, I hope we can encourage more critical thinking at all levels of schooling.

[00:36:03]Eric Stephens: [00:36:03] And here’s like the cool thing to think about with critical thinking, right? Is that when it comes down to it, what is the actionable thing that you do when you critically think about something at its core? It’s a question. Like you’re asking a question. , and so that’s like,  , to answer that, that previous question about like, what, like what should we be doing, um, at the, at the, like, at the small scale, when we’re engaging with data is to, is to just keep asking questions, like, just like, just have fun and be like, out of curiosity, I’m like, Hmm.

[00:36:34] I wonder why does the Walmart have this aisle thing right here? I mean, somebody made a decision to put it there. Why. Um, and so just like, just, and have like one of the best, uh, um, moments that I had as a, as a, as a teacher was when a student came up to me and said, uh, like they did a rhetorical analysis of a taco bell packet.

[00:36:58] Um, and they said, look like, here’s this thing that I understand now about what taco bell is trying to do with their, their branding campaign and doing it. And I was like, yes, Like be, be critical, um, and ask and be like, be, um, be skeptical about everything and ask questions about everything. And so that’s what I would do.

[00:37:15] Like, like, no matter what you’re doing, ask questions about why you’re doing it.

[00:37:20] Fred Lane: [00:37:20] All in a shadow to the old magazine, the skeptic, which is just a wonderful publication. I actually don’t know if it’s still printed anymore, but a fantastic source of questioning, right. About different things. I think one of the things are, which. It, I think is inherently challenging though. About what you’re talking about is you remember the character pig pen from a Charlie Brown Snoopy, right?

[00:37:43] I mean, the guy who walks around with a dust cloud all the time, the problem is that we do that with data. Now with mobile devices, we have a perpetual cloud of data that we kind of emit. And I think part of the problem that we’re grappling with as a society. Is that a lot of times we don’t even realize we’re interacting with data because it’s so infused into the ecosystem around us.

[00:38:09] And that, that that’s a level of questioning. I think it’s hard to get any of us to reach effectively.

[00:38:17] Eric Stephens: [00:38:17] Yeah, I think so. Um, but I think that it just, like, I think that it starts with just like trying to, um, grow and natural curiosity just about things. Um, cause I think that, I mean, that’s where it will take it because if you, I mean the idea of data manipulating our lives is not. Anything that is new, right?

[00:38:37] I mean, that’s been going around for a long time. I mean, what is powerful about it is what we talked about before, when you start pairing that data with other data sets. Um, and that’s when you can start really making interesting, um, assumptions and predictions about, about, uh, people and behavior and things like that.

[00:38:56] It’s going around and it’s, it’s kind of, it’s actively. Having that questioning attitude, no matter what you’re doing. Um, and especially, and I think that you’re right. I think that you’re absolutely right. That there needs to be a little bit more of a heightened level when it comes to social media.

[00:39:13] I mean, people are still doing those quizzes about who your personality is, and I’m just like, have we not learned anything from 2016? I mean, there’s an exhibit now in the spine museum in DC,

[00:39:25] Fred Lane: [00:39:25] mom, you don’t need to know which puppy you would buy.

[00:39:30] Eric Stephens: [00:39:30] Well, and I, yeah, it’s, it’s, I think that it’s like, it’s interesting to, to think about, uh, you know, things that it, how it should have gone down or whatever it is. Um, But, but what it comes down to is what’s next. Like after we make that critique, what’s the next thing that we do. Um, and we can do it at a larger level.

[00:39:55] And that’s what I chose to do with my research. Um, or we can do at an internal level. Um, which is, yeah. Trying to question everything that I

[00:40:01] do when I engage with

[00:40:03] a website.

[00:40:05] Jethro Jones: [00:40:05] You know, it’s interesting as, as I have kids who are between nine and 14 right now. And, um, and they asked to, you know, play certain video games or games on, on devices or whatnot. And it’s really interesting to hear the things that they ask, the things that they are interested in and the things that they, that they are picking up on that, you know, we, we didn’t really have when the three of us were kids because we didn’t have all these devices and, and ways to get our attention so much.

[00:40:38] And, and it’s just been interesting to see how my kids have navigated that. And I don’t have like a, a perfect story that illustrates, so this is what you should do or whatnot. But, but what I have noticed is that they have, they have gone through phases where they’ve experienced things in a different way, and it made them pay attention and they realized, wait a minute, What happened there.

[00:41:01] And so I do have an example, actually, I lied. Okay. So they had, they were, there was this little game they’re playing their ads between the levels, right. And one of the ads showed, um, a person like playing the game and the person kept dying. And my daughter said, Oh, that this is how I would solve that. I bet I could be.

[00:41:25] I could do better than that person. And it was a beautiful illustration of how the, uh, the game company created this ad that showed the person making a mistake so that you would say, I could do better. I should go download that app. Totally worked on my daughter. She was totally into it and said, we got to get that.

[00:41:44] And I got to show that I can do better than that. Just completely fascinating. And, and I said, why do you think you can do better? And she said, Well, uh, because they should have gone this way or that way. And I said, so do you think that the, the ad that’s trying to get you to download the game? Did it work?

[00:42:00] And she said, Oh yeah, it did. And then it was just like, nevermind, I don’t want it. It was so, so fascinating to see.

[00:42:07] Eric Stephens: [00:42:07] Um, yeah. Once, once you, um, once you understand, to what extent you’re being manipulated, um, it kind of makes other things and it’s like, Oh, like, I, I don’t actually want to do that.

[00:42:22]Jethro Jones: [00:42:22] Eric, this has been a fantastic conversation. So interested in, in the work that you’re doing and thank you for sharing it here with us on the cyber traps podcast. Um, and, uh, it’s just been a real pleasure having you here.

[00:42:35] Eric Stephens: [00:42:35] Yeah. Thank you so much for having me. Thank you. I

[00:42:37] appreciate it.

[00:42:37]

[00:42:37]Fred Lane: [00:42:37] That wraps up this episode of the cyber traps podcast. In the coming weeks, we will continue our coverage of emerging trends in a variety of areas, including digital misconduct, cyber safety, cyber security, that’s not in the news at all privacy and the challenges of high-tech parenting along the way.

[00:42:55] We’ll talk to our growing collection of interesting experts who are helping us to understand the risks and the 

[00:43:00] rewards of digital technology.

[00:43:02] Jethro Jones: [00:43:02] You can find the cyber chops podcast on all your favorite podcast apps. And we hope that you’ll share the show with your friends and colleagues and reach out to us. If you have questions or topics, suggestions. If you’d like to follow us on Twitter, I’m at Jethro Jones and Fred is at cyber chops, and you can also find us on Facebook at transformative principle and the cyber traps and all over the internet.

[00:43:22]If you enjoyed this podcast, and, if you’re still listening, you probably did, please leave us a rating and review and share it with your friends! 

[/spp-transcript]

Share this!

Leave Comment

Your email address will not be published. Required fields are marked *