Surveillance cameras and other student monitoring tools are becoming more and more common in schools today. Laptops are distributed to the students and software is installed on them for their own protection, but it is important to know what is being done with the data collected outside of classroom use and if your students or your own privacy is being invaded.
Today’s guest is Jason Kelley. Jason is the Associate Director of Digital Strategy at the Electronic Frontier Foundation, focusing on how privacy and surveillance impact digital liberties. Before joining EFF, Jason managed marketing strategy and content for a software company that helps non-programmers learn to code, and advertising and marketing analytics for a student loan startup.
“We’re hoping that the pandemic has made it more clear to people where technology fails us, especially in the educational realm.” - Jason Kelley Share on XShow Notes:
- [0:53] – Jason describes his current role at Electronic Frontier Foundation.
- [2:32] – Big tech companies who offer devices to schools collect data from them.
- [4:17] – Physical surveillance has increased due to the continuous problem of school shootings.
- [6:01] – Surveillance cameras can be accessed directly by local police. Jason explains how this can be controversial.
- [8:34] – Jason and Chris discuss the reason for using school-issued devices only for education purposes.
- [9:53] – Surveillance cameras do have blind spots. Facial recognition also has some issues.
- [11:03] – When devices are provided, parents, young people, and even administrators don’t always know the capabilities.
- [12:22] – Jason shares an example of one of the pitfalls of student monitoring apps that are on school issued devices.
- [14:07] – Schools can take screen captures from issued devices which isn’t done out of malice but does raise questions about privacy.
- [15:12] – We have to choose which is more important: safety or privacy.
- [16:37] – Students and parents need to know that school issued devices have features that will impact privacy.
- [17:32] – Jason describes some of the differences between the types of alerts school administrators receive.
- [19:12] – Sometimes software blocks things that are safe and doesn’t block things that could potentially be inappropriate.
- [20:50] – Teachers cannot have their eyes on every student’s computer at all times and often rely on software to help.
- [22:04] – Teachers shouldn’t be expected to know how surveillance software works.
- [23:01] – Jason describes a recent problem at Dartmouth with Canvas logs.
- [26:27] – This issue at Dartmouth was very serious and could have impacted the students’ careers drastically.
- [28:21] – There is an epidemic of misunderstandings of technology.
- [29:24] – EFF offers guides for students on what to do and expect with school-issued devices.
- [30:42] – There have been a lot of successful petitions in recent years about data tracking in universities. Parents have some leverage here as well.
- [33:00] – Sometimes, there’s not anything you can do about student surveillance.
- [34:20] – The Covid-19 pandemic made things very challenging as students needed access to education remotely very quickly.
- [36:50] – Jason describes some of the features of remote proctoring programs.
- [38:33] – This vastly impacted thousands of students who took the BAR exam.
- [40:36] – EFF has been pushing back on proctoring and Jason explains a recent win.
- [42:18] – Jason is hopeful that the pandemic has made it more clear where technology fails us.
Thanks for joining us on Easy Prey. Be sure to subscribe to our podcast on iTunes and leave a nice review.
Links and Resources:
- Podcast Web Page
- Facebook Page
- whatismyipaddress.com
- Easy Prey on Instagram
- Easy Prey on Twitter
- Easy Prey on LinkedIn
- Easy Prey on YouTube
- Easy Prey on Pinterest
- Electronic Frontier Foundation Website
- Jason Kelley on LinkedIn
- Jason Kelley on Twitter
- EFF on Twitter
- EFF on Facebook
- EFF on Instagram
- EFF on YouTube
Transcript:
Can you give myself and the audience a little bit of background on yourself and your involvement in privacy and surveillance?
Sure. I am an Associate Director of Digital Strategy at the Electronic Frontier Foundation, which is a digital rights organization that focuses in many different areas. One of them is protecting privacy and fighting surveillance. As the Associate Director of Digital Strategy, I do a lot of different things. My primary goal is to make sure we get the message out to people. I also focus on a couple of specific issue areas, mostly privacy and surveillance, but in particular, student privacy and the ways that surveillance has crept into the school setting and into the educational world.
I'm sure we'll come back to this, but specifically, this pandemic, lots has changed in the last year and a half or 18 months, depending on when this goes live.
Spoiler, yeah. It didn't get better.
No surprise there. Let's talk about some of the earlier, older privacy and surveillance stuff, then we'll finish out with stuff that has gotten worse during the pandemic. What do you see as some of the primary ways that privacy is being eroded for students? Whether it's elementary, high school, or college—we’ll lump all those in together.
The surveillance apparatus is coming at everybody from all angles. There are a lot of different things that kids and young people, teenagers, and parents, frankly, need to be aware of around surveillance in schools. It was really a few years ago, or maybe a decade ago, I think, started being clear that big tech companies were offering devices. There are lots of programs that places like Google had, where they were offering devices to schools. The school would then give those devices out to students.
As with many things tech, it was not 100% benevolent, unfortunately. Google required students to make a Google account. Then they collected certain amounts of that data. And this wasn't just Google. That's just an easy name to mention. There were some changes that Google made after a lot of pushback somewhere around 2017, I believe. They stopped requiring data collection and in the ways that they were of young people.
That was a really big focus for EFF for a while. Essentially, making sure that students who were receiving these free devices through schools weren’t also then required to give up private information to big tech companies.
That was a real fight in the last decade. I wouldn't exactly say we've won that fight, but it's moved to different places in terms of device surveillance. Of course, as everyone knows, the last 20 years have unfortunately been one tragedy in terms of school shootings after another. That has also caused a lot of schools to focus heavily on surveillance in more traditional ways. Cameras, for example.
I graduated high school in 2001. There weren't any cameras in my school. It's hard for me to imagine that there are now but there are. I'm sure if I went back to my school—and thankfully I haven't had to do that—there were cameras there. It's one thing to talk about the need for something like a camera in a hallway to ensure safety and to be able to look back over an event that occurs there. Over the last 20 years, you get more and more of these cameras that have live feeds to the administrators throughout schools. But then, technology over the last decade has made it easier and easier for police to essentially connect straight to those cameras.
Ten years ago, people might not have said, “Oh, yeah, let's go ahead and install cameras with a feed that goes straight into the police in our schools.” Once the cameras get installed, and then there's an incident, then the school board passes this new agreement with the police, and suddenly police have a live feed of the schools. Even the traditional surveillance—traditional in the sense that it's not super surprising that a person might not be normally opposed to—has in the last few decades gotten a little bit more insidious.
There are lots of arguments a person can make about why the police should have direct access to a camera in the school, but there are certainly arguments that you could make for the reverse as well. Safety is important, but it's also a real question of what happens when cameras that only record certain things in hallways are staffed or manned by police who already have a biased interpretation of things. You go down the road and you see how this just infuses our already tilted justice system to a more problematic area. Now we're in schools doing it.
That's always going to be the balance to fight against surveillance, privacy, and safety. They're not necessarily mutually exclusive, but there is definitely a tension between security and safety, and privacy and surveillance. It's good that we talk about it and that we figure out what those boundaries are, and whether it's locally decided. Probably the more local the better.
The more local the better, yeah. Ideally you don't want someone at the state level deciding that all schools in the state should have cameras that feed to the police. On the other hand, there are some pretty strong opinions at the local level to that effect as well. That ends up happening regardless, unfortunately.
To me, I see things like technology as being almost more invasive. At least if there's a camera, there's a thought process of it can only see where it can see. But when you have technology and kids are bringing it home, it's in their room. What would parents do? Tell your kids if you get a tablet or a light laptop from school, you only use it for your school stuff.
I think that's probably a good idea, unfortunately. Yeah, you mentioned the fact that cameras have a benefit and a failure point. The benefit is that they don't move around, ideally, unless it's like a drone, which some schools and universities have. So the camera doesn't move around, which makes it a little bit easier to evade but also safer for privacy. One knows where the camera is and what it’s recording.
We've all seen a movie—maybe this happened to us—or a TV show, where a bully picks on a kid and the kid finally just had enough. He runs after and chases him and punches him. The camera might catch that and not anything else, because that's where the camera is. Unfortunately, what's happened is the surveillance technology that the schools use has changed and shifted. Those blind spots continue to exist in different ways. Unfortunately, those blind spots—because of how technology tends to work—tend to exacerbate real problems that already exist within society.
If you have facial recognition that you use to enter the school, it's more likely to fail with black and brown students. Again, this isn't with malice. This is not intended to be nefarious, but that example is one of many, many, many examples where technology's blind spots, which once upon a time I think most of us could understand and could imagine, like, “OK, a camera can't see everything.”
People, as you say, don’t really know what those blind spots are. They don't know the extent of the surveillance as well. They don't know if they should put the tablet in a drawer. If the school emails, says, “Your kid was doing XYZ. We caught it on “random app name” that we use.” How does the parent interrogate the child? How does the child defend him or herself? It's definitely gotten worse over the last decade, as more and more of these tools have expanded their capabilities without a real educational component for parents, children, and young people about what they do.
I suppose in some sense, when it's technology provided by a third party, the school administrators may or may not even know what the technology does. They're thinking, “Hey, great, my kids who don't have computers finally have a computer. Here, use this.” They don't know that it has the capability of this other entity recording audio and video. To them, they’re just ecstatic, like, “Oh, finally, my underprivileged kid in my class now can finally get online and have access to homework and research. This is awesome.” But…
There’s a downside.
There’s a downside to that.
Yeah, and we've seen that happen. I won't be able to come up with the exact example, but one of the student monitoring tools—there's probably an even nicer euphemism before them than that—is often tied to the device, not tied to the user. You get a laptop from school. It's got a program on it, let's say GoGuardian, which is one of the many tools that schools use. GoGuardian is a pretty well-known one. It's on about 18 million student laptops or devices. You take the laptop home, and you start using it for personal stuff as a student.
You might not know that the administrator can set up pre-recorded times to do screen captures. And they can happen for eight hours at a time. I'm not saying that the school necessarily is doing this with malice. This certainly isn't a widespread “teachers are looking at student laptop screens at home” issue. But the privacy invasions that could happen and have happened where students don't know that a teacher or administrator can look at what they're doing on their laptop, look at private information. Parents don't know either.
If you're a parent and you use your child's laptop, you probably have no idea what information is both going to school and, as you’ve said, going to this third-party company that's collecting this information, supposedly de-identifying the data and aggregating it before they do anything with it. But as we know, that often leaves gaps where data can be still tied to individual users. Regardless, what's the benefit from the school's perspective of knowing what information a student is searching at 8:00 PM on the laptop?
Obviously, there are instances one can think of where that can be helpful, but to subject tens of millions of students to that sort of privacy invasion every day on the off-chance that someone searches “how to build a bomb” and they're serious—I guarantee you most of the time, they're not—that's a real question of the balance that we have to make as a society and, as you said, to choose privacy over safety or safety over privacy. The reality is that unfortunately, many of these tools—I could talk about proctoring as an example—don't even actually do the thing they're supposed to do very well. It's a band-aid on a problem. I'm trying to think of a good analogy for a band-aid that doesn't work well. It's a bad band-aid.
That’s a real question of the balance we have to make as a society: to choose privacy over safety or safety over privacy. -Jason Kelley Share on XYou try to put a band-aid on a broken bone. It really does not solve the problem.
Exactly. Yes, precisely. It's not the solution. Thank you for realizing that my problem in my analogy was the wound and not the bandage.
It's almost like, if your child—college, high school, or any educational institution—and honestly, you could probably think about it as an employer as well—if someone else is providing you technology, it's almost best to look at it from a perspective of, this tech doesn't belong to me. Everything I do on this or anything I do near this has the potential to be recorded. That's almost like—assume it's been compromised and it's feeding data to the bad guy. Not to say that it is doing that, but that's a way to say from a mindset you need to have around the tech.
From a threat-modeling perspective, if you're a student, it's probably smart to start to try to figure out what exactly is installed. But again, it's difficult to know. The school probably had parents sign an agreement. That agreement probably says what the software is on the device. It's really unclear what the capabilities are and which are turned on. They're often turned on and off at the administrative level by the school.
The teacher might not even know what GoGuardian is capable of doing. As an example, there are three different types of GoGuardian tools—GoGuardian Admin, GoGuardian Teacher, and GoGuardian Beacon. These are all things that live more or less on a student device. GoGuardian Admin lets administrators see certain information about students' search history, screens, and apps that they've used. GoGuardian Teacher does something different but overlaps with that. GoGuardian Beacon lets the school essentially get alerted when students visit certain websites or search for certain things as an attempt at stopping mental health crises or violence.
I'm extremely online, so forgive me for mentioning this dumb Twitter thing. There's an image that goes around the internet every six months. One of these apps popping up a notification to a parent or an administrator that's just a warning that your student or your child is searching for this information that's dangerous, and the information in that warning that this child is searching for is how to teach lobsters to read. There's a question about the accuracy of these tools as well. Not only are they invading privacy for students, potentially even for parents, they're also not very good at what they do.
On a similar note, the apps that block content for students, often block a lot of things like LGBTQ content in the name of blocking sexually explicit material. Because the internet is vast and ever-changing, allowing through neo-nazi content. If you can't block it all, the best you can hope to do is stem the tide. These tools simply can't solve every problem that society has, but they're trying to do that.
I think I've had this conversation with other guests. We like the thought that technology can solve all of our problems. We're social creatures. Technology is not going to solve our social issues.
Exactly.
It can help when properly used, but it's not the solution to our technology problems. If your kid is going online and looking at pornography, that's potentially a social issue not a technology issue. That's a conversation that needs to happen. Blocking that particular site at that particular moment that particular day doesn't address the why.
Exactly, and when it comes to the school setting, it's often even more complicated. We have teachers that have 60 students and they're trying their best to use technology to help them do their very difficult job. Obviously, parents are busy, too—much credit and kudos to parents as well—but there's a part of me that understands why a teacher would say, “I can't watch every student, search for content, and figure out what they're looking at. I can't watch them all take a test. Now that we're in a pandemic, I'm going to try to use this tool that everyone else I know is using in every other school.”
You don't know how accurate it is. You don't know how invasive it is. It's an understandable reach that you would use such a thing without really being able to do the deep dive into what happens when you do.
Should we expect teachers to be well-versed in everything involving programming and data gathering?
No, of course not.
That’s not their job. Their job is to teach kids, not to understand the depths of how the interconnectedness of devices works.
Exactly. To give you just an example of a thing that happened—a public relations fiasco that happened a few months back in Dartmouth. The point I want to get here is that kudos to teachers, less kudos to the companies who were selling these tools and aren't necessarily explaining what they do, or aren't explaining to teachers how to use them in certain ways or not to use them in others.
In February, we at EFF have an intake desk, so we get legal help requests. Sometimes those end up in court. Usually they don't because usually a strongly worded letter, or an activism campaign, or something like that solves the problem a little bit easier. In this case, we got an intake request from a person who is a technologist. This technologist was the partner of a woman at Dartmouth Medical School.
What this technologist sent us was basically a help beacon, a bad signal that said, “All the students at Dartmouth Medical School are having the logs of their e-learning platform, called Canvas, searched back a year. The university is trying to see if they can compare your remote exam taking.” Let's say, I took a 9:00 AM exam on a Tuesday with the logs in Canvas, which is the platform that holds course material, to see if you're accessing that material while you're taking an exam.
Where are you taking it? Where are you treating it like an open book test when it was supposed to be a closed book test while remote?
Exactly.
OK, I got it.
While remote, precisely. Dartmouth didn't have a proctoring tool at that time to record students taking the exam with a webcam or what have you. They learned that one person might have been cheating and were told by their legal advisors that they needed to apply the same methodology to every student. They went back a year and looked at log data. The log data they were looking at is in this platform called Canvas, where a student will log on with their phone, with their laptop, with their tablet, often with all three. They won't close out the tab. They won't close the app because there's no reason to.
Yeah, how many tabs have you opened on your browser right now?
Precisely, yeah. If you have a Gmail tab open, you've committed the same mistake that these students did, which was they didn't close the tab. What that meant was that it refreshed some of the course material randomly, because that's just how the app works. It made it look like students were accessing material manually that was happening automatically. We took a look at the data that this technologist provided. I would love to call it their log, but it was what the school gave them. All the exculpatory evidence had been removed. The school essentially only left those access patterns that seemed most nefarious on the students’ part.
We looked at this with some of the technologists at EFF and said, “This isn't how logs work.” A reasonable person who does understand technology would look at these logs and say, “Well, that doesn't seem manual, because here they loaded this file, then seven seconds later, they loaded this file. They answered a question at the exact same second, then they loaded this file 10 seconds later.” It just looks automatic. That's after the university had tried to remove some of the data that would have made it look even worse for them.
They singled out 20 students after looking at these logs and were on the verge of either expelling them or putting a mark on their transcript. In med school, a mark on your transcript might mean that you don't get the rotation you want. You don't end up at the hospital you want. Your career can literally be completely derailed by a transcript change like this.
We worked with an organization called the Foundation for Individual Rights in Education, which is a free speech at the university level organization, also known as FIRE. You can visit them at thefire.org. We sent a letter to the Dean of Dartmouth and said, “We think you should probably rethink the way you're looking at this data.” He sent us a “Thanks, but no thanks” reply.
That was when we thought, “OK, well, we don't know what to do.” Students were definitely hiring legal counsel in this situation, but that's not really our bailiwick at that moment. So we reached out to The Boston Globe, because Dartmouth is in that news area. The Boston Globe did a big story after interviewing students. A week later, ended up at The New York Times on the front page of their Sunday edition.
Long story short, Dartmouth rescinded all of their allegations, apologized to the students, and said they wouldn't do it again, which is great news. For about two months, students were unsure whether or not they would be able to be doctors, after getting into an Ivy League school—who knows what they're paying to go there—and this is all because the school didn't recognize what technology can and can't do.
It’s an epidemic of misunderstandings of technology at the education level. From proctoring tools that mistakenly flag head movements, or coughing, or a person walking across your background while you're taking an exam as cheating to these kinds of more technically complicated but still incorrect analyses of data. It's a real misunderstanding by the administration of a lot of schools of what technology can and can't do when it comes to validating or invalidating claims of misconduct.
What are the students supposed to do? Not specific to the Dartmouth students, but on a broader scale, what should students be doing when they're dealing with campus technology?
You mentioned earlier be careful with the tech that your school gives you. I think that’s super important. EFF has a resource that's attached to our surveillance self-defense website, which is at ssd.eff.org. We've got a specific student privacy-focused guide there. They give students some tips about what to do, like knowing what data is being collected and making sure to set social media settings correctly. As you said, sometimes the answer is only using the school-issued device when you have to. Make sure you're logged out of platforms or apps that the school might have installed on your personal device.
In the case of proctoring, it's a different story because those are tools that you use in the moment during an exam, and you often don't have a choice to get out of them. But there are some technical measures that students can take, and then there are a few social ones.
If you're at a school where there's any sort of “appetite for revolt,” we've seen lots of petitions that were successful in the last year specific to proctoring saying, “We don't want these proctoring apps that record our biometrics, our personal data, and potentially give them to a third party. We don't want these to be used during tests.”
They were successful. The University of California, the entire system basically said, “We think that all teachers within the UC system should try to find an alternative to remotely proctored testing.”
You can fight back in a social way. If you're a parent, which is maybe more likely to be a listener on this podcast, you can do the same thing. Frankly, just make sure teachers and administrators understand the dangers that these apps cause to student privacy. A recent report came out about four months ago that was basically looking at how much schools and parents understood about the technology that their students were using. You can imagine the answers weren't good.
They're very low scores.
Yeah, but they're interested in learning. If you have the ability to dig into that, I think it's a little bit incumbent on you if you care about such a thing to talk to the school, talk to the teacher, talk to other parents, and talk to the kids about what is and isn't happening on their devices. There's a lot you can do.
One of the most unfortunate things about this situation is that surveillance like this—called disciplinary technology—happens in places where there's a power imbalance. Sometimes even knowing doesn't solve the problem. You can learn that your boss is using productivity tracking software. That doesn't mean you can say, “I'm going to delete this.” The same is true for a lot of privacy-invasive student surveillance, whether it's on an app or in a classroom, or what have you. Knowing about it is really the first step.
I'm not advocating cheating, but I will say that if you search for just about any of these apps on Reddit and work around, there are usually ways to figure out what they're doing and why they're bad. Even potentially how to stop them from collecting data that really often just shouldn't be collected by a third-party app for a school while you're at home.
All of this gets exacerbated because we are in a pandemic. While your kids might be in school today, in two weeks, there may be no break, no community. Everyone's going to be all at home again. In a place where there's such a flux, the chance of these products being used without people on both sides understanding what they do and how they work is probably much higher.
That's exactly right. EFF didn't actually have what we call a student privacy working group. We have different working groups on the EFF that focus on certain issues until the pandemic started. We started it partly because the number of students using proctoring tools went from—I'm going to make up a number but it's relatively accurate—some numbers in the millions to some numbers in the 20, 30, 40, 50 million.
Once that happened, it seemed like a good idea to start focusing on what kinds of surveillance were going to be forced on people because of the pandemic. Student surveillance is just one of the many kinds of surveillance that we've implemented. Some good, some bad, some much, much worse than bad to try to make up for the problems of the pandemic.
Proctoring is the area where we've focused most heavily. Just to give you a statistic—every time I just can't believe this is real—I really started focusing on proctoring because every year there's a bar exam. Several bar exams throughout the year where people who intend to be lawyers who graduated from law school take the bar. It's a horrible experience. It's two days. It's like six hours.
It's the best of cases
Exactly. If you're lucky, you don't have to retake it. With the pandemic, it had to be remote. That was true pretty much across the country. It was true for barristers in the UK. I've talked to some of them about this. They have their own legal action set up because all of the National Bar Association people and the individual state bar associations for the most part said, “You have to take this exam through a remote proctoring tool.” In the US, that was primarily a company called ExamSoft.
To give you a sense, you're sitting at your computer and ExamSoft proctors who are either logged in or can look at the video later, can see your video. It locks down things that make you can't open other apps. It does all the things that you can imagine. It would do all and a few that you wouldn't expect, like determining based on your typing, whether or not you're the person you say you are. It's recording patterns of your keystrokes.
It's got facial recognition as well. What they call facial detection, which is to determine whether or not you're looking at the screen. I'm one of the people that I don't think while I'm looking a person in the eye, it's harder. If I'm taking an exam, the same thing is true. I'm looking down at the keyboard. You're not looking at notes, you're just looking out the window, because you're like, “What is the square root of that?”
These tools started getting used by the bar. We wrote a letter to the Supreme Court of California—the state Supreme Court—to say, “Please don't require ExamSoft for the California bar because it's got problems with bias. It's inaccurate. It's a privacy invasion. Other proctoring companies have had data breaches.”
They implemented a few measures. They made it clear that the data that was collected by ExamSoft was going to be deleted, which is up until that point, there was a real problem not knowing whether the school was responsible for students asking if they want their data deleted under the CCPA, versus you asked the bar, versus you asked ExamSoft. No one knew what to do. The state Supreme Court said ExamSoft isn't going to profit from collecting this data and selling it to third parties in this specific instance.
So the bar exam happens. There are 9000 test takers. ExamSoft flags 3000 of them as cheating. That means that the state bar has to review the video for 3000 tests and make sure that the student or the examinee wasn't cheating. They did that over the course of several weeks. Hundreds of students got notices saying that they had been flagged.
The end result was something like 70 students actually received a disciplinary action, out of the 3000 that the software said were potentially cheating. ExamSoft says that's how it's supposed to work. They don't determine whether anyone cheats. They only determine whether someone does something that could be cheating and that it's the school's responsibility, or whoever, to actually figure it out. What's the value in that, right?
Yeah. We think a third of the people taking the test were cheating, but we don't think it was cheating. We think you just need to look at it in more detail. How do the people who are reviewing it know what is or isn’t cheating? They look to the left for just a little too long. Maybe the cat was over there.
Right. I think luckily, in this case, there was a little bit of leeway. I think the state bar understood that when you have a number like that, you have to understand that this software doesn't detect cheating. It detects abnormal behavior, which could mean anything based on what the idea […]
Thirty percent of the takers exhibited abnormal…so how abnormal is it really?
A great question. Yes, exactly. We've been pushing back against proctoring and we had a pretty big win about a month ago. There are three big proctoring companies—ExamSoft, Proctorio, and ProctorU. One of the big companies, ProctorU, is no longer allowing schools or customers to use its AI-only tool. What was happening before with this ExamSoft example is there's no reviewer on the company side; it’s just an algorithm that says, “Here's 3000 people we think cheated.” It's just a robot that does that.
ProctorU basically said, “The algorithm is so wrong so much that we're not going to sell that feature unless you also pay us to review the video, because we're giving too many people false positives.” Which again really makes you wonder….
Get rid of the feature if it doesn't work.
Yeah, it's not a good feature. It's really useless, but this is what's happened over the pandemic in many places. The way that a technology that—maybe it's been around for a while, maybe it's brand new—gets pushed to the front line as we hope this is going to be the solution to all of our problems. It takes a year, or however long before people are like, “No, this actually doesn’t work very well.”
We've seen a lot of that over the last year and we're hoping that actually the pandemic has made it more clear to people where technology fails us, especially in the educational realm. That's the best case scenario. You have to be optimistic about these things.
That's how I sleep at night, saying like, when you have these giant messes, hopefully people learn from them. This school surveillance, hopefully, over the last year has been one of those messes. There is still a lot to be determined whether or not that's the case, but that's our hope.
Hopefully that will make an improvement and help at least people to be more aware of what the issues are.
Exactly, yes. That's the best you can do at least to start.
As we wrap up here, if people want to find out more about you, EFF’s mission, specifically the content around student privacy, we have ssd.eff.org, student privacy guide, or any other specific resources?
Those are great places to start. You can basically just go to eff.org. You can search for student privacy there. We have a tag or what have you on our blog where all the issues that are related to student privacy, all the blog posts pop up. That's a good place to read about and stay up to date on what's going on, but SSD—the Surveillance Self-Defense Guide—is a great place to start just for anyone, whether you need assistance in figuring out what your Facebook privacy settings should be, or how to encrypt your iPhone, or what have you. Those are both really good places to start.
You can find me on Twitter—I’m at @JGKelley. Although, as with most people on Twitter, I only tweet about student privacy occasionally. I am more often tweeting about how annoyed I am at local politics or pictures of sandwiches and things. Apologies in advance.
Obviously, people can visit EFF and follow EFF, if they’re looking specifically for that type of content.
That's probably the better choice than my Twitter. Go to Twitter and follow @EFF or go to eff.org. You can also find us on Facebook and Instagram at @efforg. And one day, TikTok. One day.
Where we need to get all of our surveillance and privacy content.
It's a perfect combination.
Isn’t it an oxymoron for EFF to go on TikTok?
Sometimes you have to be where the people are. Even if it seems ironic, it's still the best thing. Every time we post on Facebook, someone's like, “Why are you on Facebook?” I agree. I don't want to have to be on Facebook. I want to be on a dozen different platforms that all have different terms of service, different goals, and different audiences, but right now, you have to be where the people are.
That's a perfect way to close out. Jason, thank you so much for coming on the Easy Prey Podcast today.
Thanks so much, Chris.
Leave a Reply