The Social Impact Podcast with Bree Jensen
The Social Impact Podcast with Bree Jensen
Social Change in Tech | All Tech is Human Founder, David Polgard
Tech is a massive part of our world; we use it in every aspect of our lives. How do we use technology and not let it use us? Bree discusses digital citizenship, legislation, our kids and social media, and our digital lives with the founder of All Tech is Human. Listen to our conversation and let us know what you think.
TheSocialimpact.co
instagram.com/bree_jensen_/
Things to note:
All Tech Is Human is a non-profit committed to building the Responsible Tech ecosystem and uniting a diverse range of stakeholders to co-create a better tech future. The organization is holding a Responsible Tech Summit in NYC on May 20th, and has a range of activities including a mentorship program, Slack community of 3k members, open working groups, Responsible Tech Job Board, regular summits and mixers and much more. Its most recent resource is the HX Report: Aligning Our Tech Future With Our Human Experience. You can find all of the projects here.
Guest Bio:
David Ryan Polgar is a pioneering tech ethicist, Responsible Tech advocate, and expert on ways to improve social media and our information ecosystem. David is the founder of All Tech Is Human, an organization committed to building the Responsible Tech pipeline by making it more diverse, multidisciplinary, and aligned with the public interest. As the leader of All Tech Is Human, he has spearheaded the development of three recent reports: Responsible Tech Guide: How to Get Involved & Build a Better Tech Future, The Business Case for AI Ethics: Moving From Theory to Action, and Improving Social Media: The People, Organizations and Ideas for a Better Tech Future.
In March 2020, David became a member of TikTok’s Content Advisory Council, providing expertise around the delicate and difficult challenges facing social media platforms to expand expression while limiting harm.
An international speaker with rare insight into building a better tech future, David has been on stage at Harvard Business School, Princeton University, Notre Dame, The School of the New York Times, TechChill (Latvia), The Next Web (Netherlands), FutureNow (Slovakia), the Future Health Summit (Ireland), InfoShare conference (Poland, upcoming), NATO, the State Department, and many more. His commentary has appeared on CBS This Morning, TODAY show, BBC World News, Fast Company, The Guardian, SiriusXM, Associated Press, LA Times, USA Today, and many others.
David is a monthly expert contributor to Built In (writing about the Responsible Tech movement), and an advisory board member for the Technology and Adolescent Mental Wellness (TAM) program, and a participant in multiple working groups focused on improving tech and aligning it with our values.
The main throughline throughout David’s work is that we need a collaborative, multi-stakeholder, and multidisciplinary approach in order to build a tech future that is aligned with the public interest.
Speaker1: Today on the Social Impact podcast, we are talking about digital wellbeing with the founder of all tech is human.
Speaker2: Technology is affecting how I live, love, learn, even die. It affects how we feed our news. It affects if it affects how you get your news, that affects your worldview. That affects who you vote for. That affects how you see the world. That alters the human condition. That is a big deal. Full stop. So the point there and what motivated me to really drastically change around my life, to make this my entire career, was that I saw such a need in increasing the level of consideration around the impact of technology.
Speaker1: Welcome to the Social Impact podcast, where people like you and I are making sustainable change throughout the world. Learn why they do it and how you can be a change maker in your community and across the globe. Each week we hear from people on their social issue and what compels them to make an impact. Hi. My name is Bree Jensen and I am your host. Today's guest is David Ryan Polgar. He's a pioneering tech ethicist, responsible tech advocate and expert on ways to improve social media and our information ecosystem. David is the founder of All Tech Is Human, an organization committed to building the responsible tech pipeline by making it more diverse, multidisciplinary and aligned with the public interest. As a leader of all tech is human. He has spearheaded the development of three recent reports Responsible Tech Guide How to Get Involved and Build a Better Tech Future. The Business Case for All Ethics. Moving from theory to action and Improving Social Media. The people, organizations and ideas for a better tech future. In March of 2020, David became a member of Tiktok's Content Advisory Council, providing expertise around the delicate and difficult challenges facing social media platforms to expand expression while limiting harm. Let's get to this incredibly important conversation. All right. So, David, thank you so much for being on the Social Impact Podcast. I think that you are absolutely the best person to be on this first season because for one thing, my social issue and my passion is digital wellbeing in 2021. Oh my goodness. There's so much to look at when it comes to, well, being in the digital space and kind of putting that on everyone's minds. So can you share a little bit about how you even got into this space? Did you just see the need? Like, why? Why this?
Speaker2: Well, first off, thank you for having me on the show. Secondly, we won't say that the best guest, because that puts the pressure a little too high. So we'll say, you know, it's very topical.
Speaker1: It's a very one off.
Speaker2: Yeah, we'll say one of them. So then.
Speaker1: You can at.
Speaker2: Least hedge hedge their bets here. But how did I get into digital? Well, wellness and wellbeing one. You know, my my background is actually as an attorney and educator, but I saw kind of early on about 2010, inspired by a lot of great articles. There was a great article at that time period called Is Google Making US Stupid by Nicholas Carr. I Was Atlantic article got a lot of attention. You look at that article now and most people would probably accept that that thesis at the time is very, very controversial. A lot of a lot of debate around it. And I remember a couple incidents that really solidified why I got into this space. One of them I remember early on, this would have been about 2011. I had jury duty. And if anybody remembers jury duty, it can be kind of boring. You're just sitting there and waiting room. You don't know what's going to happen for hours and hours at a time. And I recall forgetting my my cell phone. I forgot my phone. Okay. And I remember just waiting in the room there for a couple of hours. And I started feeling a vibration right? There was a ringing that was happening. So I reached into my pocket and there was nothing in my pocket. I thought, Well, that's kind of strange. How do I feel the sensation of a ringing phone in my pocket when there is no phone in my pocket? So like any 21st century person does, when I got home after jury duty, I Googled my symptoms.
Speaker2: And sure enough, what I found was at that time, Pew Research had come out with some some research saying that two thirds of American have experienced these phantom vibrations. This is pretty intense, right? Because phantom vibrations, that's the same phenomenon that happens when somebody loses a limb. Right. If you lost your leg, let's say, in a car accident, you might still have the feelings that you still have a leg because you had a leg for so long. Right. So that really blew me away and it made me realize the significance of what we're developing and deploying and. What I always felt from that in a couple of other incidents was that there needs to be a lot more thoughtfulness in how we develop and deploy technology, because although we define this under this broad category of technology and we wrongly think that it's just technologists. Technology is affecting how I live, love, learn, even die. It affects how we get our news. It affects if it affects how you get your news. That affects your worldview. That affects who you vote for. That affects how you see the world. That alters the human condition.
Speaker2: That is a big deal. Full stop. So the point there and what motivated me to really drastically change around my life, to make this my entire career, was that I saw such a need in increasing the level of consideration around the impact of technology, because I really saw it as the most in my mind, important issue facing us because again, although we frame it under technology, this is really about society, this is about democracy. This is about the future of conversation, the future of humanity, the future of what it means to be human in the digital age. That's a big deal. And that's something that strongly motivated me to shift around everything that was going on in my life, to say, this is an area I want to speak about and write about and research and consult and talk to people. And I think that there's a huge, huge need for all different types of backgrounds to get involved in this space. So again, with my own background as an attorney and educator, I saw that as a positive, right? Because here you have tech founders who tend to be overly optimistic and here you have attorneys that tend to be overly pessimistic. That's actually a good thing because attorneys are known for adding a certain level of consideration of thinking of all the worst case scenarios and given all the bad things that can happen with tech and have happened with the tech industry, I saw that as a need to have more individuals who can say, Yes, I see your best case scenario, but let's also think about the people who are being impacted about this.
Speaker2: Let's start thinking about the unintended consequences, the negative externalities. Let's think about how a bad actor could exploit this. So on one hand, with something like social media that I'm heavily involved in, right? Currently sitting on Tiktok's Content Advisory Council and the nonprofit I run, All Tech Is Human is also really focused on improving these these digital spaces. That is an area where you need all different types of of thinkers to start realizing how these spaces are impacted and ensuring that we don't just have one type of thinker, in other words. This is a multifaceted, complex area where we're dealing with digital wellness, when we're dealing in digital citizenship and responsible tech and ethics. It is deserving of multiple perspectives that can work together in a process. So I'm really for moving away, moving away from this like lone wolf type of thought leader, you know, and moving this into. We need a more collaborative environment where different perspectives can come together to shine a light on our blind spots that we always have with technology.
Speaker1: Mm hmm. That's incredible. We do need all of us, everybody, because it is a big task. And we're the first ones, right? The first generation of parents, the first generation of leaders, the first generation of kids growing up in this tech age. And I love, you know, being a social entrepreneur as you are because you see the need and then you find ways to move the needle forward. And that was kind of my catalyst of getting into the digital wellbeing space is I was a youth advocate and I was traveling around talking about anti bullying and things that worked to challenge kids to do better wasn't working anymore because guess what? They were being bullied at home too, right? And it was like all of a sudden as parents were trying to figure it out, but yet we were still giving the Christmas presents, like here have keys to the world without knowing how to use those keys. And I hope you don't crash the car. And and unfortunately, some bad things have happened. Right. That have created awareness and things have really changed. So what have you seen with the evolution of awareness and now kind of like what's next?
Speaker2: I have thought a lot about the what next?
Speaker1: Yeah, it.
Speaker2: Always comes down to the complexity of all the different parts of the puzzle. So as you know, brave for being heavily involved in a space for years. We've really kind of shifted over time. There's been this evolution. So I would say early on and I remember even doing some work at the Center for Internet and Technology Addiction and dealing with Dr. Dave Greenfield, who popularized dopamine connection. Here was a, you know, a cyber psychologist who's been working in this space since the late nineties. So this issue of the impact of technology has actually been around for a really long time. People have been talking about this for four decades. But what has happened is we started kind of moving the the impact of tech around the larger systematic problems. So early on when we were talking about something like digital wellness, the debate always came down to, well, can't you just turn off the phone? Right. Is this about personal autonomy? Is this you against the phone? Whereas over the last couple of years it started to shift to say, well, think about how we deal with any type of issue. You take an issue like food. Right. And that's why I think a lot of times people use food analogies when they're thinking about digital wellbeing and digital wellness. Mm hmm. Well, with food, we have the personal autonomy. And the idea that you want to motivate a person to think about diet and exercise. But at the same time, we also put pressure and regulation on industry because we wouldn't want somebody to come out with something that is so wildly unhealthy.
Speaker2: We would then say, Well, maybe we need to list the calories, caloric intake, maybe we need to list the ingredients because how can you make a wise decision as an individual if you don't have access to how this is made? And I think that's kind of where we've been going the last couple of years, is now people start looking at that equation a little differently to say, well, it's not really just about personal decisions, even though that's that's how we like to be, especially as an American. Right. We like to think that we're hyper individualistic society and that's everything is our decision. Whereas now when we look at social media, we say, wow. It's a little more complex than this because, one, we're dealing with our neighbors and our and our friends and our frenemies and everyone else who's kind of interconnected and how they affect us and how they affect the larger world and how they affect the information ecosystem. But you also have the idea of you have, you know, an industry that is trying to get kind of prompt a person to have higher levels of engagement. Right. So how do you judge that in in that kind of analogy of food? And I think you would say, well, now you have a individual who doesn't have access to make proper decisions because there's no way of knowing how an algorithm works.
Speaker2: Right. There's no way of you knowing what's happening behind the scenes at a large social media platform. And therein lies the rub right now. So I see the future of where we're at right now is very similar to and this, I think, goes to something you just said around kind of giving somebody the keys to the car. It goes towards that safe driving analogy that I always like to bring up because. When we think about driving, it's actually a combination of personal decisions. Governmental regulation and socially responsible companies. So over the course of safe driving, let's say as a concept, we've seen a dramatic shift towards saying, well, maybe we need to put seatbelts in the car. Right? It wasn't that many years ago when we didn't actually have seatbelts. We didn't have airbags. We didn't have the intense regulatory environment that we have now. And then we also create law enforcement. That's actually we'll pull you over if you're not wearing a seatbelt. So think about that. That's obviously objectively an affront on personal autonomy, because on one hand, you could argue, well, it's your personal decision to wear a seatbelt, to not wear a seatbelt. But what we do is since we're living in a society, you have your individual rights, but then you also have your societal understanding of how you interact with other people because you're affecting the larger system.
Speaker2: So that's where we are really in social media is we need to start moving away from saying, all right, well, is this all about digital citizenship, education? Is this all about parenting? Is this all about getting Mark Zuckerberg, you know, to be more responsible? And oftentimes, the mistake, I would argue that we fall in because we're looking for a simple solution. And I have to tell you, I've got good news and bad news. The bad news is I'll start with that. The bad news is there is I'm sorry, but there is no simple solution. This is a complex, thorny problem that deals with human behavior similar to everything in our political system. This is a this is an issue that's actually going to always be chipped at somewhat. A crime, right. Nobody solves crime. It's an underlying human human behavior issue that we're trying to create environmental differences that improve the situation. That's where we're at with social media. We're trying to improve social media as opposed to there's no magic switch to fix it per se. Right. So we're trying to improve it because it's status quo of where we're at is not appropriate and it's not sustainable. So I think that's what we're going to be moving ahead. So it's going to be a lot more collaborative. And I think that's what we even saw like in recent, recent days with the Facebook whistleblower talking about the quote unquote Facebook files, The Wall Street Journal kind of exposé on whether Facebook knew some of their internal research about the potential dangers, specifically for a teen audience on Instagram.
Speaker2: And what I liked that that was said there is. This is going to need a actual collaborative effort. This is actually a time where we're going to have to roll up our sleeves. Right. Right now, it's easy to score points. It's easy for politicians to say, hey, let me grill this person from industry. It's a lot harder, though, to create an environment, an environment where this is more conducive for well-being. It's harder to actually say, well, how do we if we're going to keep social media in our lives? Right. Because, again, that's maybe not a given. We could argue that that this is not sustainable. But if we if we say, okay, well, we're still going to have teens, let's say, use social media. So still going to have its current environment. We want to make it safer. We want to make it better for. Well, let's imagine that's where we're at, right? If that's where we're at. Well, we're really going to have to work with lawmakers. We're going to have to work with parents. We're going to have to work with educators. We're going to have to work with everyone. We're going to have to bring in the saints and the poets and the attorneys and the sociologists and the psychologists as well. We're going to bring in everyone.
Speaker1: Right? We need all the players. We need the whole team. We really.
Speaker2: Do.
Speaker1: Let's bring in the whole team. Let's bring them in.
Speaker2: And I'm ready. I'm ready.
Speaker1: I know. And you know, there's so many things to unpack there, but I'm kind of thinking of the two bookends, you know, the individual and then the big player or the business. So two questions. Let's start with the individual. First of all, I love the name of your organization. You know, it's just it says it all. All tech is human, because that's something that when I speak to young people that I just really want them to get. I say to them, if there's one thing that I want you to know is that you are still you when you're on your device and the person on the other side, whether you know them face to face, they're still a person. And so they still have feelings. You still have feelings. This is this is a thing. So I have them get out their fake eraser. I mean, I always get the eye roll, you know, but I do it because I love the eye roll. And they they do their eraser between the divider that they've created of the screen. You know, that this is a human responsibility. Can you talk a little bit more about that and how to help people? I mean, even as adults, I don't love the new expression in real life or IRL. I don't love that. And I know everyone kind of uses it and it's fun, but I don't love that because I do think it perpetuates that thought of like, you can be someone else on a screen and sometimes that can be hurtful or harmful. Do you have any thoughts about that?
Speaker2: Oh, I have lots of thoughts. Yeah. That's that's a fundamental battle that we've had since the founding of the Web. So the right right. Recently had its 30th birthday with Tim Berners-Lee last year. And it's something where where we we've always debated. Right. What are we online like? There's a famous New Yorker cartoon, right? Like with, like, a dog on the Internet, right? Zimmer, I know you're a dog on the Internet, right? Like the idea behind that is that. Should you be you no matter what type of environment are you in? There's always been a lot of debate on what what causes toxic behavior online. Is it around anonymity? But then there's a huge flip side of that, because here we are in kind of a privileged position sitting in United States. So you would want some level of anonymity if you're in a country like Afghanistan, where you're where your your identity could could get you injured or worse. Right. Right, right. Especially if you're talking about, you know, some level of civil rights in some level of ability to to protest the status quo. So it's not a simple issue of, well, if we just got rid of anonymity, all would be good. No, there's plenty of people whose identities are clearly listed who still act like jerks online.
Speaker2: So there's something a little deeper to that. So on one hand, you could argue that a certain level of accountability, social accountability does usually affect human behavior, right? Like if, you know, you're being kind of kind of watched and you have social pressures on your behavior, that is obviously going to alter your behavior. But the fundamental problem with a lot of this is. Especially for a teen audience. Teens are not stable, like you and I. And what I mean by that is identities are not stable. So I think one of the one of the areas I might push back a little bit on is. If you lock in teen behavior in concrete early on, that's actually the opposite of what we wanted the Web to be. If the Web is supposed to be about increasing creativity and free expression. Mm hmm. Well, if you make it where you think that one bad tweet can destroy your life. And a lot of times we have given examples of these and we say, hey, did you know what happened to Justine Sacco? And we tell this story about a tweet that travels across the world and then leads to her firing. So that, I think can can really backfire in the sense that are we still trying to create an environment that increases connectivity and increases expression because teens are trying to try on different identities.
Speaker2: If I think back to my high school years, there's probably 12 of me because I was constantly trying on different identities every everyone. So how do we create spaces that allow for that? I think that's why recent years have also seen the rise of, you know, platforms like Snapchat, because there is a certain level of pressure that obviously teens would want to move away from. So it was kind of interesting to see the reaction of, hey, make sure you don't post anything that can come back to, you know, make sure that that gets you expelled from college. But at the same time, when you have this disappear, that also freaked out people because then they said, wait a minute, now, we don't know what you're posting. So there's a there's a little bit of a conflict of what we want. So I think that's that's a larger societal argument, you know, so there's always there's always a lot of debates. And I think this is right now a cluster and this is why I'm going to be busy for the next 20 years.
Speaker1: Yes. Yes, exactly. Exactly. So along those lines, I think that kind of segues into all that you're doing because really just the the thoughts that you're bringing up and the conversation and the things that all these players in this collective that you're bringing to the table are discussing. I mean, there's like you said, there's so many layers to each one of these. There's really no answer yet, is what I'm guessing.
Speaker2: There's no answer, but that's why we need to make sure we have conversations. So I think what we've really struggled with is making sure that we can have deep conversations around this. Because if you and I, right, we've been actively involved in the space and we've done a lot of media kind of kind of interviews and things like that. So what I've been frustrated with is, is oftentimes they'll bring you on and they'll say. All right, David, so is social media good or bad? And in 30 seconds.
Speaker1: Yeah, he's going to say in 30 seconds.
Speaker2: 30 seconds. I want you to tell me three tips that you can give to parents that will solve all of their problems. Go. The issue is, I should spend those 30 seconds saying this is going to take longer than 30 seconds. Right. In an environment where we can unpack these issues. And frankly, that's why podcasts are a good medium for these discussions, because you can have longer unpacking. Right. This is complex. This is dealing with emotions and psychology and. Right. This is actually not easy. And deal with design and capitalism. Yeah. Yeah. Right. So there's a lot there's a lot going on and it's not it's not usually given the the level of respect in terms of its complexity, then it actually deserves. We're always again looking for. Raise your hand and tell me which side do you stand on as opposed to how do.
Speaker1: We get radio voice?
Speaker2: Thank you. I'm going to start using that more often then. But the point is right. How do we roll up our sleeves and and actually tackle these issues? That's that's what I focus on. Like, that's what inspires me is like that's what I'm trying to do with with authentic human is to say, let's bring in people from academia, from civil society, from government, from industry. We're creating this agnostic space where we want people to disagree. We should disagree. This is about the future of our society. We're not going to agree on that. But what everyone involved agrees with is that the status quo is not sustainable. The technology deeply impacts us, and we need to co-create a better tech future. That's what we're trying to do and that's what inspires people. And that's that's where I think we're having a positive impact by really just uniting a cross sector of multiple stakeholders, because that's what oftentimes usually doesn't happen. It's usually a group of academics sitting over here. It's industry who are saying Chatham House rules and everything's off the record and it's politicians shaking their tiny fists in the air. You usually don't bring all of those groups in the same room. All of those groups should be in the same room.
Speaker1: Mm hmm. Absolutely. And kind of going back to that two bookends, you know, talking about the individual and then the business. And you mentioned it when we first started about the business plan. Right. And initially, the big hitters business plans were not as ethical. I mean, not not that it's not ethical, but maybe turned out the way that they weren't expecting or all these other things that had happened. Do you think that it's possible to create an ethical business in this media space? And I do know there's some trying and I want to give them big kudos. You know, I could list a few that I just feel are doing incredible work. But I want to hear from you. And with all of your experience, is it possible to make the big bucks, which we know is a driving force to a lot of these businesses and also have an ethical tech business?
Speaker2: It is, but there's environmental cues that need to dramatically change on the back end. And what I mean by that is if we don't like the way people are playing the game, then we as a society can change the rules. So even the name of the organisation I'll take is human. It's supposed to have multiple meanings, but one of the meanings is that. Otec is a human right. So it depends on the rules that we create or don't create, but the guidelines that we're creating or not creating, that's what's going to affect our tech future. So around the question of can we make an ethical tech company or specifically can we make an ethical social media company? Mm hmm. It's hard because there's there's a famous saying. Right, right. Why do you rob a bank? Because that's where the money is. Why do people exploit our data? Because that's where the money is. Why do people use an ad based model for social media? Because that's where the money is. Right. Because Silicon Valley, through their venture capitalist type of system, has has created since early on. Right. With the web has created an environment where it's usually based on how do we create rapid user growth early on, spend a bunch burn a bunch of money in the beginning, you know, make it seamless and frictionless to use a product and to to join a group. Get a large network and then try to figure out, well, now do we have enough eyeballs that we can get that attention? Then throw in some ads and now turn on the cash machine.
Speaker2: That's effectively the business model of most companies. And that's part of the problem, is that that doesn't jive with what we as society want. And I remember, you know, seeing this firsthand, organizing an event group that used to work with co founder of Digital Citizenship Summit. And we held this big event in 2016, the fall of 2016 at Twitter headquarters. And, you know, just seeing that, wow, like misinformation is is a huge deal. So we know that that's bad for society and that's bad for us as individuals. Yet all of the companies who, you know, were contributing to this, it's not like they were being financially penalized. Right. Is quite the opposite. So Google is actually making money off of misinformation because ads are being placed on some of this fake news. So therein lies the problem, because usually we try to to say, you know, do good by being good. That's what we've tried to do with the environmental movement is saying, well, you know, you can create a product that's better for the environment. And that also is something that you're going to be rewarded in a capitalistic system of making higher profits. Right. So can we do that with ethical social media? Well, I think that you would need to start changing some of the underlying laws for that. So I see this as changing.
Speaker2: So so I guess how you create a environment, environment where the legal structure starts altering the equation, then it alters the incentives. So for example, if, if we actually say, okay, we know that hyper personalization of ads creates a lot of problems and it creates a lot of discrimination. So if we say, okay, well, we're going to prevent this type of behavior that we think is not good for society, then that actually cuts off a certain cash cow of major companies who then have to pivot to say, Well, I guess I can't make money doing this. We do that. I mean, that sounds different now, but that's what we do with every single industry. If you have if you have a certain behavior like let's imagine there was, you know, a car that everyone was was buying, but it it went, you know, 180 miles per hour. And and so. Well, that's a consumer decision. Well, then we might say, well, no, let's create a speed limit and let's make it so that car can't go to 180 miles per hour. Let's let's alter the equation. And then that creates a different incentive to say, well, they can't compete on that. And so we always have a very hybrid system where we, as individuals, through the political process, alter the equation of how a company operates, a balanced structure where we are promoting innovation, but at the same time trying to limit the harms.
Speaker1: Right. It's fascinating because it really aligns with what you're doing, that community focus. Right. So like social media is a community I've got your back type situation when it comes to monitoring content right now, you know, you have to report things and there's just this level of like. Ethics and and looking out for your brother on social media, if somebody being cyber bullied, you know, it's really like a big community. I could probably start going into all the ways that we can be global citizens when it comes to our social media and all of that. So, you know, we're kind of concluding here, but I do want to unpack just what you're doing and how you've brought everyone to the table to create social impact. Because the way I'm looking at it is we've had about a decade of community assessment, you know, of seeing how this is all working out and how it's impacting people for the good and for the bad. You know, I think there's been so much good that's come out of it, so many movements and all that good stuff as well. So can you share a little bit about how you're mobilizing the community to make a positive impact moving forward?
Speaker2: Yeah, what I do with that is actually kind of have a system where trying to initially attract a diverse range of participants, right? So it's a three part system. Oftentimes the problem, at least in my opinion, has been we haven't involved a diverse range of participants that social media in particular affects. Everyone, yet is developed and deployed by a small sliver of society. You think about it, right? You have one part of one state of one country that is making decisions that deeply impact the globe at large. That's. Not sustainable. That's not a system that we want. We really need to greatly diversify that. So the first part create that system. So for example, we recently started a Slack group for all to consume and then now have about 2000 members all across the globe. It's a great way to just reach out, hear all these different opinions, bring people together, promote this culture of knowledge, sharing, collaboration. And then on the mobilization, we usually do that around working groups. So we recently had a working group around improving social media. If you go to improving social media dot com, you can see the report that we we published based on this collaborative group from all different backgrounds.
Speaker2: That's what we're trying to do and then usually hold an event around that. So the mobilization has really been on saying this is actually a movement that could be a lot more cohesive because I was jokingly like to say, no, no amplification without representation, right? Is that if something online, if the digital space, if a social media platform, if it impacts you, then you actually should have a seat at the table in the sense that you should impact its development. So that's where we're trying to to to move this. That's why we're why we're trying to mobilize people is that people largely feel left out of the process. They feel like something is being developed and it has deep impact, but they are very limited in their ability to alter it. We're seeing this as a more participatory movement as opposed to opt in opting out. So how can we, instead of opting out of the system, how can we opt into improving it? That's a lot of what we're trying to mobilize people around.
Speaker1: I like that. How can you opt in to improve it? That's fantastic. How do you opt in? That's an incredible line. How do I often and that can be applied across all social change. How do you opt in? How do you make a difference? And so I have some rapid fire questions for you, but before I get to them, you know, just as kind of like a final, what would you do? I'm curious. I know you just had a summit. If you want to share a little bit about that and how people, if they are feeling compelled to get involved in what you're doing, you know, maybe they're just feeling compelled to do their part, but maybe they want to get involved in the group at large. How can they do that? Share about the summit and all that. Good stuff.
Speaker2: Thank you. Yeah, there's a lot of different ways to get involved. So if you go to politics human dot org or if you go to our new site that we just launched that's talking about the people, organizations and ideas of this nascent responsible tech movement that's at Responsible Tech Guide dot com. We recently had a responsible tech summit where we had about 1200 or actually over 1200 registered attendees from 60 countries. We saw a huge need from so many different people to say, Wow, this is a big issue. Like, I want to get involved, I want to meet other people. And there was a tweet that really spoke to me about really the work that I've been doing for the last bunch of years is that, you know, I'm paraphrasing, but somebody tweeted that, hey, you know, before this this summit, this responsible tech summit, they they felt alone, right? But now they joined over a thousand tech ethicists and philosophers and tech policy professionals and others who are committed to co-creating this this better tech future. And now they know they're not alone. Now they're now they're inspired. Right. That's what we're trying to do is to weave together this community so we can start sharing knowledge, so we can start working together, so we can start challenging each other, so we can start building a process to to improve where we're at right now. It's going to take a process. It's going to take a community approach. And that's that's everything we're doing. And so if you're listening and you want to get involved, hey, write me an email, David, and I'll take a human dot org.
Speaker1: So a few questions, rapid fire in a couple sentences and you kind of shared this throughout, but what is your purpose or motivation for change?
Speaker2: Motivation for change is that we need greater knowledge sharing and collaboration, and that to change is not going to happen through a lone wolf thought leader, that it's going to happen through a greater process. So even me as being one individual, my point is to bring other people together.
Speaker1: I love that. That's very good. And what's your wellbeing tips? You know, on the Social Impact podcast, we are encouraging people to also take care of themselves since they're taking care of others to make sure they're full. Do you have any tips for well being?
Speaker2: I do. So much of this is about thinking about your your thinking. Being able to check in with yourself about how things are going. I really try to move away from the quantification of everything where everything is based on metrics. So I don't believe that it's based on certain screen time or other aspects that are very concrete. I think it's more based on the qualification. How are you thinking? How are your relationships? Are you able to have eye contact with your your loved ones? How do you feel things are going? How is technology impacting you? Good, bad or indifferent? And that ability to reflect on its impact is what actually causes a lot of of personal change. And that's what I always do, is that I check in with myself and say, all right, well, now I'm feeling exhausted. Therefore, I'm going to put my phone away or I'm going to close down my screen and I'm going to go for a walk. You need to start thinking of your brain as something that's constantly being affected by what you give it and how do you create a better environment so it can be conducive. The environment can be conducive to creative thoughts and allowing that.
Speaker1: Check in on yourself and make room for innovation. Great. That's really good. I'm taking that one on. And the last one is, how can we all make an impact? So I know this is an enormous question, but really what I'm asking is how can our audience find whether it's the small thing or the big thing? How do you get started?
Speaker2: You get started by finding others. I really I really would say you do because it's it's a lonely trek. If you're inspired by a book or a podcast or something, a video, and then you're just trying to do it on your own, you can't do.
Speaker1: It on your own.
Speaker2: But by finding other people who believe in you and share your your passion, that's what feeds it. Like, I couldn't do this on my own. I have to find others that that challenge me and that inspire me and that motivate me because there's a lot of back and forth. There's a lot of ups and downs with this with this movement. Right. This is this is going to be something where a community is helpful because the community provides support. So I always go back to that is get involved by finding other people that want to collaborate, that want to hear what you have to say, that one to work together. I think that's where this is going to happen.
Speaker1: Mm Absolutely. Such great insights, David, and thank you for this conversation. I know it's just a catalyst to bigger conversations and the ripple effect of making change in the digital wellbeing space and the tech space. And I just want to thank you for all that you're doing. I know, I know it can't be easy sometimes, but hopefully you find joy in the small wins and the big wins along the way. And thank you for spending your time with us. I know it's valuable. Really appreciate it.
Speaker2: Well, thank you for having me. And I do think this should be a catalyst for for deeper conversations. So, yeah, if you feel inspired by it, if you're listening, reach out and and get involved because it takes everyone and you know, no small no small changes is too small and that everything is is making a difference for the larger kind of movement. And and we need everyone involved.
Speaker1: That's great. Let's continue the conversation. Right. Keep it going. Thank you, David.
Speaker2: Thank you, Bree.
Speaker1: Thank you so much for joining today. If you are liking the Social Impact Podcast, please make sure you're subscribing on YouTube or on all the places you listen to your podcast and drop us a review. I'd love to hear how you are making a difference in this world through your projects or your business. If you need a little help, please feel free to reach out. You can find me at the Social Impact Echo. We provide consulting services, business coaching and much more. So please feel free to find us and follow on social media at BRI. Underscore Jensen Underscore. Next week on the Social Impact podcast, we are going to be speaking with Dr. Denise Burger or as I like to call her, Dr. D. She has been one of my professors and you all are going to love this conversation if you're interested in corporate social responsibility, leadership, how to make an impact there, entrepreneurship. So please make sure you subscribe so you'll get alerted when that podcast drops. I'll see you next week on the Social Impact Podcast. Talk soon.