The Evolving Discourse of Social Media (w/ Renée DiResta)

March 16, 2024 00:51:52
The Evolving Discourse of Social Media (w/ Renée DiResta)
ReImagining Liberty
The Evolving Discourse of Social Media (w/ Renée DiResta)

Mar 16 2024 | 00:51:52

/

Show Notes

Digital expression is weird. When we move our communities and communications into digital spaces, such as social media, the result is an uncertain landscape of new incentives, mechanisms of influence, vectors of information and disinformation, and evolving norms. All of which have profound effects on our personal lives, our culture, and our politics.

Few people have put as much thought into how these platforms function, or dysfunction, as social ecosystems as Renée DiResta, Research Manager at the Stanford Internet Observatory. In today's conversation, we dig into what makes social media distinct, how communities form and interact online, and what evolving technologies mean for the future of digital expression.

Produced by Landry Ayres. Podcast art by Sergio R. M. Duarte. Music by Kevin MacLeod.

View Full Transcript

Episode Transcript

[00:00:03] Speaker A: Welcome to reimagining Liberty, a show about the emancipatory and cosmopolitan case for radical social, political, and economic freedom. I'm Erin Ross Powell. Digital expression is weird when we move our communities and communications into digital spaces such as social media, the result is an uncertain landscape of new incentives, mechanisms of influence, vectors of information and disinformation, and evolving norms, all of which have profound effects on our personal lives, our culture, and our politics. Few people have put as much thought into how these platforms function or dysfunction as social ecosystems, as Renee DiReesta, research manager at the Stanford Internet Observatory. In today's conversation, we dig into what makes social media distinct, how communities form and interact online, and what evolving technologies mean for the future of digital expression. If you enjoy reimagining Liberty, I encourage you to subscribe to my free newsletter, where I write frequently about the kinds of issues we discuss on the show. And if you want to support my work, you can become a member and get early access to all new episodes. Learn more by heading to reimaginingliberty.com. With that, let's turn to my conversation with Renee. A lot of the issues that seem to be hot button when we're talking about digital conversations. Communities on social media aren't new. So there have always been communities of humans talking with each other and passing information around. And sometimes that information is good, sometimes it's bad. Sometimes misinformation spreads through these communities. There's always been levels of toxicity and hostility and canceling each other and so on. But we don't get as worked up about these issues or see them as pressing or demand that stuff be done about them outside of digital spaces the way we do now. So what's different? What makes digital spaces flashpoints for worries about these kind of standard features of human communication? [00:02:26] Speaker B: That's a great question, I think also a very multipart answer. I would say the first thing that comes to mind is norms, right? In a lot of ways, as you note, when people talk to each other in person, or even virtually, as we're doing now, there's a certain series of the way that you've kind of become culturally attuned to how to have a conversation with somebody, right? And even if you're doing that in a big group, lecturing at a town square, sitting in the proverbial public square, we're sitting there, we're having conversations. There are expectations about how we comport ourselves, right? Somebody who comes in and is a heckler, that's actually really frowned upon. That's not a thing that we want to have. We want to have that discourse, we want to have that dialogue. And that's because, actually, when you're sitting there in the confines of a physical space, you're listening with your ears, you're looking at the person. There's ways in which you engage and ways in which you ascertain whether somebody is communicating with you in good faith signals you look at to determine if they're lying to you, right, how trustworthy they are. And virtual removes the vast majority of that. So you're operating on a very, very different structure. And this is the tired media theory trope, but it's actually quite true, is that the structure really shapes the outputs and the style of engagement. And so social media became this vast, many to many environment where we could all talk to each other, we could all say whatever we wanted to whomever we wanted. And there were certain aspects of it where that the norms had not yet been established. And in some ways, I think a lot of the norms around how to have a discourse now it's a meme, right? The discourse, how do you have the discourse? But how to have conversations online became very much rooted in, are you dunking on your enemies? Are you owning your enemies? You want to get attention, right? You want to differentiate yourself from the horde of other users that are posting on the Internet as well. So you have a system of incentives that are mediated in part by algorithms, not by interpersonal communication. So you're performing not only for the persons that you are trying to reach, your human audience, but you also have an algorithmic sort of overseer, right? That's going to determine whether your content, your commentary, is seen by more people. So there's a very, very different incentive structure and a very different set of norms that's evolved in this space that I think feels very, at times, hostile and toxic in a way that the same two people who would fight on the Internet might be able to have a perfectly civil conversation. There was actually an effort looking at this called America in one room, right? Could you bring people who were politically divergent into a physical space? Turns out they actually engage quite well, right? Put them back online, and all of a sudden you are a member of a warring faction, performing your identity for all of the observant bystanders who are taking cues on how to behave. And in many ways, the most toxic people, the most extreme viewpoints, got the most attention and came to serve as, like, avatars of what it meant to be a good member of a particular identity or community. And I think that that really, over the last maybe seven years or so, has solidified into actually some pretty terrible online norms. And that's where I think that tension is felt by a lot of people who don't necessarily know how to react to it or how to behave in that space. [00:05:54] Speaker A: Is this a result then, of the digital mediation or intermediation between us? So you and I are talking digitally right now, but it's this kind of weird halfway, because I can see you and so you can read my body language and I can read yours, which is different than when we're interacting on Twitter or threads or blue sky or something like that. But that's somewhat distinct from the algorithm is creating the incentives, because we could a imagine, say, social media without the algorithm. This is the reverse chronological feed that all of the people demand should be the standard for every social media platform we could also imagine. I mean, in person has its own sorts of algorithms. As you were describing this, you become kind of more radical, more shrill. That shows that signals that you're more serious in your community. It gets you more attention. This is the history of emerging religious faiths. [00:06:57] Speaker B: Yeah, absolutely. [00:06:58] Speaker A: I think past Sunstone has done a. [00:06:59] Speaker B: Bunch of stuff on this in the pre Internet era, too. Yeah. [00:07:02] Speaker A: And so is this just a matter of if we could tweak the algorithm to emphasize more civil communication, that would fix it, or is it. This is basically an unfixable problem within digital spaces where we don't have that. If you and I disagree, I might yell horrible things at you in kind of the anonymity of an online space. But if I have to actually see your pained reactions to my words, I soften them. I'm not as much of a jerk in person as I might be online, and that seems unfixable unless we just all go to video chats. [00:07:45] Speaker B: I think there was this interesting experiment during the pandemic clubhouse, which I know still exists, but I think it was really kind of a big thing during the pandemic. And I thought it was a very interesting. I really enjoyed it, actually. I really enjoyed it a lot. And it's very much sort of of the moment, but I enjoyed it because you could hear people's intonation. You felt like you were having. There was no video. It was an audio chat for those who weren't on it. But it felt like you could have a little more of a dialogue where you could hear the other person. And people who were kind of good storytellers, good interviewers would really thrive in that environment. Anonymity is an interesting one because it's a double edged sword. I think for many, many years now, you've seen some of the people have various opinions on how to reform the Internet, and ending anonymity is one that comes up constantly. You might recall Facebook, one of its differentiators was the idea that you were speaking to someone who was a validated, true named account. Right? That was actually what it sort of set out to be. Turns out people know and any system, you're going to have some group of people that are, that are trying to manipulate it. But I think it was a, the problem of scale, I think might be the real differentiator here in that the more you try to have larger and larger communities of people, you do see that kind of jockeying for position. As you noted. You do see that kind of desire to rise to the top. Certain voices become more representative. One platform that I really like a lot that I think does this very well, is actually Reddit. And what I like about Reddit is it has persistent pseudonymity. Right? So you're not deluged with random anons. You can kind of tell if somebody's been a member of the Community for a while. You're actually in some communities, if you post and you're brand new, there's a delay, right? You don't get to just spam the subreddit with 10,000 thoughts, right? So there's a friction there. There's also the Upvote, Downvote dynamics where other members of the community are serving as kind of reinforcement, right? So somebody who goes and leaves a terrible comment is going to be sort of downvoted into oblivion. It'll eventually be kind of hidden, actually, from view. As a non constructive comment, you have moderation that's set by the moderators. So there's kind of top level Platform dynamics that say, like, there are certain things that we are going to ban here. They are very generic kind of list up at the top. But then after that, you actually have very, very detailed rules and rules that shape the norms, actually, of the Community that say, this is our Community for posting pictures of cats, and if you post a picture of a dog, we're going to kick you out. And that's just it, right? That's the social contract that you enter into when you become a member of this Community. And if you decide to be an asshole and post a dog and they kick you out, you can scream censorship if you want to, but that doesn't actually matter, right? You violated a norm that a community has set. This is the space that it wants to create. It's very clearly articulated. You know what you're getting into when you join. Some Facebook groups are like this also. You join the group, it posts a whole list of know, I don't know if you've engaged in Facebook groups. The mom groups are a whole thing. They have a whole list of rules. And you accept the rule as conditional upon entry. And then these volunteer moderators, who are also members of the community, are in there and they will enforce the rules. Right. And so it is not the kind of thing that you can do at mass scale, but when you have few dozen, few thousand, even people, you can create that kind of environment. And it's not necessarily anonymity that's the problem, because Reddit is very pseudonymous. But it is a question of, like, we have all chosen to opt into this rule set. We've kind of come to an agreement that this is how we are going to engage in this space. [00:11:39] Speaker A: How is that mechanism different, other than just maybe size of community, than, say, Twitter or Facebook? Because what you're describing is also how those places operate. There are moderators in the sense that there's meta or there was Twitter's trust and safety teams and their content moderation policies. And if you violated the community norms, you wouldn't get engagement or you might get deplatformed. And that happening is what has caused the current conflagration about social media. We're recording this a few days after the Supreme Court heard oral arguments. And I think it was the net choice case about whether you can kick conservatives off. And I think about it in the. [00:12:34] Speaker B: About to be. [00:12:38] Speaker A: There'S that. It seems like a lot of what we're seeing now is people saying, not really upset about the cat community on Reddit kicking out someone who posts a dog picture, but the broader community on Twitter kicking off someone who posts a swastika and saying that's like an unfair abridgement of. And so what's the difference there? [00:13:05] Speaker B: So I think first you called it the community on Twitter. And I guess what I would push back on is the idea that there is a Twitter community, right? Twitter is an infrastructure for many communities the way Reddit is an infrastructure for many communities. But you might think of Reddit as more of a bordered set of communities, right? More clearly delineated. You know that you're going into the cat subreddit, and this is how the cat people want to do their thing. On Twitter, you are organized into networks, meaning you choose to follow and be followed back, right? You create, you form your communities. But just to go into, like, it's like an open crowd, I think, is the way that I would describe it, the way I have described it in the past. Closed crowds, right? You have people who are there. They're forming around a particular religion, a particular group. They're the Kiwanas Club. They're the Cat group on Reddit. And there's a sense of, like, we are here because we are of this community and we are adhering to these norms. What you have on Twitter is an open crowd where you have many, many different people with many, many different types of norms, and they all see each other, right? And so it is an infrastructure for having conversations, but it only has the top level mod. So on Reddit, you have the top level mod, and then you have the community mods. And the community mods are the one making the cat rules, right? On Twitter, you have to try to create one set of rules for all of these different factions that have all assembled on your platform, that are all very kind of tightly networked on your platform, who are also then constantly seeing the people that they hate in their field of view. And in fact, the algorithm will, as people hate read content, right? The algorithm does not understand that you have just hate read know. So you're going to see more of, you know, there's tons of social science on this at this point. Like Chris Bale's book comes to mind, breaking the social media prism, where what he talks about is he studies these sort of very polarized polarization. He's at the Duke polarization lab. I don't study polarization. I'm not an expert in polarization. But what they find time and time again is that people who are on Twitter as, let's say, a right, sees these conservatives saying these things and feels very, very angry about it and feels that it is her responsibility to respond, right, to correct the record, to tell them that they're full of shit, to go and fight back. Because Twitter is not the public square. Twitter is the gladiatorial arena, right? We are there to do a thing. The incentives of that platform are not to have nice, closed crowd conversations about cats. It's actually to say, your norm is censoring my expression. The conversation around trans people is, I think, possibly one of the clearest examples. But the sort of norm that the platform tries to set about how you speak to people, what name you use, what pronouns you use, the sorts of rules that they tried to put in place was in response to legitimate harassment in which individual people were targeted, and so they tried to create these rules. But then you can actually, it turns out, rile up your faction more by arguing that the rules are foundationally illegitimate, right? That they are biased against you, and then your faction feels aggrieved, whatever the rule is. You see this on blue sky, too. Whatever the rule is, your faction is now going to be unified in its hatred for that rule. And so a lot of it is actually this ref working about. When you have that one central top down moderation regime, if you will, you are working the refs to try to make it the most favorable for you and your group. And that's because the thing that we haven't really touched on here is that we're not just talking about cats, right? We've got world leaders on the platform. We've got people in pursuit of political power. We have people in pursuit of massive wealth. And so the ability to be algorithmically rewarded versus to be throttled translates into material wins and losses in terms of either elections or financial, Luca, if you will. And so that's why this idea of Twitter as this incredibly high stakes arena is foundationally different than the ways that people behave. Even as you can have a strong identity on Reddit in the cat subreddit, and you're not constantly out there like, well, as a liberal, let me tell you, you don't have to do that. That's not how you're performing. But on Twitter, you are. [00:17:43] Speaker A: I think there's also a further factor that adds to these problems on Twitter and other centralized or social media platforms where kind of everyone's thrown into the same space and then builds what you called an open crowd. And that's that when you're on Reddit or I cut my teeth in the era of web forums, and so you're on the forum, you have this clear indicator that you are in a specific community. So you can see, I'm in this subreddit. It's labeled as such. That's the community I'm in. The other people I'm talking to are in that community, or I've gone to this particular URL that is running this forum software and joined this particular forum, and it's this community. On Twitter and similar kinds of platforms, you're all thrown into the same space. And we know that, like, conceptually, now everybody's on. There's several hundred million people that I've been thrown into space, but you're seeing a very narrow slice of that because you're seeing the people that you chose to follow. You're interacting with the people who chose to follow you. And so you've basically constructed what looks like a Reddit cat community in the sense of, like, it's a small network around a shared set of interests, but you don't have the visual indicator that it is a subcommunity. And instead, what you think of it as is what I am seeing is Twitter and the conversations I'm having and the norms and the kind of shared epistemology just is. So when someone from outside of it, when you get the context collapse and someone from outside of it pops in, you're like, this person is a fringe outsider, as opposed to, they're just another community that I have bumped into. And the things that we know to be true in my community must be what everybody knows, because it's just what everybody I see on Twitter is talking about. And so when people push back on that, it feels less like they're in a different community and more like they're challenging kind of the whole. This is, I think, that contributes to the norm should reflect me, not just because, like, working the refs and I want them to reflect my sub community, but this is just what all of Twitter is. And so, of course they should support it because it would be nonsense for them to support something else that is not representative of Twitter as a whole. [00:20:22] Speaker B: I think you really saw that on blue Sky. I think you were kind of early there, as I recall. I was very early there. I had had a chat with the CEO. I'm at Stanford Internet Observatory. We're very interested in new and emergent platforms, and we're interested in them, actually, because of several reasons. One, like, what is the sort of trust and safety framework that they envision for community governance? But the second is whenever you have a new entrant into the information space, things kind of reshuffle, right? New communities form. And so we're always just interested in what changes, how do narratives move with the new entrant or the new technology? And I remember joining Blue sky when it was very, very early, and I actually didn't even know what to post, right. I've been online forever, and I got there and I was like, all right, people are posting aigenerated art. This is like, right when? March of 2023. So about a year ago. And they were posting a lot of AI generated art, and then there were a lot of nudes. I was like, okay, I'm not really sure how I fit here. I don't really know what to do here. So I lurked. I posted some AI generated art. In some ways I liked it because it felt like a little bit of a reset, right? I didn't have to post about the stuff I normally talk about. I wasn't there to grow an audience or talk about my work or anything like that. I posted like, here are these crafts I made with my kid. I found these random gardening people, and before it had formally organized into feeds and I had a garden, and I would post my gardening stuff and I would have really nice conversations about it. And I felt like it hadn't yet factionalized, if you will. It was small enough that even though I couldn't quite figure out where to enter the conversation a lot of the time, it also felt like it wasn't, people weren't fighting about politics constantly or fighting about whatever social culture war issue constantly. But I do remember, though, blue sky had some early, they had a couple of things where they screwed up some basics, right? Letting people register slurs and their username, things like that, things that were not good. But there were also these moments where you would see that it became a very left leaning community, right, because it was an invitation based network at the time. And so people invited their friends. And so it was a lot of left leaning people who invited other leftists, antifascists, anarchists, and those communities kind of sprung up. And what was very interesting about it was like, even though the entire premise of the platform, from its sort of foundational reason for being, was that it was going to be composable moderation, right? Different communities were eventually going to run servers and moderate themselves, akin to Mastodon. People really didn't seem to actually want that, right? They wanted very specific moderation types and moderation rules. And they wanted them because they felt that Twitter, at this point, Elon owned Twitter, that Twitter had rolled back some of those things. So in a sense, what they had lost, if you will, and the ref working fight over on Twitter, you began to see manifest in the ref working fights over on blue sky, as people tried to say, these are the rules that we want to establish for our community on this thing. So you do see that idea of, even as the network is forming, even in the early days, the recognition now among a lot of users that establishing this as a space, I hate a safe space, but like a friendly space for their community, in a sense, they tried to turn blue sky into what parlor and truth social were for the right. And so it's going to be interesting to see how the platform, now that it's open to the public and is still moving into this decentralized world, how they're going to handle that, because the way I see it is between threads and mastodon and blue sky, the trend is towards decentralization. And when you have decentralization, there are actually no refs to work, right. Eventually, there are no refs. Those other communities that you don't like will set their rules for their piece of the platform, and you choose to federate or defederate, which just determines whether or not you see it. But it is still out there. And I think that this is where, unfortunately, the kind of content moderation culture wars on the big centralized platforms are leading more towards decentralization and everybody moving more off into their own world, as opposed to finding ways to bridge those gaps and say, okay, here is how we will create top level, all community norms that will achieve kind of maximum happiness for the greatest number of people on the site. And I think in some ways, that experiment is maybe being shown to be simply just a loss. It's just not possible. And so this is where it's almost like this collective retreat into smaller spaces. [00:25:22] Speaker A: Yeah, I'm a big fan of decentralization, but the dynamics you just described have been really fascinating to watch play out, because it gets to those conceptual confusions that I think, because I think a lot of these problems are humans. We communicate a lot. We're very used to communicating. It's our jam. But everything has been about these in person. It's only recently that we have had these new platforms, and these platforms introduce a lot of conceptual weirdness to the way that communities function, the way that communities interact, the distribution of what we say that you feel like you're having a private conversation, but it's really a public conversation. And it's not quite the way that you and I could be sitting at a coffee house having a conversation, and it's kind of private, but there could be other people listening in. It's that hundreds of millions of people, anyone in the world who wants to listen in our conversation can just click on it and do it. And as you were talking about this, it put me in mind if there's a fight right now about Blue Sky's opened up, there's Mastodon and the fetaverse that threads will eventually join, and the role of there's a guy building a bridge between blue sky so that a blue sky user could follow someone on mastodon and see their posts within blue sky. And there's been this huge fight in the Mastodon community, because on the one hand, they want to use this platform where anyone can follow them, but on the other hand, they don't want anyone who's using blue sky to be able to listen to them, even though the conversations they're having are public. And it feels like a lot of this navigating it is this tension in interests. On the one hand, we want our tight knit community with our norms that we can have a good conversation with, and we can feel like we belong. On the other hand, we want engagement and a large audience and a place to push links to our newsletter and to promote ourselves. We want to be influencers with the highest follower count, and those things aren't really compatible. [00:27:40] Speaker B: Well, it's an interesting point. So I don't know that we all necessarily want to be influencers. I think that is an interesting dynamic in and of itself. I am trying to remember what the stat I saw. Oh, it was TikTok. TikTok. Something you have what's called the 99 one problem, right? 90% of the people are simply lurkers. 1% create the majority of the content on the platform, trying to remember what the 9% do. I think they're like, they're in there contributing sometimes, but they're not trying to be the sort of influential 1%. And I think on TikTok, it was something like 25% were creating the vast majority of the content, which actually is quite large. [00:28:23] Speaker A: Right. [00:28:23] Speaker B: It indicates a pretty big dynamic in creators. I was actually surprised by how many were creating. But I think the question of, like, do you want to be an influencer that is known by people outside of your faction, right. There's a very sort of small, I'll use the word, elite group of people who think that way, who run newsletters and things like that. Then there are the people who want to have what you might call, like, local influence, right? So the way that I've described it is you have the influencer and then you have the crowd, right? Using the same word of the crowd, right. And the people want local influence within the crowd. They want to be seen as authoritative in their small community, but they don't necessarily want that massive visibility and reach. They're not looking to shape the national conversation. They're not activists on a particular issue. Or if they are activists, they're content to be amplifiers. Right. Many, many people see their role, particularly on Twitter, as boosting their side. They know that they have to put out content. This has been sort of, like, taught to them since maybe the 2015 presidential election, 2016 election, but 2015 campaign. They know that they have to do the work to amplify their viewpoint. And so they're there and they're performing that role. They will put in their bio things like retweeted, know Charlie Kirk or know. They have these sort of, like, local icons. And so they're indicating that they are locally important, but they're not necessarily running off to start a subsack and trying to become a conservative or liberal influencer. So I think that there is a divide there. You do see a lot of people who showed up to threads, and I thought it was interesting. Know, Chris Cox, the chief product officer at Metta, expressed this like, we just want a place for sane conversation. Right. Sane moderation, I think, is how he put it. But you just saw these people who found Twitter exhausting, who no longer wanted to be in that crowd dynamic, who just found the whole thing tiresome. And they're not on threads. Trying to. The people who are resistant to threads in some ways are the people who have the massive followings on Twitter, because they don't want to leave that behind. Right. They did a lot of work to amass that. And so they have that large scale visibility. And so you do see some entrenchment, I think, among people who have worked to really build a platform, because they have a thing where they either want to monetize a large audience or they want to be able to influence a large audience versus what we might call the vast majority of normies who just want to be in a place where they can have a conversation without feeling like they're about to become the main character of something. Right? And I think that fear of becoming the main character, saying the wrong thing, the context collapse of having somebody who runs a nut picking account grab your tweet, screenshot it, retweet it, blast it all over the Internet, people are afraid of that, actually, people really don't want that to happen. I think creating spaces where you're not going to get massive amplification, we don't have to have trending algorithms pushing things at us at all times, is something that people are looking for. And I think that there is some evidence that places like Mastodon or blue sky are really thinking, how do you have that community feel, that community moderation? But that feeling, the thing that they're afraid of, that visibility, is actually the fear of having somebody go and pick up their stuff and turn it into fodder for harassment, having people actually begin to threaten you and all of the other things that go along with visibility on the Internet. And that, I think, is, I don't know what the solution to that is? I did set my blue sky to private for off blue sky. I just didn't see the point, actually in leaving it public. If people want to see me publicly, they can see me publicly on threads, and I feel fine with that. Or mast it on. But you don't need to be public in all places at all times, in my opinion. [00:32:18] Speaker A: Yeah. It makes me think of one of the really interesting things that I have noticed is when I was a kid, when I was in elementary school, everybody wanted to grow up to either be a sports star or a movie star. Those were the things my kids. The thing that all the elementary school kids and middle school kids talk about is wanting to grow up to be a TikTok er or a youtuber is like their version of celebrity and their ideal career path. But at the same time, the interactions that they have with each other online seem to be trending away from public platforms and social media. And I was surprised that Snapchat had come back. But Snapchat is huge again, or it's just small group chats in imessages. They're not using Twitter, or it's various analogs. They're on TikTok, but it seems like very few of them are actually creating anything or have, really, a desire to actually do that on the ground versus just kind of imagining the lifestyle of one of these influencers. So I think you're right. It feels like I don't need to have these conversations in public anymore. I can just have my small friend group. But we continue to. All of the energy in building out these platforms is in building things designed from the beginning to be public, like, blue sky is. Blue sky is, like, radically public in the sense that even if you can't make your account private, and even if you turned off outside of blue sky, but anybody, the way that blue sky stores its data, anybody can query your data. There's absolutely no way to have private conversations on this in a way that you can at least make a private account on threads or Twitter and so on. So it just seems like all of our technological energy is building this thing that a lot of people, as you said about blue sky, blue Sky was designed to build a certain protocol, and it was just a reference implementation of a protocol, but everybody who used it just fundamentally wanted it to be something else. And so is a way out of this, to just resolve this tension and to be like, if you don't want to have public conversations, don't have them on platforms that are public. [00:34:43] Speaker B: I think I moved into the WhatsApp group. The WhatsApp groupification kind of trend, I think that happened among a lot of folks who work on controversial topics or even just don't want constant visibility. Like, you want a place to be wrong or to have a debate or to actually learn something as opposed to. To broadcast. Right. So moving into the various WhatsApp groups of friends and friends of friends, I found actually really constructive. I had a bunch that were extremely politically diverse or were like, I was the only liberal in the group, sometimes the only woman in the group, and it would just be a place to have debates and conversations. And some of those actually really did collapse as the vitriol and polarization increased on the outside, and it would kind of make its way into the group in terms of what was shared and how did we react to the kind of either the main character drama or whatever the latest culture war was, others persisted. Right. So it was kind of interesting to me to see which directions things went. But there was a real feeling. I think Venkatesh Rao called it, like, cozy. Know the places you could go where you wanted to ask a question or have a debate. Know, I was always kind of sad that we couldn't do that in public. There's the transitive property of bad people, right, where you talk to so and so. And the mere act of engagement means that it's a liability for both of you in a weird way. Like, how are you talking to her? Whoa. And that's, of course, the equivalent on the other side. And so I found it sort of depressing, actually. I like debating. I like engaging, I like arguing. And yet it would just turn into, people would literally, with Twitter, speaking of public things, they would just go through your mentions. I remember one pulling out. Like, I once had a one sentence conversation with a man I had never heard of. He replied to me, I liked the tweet. And then all of a sudden, it turned into, like, renee liked a tweet by a Nazi. And I'm like, who the is that guy? I don't know who he like. And I'm not famous. I'm not important in that way. But just this, there's this sense of like, oh, well, she liked that tweet. She follows this person. She reads this, know, it just turned into such a caustic, nasty environment that I think a lot of people, you might be surprised at how many people are actually texting with members of the other side and just don't want to do it in public. And I think that's actually bad. I think that's actually like, that is the worst part of what Twitter created, right. Was the sense that you constantly had to be arguing that was what it meant to be a good member of a particular identity or political party or what have you. And it's actually terrible. I think the open crowds at this point are terrible. [00:37:43] Speaker A: Yeah. I remember the discussions of, you could download block lists that were basically like, here's a bad person's account on Twitter, usually like some figure in the far right, and you can just auto block everyone who follows them. And I remember being involved in conversations about that because it was like, you know, that these major figures in far right circles are followed by a lot of researchers and journalists who study or cover the far right. And so you're basically just blocked. I mean, yes, you're going to block a lot of far right weirdos, but you're also going to block a lot of people who are actively engaged in combating far right ideologies. And the response was just like, so what? They shouldn't actually be following these people. They shouldn't be engaging with them. They should be finding other ways to see what they're up to if they need to, rather than giving them the follower counts. And it was deeply weird. And yes, I think you're caustic to just the way that we should interact with each other and with knowledge and paying attention to what's happening in the world. It's died down a little bit. But in the last week, there are big fights over Tellerenz's interview of the TikTok woman. And the very fact of interviewing her, even though it was an interview that portrayed her as, I mean, just made it very clear that she's not terribly bright, not thoughtful, just kind of a bundle of grievances and base urges. But the very fact of interviewing her was beyond the pale. And it seems like there's this kind of performative non engagement or performative ignorance that is a part of these cultures of, like, the proper way to respond to bad ideas is just to pretend you don't even listen to them. [00:39:40] Speaker B: I think it's a very early 2010s attitude. Right? So there was this selective amplification argument that it's a really interesting paper. I think it's Dana Boyd, Joan Donovan from back in the early questions was how should media cover? And at the time, they still did not have very massive followings. Right. So there was a really interesting question about platforming, which is when often used to refer to mainstream media, which was still very much where that center of attention center of gravity was how should mainstream media cover these kind of rising niche figures? And they were still niche figures at the time. Right. I think Jack Posevick had went looking this up for my book or something like 50,000 followers maybe, when Pizza Gate started, compared to the 2 million he has today. Right. And so you have these. It's a very different dynamic. So I would say that as the rise of influencers and that power system, that powerful communication system, became ascendant, we still, in some ways, talk about this idea of platforming as if it's still like mainstream media elevating this person who otherwise would be an unknown. And that's just not true. 14 years later, that is just not true. And so there's this, I would say, legacy mindset that recognizes that in the, might have been a way to, if you're going to cover it, cover it carefully, all the different sort of news literacy ways for thinking about how do you profile or cover a controversial figure who has some influence and reach without elevating that influence and reach. When you've got somebody who's got 2 million followers, they have the influence and reach. It's done. The ship has sailed. Right. And so thinking about it in terms of platforming is absolutely, I think, completely wrong. You have to be engaging the ideas. You have to be counterspeaking. And my residual frustration with, particularly the center left, is that it continues to believe that that is not necessary. It doesn't see these two systems as equal, which I do. I really do at this point. Maybe I'm wrong about that, but I just wrote a whole book about it. People can critique it. But I think that as the kind of centralized media era is also waning, as things have also decentralized on that front, you have got to be engaging with the content and with the ideas. And one of the things that's interesting about this was, I remember in 2018, I wrote this article. I think it was actually called freedom of speeches, not freedom of reach, whatever wired called it. But that was where that phrase came from. Me and Azaraskin wrote this article because we were actually writing about recommender systems and curation. Right? And how do you, this was in 2018, as this kind of grievance about moderation was inherently quote unquote censorship or quote unquote biased. Right. How could you think about what was that best path forward where the idea that you were going to simply take it down and then the idea would go away? Even in 2018, it was very, very clear that, a, that was wrong and b, that was not actually even possible, right? And so then we started to talk about, well, how does curation figure into this? How do you have the Nazis or the antivaxxers on a platform and not amplify them? Do you choose to have them both? Maybe you have the antivaxxers but not the Nazis, right? And you see the platforms trying to struggle with these same questions. You begin to see like, okay, explicitly hateful ideologies come down, okay, but then you have the kind of conspiracy theory groups, and then there begins to be this big debate about what do you do about them, right? And I think in that regard, there was this mindset that because it was like three centralized platforms, if you took an account down, it would go away and the ideas would go away. And that is just not what happened. What actually happened is a lot of it went to telegram and got much more extreme, right? And then it came back. And this is where I think that our understanding of amplification and platforming is rooted in an ecosystem that no longer exists, or that at a minimum is very significantly waning. And we need to be thinking much more about countering those ideas and how to do that ethically, but also quite. [00:44:00] Speaker A: Clearly so if we are moving to this decentralized world where you can pick your particular server with your particular rules, or you can compose your own moderation system if it's on blue sky, but you can interact with anyone in the broader ecosystem of these servers, that there might be big players, like when threads enters the mastodon fetaverse, it will be unquestionably the biggest player in that. But if you don't like what threads is doing, you can just move your account without losing all your followers to a different server. So you said, it feels like a lot of the arguments we're having now are based on a technology stack and ecosystem, that it still exists, but it's clearly weaning. So looking ahead, then, assuming that we are describing accurately the new emerging social media landscape of this decentralized setup, what lessons should we be taking from the last 1015 years of our social media experience in order to try to make this new one better? [00:45:19] Speaker B: That's a great question. I think the initial network establishment is a really interesting question. Right, so blue sky didn't have a way for you to import your old follower graph. Mastodon kind of tried to. Right? There was like an effort in November of 2022 or so where you could use these tools. They were semi reliable. But that question of like, how do you think about what people see? And I would say actually, I think, really, recommendations is the thing that needs to be significantly rethought. Recommendations and trending. And the reason for that, those are the two things I always come back to. Content moderation is an end stage process. Right. You are already assuming that something has been created, it is somehow bad or wrong or what have you, and we are dealing with the problems at the end stage as opposed to are there ways to design systems better earlier up ahead, right. Upstream. And I think groups like new public, new public is doing this. Ethan Zuckerman is a big person in this space. Eli Pariser is at new public. And the question is, if you design it from the ground up, what do you want to see? And I think that there are some real questions around how do you decide who to recommend to someone? Are there better ways to think about what is the incentive? Is it. There's kind of a flywheel effect with a lot of this. Even the very early days of Twitter, when Twitter began to create suggested follower lists, what they do is they essentially reinforce, there's like a flywheel effect. You have some reach, some power, some follower account, and you get more because the platform thinks, okay, this is a person who, I should suggest because this is a person who a lot of people follow, and then that becomes a self fulfilling, self reinforcing situation. And that's not necessarily, I think, the best way to do it. [00:47:19] Speaker A: Right. [00:47:19] Speaker B: You could say, remember clout? I feel like we're probably about the same age. Right? Okay, so clout for those listening who don't know, it sort of sat on top of Twitter and you received clout scores for how influential you were or how much of an expert you were. Not writ large, but it actually recognized that expertise was local. Right. Maybe you were an expert in dinosaurs and you could have clout in dinosaurs. And so if somebody wanted to find the dinosaur guy, there was a way to do it and people made fun of it. I definitely made fun of it, but it was really an idea before its time, I think, in that you might remember then during COVID you see Twitter actually scrambling to find doctors to give blue checks to. Right. To say, like, okay, we got to find a way, like, who are these randos talking about this disease? And how many of them have any medical background whatsoever? Maybe we should be blue checking and elevating the sort of frontline physicians. So this is kind of scramble to go and find them and do that. But I think even something as basic as that recommender systems that are a little more local, that say, like, these are the topics I'm interested in, instead of just showing you the people with the largest follower count who have built that follower account, maybe by being sensational or being grievance mongers or what have you, maybe there's different ways to surface that to establish those networks. And then I think on the curation front, it really has to be much more control in the hands of users. I think that there are a lot of people who don't, having never actually played with the levers of what a curation system can do, think that they want reverse chronological. They don't necessarily recognize that that creates its own incentives. [00:49:04] Speaker A: Right. [00:49:04] Speaker B: Which is for people to be very frequent posters. So actually, in a lot of ways, it becomes very spammy quite quickly. But if you let people have tools where they can kind of push one thing up and push another thing down, I want to see more black women in my feed. Okay, here we go. [00:49:19] Speaker A: Right? [00:49:19] Speaker B: This was what Ethan Zuckerman's Gobo tool actually did. I don't know if it was as granular as black women. I feel like definitely women was in there, but you could play with these different levers and see, like, oh, look at how my feed changes when the algorithmic curation changes. And that also, even beyond the ability to shape the feed, gives people some visibility into, oh, this is why I'm seeing so much of this stuff and not this other stuff. And so it kind of helps people realize even in that moment, this is the power that the algorithms have over what is being pushed into my field of view. What I am reacting to, how I am feeling, is in part, based on this kind of stuff. And here is where I have both more control and also, at a minimum, can understand a little more about how the system works. I remember in the olden days in 2018 on Twitter, having conversations with people who this was when the I'm shadow banned narrative really began to take off. And I would talk to some of these people, right? And I would say, why do you think you're shadow banned? These are accounts with, like, 100 followers. These are not the kind of accounts that even would have risen to the level of the platform, being aware of their existence. And they would say things to me like, my friends don't see all of my posts. The platform is shadow banning me. And they foundationally did not understand how a curated feed actually worked. And so they felt that it was some sort of reflection of a platform not liking their post or their content. And that made them feel very aggrieved, which really opened the door, I think, to kind of this persistent belief that the algorithms are out to get you. [00:51:04] Speaker A: Thank you for listening to reimagining liberty. If you like the show and want to support it, head to reimaginingliberty.com to learn more. You'll get early access to all my essays, as well as be able to join the reimagining Liberty Discord community and book club. That's reimaginingliberty.com. Or look for the link in the show notes. Talk to you soon.

Other Episodes

Episode

March 01, 2023 00:51:21
Episode Cover

Tolerance and Liberalism (w/ Andrew Jason Cohen)

A liberal society is a tolerant one. It's a society that allows for pluralism in preferences, lifestyles, religions, and approaches to life. But how...

Listen

Episode

September 09, 2023 00:48:52
Episode Cover

When You Think Your God Wants You to Be an Authoritarian (w/ Kevin Vallier)

People have all kinds of reasons, none of them good, for opposing liberalism. Recently, among intellectuals on the right, we've seen the reemergence of...

Listen

Episode

January 06, 2024 00:45:25
Episode Cover

The Ideological Origins of the Reactionary Right (w/ Tom G. Palmer)

There’s something different about the contemporary right. Classical liberal rhetoric has been replaced with something much uglier and more reactionary, keen to carve the...

Listen