Why Tribe Divides Us
Q&A with Jay Van Bavel, NYU psychologist, author, and leading research scientist on the effects of group identities, values, and beliefs, especially in politics
Note from Annie: I did this Q&A with Jay before the terrorist attacks in Israel this past Saturday. The conversation we had seems all the more timely now.
Jay Van Bavel is a professor of psychology and neuroscience at New York University, where he is also the director of the Social Identity and Morality Lab. He is an influential and prolific research scientist, studying how collective concerns – group identities, moral values, and political beliefs – shape our minds, brains, and behavior.
Since the start of 2020, he has contributed to 58 published academic papers, with an additional 9 in press. During that same period, according to Google Scholar, his work has been cited nearly 14,000 times. In addition, he is coauthor (with Dominic Packer of Lehigh University) of The Power of Us: Harnessing Our Shared Identities to Improve Performance, Increase Cooperation, and Promote Social Harmony. Packer and Van Bavel also collaborate on the Substack newsletter, The Power of Us, where they regularly share the science and stories of social identity, group dynamics, and collective behavior.
Triggering tribalism: The sneaker story
Annie: I want to start with a surprising story that you tell in your book. When we think about a group belongingness and identity and tribalism, I think that our intuition is that we form these tribes based on important things—deeply held values, morality, protection of territory and resources, etc. But one of the first stories that you tell in your book shows how small and seemingly insignificant the things are that can cause us to get into this tribal mode. Could you share the story about the sneakers?
Jay Van Bavel: While writing the book, we were trying to find similar things happening in the real world to what we were seeing in the lab and in the science that we were doing. One of the most interesting examples we found was in this town in southern Germany, known as the Town of Bent Necks. The town’s actual name is Herzogenaurach and the Aurach River divides the north of the town from the south. But what REALLY divided this town was a feud between the Dassler brothers, Adolf and Rudolf, who started making shoes in their mother’s kitchen in the 1920’s. Their little business grew. They even made the track shoes worn by Jesse Owens, who dominated the 1936 Berlin Olympics, winning four gold medals.
During World War II, one brother was hiding in a bunker and the other brother and his family came running to dive into the bunker. There are lots of legends of what happened, but one of the stories was that the brother already in the bunker swore, “The bastards are back again,” as the other brother was diving in. He may have been talking about the Allied bombers, but his brother took it as a personal insult. The ensuing feud caused the brothers to break up their shoe company, each opening their own factory on opposite sides of the river. That comment in the bunker, or how it was interpreted, started a rivalry that lasted until they were dead. In fact, it lasted past their deaths, because after they died, they insisted on being buried at opposite ends of the town cemetery.
The people in the town worked at one shoe factory or the other. The factory you worked at determined where you lived, who you could marry or date, where you could eat, or what pubs you could go to. People would walk around looking down to see what shoes people were wearing to see if they could be friendly to them or if they needed to discriminate against them. That’s why it became known as the Town of Bent Necks.
The thing that was most striking was people hating each other because of the shoes they wore, which is not something we normally think of as being an identity that people have, or something that would divide you or determine who you might date or marry or cause you to mistreat others.
Just as a fun fact, the two companies that started from this rivalry became Adidas and Puma, which are two of the biggest shoe companies in the world.
Belonging, distinctiveness, and status
Annie: What do we get out of being part of a tribe? Whether it’s the Adidas tribe or the Puma tribe or tribalism in general, what are the human needs that are being satisfied by being in a tribe?
Jay Van Bavel: The three that we talk about in our book are belonging, distinctiveness, and status.
The first one is belonging.
It turns out that belonging, or a sense of connection with other people, is one of the most basic and fundamental social needs that people have. Among people who are lonely or lack that connection, mortality skyrockets and psychological wellbeing plummets. Belonging is one of the things that makes us happy and healthy. Connection with others come from being part of be a small group, a group of friends who get together and play poker, or people in an organization where they value each other and working together, or all kinds of groups, religious groups, communities.
The second need, which seems to oppose belonging, is a need for distinctiveness.
People often want to have a sense of distinctiveness, that they’re not identical to other people. The groups that tend to be stickiest are ones that give you a sense of belonging but also are distinctive from other groups. You can think of this with college. The U.S. News rankings of universities in the United States just got updated and sent out last week. People fight over it, and universities advertise it on the front page of their website. One of the things universities love to brag about is how hard they are to get into, how exclusive. If you get admitted there, you get access to an identity that's very exclusive, where you both belong to the group of people who got into the school but are also distinct from all those who didn’t. Companies have that as well. Getting into a nightclub with a velvet rope and a long line outside confers the feeling that you are part of a distinctive group.
Finally, the third big factor that scratches the itch for most humans is status.
People have a fundamental need to have a sense of status. This can be connected to distinctiveness. Like I was saying with universities, getting a high ranking on U.S. News doesn’t just mean you're especially a hard to get into, but also gives you status over other universities. People put these stickers on their cars to signal what university or college they went to signal the status of it. There are all kinds of ways that we get status, either being a high-ranking member in a group, like a president or a manager or a star on a sports team or being part of a group that's high in status.
Minimal groups: How easy it is to divide us and create ingroup favoritism and outgroup hate
Annie: I want to delve into how easy it is for groups to split people apart. In the Adidas-Puma example, for instance, in addition to giving people in the town a sense of belongingness, it also told them who they shouldn’t socialize with, who they shouldn’t marry, who they shouldn’t like. I’d like to start with some work I know you’ve done, about how these negative feelings can develop, even when groups are formed on a random, arbitrary, or seemingly irrelevant basis.
Jay Van Bavel: Yes, we can see how easy it is to create the corrosive and destructive nature of tribalism by studying minimal groups.
This was research I started doing my very first year of graduate school. I read this study from the 1970s, one of several famous studies on the subject by Henri Tajfel, that blew my mind. These were teenage boys and they assigned them to one of two groups, based on whether they overestimated or underestimated dots on a screen. Basically, the two groups were formed on the equivalent of a coin flip. What he and colleagues found in these studies is the moment that the boys were assigned a group, even if they never even got to meet members of their own group, even if they had no reliance on members of their own group, even if their group didn't have a name or culture or logo, they given the opportunity to allocate money to the other boys, they would allocate more money to other members of their own group over an outgroup. They wanted to maximize the amount of money that their ingroup members got relative to the outgroup.
I've studied this and found the same thing. Just being part of a group, even one formed for the most trivial reason (like a coin flip), leads people to prefer members of their own group. It's called ingroup favoritism. That seems to be something that is very much like a button in the brain that gets pressed and triggers a way of thinking about the world.
I've seen this. I grew up in a small town and we'd play sports all the time. At the hockey rink, we’d determine the teams by putting every stick in the middle and have someone blindly grab them and throw them to two sides. Within a few minutes people would be violently body checking one another, We would be aggressively opposed to someone who was a best friend a few minutes ago. I lived it out long before I knew what was happening scientifically.
What we find initially is the division mostly is driven by ingroup favoritism. In other words, it's mostly about liking ingroup members. You look to them more, you pay attention to them, you think of them as individuals, you care about them, you want to be friends with them.
But once you layer on other things, it flips to the outgroup derogation or negativity or hate. That often happens when you add competition or you add moral differences. When you think of people fighting over sacred land or religious differences or political differences, those are the things that often have been added in to this basic us-versus-them system that can make it go in a negative or hateful or violent direction.
How we see ingroup members (but, notably, not outgroup members) as individuals
Annie: There's something that you said that I want to explore further, about seeing the people in your group more as individuals than people outside of your group, which I assume is then exacerbated once you layer on competition, political or religious differences, and other things. Can you explain what do you mean by that? In what way are we seeing people in our group as individuals in a way that we're not necessarily seeing people outside of our group?
Jay Van Bavel: The most famous example of this is called the they-all-look-alike effect, which means that people think that members of other racial groups all look alike, and they get them confused. This happens when white people are looking at black people, and the same when black people are looking at white people, or between white people and Asian people, etc. They can remember individuals in their group, but they get them confused when they're part of an outgroup. Another term for it is outgroup homogeneity.
The members of the outgroup all seem the same to us and we use stereotypes to decide how we're thinking about them. But when we're thinking about ingroup members, once we meet them, we're paying more attention to individual characteristics that define them, and we're more motivated to figure out who they are.
That’s the first phase of a lot of group memberships, but it's persistent, because it affects racial interactions and so forth. A lot of it is just about motivation, who we're motivated to pay attention to. They've done eye tracking studies where they find that if you're assigned to a group, you look at the eyes of ingroup members more. That's one of the ways you distinguish who they are. If it’s an outgroup member, you're just not paying that much of attention to those key features.
Political groups and what seems to have gone so wrong with them
Annie: I want your opinion on whether part of the polarization between political groups is related to outgroup homogeneity. Politics is obviously the place where we can most easily map this idea of identity and ingroup and outgroup membership. Are you saying that conservatives are more likely to think that liberals all hold the same opinions and liberals are more likely to think that conservatives hold the same opinions? Or maybe that all liberals are the same type of person, or all conservatives are the same type of person? Is that in that zone or is that a different idea?
Jay Van Bavel: That's totally in that zone. A lot of political beliefs about the other party are badly wrong. Liberals are badly mistaken about what the average conservative thinks. And conservatives tend to be badly mistaken about what the average liberal thinks.
Annie: Why are they badly mistaken? Is it that part of what we need is distinctiveness and if we took the average conservative and the average liberal, they might not be that far apart? But if we took the tails, like the most extreme conservative and the most extreme liberal, now we have a lot of distinctiveness between them. My guess is, and correct me if I'm wrong, that when a conservative is thinking all liberals are alike, they're assigning the most extreme version of liberal, and vice versa, the most extreme version of conservative from the other side.
Jay Van Bavel: Yeah, I think that's a big part of what happens. There's part of the dialogue on social media or even in mainstream media where, they call it nut picking, where they'll pull the nuttiest person from the other side and hold them up as if they're representative of that group. If you watch Fox News, they have segments like campus crazies, which is the most extreme, absurd things happening on campus. They're not showing the average student who's just going to their classes and has a pretty boring existence. They're showing the most extreme thing. I see the same thing all the time among liberals, doing this to conservatives on social media. They'll find somebody who's really extreme or insane and hold them up as if they're representative. That's something that is really corrosive to developing accurate understandings of what other people believe because we're more segregated than ever politically in this country.
People move into communities of people like them and develop internet communities of people who are like them. In the rare times they're interacting with people who differ from them politically, they're only seeing these wacky versions, and that is jarring. The dramatic example of nut picking is Libs of TikTok. The account basically just shows videos of liberals on TikTok talking about their political beliefs. But it finds the people with the most extreme beliefs and ways of acting and ways of talking and presents them as average liberals on TikTok. I watched it and immediately feel bad for the people it’s showing because then they get immediately harassed. But I also see what's happening is they're being taken as representative. The people watching that account probably have no idea what the average liberal believes.
Is political tribalism interfering with the accuracy of our beliefs?
Annie: One of the things that concerns me in terms of decision-making in the political sphere is the degree to which tribes are interfering the relationship between facts and beliefs. We think that our beliefs are informed by the facts on the ground and what is rationally true, and that will inform the decisions we make. But my feeling is, in some of the things that you've been talking about, that beliefs are very much molded by the tribe that you belong to. How much is tribe actually delivering to us our epistemology, what it is we believe to be true?
Jay Van Bavel: I love that question. I think I wrote my first paper on partisan identity and beliefs. I wrote a big paper about it six years ago because it was just driving me crazy, seeing all these examples of people's political beliefs and how much they would flip to match the beliefs of their political leaders or party.
A well-known recent example is the NFL. I remember looking at this data, and it was right around the time Colin Kaepernick was kneeling to protest police violence against black Americans. The NFL was the most popular sport in the United States for many decades. Someone posted a plot of data, and you could see the percentage of Democrats who liked the NFL and the percentage of Republicans, and it was pretty much 75% for each party, for years and years. Then, all of a sudden, you see that how much Republicans like the NFL plummets and on the graph they had plotted when Donald Trump had tweeted criticism of the NFL because he didn't like Colin Kaepernick kneeling. It was a dramatic shift, and it was sudden, and it immediately followed these criticisms.
One way of thinking about it is just that if you're part of a political party and the only people you trust are members of that party or news sites or news media connected with that party or that leader, that's going to be something that shapes your political beliefs pretty dramatically. It’s going to allow you to tune out what other media sources or leaders are saying. I think we've reached a point in America, thanks to extreme polarization, where there are large segments of the population who only trust a small subset of media sources or political elites, and their beliefs shift dramatically based on what those elites say.
You see this in America where belief systems are moving further and further away from any sense of shared reality of what's true and it's affecting things. It's one thing if you suddenly shift what you think about the NFL. But if you don't trust vaccines anymore, then that has huge consequences for your health in a pandemic. Or if you don’t trust nuclear power that has huge consequences for climate. That's where it starts to become really potentially dangerous for people.
How do we maintain a consistent sense of self when our political tribe shifts our beliefs?
Annie: I’d love for you to address what seems to me like a potential paradox. Our deeply held beliefs, as well as feeling those beliefs are valid and consistent, make up an important part of our identity. But doesn’t that conflict with our identity based on political tribe, where the tribe can abruptly shift positions on some of those beliefs? How can we maintain our sense of self as consistent over time when we see these big shifts in beliefs? I feel there’s some rationalizing and retrofitting going on. Do I have that wrong?
Jay Van Bavel: You have it right. You've queued in on these weird things that go on in the world around us. I also feel like maybe we're living in a weird time where we just see more evidence of this craziness nonstop. I do think there are some things that are probably more deeply held beliefs, maybe your attitude towards your spouse or parenting in general.
Annie: Do we process moral transgressions, as an example, of our own tribe differently than the way that we process moral transgressions of the other side? When we're looking at the other side, we're saying, “That's so hypocritical because they were letting this person get a pass,” and now when it's our side, they're just all over it. Is it hypocrisy or is it something else about the tribal nature of the way that our brains work?
Jay Van Bavel: There's a famous study on this by Piercarlo Valdesolo and David DeSteno, that we’re about to run a replication on. There’s individual hypocrisy, where people will say something's wrong when someone else does it, but then they'll do the same thing and say it's not wrong. They found that same thing happens in groups. We will think it's wrong when an outgroup does something, but it's okay if our group does the exact same thing.
Annie: The way I’m using “hypocrisy” includes self-knowledge and self-awareness that you are applying a different standard to one thing than the other. Is it hypocrisy without that knowledge and awareness? Or is it inconsistency about the rules that you're applying to your ingroup versus your outgroup, where you are not necessarily aware that those are wildly different standards. In the group dynamic, it's not necessarily that people are like, “I'm going to go lie and I'm going to apply a different standard to my ingroup than my outgroup.” This is just something that is naturally the way that we're processing the world. We may not even be aware that we're applying these different standards.
Jay Van Bavel: Your instinct is right. We have some studies on this. These are run by a former student here, Daniel Yudkin. He’s now at More in Common. He actually works on these things in the real world. We found that, when people were looking to punish ingroup or outgroup members who did a bad thing, they punished outgroup members more harshly than ingroup members. We found that they were more biased and punished members more harshly when they made rapid judgments or when actually they were under cognitive load. In other words, when they were distracted and thinking about something else they treated outgroup members worse.
It's exactly consistent with what you're saying. It's like a lot of these biases are happening either when we're just making gut decisions or when we’re distracted, and they might not be fully deliberative and carefully calculated. For the average person, our data suggests it's happening spontaneously or reactively, and that's where you're getting the biggest biases or the most hypocrisy in judgments of right and wrong and good and bad.
When one group tries to cut you off from other groups
Annie: We have different groups that we belong to, but sometimes a group we belong to can come into competition with another group we belong to, and I'm curious about how that gets resolved. Obviously, one of the big ways that we belong is belonging to our own families. But people end up in groups where they abandon old groups, including their families. I think that happens quite a bit in cults.
Jay Van Bavel: One of the things that cults do really well is they try to disconnect you from other sources of social support and love, but also people who can provide you with a reality or sanity check. They get you to spend less time with them. They get you to leave your old job. They become your main social support. You can see this happening, at least as a means separating people from other sources of information, with political groups.
What they're trying to do is cut you off from other sources of reality. You rely more and more on information from your own group, and you become more and more dependent on them about what's true, what's false, and then they do it socially. That’s something that happens all the way up to political parties.
It's a way that these groups cut you off from others and it comes from cults. Cults cut you off from other people, and then your family stops talking to you. That gives the cult even more power over you. Then, it becomes even harder for you to leave the cult. Let's say you start getting abused by the cult or your money gets taken by them or you get mistreated. It's harder and harder to leave because you don't have a family that you're connected to that you're in touch with to go back to.
It's actually one of the reasons why I tell people, if you have a family member who's gone to QAnon, you don't want to cut them off, as much as they're offensive to you or deranged. Because if they're ready to leave, they might need you as a voice that they can trust to listen to them and help them leave. If you have a friend in an abusive relationship and they won't leave, they might at some future time be ready. If there's no social support there, it's impossible for them to leave. You have to keep that line of communication open.
Can you vaccinate people against misinformation? And if so, how?
Annie: Obviously, misinformation is mainly a problem when it's coming from in your tribe. If it’s from someone who is out of your tribe, the opposing political party, you're probably not going to believe it, and vice versa. But I've seen some good news that we can vaccinate people against misinformation or we can correct people's beliefs.
This is intuitive belief, I think: Most people are rational. If you believe something that's wrong, and I tell you what the actual truth is, then your belief will move in the direction of the truth. But I feel like our lived experience, to use that term, is nobody listens to anybody who's not in their group. But then some of the scientific literature is saying, no, they do. You can vaccinate people against misinformation, and you can correct misinformation. What's right? Is our lived experience correct? Or is the science correct?
Jay Van Bavel: The biggest thing, by far more powerful than whether something's true or false, is simply who said it. Democrats trust other Democrats whether it's true or false. And Republicans trust other Republicans, whether it's true or false. Fact checks can correct for that but have pretty small effects. Fact checks from an outgroup member have the smallest effects. Whereas if you hear something wrong and an ingroup member fact checks it, you're much more likely to update your belief. We find the same thing for Democrats and Republicans.
That’s something that tells me we still need fact checks. But ideally what we would have is robust fact checkers across the partisan spectrum that are doing this job. I don't know how to do that well. Maybe what you need for the upcoming election is someone from Fox News, someone from the New York Times, and someone from AP Associated Press who's totally neutral, and have all of them on a panel that fact checks every major claim. Then, when they converge on the same answer, that might provide the most trustworthy evidence, no matter what your place is on the political spectrum. We should try that as a study, but I think that's why it's important.
Annie: When we think about exposure to different points of view, it used to be like maybe there were block parties or neighborhood parties or whatever. Your neighbor might be a Democrat and you're a Republican and you're hanging out and you see them as an individual in the way that you don't necessarily see outgroup members as individuals. You figure out that you have more in common than differences. I think we all get the sense that has broken that apart with the introduction of online communities. You're pointing out that you're going to see the weirdest of those people that disagree with you and that's going to distort your view. Are we communicating across tribe or are we mostly communicating within tribe and amplifying the effects of tribe on our identity and what we believe?
Jay Van Bavel: The cool thing about the internet is you're exposed to more perspectives than you ever would've before. That held all this promise that it was going to be this globalizing force that was going to make people more open-minded. But the reality is much different. There's an analysis, a paper that a computer scientist at Cornell who analyzed Twitter's algorithm. What you see if you're on Twitter on the algorithmic feed versus who you choose to follow is that the algorithmic feed is more polarizing, moralized, extreme content that often makes you feel more negative about the other party. You might see positions from the other party, but they're the most extreme ones and it makes you more negative about them.
That’s the consequence of the technology and the platform. The way that it is been monetized is that's what keeps people online. That's what keeps them engaged, that's what makes them comment. They get worked up. The company makes money by sending you these extreme representations of other groups that you don't like, rather than a reasonable nuanced person from another party who might open your mind or make you realize there's a perspective you didn't hear.
I got hired to analyze data from International House, which is this residence in New York where visiting graduate students from all over the world can live. We gave this big survey to 50 years of residents who lived there about what their experience was, and what they learned from living there.
I was reading the comments, thousands of them, and almost all of them are like, “I'm way more open-minded about people from different cultures,” and “I never understood their perspective,” and “it changed how I think about how I interact with people,” and “it made me more willing to engage with people.” They would talk about the need for the art of compromise and just living there for a year or two with all of these people from different backgrounds. I'm not just talking about political backgrounds. That was part of it, but also cultural, religious, ethnic, and national identities. Having to live with them and talk with them and engage with them about political issues and stuff like that over dinner every day changed their attitudes. It made people understand diversity in a deep way and how to navigate it. I just think we're creating a world where people are not getting trained how to navigate diversity of any sort.
The good news – and there is some
Annie: Alright, this has been a lot of bad news. Do you have any good news for us, Jay?
Jay Van Bavel: I guess the good news is that there is a rich science on this. A lot of the lessons about how to make things better have been tested and found consistent over and over and over again. I'll give you an example of it because I'm all about data. This group out of Stanford developed the Strengthening Democracy Challenge. They invited teams of practitioners, scientists, and members of the public from around the world to come up with ideas to reduce polarization and increase support for democracy and reduce partisan violence. Out of 252 teams that submitted proposals, they picked the top 25 and then tested them in a huge sample of 32,000 Americans, a diverse representative sample.
We submitted an intervention based on my book, on the notions of building shared identities and challenging these false stereotypes about the other party. It was tied for first for the most effective interventions on reducing polarization. Actually, the effect size was third largest, but my students like to point out it was statistically identical to the best one. We submitted something that's pretty consistent with 70 years of research in social psychology and had a huge effect size. They followed up with these people two weeks later and the effect persisted two weeks later. Huge sample, the best study I've ever seen of this type. I was not even involved in running the study. It was also great because the researchers invested in finding evidence for the theory had no hands on the data whatsoever. First of all, that's a great scientific approach.
The other best one was basically showing that Heineken commercial of people building a bar bonding despite learning of their political disagreements on key issues. Another one was about just allowing Republicans and Democrats to know and understand that they're being mischaracterized by the media. Those were the three best interventions. All of them are basically things we talked about today. All of them had pretty big effect sizes and in a diverse sample, pre-registered experiment, and with randomization and stuff. We have solutions. Those are three of them that have big effect sizes, work well across the political spectrum, and they last for multiple weeks. We need to scale stuff like that a lot more.
The other thing is that I'm a co-founding partner of this group called Starts With Us and it's the Lubetzky Family Foundation, creators of the KIND bars. They see the kind of conflict in the Middle East emerging in the U.S. around moralized, intractable, violent conflict. They have this incredible group of people, but their whole premise, their whole model, is that if you see a problem, you should join our movement and make changes in yourself. It's very individualistic and has a massive selection bias problem. I think the only people who are going to opt into this are people like you and me who don't probably need this nearly as much.
You might remember, from a conference we both attended, Josh Fryday, Chief Service Officer in the California Governor's office. They've created a special ministry for him, California Volunteers, and he wants to have something like the Peace Corps, but in California, communities helping people and talking to people not like them.
I think those are the types of solutions that to me, if you take the principles of the science we talked about and embed them in large, scalable, real practical environments, that's where you'll get change. I don't think you'll get change otherwise. I don't think enough people will opt into this change.
Annie: We need trust team building and trust exercises, universal trust falls that have been done in the corporate world for so long, but we need it for society.
Jay Van Bavel: You need something like that, where everybody's forced to go and spend a day doing trust falls, a team retreat and they can't opt out of it. You need something like that, done at the school level across the country.
Annie: Have you seen differences in tribalism within a country for countries that have mandatory military service of some sort?
Jay Van Bavel: I have not looked at that. I mean, my understanding is that builds a sense of shared identity. Maybe something like that, a civic service or civil service. I like the California model, which is, if you're in a college or university there, you get a $10,000 reduction on your tuition each year just for doing a certain number of volunteer hours. You get this personal benefit, but it's also a benefit that you end up giving back to society. If there is some way to do civil service like that and scalable.
Annie: That's what we have to figure out. A big American trust fall.
Jay Van Bavel: I've been in conversation like this for two years with the people doing this in California. They wanted me originally and Dominic Packer to come in and measure it and test it. We’re like, “If we're going to do a study for you, it'll cost you a lot of money. And if you get us and it doesn't work, we're going to have to tell you in the report. You've got to be ready to be wrong before you ever hire me.”
Annie: That would be incredible.
Jay Van Bavel: But they have to be ready to be wrong and this guy's job probably hinges on it. That's the tricky thing. I would love for the government to commit to funding research on whether this works and then scaling it. I'm always a fan of test-small-and-if-it-works-scale.
I think corporations should also be involved because guess what? Corporations are getting sick of being polarized. They're getting sick of taking stances on something that then everybody boycotts them who doesn't agree with whatever stance they took. I think that there's probably an incentive there somewhere. I could imagine that being a thing that could catch. That’s one place where I could see potential wide scale mobility around some cultural changes.
Or else in the educational system. That's another place where I can imagine that it becomes mandatory in some state to take a civics class or something like that, and it involves these types of things.
Annie: I’m glad we ended on a hopeful note, which probably doesn’t happen very much these days in talks about our political divide. Thank you.