NHPBS Presents
Social Media and Democracy
Special | 50m 15sVideo has Closed Captions
A William W. Treat Lecture.
A William W. Treat Lecture with Harvard Law School professor Lawrence Lessig. This free event will focus on the influence social media has on democracy in the United States and we'll discuss ways to honor our values in this changing landscape.
NHPBS Presents is a local public television program presented by NHPBS
NHPBS Presents
Social Media and Democracy
Special | 50m 15sVideo has Closed Captions
A William W. Treat Lecture with Harvard Law School professor Lawrence Lessig. This free event will focus on the influence social media has on democracy in the United States and we'll discuss ways to honor our values in this changing landscape.
How to Watch NHPBS Presents
NHPBS Presents is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
So you might remember Steve's Martin's 1979 film The Jerk.
1979 was a simpler, maybe happier time in America.
And Steve Martin in that film plays a character, Naveen Johnson, whose story is the very best of America.
Naveen invents a clever device to keep your glasses from slipping off your nose.
The up to grab the country goes Wild.
And this great innovator is rewarded with extraordinary wealth.
It is the American dream.
But then it is discovered that the object grab causes its users to go cross-eyed.
And very quickly, as Naveen is held responsible for that harm, his wealth disappears.
The optical drive is discontinued.
Naveen finds happiness despite no longer having endless wealth.
Now the part of the story that's significant to me is the combination of great reward for innovation and true responsibility for harm done.
Naveen was innocent at the start.
Maybe negligent, but certainly not evil.
But by discontinuing the product that harmed others, he became innocent at the end as well.
Now, this is not a typical pattern in America today.
Think of two products along the way too.
Considering the focus of my talk today social media.
First, think about cigarets.
It's easy to forget that there was a time when smoking was uncomplicated.
People liked it.
No one had a fear about it.
Indeed, some thought that smoking was actually helpful for asthma and other lung diseases.
It was the age of innocence for cigarets, and in that age, great entrepreneurs were incredibly innovative.
To increase the production of cigarets.
In 1881, James Bond Sock invented a cigaret rolling machine that could produce 120,000 cigarets per day, and that capacity only increased through the First and Second World Wars, when cigarets were provided in the daily soldier rations.
But then in the 1950s, first the British and then the Americans began to release robust research reports demonstrating the devastating health effects from cigarets.
For a time these studies were contested, but by the end but the early 1960s, no one could doubt that they were correct.
So what did the tobacco companies do?
Well, they certainly didn't follow the example of the inventor of the optical drive, Navin Johnson, though they too were innocent at first.
Rather than accept the science, they dissembled to sow doubt about the science.
For almost 50 years, they continued to press their claim that cigarets were not harmful and pressed through lobbying and campaign contributions to ensure that Congress did nothing against them.
Only when the courts began to find them liable for the harm they knowingly imposed on the world, did they finally back down millions?
Stanford's Robert Proctor estimates 6 million souls per year have died because of their dissembling.
Or think about a second product.
Ultra processed food.
Food production, of course, is the quintessential good product.
America became the breadbasket of the world because we became so fantastically efficient at producing food.
Inventors developed extra ordinary technologies to improve farm yields.
Early in the last century, we became next net exporters of food, producing more than we could eat.
Then the food scientists came along, exploring and experimenting with ways to make food even more compelling, long lasting, easier to ship.
The scientists sought the perfect mix of salt, fat and sugar to basically hack evolution and drive us to eat more than we otherwise did.
This these companies that did this, the growing agribusiness of America, they were not evil.
They were just trying to do what every good business does make more money.
But as their scientists continued their work and processed food became ultra processed food.
Health officials began to voice concerns the consequences of the diet that America was increasingly consuming because of the innovations of ultra processed foods were devastating for our health.
In the early 1990s, the adult obesity rate in the United States was about 12%.
By 2020, the rate was 41.9%.
Children.
Obesity in 1970 was about 5%.
By 2020, it was 19.7%, affecting about 15 million children and adolescents.
Now, don't get me wrong, I don't believe any of these companies hated America.
I don't believe they produced what they produced because they wanted to make America sick.
Too bad for us that the food that is most compelling to us is also the food that is most unhealthy for us.
Agribusiness didn't create evolution, but agribusiness exploited evolution.
And when companies like Kraft tried to reform what they sold to offer healthier foods, we were not interested.
We had been trained to like what we liked.
And when Kraft wouldn't give it to us, we went somewhere else.
Okay, so now consider social media like cigarets and processed food.
When social media was born, the world celebrated this extraordinary innovation.
I still remember its highlight, the Arab Spring, when we all told ourselves that social media had actually felled dictators in Tunisia and Egypt, and that the same technology was giving all of us a chance to see what happened to that high school friend who wouldn't date us, or who did date us, but dumped us on the night before the prom.
That actually didn't happen.
But anyway, what could be better?
What could possibly go wrong?
Well, advertising for starters, though for many years no one quite knew how the internet would ever pay for itself.
Early in the 20 tens, social media platforms decided it would be advertising that would pay for social media.
So the geeks, like the food scientists with processed food, began to explore how best to keep, to exploit evolution, to keep us glued to our screens so that there would be more time for them to show us the ads they wanted to sell.
We are evolved to respond best to random rewards, so they built random rewards into the platform.
We can't resist bottomless pits of content, so you'll notice your Facebook feeds or news or Twitter feeds just never end.
Now, at the start, no one quite knew what kinds of content would work best.
Some were very hopeful, but very quickly as the eyes that they deployed to determine what would engage us more, they determined that the more extreme, hate filled and misleading their content was, the more we watched it.
We just couldn't turn away.
Tucker Carlson would always beat Walter Cronkite, Donald Trump would beat Jeb Bush.
So by the middle of the 20 tens, the machines had gotten very good at understanding what sort of stuff would engage us the most.
The answer?
The worst sort of stuff that could be spewed across the platforms.
The stuff that would make us hate our neighbor if our neighbor was from a different political party, or his kids loved in a different way, the stuff that would make Mark Zuckerberg rich was the stuff that would make American democracy extremely poor.
Now, again, as with food and cigarets, none of these companies wanted things to turn out as they did.
Mark Zuckerberg doesn't hate America, and I'm sure he wishes that the most profitable content for his platform would also be the most edifying or solidifying for American culture.
But when it turned out that it was not Zuckerberg in particular, and social media in general did not do what Navin Johnson did, none discontinued their damaging products.
They instead continued to push and continue to develop the AI that would even more effectively get us to engage even though the consequence of that engagement is to weaken and maybe destroy the fabric of our democracy.
This is the part that always gets me.
I would understand the argument that we need to destroy democracy to save the planet.
I don't believe it, but I would understand it or I would understand the argument.
We need to pause democracy to deal with a famine.
I don't believe that either, but I would understand it.
What I don't understand is the idea that we need to destroy the fabric of our democracy, so that Mark Zuckerberg can be even richer.
I don't understand why we are sacrificing all that we are sacrificing, just so that advertising revenues can continue new to climb, because that is what drives our media today.
Advertising revenue.
It is advertising revenue that makes social media and cable television be as it is.
All of the harm that these platforms are causing is being caused to make what we used to call Madison Avenue, but what is now 1850 four square miles in California, Silicon Valley, insanely rich.
Now, we're not going to solve this problem by November.
But there are things that we can do that we must do.
Each of us independently and together now to resist that problem.
First, each of us must reflect constantly that if we don't understand how others believe as they do, then that is because we don't understand.
We need to keep at the front of our minds that we are all living in a world where we are all being played, where the platforms of media for all of us are focused not on spreading truth or understanding or strengthening democracy, but instead focused on how best to engage us and to keep us engaged.
And so that means second, that we must learn how to speak across the gaps, that this technology is rending in the hearts of our society.
They don't want to poison us.
They just have to.
And though I do believe that we need to develop a stronger and more united response to those who would poison us, just so they could be richer.
The more important response right now is for us to learn how to love our neighbor again.
Now that's easy for someone like me.
I live in Brookline, Massachusetts, just across the river from the Republic of Cambridge.
There are no MAGA hats, or if they are, they are worn in the privacy of someone's own home.
But it's different here.
New Hampshire is beautiful because of its diversity.
The most important diversity, ideological diversity for you here.
The challenge I am describing is something you confront daily.
And so maybe you can teach us.
Maybe you can develop the practices that we all need as citizens, as neighbors, as parents to practice the love of country first and of the potential this country has.
If only we could be liberated from a business model that profits from teaching us to hate them.
Now they may well say, echoing The Godfather, that it's not personal.
It's strictly business.
But it is personal to me.
And it should be to you, because what they're destroying is not our business.
It's our country.
It is our ability to govern ourselves.
It is the future for our children.
None of that is worth sacrificing forever for anything.
Certainly not for a corner of California that needs, it says, to become even more insane.
The rich.
This is a terrifying moment, and the arc of American democracy and the hopeful will tell us the arc bends towards justice.
And I hope they're right.
But what I know is that if we all don't take responsibility for how we engage with our neighbor, if we don't understand that we're being steered as we are for reasons completely unrelated to the values of democracy, if we don't demand a return to the culture that can debate and engage on democratic issues the way New Hampshire has in the tradition of the presidential primaries for so many years, this democracy will be lost.
And if there's a history in 100 years, they will look back and ask, how could so much be sacrificed for so little?
How could they sit back and allow what they had built to be taken away?
So a tiny, tiny few could be even more privileged.
We can stop this.
I want to ask you to try.
Thank you very much.
I've got questions, many of them from high school students, some from adults.
And I think that they will take us, give you the opportunity, go a little deeper into some of the themes that you've raised.
Our first question is from a student named Ellie, who's 16 years old.
Do you agree with this statement?
Social media helps democracy because we as teens are more actively making connections and learning new information.
So I agree that technology could help democracy by enabling people to engage.
Exactly as Ellie, I think I know which Ellie this is, describes.
if there weren't a constant pressure on the platform to steer people in directions that lead them to engage in unproductive, destructive ways for democracy, you know, the European parties, as revealed in the Facebook papers, I had the honor to be, Frances Haugen, the Facebook whistleblowers, lawyer, the European parties, leading up to the 2016, and after elections came to Facebook and said, you know, you changed your algorithm.
And now the only way we can say things that will be spread to our whole list is if we say inciting and destructive and hateful things, you're forcing us to become hateful just to compete.
And Facebook's answer was, that's the way we get people to engage.
That's our business model.
And so I don't want to target the technology.
I want to target the business model.
Their model is to force engagement.
And if that's their model, they will do it by exploiting our weaknesses.
And those weaknesses turn us int for young people just to follow up.
So we you here, for young people, they're they're experiencing that reinforcement all the time when they post something on TikTok.
And lots of people watch it, which feels good.
But our kids are also feeling the pain, maybe more directly than anybody else, because we see them getting, we understand the mental health effects that that being body shamed or always thinking that they're not good enough.
if you if you you've probably spoken to rooms full of kids, how do you help them understand what's happening to them and help them see that the thing that they're getting sucked into, they have some power over?
there's a fantastic book called The Outrage Machine, and it has a great, metaphor where it says, so imagine you have a dog, and, the dog in the morning brings you three things.
one thing it brings you is a cuddly toy.
Another thing, it brings you some pair of socks.
And the third thing it brings you is a dead rat.
And then you, of course, react to the dead rat.
You're outraged, like, oh, my God, how could you do this puppy?
You don't bring a dead rat in here, but the dog just records the fact that it's the dead rat that has excited you.
So what does the dog do?
It goes out and gets another dead rat.
And more dead rats.
Because the dead rat reaction is what it's seeking.
It's seeking the attention.
Now, when you think about it like that, I think kids are pretty sophisticated and understanding they're being played with because they will themselves see they can pose something.
It can be really smart, insightful, and it gets two likes or two retweets, and then they can post something that's really hateful or aggressive or like outrageous, or they're dressed in ways that none of us parents would want them to be dressed.
And they get hundreds, maybe thousands of retweets.
So they see that the mechanism is driving them to the kind of content and behavior that they themselves don't want.
And many of them, if they have the courage, find a way to to back off, to pull away.
But I think Jonathan Haidt recent book, The Age of, oh, it's a title I always blanket reviews with him.
Yeah.
So it's a fantastic book.
but it's just come out, describes the mechanism, especially for children, especially for girls.
and it's hard not to be terrified about its effect on our kids, independent of its effect on democracy.
But it's the same dynamic.
Yeah.
And we all fall prey.
I was just told that Knpr has its most viewed TikTok ever.
Over a million views.
And it's of, protesters at Dartmouth shouting and shouting at each other over, Gaza.
Yes.
So, yes.
Amanda from Penicuik asks, this is a simple one, so you can answer it in just a couple of sentences.
How do we balance the values of free speech and free press with the dangers of misinformation and deliberately, deliberately divisive rhetoric on social media?
Yeah.
So it's an important question, because some people look at the problem of social media and say, we need to filter out the bad content and keep only the good content.
we need to filter out the misinformation or mark the misinformation.
But what the engineers at Facebook discovered when they first began to see this problem is that it's very simple, technical decisions that the platform makes that helps amplify this terrible content.
So, for example, Facebook did an experiment where after two read posts, if you wanted to repost the content, you would have to copy the link and post it yourself.
So it kind of slows the process down.
And it's not picking between, you know, outrageous content or not outrageous content or Republican content.
And it's just saying after two reposts, you're going to have to take an extra step to repost it.
And that's simple friction led people to think a little bit about what they were posting, and it radically reduced the amount of misinformation or inflammatory information that was being posted.
Now, they took this report, this research, back to Mark Zuckerberg and said we could solve a huge chunk of this problem just by slowing down the repost.
And the response was, yeah, but that was slow down engagement and therefore reduce our numbers and therefore would be punished on Wall Street in our stock price.
So they didn't do it and they didn't do it leading up to the events of January 6th as well, because they had turned off all of the dampening technologies that and put in place during the election.
So there are ways of doing this that should not violate fundamental principles that we don't want people picking and choosing the content that we are uttering, you know, within the realm of decent, engagement, non hurtful, nondestructive engagement.
we could we could have that if we, if we could give up on the idea that the objective of the platform is to maximize engagement no matter what, at the risk of, of inviting you to nerd out, which will satisfy me, but made me lose the audience.
I I'm friends with a guy who's a telecommunications lawyer in Washington.
You probably know him.
A guy named Larry Irving.
Yeah.
Who?
basically, takes credit and blame for the, section of the Telecommunications Act that, makes the platforms not susceptible to defamation claims like Knpr is or Fox News is or the New York Times.
do you believe it would make a big difference if, if Facebook and Twitter and TikTok were subject to those, laws and could be sued for libel and slander and such, right.
So the geeks will recognize you're talking about section 230, right, of the telecommunications.
the Communications Decency Act.
and, and so what section 230 says is a broad range of content.
The platform is immune from liability for certain exceptions like copyright.
You can be you can be sued for copyright in certain child sex, images.
But beyond those, you're immune.
That provision was crafted at a time when the computers were not smart enough to understand what was on that platform.
They weren't the platforms were not targeting or spewing or pushing or amplifying or suppressing messages based on the content.
Today, the AI in these platforms is extremely sophisticated, and it knows what the content is.
So the conditions that made it makes sense to have section 230.
In 1996, I guess the statute was passed no longer exist.
There's no reason today to say that if your platform is amplifying certain kinds of content, or at least once you've been notified, the content is defamatory, you shouldn't continue to amplify that content.
There's no reason not to create that kind of liability.
But here's I think the most important point about this.
You know, lawyers, especially like to think that liability through courts will solve the problem.
And I'm sorry to tell you this, but the law is pretty bad.
It's pretty inefficient, and it's pretty incapable of dealing with this just by liability.
The only way we address this problem is by addressing the business model of engagement, so long as they are focused on how to steal every minute of your time, they will do whatever they have to do to get you to engage.
And if they drop off the misleading content or the false content, I'll find other ways to get you engaged.
Now, I think we could address the problem of, the business model.
I think we can for, you know, here's a word you can't utter in Washington.
But let me utter it.
You could tax engagement, right?
You could say, be really geeky here.
We can have a quadratic tax on engagement, whatever the unit is.
If it's one unit of engagement, the tax is one.
If it's two units of engagement, the tax is for.
If it's three years of engagement, the tax is nine.
So the point is that the tax goes up.
And all of a sudden the, you know, Mark Zuckerberg said, hey, go get a life like, you know, you've been on long enough.
We don't want you here anymore.
And the point is, it would change the way the platform tried to do its work.
So it wasn't about pulling you in at all hours of the day, constantly checking your feed.
It was instead trying to find business models that weren't prey to that type of incentive.
I think the only way we solve this is to get them away from this particular business model.
And again, I think that ties up why this is so outrageous.
Wait, a business model for a couple, you know, literally five, five companies are the most profitable in the space.
A business model for these five companies that is having this destructive psychological effect on individuals and on the democracy is the thing that we can't do anything about.
I mean, this is what governments are for, to decide when intervention is necessary, based on whether what you're doing is harming society.
And this is a classic example at that time.
And we we see efforts to regulate get going.
And, Senator Klobuchar, I remember a couple of years ago there were hearings the Democrats controlled the Senate and the white House.
Is it is it the fact that Mark Zuckerberg has all the money in the world that is stopping Washington from doing something about this?
Well, surprise, surprise, money is at the roots.
you know, Granny Dee's message, when she made her walk across the United States at the turn of the last century is still true today.
But true, two orders of magnitude more.
and so right now, you see, I just read an article today about how, the lobbyists, for the tech companies behind.
I have successfully neutralized the efforts in Congress to try to do something about AI.
and so, you know, the reality is, is pretty trivial when you've got members of Congress spending 30 to 70% of their time raising money.
They are extremely dependent on the people who give money.
And that's not even talking about the super PACs.
They get to sweep in and spend millions of dollars in these particular races.
And the threat of the super PACs means that you're constantly making sure you don't upset the people who are going to spend that super PAC money.
So we have a Congress that is extraordinarily dependent on the tiniest fraction of the 1%.
You know, Madison promised us a democracy where the House would be, quote, dependent on the people alone.
And to clarify, he went on to say, by the people he means, quote, not the rich more than the poor.
Well, we've totally corrupted his vision.
We have a Congress that is extremely dependent on the extremely rich.
That's who they care about first.
That's the first line of them getting into Congress is, can I excite the attention of enough super rich people to fund my campaigns?
and and so long as that's the reality, the idea of our Congress taking on hugely significant issues will just not be possible.
Going in a different direction, because I'll just get sadder and sadder.
I'm sorry.
All of this is obviously, you know, pulled the the the business model out from under local news media, among other things, which is obviously something dear to my heart.
Kaylee, who is 18, asks, should news organizations be allowed to self-police their own content for misinformation?
Why or why not?
Yes, news organizations absolutely are in the business and should be in the business of policing their own content.
So that's called editorial judgment.
As long as it's humans making the editorial judgment.
I think we've, you know, especially lawyers are quick to like slide the values of the First Amendment from humans over to machines.
Google about a decade ago funded a whole bunch of lawsuits to establish that an algorithm is protected by the First Amendment.
and the reality is, if the First Amendment protects these machines, we're toast because there's no way we can regulate against the burden of the First Amendment to make sure these machines are not causing harm to us.
So as long as it's a human, absolutely, the law should step aside and say, you make your judgment and your readers decide whether to continue to subscribe to you on the basis of that judgment.
But when it's machines, I think it's a different it's a different world.
I think when it's machines, it's not that there's no type of protection.
So you couldn't have a law that says the machines have to filter out democratic content and only show Republican content.
You can't have content based or viewpoint based restrictions, but I do think you can begin to have restrictions of them on the machines to make sure that the machines are not engaging with us in ways that's destructive to our democracy.
We've got basically, I think of it as I'm an old new New York newspaper guy.
We've got machines that are like the best tabloid front page headline writers of all time.
Right.
And would you would you let the headline writers at the New York Post, you know, controlling society?
Yeah, they basically do.
I mean, and sometimes there's a great story that, you know, most of the time these companies say, well, it's actually not machines, it's humans.
But here was one case where Facebook was willing to admit it was the machine.
In 2017, ProPublica, one of the great journalists source resources in America Today published an article demonstrating that Facebook's ad, machine was selling ads to, quote Jew haters, meaning you could target Jew haters if you wanted to sell a certain product.
and when this was discovered, Facebook quickly said, hey, wait, wait, we didn't do that.
No human created that category.
The category was created by a by by Facebook's AI.
Facebook's.
I watched the public in Facebook and figured out there was a segment of the public Jew haters that it could target if it just created this category.
So it created the category.
So we're not response.
They were not responsible.
Right.
But the point is like if that's true, if they're not responsible, then those machines don't get shouldn't have any First Amendment protection.
They should be subject to liability and restriction and regulation to make sure that they're not actually harming the democratic process.
And does art is our government.
I mean, one of the answers that you always hear is, well, we can't keep up with with Silicon Valley, and we still have senators who don't know what email is.
Are they pulling the wool over our eyes in Washington when they claim to be that much stupider than the rest of us?
So money makes people stupid?
yeah.
Money can make you very stupid in Washington.
anywhere.
Money can make you stupid anywhere.
But the point is, if they realize they're not, they're not actually stupid.
They just realize what they must do to make sure they don't lose their access to money.
Leslie Byrne, a Democrat from Virginia, when she came to Congress, she said, quote, she said, a colleague told her, quote, always lean to the green.
And then she went on, that colleague was not an environmentalist.
So the point was, you just know how your position will affect your ability to raise money.
And, you know, through a sixth sense, how to avoid saying things or doing things that will make it impossible to raise money.
And so that invisible, almost like an iceberg, the the part under the water, is enormously powerful to affect what our government can do.
If anybody in the audience has stupid money and they want some advice about what to do with it, please see us in the foyer after the talk.
Caitlin, who's 17, asks again.
This is one of the, the the conundrums of all this.
Does enforcing rules around content stifle creativity?
Yeah, I think if you enforce rules around the substance of the content where you're saying you can talk about this, but you can't talk about that, inevitably it's, stifles creativity, or it could trigger more creativity because you're trying to evade the rule by being extremely creative about the double entendre of what you're saying, which is why I think the solution is not to be regulating content like that.
It's to regulate the infrastructure, you know, not necessarily the government, even just, platforms to make sure that it slows the content down enough so that humans can actually make a judgment about the content.
The point is, if it runs too fast, you know, if you think there's a there's a wonderful movement called the Slow food movement that sort of addresses the problem of processed food by saying, look, if you just cook your own food, you can't make it as poisonous as the processed food companies can because you just don't have the chemicals in your kitchen.
So if you cook your own food and you eat it with friends, taking time to just talk, then that would be a perfect way to get your nutrition because the body can process food at that speed.
There's an equivalent in the democracy space called the Slow Democracy movement, which says when we try to do democracy at the speed of Twitter, and yes, I'm just going to call it Twitter, no matter what anybody says, Twitter or Facebook, if you're constantly being bombarded by this stuff, we can't do democracy well.
But if you slow democracy down the way, you know, many of you experienced when New Hampshire was still the center of the Democratic and Republican primaries.
When you have people in your house talking to you about their ideas, when you do it at a human speed, you can do democracy well.
So I think we have to have a slow democracy movement, and that platforms need to help us by making it so that the processing we have to do to engage with the content is not beyond our capacity and doesn't trigger the kind of worst in who we are.
You can only listen to the radio at one speed.
I will point out Olivia from Franklin brings this home.
New Hampshire is working on a bill to have disclosure of AI political ads.
This is no doubt based on some of the the deepfake of President Biden that that was exposed before the primary.
What other policies are needed to slow the spread of misinformation in elections?
Well, I think obviously we need a we need a rule at the federal level and if not at the federal level, at the state level, that says that anybody using AI technology has to market just like you need to say, paid for by Joe Schmo.
you need to say created by a bot, a robot.
Right.
and that would at least enable people to see the content.
as for what it is, and not be misled by that.
it's extremely hard, though, to be able to do that completely, given, how easy it is for anybody to create viral content and just put it on the internet.
If you're talking about regulating what campaigns do and even what superPACs do, you can have rules that say that you, as a campaign treasurer, need to swear that none of your money went into any ads that were not properly certified, like marked as, and if you if you lie about that, that's perjury.
So if you did that, then I think we could make sure that the real campaigns and the PACs would behave properly.
But that wouldn't stop the Russians.
It wouldn't stop the Chinese, it wouldn't stop, you know, some trouble maker who decides he's got a really cool way to throw the election in the last couple of minutes.
And we should recognize there democracies around the world.
The Czech Republic just had this.
Where deepfakes released just before the election have changed the results completely.
False claims credibly made have changed the results because people have reacted to them.
so it's a very serious problem.
And as these open source AI models are out there and able for anybody to deploy them, even contrary to the permission given by the original licensor, it's going to be extremely difficult to control it.
It's kind of building on that.
you began your remarks, referencing, the way we've changed the tide of history around cigarets.
We've changed the tide of history to some degree around, ultra processed foods.
What are the limits of that?
Those analogies is, has has there ever been anything like these algorithms that we've faced where we've been able to put the genie back in the bottle?
Well, I actually think, you know, the the point of the story is, was we really haven't solved the problem of cigarets.
You know, we've done you know, many of us live fortunately without cigarets, but many people still have them.
They've been, you know, led into an addiction which is extremely hard for them to escape.
ultra processed foods are.
I can't imagine how we remove this from society.
Like it's a problem that we've just grown into.
And it's hard to imagine how we back out of.
I think social media is the same.
I think we're going to come to recognize that if we continue to run democracy in this completely unprotected space where people are vulnerable to whatever their happen, whatever strings happen to be pulled, whether, you know, the unintended consequence of just trying to make a lot of money or the intended consequence of trying to disrupt the democracy, which, of course, is what China and Russia are doing right now.
our democracy is going to be vulnerable.
And I think ultimately, Phil, I'm in the middle of a book right now.
The metaphor I don't mean to be so oppressive, but, the you of the metaphor is the Titanic.
So, so the idea is you, you know, they hit the iceberg and the captain comes out and he sees all the overturned tables and he says, okay, well, we can fix all this.
That's fine.
And then his crew comes to him and says, there's a gash in the hull.
And six of the compartments are now filling, and you realize it doesn't matter if they fix the tables.
The ship is going down, and then he's got to figure out how to convince people to get into the lifeboats, which in the middle of the night, in the middle of the winter, in the middle of the Atlantic, when you're on the Titanic is an extremely hard thing to do because people are like, it's the Titanic.
It can't sing.
So the parallel is, you know, I've been working, you know, many people here have been working for many years.
I've been working for 18 years on democracy reform.
And I think we know what has to be done.
We came within two votes of the Senate passing the Freedom to Vote Act, which would have been the most important democracy reform legislation passed by Congress.
I think since the Civil War.
so we know what to do, and we are close to being able to do it.
And that would get us very close to a representative.
Representative democracy.
But I think there's a gash in the hall.
I mean, that's the equivalent of the overturned tables.
We could fix it, but I think there's a gash in the hull and the gash in the hull is this infrastructure of engagement based media.
Even if we had a perfectly representative, representative democracy, still we're being pulled into these these radically polarized corners where what we and what we intuitively feel is I hate that person.
I hate that person.
I see that person who wears a hat or a certain kind of shirt, and you just are triggered because you've been processed to be triggered.
And so long as that's the reality of how we engage with media around politics, I don't think a perfectly representative, representative democracy is enough.
And then the question is, what are the lifeboats like?
What are the reforms that could begin to give us a way to have confidence in our democratic process?
Again?
And my the argument I make in the book is that places like here that have tried to do politics in a different way, more bottom up, face to face talking and listening and gauging are more likely to be able to withstand this corrupting influence.
Then places, you know, like New York City, or, you know, I mean, New York State or like big places where the only information you get are the information is the information.
You're being fed by these machines of engagement.
You can't help but think of Croydon, where, you know, people ignored what was going on.
The school budget was cut in half.
And then people, yeah, were awake, got together, knocked on each other's doors and met in coffee shops, whatever.
And change things and fixed it and fixed it.
Last question.
Robert Putnam also teach at Harvard.
all right.
You might have to put on your Robert Putnam hat.
This one.
This is from Kevin, who's 16.
I'm connecting in my community without social media, through scouting and volunteering.
How can we encourage more of this kind of social?
It's critically important.
Yeah.
And, and I think what we need to do is to multiply the opportunities for that and make some of them feel like civic opportunities.
so we are deploying at Harvard.
We just purchased, a deliberation platform, a virtual deliberation platform that puts people into small groups and they can deliberate about issues.
And we're going to make it freely available through an open source licensing to encourage as many different groups around the world to use it, to try to facilitate this kind of face to face engagement.
Because I think what we see, especially through the citizen assembly movements that's happening all across the world, is when humans just face each other, you learn the other side is not a reptile.
You know, the other side has kids.
It has he he or she has dreams for their kids.
they're just trying to get by.
They're not conspiring to kill you, conspiring to make your life miserable.
and then if we had experiences where we lived that regularly.
I'm not talking about every day.
I don't want to be doing politics every day of my life.
But regularly, that could begin to change our understanding or appreciation of what it is to be a citizen.
at the beginning of the Republic in Philadelphia, there was a study that established that the average juror.
Now, of course, in the beginning of a republic, a juror was a white male property owner.
But for the white male property owners, the average white male property owner sat on a jury three times a year.
So three times a year that person had the experience of listening to people tell him what to do, what to think, how to think.
Sitting with his colleagues and deliberating and then exercising governmental power like deciding whether somebody was going to go to jail or whether someone is going to have to pay substantial amounts of damages.
And I think the culture where people do that regularly is a culture where they begin to feel like they are part of the government.
They are not the government is not another.
The government is them.
They are part of the government.
The system of rotating through government.
And and there's a strong movement in, in, across the world.
But in America to to push the idea of citizen assemblies, which is one way to make this, make this, manifest this and I think we need to experiment with many of these to begin to multiply the opportunities for people to recognize they are responsible for their government.
It's not being done to them.
That's like why I said at the end of the speech, this is a problem we only solve if we solve it, if we take responsibility for it.
And I know how hard it is.
I mean, I know how I'm triggered when I see people making certain arguments or I see them saying certain things, or I see them carrying certain flags or wearing certain hats, I know what happens inside of me.
And then I also know that what I am trying to do is to think, how is it that they see the world as they see it?
What is it that has led them to see it like that?
Not they're idiots or they're wrong, or they're to be despised.
But what do I not understand so that I can see them as such a completely different type of person?
If we can't do that, all of us.
Then I'm not sure there's hope for us as a people.
Now that's not optimistic.
What was the optimistic?
The optimistic, if I can, is that the people in this room and the people watching, live or whenever they see this on, on PBS, have a charge.
You've given us, responsibility and we all have our part to do so.
Thank you very much.
Thank you.
Fester large.
Thank you, thank you.
Okay.
That's nice.
Thank you very much.
NHPBS Presents is a local public television program presented by NHPBS