Advertisement

Let a jury decide what content should appear on social media

<span>Photograph: Pat A Robinson/Alamy</span>
Photograph: Pat A Robinson/Alamy

One of the most instructive experiences of my life was serving as a juror in a criminal trial. When the summons to report for jury service arrived, though, I was anything but enthusiastic. I was bringing up two young children on my own at the time and the last thing I needed was to be locked down for an unknown number of days. So I headed into the crown court feeling pretty glum.

The trial was a serious one: the charge was of causing grievous bodily harm with intent. It went on for two weeks. A number of witnesses gave evidence, much of which seemed (to me) unconvincing, sometimes contradictory, occasionally horrifying. We learned more about what goes on at night in an economically depressed East Anglian town than is good for anyone. And then, when the lawyers and the judge had summed up, we retired to reach a verdict.

What happened next was remarkable. My fellow jurors, chosen – as I was – at random, were all “ordinary” citizens: none of us had any legal training or knowledge. But the process that unfolded in the room was deeply impressive: 12 people, confronted with a morass of contradictory evidence, quietly sorted through it, identified the one witness who seemed to all of us to be the most credible, and unanimously reached a verdict: guilty. Only later did we discover that the defendant had a set of previous convictions for violent assault – knowledge that had properly been withheld from us during the trial.

The jury system isn’t perfect; no system is. And juries sometimes make mistakes or are misled. But it’s no accident that law-abiding societies have for a long time used juries as a way of deciding whether someone is guilty or innocent. “In the court system,” writes the eminent Harvard legal scholar Prof Jonathan Zittrain, “legislators write laws, and lawyers argue cases, but juries of ordinary people are typically the finders of fact and judges of what counts as ‘reasonable’ behaviour. This is less because a group of people plucked from the phone book is the best way to ascertain truth – after all, we don’t use that kind of group for any other fact-finding. Rather, it’s because, when done honourably, with duties taken seriously, deliberation by juries lends legitimacy and credibility to the machinations of the legal system.”

Empanel juries of high-school seniors and their teachers to provide a system that could be scaled up to match the task

This quote comes from an essay in which Zittrain wrestles with two of the most pressing problems of the digital age: what kinds of content should be allowed (and disallowed) on social media platforms? And who should make those decisions? At the moment, we have no good answers to either question, so decisions are left to the bosses of those platforms via their shifting terms and conditions, their legal obligations in different jurisdictions (eg, Germany, China, Saudi Arabia, India), the incentives to permissiveness implicit in their business models and lust for growth, and their lack of transparency and accountability.

The result is a chaotic vacuum at a critical time for democracies grappling with acute polarisation, a looming presidential election and what will be an epic post-pandemic economic depression. Small wonder, then, that legislators in the US and elsewhere are beginning to think about pulling the internet kill-switch buried in Section 230 of the 1996 Communications Decency Act – the crucial clause that grants internet companies immunity from liability for the content posted by their users. If that section were repealed – and Joe Biden, for one, has said that he thinks it should be – then the business model of Facebook, YouTube, Twitter and a host of other smaller companies would become toast overnight.

But even those of us who abhor these platforms should be careful what we wish for. As Anna Wiener writes, in the New Yorker: “Without Section 230, only platforms with the resources for constant litigation would survive; even there, user-generated content would be heavily restricted in service of diminished liability. Social media startups might fade away, along with niche political sites, birding message boards, classifieds, restaurant reviews, support-group forums and comments sections.” What we might wind up with is “a desiccated, sanitised, corporate internet – less like an electronic frontier than a well-patrolled office park”.

This is a fate that Zittrain also wishes to avoid. His big idea is that difficult decisions about allowable (and forbidden) content could be taken by applying the jury system to the problem. Why shouldn’t public librarians, he writes, “be asked in small panels, real or virtually convened, to evaluate ads? Today only 33% of Americans have trust in the news media, but 78% trust libraries to help them find information that is trustworthy and reliable.” Or, better still, empanel juries of high-school seniors and their teachers to provide a system that could be scaled up to match the magnitude of the task.

Original? Definitely. Utopian? Maybe. But then I expect that the idea of asking a dozen average folks to decide guilt or innocence once seemed utopian too.

What I’ve been reading

Panic attack
There’s an intriguing academic article, The Sisyphean cycle of technology panics, by Amy Orben in the journal Perspectives on Psychological Science.

Virtually perfect
Zoom is great for virtual meetings, but if you want to use it for seminars you really need to know what you’re doing. Quentin Stafford-Fraser’s checklist on statusq.org is really helpful.

Every loser wins
A Newsweek article, How Trump could lose the election – and still remain president, by a former US senator and a senior journalist, could keep you up at night.