Most tech pros believe Facebook should do more to stop election misinformation

If you are worried about misinformation swaying the upcoming US election, you are not alone

As bot-driven misinformation campaigns flood our social feeds, aiming to guide voter choice, and fake accounts intend to undermine elections proliferate, users need to feel reassured that the information they see is authentic.

Security

Cyber security 101: Protect your privacy from hackers, spies, and the government

Simple steps can make the difference between losing your online accounts or maintaining what is now a precious commodity: Your privacy.

Read More

in 2020, the responsibility of electoral integrity is falling on US tech companies nearly as much as on the government. Social media platforms are so prevalent that any misinformation, if left unchecked on social media, could cause a massive swing of sentiment amongst voters.

As the US presidential election draws closer, questions are still asked about whether bots influenced the 2016 election in a significant way. Facebook noticed that in 2016 there were "coordinated online efforts by foreign governments and individuals to interfere in our elections."

It also recently "took down a network of 13 accounts and 2 pages that were trying to mislead Americans and amplify division." But what do users across the tech industry think?

San Francisco-based anonymous professional network Blind surveyed 1,332 users to ask the same two questions. It wanted to get a pulse on how tech employees felt whether Facebook was accountable for election misinformation compared to Facebook employees.

It asked "Do you believe it is the responsibility of Facebook to prevent misinformation about the election?" and "Are you surprised by Zuckerberg's stance given his previous 'free speech' stance?"

in October 2019, Zuckerberg spoke at Georgetown University about the importance of protecting free expression and promised to:

"1. Write policy that helps the values of voice and expression triumph around the world; 2. Fend off the urge to define speech we don't like as dangerous; and 3. build new institutions so companies like Facebook aren't making so many important decisions about speech on our own."

Two in three techs believe Facebook should do more to stop election misinformation zdnet

TeamBlind

The survey results showed that almost seven in 10 (68%) of surveyed tech professionals believe it is the responsibility of Facebook to prevent misinformation about the election.

This percentage contrasted markedly, with only 47% of Facebook employees believing Facebook should be responsible to prevent misinformation.

One in three (33%) of surveyed tech professionals are surprised by Zuckerberg's stance given his previous "free speech" stance, contrasted by only 27% of Facebook employees.

Considering Facebook's adherence to its "free speech" policy, any deviation to its political ad policies is worth looking at.

Last week, Zuckerberg said in a Facebook post that the platform will block new political and issue ads in the week leading up to the election, to prevent last-minute misinformation.

It will also expand its voter suppression policies and will remove posts with claims that people will get COVID-19 if they take part in voting.

These survey results suggest that Facebook's employees disagree with other tech professionals about their hand in misinformation accountability.

Is it Facebook's job to sway voters one direction or the other in November? Should people across Facebook be allowed to speak their minds, share their opinions, and come to their own conclusions based on the information they see?

If President Donald Trump is swaying public opinion via social media, then should former Vice President Joe Biden use social media to sway voters in the other direction?

Is it up to Facebook to decide who will win this election -- or is it up to the voters getting the information they need across social platforms to make the right choice?