Shannon Bond

Shannon Bond is a business correspondent at NPR, covering technology and how Silicon Valley's biggest companies are transforming how we live, work and communicate.

Bond joined NPR in September 2019. She previously spent 11 years as a reporter and editor at the Financial Times in New York and San Francisco. At the FT, she covered subjects ranging from the media, beverage and tobacco industries to the Occupy Wall Street protests, student debt, New York City politics and emerging markets. She also co-hosted the FT's award-winning podcast, Alphachat, about business and economics.

Bond has a master's degree in journalism from Northwestern University's Medill School and a bachelor's degree in psychology and religion from Columbia University. She grew up in Washington, D.C., but is enjoying life as a transplant to the West Coast.

During the pandemic, Reesha Howard got hooked on doing live audio chats from her smartphone. First she used Clubhouse, the buzzy, invitation-only app that surged in popularity last year with freewheeling conversations, game shows and celebrity appearances.

MUMBAI AND SAN FRANCISCO — One night last month, police crowded into the lobby of Twitter's offices in India's capital New Delhi. They were from an elite squad that normally investigates terrorism and organized crime, and said they were trying to deliver a notice alerting Twitter to misinformation allegedly tweeted by opposition politicians.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

SCOTT SIMON, HOST:

Facebook suspended Donald Trump after his supporters attacked the U.S. Capitol on January 6. Now Facebook's rolling out new rules for all politicians and says under those rules, Trump can't come back for at least two years.

Updated June 4, 2021 at 4:43 PM ET

Facebook has extended former President Donald Trump's suspension for two years and says it will only reinstate him "if the risk to public safety has receded."

A group of Democratic senators is urging Google parent company Alphabet to investigate how its products and policies may be harming Black people.

In a letter to the tech giant's CEO, Sundar Pichai, and other executives, Sens. Cory Booker of New Jersey, Ron Wyden of Oregon, Mark Warner of Virginia, Ed Markey of Massachusetts and Richard Blumenthal of Connecticut said they worried about bias and discrimination, both in the products Google makes and the way it's handled workplace diversity.

Social media companies prohibit kids under 13 from signing up because of federal privacy law. But parents like Danielle Hawkins can tell you a different story.

"She got on Instagram and Snapchat without my approval when she was about 12," Hawkins, a mom of four who lives near Detroit, said of her eldest daughter.

The tech companies are well aware of this problem. Facebook CEO Mark Zuckerberg told a congressional hearing in March that his company knows kids get around the age limits on apps like Instagram, the photo-sharing network Facebook owns.

Updated May 14, 2021 at 11:48 AM ET

Researchers have found just 12 people are responsible for the bulk of the misleading claims and outright lies about COVID-19 vaccines that proliferate on Facebook, Instagram and Twitter.

Facebook has almost 2 billion daily users, annual revenue that rivals some countries' gross domestic product, and even its own version of a Supreme Court: the Oversight Board, which the company created to review its toughest decisions on what people can post on its platforms.

This week, the board faced its biggest test to date when it ruled on whether Facebook should let former President Donald Trump back on its social network.

Updated May 5, 2021 at 11:36 AM ET

Facebook was justified in its decision to suspend then-President Donald Trump after the Jan. 6 insurrection at the U.S. Capitol, the company's Oversight Board said on Wednesday.

Updated May 5, 2021 at 10:30 AM ET

Copyright 2021 NPR. To see more, visit https://www.npr.org.

RACHEL MARTIN, HOST:

Copyright 2021 NPR. To see more, visit https://www.npr.org.

SCOTT SIMON, HOST:

Last year, in the middle of the pandemic, Sinead Boucher offered $1 to buy Stuff, New Zealand's largest news publisher.

Boucher was already the company's chief executive and was worried that its Australian media owner would shut down the publisher. Things had started to look really grim: The economy had ground to a halt and advertising revenue had evaporated.

"I knew that they ... would potentially just decide to wind us up," said Boucher. "So it was just a punt."

Facebook is making changes to give users more choice over what posts they see in their news feeds, as the social media company defends itself from accusations that it fuels extremism and political polarization.

The changes, announced Wednesday, include making it easier for people to switch their feeds to a "Most Recent" mode, where the newest posts appear first, and allowing users to pick up to 30 friends or pages to prioritize. Users can now limit who can comment on their posts.

Tech workers say they have experienced more harassment based on gender, age and race or ethnicity while working remotely during the pandemic, according to a survey from a nonprofit group that advocates for diversity in Silicon Valley.

The increases were highest among women, transgender and nonbinary people, and Asian, Black, Latinx and Indigenous people.

Support for the siege on the U.S. Capitol. Bogus promises of COVID-19 cures. Baseless rumors about vaccines.

Who should be held accountable for the spread of extremism and hoaxes online?

Lina Khan, a prominent antitrust scholar who advocates for stricter regulation of Big Tech, may be about to become one of the industry's newest watchdogs.

President Biden on Monday nominated Khan to the Federal Trade Commission, an agency tasked with enforcing competition laws. She is the splashiest addition to Biden's growing roster of Big Tech critics, including fellow Columbia Law School professor Tim Wu, who announced earlier this month he would join the National Economic Council.

If there's one business that has come out ahead after a very hard year, it's Zoom.

The Silicon Valley upstart has become synonymous with video chat over the course of the pandemic. It has fulfilled our need to see and be with each other, even when we can't do that in person. And it's beat out some of the biggest names in tech along the way.

Kelly Steckelberg, the company's chief financial officer, can pinpoint the day when everything changed: March 15, 2020.

"Almost overnight, the demand grew exponentially," she told NPR.

Facebook is failing to enforce its own rules against falsehoods about COVID-19, vaccines, election fraud and conspiracy theories when it comes to posts in Spanish, according to a coalition of advocacy groups.

"There is a gap, quite an enormous gap, in fact, in English and Spanish-language content moderation," Jessica González, co-CEO of the advocacy group Free Press, told NPR.

Instagram recommended false claims about COVID-19, vaccines and the 2020 U.S. election to people who appeared interested in related topics, according to a new report from a group that tracks online misinformation.

"The Instagram algorithm is driving people further and further into their own realities, but also splitting those realities apart so that some people are getting no misinformation whatsoever and some people are being driven more and more misinformation," said Imran Ahmed, CEO of the Center for Countering Digital Hate, which conducted the study.

There's a saying in Silicon Valley: Solve your own problems. Tracy Chou didn't have to look further than her social media feeds to see those problems.

"I've experienced a pretty wide range of harassment," she said. "Everything from the casual mansplaining-reply guys to really targeted, persistent harassment and stalking and explicit threats that have led me to have to go to the police and file reports."

On Feb. 1, the editor of an award-winning Indian magazine got a call from his social media manager: The magazine's Twitter account was down.

"I said, 'Are you sure? Can you just refresh, and check again?' " recalled Vinod K. Jose, executive editor of The Caravan, which covers politics and culture. "But she said, 'No, no, it's real.' "

Copyright 2021 NPR. To see more, visit https://www.npr.org.

AILSA CHANG, HOST:

All right. Well, for more on this dilemma facing Twitter in India, we're going to turn now to NPR tech correspondent Shannon Bond.

Hey, Shannon.

SHANNON BOND, BYLINE: Hey, Ailsa.

Twitter users aren't known for staying quiet when they see something that's flat out wrong, or with which they disagree. So why not harness that energy to solve one of the most vexing problems on social media: misinformation?

With a new pilot program called Birdwatch, Twitter is hoping to crowdsource the fact-checking process, eventually expanding it to all 192 million daily users.

"I think ultimately over time, [misleading information] is a problem best solved by the people using Twitter itself," CEO Jack Dorsey said on a quarterly investor call on Tuesday.

Facebook is expanding its ban on vaccine misinformation and highlighting official information about how and where to get COVID-19 vaccines as governments race to get more people vaccinated.

"Health officials and health authorities are in the early stages of trying to vaccinate the world against COVID-19, and experts agree that rolling this out successfully is going to be helping build confidence in vaccines," said Kang-Xing Jin, Facebook's head of health.

January brought a one-two punch that should have knocked out the fantastical, false QAnon conspiracy theory.

After the Jan. 6 attack on the U.S. Capitol, the social media platforms that had long allowed the falsehoods to spread like wildfire — namely Twitter, Facebook and YouTube — got more aggressive in cracking down on accounts promoting QAnon.

Updated at 3:16 p.m. ET

Facebook's oversight board on Thursday directed the company to restore several posts that the social network had removed for breaking its rules on hate speech, harmful misinformation and other matters.

The decisions are the first rulings for the board, which Facebook created last year as a kind of supreme court, casting the final votes on the hardest calls the company makes about what it does and does not allow users to post.

The alternative social network MeWe had 12 million users at the end of 2020. Barely three weeks into 2021 — and two since a right-wing mob attacked the U.S. Capitol — the company says it's now passed 16 million.

CEO Mark Weinstein says this popularity is a testament to the reason he launched MeWe in 2016 as an alternative to Facebook. MeWe markets itself as privacy forward. It doesn't harness users' data to sell ads or decide what content to show them.

Two weeks ago, Facebook indefinitely suspended former President Donald Trump from its social network and Instagram, after a mob of his supporters stormed the U.S. Capitol. CEO Mark Zuckerberg said the risks of allowing Trump to keep using the social network were "too great."

Now, Facebook wants its newly formed independent oversight board to weigh in and decide whether it should reinstate Trump.

Updated at 3:05 p.m. ET

Willy Solis never saw himself as an activist.

"I'm an introvert, extreme introvert," he said. "That's my nature."

But 2020 changed that — like so many other things.

Los Angeles County Supervisor Sheila Kuehl's district sweeps from the beaches of Santa Monica to the San Fernando Valley. Among the two million people she represents are Latino communities hit especially hard by the coronavirus pandemic.

"Many essential workers, many market and pharmacy and food service and restaurant and hotel workers and a lot of health care workers," she said. "So a lot of people just had to go to work."

Pages