By allowing ads to appear on this site, you support the local businesses who, in turn, support great journalism.
New bill would require social media sites to report terrorist activity
667ebb7ff1c22d742867632d8a34a1b4938a709bbc6c45267dac4c49112a8030
Two senators have introduced a bill to give social media sites more responsibility in halting terrorists. If passed, here's what it would do. - photo by Payton Davis
Around the time Tashfeen Malik and her husband went on a shooting rampage that killed 14 in San Bernardino, California, on Dec. 2, she posted a message of support for the Islamic State group online, according to USA Today.

That instance, along with last month's terror attacks in Paris, have prompted some U.S. lawmakers to put pressure on tech companies to "do more to halt terrorists' use of their services to recruit and communicate," Elizabeth Weise wrote for USA Today. Sen. Dianne Feinstein and Sen. Richard Burr introduced Tuesday a bill to require sites like Facebook, Twitter and YouTube to let law enforcement know when they find "terrorist activity on their platforms."

Tami Abdollah noted for Associated Press the legislation's similarities to a law that requires sites to report online child pornography.

But Abdollah wrote that detecting pornographic images is more straightforward than identifying supposed terrorist activity.

"The bill models a law that requires reporting of online child pornography," the AP article said. "But in that situation, images are automatically matched to an existing database that helps with swift removal. Social media companies have robust terms of use that help them ensure civil discourse online, but they also work to ensure they're not limiting free speech."

That "limiting of free speech" part and the potential for "unhelpful data" has critics skeptical, according to AP.

Of the negatives is the possibility of less reporting of terrorist activity the opposite of what Feinstein and Burr hope to accomplish, said Sen. Ron Wyden in a statement, according to The Hill.

"(The bill) would create a perverse incentive for companies to avoid looking for terrorist content on their own networks, because if they saw something and failed to report it they would be breaking the law, but if they stuck their heads in the sand and avoided looking for terrorist content they would be absolved of responsibility," The Hill quoted Wyden as saying.

James Vincent wrote for The Verge that companies including Facebook, Google and Twitter also expressed concerns. It would introduce an "impossible compliance problem," and many of the items reported might not be of concern.

Still, proponents of such steps are high-profile, Vincent said: Both leading presidential candidates have called for "tighter Web controls."

According to The Verge, Donald Trump suggested the U.S. "close up" the Internet with the goal of combating terrorism, and "his views are not far from those of Hillary Clinton."

"Both Trump and Clinton have preemptively waved off objections about 'free speech,' and seem determined to press for more action, of any sort. If Feinstein and Burr's bill emerges from committee and ends up being voted on, then it's likely to find support," Vincent wrote.

Whether the legislation is passed, social media sites could do more than they do now, said Rabbi Abraham Cooper, of the Simon Wiesenthal Center's digital terrorism and hate project, to USA Today.

"I think weve collectively reached a tipping point," Cooper told USA Today. "If these companies took this seriously, they could put an immediate dent into the marketing capabilities of ISIS, al-Qaida, al-Shabab and the rest."
Sign up for our e-newsletters