Did you know every 83 seconds someone posts an anti-Semitic message on social media? That's 44 messages every hour, according to a new World Jewish Congress study that analyzed tens of millions of social posts. Startlingly, 63 percent of all anti-Semitic content online is on Twitter.

"We knew that anti-Semitism online was on the rise, but the numbers revealed in this report give us concrete data as to how alarming the situation really is," said World Jewish Congress CEO and Executive Vice President Robert R. Singer. Another study, conducted by SafeHome.org, corroborated Singer's sentiment.

Between 2014 and 2015, the number of likes given to tweets and comments produced by hate groups on Twitter more than tripled. Again between 2015 and 2016, the likes associated with these comments more than tripled again. In just three years, the average number of likes on hateful or racist tweets from the groups studied rose from less than one, on average, to almost eight. That means likes on hate group tweets and comments rose 900 percent from 2014 to 2016.

Anti-immigrant hate groups on Twitter have the largest average amount of followers — followed by anti-Muslim, anti-LGBT, black separatists, white supremacists and anti-Semitics.

Growing hate speech on social media has real-world impacts. For example, Charlottesville's Unite the Right rally was organized on Facebook. Only after the rally began to make national news did Facebook take down the event page.

Following the Charlottesville incident, Facebook CEO Mark Zuckerberg said in a statement, "There is no place for hate in our community. That's why we've always taken down any post that promotes or celebrates hate crimes or acts of terrorism including what happened in Charlottesville."

From April to June of this year, Facebook said they've taken down 66,000 posts related to hate speech a week (or 288,000 posts a month). In this blog post, Richard Allan, Facebook's vice president for public policy EMEA, explains how the company decides what to remove.

Keegan Hankes, an analyst at the Southern Poverty Law Center, told CNN that social media sites "are not doing nearly enough to combat organized hate groups on the platform."

Yet the topic has raised questions about free speech and the First Amendment. David Snyder, executive director of the First Amendment Coalition, also told CNN, "Do we the people really want private entities calling the shots as to who can or can't participate in the discussion on the internet? There's a risk that the pendulum could swing too far and those who are now cheering ... might find that their speech ... ultimately is banned."

While social media companies and CEOs debate the best way to handle hate speech in the U.S., the European Union's European Commission is demanding action now and threatening legal action if tech companies don't take "swift action." Mariya Gabriel, digital economy and society commissioner of the European Commission said websites like Facebook and Twitter take too long — often more than a week — to remove illegal posts.

Germany is leading the way. In June, the country passed a law that would impose fines of up to $57 million on websites that do not remove reported hate speech, including racist speech and Nazi symbols, within 24 hours.

Instagram coincidentally just announced a modification to their comment moderation tool. Now, you can limit comments to specific groups of people, like only your followers, or block accounts entirely. That's in addition to the comment filters Instagram already implemented in June of this year.

With the added pressure from the European Union, coupled with social pressure in the U.S., you'll likely see more actions and tools like Instagram's latest to moderate and eliminate hate speech on social media.