The present is not a particularly good time for social media service Twitter. Its value has plummeted in the past year, its stock price has steadily fallen since its November 2013 initial public offering, and its C-suite offices have seen revolving doors in recent months.

The service, perpetually in the shadows of Facebook by most any important business metric, has recently proposed changes deeply unpopular with its core users — such as possibly lengthening tweets to 10,000 characters from the current 140 and replacing a user's chronological timeline of tweets to an out-of-order one. Each controversial change threatens to take Twitter away from where it still has an advantage over any other major social network, with commentary and developments about news or other live events.

As if all that weren't enough, Twitter has also deservedly come under fire for not doing enough about online harassment and abusive behavior. Death threats, revenge porn and publishing of someone's private residence or contact information are all repeated examples of things that have crossed the line from the Web's healthy culture of free speech to real-life danger.

Former Twitter CEO Dick Costolo said the company "sucks at dealing with abuse and trolls on the platform, and we've sucked at it for years." He pledged to fix the problem in the same memo last February. Costolo then stepped down from the CEO position in June.

Twitter co-founder Jack Dorsey replaced Costolo, but the problem remains. Just in the public sphere, elected officials, activists and celebrities have been subjected to abuse in the past several months.

Twitter now appears to be taking online abuse more seriously, and with more explicit ways of how it's going to act in situations where people are threatened. Yet, some are understandably unconvinced.

After Costolo's frank admission a year ago, Twitter devoted more manpower to moderating harassment and added new reporting mechanisms. In December, the social media service updated its Twitter Rules to prevent harassment, threats and suggesting self-harm, among others.

The Verge, however, characterized the update as "rearranging paragraphs in its terms of service." Casey Newton, the site's Silicon Valley editor wrote, "The truth is that updated rules are meaningless unless the company strictly enforces them."

Now, Twitter is looking outward to try to eliminate abuse on its platform. The company started by shutting down 125,000 terrorism-related accounts on Feb. 5. Then, it turned its attention to abuse.

On Feb. 9, the social media company announced the formation of the Twitter Trust and Safety Council, made up of 40 outside organizations ranging from Internet free speech advocacy, civil rights, womens' rights and domestic violence support, as "a new and foundational part of our strategy to ensure that people feel safe expressing themselves on Twitter." The groups are also from all over the world.

The obvious question remains: Can this step finally be the one that gets Twitter to well and truly crackdown on abusive content and behavior?

It certainly represents as good a chance as anything Twitter has done to this point, and its consultation of such diverse groups likely represents a concession that such a serious undertaking while still balancing valid free speech concerns cannot and will not be solved simply by Silicon Valley techies and executives.

To be clear, even if the Trust and Safety Council is highly successful in rooting out abuse, Twitter will still have many of the same problems it has today.

However, its user base has been stagnant at 320 million users for a while, which is just one-fifth of Facebook's reach. A healthy reason for that stagnation is that harassment has caused many to leave Twitter, and discouraged potential users from joining and tweeting due to its reputation as a site where trolls and troublesome users can run amok without consequence.

If that can change and if Twitter doesn't try to be something it's not the company may see better days.