Facebook has long been at the centre of all the controversy surrounding the spread of fake news ever since the U.S. Presidential Election. Especially after the Cambridge Analytica privacy controversy, the social network giant has been under fire.

So far, Facebook has implemented multiple measures that help users detect fake news on the platform, and also reduce the spread of fake news. The company is now taking things a bit further by rating users on trustworthiness. Yes, Facebook is assigning users with a reputation score to better fight fake news on its platform.

Exactly how the new reputation score is being used is unclear.

Facebook says it measures a bunch of different factors to determine a user’s reputation score, though it’ll specifically monitor things like what publishers on Facebook are considered trustworthy by a user, what kind of posts they flag as false, etc. The point of the reputation score is to help understand the trustworthiness of a user so that the company can use it to better fight fake news.

Facebook already allows users to report fake news on the platform, but the problem was that users who didn’t agree with certain articles or publishers were falsely reporting those as fake news. That way, Facebook’s fact checkers are wasting a ton of time reviewing reports from users that are simply reporting content because they disagree with it. The new reputation could be used to help prevent that, as Facebook will now give more importance to reports from users with a higher trustworthiness than ones with much lower trustworthiness.

The reputation score still presents a ton of questions, though. Facebook is being very tight-lipped about whether the reputation score affects the News Feed itself, and whether users with a lower reputation score are less likely to get their post displayed on the News Feed to their friends. It’s also unclear whether publishers who have a large audience of low reputation scores are penalized on the News Feed.

Thurrott