Since its inception, the Internet has never ceased growing, being filled with more and more content every minute. Unfortunately, most of this content is garbage. Propaganda, misinformation, and pure rumor clutters many search results and sorting through that digital detritus can present a real challenge.

In response, Google has devised a potential fix to the problem of establishing truth on the internet: ranking algorithms based on the accuracy of a website’s content.

The search engine giant currently considers the number of incoming links to a webpage to be the gold standard for establishing its reputation for quality. The more links that lead to a page, the higher it will rank in their search results. However, just because something is popular doesn’t mean it is high quality. With Google’s current ranking system, a website full of disinformation can also rise up the ranks if enough people link to it.

Google has tasked a dedicated research team to adapt their current algorithm to account for and measure the factual accuracy of a page instead of just its popularity across the web. Rather than count only the incoming links, the new system (which is not yet live) would cross-reference and tally the number of factually incorrect items on the page. “A source that has few false facts is considered to be trustworthy,” the team says. A score for the page, called its “Knowledge Based Trust” score, is then computed.

How does Google separate the facts from so much fiction? Well, being one of the largest online resources of information on the web has its advantages. Google’s truth-seeking algorithm works by tapping into the Knowledge Vault, a vast store of information that Google has culled from the internet; facts virtually everyone agrees constitute a reasonable basis for establishing verifiable truth. Under this system, pages containing verifiably factual content will rise in the rankings, while those containing contradictory information will be demoted.

There are apps already available to help users uncover the truth. Emergent, a project from the Tow Center for Digital Journalism at Columbia University, investigates content found on various websites, then verifies or rebuts that content by cross-referencing with other, what they consider to be reliable, sources. LazyTruth is a browser extension which will filter out known fake and hoax emails that continuously re-emerge.

Matt Stembeck, LazyTruth developer and director of civic media at Microsoft New York, wants to develop software that will make information that has been verified by fact-checking sites such as Snopes, PolitiFact and FactCheck.org, easily accessible to everyone. Tools like this are useful online, but challenging the erroneous beliefs underpinning that information is harder. “How do you correct people’s misconceptions? People get very defensive,” Stempeck says. “If they’re searching for the answer on Google they might be in a much more receptive state.”