Web Content Quality | Web Development Company NYC
686
post-template-default,single,single-post,postid-686,single-format-standard,ajax_fade,page_not_loaded,,qode-child-theme-ver-1.0.0,qode-theme-ver-17.1,qode-theme-bridge,wpb-js-composer js-comp-ver-7.6,vc_responsive
 

Web Content Quality for Media Sites

News concept

Web Content Quality for Media Sites

The NY Times published an online article today titled Web Words That Lure the Readers which featured the Huffington Post and stated that 35% of their traffic comes from organic search. On the surface, this seems like discouraging news for their competition. The article makes it sound as though they have a stranglehold on the market.  If we dig a bit deeper, a very different conclusion can be reached.

Huffington Post has success writing articles by scraping “trending” phrases from Google. They have monster Google PageRank of PR9 because they have so many inbound links (over 20 million). The article makes it sound like they come up first for every possible phrase. However, this is really not the case and the explanation is simple. Google PageRank is NOT a measure of quality and the effect of it can actually be diluted by poor quality content. Ironically, both the NY Times own website along with HuffPo illustrate this.

Measuring Authority – Inbound Links

There is only one measure of authority and that is inbound hyperlinks from other websites. Of all the search engines, Google does the best job of respecting it. The Google algorithm not only measures the quantity of the links, but also the relevance and quality. However, the story does not end there. If it did, then the oldest site with the most content would always come up first. However, that does not happen because the quality of the content itself indirectly, but profoundly, influences the search results.

Measuring Quality – Inbound Links vs Content

If authority were the only thing that mattered, then the best SEO strategy would be to launch loads of content and hope that enough people would find it and link to it. Sound familiar? That’s the failed AOL strategy (and the new leadership wants even more of that). Adding additional content that has no value and actually weakens the value of the entire domain.  This is caused by rank distribution throughout a website through internal links, navigation, HTML sitemaps, etc. (too complex to discuss further here).

True Measure of Quality – Ratio of Inbound Links to Content

The actual measure of the quality of content on a website is ratio of inbound links to the number of pages. The more visitors like the content and find it unique and compelling, the better the chance they will link to it. Though Google does not have any accurate way to present this data, Yahoo actually does (wow, something Yahoo does better than Google). Here is an example of four media websites I am very familiar with:

  Pages Inbound Links Ratio
NY Times 31,700,000 41,900,000 1.32
Gawker 1,788,000 9,227,000 5.16
HuffPo 20,029,000 20,328,000 1.01
Mediaite 123,000 889,000 7.22

So the sites with the worst quality are actually NY Times and Huffington Post.  They certainly have more inbound links than anyone else (a good thing).  However,  they also have much more content and the worst ratios of inbound links to content.  The best quality can actually be found at relative newcomer Mediaite and the more established Gawker.  There are several of reasons for this, but one of the biggest is having unique, compelling content (though Gawker has several sister domains that help boost the link count).

Googles Goal

The goal of the folks at Google is simple; display the very best unique compelling search results and the web visitors will just keep coming back to Google.  No algorithm can read and curate the content on the page, so instead they rely on the votes (inbound links) cast from other domains.  The better the quality of the domain casting the vote, but better the quality of the vote.

AOL is tabbing Arianna Huffington to guide the AOL content strategy going foward.  They have chosen poorly…