It’s evident that Google wants forum administrators to work harder on managing user-generated content Googlebot ‘rates’ as part of your site. User-generated content are counted as part of the page and these comments are taken into consideration when Google rates the page.
Think about providing a way for users to report a broken link. Make sure it does not inadvertently produce pages with little to no unique content on them . If you are forced to employ these type of pages, you need to do it in a better way.
Beware using client-side Javascript to render pages as often some content served using client-side JavaScript is not readable by Googlebot. If you believe in ‘first link priority’, you are going to have to take it into account when creating your main navigation system that appears on every page, and where that sits in the template. Do not link to low-quality pages on your site or off-site. Ensure all pages on your site are linked to from other pages on your site. High-quality supplementary content should “ to a satisfying user experience on the page and website.” and it should NOT interfere or distract from the MC. Webmasters need to be careful when optimising a website for CONVERSION first if that gets in the way of a user’s consumption of the main content on the page.
If you run an affiliate website or have content that appears on other sites, this is even more important. This article aims to cover the most significant challenges of writing ‘SEO-friendly’ text and web page copy for Google. High-quality content is one aspect of a high-quality page on a high-quality site. Content quality is one area to focus on if you are to avoid demotion in Google. This concept is a bit like a leaky or reversed version of Pagerank applied ON-SITE. In the original patent, I believe Pages did not ‘lose’ PR , they ‘donated’ PR to other pages in the ‘set’.
Moz has a good video on theGoogle organic quality score theory. It goes into a lot of stuff I have been blogging for the last few year. Make sure Google can crawl your website, index and rank all your primary pages by only serving Googlebot high-quality, user friendly and fast loading pages to index. I think that advice is relevant for any site with lots of content.
Helpful SC is content that is specifically targeted to the content and purpose of the page. Most of this advice is relevant to the desktop version of your site, and has actually been removed from recent quality rater guidelines but I think this is still worth noting, even with mobile first indexing. Write naturallyand include the keyword phrase once or twice on-page. Slow load times are a primary reason visitors abandon a checkout process. The average load time for mobile sites is 19 seconds over 3G connections. Models predict that publishers whose mobile sites load in 5 seconds earn up to 2x more mobile ad revenue than those whose sites load in 19 seconds.
A 2-second delay in load time resulted in abandonment rates of up to 87%. The web is changing very fast, and a fast website is a good user experience. Consider linking to important pages on your site from your home page, and other important pages on your site. NOTE – If a page exists only to make money from Google’s free traffic – Google calls this spam. If you are just starting out, don’t think you can fool Google about everything all the time.
When you expand the content on a website, you apparently dilute the ‘value’ you already have, “all else being the same“. I have long thought this was the direction we were heading in. Google probably has a quality score of some sort, and your site probably has a rating whatever that is relevant to . Deleting content is not always the optimal way to handle MANY types of low-quality content – far from it, in fact. Nuking it is the last option unless the pages really are ‘dead‘ content. We are dealing with algorithms designed to target old style SEO – that focus on the truism that DOMAIN ‘REPUTATION’ plus LOTS of PAGES equals LOTS of Keywords equals LOTS of Google traffic. But the explanation of the quality score is a good introduction for beginners.
To avoid creating pages that might be considered thin content in 6 months, consider planning a wider content strategy. If you publish 30 ‘thinner’ pages about various aspects of a topic, you can then fold all this together in a single topic page centred page helping a user to understand something related to what you sell. A website that does not link out to ANY other website could be interpreted accurately to be at least, self-serving. I can’t think of a website that is the true end-point of the web. However, we do expect websites of large companies and organizations to put a great deal of effort into creating a good user experience on their website, including having helpful SC. For large websites, SC may be one of the primary ways that users explore the website and find MC, and a lack of helpful SC on large websites with a lot of content may be a reason for a Low rating. Smaller websites such as websites for local businesses and community organizations, or personal websites and blogs, may need less SC for their purpose.
NOTE – The ratio of duplicate content on any page is going to hurt you if you have more duplicate text than unique content. A simple check of the pages, page to page, on the site is all that’s needed to ensure each page is DIFFERENT page-to-page. The number 1 way to do ‘ copywriting‘ will be to edit the actual page copy to continually add unique content and improve its accuracy, uniqueness, relevance, succinctness, and use. Copying the entire report from an encyclopedia, or paraphrasing content by changing words or sentence structure here and there. Filling the report with large pictures or other distracting content. You must avoid boilerplate text, spun text or duplicate content when creating pages – or you are Panda Bamboo – as Google hinted at in the 2015 Quality Rater’s Guide. Optimising low-quality pages without value-add is self-defeating, now that the algorithms – and manual quality rating efforts – have got that stuff nailed down.
If we think ‘value’ is the new PR score, then the more pages you add to your website the primary way Google measures your site ‘score’ means that that quality score is coming down. This certainly seems to be an ‘answer‘ to ‘domain authority‘ abuse and ‘doorway page‘ abuse. It is also going to make webmasters think twice about the type of “SEO friendly content” they publish.
If you have a lot of pages to address, the main priority is to create a UNIQUE couple of paragraphs of text, at least, for the MC . They just need to MEET A SPECIFIC USER INTENT and not TRIP ‘LOW_QUALTY’ FILTERS.
It’s evident that Google wants forum administrators to work harder on managing user-generated content Googlebot ‘rates’ as part of your page and your site. If you run a legitimate online business you’ll want to ensure your website never looks obviously out of date. Consider also the Distance Selling Regulations which contain other information requirements for online businesses that sell to consumers . If your business has a VAT number, it should be stated even if the website is not being used for e-commerce transactions. As long as it renders as a browser compatible document, it appears Google can read it these days. I prefer PHP these days even with flat documents as it is easier to add server-side code to that document if I want to add some sort of function to the site.