QUOTE: “The preferred domain is the one that you would liked used to index your site’s pages (sometimes this is referred to as the canonical domain). Links may point to your site using both the www and non-www versions of the URL (for instance, http://www.example.com and http://example.com). The preferred domain is the version that you want used for your site in the search results.” Google, 2018

Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[64] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[65] As of 2006, Google had an 85–90% market share in Germany.[66] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[66] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[67] That market share is achieved in a number of countries.
Critics will point out the higher the cost of expert SEO, the more cost-effective Adwords becomes, but Adwords will only get more expensive, too. At some point, if you want to compete online, your going to HAVE to build a quality website, with a unique offering to satisfy returning visitors – the sooner you start, the sooner you’ll start to see results.
If you take money online, in any way, you NEED to have an accessible and satisfying ‘customer service’ type page. Google says, “Contact information and customer service information are extremely important for websites that handle money, such as stores, banks, credit card companies, etc. Users need a way to ask questions or get help when a problem occurs. For shopping websites, we’ll ask you to do some special checks. Look for contact information—including the store’s policies on payment, exchanges, and returns. “ Google urges quality raters to be a ‘detective’ in finding this information about you – so it must be important to them.
QUOTE: “Keyword Stuffed” Main Content Pages may be created to lure search engines and users by repeating keywords over and over again, sometimes in unnatural and unhelpful ways. Such pages are created using words likely to be contained in queries issued by users. Keyword stuffing can range from mildly annoying to users, to complete gibberish. Pages created with the intent of luring search engines and users, rather than providing meaningful MC to help users, should be rated Lowest.” Google Search Quality Evaluator Guidelines 2017
One of its content headers and SEO page titles is: Why Understanding the Four Major Learning Styles Will Help You Reach More Employees This Open Enrollment. The highest ranking for that page is 23rd for the phrase “benefits of learning styles” (30 monthly searches), which appears on the third page of Google. Maybe it’s good content – just not an effective string of words for SEO.
Yes, you need to build links to your site to acquire more PageRank, or Google ‘juice’ – or what we now call domain authority or trust. Google is a link-based search engine – it does not quite understand ‘good’ or ‘quality’ content – but it does understand ‘popular’ content. It can also usually identify poor, or THIN CONTENT – and it penalises your site for that – or – at least – it takes away the traffic you once had with an algorithm change. Google doesn’t like calling actions the take a ‘penalty’ – it doesn’t look good. They blame your ranking drops on their engineers getting better at identifying quality content or links, or the inverse – low-quality content and unnatural links. If they do take action your site for paid links – they call this a ‘Manual Action’ and you will get notified about it in Webmaster Tools if you sign up.
All sites have a home or "root" page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (for example, root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?

Mobile-first design has been a best practice for a while, and Google is finally about to support it with mobile-first indexing. Learn how mobile-first indexing will give digital marketers their first real swing at influencing Google’s new AI (Artificial Intelligence) landscape. Marketers who embrace an accurate understanding of mobile-first indexing could see a huge first-mover advantage, similar to the early days of the web, and we all need to be prepared.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.

The world is mobile today. Most people are searching on Google using a mobile device. The desktop version of a site might be difficult to view and use on a mobile device. As a result, having a mobile ready site is critical to your online presence. In fact, starting in late 2016, Google has begun experiments to primarily use the mobile version of a site's content42 for ranking, parsing structured data, and generating snippets.


Another reason is that if you're using an image as a link, the alt text for that image will be treated similarly to the anchor text of a text link. However, we don't recommend using too many images for links in your site's navigation when text links could serve the same purpose. Lastly, optimizing your image filenames and alt text makes it easier for image search projects like Google Image Search to better understand your images.
Search engine optimization is a method for sustainably influences search engine rankings. Google and other search engines calculate their search results for keywords using highly complex algorithms. The individual ranking factors and their weighting within the ranking calculation are well-guarded intellectual property that belongs to the search engines and is not publicly disclosed.

The existing content may speak to core audiences, but it isn’t producing many strong organic results. For example, the content header Capitalizing on the Right Skills at the Right Time With Business Agility may seem OK, but it doesn’t include a keyword phrase within striking distance. The lengthy URL doesn’t help matters. Extraneous words prevent any focus and the URL is bogged down by “business” and “agility” duplication:
When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.
Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.
Google and Bing use a crawler (Googlebot and Bingbot) that spiders the web looking for new links to find. These bots might find a link to your homepage somewhere on the web and then crawl and index the pages of your site if all your pages are linked together. If your website has an XML sitemap, for instance, Google will use that to include that content in its index. An XML sitemap is INCLUSIVE, not EXCLUSIVE.  Google will crawl and index every single page on your site – even pages out with an XML sitemap.
He is the co-founder of NP Digital. The Wall Street Journal calls him a top influencer on the web, Forbes says he is one of the top 10 marketers, and Entrepreneur Magazine says he created one of the 100 most brilliant companies. Neil is a New York Times bestselling author and was recognized as a top 100 entrepreneur under the age of 30 by President Obama and a top 100 entrepreneur under the age of 35 by the United Nations.
Either porn SC or Ads containing porn on non­Porn pages can be very distracting or even upsetting to users. Please refresh the page a few times to see the range of Ads that appear, and use your knowledge of the locale and cultural sensitivities to make your rating. For example, an ad for a model in a revealing bikini is probably acceptable on a site that sells bathing suits. However, an extremely graphic porn ad may warrant a Low (or even Lowest) rating.”
In short, nobody is going to advise you to create a poor UX, on purpose, in light of Google’s algorithms and human quality raters who are showing an obvious interest in this stuff. Google is rating mobile sites on what it classes is frustrating UX – although on certain levels what Google classes as ‘UX’ might be quite far apart from what a UX professional is familiar with in the same ways as Google’s mobile rating tools differ from, for instance,  W3c Mobile testing tools.

All you need to notice from this kind of articles is what I & most of the others newbies focusing on the SEO link-building. I have seen many bloggers spending time on different ways of SEO link building instead of providing the value to the content and its social promotions. You may call it ignoring the Google, but we all know that the Google bot doesn’t ignore anchored dofollow or nofollow backlinks to calculate your PageRank.


Thank you for the great checklist. It is really useful and gave us a couple of new hints on optimization process. We’ve been using WebSite Auditor and found it exremely helpful in site’s structure and content analysis, bringing all the statistical information on the validation errors, social mentions, duplicate stuff, etc. under one roof. Still there is so many different SEO information you sometimes cannot understand what to start with. Fortunately you answered this question. Thank you again!
×