Note that Google is pretty good these days at removing any special characters you have in your page title – and I would be wary of trying to make your title or Meta Description STAND OUT using special characters. That is not what Google wants, evidently, and they do give you a further chance to make your search snippet stand out with RICH SNIPPETS and SCHEMA mark-up.

QUOTE: “So if you have different parts of your website and they’re on different subdomains that’s that’s perfectly fine that’s totally up to you and the way people link across these different subdomains is really up to you I guess one of the tricky aspects there is that we try to figure out what belongs to a website and to treat that more as a single website and sometimes things on separate subdomains are like a single website and sometimes they’re more like separate websites for example on on blogger all of the subdomains are essentially completely separate websites they’re not related to each other on the other hand other websites might have different subdomains and they just use them for different parts of the same thing so maybe for different country versions maybe for different language versions all of that is completely normal.” John Mueller 2017
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
In short, nobody is going to advise you to create a poor UX, on purpose, in light of Google’s algorithms and human quality raters who are showing an obvious interest in this stuff. Google is rating mobile sites on what it classes is frustrating UX – although on certain levels what Google classes as ‘UX’ might be quite far apart from what a UX professional is familiar with in the same ways as Google’s mobile rating tools differ from, for instance,  W3c Mobile testing tools.
Expertise and authoritativeness of a site increases its quality. Be sure that content on your site is created or edited by people with expertise in the topic. For example, providing expert or experienced sources can help users understand articles’ expertise. Representing well-established consensus in pages on scientific topics is a good practice if such consensus exists.
Repeat this exercise for as many topic buckets as you have. And remember, if you're having trouble coming up with relevant search terms, you can always head on over to your customer-facing colleagues -- those who are in Sales or Service -- and ask them what types of terms their prospects and customers use, or common questions they have. Those are often great starting points for keyword research.
Provide full functionality on all devices. Mobile users expect the same functionality - such as commenting and check-out - and content on mobile as well as on all other devices that your website supports. In addition to textual content, make sure that all important images and videos are embedded and accessible on mobile devices. For search engines, provide all structured data and other metadata - such as titles, descriptions, link-elements, and other meta-tags - on all versions of the pages.
However, we do expect websites of large companies and organizations to put a great deal of effort into creating a good user experience on their website, including having helpful SC. For large websites, SC may be one of the primary ways that users explore the website and find MC, and a lack of helpful SC on large websites with a lot of content may be a reason for a Low rating.

QUOTE: “The preferred domain is the one that you would liked used to index your site’s pages (sometimes this is referred to as the canonical domain). Links may point to your site using both the www and non-www versions of the URL (for instance, http://www.example.com and http://example.com). The preferred domain is the version that you want used for your site in the search results.” Google, 2018


The last time I looked Google displayed as many characters as it can fit into a block element that’s about 600px wide and doesn’t exceed 1 line of text (on desktop). So – THERE IS NO BEST PRACTICE AMOUNT OF CHARACTERS any SEO could lay down as exact best practice to GUARANTEE a title will display, in full in Google, at least, as the search snippet title, on every device. Ultimately – only the characters and words you use will determine if your entire page title will be seen in a Google search snippet.
Google is a link-based search engine. Google doesn’t need content to rank pages but it needs content to give to users. Google needs to find content and it finds content by following links just like you do when clicking on a link. So you need first to make sure you tell the world about your site so other sites link to yours. Don’t worry about reciprocating to more powerful sites or even real sites – I think this adds to your domain authority – which is better to have than ranking for just a few narrow key terms.
In particular, the Google web spam team is currently waging a PR war on sites that rely on unnatural links and other ‘manipulative’ tactics (and handing out severe penalties if it detects them). And that’s on top of many algorithms already designed to look for other manipulative tactics (like keyword stuffing or boilerplate spun text across pages).
Disney initially stated they wouldn’t exceed one million in donations, but ended up donating two million after the campaign blew up. #ShareYourEars campaign garnered 420 million social media impressions, and increased Make-A-Wish’s social media reach by 330%. The campaign is a powerful example of using an internet marketing strategy for a good cause. #ShareYourEars raised brand awareness, cultivated a connected online community, and positively affected Disney’s brand image.

Redirecting is the act of sending a user to a different URL than the one initially requested. There are many good reasons to redirect from one URL to another, for example, when a website moves to a new address. However, some redirects are designed to deceive search engines and users. These are a very poor user experience, and users may feel tricked or confused. We will call these “sneaky redirects.” Sneaky redirects are deceptive and should be rated Lowest.

Do you regularly publish helpful, useful articles, videos or other types of media that are popular and well produced? Do you write for actual human beings rather than the search engine itself? Well, you should. Latest research from Searchmetrics on ranking factors indicates that Google is moving further towards longer-form content that understands a visitor’s intention as a whole, instead of using keywords based on popular search queries to create content.
Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[64] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[65] As of 2006, Google had an 85–90% market share in Germany.[66] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[66] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[67] That market share is achieved in a number of countries.
Sometimes, Google turns up the dial on demands on ‘quality’, and if your site falls short, a website traffic crunch is assured. Some sites invite problems ignoring Google’s ‘rules’ and some sites inadvertently introduce technical problems to their site after the date of a major algorithm update and are then impacted negatively by later refreshes of the algorithm.
QUOTE: “To summarize, a lack of helpful SC may be a reason for a Low quality rating, depending on the purpose of the page and the type of website. We have different standards for small websites which exist to serve their communities versus large websites with a large volume of webpages and content. For some types of “webpages,” such as PDFs and JPEG files, we expect no SC at all.” Google Search Quality Evaluator Guidelines 2017
Many think that Google won’t allow new websites to rank well for competitive terms until the web address “ages” and acquires “trust” in Google – I think this depends on the quality of the incoming links. Sometimes your site will rank high for a while then disappears for months. A “honeymoon period” to give you a taste of Google traffic, perhaps, or a period to better gauge your website quality from an actual user perspective.
I used to think it could take more to get a subfolder trusted than say an individual file and I guess this sways me to use files on most websites I created (back in the day). Once subfolders are trusted, it’s 6 or half a dozen, what the actual difference is in terms of ranking in Google – usually, rankings in Google are more determined by how RELEVANT or REPUTABLE a page is to a query.
The above information does not need to feature on every page, more on a clearly accessible page. However – with Google Quality Raters rating web pages on quality based on Expertise, Authority and Trust (see my recent making high-quality websites post) – ANY signal you can send to an algorithm or human reviewer’s eyes that you are a legitimate business is probably a sensible move at this time (if you have nothing to hide, of course).
And so on and so on. The point of this step isn't to come up with your final list of keyword phrases. You just want to end up with a brain dump of phrases you think potential customers might use to search for content related to that particular topic bucket. We'll narrow the lists down later in the process so you don't have something too unwieldy. Once you have your final list, there are several data-driven tools available to you for finding out which keywords you're most likely to rank well for. 

Ever wonder how major search engines such as Google, Bing and Yahoo rank your website within their searches? Or how content such as videos or local listings are shown and ranked based on what the search engine considers most relevant to users? Welcome to the world of Search Engine Optimization (SEO). This course is the first within the SEO Specialization and it is intended to give you a taste of SEO. You will be introduced to the foundational elements of how search engines work, how the SEO landscape has changed and what you can expect in the future. You discuss core SEO strategies and tactics used to drive more organic search results to a specific website or set of websites, as well as tactics to avoid to prevent penalization from search engines. You will also discover how to position yourself for a successful career in SEO should this subject prove interesting to you. We hope this taste of SEO, will entice you to continue through the Specialization.
I think it makes sense to have unique content as much as possible on these pages but it’s not not going to like sync the whole website if you don’t do that we don’t penalize a website for having this kind of deep duplicate content and kind of going back to the first thing though with regards to doorway pages that is something I definitely look into to make sure that you’re not running into that so in particular if this is like all going to the same clinic and you’re creating all of these different landing pages that are essentially just funneling everyone to the same clinic then that could be seen as a doorway page or a set of doorway pages on our side and it could happen that the web spam team looks at that and says this is this is not okay you’re just trying to rank for all of these different variations of the keywords and the pages themselves are essentially all the same and they might go there and say we need to take a manual action and remove all these pages from search so that’s kind of one thing to watch out for in the sense that if they are all going to the same clinic then probably it makes sense to create some kind of a summary page instead whereas if these are going to two different businesses then of course that’s kind of a different situation it’s not it’s not a doorway page situation.”
Once you have your keyword list, the next step is actually implementing your targeted keywords into your site’s content. Each page on your site should be targeting a core term, as well as a “basket” of related terms. In his overview of the perfectly optimized page, Rand Fishkin offers a nice visual of what a well (or perfectly) optimized page looks like:
I used to think it could take more to get a subfolder trusted than say an individual file and I guess this sways me to use files on most websites I created (back in the day). Once subfolders are trusted, it’s 6 or half a dozen, what the actual difference is in terms of ranking in Google – usually, rankings in Google are more determined by how RELEVANT or REPUTABLE a page is to a query.
One concern we hear frequently is whether it is beneficial or harmful to repeat keywords. In other words, should we vary keywords (dog food, puppy food, and Purina) or repeat keywords (dog food reviews, dog food comparison, and dog food rankings.) The short answer is that the repetition is just fine, as long as the meaning of the phrase as a whole is sufficiently varied. In other words, dog food and dog food online are basically synonymous, and the content that one might expect to find associated with both keywords is the same. However, dog food reviews and dog food comparison indicate somewhat different content and therefore are appropriate to be used in tandem as keywords.
QUOTE: “Search engine optimization is often about making small modifications to parts of your website. When viewed individually, these changes might seem like incremental improvements, but when combined with other optimizations, they could have a noticeable impact on your site’s user experience and performance in organic search results.” Google Starter Guide, 2008
This Specialization will teach you to optimize website content for the best possible search engine ranking. You'll learn the theory behind Google search and other search engine algorithms; you'll also build practical, real-world skills that you can apply to a career in digital marketing or online content development, including on-page and off-page optimization, optimizing for local and international audiences, conducting search-focused website audits, and aligning SEO with overall business strategies. Each course is intended to build on the skills from the previous course, thus we recommend you take the courses in the order they are listed. The Specialization culminates in a hands-on Capstone Project, in which you will apply your skills to a comprehensive SEO consulting task.
I think the anchor text links in internal navigation is still valuable – but keep it natural. Google needs links to find and help categorise your pages. Don’t underestimate the value of a clever internal link keyword-rich architecture and be sure to understand for instance how many words Google counts in a link, but don’t overdo it. Too many links on a page could be seen as a poor user experience. Avoid lots of hidden links in your template navigation.

QUOTE: “The preferred domain is the one that you would liked used to index your site’s pages (sometimes this is referred to as the canonical domain). Links may point to your site using both the www and non-www versions of the URL (for instance, http://www.example.com and http://example.com). The preferred domain is the version that you want used for your site in the search results.” Google, 2018
A page title that is highly relevant to the page it refers to will maximise usability, search engine ranking performance and user experience ratings as Google measures these. It will probably be displayed in a web browser’s window title bar, bookmarks and in clickable search snippet links used by Google, Bing & other search engines. The title element is the “crown” of a web page with important keyword phrase featuring AT LEAST ONCE within it.
Google decides which pages on your site are important or most relevant. You can help Google by linking to your important pages and ensuring at least one page is well optimised amongst the rest of your pages for your desired key phrase. Always remember Google does not want to rank ‘thin’ pages in results – any page you want to rank – should have all the things Google is looking for. That’s a lot these days!

Effective anchor text should be used to help users navigate your website and find what they are looking for. It should also include keywords and phrases related to what you do. If you own a shoe store, for example, the words, “Check out our selection of children’s shoes,” on your homepage can link via anchor text to your online store that is stocked full of – you guessed it – children’s shoes.


QUOTE: “Anytime you do a bigger change on your website if you redirect a lot of URLs or if you go from one domain to another or if you change your site’s structure then all of that does take time for things to settle down so we can follow that pretty quickly we can definitely forward the signals there but that doesn’t mean that’ll happen from one day to next” John Mueller, Google 2016
QUOTE: “So if you have different parts of your website and they’re on different subdomains that’s that’s perfectly fine that’s totally up to you and the way people link across these different subdomains is really up to you I guess one of the tricky aspects there is that we try to figure out what belongs to a website and to treat that more as a single website and sometimes things on separate subdomains are like a single website and sometimes they’re more like separate websites for example on on blogger all of the subdomains are essentially completely separate websites they’re not related to each other on the other hand other websites might have different subdomains and they just use them for different parts of the same thing so maybe for different country versions maybe for different language versions all of that is completely normal.” John Mueller 2017
For example, let’s say you were running the website of an online pet store. You might be wise to create one keyword grouping for all your dog-related products, then one for all of your parakeet-related projects, etc. The next step would be to segment each individual group into smaller subgroups (parakeet cages, parakeet toys, parakeet snacks) and then even smaller groups for each type of product (low-fat parakeet snacks, luxury parakeet snacks… you get the idea). Now your pet store can create individual pages optimized for each small keyword group.
Many blogging software packages automatically nofollow user comments, but those that don't can most likely be manually edited to do this. This advice also goes for other areas of your site that may involve user-generated content, such as guest books, forums, shout-boards, referrer listings, etc. If you're willing to vouch for links added by third parties (for example, if a commenter is trusted on your site), then there's no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site. The Webmaster Help Center has more tips on avoiding comment spam40, for example by using CAPTCHAs and turning on comment moderation.
×