Everything for Google these days comes down to “Customer Experience”. Page speeds, table of content, the relevancy of content, length of content, uniqueness of content, jump links, video in the content, relevant headers, less broken links, customized 404 pages etc are all indicative of improving customer experience on th website and hence helps you rank better.

Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.

SEM is better for testing than SEO. Because you can immediately turn SEM paid ads off and on, it’s a great strategy for testing. You can quickly revise your ad copy, target new audiences, and change landing page content to test your new tactics. This flexibility allows you to see differences in your strategies immediately. You cannot accomplish this through SEO, as it would take too much time to make changes and monitor differences in results.
And finally, the other really important bucket is authority. Google wants to show sites that are popular. If they can show the most popular t-shirt seller to people looking to buy t-shirts online, that’s the site they want to show. So you have to convince Google - send them signals that your site is the most popular site for the kind of t-shirts that you sell.

The errors in technical SEO are often not obvious, and therefore one of the most popular. Mistakes in robots.txt and 404 pages, pagination and canonical URLs, hreflang tags and 301 redirects, http vs https and www vs non www versions: each of them can seriously spoil all efforts to promote the site. One quality SEO website analysis is enough to solve all the main problems in this part forever.
In 2007, Google announced a campaign against paid links that transfer PageRank.[30] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat any nofollow links, in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[31] As a result of this change the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated JavaScript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and JavaScript.[32]

You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.
SEM is a broader term than SEO, and is used to encompass different options available to use a search engine’s technology, including paid ads. SEM is often used to describe acts associated with researching, submitting and positioning a website within search engines.  It includes things such as search engine optimization, paid listings and other search-engine related services and functions that will increase exposure and traffic to your Web site. 
In the last year, Google and Bing have both indicated a shift to entity-based search results as part of their evolution. Google has unscored this point with rich snippets and Knowledge Graph, and Bing has now upped the ante on personal search results with Bing Snapshots. Find out how you can adopt strategies to stay ahead of the curve in the new world of semantic search results.
The depth of your articles impresses and amazes me. I love all the specific examples and tool recommendations. You discuss the importance of backlinks. How important is it to use a tool to list you on directories (Yext, Moz Local, Synup or JJUMP)? Will Google penalize you for listing on unimportant directories? Is it better to avoid these tools and get backlinks one at a time and avoid all but a few key directories?
Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.
Search engines may penalize sites they discover using black or grey hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices.[54] Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's search engine results page.[55]
When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.

Social media has a pivotal role – Last but not least, social media is an evolving platform that has changed from a basic communication platform to a highly profitable marketing channel. Many users start their searches on social media and make their way to a business’s site. Sharing up-to-date, engaging, and personalized content will attract more people to your profile, and eventually to your website.

By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. In June 2007, The New York Times' Saul Hansell stated Google ranks sites using more than 200 different signals.[26] The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. Some SEO practitioners have studied different approaches to search engine optimization, and have shared their personal opinions.[27] Patents related to search engines can provide information to better understand search engines.[28] In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[29]
×