A navigational page is a simple page on your site that displays the structure of your website, and usually consists of a hierarchical listing of the pages on your site. Visitors may visit this page if they are having problems finding pages on your site. While search engines will also visit this page, getting good crawl coverage of the pages on your site, it's mainly aimed at human visitors.

Great SEO is increasingly dependent on having a website with a great user experience. To make your user experience great requires carefully tracking what people do so that you always know where to improve. But what do you track? In this 15-minute talk, I’ll cover three effective and advanced ways to use event tracking in Google Analytics to understand a website's user.

One thing Google has indicated it likes to do is penalize sites or stores or companies that consistently have poor reviews, so if you have many poor reviews, in time Google is going to figure out not to show your site in their rankings because Google doesn’t want to show those sites to their searchers. So prove to Google’s algorithm that you are trustworthy. Get other highly authoritative websites to link to you. Get newspaper articles, get industry links, get other trusted sites to link to you: partners, vendors, happy customers—get them to link to your website to show that you are highly credible and trustworthy.
Your website is the “hub” of your online brand – so, it’s important to have regular checkups to ensure everything is in order. It’s also important to note that your website is a living digital property, it’s typically not stagnant for long periods of time. In any given year, content is added and/or removed from your site. It is for this reason that audits should occur on a regular basis. We recommend that websites be audited at a minimum of once per year. That allows your teams to fix critical issues as they arise.
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[22] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.

Write a description that would both inform and interest users if they saw your description meta tag as a snippet in a search result. While there's no minimal or maximal length for the text in a description meta tag, we recommend making sure that it's long enough to be fully shown in Search (note that users may see different sized snippets depending on how and where they search), and contains all the relevant information users would need to determine whether the page will be useful and relevant to them.
Most small businesses owners and marketers know a little something about SEO (search engine optimization) and the different tactics to help your website rank well in organic search engine results. Another important tactic for any Internet business to know about is SEM (search engine marketing), which includes things such as search engine optimization, paid listings and other search engine related services.
Search engines use complex mathematical algorithms to interpret which websites a user seeks. In this diagram, if each bubble represents a website, programs sometimes called spiders examine which sites link to which other sites, with arrows representing these links. Websites getting more inbound links, or stronger links, are presumed to be more important and what the user is searching for. In this example, since website B is the recipient of numerous inbound links, it ranks more highly in a web search. And the links "carry through", such that website C, even though it only has one inbound link, has an inbound link from a highly popular site (B) while site E does not. Note: Percentages are rounded.
The audit is for all pages and not only one. What happens in the majority of the cases is that pages / posts have similarities so you can group them together. For example the pages of a website may be ok but the blog post pages may be missing titles. It’s a lot of work especially for a 500 pages website but you can start from the most important pages first and work your way to the rest
Hi Noya, all the info suggests that dwell time IS taken into account in search ranking, and we know that Google measures time on page and bounce rate in Analytics, too. Plus the search engine gets smarter all the time. With the machine learning component of RankBrain, we wouldn’t be surprised if Google can tell the difference between sites where visitors stick around, bounces where the visitor gets an answer immediately, and bounces where the visitor keeps searching.

Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.


I’m really grateful for your generous post, Brian. I’m definitely going to implement TOC on some of my over 4k words posts, where I’m trying to become the source. 😉 And I will also use the stats on some new posts. Thanks to you, I also researched big keywords, which I’d stayed away from, and found that many of the high CPC and ranking articles are from 2014. Hoping some of my fresh new content helps rank me higher. Love what you do, sir!
What would be the purpose of/reason for moving back to a different url? If its been a few years, I’d leave it alone unless you watched everything decline since moving to the main url. Moving the forum to a new url now would probably be a bit chaotic, not only for your main url but for the forum itself…. Only reason I could imagine myself moving the forum in this scenario would be if all those links were really awful and unrelated to the url it currently sits on…
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually  ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]

In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. KinderStart's website was removed from Google's index prior to the lawsuit, and the amount of traffic to the site dropped by 70%. On March 16, 2007, the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend, and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.[70][71]

Ego and assumptions led me to choose the wrong keywords for my own site. How did I spend three years optimizing my site and building links to finally crack the top three for six critical keywords, only to find out that I wasted all that time? However, in spite of targeting the wrong words, Seer grew the business. In this presentation, Will shows you the mistakes made and share with you the approaches that can help you build content that gets you thanked.
At Yoast, we practice what we call ‘holistic SEO‘. This means that your primary goal should be to build and maintain the best possible website. Don’t try to fool Google, but use a sustainable long-term strategy. Ranking will come automatically if your website is of extremely high quality. Google wants to get its users to the right place, as its mission is to index all the world’s online information and make it universally accessible and useful.
When referring to the homepage, a trailing slash after the hostname is optional since it leads to the same content ("https://example.com/" is the same as "https://example.com"). For the path and filename, a trailing slash would be seen as a different URL (signaling either a file or a directory), for example, "https://example.com/fish" is not the same as "https://example.com/fish/".

“When I decided to take the plunge and bring an SEO partner onboard my web project, I thought it would be hard – no – impossible! As a not for profit site my budget was very tight, but then I found SEO Rankings.  After explaining my situation and my goals Easy Internet Service worked with me to design a payment plan which meant I got everything I needed at a price I could afford. What’s more, they never once limited their support or assistance, and being new to the SEO field I had a lot to learn, but David from Easy Internet Services had answers and reassurance for all of my questions. This is why I recommend Easy Internet Services to all my friends, and I will continue to use them for as long as the internet exists.”


Another example when the “nofollow" attribute can come handy are widget links. If you are using a third party's widget to enrich the experience of your site and engage users, check if it contains any links that you did not intend to place on your site along with the widget. Some widgets may add links to your site which are not your editorial choice and contain anchor text that you as a webmaster may not control. If removing such unwanted links from the widget is not possible, you can always disable them with “nofollow" attribute. If you create a widget for functionality or content that you provide, make sure to include the nofollow on links in the default code snippet.

Google recommends that all websites use https:// when possible. The hostname is where your website is hosted, commonly using the same domain name that you'd use for email. Google differentiates between the "www" and "non-www" version (for example, "www.example.com" or just "example.com"). When adding your website to Search Console, we recommend adding both http:// and https:// versions, as well as the "www" and "non-www" versions.
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
John Lincoln is CEO of Ignite Visibility, one of the top digital marketing agencies in the nation and a 2017, 2018 and 2019 Inc. 5000 company. Lincoln is consistently named one of the top marketing experts in the industry. He has been recipient of the Search Engine Land "Search Marketer of the Year" award, named the #1 SEO consultant in the USA by Clutch.co, most admired CEO and 40 under 40. Lincoln has written two books (The Forecaster Method and Digital Influencer) and made two movies (SEO: The Movie and Social Media Marketing: The Movie) on digital marketing. He is a digital marketing strategy adviser to some of the biggest names in business.
There are other parts of SEO which you should pay attention to after your audit to make sure you stay competitive. After all, the technical foundation isn't the end of the road for SEO success. It's important to pay attention to your competition's SEO activity, keep an eye on the newest search engine best practices, and maintain local SEO best practices if your business depends on customers visiting a physical address. All of these are elements of a successful SEO strategy and should be corollary to your audit and ongoing SEO maintenance.
The world is mobile today. Most people are searching on Google using a mobile device. The desktop version of a site might be difficult to view and use on a mobile device. As a result, having a mobile ready site is critical to your online presence. In fact, starting in late 2016, Google has begun experiments to primarily use the mobile version of a site's content42 for ranking, parsing structured data, and generating snippets.

Advertiser Disclosure: Some of the products that appear on this site are from companies from which QuinStreet receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. QuinStreet does not include all companies or all types of products available in the marketplace.
And finally, the other really important bucket is authority. Google wants to show sites that are popular. If they can show the most popular t-shirt seller to people looking to buy t-shirts online, that’s the site they want to show. So you have to convince Google - send them signals that your site is the most popular site for the kind of t-shirts that you sell.
The depth of your articles impresses and amazes me. I love all the specific examples and tool recommendations. You discuss the importance of backlinks. How important is it to use a tool to list you on directories (Yext, Moz Local, Synup or JJUMP)? Will Google penalize you for listing on unimportant directories? Is it better to avoid these tools and get backlinks one at a time and avoid all but a few key directories?
All sites have a home or "root" page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (for example, root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?

The depth of your articles impresses and amazes me. I love all the specific examples and tool recommendations. You discuss the importance of backlinks. How important is it to use a tool to list you on directories (Yext, Moz Local, Synup or JJUMP)? Will Google penalize you for listing on unimportant directories? Is it better to avoid these tools and get backlinks one at a time and avoid all but a few key directories?
A recent study done by Search Engine Watch has shown that the top position on Google gets over 30% of the visitors for any given search.  I have seen businesses with the worst looking websites bring in client after client simply because they rank higher than their competition.  You might know what I am talking about if you run a small business with a website that is not currently at number 1. Hiring a local SEO pro is a must for your business.
Most small businesses owners and marketers know a little something about SEO (search engine optimization) and the different tactics to help your website rank well in organic search engine results. Another important tactic for any Internet business to know about is SEM (search engine marketing), which includes things such as search engine optimization, paid listings and other search engine related services.
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb (Adversarial Information Retrieval on the Web), was created to bring together practitioners and researchers concerned with search engine optimization and related topics.[14]
×