I like the competition analysis tools, it provides paid and organic data, which gives me an idea on how to catch up and outrank the immediate competition for my clients. It also provides data for the potential traffic, which helps show clients the potential gains of the campaign. And with the marketing plan, I know what needs to be improved in order to get results for my clients.

Off-page SEO builds a website’s reputation and authority by connecting it to other high-quality websites. Off-page SEO techniques include: link building (acquiring high-quality backlinks) from other websites and managing local listings and directory profiles. When many websites link to a brand’s website, it shows search engines that the brand’s website is trustworthy, reliable, and reputable, which increases its search rankings.
Hey Sharon, great post! Re. dwell time – I’ve read conflicting opinions, some saying that Google DOES consider it an ‘important’ ranking signal, and others saying that it doesn’t, because dwell time can sometimes be a misleading indicator of content quality. For example when a user searches for something specific and finds the answer immediately in the recommended page (meaning that the content on the page is actually spot on) so he returns to the SERPs very quickly. I have been unable to locate any definitive statements (written/spoken) from anyone at Google that suggest that dwell time IS still a factor in ranking considerations, but it makes sense (to me, anyway) that it should be. Do you have any ‘proof’ one way or the other re. whether Google definitely considers dwell time or not?
When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.

Our local Frisco SEO services can help your business get found by people in your area who are looking for a business like yours.  We believe in supporting local small to medium sized businesses, and would love to help you grow.  We are partnered with some of the best SEO’s in the country, which allows us to virtually guarantee first page ranking for our clients. We do both website SEO and Video SEO, as well as many other aspects of digital marketing.  Rebel Base SEO can handle just about anything your business requires.

The ranking of your website is partly decided by on-page factors. On-page SEO factors are all those things you can influence from within your actual website. These factors include technical aspects (e.g. the quality of your code and site speed) and content-related aspects, like the structure of your website or the quality of the copy on your website. These are all crucial on-page SEO factors.
SEM search results have ad extensions. SEO search results have featured snippets. When comparing SEM vs. SEO, you’ll also find differences in the appearance of the search results. SEM search results may include ad extensions, which can add on additional links, phone numbers, and callouts. On the other hand, SEO results may appear with featured snippets in search.

Link text is the visible text inside a link. This text tells users and Google something about the page you're linking to. Links on your page may be internal—pointing to other pages on your site—or external—leading to content on other sites. In either of these cases, the better your anchor text is, the easier it is for users to navigate and for Google to understand what the page you're linking to is about.
A breadcrumb is a row of internal links at the top or bottom of the page that allows visitors to quickly navigate back to a previous section or the root page. Many breadcrumbs have the most general page (usually the root page) as the first, leftmost link and list the more specific sections out to the right. We recommend using breadcrumb structured data markup28 when showing breadcrumbs.
Technical SEO optimizes the non-content elements of a website and the website as a whole to improve its backend structure and foundation. These strategies relate to: site speed, mobile friendliness, indexing, crawlability, site architecture, structured data, and security. Technical SEO improves both user and search crawler experience, which leads to higher search rankings.
SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective, such as paid advertising through pay per click (PPC) campaigns, depending on the site operator's goals. Search engine marketing (SEM) is the practice of designing, running and optimizing search engine ad campaigns.[56] Its difference from SEO is most simply depicted as the difference between paid and unpaid priority ranking in search results. Its purpose regards prominence more so than relevance; website developers should regard SEM with the utmost importance with consideration to visibility as most navigate to the primary listings of their search.[57] A successful Internet marketing campaign may also depend upon building high quality web pages to engage and persuade, setting up analytics programs to enable site owners to measure results, and improving a site's conversion rate.[58] In November 2015, Google released a full 160 page version of its Search Quality Rating Guidelines to the public,[59] which revealed a shift in their focus towards "usefulness" and mobile search. In recent years the mobile market has exploded, overtaking the use of desktops, as shown in by StatCounter in October 2016 where they analyzed 2.5 million websites and found that 51.3% of the pages were loaded by a mobile device [60]. Google has been one of the companies that are utilizing the popularity of mobile usage by encouraging websites to use their Google Search Console, the Mobile-Friendly Test, which allows companies to measure up their website to the search engine results and how user-friendly it is.

Thank you for the great checklist. It is really useful and gave us a couple of new hints on optimization process. We’ve been using WebSite Auditor and found it exremely helpful in site’s structure and content analysis, bringing all the statistical information on the validation errors, social mentions, duplicate stuff, etc. under one roof. Still there is so many different SEO information you sometimes cannot understand what to start with. Fortunately you answered this question. Thank you again!
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
John Lincoln (MBA) is CEO of Ignite Visibility (a 2017, 2018 & 2019 Inc. 5000 company) a highly sought-after digital marketing strategist, industry speaker and winner of the coveted Search Engine Land "Search Marketer of the Year" award. With 16+ years of demanding experience, Lincoln has worked with over 1,000 online businesses including amazing clients such as Office Depot, Tony Robbins, Morgan Stanley, Fox, USA Today, COX and The Knot World Wide.
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
×