Great SEO is increasingly dependent on having a website with a great user experience. To make your user experience great requires carefully tracking what people do so that you always know where to improve. But what do you track? In this 15-minute talk, I’ll cover three effective and advanced ways to use event tracking in Google Analytics to understand a website's user.
A few years back we decided to move our community forum from a different URL (myforum.com) to our main URL (mywebsite.com/forum), thinking all the community content could only help drive additional traffic to our website. We have 8930 site links currently, which probably 8800 are forum content or blog content. Should we move our forum back to a different URL?
After the audit has been completed, your team will be invited to a presentation in which your SEO specialist will talk through the findings and recommendations. The Three Deep SEO team will walk you and your team though the roadmap to completion so you know what to expect and when. In addition, you will receive a comprehensive analysis of your site’s health. All of these are customized to you and your specific situation.
Search engines may penalize sites they discover using black or grey hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices. Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's search engine results page.
Our online SEO training courses teach you vital SEO skills you can apply immediately. Find out how to outrank your competition and become the best result through our training courses! Whether you’re a blogger, developer, online marketer, or own a business, big or small: we believe in SEO for everyone. We’ve got a great variety of courses, from Keyword Research, Site structure and SEO Copywriting to the more technical aspects of SEO: Structured data, multilingual SEO and Technical SEO training. There’s something for everyone, so be sure to check them out!
Social media has a pivotal role – Last but not least, social media is an evolving platform that has changed from a basic communication platform to a highly profitable marketing channel. Many users start their searches on social media and make their way to a business’s site. Sharing up-to-date, engaging, and personalized content will attract more people to your profile, and eventually to your website.
Your article reaches me at just the perfect time. I’ve been working on getting back to blogging and have been at it for almost a month now. I’ve been fixing SEO related stuff on my blog and after reading this article (by the way is way too long for one sitting) I’m kind of confused. I’m looking at bloggers like Darren Rowse, Brian Clark, and so many other bloggers who use blogging or their blogs as a platform to educate their readers more than thinking about search rankings (but I’m sure they do).
The errors in technical SEO are often not obvious, and therefore one of the most popular. Mistakes in robots.txt and 404 pages, pagination and canonical URLs, hreflang tags and 301 redirects, http vs https and www vs non www versions: each of them can seriously spoil all efforts to promote the site. One quality SEO website analysis is enough to solve all the main problems in this part forever.
The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review. Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links in addition to their URL submission console. Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click; however, this practice was discontinued in 2009.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.
Simple navigation reigns and quality content is king – A user-friendly website, with interesting and easy-to-find information is what will boost your traffic. Each page needs to be built around keyword themes, with unique content, so search engines can easily index yours and rank you higher. Positive behaviors from site visitors is your best bet for a better ranking, so keep the content natural and focused; avoid jargon and keyword stuffing to keep users from leaving the site unhappy and hurting its ranking.
Off-page SEO builds a website’s reputation and authority by connecting it to other high-quality websites. Off-page SEO techniques include: link building (acquiring high-quality backlinks) from other websites and managing local listings and directory profiles. When many websites link to a brand’s website, it shows search engines that the brand’s website is trustworthy, reliable, and reputable, which increases its search rankings.
Many websites rely on other traffic generation methods such as traffic from social media, email, referrals, and direct traffic sources over search engines. For sites like these, SEO errors aren’t as important because search engines aren’t their #1 traffic source. For a smaller website, a couple of errors can have a much bigger negative effect than those same errors on a larger website.
As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable example markets are China, Japan, South Korea, Russia and the Czech Republic where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.
Everything for Google these days comes down to “Customer Experience”. Page speeds, table of content, the relevancy of content, length of content, uniqueness of content, jump links, video in the content, relevant headers, less broken links, customized 404 pages etc are all indicative of improving customer experience on th website and hence helps you rank better.
There are other parts of SEO which you should pay attention to after your audit to make sure you stay competitive. After all, the technical foundation isn't the end of the road for SEO success. It's important to pay attention to your competition's SEO activity, keep an eye on the newest search engine best practices, and maintain local SEO best practices if your business depends on customers visiting a physical address. All of these are elements of a successful SEO strategy and should be corollary to your audit and ongoing SEO maintenance.
Love how you just dive into the details for this Site Audit guide. Excellent stuff! Yours is much much easier to understand than other guides online and I feel like I could integrate this to how I site audit my websites and actually cut down the time I make my reports. I only need to do more research on how to remove “zombie pages”. If you could have a ste-by-step guide to it, that would be awesome! Thanks!
To prevent users from linking to one version of a URL and others linking to a different version (this could split the reputation of that content between the URLs), focus on using and referring to one URL in the structure and internal linking of your pages. If you do find that people are accessing the same content through multiple URLs, setting up a 301 redirect32 from non-preferred URLs to the dominant URL is a good solution for this. You may also use canonical URL or use the rel="canonical"33 link element if you cannot redirect.
SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective, such as paid advertising through pay per click (PPC) campaigns, depending on the site operator's goals. Search engine marketing (SEM) is the practice of designing, running and optimizing search engine ad campaigns. Its difference from SEO is most simply depicted as the difference between paid and unpaid priority ranking in search results. Its purpose regards prominence more so than relevance; website developers should regard SEM with the utmost importance with consideration to visibility as most navigate to the primary listings of their search. A successful Internet marketing campaign may also depend upon building high quality web pages to engage and persuade, setting up analytics programs to enable site owners to measure results, and improving a site's conversion rate. In November 2015, Google released a full 160 page version of its Search Quality Rating Guidelines to the public, which revealed a shift in their focus towards "usefulness" and mobile search. In recent years the mobile market has exploded, overtaking the use of desktops, as shown in by StatCounter in October 2016 where they analyzed 2.5 million websites and found that 51.3% of the pages were loaded by a mobile device . Google has been one of the companies that are utilizing the popularity of mobile usage by encouraging websites to use their Google Search Console, the Mobile-Friendly Test, which allows companies to measure up their website to the search engine results and how user-friendly it is.
Your website is the “hub” of your online brand – so, it’s important to have regular checkups to ensure everything is in order. It’s also important to note that your website is a living digital property, it’s typically not stagnant for long periods of time. In any given year, content is added and/or removed from your site. It is for this reason that audits should occur on a regular basis. We recommend that websites be audited at a minimum of once per year. That allows your teams to fix critical issues as they arise.
Site. Migration. No two words elicit more fear, joy, or excitement to a digital marketer. When the idea was shared three years ago, the company was excited. They dreamed of new features and efficiency. But as SEOs we knew better. We knew there would be midnight strategy sessions with IT. More UAT environments than we could track. Deadlines, requirements, and compromises forged through hallway chats. ... The result was a stable transition with minimal dips in traffic. What we didn't know, however, was the amount of cross-functional coordination that was required to pull it off. Learn more in this video!
However, if possible, I would like you to expand a bit on your “zombie pages” tip..we run a site where are definitely enough pages to delete (no sessions, no links, probably not even relevant with the main theme of the site, not even important for the architecture of the site)..Nonetheless, I am not very sure what is the best technical decision for these pages…just deleting them from my CMS, redirecting (if there is a relevant alternative) or something else? Unindex them on Search console? what response code they should have? ..
Search engines use complex mathematical algorithms to interpret which websites a user seeks. In this diagram, if each bubble represents a website, programs sometimes called spiders examine which sites link to which other sites, with arrows representing these links. Websites getting more inbound links, or stronger links, are presumed to be more important and what the user is searching for. In this example, since website B is the recipient of numerous inbound links, it ranks more highly in a web search. And the links "carry through", such that website C, even though it only has one inbound link, has an inbound link from a highly popular site (B) while site E does not. Note: Percentages are rounded.