All sites have a home or "root" page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (for example, root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?
Millions of people are searching the internet everyday, looking for something or someone to help them out.  If your business is not showing up in the search results, you are probably losing money and customers to your competitors.  Building a website that doesn’t rank high in the search engines is like opening a store in the middle of a major city and not advertising.  If you ever wondered why some businesses are doing great and others get zero business online, I can probably tell you it’s how they rank in the local search results. Every business online needs good Search Engine Optimization.
As stated above, we are a local Frisco SEO agency servicing Collin County and Denton County, as well as other clients across the country. As a small business, we love to work with other entrepreneurs and small businesses who are looking to make their own mark.  When small businesses are profitable, the whole community benefits.  With that being said, not every business is a perfect match for our services. Some clients just need a one time project (like having a website built).  Others need long term services.

Google is falling into a familiar pattern. First, they offer web publishers increased visibility and SERP display options. Next, they incent participation in specific formats and data structures. Finally, they take that data for themselves, changing the SERPs to favor advertising, their own properties, and/or instant answers that can reduce publisher traffic. For web marketers, it's a prisoner's dilemma. In this presentation, Rand will show data on how Google is being used today, how it's changing, then dive into strategic initiatives and specific examples of how savvy players can build a moat to protect against long-term risk.

Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
Search engines may penalize sites they discover using black or grey hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices.[54] Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's search engine results page.[55]
Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.
When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.
The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[40] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[41] in addition to their URL submission console.[42] Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click;[43] however, this practice was discontinued in 2009.
Your website is the “hub” of your online brand – so, it’s important to have regular checkups to ensure everything is in order. It’s also important to note that your website is a living digital property, it’s typically not stagnant for long periods of time. In any given year, content is added and/or removed from your site. It is for this reason that audits should occur on a regular basis. We recommend that websites be audited at a minimum of once per year. That allows your teams to fix critical issues as they arise.
As stated above, we are a local small business who loves to work with other entrepreneurs and small businesses who are looking to make their own mark.  When small businesses are profitable, the whole community benefits.  With that being said, not every business is a perfect match for our services. Some clients just need a one time project (like website design).  Others need long term services and don’t want to deal with a Dallas SEO Agency.
Alt text (alternative text), also known as "alt attributes" describe the appearance and function of an image on a page. Alt text uses: 1. Adding alternative text to photos is first and foremost a principle of web accessibility. Visually impaired users using screen readers will be read an alt attribute to better understand an on-page image. 2. Alt tags will be displayed in place of an image if an image file cannot be loaded. 3. Alt tags provide better image context/descriptions to search engine crawlers, helping them to index an image properly.
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
×