Site Discovery, Site indexing, Ranking - 3 Seo Components

Lecture



Site Discovery,   Site indexing, Ranking  - 3 Seo Components

In the era of search engines, SEO can be reduced to 3 main elements or functions: time to survey a site (detection), time to index (which also includes filtering), and time to ranking (algorithms).


Today, the first part of the three, we will talk about the discovery of the site by search engines.

Combining SEO to these basic elements allows SEOs to create a kind of template for working on search engine promotion. Using this template, you can not go into the specifics and subtleties of the actions in this template.

And sometimes it is good for us, because search engines are top-secret monsters ... monster robots.

SEO priorities

Detection - Indexing - Ranking

This thesis is based on the following assumptions:

  • Googlebot (or any other search engine bot) comes to the page to read the information. This will always be the case, at least until Google learns about osmosis.
  • Google, based on information obtained during the discovery process, allocates and analyzes URLs in real time in order to make indexing decisions.
  • Based on the two steps listed above, the ranking process takes place, usually it is very accurate.
  • The process is repeated.

Search engines explore, index and rank web pages. Therefore, SEO should base all their tactics on these steps of search engines. Here are the simple conclusions that will be useful for SEO, regardless of the competition in the niche:

  • It's all about page discovery
  • It's all about indexing
  • It's all about ranking

But of course, Google, like all other search engines, exists with one goal - to develop a business, satisfying user needs. Therefore, we must constantly remember the following:

  • Everything revolves around users

And of course, since by its nature SEO is a competition in which to reach the top of the list you need to defeat your competitors, we should note the following:

  • It's all about competition

Considering the above, we can derive several SEO tactics for each of the phases of the search engine.

Page discovery process

SEO begins with a study of how a search bot discovers pages. Googlebot, for example, exploring our site, sees all the details that are recorded in the logs. These carry the following information:

  • Detection difficulty
  • Pages that should not have been read
  • Duplicate content
  • Frequency and depth of reading pages
  • The presence of errors from the server side: 302, 304, 307, 5xx, etc.
  • The presence of chains redirects
  • Having unnecessary 404 errors

Special scripts (AudetteMedia's logfilt), which help to quickly parse a lot of such logs, are used in different situations:

Site Discovery,   Site indexing, Ranking  - 3 Seo Components

At the corporate level, there is a tool for analysis, such as http://www.splunk.com/, which can parse all the information by user-agent, sorting it by server status code, time and date:

Site Discovery,   Site indexing, Ranking  - 3 Seo Components

Site Discovery,   Site indexing, Ranking  - 3 Seo Components

Xenu is a full-fledged application that provides the most complete information about a search bot visiting a site. The application works with more than 10,000 pages, so it’s best to use it at the corporate level. Here are some tips for working with Xenu or the like:

  • Upload domain or part of the site to SEMrush
  • Export all request data to CSV
  • Download this file to Xenu or another application.
  • Sort by code

This quick and simple technique helps to find codes such as 302 and others.

Google Webmaster Tools also has useful information for SEOers about page detection, duplicate content, and how often a bot visits the site:

Site Discovery,   Site indexing, Ranking  - 3 Seo Components

Site Discovery,   Site indexing, Ranking  - 3 Seo Components

There are also other tools with which you can look at the pages of the site through the eyes of a search bot (for example, Lynx and SEO Browser).

Site Discovery,   Site indexing, Ranking  - 3 Seo Components

SEO is action

The most important thing in SEO is accurate information, but more importantly, what and how can you do with this information. The thing is how you interpret this information, not how you get it.

Subjecting SEO to work of systematization, savvy SEOs make a pattern of actions or methods based on the theory of detection, indexing and ranking, which subsequently leads their sites straight to the tops of search results.

Last time, in a series of three articles, we met the first article, where we looked at the first stage of the search engine: site discovery. We also considered the possible methods of SEO work with each of the stages.

Before continuing, I think it would be appropriate to refresh in memory of what was discussed in the first article:

The search engine detects, indexes and ranks web pages. SEO should base their promotion tactics on these three stages of search engine performance. Therefore, the following conclusions were made:

  • It's all about page discovery
  • It's all about indexing.
  • It's all about ranking.

But of course, Google, like any other search engine, exists with one goal - to build and grow a business, satisfying user needs. Therefore, we must constantly remember the following:

  • It's all about the users.

Knowing all of the above, we can develop several methods of promotion for each of the phases of the search engine, which in the end can lead to a single SEO strategy.

Site Indexing

Indexing is the next step after finding the page. Identification of duplicate content is the main function of this step of the search engine. It may not be an exaggeration if I say that all large sites do not have unique content, albeit internationally.

Online stores can have the same content in the form of the same goods. We can state this with precision, having extensive experience in working with such vendors as Zappos and Charming Shoppes.

More problems with news portals of famous newspapers and publications. Marshall Symonds and his team, working on The New York Times and other publications, are daily confronted with duplicate content, which is a major SEO job.

The site will never be specifically sanctioned if there are duplicate content on its pages. But there are filters that can distinguish the same or slightly modified content on multiple pages. This problem is one of the main for SEO.

Duplicates will also affect the visibility of the site, so you need to reduce the number of duplicates to zero. Different versions of the same content in the search engine index is also not the best optimization result.

Matt Cutts , in an interview with Eric Engem, confirmed the existence of a "crawl cap" (site visibility cap), which depends on the PR site (not the tool PR) and told about what problems may arise due to duplicate content:

Imagine if we scan the three pages of the site, and then find out that two of them are copies of the third. We throw away these two pages and leave only one, and therefore it turns out that the site has much less good content ... But the more PR, the more chances that the pages will not disappear from the search ...

The full version of the interview with Matt Cutts includes the most complete information for any serious SEO user on the issue of duplicate content. Although the majority of what you hear there will not be news, but to confirm some guesses and decisions that we face every day will not be superfluous.

Links, especially from sites with excellent structure, with relevant and high-quality pages will not only improve the indexing of the site, but also improve its visibility.

Determining the level of "penetration" of the search engine into the site, "site visibility cap", the number of duplicate content, and then eliminating them, will improve both the visibility of the site in the eyes of the search engine and the site indexation.

How to determine the quality of indexing your site?

There are several great ways to find out:

  • Log analysis or analysis of traffic by URL. Making a schedule on this data, you will see which parts of the site do not like search engines. And this will indicate indexing problems.
  • Analysis of internal linking site. Which parts of the site have the least internal links? Which parts of the site are located 6-7 clicks from the main page?
  • Using queries such as: site: jcrew.com inurl: 72977 will identify duplicate pages. The worst duplicates for online stores are duplicate pages with a description of the goods. Use similar queries in the search: intitle and allintitle .
  • Use "rel = canonical" to merge duplicates and bring them to the main version of the content. But be careful, as not using "rel = canonical" correctly can harm your site.
  • Let Google and Yahoo (through the webmasters' panels) know that duplicate content appears in the search and you can ignore it during the next visit.
  • Use robots.txt to prevent indexing of unnecessary pages and content that is not intended for search engines. In addition, you can use meta robots and noindex tags to prevent indexing of certain parts of the page.
  • Using XML sitemaps and Google Webmaster Tools to compare vertical indexing. (Google will show problems in indexing the URL for the proposed sitemap).

Site Discovery,   Site indexing, Ranking  - 3 Seo Components

The number of duplicates and the results of search results

The problem of the number of duplicates for SEO is very complex and requires separate consideration. In short, the problem of duplicates can be solved using "rel = canonical" and standard "View All page" on pages that serve as the main one.

Search results are another unique situation. There are many ways to manage these results.

One of the good ways to manage the search results on the site is to canonize these results to a specific search page, which is then reduced to a quality page with contextual and useful links. This is certainly not the best way out of the situation and it is better to use it for search pages that are not the basis of site navigation.

Identifying URL Indexing Issues

During the detection of problems with the site indexation, any "weaknesses" of the URL structure of the site pages will pop up. This is especially true of corporate level sites where you will encounter all kinds of unexpected results in a search engine index.

These problems arise when the site has many different types of users and members of the administration. Of course, we ourselves often make mistakes, SEO is not the solution to all problems.

Site indexing is the main component of site visibility, index, ranking, and is usually the main focus of SEOs. Clean up the index of your site and enjoy scanning efficiency, speed of indexing your site.

Stay with us, as there will be the third and final article in this series.

This article is aimed at considering the ranking - the last of the steps of the search engine with a web page. Prior to this, we considered the two previous stages of the search engine: the detection and indexing of a web page.

Ranging

According to not experienced SEO, ranking is the simplest of the three parts of the search engine. However, it is not.

In ranking, experience plays a major role. Those SEOs who have long been working on the promotion of sites in search engines know all the nuances that in one way or another help to move the site up. And then, based on their experience, they begin to put forward certain ideas about what affects the ranking.

The whole essence of SEO comes down to search results. If the search results URL of your competitors suddenly become higher than yours, you should analyze and put the following into the table:

What made the page rank better?

You need to collect data for each URL and for each domain. The data for both the URL and the domain as a whole are very important, but you should not consider every factor that could move the page up in the issue. Consider the most important ones, the ones you already dealt with.

  • What is the link weight of pages? What is internal linking? Yes, collecting information on links for each URL is not an easy task, but be patient. Be careful and do not try to do this work with automated tools. Use Yahoo Site Explorer and Linkscape to analyze page backlinks; MegaIndex will also be relevant for RuNet.
  • As mentioned above, the reference profiles of both the entire domain and the pages are important, but still pay more attention to the pages. Pay attention to how many unique domains refer to the page (MajesticSEO will be the best tool here).
  • What is the link profile of a unique domain that links to your competitor's page? This profile can be analyzed by the number, variety and quality of links, frequency of visits to the search engine, domain age, PR and mozRank.
  • What is the unique link profile of a page that refers to a competitor's page? It depends on the quality of the domain profile, which we talked about in paragraph 3. This data gives us more detailed information on each link.
  • In the end, you should have the following question: what should be the average of these metrics in order to compete in the issue? You will know this after placing all the data for each link and domain in a column and compare. Then you can see all the pros and cons of each URL.

SEO Quake is a great tool for quickly obtaining basic SEO metrics right from search results. Moreover, all data can be easily exported to Excel. All the information you receive can be presented in various ways.

Site Discovery,   Site indexing, Ranking  - 3 Seo Components

Site Discovery,   Site indexing, Ranking  - 3 Seo Components

Summarize

So, what is left with you after this analysis? The obtained information will clearly show you what needs to be done in order to bypass your competitors in the search results and in general where your site is among the competitors.

The task here is to "identify and conquer" the sites of competitors.

By systematically SEO analyzing the three component stages of a search engine: detection, indexing, and ranking, competent SEOs compose their own methods of improving (“pumping”) their sites, which will then reward them with excellent ranking results in search results.


Comments


To leave a comment
If you have any suggestion, idea, thanks or comment, feel free to write. We really value feedback and are glad to hear your opinion.
To reply

seo, smo, monetization, basics of internet marketing

Terms: seo, smo, monetization, basics of internet marketing