You get a bonus - 1 coin for daily activity. Now you have 1 coin

Buying links: classification of parameters for screening

Lecture




  Buying links: classification of parameters for screening

Work on the purchase of links should be as automated as possible, but for this you need to do a lot of preparatory work. Any robot will follow your instructions, but you can make a number of mistakes and spoil the seemingly final, excellent idea.


It only seems that there are a lot of resources on the link exchange, in fact, they need to be spent quite carefully. Therefore, the question of the correct classification of parameters for eliminating links arises quite sharply.

In fact, all the filters that you create have a rather small correlation with efficiency, and if you say more precisely, it is not the choice of filter that determines its effectiveness.

  Buying links: classification of parameters for screening

I have more than 50 different filters in my accounts (if you take all the accounts in Sapa), including client accounts, but neither do they determine the strategy for choosing a donor.

  Buying links: classification of parameters for screening

A site with zero indicators can work no worse than “with puzomerkami”, a site that is not thematic can work better than 100% correct topics.

  Buying links: classification of parameters for screening

If you do not have a large sample of sites, on which you can make white lists, you will have to act a little differently (by the way, white sites often turn gray and black, they need to be checked constantly).

The main focus is on the formation of blacklists, for this you need a clear classification of parameters .

I know enough people who have a blacklist for 70 thousand sites for a year or two, and many of them consider this to be absolutely normal and they don’t bother with the classification of parameters.

Just one parameter is enough to blacklist the site - this is not the right and not rational approach.
To collect the parameters for the classification - you have to work with your hands and eyes, partially with your brain. And then all the results should go into automatic processing, because automation allows you to save a lot of time.

The method of screening for the plugin plume is as follows: LF, H, Cn, TF, BC, HC, YAP, YAC .

The first barrier that sites go through is a filter by URL (LF), with which we try to identify in advance those sites that have a rather high probability of falling out, as well as undesirable pages.

As we can see, we already have 2 classifiers:

  • The probability of a ban / filter.
  • The probability of a page falling out.

The probability of a ban or a filter can have a sufficient number of sites, for example, it can be a gallery site specially created for glanders, we have entered the parameters:

  Buying links: classification of parameters for screening

  • Displayimage (image gallery)
  • Thumbnails (image gallery)
  • dn = photos (image gallery)
  • .php? vac_id = (parse base of vacancies)

We mark these sites and we can safely add them to the global black list or black list of the project.

The following parameters - do not have a clear attachment to the characteristics of the site , they have certain characteristics of the page :

  Buying links: classification of parameters for screening

  • User
  • Tag
  • Archive

With these parameters, we can screen out system pages that may be in the index, but will sooner or later drop out as containing duplicate content, if they are tag or archive pages, or system user profile, registration and other profile pages.

  Buying links: classification of parameters for screening

This classifier does not give us the right to mark the site with a blacklist , because these are not characteristics of the whole site, so we cancel these pages without adding them to the blacklists, next time we will see other pages of this site.

I hope I made it clear that there are signs of the site and the page, they need to be clearly distinguished, otherwise you will not “spend the donors” correctly.

The second barrier is the number of pages in the index; here you need to be careful. Its main goal is to weed out sites that fall under the AGS, by the number of pages indexed.

I would advise not to be guided by SAPE indicators and press H (checking the number of indexed pages) before the Cn button (by the number of pages), which allows us to get real data at the present moment . As a result, we will get exactly today's data and will be able to press the cherished button Cn, which will mark sites with a black list (let's say containing less than 15 pages in the Yandex index).

The third barrier is the content filter or TF. I have already partially described the methodology; we can filter enough donors by different parameters. Suppose the site has a "porno" and we noticed it. If you do not go into details - this option is worthy of a blacklist.

  Buying links: classification of parameters for screening

But not every site containing the parameters listed below is worthy of a blacklist:

  • Keygen
  • IFolde
  • Rapidshare
  • Megaupload

Here we can either cancel and offer us new pages , or add to the black list of a specific project ( BL ), because global he may not be worthy. I have already touched on TF, so we will not touch it in detail, I will answer questions in the comments, if there are any.

The fourth barrier is the number of external links, the number of internal links, as well as the amount of content per page (BC).

Everything is simple here - you need to clearly understand that this is a characteristic of the page, which means it does not fit the global blacklist , it is best to cancel and we will be offered a new page.

At the same time, if we clearly see 500 internal links on a page or more, then you can add to the black list, before looking at the donor (often we can meet plugins of a pixel sitemap, etc. - they can be filtered with a TF filter) We can safely add this part to the blacklist, which means we already have 2 classifiers. But still, I insist on the need to distinguish page parameters and site parameters.

The last stage we go through is the level of nesting (HC), as well as the presence of a page in the index (YAP) and in the cache (YAC) of Yandex. These are also characteristics of the page, we do not have the right to add such sites to black lists, since draw conclusions only on one page.

Today I received 5 thousand 413 references to confirm, the results are as follows:
LF for the blacklist - 35 sites, LF for cancellation - 1766 sites.
H, Cn for screening of AGS sites - 81.
TF for the blacklist - 320 sites, TF for cancellation - 840 sites.
Sun for the black list - 15 sites, Sun for cancellation - 2140 sites.
YAP and YAC to cancel - 230 sites.

As a result, with a balanced work , we blacklisted 361 sites , and 4076 sites went off the line . That means that out of the 5 thousand 413 links, we have eliminated 4976, keeping for consideration a more detailed 437 links.

If we idly spent the reference mass and added everything to blacklists, without classifying factors, then we would have 4976 sites in the blacklist in just 1 day .

In my opinion, this is not reasonable, but I know many people who do this without demarcating and classifying the parameters .

To date, in my personal account, in just a few years, only 41 thousand sites have accumulated, which were screened out by very tough parameters; if I indiscriminately sent everything to the blacklist, it would already be 3-4 times more sites.

On customer accounts, there are fewer blacklists, but they still work less.

How do you classify donors in the selection process? Is there a separation or halftones not for you and you see only black and white?

Comments


To leave a comment
If you have any suggestion, idea, thanks or comment, feel free to write. We really value feedback and are glad to hear your opinion.
To reply

seo, smo, monetization, basics of internet marketing

Terms: seo, smo, monetization, basics of internet marketing