Do you check each and every report manually?' 'Some time ago we sent a report on a spam, but still have not seen any changes. The question to Mueller was the following:
Google employee named John Mueller stated that the search team does not check all spam reports manually during the last video conference with webmasters.
Google does not check all spam reports in manual modeOct 08/2017 Therefore, if you have a change, it is recommended to move to this protocol. It should be recalled that in April 2016, John Mueller said that the use of the HTTP / 2 protocol on the website does not directly affect the ranking in Google, but it improves the experience of users due to faster loading speed of the pages.
Therefore, we do not see the full benefits of scanning HTTP / 2.īut with more websites implementing push notification feature, Googlebot developers are on the point of adding support for HTTP in future.” We can cache data and make requests in a different way than a regular browser. In general, the difficult part is that Googlebot is not a browser, so it does not get the same speed effects that are observed within a browser when implementing HTTP / 2. We are still investigating what we can do about it. 'No, at the moment we do not scan HTTP / 2. The reason is that the crawler already scans the content that fast, so the benefits that the browser receives (web pages loading time is decreased) are not that important. Googlebot still refuses to scan HTTP/2Oct 08/2017ĭuring the last video conference with webmasters Google rep called John Mueller said that Googlebot still refrains to scan HTTP.
They are not necessary for many website owners and it is better to spend this time on improving the website itself, says Slagg. Therefore, referential audits are needed if there were any violations in the history of the resource. It is important to remember that rejecting links can lead to a decrease in resource positions in the global search results, since many webmasters often reject links that actually help the website, rather than doing any harm to it. Thus, in the case when before a website owner was engaged in buying links or using other prohibited methods of link building, then conducting an audit of the reference profile and rejecting unnatural links is necessary in order to avoid future manual sanctions. I do not even know who is referring to me. I have it for 4 years already and I do not have a file named Disavow. I've got my own website, which receives about 100,000 visits a week. In case your links are ignored by the 'Penguin', there is nothing to worry about.
I don't think that helding too many audits makes sense, because, as you noted, we successfully ignore the links, and if we see that the links are of an organic nature, it is highly unlikely that we will apply manual sanctions to a website. These companies have different opinions on the reason why they reject links. 'I talked to a lot of SEO specialists from big enterprises about their business and their answers differed.
Since Google Penguin was modified into real-time update and started ignoring spam links instead of imposing sanctions on websites, this has led to a decrease of the value of auditing external links.Īccording to Gary Illyes, auditing of links is not necessary for all websites at the present moment. This information was reported by Jennifer Slagg in the TheSEMPost blog. At the Brighton SEO event that took place last week, Google rep called Gary Illyes shared his opinion about the importance of auditing the website's link profile.