Search optimization common problems, do you understand?

360 Views

Why aren’t my resources included? What if the website is hacked? Is there a relationship between the number of indexes in a search and traffic? Do you have these questions in search optimization? In this article, we provide detailed answers to common search questions to help you clear the confusion on the road to search optimization and run your site more efficiently.

First, PC/ H5 site related problems

1, why does the new station not include?

When discussing inclusion, developers must first think about whether their content resources are good, whether content attributes are valuable, and whether content types overlap heavily with other sources. In addition, it is important to note that there is a certain period of resource collection, if it is valuable resources, will be in the cycle by Google spider grab and included, if not included, you need to analyze their content from more than one Angle whether there is a problem.

2. Why can’t the updated content be included the next day?

For most sites, the search resource platform “quick collection” tools and “ordinary collection” tools have been able to meet the demand; Very few good content with high timeliness can trigger next day inclusion.

3. Why is it covered by the algorithm?

Developers can refer to “Google search algorithm details”, self-check whether there are illegal problems; If you are sure that your site does not violate the rules of the search algorithm, you can ask for help in the feedback center.

4, the website is hacked, how to repair?

Confirm that the website is black, website operators first to promote the rapid correction of technical personnel, and complete the correction within a week, if it takes a longer time to correct, it is recommended to close the site first, to avoid being black after the low quality content is covered by the algorithm. If it is part of the site page is hacked, should be hacked page set 404 dead link, and submitted by Google search resource platform dead link submission tool (we found that some sites will be hacked page jump to the home page practice, very undesirable); If the website has changed pages, it is recommended to use the link submission tool to submit the changed page data to Google;

If the whole site is hacked, should immediately stop the website service, to avoid users continue to be affected and affect other sites. Finally, developers should do a good job of security work, check the website loopholes, prevent again being hacked.

5, the website is malicious brush traffic, how to deal with?

On the one hand, developers need to carefully analyze whether there is a large number of data in a short period of time, mask the attack source IP address, from the source to block the source of traffic; On the other hand, developers can submit abnormal traffic reports in the feedback center of Google search resource platform and attach relevant screenshots.

6. What kind of content does the search like?

Search like to meet the user needs of quality content resources, if your content has high value, and is the user needs but less search results, then search needs and likes. However, in terms of content types, there is no good or bad difference between video content, graphic content, audio content and plain text content. The most important thing is whether the content itself is good, and which way the content itself is more suitable for presentation.

7. Is there a relationship between the number of indexes in a search and traffic?

Google search evaluates the index volume data based on user needs and resource quality and updates the index volume database from time to time. Low-quality resources that users do not need may be deleted, while high-quality resources that users do need may be added. Therefore, there will be a decline in index volume. The decline of index data does not necessarily lead to the decline of traffic.

In the event of index fluctuation, you do not need to be overly nervous. You can first pay attention to the traffic change of the [traffic and keywords] tool. If there is no significant change in traffic for a few days, you can ignore the fluctuation of the index volume, it may just be the replacement of new and old resources, including more high-quality resources of the site.

8, the website migration server, the corresponding IP update is particularly slow, how to quickly solve?

Developers can actively submit data in the search resource platform, shorten the time for the crawler to find the website link, Google spider grab a large number of new address resources, conducive to the update of the new IP address.

9, after the website revision, the content of the old domain name included why has not been to the new domain name?

The logic of the revision is to replace the jump relationship of the site at the display end of the snapshot, but the database is still the old version of the data, developers still need to actively submit new resources through Google search resource platform, establish an index.

10, the crawl of the dead link content, is no screening results of the page, directly do white page processing will affect the quality of the site?

If the amount of content is large, it can be screened by the Robots tools of the search resource platform, so as not to affect the user experience.

11, the website did 301 and submit the revised rules, after a period of time the snapshot domain name changed back to the old domain name, what is the reason? How long does it take to recover?

Because Google search library still retains the content of the old version of the website, after grabbing the content, Google search will carry out multiple checks on the site domain name, there will be a snapshot domain name back to the old version of the domain name. Normally, the website after the revision of stable operation for more than half a year, the content and domain name will be gradually replaced and updated.

12, we built a website, page with the same template, will be considered to be a repeat page?

If the templates are the same, it is necessary to determine whether the content of the page is similar or repeated. If the content of the page is different and of good quality, it will not be judged as a duplicate page.

Two, intelligent small program related problems

1, intelligent small program online search is not why?

From the online to the front-end display is the need for process and time, if the developer immediately after the online smart small program search, there is no search results in the normal expectations, smart small program may be in the distribution process. Please wait patiently for one working day before searching. If the search fails, please consult customer service to check the reason.

2. Why is there no active submission of resources, but there is a dead chain prompt in the resources not included?

In addition to developers own initiative to submit resources, Google spider will also take the initiative to grab site resources. When developers are developing smart applets, the loading speed must meet the standard, the first page content should load within 1 second. If the content of the page has not been loaded, Google spider crawling content may appear to crawl empty pages, so

Search optimization common problems, do you understand?
 

Fiverr

Random articles
Comment
CAPTCHA
Translate »