和顺纵横信息网

 找回密码
 立即注册
搜索
热搜: 活动 交友 discuz
查看: 252|回复: 0

Incorrect handling of URL parameters

[复制链接]

20

主题

20

帖子

62

积分

注册会员

Rank: 2

积分
62
发表于 2024-12-18 16:48:23 | 显示全部楼层 |阅读模式
Parameters are portions of URLs that follow a question mark and specify the path to take to reach the directory where the resource is located. They are generally generated when the user selects one or more attributes in an e-commerce listing page, but also when pagination is applied to a site archive or UTMs are created to track traffic sources.

If not properly managed, the parameters can give rise to thousands france whatsapp number of duplicates of the page. The spider, detecting different URLs with the same content and not being able to understand which is the canonical version to give more value to, could decide to index both pages.

But not only that, if handled incorrectly, parameters could cause URLs to be excluded from search results. For example, a classic error of this type could be the incorrect configuration of the Parameter Management feature within Google Search Console.

Specifically, remember that:

If the parameter is to be indexed you must select: “Yes, modify, reorder or limit the contents of the page” and then indicate the function of the parameter (e.g. order, limit, specify, number pages, translate) and the pages where to apply it.

If the parameter is not to be indexed you must select: “No, does not affect the contents of the page” . Alternatively you can choose “Yes, modify, reorder or limit the contents of the page” and then select “No URL”.

7. Javascript not loading correctly

JavaScript is an important programming language used to create interactive effects within a website and search engines are making many efforts to improve the ability to interpret these resources, but there are still some critical areas that can be optimized.

How does Javascript work? When a user or spider visits a page on a website, the browser requests information from the server, including basic HTML and various external resources (including those that use Javascript). The browser then combines all the information it has acquired to load the complete web page with all its features. When the spider encounters web pages with JavaScript, it uses its own renderer to execute it. But because this takes a lot of time and computational resources, rendering of JavaScript-based websites is deferred until Googlebot has the resources available to process that content. This means the bot does two waves of indexing between the content, and it is possible that some details are lost.

image.png



Improperly configured JavaScript can cause your site to be crawled incorrectly, which can negatively impact your site's indexing and ranking. Here are some of the most common cases where a site might have indexing issues:

Base HTML too different from final HTML with JavaScript. This technique is frowned upon by Google because in the past it was used to show a more optimized version of the page to the bot and a different version to the user.

Massive use of Javascript. When Google first crawls the site, it does not render it. This means that elements present in JavaScript, but not in HTML will not be taken into consideration by the search engine. In this way, the site is not only "received" in an incomplete way, but there are also disadvantages regarding the frequency of crawling and therefore indexing times.

JavaScript navigation menus, links, and metadata. If the site's core elements are rendered in the second wave of indexing, there is a high probability that Google will not be able to read them.

Try disabling JavaScript in your browser and browse the site to see if it displays properly. You can use one of the many extensions available for Chrome, such as “Web developer”.

回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

QQ|Archiver|手机版|小黑屋|和顺纵横信息网

GMT+8, 2025-7-15 18:53 , Processed in 0.036955 second(s), 19 queries .

Powered by Discuz! X3.4

Copyright © 2001-2021, Tencent Cloud.

快速回复 返回顶部 返回列表