Post by account_disabled on Feb 27, 2024 9:28:06 GMT
The into different XML sitemaps to test those hypotheses. You can do several at once nothing wrong with having a URL exist in multiple sitemaps. You might start with theories Pages that dont have a product image arent getting indexed Pages that have less than words of unique description arent getting indexed Pages that dont have commentsreviews arent getting indexed Create an XML sitemap with a meaningful number of pages that fall into each of those categories. It doesnt need to be all pages in that category just enough that the sample size makes it reasonable to draw.
A conclusion based on the indexation. You might do pages in each for Kazakhstan Phone Number instance. Your goal here is to use the overall percent indexation of any given sitemap to identify attributes of pages that are causing them to get indexed or not get indexed. Once you know what the problem is you can either modify the page content or links to the pages or noindex the pages. For example you might have of your product pages where the product description is less than words. If these arent bigtraffic terms and youre getting the descriptions from a manufacturers feed its probably not worth your while to try and manually.
Write additional words of description for each of those pages. You might as well set meta robots to noindexfollow for all pages with less than words of product description since Google isnt going to index them anyway and theyre just bringing down your overall site quality rating. And dont forget to remove those from your XML sitemap. Dynamic XML sitemaps Now youre thinking OK great Michael. my XML sitemap in sync with my meta robots on all of my pages and thats not likely to happen. But theres no need to do this manually. XML sitemaps dont have to be static files. In fact they dont.
A conclusion based on the indexation. You might do pages in each for Kazakhstan Phone Number instance. Your goal here is to use the overall percent indexation of any given sitemap to identify attributes of pages that are causing them to get indexed or not get indexed. Once you know what the problem is you can either modify the page content or links to the pages or noindex the pages. For example you might have of your product pages where the product description is less than words. If these arent bigtraffic terms and youre getting the descriptions from a manufacturers feed its probably not worth your while to try and manually.
Write additional words of description for each of those pages. You might as well set meta robots to noindexfollow for all pages with less than words of product description since Google isnt going to index them anyway and theyre just bringing down your overall site quality rating. And dont forget to remove those from your XML sitemap. Dynamic XML sitemaps Now youre thinking OK great Michael. my XML sitemap in sync with my meta robots on all of my pages and thats not likely to happen. But theres no need to do this manually. XML sitemaps dont have to be static files. In fact they dont.