M
macB
New Member
My site is over 12 months old, my sitemap has grown steadily over this period to 1500 URLs but Search Console reports 1000 excluded under the type "Crawled - not currently indexed".
The help says "These are pages that are intentionally not included" and "The pages won't appear in Google but that was probably intentional" errrr NO!
I've googled quite a few threads with different answers from "you can't do anything" to "you just have to wait".
I've tried analysing the excluded v valid and there is nothing that I can see, the excluded are not duplicates, they are not new and they contain as much "rich" content as those it considers valid.
I've even used the "Request indexing" on individual URLs, Google does craw them (you can see this by the date) but still considers them excluded.
Strangely, the Index coverage stats (i.e. not the sitemap) reports 11.5K valid and 47.5K excluded, these are URLs that the Googlebot spider has discovered itself!
I'm confused... and stuck, any pointers anyone?
The help says "These are pages that are intentionally not included" and "The pages won't appear in Google but that was probably intentional" errrr NO!
I've googled quite a few threads with different answers from "you can't do anything" to "you just have to wait".
I've tried analysing the excluded v valid and there is nothing that I can see, the excluded are not duplicates, they are not new and they contain as much "rich" content as those it considers valid.
I've even used the "Request indexing" on individual URLs, Google does craw them (you can see this by the date) but still considers them excluded.
Strangely, the Index coverage stats (i.e. not the sitemap) reports 11.5K valid and 47.5K excluded, these are URLs that the Googlebot spider has discovered itself!
I'm confused... and stuck, any pointers anyone?