R
Reena
New Member
Hi,
I need help in a robots.txt file.I am working on Collegesearch.in and found some of the tools are showing I have blocked my site for all crawlers.
Please find the below URL of robots.txt: https://www.collegesearch.in/robots.txt
I am planning to create robots.txt is
Sitemap: https://www.collegesearch.in/sitemaps_index.xml
User-agent: Googlebot
User-agent: Googlebot-Mobile
User-agent: Mediapartners-Google
User-agent: AdsBot-Google
User-agent: Alexabot
User-agent: Mail.Ru bot
User-agent: bingbot
User-agent: YandexBot
User-agent: Yahoo!
User-agent: Baiduspider
User-agent: archive.org_bot
User-agent: MSNBot
User-agent: ShopWiki
User-agent: ia_archiver
User-agent: netEstate Crawler
User-agent: bitlybot
User-agent: linkdexbot
User-agent: AhrefsBot
User-agent: SearchmetricsBot
User-agent: FacebookExternalHit
User-agent: SEOkicks-Robot
User-agent: Blekkobot
User-agent: HubSpot Connect
User-agent: BingPreview
User-agent: 360Spider
User-agent: yacybot
User-agent: Leikibot
User-agent: AboutUsBot
User-agent: OpenWebSpider
User-agent: BUbiNG
User-agent: backlink-check.de
User-agent: PayPal IPN
User-agent: Feedly
User-agent: WebCookies
User-agent: LinkedInBot
User-agent: BacklinkCrawler
User-agent: MetaGeneratorCrawler
User-agent: alexa site audit
User-agent: webmastercoffee
User-agent: MetaHeadersBot
User-agent: Open Web Analytics Bot
User-agent: Automattic Analytics Crawler
User-agent: YoudaoBot
User-agent: YodaoBot
User-agent: facebookplatform
User-agent: Whoismindbot
User-agent: gonzo
User-agent: KeywordDensityRobot
User-agent: pmoz.info ODP link checker
User-agent: YRSpider
User-agent: Twiceler
User-agent: XML Sitemaps Generator
User-agent: FeedFinder/bloggz.se
User-agent: DNS-Digger-Explorer
User-agent: RSSMicro.com RSS/Atom Feed Robot
Disallow: /ajax/
Disallow: /images/
Disallow: /upload/
It is fine or need some changes.
I need help in a robots.txt file.I am working on Collegesearch.in and found some of the tools are showing I have blocked my site for all crawlers.
Please find the below URL of robots.txt: https://www.collegesearch.in/robots.txt
I am planning to create robots.txt is
Sitemap: https://www.collegesearch.in/sitemaps_index.xml
User-agent: Googlebot
User-agent: Googlebot-Mobile
User-agent: Mediapartners-Google
User-agent: AdsBot-Google
User-agent: Alexabot
User-agent: Mail.Ru bot
User-agent: bingbot
User-agent: YandexBot
User-agent: Yahoo!
User-agent: Baiduspider
User-agent: archive.org_bot
User-agent: MSNBot
User-agent: ShopWiki
User-agent: ia_archiver
User-agent: netEstate Crawler
User-agent: bitlybot
User-agent: linkdexbot
User-agent: AhrefsBot
User-agent: SearchmetricsBot
User-agent: FacebookExternalHit
User-agent: SEOkicks-Robot
User-agent: Blekkobot
User-agent: HubSpot Connect
User-agent: BingPreview
User-agent: 360Spider
User-agent: yacybot
User-agent: Leikibot
User-agent: AboutUsBot
User-agent: OpenWebSpider
User-agent: BUbiNG
User-agent: backlink-check.de
User-agent: PayPal IPN
User-agent: Feedly
User-agent: WebCookies
User-agent: LinkedInBot
User-agent: BacklinkCrawler
User-agent: MetaGeneratorCrawler
User-agent: alexa site audit
User-agent: webmastercoffee
User-agent: MetaHeadersBot
User-agent: Open Web Analytics Bot
User-agent: Automattic Analytics Crawler
User-agent: YoudaoBot
User-agent: YodaoBot
User-agent: facebookplatform
User-agent: Whoismindbot
User-agent: gonzo
User-agent: KeywordDensityRobot
User-agent: pmoz.info ODP link checker
User-agent: YRSpider
User-agent: Twiceler
User-agent: XML Sitemaps Generator
User-agent: FeedFinder/bloggz.se
User-agent: DNS-Digger-Explorer
User-agent: RSSMicro.com RSS/Atom Feed Robot
Disallow: /ajax/
Disallow: /images/
Disallow: /upload/
It is fine or need some changes.