|
إنضمامك إلي منتديات استراحات زايد يحقق لك معرفة كل ماهو جديد في عالم الانترنت ...
انضم الينا
#1
| ||
| ||
My site was suspended yesterday by my shared hosting provider for using too much resources. So, I am attempting to stop all robots from indexing the site at least temporarily while I work on this. Right now I have 4 members and like 53 visitors, many of which are Google Spider, Yahoo! Slurp Spider, Turnitin.com Spider and MSNBot Spider. Just to test this out, I just put the most repressive robots.txt file for the forums that I could think of, which is: User-agent: * Disallow: /forum/ (the forum folder is on the root level of my public_html folder, i.e. www.mysite.com/forum). The robots.txt file is also on the root of the public_html folder. Yet, I still have all of these robots crawling around the forums!! How do I get them to stop??! This file seems to have no effect whatsoever! Thanks.... __DEFINE_LIKE_SHARE__ |
مواقع النشر (المفضلة) |
أدوات الموضوع | |
انواع عرض الموضوع | |
| |