What is Robot.txt? Mostly people say that robot.txt can increase the visibility of a site in search engine result (SERPs). Why robot.txt? These questions are submitted by mostly blogger to get better traffic and SERPs. Since many explanation found in internet and sites, these makes bloggers feel frustrated and do not know what to do and who should be trusted. (I do too, lol).
Here, I try to share about robot.txt after exploring websites and blogs which have valid and trusted sources. It is said that I have done with it.
Robot.txt is JUST other ways to TELL spider of search engine what to do.
When the spider crawls the page of a site and find the robot, spider will read it and determine what to do next. This algorithm is in all search engines, such as, Google, Bing, Yahoo, Ask, AOL, etc. Without existing the robot.txt, spider will crawl all pages and sides and process it to display in the result of searching when visitors enter query (keywords).
Spider is a software which is used to explore (crawl) sites or blogs in the world
by using its "net" and store them into its stomach.
Therefore, the existence of robot.txt is very useful. Since robot.txt is placed in the root of host, such as, in blogspot/blogger. It is impossible for all bloggers to customize it. However, now blogspot.com has listened and granted his fan's crying and, we are be able to customize it at last. Besides, you can place a meta robot in your page script, e.g. <meta name="robots" content="index,follow"/>
There are many items in robot.txt to decide and to customize to reach the higher visibility our blogs in Search Engine Results. All of them are concluded into two main consideration; Allow or Disallow.
Allow, here, means to allow spiders to crawl on a page which is mentioned in robot.txt (which is placed after root sign ("/"). And, Disallow means to disallow spiders to crawl on a page which is mentioned in robot.txt. This is natural since its just thing to do or not to do.
Here all items are in robot.txt:
user-agent: *When all pages are allowed to, just write the sign "/", for instance:
Disallow: /p / about. html
user-agent: *For better result, you can customize as follows:
All pages > Allow, including sitemaps
Some improper pages, such as, blog archives, can be disallowed.
Do not allow user-agent (Google-Bot) to find everything in site directory, like, dmoz. org > noodp
How to Customize Robot.TXT in Blogger
1. Login to your blogger.
2. Go to [Settings]
3. Klik sub-settings> [Search preferences]
4. Find the words /Custom Robot .txt/, then click [Edit]
5. Write your own robot inside the box.
Next, below the /Custom Robot. txt/, find /Custom robots header tags/ and click [Edit] to edit the robots header tags.Pick the option you wish to - based on your needs which is explained before. See the illustration here.
*) This page will be discuss in details later ..
Herman Bin Nasarudin in Google+