In my last article, you read about robot.txt and sitemap.xml files and how you can add these into your blog or website. Now it’s very important to keep adding new content into your blog or website, otherwise search engine crawlers will start ignoring your blog or website. Because if they don’t find any new content there then they will start decreasing the priority of your blog or website and lower the frequency of crawling. So, it’s very important that you regularly add new and quality material. Also, this will keep the interest of your visitors alive and they keep visiting repeatedly. Here are few instructions which you keep in mind while adding content to your blog or website.
Month: April 2017
In my last article you read about How to Use Google Analytics to Check My Blog Users and Visitors. That means you have already added Google Analytics script on your blog or website and you have started checking the user activities on your website. Now next comes how to instruct search engine robots and crawlers to index your website/blog page links. There may be few pages which you don’t want to show publicly. Search engine robots and crawlers keep coming on your website or blog after some interval of days and check if there are new pages or content to be indexed. They do so because they have to update their databases, so that they can show updated information when someone searches on internet. We do have robot.txt and sitemap.xml files which helps search engine robots and crawlers to navigate easily into your website or blog and get the stuff indexed.