Tips and help.

Home / Tips and help / SEO / Stop search engines from listing web pages in results

Stop search engines from listing web pages in results

When search engines like Google visit your website, they will add all your web pages to their search results listings. Occasionally, you may have a page on your website that you do not want search engines to display in their search results, for example a password protected page. They could also include pages containing files like PDF documents, Word documents, etc. To request that a search engine does not list/ display a page you can add an instruction file to it called a 'robots.txt' file.

If you have Takeaway Website Pro...

Log in to Takeaway Website and click on 'Extras', from the Main Options, followed by 'SiteBoost', then 'Keywords & Tags' and 'Meta Tag Boost'.

Click on 'Next', followed by 'Next' and then 'Next' again. You will now see all of your website pages listed with different options next to them. Where it states: 'What should search engines do with the page?' there is a drop down box, which by default says 'Index & follow links'. For pages that you don't want Google to list, you can change this setting to 'Don't index page (noindex)'. Remember to re-publish your website through the Main Options to see the changes live on the internet.

Please note: If the pages you have just blocked are already listed on the search engines, you may need to wait several weeks for them to be re-indexed and removed from the listings.

If you have Takeaway Website Beginner or Standard...

You will not have access to Pro's SiteBoost tools, therefore you will need to create your 'robots.txt' file and upload it to your website using FTP (this stands for 'File Transfer Protocol' and is used to exchange files over the internet).

Below is an example of a 'robots.txt' file - you can edit this to include any of your website's pages, files and directories that you don't want search engines to display in their search results. You will need to edit the whole file in NotePad NOT Word, which can add hidden HTML code that will result in the file not working properly:


User-agent: *Disallow: /USERIMAGES/


Disallow: /page6.htm


Disallow: /USERIMAGES/*.pdf


User-agent: *' means that ALL search engines are included and that all of them should view the 'robots.txt' file and block the information from appearing in the results listings.

'Disallow: /USERIMAGES/' means that the search engine cannot index any files contained within the /USERIMAGES/ folder, this includes anything you upload to your site, such as any images or documents

'Disallow: /page6.htm' means that the search engine will not list page6.htm in the search engine results (this is where you need to edit - replace the number or name of the page to whatever you want hidden)

'Disallow: /USERIMAGES/*.pdf' means that the search engines will not list your PDF document(s)

To upload your 'robots.txt' file via FTP, follow the steps below:

When FTPing we recommend using Filezilla, which is free to download from http://filezilla-project.org/

FTP SERVER: www.yourdomain.com   (replace 'yourdomain.com' with your website name


USER NAME: yourdomain.com (replace 'yourdomain.com' with your website name )


PASSWORD: your original Takeaway Website activation code in UPPER case


PORT: 21

PLEASE NOTE: If you have a 'Passive Transfer' setting in your FTP software, make sure it is set to 'disabled'. To do this, take the following steps:

 - Open Filezilla

 - Click 'Edit' then 'Settings...'

 - Under 'Connection' click 'FTP'

 - Change the transfer mode to 'Active' and click 'OK'

Click on the 'Quick Connect' button to login to your FTP account. Once you are logged in, go into the into the 'wwwroot' folder.

On the left hand side of the page you should be able to see, for example, C:/ and D:/ hard drives. Go into the folder that you saved your 'robots.txt' file into and then you drag it to the right hand side to upload it.

If you then visit http://www.yourdomainname.com/robots.txt (replace 'yourdomainname.com' with your website name) you should be able to see your file. This will mean that the search engines will see this file and follow it's instructions when they next visit your website to index and list your pages.

Please note: If the pages you have just blocked are already listed on the search engines you will need to wait for several weeks for them to be re-indexed and hence removed from the listings.