Archive for the ‘LPT730 – LAB3’ Category

Robot Exclusion

September 23, 2008


Robots are programs that goes to many pages recursively retrieving linked pages. They are also called WWW Robots, Spiders or Crawlers.

They were useful in the past when the dial-up calls were expensive and a cheaper solution was to download all the texts (newspapers, books, etc) you want to read in your computer and then hang up the phone line, saving you some money in the phone bill.

One popular program I used in that days was webmirror ( )

Some time in the years 1993 or 1994 there have been occasions where robots have visited web servers where they weren’t welcome for various reasons. One of these reasons were robot specific swamped servers with rapid-fire requests, retrieved the same files repeatedly or going very deep virtual trees.

These incidents indicated the need for established mechanisms for web servers to indicate to robots which parts of their server should not be accessed.

The solution to exclude robots from accessing sensitive information on a server was to create a file on the server which specifies an access policy for robots. This file must be accessible via HTTP on the local URL “/robots.txt“.

This approach was chosen because it can be easily implemented on any existing WWW server, and a robot can find the access policy with only a single document retrieval. Even though this control is implemented in the robot, and can deactivated.

Phishing – Protect yourself before lose your informations

September 22, 2008

Following the definition from the website, fishing is the act of sending an email to a user falsely claiming to be an established legitimate enterprise in an attempt to scam the user into surrendering private information that will be used for identify theft.

The e-mail directs the user to visit a Web site where they are asked to update personal information, such as passwords and credit card, social security, and bank account numbers, that the legitimate organization already has. The web site, however, is bogus and set up only to steal the user’s information.

To protect yourself against phishing the users must following some recommendations:

  • Never open a email from a unknown sender
  • Take care about which kind of attachments your are going to open
  • Use Firefox instead of Internet Explorer
  • Keep your web browser always updated
  • Uses a good anti virus and update it daily
  • Never click in links to go to any website, always prefer to type the address by yourself
  • Pay attention about the locker that appear in the browser when entering in a protected website

In the blog the author describe step by step a phishing attempt against his blog.

Another web site have a huge collection of screen shots from real fishing attacks. They also have a free GNU / GPL anti fishing software.