|dc.description.abstract||The use of web robots has exploded on today's World Wide Web (WWW). Web robots are used for various nefarious activities including click fraud, spamming, email scraping, and gaining an unfair advantage. Click fraud and unfair advantage present a way for bot writers to gain a monetary advantage. Click fraud web bots allow their authors to trick advertisement (ad) networks into paying for clicks on ads that do not belong to a human being. This costs ad networks and advertisers money they wanted to spend on advertisements for human consumption. It also affects the reputation for ad networks in the eyes of advertisers.
These problems make combating web robots an important and necessary step on the WWW. Combating web robots is done by various methods that provide the means to make the distinction between bot visits and human visits. These methods involve looking at web server logs or modifying the pages on a website
When log analysis cannot detect sophisticated web robots, web page modification methods like Decoy Link Design Adaptation (DLDA) and Transient Links come into play. These methods dynamically modify web pages such that web robots can not navigate them, but still provide the same user experience to humans in a web browser. The two methods differentiate between the type of a web robot they defend against. DLDA works against a crawling web bot which looks at the web page source, and Transient Links works against replay web bots which make requests with links and data given to them. DLDA detects walking bots by obfuscating real links in with decoy (fake) links. The web bot is forced to randomly choose link and has probability of being detected by picking the decoy link. Transient Links detects replay bots by creating single-use URLs. When a replay bot uses these single-use links, Transient Links is able to tell these links have been used before and are being replayed.||