Archive of UserLand's first discussion group, started October 5, 1998.

Deep Linking

Author:Jacob Levy
Posted:8/9/1999; 3:20:06 PM
Topic:Deep Linking
Msg #:9274
Prev/Next:9273 / 9275

Dave's recent Davenet piece proposes to require each web site to have a deepLink.txt file to denote whether and how it is willing to be deep-linked. IMHO that won't work. Unlike the robots.txt which was intended to be read by programs and had a form easily parse-able by programs, the deepLink.txt file is for *users* -- most deep linking is done by users and not by programs. There is no concept of a "well written" user and its too easy and tempting to take whats there for the taking.

IMHO the only way is enforcement through server side action on the part of the web master owning the site. Since there is a technical solution to this kind of problem, I can't see any merit in Universal's lawsuit. Like copyright and patent violation, deep linking should be prevented by active and vigilant enforcement if that's the intent of the owner. I expect the lawsuit to be thrown out on the basis that Universal did not take the elementary steps to protect itself adequately.

There are responses to this message:

This page was archived on 6/13/2001; 4:51:47 PM.

© Copyright 1998-2001 UserLand Software, Inc.