Archive of UserLand's first discussion group, started October 5, 1998.
Author: Jacob Levy Posted: 8/9/1999; 3:20:06 PM Topic: Deep Linking Msg #: 9274 Prev/Next: 9273 / 9275
Dave's recent Davenet piece proposes to require each web site to have a deepLink.txt file to denote whether and how it is willing to be deep-linked. IMHO that won't work. Unlike the robots.txt which was intended to be read by programs and had a form easily parse-able by programs, the deepLink.txt file is for *users* -- most deep linking is done by users and not by programs. There is no concept of a "well written" user and its too easy and tempting to take whats there for the taking.
IMHO the only way is enforcement through server side action on the part of the web master owning the site. Since there is a technical solution to this kind of problem, I can't see any merit in Universal's lawsuit. Like copyright and patent violation, deep linking should be prevented by active and vigilant enforcement if that's the intent of the owner. I expect the lawsuit to be thrown out on the basis that Universal did not take the elementary steps to protect itself adequately.
There are responses to this message:
- Re: Deep Linking, email@example.com, 8/9/1999; 3:35:23 PM
- Re: Deep Linking, Lawrence Lee, 8/9/1999; 3:35:28 PM
- Re: Deep Linking, firstname.lastname@example.org, 8/9/1999; 5:04:51 PM
- Re: Deep Linking, Dave Winer, 8/9/1999; 5:12:03 PM
- Re: Deep Linking, Christoph Pingel, 8/10/1999; 6:25:30 AM
- Re: Deep Linking, Jim Goodman, 8/10/1999; 7:30:42 AM
- Oh good, another corporate information entitlement, TQ White II, 8/10/1999; 8:17:47 AM
- Re: Deep Linking, Andrew Wooldridge, 8/10/1999; 9:19:39 AM
- Don't throw the baby out with the bathwater, Jeremy Bowers, 8/10/1999; 10:28:35 AM
- Re: Deep Linking, email@example.com, 8/10/1999; 1:27:14 PM
This page was archived on 6/13/2001; 4:51:47 PM.
© Copyright 1998-2001 UserLand Software, Inc.