On 23 Oct 2002, Frank Knobbe wrote: > I think it was Jose who used the example of a rogue broker accessing > websites in a certain order. While valid traffic, shouldn't it be > possible to detect that behavior? One more thing - look in your web logs. Quite frankly, I'd qualify many of my human visitors as covert channel agents or such. People reload pages with no purpose, often visit them several times in a row, click on same links a number of times, click on quite random links, some people have one hour delays, others just read first few words and have few second delays, list directories in each possible sort variant, mistype URLs five times in a row, even manage to append to URLs in their address bar... It's really next to impossible to profile it without having a model loose enough to miss even a lousy hidden covert channel. If your network has more than 100 users, I think the only thing you'd achieve is identifying a number of normal users as a suspicious software, and completely missing low profile covert channel agents ;-) -- ------------------------- bash$ :(){ :|:&};: -- Michal Zalewski * [http://lcamtuf.coredump.cx] Did you know that clones never use mirrors? --------------------------- 2002-10-23 18:51 --
This archive was generated by hypermail 2b30 : Wed Oct 23 2002 - 16:10:37 PDT