[logs] MY ACCESS LOG FILES ARE FILLED WITH "SEARCH /\x90\x04H\x04H\x04H\x04H\x04H\x04H\x04H\..................................."

From: MUKESH KUMAR SINGH (mksingh13@private)
Date: Tue Dec 06 2005 - 20:45:28 PST


DEAR FRIENDS

MY ACCESS LOG FILES ARE FILLED WITH

SEARCH
/\x90\x04H\x04H\x04H\x04H\x04H\x04H\x04H\...................................

LIKE STRINGS

CAN ANYONE PLEASE HELP ME AND TELL ME EXACTLY WHAT DOES IT MEAN.

THANKS AND REGARDS,

MAK



On 12/7/05, loganalysis-request@private <
loganalysis-request@private> wrote:
>
> Send LogAnalysis mailing list submissions to
>         loganalysis@private
>
> To subscribe or unsubscribe via the World Wide Web, visit
>         http://lists.shmoo.com/mailman/listinfo/loganalysis
> or, via email, send a message with subject or body 'help' to
>         loganalysis-request@private
>
> You can reach the person managing the list at
>         loganalysis-owner@private
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of LogAnalysis digest..."
>
>
> Today's Topics:
>
>    1. Re: regex-less parsing of messages (Solomon, Frank)
>    2. Re: regex-less parsing of messages (Adrian Grigorof)
>    3. Re: regex-less parsing of messages (Moehrke, John (GE Healthcare))
>    4. Re: regex-less parsing of messages (todd.glassey@private)
>    5. Re: regex-less parsing of messages (Christina Noren)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Tue, 6 Dec 2005 08:45:34 -0500
> From: "Solomon, Frank" <frank@private>
> Subject: [logs] Re: regex-less parsing of messages
> To: <LogAnalysis@private>
> Message-ID: <E8D6504F8196F64E82053613BCA8159860B28D@private>
> Content-Type: text/plain;       charset="us-ascii"
>
> Jason, your example certainly struck a chord.  We haven't even begun to
> put our mail logs into our central log server because of the technical
> challenges that would pose.  And yet, we get asked the same sort of
> questions which require a highly trained person to probe through the
> heterogeneous mail log files and trace the path of some errant envelope
> that may or may not actually exist.  It is not pretty; part of the price
> we pay for having to accommodate multiple mail systems, vendors and
> standards.
>
> Our standing joke is:  "That's the nice thing about standards, there are
> so many to choose from and everyone can have their own."  So, "sendmail"
> has its "standard" log format and "Exchange" has its "standard" log
> format, and "Novell" has its "standard" log format, etc.  I saw an
> article recently describing the new "logging standard" that Microsoft
> was about to introduce in their latest OS.  Well that will certainly
> clear things up!  I'm sure all their competitors will rush to implement
> compatible systems.  Don't get me wrong, I laud Microsoft's attempt to
> enforce programmer discipline.
>
> In case you're interested in the MS stuff:
> http://msdn.microsoft.com/library/default.asp?url=/library/en-us/wes/wes
> /about_the_windows_event_log.asp
>
> <dreaming>
> Certainly, the first challenge in being able to analyze data is getting
> it into a common format with a common symbolic representation of the
> underlying information.  Since we cannot count on the energy and
> discipline of the programmers that write the log-generating programs,
> that energy must be invested in and discipline must be enforced by the
> log collection mechanism.  It's becoming obvious to me that the blanket
> approach of collecting everything on the off chance that some auditor or
> forensic specialist in the future might be able to make sense of it, is
> a waste of resources.  That implies that the requirements for what needs
> to be logged could be set at the collecting end and that somehow those
> requirements need to be communicated to the source of the messages to
> make sure that the required messages exist and are coded appropriately
> (which they won't be).
> </dreaming>
>
> I know, I'm dreaming: there's no choice but to continue to collect tons
> of ore and hope to glean an ounce of silver from it every once in a
> while.  And besides, those old log CD's make nifty tree ornaments.
>
> John Moehrke mentioned that his organization was making the attempt to
> define the standards for the events at the beginning.  To quote:  "We
> thus will be sending the experts in log analysis an already manageable
> format."  That's a great idea, but it suffers from the same standards
> problem I've mentioned:  everybody's likely to have their own (maybe
> someday the only industry will be healthcare, but not yet).  And after
> looking at the RFC, I can't imagine that good things will come of the
> burden this will place on the infrastructure if the logging rate is very
> high.  Can you imagine the "sendmail" guys wrapping xml around the mail
> logs?  Or, all the mail system vendors agreeing on a common xml schema
> for their mail logs?  Yeah, it might happen.
>
> Personally, I'm glad that syslog uses udp.
>
> Sorry, I've rambled entirely too long, I'll go back to merely listening.
>
> Frank Solomon
> University of Kentucky
> Lead Systems Programmer, Enterprise Systems
> http://www.franksolomon.net
> "If you give someone a program, you will frustrate them for a day; if
> you teach them how to program, you will frustrate them for a lifetime."
> --Anonymous
>
>
> -----Original Message-----
> [mailto:loganalysis-bounces+sysfrank=uky.edu@private] On Behalf
> Of Jason Haar
> Sent: Monday, December 05, 2005 3:15 PM
>
> . . .snip. . .
>
> Boring, everyday example:  These days (due to the horrors of antispam
> systems) internal users routinely ring the helpdesk and ask "Customer YY
> sent me an email and I never got it. What happened?". To figure that out
> involves converting what you can learn about customer YY into DNS
> records and IP addresses, then tracking any related connections as they
> hits the edge of our Internet link. Where it first meets our RBL checks,
> then flows through AV and antispam systems, then through a couple more
> internal mail relays before hitting our end mail servers. We have logs
> all merged together from all those systems, but frankly, I am still the
> only one who can link all those events together. And my attempts at
> turning that eyeballing into a program have failed so far. And that's
> only one example.
>
> . . .
>
>
> ------------------------------
>
> Message: 2
> Date: Tue, 6 Dec 2005 14:59:37 -0500
> From: "Adrian Grigorof" <adi@private>
> Subject: [logs] Re: regex-less parsing of messages
> To: <LogAnalysis@private>
> Cc: Anton Chuvakin <anton@private>
> Message-ID: <0c5401c5fa9f$981e6500$4600a8c0@private>
> Content-Type: text/plain;       charset="iso-8859-1"
>
> This is literally the million dollar question of the log analysis industry
> and as a log analyzer developer it's on my mind every day. I guess
> everyone
> agrees that the log analysis should be the job of A.I.s and given the
> current technologies there are just few potential approaches:
>
> 1. Expert system  - a collection of empirical data and decision algorithms
> compiled by developers - most of the log analysis solutions (including
> ours)
> implement this type of AI.
> 2. Hidden Markov models - since they are used in natural language and
> speech
> processing they might be applicable to log entries (they are after all
> some
> type of  "natural speech").
> 3. Neural nets - Once built, the neural net would be trained by
> experienced
> teachers (log analysis gurus).
> 4. Genetic algorithms - The trick would be to 1. define the right
> requirements (for example, determine the least number of message types
> without discarding significant data) and 2. define the genetic codes for
> the
> solution organisms. Maybe GAs are a bit far fetched but I wouldn't exclude
> them.
>
> The problem is that most developers can only program some sort of expert
> system and add rules to it using brute force. The other 3 methods (if
> really
> applicable to log parsing), require (very) advanced mathematical skills
> and
> expensive hardware - this is the realm of Ph.D's and research labs. The
> way
> I see it, we'll be stuck with "expert systems" for a while - the market
> for
> log analysis software is not that rich to justify the type of investments
> required to keep a couple of Ph.D's on your payroll.
>
> Anton mentioned Bayes but personally I would see Bayesian logic used in
> analyzing the results of the log analysis and not in the actual parsing of
> the log entries. The analyzer would continually learn patterns from the
> daily traffic (so the ability to raise "real alarms" and discard false
> positives would increase with every log analyzed).
>
> Regards,
>
> Adrian Grigorof
> www.firegen.com
>
>
> ----- Original Message -----
> From: "Anton Chuvakin" <anton@private>
> To: <LogAnalysis@private>
> Sent: Sunday, December 04, 2005 18:58
> Subject: [logs] regex-less parsing of messages
>
>
> > All,
> >
> > Its time for me to come out of lurking again :-) Here is the thing:
> > when people want to analyze logs, the first stage is often to tokenize
> > (or "parse" as some say) the logs to some manageable format (XML
> > anyone?) for analysis of RDBMS storage.
> >
> > However, if logs are very diverse and lack a format in the first
> > place, the above becomes a mammoth task, since one has to write a lot
> > of ugly regular expressions. In addition, if a message with a new
> > format comes out of the woodwork, a new regex needs to be created. Or,
> > a silly generic regex is used (such as the one that only tokenizes the
> > date and the device name from a Unix syslog message).
> >
> > What are the possible ways around it? From what I know, none of the
> > easy or fun ones. One might try to use clustering (such as 'slct') to
> > try to identify the variable and stable parts of messages from a bulk
> > of them, but that still does not make them tokenized. Or, one can try
> > to create a "brute forcing parser" that will try to guess, for
> > example, that a part of message that contains from 1 to 3 numbers in 4
> > quads with dots is really an IP address. However, it will likely fails
> > more often than not, and it is kinda hard :-) for it to tell a
> > username from a password (both are strings). Or, one can do analysis
> > without tokenizing the logs into a common format, such as with Bayes,
> > by treating them as pretty much English text...
> >
> > So, any more ideas from the group on handling it?
> >
> > Best,
> > --
> > Anton Chuvakin, Ph.D., GCIA, GCIH, GCFA
> >          http://www.chuvakin.org
> >     http://www.securitywarrior.com
> > _______________________________________________
> > LogAnalysis mailing list
> > LogAnalysis@private
> > http://lists.shmoo.com/mailman/listinfo/loganalysis
> >
> >
>
>
>
> ------------------------------
>
> Message: 3
> Date: Tue, 6 Dec 2005 08:56:44 -0600
> From: "Moehrke, John \(GE Healthcare\)" <John.Moehrke@private>
> Subject: [logs] Re: regex-less parsing of messages
> To: "Solomon, Frank" <frank@private>,
>         <LogAnalysis@private>
> Message-ID:
>         <45A5295FFA1CBE4D9BF44E8534D2686C09E0AB8B@privatem
> >
> Content-Type: text/plain;       charset="US-ASCII"
>
> Great point. RFC 3881 is written by healthcare people, but it wasn't
> intended to be specific to be exclusive to healthcare. We would welcome
> others to further develop it for non-healthcare purposes.
>
> I attended a presentation at Catalyst by Mary Ann Davidson from Oracle.
> Her presentation was on the subject of creating a common XML schema to
> describe security audit events. She indicated that NIST had shown some
> interest in driving this standardization. She gave me the name Elizabeth
> Chew <echew@private>. I have not gotten responses back from Elizabeth,
> so I don't know the current status.
>
> John
>
> > -----Original Message-----
> > From:
> > loganalysis-bounces+john.moehrke=med.ge.com@private
> > [mailto:loganalysis-bounces+john.moehrke=med.ge.com@private
> > o.com] On Behalf Of Solomon, Frank
> > Sent: Tuesday, December 06, 2005 7:46 AM
> > To: LogAnalysis@private
> > Subject: [logs] Re: regex-less parsing of messages
> >
> > Jason, your example certainly struck a chord.  We haven't
> > even begun to
> > put our mail logs into our central log server because of the technical
> > challenges that would pose.  And yet, we get asked the same sort of
> > questions which require a highly trained person to probe through the
> > heterogeneous mail log files and trace the path of some
> > errant envelope
> > that may or may not actually exist.  It is not pretty; part
> > of the price
> > we pay for having to accommodate multiple mail systems, vendors and
> > standards.
> >
> > Our standing joke is:  "That's the nice thing about
> > standards, there are
> > so many to choose from and everyone can have their own."  So,
> > "sendmail"
> > has its "standard" log format and "Exchange" has its "standard" log
> > format, and "Novell" has its "standard" log format, etc.  I saw an
> > article recently describing the new "logging standard" that Microsoft
> > was about to introduce in their latest OS.  Well that will certainly
> > clear things up!  I'm sure all their competitors will rush to
> > implement
> > compatible systems.  Don't get me wrong, I laud Microsoft's attempt to
> > enforce programmer discipline.
> >
> > In case you're interested in the MS stuff:
> > http://msdn.microsoft.com/library/default.asp?url=/library/en-
> > us/wes/wes
> > /about_the_windows_event_log.asp
> >
> > <dreaming>
> > Certainly, the first challenge in being able to analyze data
> > is getting
> > it into a common format with a common symbolic representation of the
> > underlying information.  Since we cannot count on the energy and
> > discipline of the programmers that write the log-generating programs,
> > that energy must be invested in and discipline must be enforced by the
> > log collection mechanism.  It's becoming obvious to me that
> > the blanket
> > approach of collecting everything on the off chance that some
> > auditor or
> > forensic specialist in the future might be able to make sense
> > of it, is
> > a waste of resources.  That implies that the requirements for
> > what needs
> > to be logged could be set at the collecting end and that somehow those
> > requirements need to be communicated to the source of the messages to
> > make sure that the required messages exist and are coded appropriately
> > (which they won't be).
> > </dreaming>
> >
> > I know, I'm dreaming: there's no choice but to continue to
> > collect tons
> > of ore and hope to glean an ounce of silver from it every once in a
> > while.  And besides, those old log CD's make nifty tree ornaments.
> >
> > John Moehrke mentioned that his organization was making the attempt to
> > define the standards for the events at the beginning.  To quote:  "We
> > thus will be sending the experts in log analysis an already manageable
> > format."  That's a great idea, but it suffers from the same standards
> > problem I've mentioned:  everybody's likely to have their own (maybe
> > someday the only industry will be healthcare, but not yet).  And after
> > looking at the RFC, I can't imagine that good things will come of the
> > burden this will place on the infrastructure if the logging
> > rate is very
> > high.  Can you imagine the "sendmail" guys wrapping xml
> > around the mail
> > logs?  Or, all the mail system vendors agreeing on a common xml schema
> > for their mail logs?  Yeah, it might happen.
> >
> > Personally, I'm glad that syslog uses udp.
> >
> > Sorry, I've rambled entirely too long, I'll go back to merely
> > listening.
> >
> > Frank Solomon
> > University of Kentucky
> > Lead Systems Programmer, Enterprise Systems
> > http://www.franksolomon.net
> > "If you give someone a program, you will frustrate them for a day; if
> > you teach them how to program, you will frustrate them for a
> > lifetime."
> > --Anonymous
> >
> >
> > -----Original Message-----
> > [mailto:loganalysis-bounces+sysfrank=uky.edu@private]
> > On Behalf
> > Of Jason Haar
> > Sent: Monday, December 05, 2005 3:15 PM
> >
> > . . .snip. . .
> >
> > Boring, everyday example:  These days (due to the horrors of antispam
> > systems) internal users routinely ring the helpdesk and ask
> > "Customer YY
> > sent me an email and I never got it. What happened?". To
> > figure that out
> > involves converting what you can learn about customer YY into DNS
> > records and IP addresses, then tracking any related
> > connections as they
> > hits the edge of our Internet link. Where it first meets our
> > RBL checks,
> > then flows through AV and antispam systems, then through a couple more
> > internal mail relays before hitting our end mail servers. We have logs
> > all merged together from all those systems, but frankly, I am
> > still the
> > only one who can link all those events together. And my attempts at
> > turning that eyeballing into a program have failed so far. And that's
> > only one example.
> >
> > . . .
> > _______________________________________________
> > LogAnalysis mailing list
> > LogAnalysis@private
> > http://lists.shmoo.com/mailman/listinfo/loganalysis
> >
>
>
> ------------------------------
>
> Message: 4
> Date: Tue, 06 Dec 2005 16:13:01 +0000
> From: todd.glassey@private
> Subject: [logs] Re: regex-less parsing of messages
> To: "Solomon, Frank" <frank@private>,
>         <LogAnalysis@private>
> Message-ID:
>         <
> 120620051613.5476.4395B88C000BD73A000015642160375964970A9C9C0E0409D20B0B019B@pr
> >
>
>
> We use SPLUNK for exactly this.
>
> Todd
> -------------- Original message ----------------------
> From: "Solomon, Frank" <frank@private>
> > Jason, your example certainly struck a chord.  We haven't even begun to
> > put our mail logs into our central log server because of the technical
> > challenges that would pose.  And yet, we get asked the same sort of
> > questions which require a highly trained person to probe through the
> > heterogeneous mail log files and trace the path of some errant envelope
> > that may or may not actually exist.  It is not pretty; part of the price
> > we pay for having to accommodate multiple mail systems, vendors and
> > standards.
> >
> > Our standing joke is:  "That's the nice thing about standards, there are
> > so many to choose from and everyone can have their own."  So, "sendmail"
> > has its "standard" log format and "Exchange" has its "standard" log
> > format, and "Novell" has its "standard" log format, etc.  I saw an
> > article recently describing the new "logging standard" that Microsoft
> > was about to introduce in their latest OS.  Well that will certainly
> > clear things up!  I'm sure all their competitors will rush to implement
> > compatible systems.  Don't get me wrong, I laud Microsoft's attempt to
> > enforce programmer discipline.
> >
> > In case you're interested in the MS stuff:
> > http://msdn.microsoft.com/library/default.asp?url=/library/en-us/wes/wes
> > /about_the_windows_event_log.asp
> >
> > <dreaming>
> > Certainly, the first challenge in being able to analyze data is getting
> > it into a common format with a common symbolic representation of the
> > underlying information.  Since we cannot count on the energy and
> > discipline of the programmers that write the log-generating programs,
> > that energy must be invested in and discipline must be enforced by the
> > log collection mechanism.  It's becoming obvious to me that the blanket
> > approach of collecting everything on the off chance that some auditor or
> > forensic specialist in the future might be able to make sense of it, is
> > a waste of resources.  That implies that the requirements for what needs
> > to be logged could be set at the collecting end and that somehow those
> > requirements need to be communicated to the source of the messages to
> > make sure that the required messages exist and are coded appropriately
> > (which they won't be).
> > </dreaming>
> >
> > I know, I'm dreaming: there's no choice but to continue to collect tons
> > of ore and hope to glean an ounce of silver from it every once in a
> > while.  And besides, those old log CD's make nifty tree ornaments.
> >
> > John Moehrke mentioned that his organization was making the attempt to
> > define the standards for the events at the beginning.  To quote:  "We
> > thus will be sending the experts in log analysis an already manageable
> > format."  That's a great idea, but it suffers from the same standards
> > problem I've mentioned:  everybody's likely to have their own (maybe
> > someday the only industry will be healthcare, but not yet).  And after
> > looking at the RFC, I can't imagine that good things will come of the
> > burden this will place on the infrastructure if the logging rate is very
> > high.  Can you imagine the "sendmail" guys wrapping xml around the mail
> > logs?  Or, all the mail system vendors agreeing on a common xml schema
> > for their mail logs?  Yeah, it might happen.
> >
> > Personally, I'm glad that syslog uses udp.
> >
> > Sorry, I've rambled entirely too long, I'll go back to merely listening.
> >
> > Frank Solomon
> > University of Kentucky
> > Lead Systems Programmer, Enterprise Systems
> > http://www.franksolomon.net
> > "If you give someone a program, you will frustrate them for a day; if
> > you teach them how to program, you will frustrate them for a lifetime."
> > --Anonymous
> >
> >
> > -----Original Message-----
> > [mailto:loganalysis-bounces+sysfrank=uky.edu@private] On Behalf
> > Of Jason Haar
> > Sent: Monday, December 05, 2005 3:15 PM
> >
> > . . .snip. . .
> >
> > Boring, everyday example:  These days (due to the horrors of antispam
> > systems) internal users routinely ring the helpdesk and ask "Customer YY
> > sent me an email and I never got it. What happened?". To figure that out
> > involves converting what you can learn about customer YY into DNS
> > records and IP addresses, then tracking any related connections as they
> > hits the edge of our Internet link. Where it first meets our RBL checks,
> > then flows through AV and antispam systems, then through a couple more
> > internal mail relays before hitting our end mail servers. We have logs
> > all merged together from all those systems, but frankly, I am still the
> > only one who can link all those events together. And my attempts at
> > turning that eyeballing into a program have failed so far. And that's
> > only one example.
> >
> > . . .
> > _______________________________________________
> > LogAnalysis mailing list
> > LogAnalysis@private
> > http://lists.shmoo.com/mailman/listinfo/loganalysis
>
>
>
>
> ------------------------------
>
> Message: 5
> Date: Tue, 6 Dec 2005 18:26:49 -0800
> From: Christina Noren <cfrln@private>
> Subject: [logs] Re: regex-less parsing of messages
> To: todd.glassey@private, LogAnalysis@private
> Message-ID: <8B41EDCD-2DE3-4CD2-81FE-BFC4DEF28217@private>
> Content-Type: text/plain; charset=US-ASCII; delsp=yes; format=flowed
>
> Speaking from Splunk...
>
> This problem of needing to build and maintain a big library of
> regexes to analyze logs centrally is one we're trying to end run, so
> thanks Todd for bringing us into the conversation.
>
> We agree with Frank that getting common XML standards is pretty
> unlikely across the broad range of log sources people need to correlate.
>
> We've instead built a series of universal processors that find and
> normalize timestamps in any format, then tokenize everything in each
> event, and classify new sources and events based on patterns and
> grammatical structure in the event. We put off all of the semantics
> till search time so we don't need to worry about mapping "deny"
> "reject" and other variants of the same action to a common value. I'm
> oversimplifying a more complex set of algorithms for the sake of a
> short message.
>
> Users are able to put in log sources we've never seen before and have
> them handled by the same algorithms as everything else.
>
> Then, instead of a structured relational db, we put everything into a
> rich, dense search index behind a simple search interface that
> provides results to most searches in seconds. This has the nice side
> effect of making ad hoc access to the logs a lot easier than needing
> to form a SQL style query.
>
> This works pretty well for use cases like tracing an email message
> through different sendmail, antispam and other events and other
> investigative/troubleshooting scenarios. There's really no reason to
> write a regex to parse sendmail's different message formats into a
> structured schema if you're going to search for an email address and
> time, then follow that event based on message id and other content of
> that event. We have some interesting accelerators for following the
> correlation, like a "related" feature that looks for the connections
> based on time and value.
>
> - Christina
>
> p.s. you can download Splunk free at www.splunk.com
>
>
>
> On Dec 6, 2005, at 8:13 AM, todd.glassey@private wrote:
>
> > We use SPLUNK for exactly this.
> >
> > Todd
> >  -------------- Original message ----------------------
> > From: "Solomon, Frank" <frank@private>
> >
> >> Jason, your example certainly struck a chord.  We haven't even
> >> begun to
> >> put our mail logs into our central log server because of the
> >> technical
> >> challenges that would pose.  And yet, we get asked the same sort of
> >> questions which require a highly trained person to probe through the
> >> heterogeneous mail log files and trace the path of some errant
> >> envelope
> >> that may or may not actually exist.  It is not pretty; part of the
> >> price
> >> we pay for having to accommodate multiple mail systems, vendors and
> >> standards.
> >>
> >> Our standing joke is:  "That's the nice thing about standards,
> >> there are
> >> so many to choose from and everyone can have their own."  So,
> >> "sendmail"
> >> has its "standard" log format and "Exchange" has its "standard" log
> >> format, and "Novell" has its "standard" log format, etc.  I saw an
> >> article recently describing the new "logging standard" that Microsoft
> >> was about to introduce in their latest OS.  Well that will certainly
> >> clear things up!  I'm sure all their competitors will rush to
> >> implement
> >> compatible systems.  Don't get me wrong, I laud Microsoft's
> >> attempt to
> >> enforce programmer discipline.
> >>
> >> In case you're interested in the MS stuff:
> >> http://msdn.microsoft.com/library/default.asp?url=/library/en-us/
> >> wes/wes
> >> /about_the_windows_event_log.asp
> >>
> >> <dreaming>
> >> Certainly, the first challenge in being able to analyze data is
> >> getting
> >> it into a common format with a common symbolic representation of the
> >> underlying information.  Since we cannot count on the energy and
> >> discipline of the programmers that write the log-generating programs,
> >> that energy must be invested in and discipline must be enforced by
> >> the
> >> log collection mechanism.  It's becoming obvious to me that the
> >> blanket
> >> approach of collecting everything on the off chance that some
> >> auditor or
> >> forensic specialist in the future might be able to make sense of
> >> it, is
> >> a waste of resources.  That implies that the requirements for what
> >> needs
> >> to be logged could be set at the collecting end and that somehow
> >> those
> >> requirements need to be communicated to the source of the messages to
> >> make sure that the required messages exist and are coded
> >> appropriately
> >> (which they won't be).
> >> </dreaming>
> >>
> >> I know, I'm dreaming: there's no choice but to continue to collect
> >> tons
> >> of ore and hope to glean an ounce of silver from it every once in a
> >> while.  And besides, those old log CD's make nifty tree ornaments.
> >>
> >> John Moehrke mentioned that his organization was making the
> >> attempt to
> >> define the standards for the events at the beginning.  To quote:  "We
> >> thus will be sending the experts in log analysis an already
> >> manageable
> >> format."  That's a great idea, but it suffers from the same standards
> >> problem I've mentioned:  everybody's likely to have their own (maybe
> >> someday the only industry will be healthcare, but not yet).  And
> >> after
> >> looking at the RFC, I can't imagine that good things will come of the
> >> burden this will place on the infrastructure if the logging rate
> >> is very
> >> high.  Can you imagine the "sendmail" guys wrapping xml around the
> >> mail
> >> logs?  Or, all the mail system vendors agreeing on a common xml
> >> schema
> >> for their mail logs?  Yeah, it might happen.
> >>
> >> Personally, I'm glad that syslog uses udp.
> >>
> >> Sorry, I've rambled entirely too long, I'll go back to merely
> >> listening.
> >>
> >> Frank Solomon
> >> University of Kentucky
> >> Lead Systems Programmer, Enterprise Systems
> >> http://www.franksolomon.net
> >> "If you give someone a program, you will frustrate them for a day; if
> >> you teach them how to program, you will frustrate them for a
> >> lifetime."
> >> --Anonymous
> >>
> >>
> >> -----Original Message-----
> >> [mailto:loganalysis-bounces+sysfrank=uky.edu@private] On
> >> Behalf
> >> Of Jason Haar
> >> Sent: Monday, December 05, 2005 3:15 PM
> >>
> >> . . .snip. . .
> >>
> >> Boring, everyday example:  These days (due to the horrors of antispam
> >> systems) internal users routinely ring the helpdesk and ask
> >> "Customer YY
> >> sent me an email and I never got it. What happened?". To figure
> >> that out
> >> involves converting what you can learn about customer YY into DNS
> >> records and IP addresses, then tracking any related connections as
> >> they
> >> hits the edge of our Internet link. Where it first meets our RBL
> >> checks,
> >> then flows through AV and antispam systems, then through a couple
> >> more
> >> internal mail relays before hitting our end mail servers. We have
> >> logs
> >> all merged together from all those systems, but frankly, I am
> >> still the
> >> only one who can link all those events together. And my attempts at
> >> turning that eyeballing into a program have failed so far. And that's
> >> only one example.
> >>
> >> . . .
> >> _______________________________________________
> >> LogAnalysis mailing list
> >> LogAnalysis@private
> >> http://lists.shmoo.com/mailman/listinfo/loganalysis
> >>
> >
> >
> > _______________________________________________
> > LogAnalysis mailing list
> > LogAnalysis@private
> > http://lists.shmoo.com/mailman/listinfo/loganalysis
> >
>
>
>
> ------------------------------
>
> _______________________________________________
> LogAnalysis mailing list
> LogAnalysis@private
> http://lists.shmoo.com/mailman/listinfo/loganalysis
>
>
> End of LogAnalysis Digest, Vol 31, Issue 2
> ******************************************
>



_______________________________________________
LogAnalysis mailing list
LogAnalysis@private
http://lists.shmoo.com/mailman/listinfo/loganalysis



This archive was generated by hypermail 2.1.3 : Tue Dec 06 2005 - 20:48:21 PST