On Wed, Sep 19, 2001 at 09:48:22PM +0200, Nicolai Rasmussen wrote: > We run some websites that generates more than 5 gb logs pr. day on approx. > 50 different sites and we would like to put them into a database so we could > do some data mining on them. > > Does anyone have any idears, input, thoughts or anything on how we should do > this ? > > We thought about making a optimized table definition and then dump each line > into the database. From there we would make some summary reports.. This is the way I implemented such a solution for an ISP. We piped around 4 gb of data into a db2 database (on linux) and used ms access (*duck*) to connect to the database to generate traffic bills. -- With best regards Hans - Joachim Picht <hansat_private> --------------------------------------------------------------------- To unsubscribe, e-mail: loganalysis-unsubscribeat_private For additional commands, e-mail: loganalysis-helpat_private
This archive was generated by hypermail 2b30 : Fri Sep 28 2001 - 11:36:47 PDT