RE: recommendations for URL filtering

From: O'Shea, Dave (dave.osheaat_private)
Date: Tue Jan 25 2000 - 19:13:36 PST

  • Next message: Riley, Steven: "RE: Bypassing firewall"

    Way Back When, I wrote a hack to the CERN httpd proxy that searched for
    various keywords within a URL, and blocked access to them. I seem to recall
    that it didn't take a whole lot of work, just a bit of jiggering with the
    httpd.config file. 
     
    
     -----Original Message-----
    From: 	Antonomasia [mailto:antat_private] 
    Sent:	Sunday, January 23, 2000 6:22 PM
    To:	firewall-wizardsat_private
    Subject:	recommendations for URL filtering
    
    
    I have been asked to look at ways of filtering URLs to people browsing
    from a business site.  (FW-1 is used there, but they seem to have
    discounted that option.  I see from a post on this list from 1997 that
    one reader was using this with partial success.  I know Raptor has scope
    for this, but is not used here.)
    
    To my mind this looks like an easy extension to squid, particularly if
    you are using an external redirector program.  The intended form of blocking
    has not yet been described to me - whether it is to be based on a whitelist
    of domain names, filename extensions, MIME types or what.
    
    It is also said there will be requirements for logging and for identifying
    human browsers on NT.  What does NT have as rough "ident"/"rusers"
    equivalents ?  Or means of authenticating users in advance to a proxy ?
    
    What can people suggest ?   I'll summarise any useful stuff sent off list.
    
    --
    ##############################################################
    # Antonomasia   antat_private                      #
    # See http://www.notatla.demon.co.uk/                        #
    ##############################################################
    



    This archive was generated by hypermail 2b30 : Fri Apr 13 2001 - 13:59:00 PDT