Google lists vulnerable sites.

From: silencedscreamat_private
Date: Fri Jul 05 2002 - 12:01:14 PDT

  • Next message: Brian Hatch: "Re: Ports 0-1023?"

    
     ('binary' encoding is not supported, stored as-is)
    Let me first say that I do now know if this issue has been brought to 
    light before or in what detail it might have been discussed.  On to the 
    show...
    
    The problem I have found is that google may be archiving too much 
    information on sites.  By carefully crafting search strings you can 
    reliably return sites who's root, cgi-bin, bin, admin, etc... directories 
    are exposed and unprotected.  The first thing you must do is select the 
    name of a commonnly protected directory (I will use admin in this 
    example).  The second is to think of a filetype that only the 
    administrator and not the average web surfer would have access to.  
    Things like bin, txt, or htm are no good because they are commonly made 
    available in other directories for legitimate reasons.  For this example 
    I choose to go with .db.  Now to create the search string.
    
    inurl:admin filetype:db
    The above gives us,
    http://www.google.com/search?sourceid=navclient&q=inurl%3Aadmin+filetype%
    3Adb
    
    The above search sets the requirments that admin must be in the url and 
    only sites that contain a file of the type .db are returned.
    
    Now most of the links you click on will take you to some meaningless url 
    or email database but if for exaple you had
    
    www.somesite.org/admin/cgi-bin/url.db
    
    and you removed the url.db from the link you are now free to traverse 
    through there directories and files.  By useing carefully selected search 
    terms like the ones above I have about a 90-95% success rate of 
    vulnerable sites returned.  The trick is finding the right directory and 
    filetypes to use in the search.
    



    This archive was generated by hypermail 2b30 : Fri Jul 05 2002 - 15:24:33 PDT