Date: Mon, 04 May 1998 09:46:13 +1000 (EST) From: Peter Jeremy <peter.jeremyat_private> I'm currently engaged in an internal NT vs Unix debate and trying to insert some facts into the debate. One point that has come up is along the lines of `most Internet sites that have been hacked have been running Unix therefore Unix is insecure'. Can anyone point me to some figures showing what sorts of sites have been broken into and what they were running, compared to the Internet as a whole? Note: I don't want to start a flamewar here. I'm just after some defendable figures in place of FUD. I don't have the numbers, but look at some of the recent CERT advisories, regarding the widespread use of teardrop-related attacks against large numbers of sites across the Internet. ..And in the editorial dept.: I've had to set up both Unix and NT machines for shared hosting platforms, and I am currently quite stymied by several issues under NT, especially the high cost of keeping up-to-date with the MANY hotfixes (use NTbugtraq.com to help with this! -- Thanks Russ! ;-). Also, I find it difficult to add any additional layers of protection -- basically, you have to count on the software not having bugs. Finally, if you think MS's software is bad, wait until you try to configure other products, such as Net Objects' Fusion, Cold Fusion, etc. for shared use. It is practically impossible to keep one client from having access to another's stuff, and equally difficult to separate program data from transient data. Again, you simply must rely on the software to work right. Does anyone have some suggestions for solving these problems? This is off-topic, so responses should probably not go to the list.... -- -- Bill Van Emburg Quadrix Solutions, Inc. Phone: 732-235-2335 (bveat_private) Fax: 732-235-2336 (http://quadrix.com) "You do what you want, and if you didn't, you don't"
This archive was generated by hypermail 2b30 : Fri Apr 13 2001 - 12:58:29 PDT