dumb idea(s) (tm)

From: Doru Petrescu (pdoruat_private)
Date: Thu Jun 28 2001 - 23:09:21 PDT

  • Next message: Aycan Irican: "Re: dumb idea(s) (tm)"

    in regards to "Re: Antivirus scanner DoS with zip archives" by "Aycan
    Irican <aycanat_private>"
    
    let's analize this idea some guy had ... he wanted to create a long file
    without using disk space to store it ...
    
    $> perl -e 'print "A"x1000000000' | gzip -f | dd of=file1
            Out of memory!
            0+1 records in
            0+1 records out
    
    
    first, consider removing 'dd' ... try like this:
    
    $> .... |gzip >outputfile.gz
    
    this mecanism is called OUTPUT REDIRECTION. it works even on windows. :-)
    
    then, let's analize what is with that strange 'Out of memory!' ....
    as many of you might already guessed, PERL is the one that complains ...
    and, why not ... it was asked to create a string 1GB long. it first has to
    alocate memory for it, so when perl tries something like
    malloc(1000000000) it will fail...
    
    in perl a construct like "a" x N_TIMES is a static string, it is not an
    iterator, a 'for' or something like this. it will not 'execute' N times to
    produce the result. the internal compiler will create the string by
    duplicating N times the smaller string. it still uses memory. it just
    saves you (the programmer) from writing a 1GB+ program for printing a 1GB
    long string containg 1000000000 'a' ...
    
    the right way is to use a loop, like this:
    
    foreach (1..1000000000) { print "a"; }
    
    so, we can try something like:
    
    $> perl -e 'foreach (1..1000000000) { print "a"; }' | gzip >outfile.gziped
    
    you can even add more ZEROs ...
    
    
    but, there is a small problem ... this is going to take a long time,
    because print is a quite expensive function, and anyway it has to run it
    1000000000 times ...
    
    so we can do:
    
    foreach (1..1000) { print "a"x1000000; }
    
    this will allocate a 1MB long string (but that's ok, we are no longer
    running on Z80 processors) and print it 1000 times. it works _A_ LOT
    FASTER
    
    $> perl -e 'foreach (1..1000) { print "a"x1000000; }' | gzip >outfile.gziped
    
    
    
    
    -------------------------------------------------------------------
    so,
    
    what is the POINT of this email ?
    
    
    1. when you write a program, try NOT TO eat all your memory.
    (the point of any exploit is to exploit OTHERS :-)
    
    2. when you try to break something try to think a little more on the
    subject, let's do a better job that the one that wrote the original
    program. we don't want an exploit that works even worse than the program.
    
    3. everybody, before releasing a theory FIRST see if you're right.
    test your idea first, then test your code. read some papers about that
    subject.
    
    
    4. there is no such thing as "So I must find the maximum byte size that I
    can pipe to gzip."
       you can pipe as long as you want, as long as you do PIPE something. in
    your case the source program (perl) crashed with 'out of memory' BEFORE it
    was able to send something to gzip.
       for all you new guys, PIPE doesn't have a size limit, you can write as
    long as someone keeps reading from the other end.
    
    
    
    When I see something called EXPLOIT, I expect it to exploit the VICTIM,
    not the attacker.
    and, when you report a vulnerability, make sure is the target program
    vulnerable not your own code, and make sure your exploit is not yet
    another vulnerable program. there is nothing worse than exploatable
    exploit ... hahahaha
    
    
    
    
    Best regards,
    Doru Petrescu
    



    This archive was generated by hypermail 2b30 : Fri Jun 29 2001 - 10:32:18 PDT