RE: [ISN] Insecurity through obscurity

From: InfoSec News (isn@private)
Date: Mon Jun 13 2005 - 01:03:16 PDT


Forwarded from: Phil Hollows <phollows@private>

I can't claim to be familiar with Kerckhoffs Law - although it seems
like common sense when planning to defend a security system - but I
think the author here is stretching the point when it comes to
security through obscurity.

 
> Lessons learned
> 
> First of all, ApplyYourself.com's method of hiding the admission
> status from the applicants was a great example of security through
> obscurity. In order to obtain the status early, the users took
> information that was readily available to them, modified the URL in
> their browsers and got access to their own admission status.

This isn't security through obscurity, it's just thoughtless
application design.  "Hiding" the session key in a hidden field is
merely a way to track state without it getting in the way of the user.  
The fact that fields / parameters that enabled users to get at
(somewhat) privileged information were easily guessed is the
*opposite* of security through obscurity.  An obscure value would not
have been guessable by the student.  The writer confuses hiding
application data for usability purposes with hiding it for security
purposes.  There was little secure about this app.

In fact, my recollection is that the user could only look at their own
data.  The app itself seems to have been secure enough to prevent a
visitor using this technique to look at someone else's data, which
would have been much more serious.  Didn't mention that in the
article, of course, because that would spoil the premise.
 
> There are at least two major mistakes here. First, ApplyYourself.com
> hid an ID field that users were not supposed to see in the Web page
> source. This ID was then used to construct the URL that would give
> the user the admission status.

A session or user variable.  What were they thinking?  Would the
author prefer it to have been a cookie to make it a teensy bit harder?  
How would he like them to track state?
 
> Second, ApplyYourself.com assumed that users would not have
> knowledge of the URL that would provide the status. However, anyone
> who applied to these schools through ApplyYourself.com would have
> seen the URL, and would have known what the URL looked like, as well
> as the parameters required to construct the URL. Given that this URL
> was provided to previous applicants, current applicants could easily
> obtain it by simply asking.

I think the assumption here seems to be second guessing by the author
of the developers' intent (and yes, this is also an assumption, so I'm
equally guilty of the same sin).  I think this is a fair assumption
because the developers of the app aren't actually quoted in this
"analysis."

And if anyone can guess the URL, it's hardly obscure, is it?  I mean,
if you want to write an article about using obscurity or camouflage as
a technique, fine.  Just that this case isn't one of them.
 
> These two grave mistakes left ApplyYourself.com scrambling to patch
> the security holes.

Their assumption was that nobody would guess the URLs.  Yup, security
hole.  Poor application design, careless QA. Problem with security
thru obscurity?  Nope.
 
> Another good example of security through obscurity was demonstrated
> when hackers compromised Cisco Systems Inc.'s corporate network and
> stole more than 800MB of source code (see story)[3]. This incident
> caused quite a stir in the IT community, since Cisco's routers are
> responsible for managing a majority of the Internet traffic. Any
> security issues in the source code could become public. The
> publication of these security vulnerabilities -- still a possibility
> -- has the definite potential of causing major havoc on the
> Internet, possibly bringing it down on its knees.

And the reason why this is a security thru obscurity problem is ...
errm ... well, we'll get back to you on that one as the author of this
piece doesn't actually tell us.  I guess I'll trust him on that one.  
(not).
 
> Microsoft Corp. has also experienced similar embarrassing incidents.
> In February 2004, portions of the source code for the Microsoft
> Windows NT and Windows 2000 operating systems were leaked (see
> story)[4].  The leaked source code could potentially allow hackers
> to identify security vulnerabilities in the Windows operating
> systems. Given the popularity of Windows in both consumer and
> corporate environments, this leak could be devastating to the whole
> Internet community.

Oh, wait. Now I get it.  Having the source code for the products is a
de facto failure of security through obscurity.  Of course!  Obvious,
really.  We should all stop compiling and linking code and use
interpreters instead, because going to machine code is clearly a poor
attempt at security through obscurity.  Then we'll be more secure.

No mention of how the source code repositories were actually
protected, nor of what systems or processes or policies failed that
allowed the source code to be accessed.  If there were security
through obscurity failures here, that would be an interesting topic.
 
> All these examples demonstratem the danger of the
> security-through-obscurity premise.

No, they don't.  There are no demonstrations of security through
obscurity in this article, successful or otherwise.

> There are many articles, books and seminars on this topic. Companies
> and software developers need to start with Kerckhoffs' law, assume
> that the algorithm and design of the software are known, and design
> security into the products and software in the beginning instead of
> retrofitting or patching security holes later.

True.  What a shame this conclusion has nothing to do with the
paragraphs above it, nor a debate around security through obscurity's
valid place (or otherwise) in a comprehensive layered defense
architecture, nor about the challenges of designing, writing and
testing secure code.

Phil
www.openservice.com




_________________________________________
Attend the Black Hat Briefings and
Training, Las Vegas July 23-28 - 
2,000+ international security experts, 
10 tracks, no vendor pitches.
www.blackhat.com 



This archive was generated by hypermail 2.1.3 : Mon Jun 13 2005 - 01:11:08 PDT