http://www.informationweek.com/news/showArticle.jhtml?articleID=191000063 By Sharon Gaudin InformationWeek July 21, 2006 The computer sabotage trial of a systems administrator who was found guilty of attacking the network he had been hired to protect at UBS PaineWebber is sending out a sobering message, and one that can't be stressed enough: No matter what network security you have in place, it may not be enough to protect you from one of your own. It's almost a clich, but one that many companies still do not take seriously. That, say security analysts and members of the government prosecution team, was the case for UBS, whose network was hit by a logic bomb in March of 2004. A jury this week found Roger Duronio of Bogota, N.J., guilty of two crimes: computer sabotage for building, planting and distributing the malicious code that brought down nearly 2,000 servers on the company's nation-wide trading network; and securities fraud. Duronio had worked at UBS as a systems administrator for about three years, and had become disgruntled when he found out that his annual bonus was going to come in smaller than he'd expected. Duronio is set to be sentenced on Oct. 30. The UBS/Duronio case is a perfect example of the damage that can be caused by a knowledgeable insider with high-level access and an axe to grind. But that didn't stop the defense in this case from laying the blame squarely at the company's feet. During the seven-week trial, Chris Adams, Duronio's defense attorney, painted an ugly picture of UBS' security infrastructure and practices. He repeatedly hammered on the fact that all the root users on the Unix-based system had the same password, and that UBS logs were unable to track which user was giving commands on the system as 'root'. He also focused on a back door found on a server in the main data center the year before the attack was launched. The defense attorney went so far as to say UBS security was so riddled with holes as to make it impossible to tell what was happening on the system or who could have "masqueraded" as his client and planted the bomb. But the forensics investigator who spent more than three years analyzing backup tapes, logs and source code from UBS' network says the company actually had a solid security setup. "In my opinion, it was strong," says Keith Jones, the government's star witness and director of computer forensics and incident response at Mandiant, an information security company, based in Alexandria, Va. "Not only that, but they knew where their weaknesses were and they were trying to address them. UBS did a lot of things right The defense raised the few issues that they could about UBS in order to make their network sound like the Wild West." (See Jones' Top Five lists for what UBS did right, and what the company could have done better.) Alan Paller, director of research at the SANS Institute, says it's easy to take a few problems and make it look like a complete security fiasco. But that doesn't make it an accurate representation. "You can do 5,000 things right, and only one thing wrong and that's what they'll rake you over the coals with," says Paller. He adds that financial services companies, in general, tend to have better security than the average company, and UBS is known to be solid security-wise. Was UBS' security bullet-proof? No. But whose is? The company could have put in place better monitoring and auditing of administrators' actions and system commands. There could have been fewer full-fledged root users, and there could have been separate passwords for each one of them. But even if any or all of these issues had been addressed before the attack, security analysts say a corporate IT professional with a good-size chip on his shoulder could still wreak a frightening amount of havoc and high-cost damage. Beware the Insider The real issue here, says Paller, is the insider. A company employee is already inside the perimeter, where the vast majority of the protective technologies (firewalls, intrusion detection systems, etc.) sit. That same employee also knows what information is most vital to the company's ability to make money and sustain itself. He has knowledge of passwords, and he also probably knows what kind of machines and operating systems the company is running. Now, imagine that employee works in IT. An IT professional has all this information, plus he has access to the inner workings of the infrastructure. He has high-level privileges that allow him access to key servers and databases, and possibly even root-level access, which would give him all-encompassing power over the system. Now, imagine that IT worker is angry at the company. Companies need to be watchful about employees who have that much power over the health and well-being of the network, says Assistant U.S. Attorney V. Grady O'Malley, who helped prosecute the Duronio case, as well as the Tim Lloyd computer sabotage trial in 2000. Lloyd, a former network administrator at Omega Engineering Corp.'s Bridgeport, N.J. manufacturing plant, was convicted of launching a very similar attack on that company. "Unfortunately, the message is still the same," says O'Malley. "You have to be incredibly vigilant when you're talking about trusting a system to people. You better be a lot more aggressive in [employee] analysis. Who is the person working on our network? Has he exhibited problems we should be worried about? Is he in a position to do damage if he wants to do damage?" It's simply not fair to look at UBS, and say their system was flawed so they're at fault, says O'Malley. "Whether their system was flawed or not, they still had the Duronio factor," he adds. "He was a guy with a significant position. Regardless of the security measures you have in place, if the guy you're tasking to make sure the system is protected, wants to hit you, then it doesn't make any difference what you've done." Paller agrees, adding that all too often corporate executives forget they're basically at the mercy of their IT workers. Paller tells the story of the time he was visiting a company back in 1969. Paller says he was walking with an IT manager who ran into a systems programmer and delicately asked him to go home and change into more professional clothes before he went to an important meeting. "This guy was still wearing pizza from last night's dinner," remembers Paller. "I said to him, 'Wow. Does that guy work for you? Why did you treat him with such kid gloves?' And he said to me, 'No. He doesn't work for me. He's a systems programmer. He owns me.' " It's a lesson more executives should have had, says Paller, adding that IT workers wield a tremendous amount of power over an enterprise, and there are very few protections keeping them from using that power maliciously. Building in Protections So, if IT professionals need high-level system access, tools and privileges to do their jobs, how do companies restrict their ability to hurt the company while still empowering them to help the company? There are technologies and process that can be put in place to help, says Ken van Wyk, principal consultant with KRvW Associates, LLC of Alexandria, Va. But that doesn't mean the company will be fully protected from an attack launched by an insider. It does mean that they won't be leaving their soft underbellies wide open to simple assaults. Van Wyk says it's helpful to put checks and balances in place. One administrator can build code, but a second administrator would have to look it over and approve it before it could go live on the system. The problem with this, adds van Wyk, is that there are big costs associated with it. And in a time when most companies are looking to cut costs, this may not be a welcome suggestion in the boardroom. Companies need to find a middle ground where they can put processes in place, but not break that bank doing it. "I remember reading that UBS had a large number of root users on the system," says van Wyk. "Reduce that number, and put in place role-based privileges. That means that one administrator may be able to work with backup tapes but they aren't allowed to add code to the system. And have the event log monitored by someone [else]." Andi Mann, a senior analyst with Enterprise Management Associates, which is based in Boulder, Co., says high-level access should be limited to as few people as possible, and everyone should have their own user ID and unique password to help keep a granular log of what users are making changes and issuing commands on the network. Paller recommends that companies basically instill a healthy fear in their employees. "You need granular logging and log monitoring that gives people the feeling that somebody omniscient is out there watching them all the time," says Paller. "And you have to demonstrate that omniscience a few times. Somebody visits a porn site, and you walk in and say, 'Do you really want to visit those sites from work?' Somebody else downloads something to a thumb drive, and you ask them where the thumb drive is. Let these stories spread through the system." But Paller also says while it's important to carry a big stick, it's equally important to wave a carrot in front of employees at the same time. "You let them know you're watching them, and then you listen to them and treat them well," he adds. Wolfe, the government prosecutor, says ultimately the majority of the burden falls on the company when it comes to going after an employee gone bad. The victim company has to call in law enforcement. The company has to bear the expense of providing evidence, like backup tapes for servers scattered across the country. And the company has to take it on the chin when the defense stands in open court and blames the victim, often using their own security reports against them. The payoff of putting in place safeguards to help disarm IT workers is that there will be fewer security weaknesses for a defense attorney to trot out in front of a jury " and the media. But both Wolfe and O'Malley say it's critical for companies to suck it up, and call in law enforcement if they're attacked. Anyone thinking of launching their own assault needs to realize that companies will call in the feds, who have the ability to put them behind bars. And O'Malley also says executives need to step it up when it comes to keeping an eye on employees who are full of complaints, or are on a bad streak with the company. "Sure it will happen again," he says. "And in all likelihood it will happen because of an insider They always say, 'Oh, he was a trusted insider.' Bingo! That's the problem. He was a trusted insider." _________________________________ Attend the Black Hat Briefings and Training, Las Vegas July 29 - August 3 2,500+ international security experts from 40 nations, 10 tracks, no vendor pitches. www.blackhat.com
This archive was generated by hypermail 2.1.3 : Mon Jul 24 2006 - 00:47:20 PDT