Share

"To this day, I regret not taking that stuff to the FBI," says Bryan.
It happened six years ago, when Bryan, who asked that his last name not be published, was IT director for the U.S. division of a £250 million multinational corporation based in Germany.

The company's Internet usage policy, which Bryan helped develop with input from senior management, specifically prohibited the use of company computers to access pornographic or adult-content Web sites. One of Bryan's duties was to monitor employee Web surfing using SurfControl and report any violations to management.
Bryan knew that the executive, who was a level above him in another department, was popular both within the U.S. division and the German parent.

So when SurfControl turned up dozens of pornographic Web sites visited by the exec's computer, Bryan figured "my best course of action was to follow the policy."

"That's what it's there for," he reasoned. "I wasn't going to get into trouble for following the policy." He went to his manager with copies of the Web logs in question (which he still has in his possession and made available to Computerworld for verification).

Power and prowess

Bryan's case may be extreme, but it's a good example of the ethical dilemmas that IT workers encounter on the job. IT employees have privileged access to digital information, both personal and professional, throughout the company, and they have the technical prowess to manipulate that information.
That gives them both power and responsibility -- to monitor and report employees who break company rules; to sneak a look at salary information or read personal e-mails that reveal love affairs; or to uncover evidence that a co-worker is embezzling funds from the company.

But ethics professionals, technology industry watchers and IT workers say, there's no consensus on how to wield that power or fulfill that responsibility, at least not officially. And that often puts IT people in uncomfortable positions.

In Bryan's case, he didn't get into trouble, but neither did the porn-viewing executive, who beat Bryan to the human resources director with "a pretty outlandish explanation," says Bryan. The executive claimed that his ex-wife was publishing pictures of their kids on the Internet, and he had been trying to find out where. "He said he thought this might show up in a report on him, and he just wanted them to know that he was not going to be doing that anymore."

The company accepted the explanation and tabled the incident, despite Bryan's documentation, which he showed to his direct superior and to human resources and which he insisted be placed into the man's personnel file. Bryan considered going to the FBI, but the Internet bubble had just burst and jobs were hard to come by. "It was a tough choice," he says. "[But] I had a family to feed."

In theory, ethical behavior is governed by federal and state laws, corporate policy, professional ethics and personal judgment. But as Bryan now realises, and other tech workers discover all the time, navigating those muddy waters can be one of the most daunting challenges in an IT professional's career.

To give just one example of how confusing ethical transgressions can be: Child pornography is illegal in the U.S., but the law does not necessarily require individuals to report it.

A handful of states, including Arkansas, Missouri, Oklahoma, South Carolina and South Dakota, have laws requiring IT workers to report child pornography, and others are considering similar measures. But they're still in the minority.

Otherwise, "as a general rule, there's no obligation for any citizen to report any violation of law to police or any other government agency," says Linn Hynds, a senior partner and past chair of the labor and employment law department at Honigman Miller Schwartz and Cohn LLP. (He notes that there are also other exceptions, such as teachers with knowledge of child abuse.)

Ideally, that's where corporate policy takes over, governing ethics in the workplace and clearing up gray areas. A good policy removes personal judgment from the equation as much as possible. "If you haven't published a corporate law of ethics, if you don't set out your policy and your guidelines, if you don't make sure that people know what they are and understand them, you're in no position to hold [individual workers] accountable," says John Reece, former CIO and deputy commissioner for modernisation at the IRS and former CIO at Time Warner Inc.

Having clear ethical guidelines also lets employees off the hook emotionally if the person they discover breaking the policy is a friend, a direct report or a supervisor. "So it's not that I'm squealing on Joe, but that I have to be accountable to my employer, that part of my job is to enforce company policy," says Reece, who is now chairman and CEO of his own consultancy, John C. Reece and Associates LLC.

What to do ... and when

That policy also should warn employees that their PCs are company property and thus any information on them is fair game for investigation, says Art Crane, principal of Capstone Services Inc., a human resources consulting company. It should provide clear instructions on what to do when employees encounter a violation of the policy, including how to bring it up the chain of command, and include whistle-blower provisions that protect employees from retaliation.

But most corporate policies aren't ideal. In many companies, they are ill-defined or at least not communicated strongly and in detail to the IT department. The reasons are several.
First, ethics policies are typically defined by an organisation's lawyers or compliance people, says Larry Ponemon, founder and chairman of the Ponemon Institute LLC, a research company that specialises in privacy and data protection. "In my experience, these folks may not fully understand or respect the complexities that IT-related ethical issues create, such as privacy and data protection," he notes.

Second, policy-makers often assume that IT workers -- indeed, all employees -- have an inherent responsibility not to misuse information or technology and to report any issues or problems, says Crane. But they still need to spell out that assumption in writing to be completely clear with their employees.
And third, ethical challenges evolve as technology advances, an evolution that can be hard for policies to anticipate. "I'd bet that 10 years ago, very few companies had a policy on e-mail usage," says Crane. "These days, there are very few that don't."

In fact, technology sometimes creates an illusion that a particular behavior is not wrong, notes Ramon Barquin, president of the nonprofit Computer Ethics Institute, which studies ethical issues that arise from technology. "Technology has a way of putting distance between an individual and the consequences of their actions," notes Barquin, who is also CEO of Barquin International, a business intelligence consultancy.

As a result, ethics decisions are often left to the individual. "What's happening right now is it's very much all over the place," says Barquin. When the policy is inadequate, out of date, unenforced or nonexistent, people may do what they feel is right or try to duck the issue. "Looking back, they often say, 'Well, what was I supposed to do?'" says Barquin.

It's as if employers are in denial about the power that IT people have. "When an IT professional comes across some [sensitive] information -- well, that shouldn't be a surprise to anyone," says Albert Erisman, executive director of the Institute for Business, Technology and Ethics. The company should anticipate such situations and tell IT what's expected. But "in my opinion, most companies try to put these questions off to IT people," he says.

Even when companies have policies, they vary widely depending on a company's size, culture, management, and whether it is public or private. And even the most detailed corporate ethics policy can't cover every situation and/or may not be well known in all areas of the company.

Troubles, past and future

Some policies focus on areas where the company may have had past troubles or emphasise whatever the organization is most worried about.

When John Reece was at the IRS, for example, the biggest emphasis was on protecting the confidentiality of taxpayer information, he says. At other government agencies, such as the U.S. Department of Defense, policies usually emphasise procurement rules, notes Stephen Northcutt, president of the SANS Technology Institute and author of IT Ethics Handbook: Right and Wrong for IT Professionals (Syngress, 2004).

"It's quite often the case that if [a transgression] hasn't happened yet, no one thinks of it or wants to focus on it," says Leslie Ann Skillen, a partner at law firm Getnick & Getnick and an expert on fraud and corruption in business and government. "That's probably more common than you might think."

Indeed, companies tend to be reactive rather than proactive, agrees Crane. "It often takes a smack upside the head to get companies to address exposures and behaviors," he says.

Further muddying the waters, an organisation that employs highly creative or highly skilled workers might be more lenient in certain areas. When Northcutt worked in IT security at the Naval Surface Warfare Center in Virginia, it was a rarified atmosphere of highly sought-after Ph.D.s. "I was told pretty clearly that if I made a whole lot of Ph.D.s very unhappy so that they left, the organisation wouldn't need me anymore," says Northcutt.

Of course, that wasn't written in any policy manual, so Northcutt had to read between the lines. "The way I interpreted it was: child pornography, turn that in," he says. "But if the leading mathematician wants to download some pictures of naked girls, they didn't want to hear from me." Northcutt says he did find child porn on two occasions, and that both events led to prosecution.

As for the naked photos that he encountered, Northcutt pointed out to his superiors that they might be a legal liability, citing a Supreme Court decision that found similar pictures at a military installation indicated a pervasive atmosphere of sexual harassment. The center then changed its policy. "Once they saw that law was involved, they were more willing to change culture and policy."

When policies aren't clear, ethical decisions are left to the personal judgment of IT employees, but that judgment varies dramatically from person to person. Where Northcutt pushed to get the policy changed, for example, another manager might have accepted the status quo.

Often the decision depends on the type, frequency and severity of the incident, as well as the possible consequences of action or inaction, observers say. Even people who insist on a high personal standard of ethics for one type of behavior may bend the rules in other areas.

For example, Gary, a director of technology at a nonprofit organisation in the Midwest, flat-out refused when his assistant CEO wanted to use a mailing list that a new employee had stolen from her former employer. Not only was it illegal, but "it's morally reprehensible and I wasn't about to participate in anything that would damage the reputation of the organisation," says Gary, who asked that his last name not be used.

However, when his boss installed unlicensed software on PCs for a short time, Gary acknowledges he was willing to look the other way. He told his boss it wasn't ethical and he refused to do it himself. But he didn't stop it. "The question is, how much was it really going to hurt anybody? We were still going to have 99.5% compliant software. I was OK with that." He says he uninstalled it, with his boss's approval, as soon as he could -- about a week later.

Higher standards?

These instances aren't surprising, says Northcutt, because there are no professional mechanisms in place to hold IT workers to any higher standard. He believes that the IT profession should have two things that other professions, such as law or accounting, have had for years: a code of ethics and standards of practice. That way, when company policy is nonexistent or unclear, IT professionals still have standards to fall back upon. (See Does IT need a code of ethics? on the next page.)

That development could be a very welcome thing, because there's evidence that, left to their own devices, not all IT workers adhere all of the time to the highest of ethical standards.

In the spring of 2007, security vendor Cyber-Ark Software Ltd. conducted a survey in which one-third of 200 IT employees who responded admitted using their admin passwords to snoop through company systems and peek at confidential information such as salary data. The company quoted one anonymous respondent as saying, "Why does it surprise you that so many of us snoop around your files? Wouldn't you if you had secret access to anything you can get your hands on?"

In addition, more than one-third of respondents said they still had access to their former employers' networks, even after they'd left the company.

Meanwhile, new data from the Ponemon Institute reveals how commonly IT employees bend the rules. In a June 2007 poll of more than 16,000 U.S. IT practitioners, 62% said they had accessed another person's computer without permission and 50% read confidential or sensitive information without a legitimate reason. In addition, 42% said they had knowingly violated their company's privacy, security or IT policies.
And these weren't newbies or outliers. The average experience level was 8.4 years, and about 32% of respondents were at or above the manager level. Over 81% worked at companies with more than 5,000 full-time employees.

That's in keeping with the ethical standards of Tim, a systems administrator who works at a Fortune 500 agricultural business. When Tim, who asked that his last name not be published, happened across an unencrypted spreadsheet of salary information on a manager's PC, he copied it. He didn't share the information with anyone or use it to his advantage. "I didn't take it for nefarious reasons -- I just took it to prove that I could," he says.

Does IT need a code of ethics?

Lawyers have them. Doctors have them. Is it time, in this age of unfettered digital access, for IT to have a code of ethics? At least some prominent players in the industry think so.

"We need a real call to arms for a code for IT," says Larry Ponemon, founder and chairman of the Ponemon Institute.

Some groups, including the Association for Computing Machinery and the Association of Information Technology Professionals, have adopted generalised ethics codes. The IEEE (Institute of Electrical and Electronics Engineers) has both a general code of ethics and a software engineering code of ethics.

Five certification groups are also taking a stab at the effort. Global Information Assurance Certification, Information Systems Audit and Control Association, International Information Systems Security Certifications Consortium Inc., Information Systems Security Association and ASIS International are attempting to work together to draw up a code of ethics for IT security professionals, according to Stephen Northcutt, president of the SANS Technology Institute.

But "getting five organisations to dance together is proving quite challenging," Northcutt observes. Indeed, their greatest challenge may well be deciding what does, and does not, go into such a code.
"I think it would be really hard to reach consensus," says Ponemon. "You talk to five people and you get a hundred ideas."

If and when a universal code is ever adopted, the next step would be standards of practice that would serve as teeth behind the code -- a sort of American Bar Association for IT. If an IT worker violated the standards, in theory he might be "disbarred" from the profession.

It was an impulsive act, he admits, that stemmed from frustration with his employer. Feeling that his needs were being ignored by central IT, it was a way to assert some power. "Information is power," he says.
Tim's actions point to a disturbing trend: IT professionals justifying ethically questionable behavior. That path can and sometimes does end in clearly criminal behavior, says private investigator Chuck Martell.
"We started seeing a few cases about seven or eight years ago," says Martell, who is managing director of investigative services at Veritas Global, where he investigates corporate fraud. "Now we're [investigating] a tremendous amount of them."

IT as the bad guy

One of the most common crimes is embezzlement. The typical modus operandi, says Martell, is to set up one or more shell companies in a different city or state and then send invoices for purported purchases of products or services by the employer. "That's a common theme," says Martell. "We know exactly how they do this. We know where to look." Martell's company recently cracked such a case and recovered $360,000 plus stolen hardware.

The good news is that in half of the cases, the perpetrator was turned in by a fellow IT worker, says Martell.
In the meantime, organisations wanting to mitigate how much private information IT workers can see should consider better control systems or perhaps encryption techniques, suggests Ponemon. "So much of what happens in the IT environment is transmitted in clear text," he points out. "Encryption is getting so convenient, why not encrypt [data] till it gets to its final destination?"

Whichever side of the line they're on, IT workers will continue to muddle through ethical dilemmas on their own and wrestle with their consciences afterward.

Perhaps it will ease the conscience of Bryan, who never reported the child pornography he found to law enforcement, to hear that he did just what labor attorney Hynds would've advised in his case.

"Let the company handle it," he says. "Make sure you report violations to the right person in your company, and show them the evidence. After that, leave it to the people who are supposed to be making that decision."

Tam Harbert is a Washington-based freelance journalist specialising in technology, business and public policy. Some US states require IT professionals to report child pornography, and that certain IT professional associations already have their own ethics codes in place.