Donate for the Cryptome archive of files from June 1996 to the present

26 January 2014

Spies Find InfoTech Specialists Are Like Them


http://www.securitymanagement.com/archive/library/000762

Business Continuity

Inside the Mind of the Insider

By Eric D. Shaw, Jerrold M. Post, and Keven G. Ruby


Information technology specialists are uniquely positioned to wreak havoc on corporate systems, so the shrewd employer conducts personality profiles and monitors staff behavior.


A highly paid systems administrator under contract at a large financial institution had overseen the selection, installation, and operation of a new accounting system. In the process he had acquired unique expertise with what he came to regard as "his" system. When a new supervisor took over the IT department, he was shocked at the contractor's independence, salary, and lack of responsiveness to his requests to train backup personnel and give others root access to the system. In response to the attempts to rein him in, the systems administrator asked to be relieved of his duties.

One month before his expected departure date, several systems crashed, stumping the members of the IT department. The systems administrator, however, brought the systems back up in minutes. After that incident, IT operations went smoothly until his last day on the job, when both accounting servers went down and no backup tapes could be located. The damage to the servers turned out to be so systemic and thorough that they had to be replaced at extraordinary expense.

Had the IT department referred this employee to security personnel, who might have requested a psychological evaluation of the systems administrator or had his behavior analyzed, signs of impending trouble could have been detected before he had a chance to wipe out corporate systems. Unfortunately, many companies worry about rogue outside hackers when vengeful insiders, such as employees, consultants, temps, and partners, pose far more of a threat. A 1998 survey conducted jointly by the Computer Security Institute and the FBI found that the average cost of successful computer attacks by outside hackers was $56,000. By contrast, the average cost of malicious acts by insiders was an astounding $2.7 million.

Concerned about the increasing number of insider violations of government systems, the Department of Defense sponsored a project in 1997 that was conducted by the authors of this article. It was designed to construct psychological profiles of perpetrators of insider computer crime. From a pool of more than 100 cases provided by computer crime investigators, prosecutors, and security specialists over the last two years, we have been analyzing 46 recent cases for which sufficient detail could be obtained.

This profiling effort was initiated to help government, law enforcement, and industry examine the adequacy of their personnel security policies for deterring, detecting, and investigating malicious acts committed by insiders, including sabotage, espionage, fraud/theft, and extortion.

Through the analysis of past cases and other research, we compiled a list of psychological characteristics typical of IT specialists, and in particular the traits of those who attacked their employer or former employer. We then classified the perpetrators by type. Finally, we reflected on the lessons and the implications of these incidents.

Characteristics.
Every psychological assessment of programmers, computer scientists, computer science graduate students, and systems administrators has found one common trait: introversion. Introverts are more comfortable in their own mental world than they are in the more emotional and unpredictable social world. They are more sensitive to external stresses than extroverts are, and they tend to have less sophisticated social skills.

These traits have specific implications for information technology departments. Introverted IT professionals may be more sensitive to work stress and emotional conflicts, less likely to deal with stress in an overt, constructive manner, and less likely to seek direct assistance from supervisors, employee assistance programs, or other sources. According to our discussions with computer crime investigators and human resource professionals, these individuals are more likely to express their frustrations and concerns to peers via e-mail than in person.

The subjects in our study who had committed computer offenses shared several traits. Many of these characteristics pose significant management challenges. It should be emphasized, however, that many IT professionals and other individuals who possess these traits are honest and law-abiding. Only when subjected to high levels of perceived personal or professional stress did this at-risk population (as opposed to many of the other IT workers in this study) actually commit malicious acts.

Six personal characteristics with direct implications for risk were identified, including a history of personal and social frustrations, computer dependency, ethical "flexibility," reduced loyalty, a sense of entitlement, and a lack of empathy.

Frustrations. Many of the subjects had a history of significant family problems, difficulties in school and at work, and various social frustrations, which left them with negative attitudes toward authority. These preliminary findings are consistent with the research of Professor R. Caldwell, a computer scientist, who conducted separate studies in 1990 and 1993. He found high levels of disappointment and conflict among a subgroup of computer science students who reported preferring the predictability and structure of computers to the vagaries of personal relationships. Caldwell also identified a subgroup whose members were angry, alienated from authority, less socially skilled than and isolated from their peers, and poised to strike out at the system. Caldwell designated these individuals as being stricken by "revenge syndrome."

Computer dependency. Online activity significantly interfered with, or replaced, direct social and professional interactions for many of the subjects in our study. The appearance of computer dependency in the subjects was not surprising given recent research. According to psychologists, computer-addicted individuals are more likely than nonaddicted users to be aggressive loners who make poor team players. They report their primary interests as exploring networks, breaking security codes, hacking into computer systems, and challenging and outfoxing security professionals.

In addition, computer dependents be deeply involved in online relationships, to the extent that they tend to prefer their online personas and contacts to their real-world selves and relationships. The research also indicates that computer-dependent people often act aggressively to compensate for feelings of inadequacy.

One potential concern for security professionals may be the vulnerability of these individuals to online manipulation by those targeting disgruntled insiders for financial gain or espionage. These vulnerabilities can be easily exploited through friendships struck up anonymously in online chat rooms. During an investigation of a specific disgruntled employee, it may be advisable within the confines of the law to keep track of what that person is saying, and to whom, both during online contacts on company systems and in chat rooms.

Ethical flexibility. Many of the subjects whose cases we studied reportedly did not view their violations as unethical; some even viewed them as justified under the circumstances. These subjects appeared to lack the moral inhibitions that prevent others, even when similarly provoked, from committing such acts.

This finding is consistent with earlier research on ethical boundaries within the "information culture" conducted by S. Harrington and published in 1995. Harrington's findings indicate that approximately 7 percent of computer professionals do not object to cracking, espionage, or sabotage. Their rationale is that an electronic asset is fair game for attack if it has not been sufficiently secured by the company. Other social phenomena have been cited as contributing to such ethical flexibility, including a lack of training in computer ethics, a lack of specific policies and regulations on privacy and security, a lack of legal penalties for abuses, and the lack of face-to-face interaction in cyberspace.

Reduced loyalty. The subjects in our study appeared to identify more with their profession or computer specialty than with their employer. This finding is reminiscent of a study of computer fraud conducted by the U.S. Department of Health and Human Services in 1986, which found that computer programmers who committed fraud felt more loyalty to their profession than to their employer.

Entitlement. Entitlement, the sense that one is special and owed corresponding recognition, privilege, or exceptions, was a key characteristic of many of the attackers. In addition, the study found that this trait was frequently reinforced by the employer. When combined with a preexisting anger toward authority figures, as it often was, this sense of entitlement fueled a desire for revenge in reaction to perceived slights or setbacks.

In one case, a computer programmer who was promised an advanced technical position was instead assigned to a help desk. Having in his view been intentionally demeaned or negligently overlooked by his employer, the man became increasingly disgruntled and eventually took down the servers of his employer, a major publishing company.

Lack of empathy. An employee's disregard for the impact of his or her action on others, or inability to appreciate this impact, has been noted consistently by investigators. Likewise, many of the subjects in our study lacked empathy. This characteristic is magnified by the nature of cyberspace, where the effect of events is muted by the lack of immediate apparent consequences.

Typology.
Our review of insider computer crimes by employees, contractors, consultants, and others with trusted access to a corporate or government computer system indicates that perpetrators fall into several motivational categories. In our typology, they are referred to as explorers, good Samaritans, hackers, Machiavellians, exceptions, avengers, career thieves, and moles.

Explorers. Explorers tend to be motivated by curiosity, wandering into poorly designated or relatively unprotected areas of the corporate network. They rarely cause any damage purposefully and are, therefore, rarely punished. In most cases, their forays simply reveal the organization's lack of adequate policies and safeguards and expose the potential consequences of unauthorized access.

Good Samaritans. These employees believe that their violations resulted from efforts to perform legitimate duties more effectively and efficiently. Good Samaritans often claim they were unaware that their activities violated a rule. They also sometimes argue that their need to resolve an emergency outweighed any minor violation of procedures. These individuals like to "save the day" or show off their abilities. Good Samaritans should not be confused with other perpetrators who disingenuously claim they were just "testing security" when caught hacking the system.

A recurring example of the good Samaritan is the person who discovers a system problem outside of his or her range of responsibilities and violates procedures to make a fix, often endangering the system or triggering alarms in the process. In one case, a military technician discovered a system failure in a network identical to his own but located at another facility. Violating protocol, he hacked into the computer system to make what he felt were essential emergency repairs, setting off the network's intrusion detection system. Although he was reprimanded for this breach of protocol, he was not prosecuted because of his putative benevolent intent.

Hackers. Hacker is a widely used term with a broad array of meanings, but for our purposes hackers are those who need to violate access boundaries to bolster their self-esteem. The ego boost they receive from challenges to authority, peer approval, and technical prowess assuages the wounds inflicted by previous personal, social, and professional setbacks. Typically, those we characterized as hackers in our study had invaded systems before they were hired and continued to hack into other systems while on the job.

Hackers are particularly dangerous when allied with groups of like-minded computer experts. In the desire to demonstrate their accomplishments, these persons may provide outsiders with unauthorized system access to a company system, or they may divulge system safeguards to win peer approval. Often their group affiliation leads to escalation of illegal activities and competition for attention, increasing the risk of damage to the corporate network.

Many of the hackers in our database did not commit intentional destruction unless they became disgruntled or were fired by the company or threatened with termination. However, they tended to operate by a flexible set of ethical guidelines, which could be summarized as "if it isn't tied down, it's mine to play with."

A signal corps specialist, for example, compulsively violated procedures just to demonstrate that he could. He used a sensitive military system to access an Internet service provider. He also gave outside hacker friends demonstrations of his prowess and access to military systems.

Within the hacker category, some cases demonstrated a specific pattern warranting a subtype. We designated them as "the golden parachuters." These people, who have past violations, don't disclose their criminal records to their new employers, but they plan for their eventual discovery. They insert logic bombs, for example, or other system booby traps, which they are uniquely qualified to defuse, in exchange for a generous consulting fee or severance package. Such extortion tactics are rarely reported, and it is often more cost-effective for the employer to pay off the employee than to press charges.

Machiavellians. Machiavellians use corporate systems ruthlessly to advance their personal and career goals. In our study, Machiavellians used logic bombs to establish consulting careers or to use as insurance against being fired; they framed bosses to advance their careers, stole intellectual property to jump-start their next position, and created disruptions that only they could fix to promote their advancement or to arrange special travel (for example, the IT worker would be sent to a remote site in a foreign country to repair the system). Some also damaged the equipment and products of rivals. Attacks by this personality type may involve some element of disgruntlement, but they are more likely to be calculated efforts at advancement than reactions to perceived setbacks.

In one case, an information systems specialist left his employer to establish a competitive firm. Shortly afterwards he convinced another employee of his old company to leave and pilfer sensitive proprietary data, including customer information. Another former coworker was then convinced to destroy the office's database and to defect to the new firm. The new business slipped smoothly into place as it supplanted the now critically disabled original employer.

Exceptions. Exceptions view themselves as special, deserving of extraordinary recognition. They also consider themselves above the rules that apply to other employees. They are sensitive to slights and become disgruntled easily, even when treated normally. They often deflect blame to others and have a grandiose view of their importance beneath their fragile self-esteem.

One important subset of exceptions is the proprietor. Proprietors feel that they own their systems. In most of the cases studied, these feelings of entitlement were unwittingly fomented by the employer. In the case that began this story, for example, the employer had provided the worker with a unique pay structure and unusual independence, and his supervisor tolerated the employee's insubordination and direct refusal to obey directives.

In another case, an engineer at an energy processing plant, depressed and agitated over his wife's fatal illness, blamed the company when its health insurance would not cover the recommended treatment. He stewed further when his older, technically experienced supervisor was replaced by a younger, nontechnical manager brought in to tighten control of plant operations.

The engineer was subsequently put on probation for bringing a weapon to work, yelling at vendors, and refusing to comply with the new manager's orders. Engineering staff soon discovered that he had made several idiosyncratic changes to computerized plant controls and safety systems. Confronted with these changes, the engineer refused to divulge the system password, in effect hijacking the plant and threatening plant safety.

In that case, the subject had previously committed dozens of violations of personnel and computer policies. The former supervisor had tolerated the behavior because he believed that the engineer was irreplaceable and he was intimidated by the engineer's system knowledge and unique control over computer resources. Our study shows that when supervisors overlook violations in these situations, the subject's sense of entitlement and expectations of special treatment grow and undesirable behavior is more likely to escalate to problematic levels.

Avengers. Those known as avengers are motivated to attack in reaction to specific perceived setbacks, disappointments, and frustrations rather than by general disgruntlement. In the cases we examined, setbacks included termination, transfer, demotion, or failure to receive an expected raise, reward, or other form of recognition. The critical issue was the employee's perception of mistreatment, not any objective standards or the assessment of others familiar with the circumstances. The bitterness of avengers can manifest itself in a range of malicious acts, including sabotage, espionage, theft, fraud, and extortion.

A sample case involved a system administrator at a large healthcare facility who heard a rumor that she was going to be terminated as part of a downsizing effort. She encrypted the facility's patient files and offered to fix the problem in exchange for a small severance package, including cash and a no-prosecution agreement. Fearing severe disruption of patient care and negative publicity, the hospital quickly agreed to her terms. Prosecutors reviewing the case concluded that the deal struck by hospital administrators precluded them from pressing charges against the employee.

It should be noted that revenge is a motive among system attackers in other categories as well. In fact, as we learn more about the pathway each perpetrator type follows to disgruntlement, we expect that many cases now in the avenger category may be found to fit more appropriately in one of the other perpetrator categories.

Career thieves.
Career thieves enter the organization with a predetermined plan to use the computer as a tool for embezzlement, theft, fraud, or other illegal moneymaking schemes. For these individuals the computer is simply a tool used to acquire funds. Theirs are cold, calculated, and unprovoked schemes, with no necessary relationship to perceived mistreatment by the company.

Moles. Similar to the career thief is the mole, who joins an organization with the intent to commit espionage for the benefit of a company or foreign government. By contrast, career thieves work purely to benefit themselves. Moles can also be distinguished from disgruntled employees or avengers who, out of anger or resentment, commit espionage for revenge.

Lessons learned.
Our analysis revealed that organizations victimized by IT employees often were vulnerable to these attacks not only because of poor computer security but also because of poor management practices. These problems include underreporting of incidents, poor employee screening, incomplete termination procedures, missed warning signs, and unmonitored online communications.

Underreporting. The problem of insider attacks remains highly underreported. According to the authors' interviews with computer crime investigators, most public and private organizations prefer to deal with these events quickly and quietly to avoid publicity. The relative leniency of criminal penalties connected with these crimes and the technical difficulties involved in successfully prosecuting such cases give companies little incentive for reporting incidents to police. Consequently, companies that have yet to be victimized underestimate the risk and are hampered in their efforts to take precautionary steps by a lack of knowledge of previous incidents or perpetrators.

Employee screening. The lack of basic screening of employees with access to vital systems is also a problem. While screening can't eliminate the risk that some employees will become disgruntled on the job, it can at least weed out applicants with criminal hacking histories. These convicted criminals routinely go from one victimized company to the next without being challenged.

One person in our database, for example, sought and gained employment in the same industry and in the same town the day after his indictment. Another person arrested for hacking, who had prior criminal convictions and a record of membership in an infamous hacker organization, was offered a choice between jail and military service. Not surprisingly, he chose the latter. He again came to the attention of authorities when he tried to recontact the hacker group and was caught hacking into the local telephone system from his base.

The Treasury Department has played a pioneering role in ensuring that computer system contractors have vetted their employees. As of October 29, 1997, before they can win a contract, contractors designing, managing, or seeking access to Treasury information systems must certify and provide evidence that employees have undergone clearance procedures.

The Department of Defense is expected to follow suit with a similar rule. There is evidence that such a program can have a profound impact on U.S. industry beyond defense contracting. During the 1980s, Defense targeted substance abuse by requiring employee assistance program services as a condition of any contract. This mandate sparked an explosion of these services both within and outside of government. We predict a similar pattern with security screening; screening programs will proliferate, and organizations able to prove that their employees are thoroughly vetted will gain an edge in contract competition.

Our review of the behavioral patterns of information technology insiders who commit malicious acts suggests that security clearance programs will have to be tailored to these unique employees. Traditional concerns about foreign contacts should be augmented with specific questions concerning online contacts, particularly with the hacker community. Similarly, expanded background questions, both professional and personal, tailored to the unique characteristics of this special population should draw on perpetrator studies.

Personnel changes. Many of the subjects in our study had carried out their attacks after hearing that they were going to be demoted, terminated, or reassigned. Businesses have paid insufficient attention to the dangers of personnel changes. Such changes are likely to be the catalyst for malicious acts in persons predisposed to those behaviors. But termination is not the only stimulus for these acts. In one case we reviewed, a project leader was demoted to team member. In anger, he e-mailed project design plans to several competitors. Thus companies should consider how to reduce the motivation and remove the opportunity for carrying out those acts.

For example, companies should allocate more security and psychological resources to supervisors and human resource departments when IT personnel at risk are being transferred from a job or fired. Providing job assistance in the case of termination, for example, may alleviate some of an employee's resentment.

In our experience, termination initiates a grief process in the employee, one of the first stages of which is anger. Psychological debriefing focuses on discussions with the employee designed to move him or her through the grief process, specifically through this predictable angry reaction. Such intervention has frequently proved its worth, though standard debriefing approaches must often be modified for the specialized information systems worker population.

In the case of the engineer at the energy processing plant who brought a weapon to work, for example, the company moved in quickly to counsel the employee. During the course of several debriefing sessions, the employee made the transition from anger to acceptance and was helped to find useful employment as a union representative, where he could express his distrust of the company through a sanctioned pathway.

The person's access to critical systems should also be immediately revoked if possible. (If the person has not been terminated, but rather transferred or demoted, some access may still be required, making security more difficult and monitoring more important.) But even where termination makes total revocation of access appropriate, physical removal or password deletions alone may not be sufficient; the employee may have unique or specialized system knowledge or ongoing social contacts with employees who have continuing access to the corporate system.

Warning signs. While many cases first appear to involve disgruntled employees who execute a single massive attack after receiving dramatic news, perceiving a slight, or suffering a setback, closer examination of the case histories reveals that many employees demonstrated clear signs of disgruntlement and committed less serious violations leading up to the ultimate act. This finding suggests that the most devastating attacks could have been prevented if the early indications of dissatisfaction had been recognized or taken more seriously.

In many of our cases, supervisors were aware of violations but did not appreciate their significance. This finding indicates a strong need for supervisor training, improved communications with security, and independent channels by which security can monitor risk situations, other than via supervisor notification.

Online communications. The signals that presage attacks by IT professionals are more likely to appear online (as internal office e-mail or in other electronic formats) than face to face. But managers are either unaware of, or tend not to react to, these communications in the same way that they might respond to a troubling face-to-face encounter with someone.

Consider the case mentioned at the beginning of this article concerning the system administrator crashing the accounting servers. Three months before destroying the servers, the administrator had sent the following e-mail message to his supervisor: "Until you fire me or I quit, I have to take orders from you...Unless she [a proposed backup for the employee] is trained, I won't give her access...If you order me to give her root [access], then you have to relieve me of my duties on that machine. I won't be a garbage cleaner if she screws up."

The tone of his e-mail messages abruptly shifted a month before termination, when the subject appeared to be the model of a good team player: "Whether or not you continue me after next month, you can always count on me for quick response to any questions, concerns, or production problems with the system. As always, you'll get my most cost-effective, and productive solution from me."

It is clear that in the latter e-mails the subject feigned a pleasant demeanor. In fact, he was planning to sabotage the servers on his last day at work. The supervisor presumably noticed the unpleasant tone of the early messages but failed to bring them to the attention of security. He was subsequently deceived by the subject's charm and bonhomie.

Recognizing the importance of electronic communications, the Securities and Exchange Commission (SEC) has ordered regulated institutions (primarily brokerage houses) to monitor the work-related online communications of their employees for signs of trouble, particularly insider trading. This does not, however, involve psychological profiling.

Other private and governmental organizations are considering similar moves, particularly given the potential liability of failing to protect these important assets. In the future, certification requirements for contractors may include provisions for automated monitoring of known linguistic patterns and psychological precursors of insider attacks. The authors' firm has developed psycholinguistic measures sensitive to changes in an employee's psychological state indicative of increased risk. In the case of the employee who abruptly changed his tone in his e-mail messages, post hoc use of these measures detected both the employee's initial disgruntlement and the contrast between his overt and covert attitudes. Had these automated measures been monitored by security, this incident might have been prevented.

Implications.
Our research has several implications for the management of the insider threat. We are currently working on several projects related to these findings. These include the need for improved preemployment screening practices and improved management tools, including specific policies governing system violations and misuse, strict enforcement of these guidelines, and new forms of supervisor training in the management of these challenging employees, including threat assessment and management.

Also necessary are innovative online approaches to employee management, threat detection, and services. Examples include online monitoring to spot threatening changes in employee attitudes and psychological states, online ethical help desks, online employee assistance points of contact more likely to be used by these employees, online employee bulletin boards for anonymous communications, and other forms of online services tailored to the preferred communications means of information technology employees.

The information revolution has transformed the workplace. The special class of employees that is now playing a leading role in the electronic workplace has unique psychological features that require specialized security practices and management techniques. While most IT professionals are honest and valued business partners, security must learn to understand and counter the threat of those who are not.


Eric D. Shaw is a clinical psychologist, a former intelligence officer, and an adjunct associate professor of political psychology at George Washington University's Elliot School of International Studies. He is director of research at Political Psychology Associates, Ltd. (PPA), a Bethesda, Maryland-based behavioral science firm providing organizational research and consulting services to industry and government, including profiling, threat analysis, employee screening, human factors security audits, and consultations on workplace violence and prevention. Jerrold M. Post, a psychiatrist by training, is founder and president of PPA and director of the political psychology program at George Washington University. During his 21 years in the CIA, he founded and directed the Political Psychology Center. He is a member of ASIS. Keven G. Ruby is a research analyst at PPA, where he specializes in terrorist group dynamics, weapons proliferation, and hacker groups.

This article is based on research supported by the Office of Information Operations of the Assistant Secretary of Defense for Command, Control, and Communications, Department of Defense. All case materials were recorded with anonymity for the subject, his or her organization, and the source involved. The authors hope to make this database available soon to researchers and practitioners. They are still collecting cases and would be interested in talking to anyone (with the promise of anonymity) with knowledge of cases or subjects. If interested, call Dr. Shaw at 202/686-9150 or e-mail him at eshaw@pol-psych.com. Comments on the article itself should be directed to sharowitz@asisonline.org.


back to Security Management Online