X
Tech

Social engineering: Don't be fooled

Social engineering:don't be fooled
Written by Patrick Gray, Contributor

It is a hard one to protect against, as attackers prey on the kindness of strangers, but there are some tips to prevent your company being a victim to social engineering ploys.
Ever been conned? Would you know if you had been? Social engineers practice a subtle art. Their techniques, when applied properly, leave victims none the wiser as to how they may have let an attacker in.
Sophisticated social engineers take advantage of the security vulnerabilities in human nature, and not software, in order to penetrate otherwise well-protected networks.
The thief
One such master of deception is US-based Kevin Mitnick, who has been imprisoned three times for computer crime. After his release from a five-year prison stint, Mitnick hung up his criminal spurs and now runs a specialist consultancy, Defensive Thinking, dedicated to protecting unsuspecting employees from the whims of the social engineer.

As one of the United States' most prolific cyber-criminals, Mitnick pulled off some pretty impressive con jobs. He'd trick people into disclosing all sorts of information: passwords, modem numbers, and general technical information. We asked Mitnick what the social engineer may be looking for when they ring an unsuspecting employee from an organisation they're targeting.

"Its largely... calling someone and tricking them out of their password," he says. "But there are a lot more sophisticated attacks where people just need bits and pieces of information."

Say you wanted to target a software company â€" which Mitnick has been known to do, siphoning off source code from DEC in the '80s and later from Nokia, Sun Microsystems, Motorola, and NEC â€" you wouldn't just ring up the administrator and say "give me your password".

An attacker with skill would target the low-hanging fruit â€" perhaps a workstation on the company's LAN â€" using a run-of-the-mill technical vulnerability. Social engineering could then be used to find out which machine on the network contained what the attacker was after, saving them countless hours of fumbling about on the LAN while running the risk of tripping some sort of alarm or raising suspicions.

How can these attacks be foiled?
Start by training employees to evaluate "requests" in isolation, not the circumstances under which they are made. Mitnick says social engineers use flattery, as in "you're the only one smart enough to do this for me, please run the attachment I'm about to send you", and intimidation, as in "if you don't give me your password so I can log in to get my mail you'll lose your job".
If you separate the circumstance from the request, you have half the battle won, he says.
"They key is... to train staff to determine what is a legitimate and what is an illegitimate request," he says.
There are also some easy rules and policies that can help. Almost all the time a social engineer will refuse to give a call-back number. "They'll come up with an excuse... like 'my cell-phone battery is dying'," Mitnick explains. By putting in a policy that "states if someone is making a request of a sensitive nature... and you don't personally know this person, then you have to call them back," around seven out of 10 social engineering attacks will be foiled.

Implementing a policy of calling back at their desk anyone who is requesting a password reset is one smart policy designed to crack down on social engineers.

Mitnick isn't an IT security generalist; social engineering is his bag. Mitnick has studied the psychology at play during his success as a criminal attacker, and built a business around his understanding of it.

"Social psychology tells us people are in two modes of thinking: systematic mode and heuristic mode," Mitnick explains. When a person is operating in systematic mode, they have the wherewithal to think. When operating in heuristic mode, however, "we're simply idling by. We're distracted, we're thinking of something else. We're there 90 percent of the time".

It's at this time that we're most likely to cooperate with an attacker. The social engineer can bend the will of their victim, not giving them reason enough to think twice. They can do this by making requests that involve a favour or a request that is not by nature something the target normally does as a part of their job. Currying favour is also high on the hit-list.

"If you are talking to somebody and you notice that coincidentally they grew up in the same state, or are interested in the same types of hobbies or sports... the attacker wants to mirror you because psychologically you're more likely to like someone like yourself," Mitnick says. "And when you like someone you're more likely to comply with their request."

"If there are too many coincidences it may be a yellow flag," he says.

Red and yellow flags
Mitnick advocates the introduction of red and yellow "flags" to help staff determine when they're being suckered in. "If you're talking to someone and they're over-flattering you... they tell you you're the only one smart enough to do what they say, that may be a red flag," he explains.
It's the human tendency to automatically trust rather than distrust strangers that gives social engineers their influence. How many times have you held a door open for a stranger in your building, letting them into the building without swiping their card? People want to be liked, even by strangers, so they do favours for them. And of course it's polite to return a favour where possible. That human tendency is a big loophole for sophisticated attackers, Mitnick says. Exploiting this trait was one of his most successful techniques.
"When somebody does you a favour it is the rule of every human society that you reply in like kind. That's ingrained in every society, especially in the US," he says. "An attacker will purport to be helping you to solve a problem. Or they'll make up a problem and pretend they're helping you."

An attacker may ring a helpdesk and ask the staff about trouble tickets, pretending they're doing a management survey. Once they have the details of a trouble ticket, the social engineer can pose as a helpdesk operator and call the user experiencing problems and help them solve it. Why? So when the attacker calls them back a few hours later and says: "Hi, this is Bob from the helpdesk, I helped you with your e-mail before. If I send you a diagnostic tool by e-mail could you run it for me? It would really help me out a lot," the user will naturally cooperate. It's a fairly simple ruse, but one that could sucker in many intelligent, but unsuspecting, users.

Another psychological technique used by social engineers is called "reciprocal concession". "This is where the attacker asks for a favour that will take to much time or is too sensitive... they'll ask for an unreasonable request," Mitnick explains. Then the attacker will move on to a "if you can't help me do this, can you help me with this?" type of approach. "They will feel it necessary to compromise," Mitnick says.

The master social engineer says tricking someone into disclosing information or performing an action on behalf of the attackers is similar to good salesmanship.

"It's using sales and marketing tactics and applying them in a negative way," he says. "You want to have a set of red flags or yellow flags that are indicative of people using these tactics."

Train your people, Mitnick says, and audit them. The risks can't be eliminated, but they can be minimised. The proof? The master himself was recently taken in. A reporter recently told Mitnick his publisher had authorised him to discuss details of his new book. Mitnick believed him without verifying the request. "I was socially engineered," he admits.

The banker
Your bank account is the one user account you don't want to give a social engineer access to. We asked the Bank of Queensland's IT security manager, Karl Hanmore, about what he has put in place to reduce the effectiveness of social engineering.
"All Bank of Queensland staff receive IT security training as part of their corporate induction, which is then followed up by periodic bulletins," Hanmore says. "This is the first line of defence in social engineering: trying to prevent the attack from occurring in the first place."
The bank doesn't allow password resets "for electronic channels" over the telephone. Where an Internet banking password is reset, the new credential is sent via post to the customer's listed postal address.

Bank staff are monitored, too. "There are a number of checks surrounding the quality of staff performance, including ensuring they properly identify the customer. We supplement the standard processes with externally provided audits of the quality of the [call centre] responses."

Educating staff and assessing risk are the keys to reducing the risks. "A focus on staff training, documented processes and known escalation paths is critical... a risk assessment of the services you provide where authentication is especially prone to social engineering, like call centres, is highly recommended," Hanmore says.

The ex-cop
If you ask former Australian Federal Police officer Neil Campbell to socially engineer your company for the purposes of an audit, he will refuse.

"This is a tough thing for me. When I started out in IT security I did do the testing, but the negative impact of the testing outstrips the benefits," says Campbell, who now runs Dimension Data's security practice. "No matter how much an organisation is prepared to have its physical security tested, it's a very painful process to go through. You embarrass individuals by succeeding. It's difficult as a provider to engage in that exercise and remain friends afterwards."

"If you don't get in you've failed, and if you do get in you've embarrassed them and from a relationship perspective you've failed," he adds.

Despite the risks, Australian companies aren't spending big on providing their staff with specialist training offered by companies such as Mitnick's consultancy, Campbell says. "If you're cynical about it... how can I say to my boss 'Hey, I spent $50,000 on awareness' and justify it?" he asks. "I think its one of the critical security problems that we have to deal with. There are services for it, but people don't buy them."

The vigilante
Social engineering can take on some unsophisticated forms. The "I love you" Internet worm was a terrific example of mass social engineering. Worm writers and fraudsters alike have engaged in social engineering to peddle their viruses and fraud. Whether it's tricking someone into divulging their Internet banking passwords â€" in what's been termed a "phishing" attack â€" or convincing the recipient of an e-mail to run a viral attachment, social engineering is now "broadcast capable" when combined with spamming techniques.
Canberra-based Daniel McNamara doesn't appreciate that kind of deception, so he decided to do something about it. He set up a Web site, named Code Fish Spam Watch, designed to combat the scourge. McNamara monitors phishing scams and spam Trojans in the hope of exposing them. However, it doesn't stop him from describing victims of some social engineering attacks in less than flattering terms.
"The social engineering attempts are aimed at the more greedy and gullible section of society. I'm afraid to say there is a fairly large percentage out there," McNamara says.
However, the lengths to which social engineers will go to are astonishing. They will prey on the fears of the intended victim, posing as a news bulletin containing details of a terrorist attack on Australia, an e-mail from the bank claiming to have deducted money from victim's account, or even that the recipient is under investigation by the authorities for involvement in child pornography.
These examples are designed to rattle the victim. Instead of thinking "should I run the attachment," they're thinking "I wonder if my child died in the terrorist attack".

Not surprisingly, McNamara hasn't exactly earned the admiration of the online fraud community. Peeved phishers recently spammed messages with McNamara's e-mail address as the reply address. Because the spam involved a child pornography theme, McNamara's inbox was flooded with hate mail. The recipients of the spam had been socially engineered into attacking McNamara himself. He's taken it in his stride as a learning experience -- making some interesting observations about auto-responders -- which allow mail users to automatically reply to all e-mail, even spam, if they're out of the office.

"One of things I learnt during the recent attack... is that people put some very personal things in automated responses. People usually set up auto-responders to tell people about when they are away from work, that their off sick, and so on," McNamara says. "This is all fine and good but most people fail to realise that it might be possible that people they don't know end up reading these messages and that some of the information they're letting out shouldn't be."

"During the attack against my site we received approx 2500-3000 auto-responses. Most of these were fairly mundane but some we're very particular about where and when they wouldn't be in the office, what sort of sickness they had -- for example a fractured upper arm bone -- and a lot of US companies seemed to relish setting up auto-responders for ex-employees; one even stating that the person had been fired for misconduct," he adds. "This information is very useful to someone who is trying to target a business."

Social Engineering 101: The password reset
Security consultant Daniel Lewkovitz was commissioned to conduct a social engineering audit of a large company.
"The aim of the exercise was to gain remote access to the network and access certain files as part of a full security audit. The original plan was to war dial to find dial-in servers. An assumption was made that IT staff and senior management would have this access. The former were easily identified: pagers, tee-shirts etc," Lewkovitz explains.
"It turned out that the company allowed all staff to log in via their Web page. As such, anyone's details would have worked. After the password was reset, access was allowed to all manner of commercial-in-confidence material."
To begin, Lewkovitz obtained the name of an employee, and rang the company help desk impersonating the stolen identity. The help desk staff cheerily told him he would need his staff number and log-in ID in order to perform a password reset. Lewkovitz simply hung up and went about getting the information.

The login name was easy enough to come up with, and staff numbers were displayed prominently on staff badges hung around employees' necks. Lewkovitz simply meandered in to the company's favourite local lunch spot and wrote down a few names and badges.

The following is a direct transcript of the call that took place once Lewkovitz had obtained the necessary information: (Names have been changed.)

Telephone Operator [TO]: Help desk, good morning.

Daniel Lewkovitz [DL]: Good afternoon actually.

TO: Whoops, sorry. Good afternoon [laughs] how can I help you?

DL: [Laughs] It's John Smith speaking. My computer says I can't log in.

TO: Umm . . . When you say you can't log in, what exactly is the message?

DL: Sorry, I don't recall exactly. It was after I type in my password.

TO: Oh, okay. Did the message start with "The password is incorrect, please retype your..."

DL: [Interrupts] Yeah, that was it. I got that a few times and now it won't let me get in at all!

TO: I see. Windows XP locks you out if you get the password wrong more than twice. I can reset it for you if you'd like.

DL: Thanks! I'll type it in properly this time.

TO: Actually we have to change your password to a new one.

DL: That's okay, I'll write it down this time [laughing].

TO: [Laughing] I've seen a lot of people around here do that . Sorry, I missed your name?

DL: John Smith

TO: Thanks. I need your badge number as well.

DL: No problem, it's 2231.

TO: Thanks. Your password has been reset to Tuesday. You'll have to change it on your first login and it will also take about two minutes for that to be updated in Lotus and the Intranet. Was there anything else?

DL: No thanks. That was all, you've been great. Thanks again.

TO: Okay, goodbye.

Lewkovitz offers the following analysis:

Note the operator volunteered information about the company's operating system, which could be useful in an attack. He also volunteered details of the password policy that may assist in cracking attempts; namely that the passwords would be reset to the day of the week, and that staff write down their passwords.

He also asked a closed question that could be answered yes or no based solely on information embedded in the question, such as "are you using Windows" instead of "what operating system are you using?"

The jovial manner of the call put the target at ease -- he kept trying to be helpful, polite, and friendly despite this being used against him. Security has to take precedence over rank or ego. You can still be polite while guarding security.

Badge IDs should have been considered "sensitive" as they could be used to reset passwords. Despite this, the information on them was readily obtainable as staff prominently displayed them. The company concerned has since allowed staff to nominate their own private questions, for example their child's favourite doll, rather than mandate date of birth or badge numbers etc. There are a lot of companies who still use easily-obtained information as private identifiers. Even someone's date of birth is usually easy to ascertain.

Social Engineering 102: Code execution
Security consultant Adam Pointon was hired to conduct a penetration test against a relatively small, low-profile organisation that holds very commercially sensitive information. Company management dictated the audit must include a physical and social engineering component. Due to the small size of the organisation, Pointon decided a password reset would be too risky -- all the staff knew each other. So, he took a different tack. He would send a staff member a customised "proof of concept" Trojan he had written in C++ which would give him access to the corporate LAN. When executed, it would connect back to one of his systems.

"After the PABX audit which was conducted out of hours, I was able to determine the names of people in key positions within the company, including the full name and contact number of the communications manager. I got full read-and-write access to his voicemail, too, but that's another story," Pointon says.

"The next day I called and asked to be put through to the communications manager, without mentioning his name, I was forwarded through."

The following is a transcript of the call Pointon made. He uses his prior knowledge of the communications manager's last name, Smith, to assist his social engineering attempt. Names of both staff and companies have been changed.

Communications Manager [CM]: Hello James speaking.

Adam Pointon [AP]: Hi Mr Smith, it's Matt South from Acme Oz Telecommunications here, I'm calling to inform you that we are installing a new tail of our fibre loop at the rear of 28 Blah St, and I need your cooperation on the matter.

CM: [Cuts in] You're what? You're installing a fibre tail and you need my cooperation?!

AP: Yes. We are adding a new section to our fibre loop which will require us to access the rear of your premises. I need to find out if anyone will be there on Saturday from 9am until 4pm to assist us with access.

CM: OK hang on a minute. Where exactly is this fibre tail going to be laid?

AP: At the rear of 28 Blah St. There is an Acme Oz Telecommunications pit that currently provides lines to your building. We need to open the pit and feed through some cable.

CM: OK, well, I'm not sure if anyone will be here this weekend. Are you going to allow us to connect to this new fibre loop? We have asked you guys numerous times if we can get a faster feed in here but no one has every got back to me about it.

AP: Well we must have taken that into consideration because on this document I have here it mentions your company as being a possible future user of the service.

CM: What's the document?

AP: I will send it to you if you like.

CM: OK, well send it through. I'll speak to my colleagues and see if anyone will be here this weekend. AP: OK, one thing to note: the document is a protected Acme document which is in self-extracting .EXE format it will be coming from my e-mail address matt.south@acmedomain.com

CM: OK, send it through [said impatiently].

AP: Give me five minutes.

CM: Right, bye.

The victim's desire for a faster Internet connection into the premises appears to be at the crux of his downfall, Pointon says. Being able to register a free e-mail address with the telecommunications company also helped â€" if the victim were to send a response to Pointon, he would have received it, which would not have been possible if he forged the mail from a fictitious address.

The attachment was executed within two minutes of Pointon sending it.

Learn to recognise attacks
Here are some warning signs Kevin Mitnick advises to look out for:
Someone making a request:

  • Refuses to give you their contact information
  • Makes an out-of-the-ordinary request
  • Rushes what they claim is an "urgent" request
  • Mirrors your interests and background
  • Lays on too much flattery
  • Intimidates by using authoritative commands from management
  • Offers help with an unknown problem
  • Claims the request has been approved by management

How to build resistance to manipulation:
  • Demonstrate personal vulnerability (role-play to demonstrate social engineering techniques)
  • Train employees to focus on the nature of requests, not the context in which they are made
  • Verify the identity of those making requests and their authority to do so
  • Modify enterprise politeness norms
  • Change attitudes toward information â€" protecting versus sharing
  • Educate personnel why security protocols are critical
  • Provide employees with stress management and assertiveness training

How to respond to incidents:
  • Know when you've had one!
  • Train employees to properly document suspicious events
  • Issue security alerts when suspicious activity is noticed

Thwarting social engineering attacks:
  • Define security policies and procedures

  • - Classify data and handling practices
    - Implement a clean-desk policy â€" otherwise people leave too much information lying around in plain view!
  • Conduct security awareness training
  • Get a social engineering penetration test
  • Do some periodic dumpster diving

This article was first published in Technology & Business magazine.
Click here for subscription information.

Editorial standards