Social engineering (security)

In the context of information security, social engineering is the psychological manipulation of people into performing actions or divulging confidential information. This differs from social engineering within the social sciences, which does not concern the divulging of confidential information. A type of confidence trick for the purpose of information gathering, fraud, or system access, it differs from a traditional "con" in that it is often one of many steps in a more complex fraud scheme.[1]

OPSEC alert

It has also been defined as "any act that influences a person to take an action that may or may not be in their best interests."[2]

An example of social engineering is the use of the "forgot password" function on most websites which require login. An improperly-secured password-recovery system can be used to grant a malicious attacker full access to a user's account, while the original user will lose access to the account.

Information security culture

Employee behaviour can have a big impact on information security in organizations. Cultural concepts can help different segments of the organization work effectively or work against effectiveness towards information security within an organization. "Exploring the Relationship between Organizational Culture and Information Security Culture" provides the following definition of information security culture: "ISC is the totality of patterns of behavior in an organization that contribute to the protection of information of all kinds."[3]

Andersson and Reimers (2014) found that employees often do not see themselves as part of the organization Information Security "effort" and often take actions that ignore organizational information security best interests.[4] Research shows Information security culture needs to be improved continuously. In "Information Security Culture from Analysis to Change," authors commented that "it's a never ending process, a cycle of evaluation and change or maintenance." They suggest that to manage information security culture, five steps should be taken: Pre-evaluation, strategic planning, operative planning, implementation, and post-evaluation.[5]

  • Pre-Evaluation: to identify the awareness of information security within employees and to analyse current security policy.
  • Strategic Planning: to come up with a better awareness-program, we need to set clear targets. Clustering people is helpful to achieve it.
  • Operative Planning: set a good security culture based on internal communication, management-buy-in, and security awareness and training program.[5]
  • Implementation: four stages should be used to implement the information security culture. They are commitment of the management, communication with organizational members, courses for all organizational members, and commitment of the employees.[5]

Techniques and terms

All social engineering techniques are based on specific attributes of human decision-making known as cognitive biases.[6][7] These biases, sometimes called "bugs in the human hardware,” are exploited in various combinations to create attack techniques, some of which are listed below. The attacks used in social engineering can be used to steal employees' confidential information. The most common type of social engineering happens over the phone. Other examples of social engineering attacks are criminals posing as exterminators, fire marshals and technicians to go unnoticed as they steal company secrets.

One example of social engineering is an individual who walks into a building and posts an official-looking announcement to the company bulletin that says the number for the help desk has changed. So, when employees call for help the individual asks them for their passwords and IDs thereby gaining the ability to access the company's private information. Another example of social engineering would be that the hacker contacts the target on a social networking site and starts a conversation with the target. Gradually the hacker gains the trust of the target and then uses that trust to get access to sensitive information like password or bank account details.[8]

Social engineering relies heavily on the six principles of influence established by Robert Cialdini. Cialdini's theory of influence is based on six key principles: reciprocity, commitment and consistency, social proof, authority, liking, scarcity.

Authority

In social engineering, the attacker may pose as authority to increase the likelihood of adherence from the victim.

Intimidation

Attacker (potentially disguised) informs or implies that there will be negative consequences if certain actions are not performed. Consequences could include subtle intimidation phrases such as "I'll tell your manager" to much worse.

Consensus/Social proof

People will do things that they see other people are doing. For example, in one experiment, one or more confederates would look up into the sky; bystanders would then look up into the sky to see what they were missing. At one point this experiment was aborted, as so many people were looking up that they stopped traffic. See conformity, and the Asch conformity experiments.

Scarcity

Perceived scarcity will generate demand. The common advertising phrase "while supplies last" capitalizes on a sense of scarcity.

Urgency

Linked to scarcity, attackers use urgency as a time-based psychological principle of social engineering. For example, saying offers are available for a "limited time only" encourages sales through a sense of urgency.

Familiarity / Liking

People are easily persuaded by other people whom they like. Cialdini cites the marketing of Tupperware in what might now be called viral marketing. People were more likely to buy if they liked the person selling it to them. Some of the many biases favoring more attractive people are discussed. See physical attractiveness stereotype.

Vishing

Vishing, otherwise known as "voice phishing", is the criminal practice of using social engineering over a telephone system to gain access to private personal and financial information from the public for the purpose of financial reward.[9] It is also employed by attackers for reconnaissance purposes to gather more detailed intelligence on a target organization.

Phishing

Phishing is a technique of fraudulently obtaining private information. Typically, the phisher sends an e-mail that appears to come from a legitimate business—a bank, or credit card company—requesting "verification" of information and warning of some dire consequence if it is not provided. The e-mail usually contains a link to a fraudulent web page that seems legitimate—with company logos and content—and has a form requesting everything from a home address to an ATM card's PIN or a credit card number. For example, in 2003, there was a phishing scam in which users received emails supposedly from eBay claiming that the user's account was about to be suspended unless a link provided was clicked to update a credit card (information that the genuine eBay already had).[10] By mimicking a legitimate organization's HTML code and logos, it is relatively simple to make a fake Website look authentic. The scam tricked some people into thinking that eBay was requiring them to update their account information by clicking on the link provided. By indiscriminately spamming extremely large groups of people, the "phisher" counted on gaining sensitive financial information from the small percentage (yet large number) of recipients who already have eBay accounts and also fall prey to the scam.

Smishing

The act of using SMS text messaging to lure victims into a specific course of action, also known as "smishing".[11] Like phishing it can be clicking on a malicious link or divulging information. Examples are text messages that claim to be from a common carrier (like FedEx) stating a package is in transit, with a link provided.

Impersonation

Pretending or pretexting to be another person with the goal of gaining access physically to a system or building. Impersonation is used in the "SIM swap scam" fraud.

Other concepts

Pretexting

Pretexting (adj. pretextual) is the act of creating and using an invented scenario (the pretext) to engage a targeted victim in a manner that increases the chance the victim will divulge information or perform actions that would be unlikely in ordinary circumstances.[12] An elaborate lie, it most often involves some prior research or setup and the use of this information for impersonation (e.g., date of birth, Social Security number, last bill amount) to establish legitimacy in the mind of the target.[13] As a background, pretexting can be interpreted as the first evolution of social engineering, and continued to develop as social engineering incorporated current-day technologies. Current and past examples of pretexting demonstrate this development.

This technique can be used to fool a business into disclosing customer information as well as by private investigators to obtain telephone records, utility records, banking records and other information directly from company service representatives.[14] The information can then be used to establish even greater legitimacy under tougher questioning with a manager, e.g., to make account changes, get specific balances, etc.

Pretexting can also be used to impersonate co-workers, police, bank, tax authorities, clergy, insurance investigators—or any other individual who could have perceived authority or right-to-know in the mind of the targeted victim. The pretexter must simply prepare answers to questions that might be asked by the victim. In some cases, all that is needed is a voice that sounds authoritative, an earnest tone, and an ability to think on one's feet to create a pretextual scenario.

Vishing

Phone phishing (or "vishing") uses a rogue interactive voice response (IVR) system to recreate a legitimate-sounding copy of a bank or other institution's IVR system. The victim is prompted (typically via a phishing e-mail) to call in to the "bank" via a (ideally toll free) number provided in order to "verify" information. A typical "vishing" system will reject log-ins continually, ensuring the victim enters PINs or passwords multiple times, often disclosing several different passwords. More advanced systems transfer the victim to the attacker/defrauder, who poses as a customer service agent or security expert for further questioning of the victim.

Spear phishing

Although similar to "phishing", spear phishing is a technique that fraudulently obtains private information by sending highly customized emails to few end users. It is the main difference between phishing attacks because phishing campaigns focus on sending out high volumes of generalized emails with the expectation that only a few people will respond. On the other hand, spear-phishing emails require the attacker to perform additional research on their targets in order to "trick" end users into performing requested activities. The success rate of spear-phishing attacks is considerably higher than phishing attacks with people opening roughly 3% of phishing emails when compared to roughly 70% of potential attempts. When users actually open the emails phishing emails have a relatively modest 5% success rate to have the link or attachment clicked when compared to a spear-phishing attack's 50% success rate.[15]

Spear-phishing success is heavily dependent on the amount and quality of OSINT (open-source intelligence) that the attacker can obtain. Social media account activity is one example of a source of OSINT.

Water holing

Water holing is a targeted social engineering strategy that capitalizes on the trust users have in websites they regularly visit. The victim feels safe to do things they would not do in a different situation. A wary person might, for example, purposefully avoid clicking a link in an unsolicited email, but the same person would not hesitate to follow a link on a website they often visit. So, the attacker prepares a trap for the unwary prey at a favored watering hole. This strategy has been successfully used to gain access to some (supposedly) very secure systems.[16]

The attacker may set out by identifying a group or individuals to target. The preparation involves gathering information about websites the targets often visit from the secure system. The information gathering confirms that the targets visit the websites and that the system allows such visits. The attacker then tests these websites for vulnerabilities to inject code that may infect a visitor's system with malware. The injected code trap and malware may be tailored to the specific target group and the specific systems they use. In time, one or more members of the target group will get infected and the attacker can gain access to the secure system.

Baiting

Baiting is like the real-world Trojan horse that uses physical media and relies on the curiosity or greed of the victim.[17] In this attack, attackers leave malware-infected floppy disks, CD-ROMs, or USB flash drives in locations people will find them (bathrooms, elevators, sidewalks, parking lots, etc.), give them legitimate and curiosity-piquing labels, and wait for victims.

For example, an attacker may create a disk featuring a corporate logo, available from the target's website, and label it "Executive Salary Summary Q2 2012". The attacker then leaves the disk on the floor of an elevator or somewhere in the lobby of the target company. An unknowing employee may find it and insert the disk into a computer to satisfy their curiosity, or a good Samaritan may find it and return it to the company. In any case, just inserting the disk into a computer installs malware, giving attackers access to the victim's PC and, perhaps, the target company's internal computer network.

Unless computer controls block infections, insertion compromises PCs "auto-running" media. Hostile devices can also be used.[18] For instance, a "lucky winner" is sent a free digital audio player compromising any computer it is plugged to. A "road apple" (the colloquial term for horse manure, suggesting the device's undesirable nature) is any removable media with malicious software left in opportunistic or conspicuous places. It may be a CD, DVD, or USB flash drive, among other media. Curious people take it and plug it into a computer, infecting the host and any attached networks. Again, hackers may give them enticing labels, such as "Employee Salaries" or "Confidential".[19]

One study done in 2016 had researchers drop 297 USB drives around the campus of the University of Illinois. The drives contained files on them that linked to webpages owned by the researchers. The researchers were able to see how many of the drives had files on them opened, but not how many were inserted into a computer without having a file opened. Of the 297 drives that were dropped, 290 (98%) of them were picked up and 135 (45%) of them "called home".[20]

Quid pro quo

Quid pro quo means something for something:

  • An attacker calls random numbers at a company, claiming to be calling back from technical support. Eventually this person will hit someone with a legitimate problem, grateful that someone is calling back to help them. The attacker will "help" solve the problem and, in the process, have the user type commands that give the attacker access or launch malware.
  • In a 2003 information security survey, 91% of office workers gave researchers what they claimed was their password in answer to a survey question in exchange for a cheap pen.[21] Similar surveys in later years obtained similar results using chocolates and other cheap lures, although they made no attempt to validate the passwords.[22]

Tailgating

An attacker, seeking entry to a restricted area secured by unattended, electronic access control, e.g. by RFID card, simply walks in behind a person who has legitimate access. Following common courtesy, the legitimate person will usually hold the door open for the attacker or the attackers themselves may ask the employee to hold it open for them. The attacker will often purport to be on a phone call using a mobile to prevent questioning by an employee. The legitimate person may fail to ask for identification for any of several reasons, or may accept an assertion that the attacker has forgotten or lost the appropriate identity token. The attacker may also fake the action of presenting an identity token.

Other types

Common confidence tricksters or fraudsters also could be considered "social engineers" in the wider sense, in that they deliberately deceive and manipulate people, exploiting human weaknesses to obtain personal benefit. They may, for example, use social engineering techniques as part of an IT fraud.

As of the early 2000s, another type of social engineering technique includes spoofing or hacking IDs of people having popular e-mail IDs such as Yahoo!, Gmail, or Hotmail. Additionally, some spoofing attempts included emails from major online service providers, like PayPal.[23] This led to the "proposed standard" of Sender Policy Framework RFC 7208 dated April 2014, in combination with DMARC, as means to combat spoofing. Among the many motivations for this deception are:

  • Phishing credit-card account numbers and their passwords.
  • Cracking private e-mails and chat histories, and manipulating them by using common editing techniques before using them to extort money and creating distrust among individuals.
  • Cracking websites of companies or organizations and destroying their reputation.
  • Computer virus hoaxes
  • Convincing users to run malicious code within the web browser via self-XSS attack to allow access to their web account

Another type is to read sensitive information of unshielded or unprotected Displays and input devices, called Shoulder surfing.

Countermeasures

Organizations reduce their security risks by:

Training to Employees: Training employees in security protocols relevant to their position. (e.g., in situations such as tailgating, if a person's identity cannot be verified, then employees must be trained to politely refuse.)

Standard Framework: Establishing frameworks of trust on an employee/personnel level (i.e., specify and train personnel when/where/why/how sensitive information should be handled)

Scrutinizing Information: Identifying which information is sensitive and evaluating its exposure to social engineering and breakdowns in security systems (building, computer system, etc.)

Security Protocols: Establishing security protocols, policies, and procedures for handling sensitive information.

Event Test: Performing unannounced, periodic tests of the security framework.

Inoculation: Preventing social engineering and other fraudulent tricks or traps by instilling a resistance to persuasion attempts through exposure to similar or related attempts.[24]

Review: Reviewing the above steps regularly: no solutions to information integrity are perfect.[25]

Waste Management: Using a waste management service that has dumpsters with locks on them, with keys to them limited only to the waste management company and the cleaning staff. Locating the dumpster either in view of employees so that trying to access it carries a risk of being seen or caught, or behind a locked gate or fence where the person must trespass before they can attempt to access the dumpster.[26]

The lifecycle of social engineering

  1. Information gathering: Information gathering is the first and foremost step of the lifecycle. It requires much patience and keenly watching habits of the victim. This step gathering data about the victim's interests, personal information. It determines the success rate of the overall attack.
  2. Engaging with victim: After gathering required amount of information, the attacker opens a conversation with the victim smoothly without the victim finding anything inappropriate.
  3. Attacking: This step generally occurs after a long period of engaging with the target and during this information from the target is retrieved by using social engineering. In phase, the attacker gets the results from the target.
  4. Closing interaction: This is the last step which includes slowly shutting down the communication by the attacker without arising any suspicion in the victim. In this way, the motive is fulfilled as well as the victim rarely realizes the attack even happened.[27]

Notable social engineers

Frank Abagnale Jr.

Frank Abagnale Jr. is an American security consultant known for his background as a former con man, check forger, and impostor while he was between the ages of 15 and 21. He became one of the most notorious impostors,[28] claiming to have assumed no fewer than eight identities, including an airline pilot, a physician, a U.S. Bureau of Prisons agent, and a lawyer. Abagnale escaped from police custody twice (once from a taxiing airliner and once from a U.S. federal penitentiary) before turning 22 years old.[29] The popular Steven Spielberg movie Catch Me If You Can is based on his life.

Kevin Mitnick

Kevin Mitnick is an American computer security consultant, author and hacker, best known for his high-profile 1995 arrest and later five-year conviction for various computer and communications-related crimes.[30]

Susan Headley

Susan Headley was an American hacker active during the late 1970s and early 1980s widely respected for her expertise in social engineering, pretexting, and psychological subversion.[31] She was known for her specialty in breaking into military computer systems, which often involved going to bed with military personnel and going through their clothes for usernames and passwords while they slept.[32] She became heavily involved in phreaking with Kevin Mitnick and Lewis de Payne in Los Angeles, but later framed them for erasing the system files at US Leasing after a falling out, leading to Mitnick's first conviction. She retired to professional poker.[33]

James Linton

James Linton is a British hacker and social engineer who in 2017 used OSINT and spear phishing techniques to trick a variety of targets over email including the CEOs of Major Banks, and members of the Trump White House Administration. He then went to work in email security where he socially engineered BEC (Business Email Compromise) threat actors to collect specific threat intelligence.

Mike Ridpath

Mike Ridpath Security consultant, published author, and speaker. Previous member of w00w00. Emphasizes techniques and tactics for social engineering cold calling. Became notable after his talks where he would play recorded calls and explain his thought process on what he was doing to get passwords through the phone and his live demonstrations.[34][35][36][37][38] As a child Ridpath was connected with Badir Brothers and was widely known within the phreaking and hacking community for his articles with popular underground ezines, such as, Phrack, B4B0 and 9x on modifying Oki 900s, blueboxing, satellite hacking and RCMAC.[39][40]

Badir Brothers

Brothers Ramy, Muzher, and Shadde Badir—all of whom were blind from birth—managed to set up an extensive phone and computer fraud scheme in Israel in the 1990s using social engineering, voice impersonation, and Braille-display computers.[41][42]

Christopher J. Hadnagy

Christopher J. Hadnagy is an American social engineer and information technology security consultant. He is best known as an author of 4 books on social engineering and cyber security[43][44][45][46] and founder of Innocent Lives Foundation, an organization that helps tracking and identifying child trafficking using various security techniques such as seeking the assistance of information security specialists, utilizing data from open-source intelligence (OSINT) and collaborating with law enforcement.[47][48]

Law

In common law, pretexting is an invasion of privacy tort of appropriation.[49]

Pretexting of telephone records

In December 2006, United States Congress approved a Senate sponsored bill making the pretexting of telephone records a federal felony with fines of up to $250,000 and ten years in prison for individuals (or fines of up to $500,000 for companies). It was signed by President George W. Bush on 12 January 2007.[50]

Federal legislation

The 1999 "GLBA" is a U.S. Federal law that specifically addresses pretexting of banking records as an illegal act punishable under federal statutes. When a business entity such as a private investigator, SIU insurance investigator, or an adjuster conducts any type of deception, it falls under the authority of the Federal Trade Commission (FTC). This federal agency has the obligation and authority to ensure that consumers are not subjected to any unfair or deceptive business practices. US Federal Trade Commission Act, Section 5 of the FTCA states, in part: "Whenever the Commission shall have reason to believe that any such person, partnership, or corporation has been or is using any unfair method of competition or unfair or deceptive act or practice in or affecting commerce, and if it shall appear to the Commission that a proceeding by it in respect thereof would be to the interest of the public, it shall issue and serve upon such person, partnership, or corporation a complaint stating its charges in that respect."

The statute states that when someone obtains any personal, non-public information from a financial institution or the consumer, their action is subject to the statute. It relates to the consumer's relationship with the financial institution. For example, a pretexter using false pretenses either to get a consumer's address from the consumer's bank, or to get a consumer to disclose the name of their bank, would be covered. The determining principle is that pretexting only occurs when information is obtained through false pretenses.

While the sale of cell telephone records has gained significant media attention, and telecommunications records are the focus of the two bills currently before the United States Senate, many other types of private records are being bought and sold in the public market. Alongside many advertisements for cell phone records, wireline records and the records associated with calling cards are advertised. As individuals shift to VoIP telephones, it is safe to assume that those records will be offered for sale as well. Currently, it is legal to sell telephone records, but illegal to obtain them.[51]

1st Source Information Specialists

U.S. Rep. Fred Upton (R-Kalamazoo, Michigan), chairman of the Energy and Commerce Subcommittee on Telecommunications and the Internet, expressed concern over the easy access to personal mobile phone records on the Internet during a House Energy & Commerce Committee hearing on "Phone Records For Sale: Why Aren't Phone Records Safe From Pretexting?" Illinois became the first state to sue an online records broker when Attorney General Lisa Madigan sued 1st Source Information Specialists, Inc. A spokeswoman for Madigan's office said. The Florida-based company operates several Web sites that sell mobile telephone records, according to a copy of the suit. The attorneys general of Florida and Missouri quickly followed Madigan's lead, filing suits respectively, against 1st Source Information Specialists and, in Missouri's case, one other records broker – First Data Solutions, Inc.

Several wireless providers, including T-Mobile, Verizon, and Cingular filed earlier lawsuits against records brokers, with Cingular winning an injunction against First Data Solutions and 1st Source Information Specialists. U.S. Senator Charles Schumer (D-New York) introduced legislation in February 2006 aimed at curbing the practice. The Consumer Telephone Records Protection Act of 2006 would create felony criminal penalties for stealing and selling the records of mobile phone, landline, and Voice over Internet Protocol (VoIP) subscribers.

Hewlett Packard

Patricia Dunn, former chairwoman of Hewlett Packard, reported that the HP board hired a private investigation company to delve into who was responsible for leaks within the board. Dunn acknowledged that the company used the practice of pretexting to solicit the telephone records of board members and journalists. Chairman Dunn later apologized for this act and offered to step down from the board if it was desired by board members.[52] Unlike Federal law, California law specifically forbids such pretexting. The four felony charges brought on Dunn were dismissed.[53]

Preventive measures

Taking some precautions reduces the risk of being a victim of social engineering frauds. The precautions that can be made are as follows:

  • Be aware of offers that seem "Too good to be true".
  • Use multifactor authentication.
  • Avoid clicking on attachments from unknown sources.
  • Not giving out personal or financial information (such as credit card information, Social Security Numbers, or bank account information) to anyone via email, phone, or text messages.
  • Use of spam filter software.
  • Avoid befriending people that you do not know in real life.
  • Teach kids to contact a trusted adult in case they are being bullied over the internet (cyberbullying) or feel threatened by anything online.[54]
  • Don't make instant decisions, but when possible take 5 minutes to evaluate the information presented.

See also

  • Certified Social Engineering Prevention Specialist (CSEPS)
  • Code Shikara  Computer worm
  • Confidence trick  Attempt to defraud a person or group after first gaining their confidence
  • Countermeasure (computer)  Process to reduce a security threat
  • Cyber-HUMINT  Set of skills used by cyberspace hackers
  • Cyberheist
  • Inoculation theory  How people's attitudes can resist change through weak counterargument exposures
  • Internet Security Awareness Training
  • IT risk  Any risk related to information technology
  • Media pranks, which often use similar tactics (though usually not for criminal purposes)
  • Penetration test  Method of evaluating computer and network security by simulating a cyber attack
  • Phishing  Attempt to trick a person into revealing information
  • Physical information security  Common ground of physical and information security
  • Piggybacking (security)
  • SMS phishing
  • Threat (computer)
  • Voice phishing  Phishing attack via telephony
  • Vulnerability (computing)  Exploitable weakness in a computer system
  • Cyber security awareness

References

  1. Anderson, Ross J. (2008). Security engineering: a guide to building dependable distributed systems (2nd ed.). Indianapolis, IN: Wiley. p. 1040. ISBN 978-0-470-06852-6. Chapter 2, page 17
  2. "Social Engineering Defined". Security Through Education. Retrieved 3 October 2021.
  3. Lim, Joo S., et al. "Exploring the Relationship between Organizational Culture and Information Security Culture." Australian Information Security Management Conference.
  4. Andersson, D., Reimers, K. and Barretto, C. (March 2014). Post-Secondary Education Network Security: Results of Addressing the End-User Challenge.publication date 11 March 2014 publication description INTED2014 (International Technology, Education, and Development Conference)
  5. Schlienger, Thomas; Teufel, Stephanie (2003). "Information security culture-from analysis to change". South African Computer Journal. 31: 46–52.
  6. Jaco, K: "CSEPS Course Workbook" (2004), unit 3, Jaco Security Publishing.
  7. Kirdemir, Baris (2019). "HOSTILE INFLUENCE AND EMERGING COGNITIVE THREATS IN CYBERSPACE". Centre for Economics and Foreign Policy Studies. {{cite journal}}: Cite journal requires |journal= (help)
  8. Hatfield, Joseph M (June 2019). "Virtuous human hacking: The ethics of social engineering in penetration-testing". Computers & Security. 83: 354–366. doi:10.1016/j.cose.2019.02.012. S2CID 86565713.
  9. Choi, Kwan; Lee, Ju-lak; Chun, Yong-tae (1 May 2017). "Voice phishing fraud and its modus operandi". Security Journal. 30 (2): 454–466. doi:10.1057/sj.2014.49. ISSN 0955-1662. S2CID 154080668.
  10. Austen, Ian (7 March 2005). "On EBay, E-Mail Phishers Find a Well-Stocked Pond". The New York Times. ISSN 0362-4331. Retrieved 1 May 2021.
  11. Steinmetz, Kevin F.; Holt, Thomas J. (5 August 2022). "Falling for Social Engineering: A Qualitative Analysis of Social Engineering Policy Recommendations". Social Science Computer Review: 089443932211175. doi:10.1177/08944393221117501. ISSN 0894-4393. S2CID 251420893.
  12. The story of HP pretexting scandal with discussion is available at Davani, Faraz (14 August 2011). "HP Pretexting Scandal by Faraz Davani". Retrieved 15 August 2011 via Scribd.
  13. "Pretexting: Your Personal Information Revealed", Federal Trade Commission
  14. Fagone, Jason (24 November 2015). "The Serial Swatter". The New York Times. Retrieved 25 November 2015.
  15. "The Real Dangers of Spear-Phishing Attacks". FireEye. 2016. Retrieved 9 October 2016.
  16. "Chinese Espionage Campaign Compromises Forbes.com to Target US Defense, Financial Services Companies in Watering Hole Style Attack". invincea.com. 10 February 2015. Retrieved 23 February 2017.
  17. "Social Engineering, the USB Way". Light Reading Inc. 7 June 2006. Archived from the original on 13 July 2006. Retrieved 23 April 2014.
  18. "Archived copy" (PDF). Archived from the original (PDF) on 11 October 2007. Retrieved 2 March 2012.{{cite web}}: CS1 maint: archived copy as title (link)
  19. Conklin, Wm. Arthur; White, Greg; Cothren, Chuck; Davis, Roger; Williams, Dwayne (2015). Principles of Computer Security, Fourth Edition (Official Comptia Guide). New York: McGraw-Hill Education. pp. 193–194. ISBN 978-0071835978.
  20. Raywood, Dan (4 August 2016). "#BHUSA Dropped USB Experiment Detailed". info security. Retrieved 28 July 2017.
  21. Leyden, John (18 April 2003). "Office workers give away passwords". The Register. Retrieved 11 April 2012.
  22. "Passwords revealed by sweet deal". BBC News. 20 April 2004. Retrieved 11 April 2012.
  23. "Email Spoofing – What it Is, How it Works & More - Proofpoint US". www.proofpoint.com. 26 February 2021. Retrieved 11 October 2021.
  24. Treglia, J., & Delia, M. (2017). Cyber Security Inoculation. Presented at NYS Cyber Security Conference, Empire State Plaza Convention Center, Albany, NY, 3–4 June.
  25. Mitnick, K., & Simon, W. (2005). "The Art of Intrusion". Indianapolis, IN: Wiley Publishing.
  26. Allsopp, William. Unauthorised access: Physical penetration testing for it security teams. Hoboken, NJ: Wiley, 2009. 240–241.
  27. "social engineering – GW Information Security Blog". blogs.gwu.edu. Retrieved 18 February 2020.
  28. Salinger, Lawrence M. (2005). Encyclopedia of White-Collar & Corporate Crime. SAGE. ISBN 978-0-7619-3004-4.
  29. "How Frank Abagnale Would Swindle You". U.S. News. 17 December 2019. Archived from the original on 28 April 2013. Retrieved 17 December 2019.
  30. "Kevin Mitnick sentenced to nearly four years in prison; computer hacker ordered to pay restitution to victim companies whose systems were compromised" (Press release). United States Attorney's Office, Central District of California. 9 August 1999. Archived from the original on 13 June 2013.
  31. "DEF CON III Archives – Susan Thunder Keynote". DEF CON. Retrieved 12 August 2017.
  32. "CDNE Chapter 14 - Female Hackers?". Archived from the original on 17 April 2001. Retrieved 6 January 2007.
  33. Hafner, Katie (August 1995). "Kevin Mitnick, unplugged". Esquire. 124 (2): 80(9).
  34. Social Engineering: Manipulating the human. Scorpio Net Security Services. 16 May 2013. ISBN 9789351261827. Retrieved 11 April 2012.
  35. Niekerk, Brett van. "Mobile Devices and the Military: useful Tool or Significant Threat". Proceedings of the 4Th Workshop on Ict Uses in Warfare and the Safeguarding of Peace 2012 (Iwsp 2012) and Journal of Information Warfare. academia.edu. Retrieved 11 May 2013.
  36. "Social Engineering: Manipulating the human". YouTube. Retrieved 11 April 2012.
  37. "BsidesPDX Track 1 10/07/11 02:52PM, BsidesPDX Track 1 10/07/11 02:52PM BsidesPDX on USTREAM. Conference". Ustream.tv. 7 October 2011. Archived from the original on 4 August 2012. Retrieved 11 April 2012.
  38. "Automated Social Engineering". BrightTALK. 29 September 2011. Retrieved 11 April 2012.
  39. "Social Engineering a General Approach" (PDF). Informatica Economica journal. Retrieved 11 January 2015.
  40. "Cyber Crime". Hays. Retrieved 11 January 2020.
  41. "Wired 12.02: Three Blind Phreaks". Wired. 14 June 1999. Retrieved 11 April 2012.
  42. "Social Engineering A Young Hacker's Tale" (PDF). 15 February 2013. Retrieved 13 January 2020. {{cite journal}}: Cite journal requires |journal= (help)
  43. "43 Best Social Engineering Books of All Time". BookAuthority. Retrieved 22 January 2020.
  44. \ (31 August 2018). "Bens Book of the Month Review of Social Engineering The Science of Human Hacking". RSA Conference. Retrieved 22 January 2020.{{cite web}}: CS1 maint: numeric names: authors list (link)
  45. "Book Review: Social Engineering: The Science of Human Hacking". The Ethical Hacker Network. 26 July 2018. Retrieved 22 January 2020.
  46. Hadnagy, Christopher; Fincher, Michele (22 January 2020). "Phishing Dark Waters: The Offensive and Defensive Sides of Malicious E-mails". ISACA. Retrieved 22 January 2020.
  47. "WTVR:"Protect Your Kids from Online Threats"
  48. Larson, Selena (14 August 2017). "Hacker creates organization to unmask child predators". CNN. Retrieved 14 November 2019.
  49. Restatement 2d of Torts § 652C.
  50. "Congress outlaws pretexting". 109th Congress (2005–2006) H.R.4709 – Telephone Records and Privacy Protection Act of 2006. 2007.
  51. Mitnick, K (2002): "The Art of Deception", p. 103 Wiley Publishing Ltd: Indianapolis, Indiana; United States of America. ISBN 0-471-23712-4
  52. HP chairman: Use of pretexting 'embarrassing' Stephen Shankland, 8 September 2006 1:08 PM PDT CNET News.com
  53. "Calif. court drops charges against Dunn". CNET. 14 March 2007. Retrieved 11 April 2012.
  54. "What is Social Engineering | Attack Techniques & Prevention Methods | Imperva". Learning Center. Retrieved 18 February 2020.

Further reading

  • Boyington, Gregory. (1990). 'Baa Baa Black Sheep' Published by Gregory Boyington ISBN 0-553-26350-1
  • Harley, David. 1998 Re-Floating the Titanic: Dealing with Social Engineering Attacks EICAR Conference.
  • Laribee, Lena. June 2006 Development of methodical social engineering taxonomy project Master's Thesis, Naval Postgraduate School.
  • Leyden, John. 18 April 2003. Office workers give away passwords for a cheap pen. The Register. Retrieved 2004-09-09.
  • Long, Johnny. (2008). No Tech Hacking – A Guide to Social Engineering, Dumpster Diving, and Shoulder Surfing Published by Syngress Publishing Inc. ISBN 978-1-59749-215-7
  • Mann, Ian. (2008). Hacking the Human: Social Engineering Techniques and Security Countermeasures Published by Gower Publishing Ltd. ISBN 0-566-08773-1 or ISBN 978-0-566-08773-8
  • Mitnick, Kevin, Kasperavičius, Alexis. (2004). CSEPS Course Workbook. Mitnick Security Publishing.
  • Mitnick, Kevin, Simon, William L., Wozniak, Steve,. (2002). The Art of Deception: Controlling the Human Element of Security Published by Wiley. ISBN 0-471-23712-4 or ISBN 0-7645-4280-X
  • Hadnagy, Christopher, (2011) Social Engineering: The Art of Human Hacking Published by Wiley. ISBN 0-470-63953-9
  • N.J. Evans. (2009). "Information Technology Social Engineering: An Academic Definition and Study of Social Engineering-Analyzing the Human Firewall." Graduate Theses and Dissertations. 10709. https://lib.dr.iastate.edu/etd/10709
  • Z. Wang, L. Sun and H. Zhu. (2020) "Defining Social Engineering in Cybersecurity," in IEEE Access, vol. 8, pp. 85094-85115, doi: 10.1109/ACCESS.2020.2992807.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.