Thought leadership

Why do people put data at risk? Exploring the psychology behind your data breaches

by Egress
Published on 7th Aug 2020

Even with all of the changes in technology over the years, understanding human behaviour is still key to controlling data breaches within your organisation. Mistakes can happen in any situation, so risks will always be present – especially as humans are prone to acting unpredictably at the best of times.

Lisa Forte, Social Engineering & Insider Threat Expert Cyber Security Speaker and Vlogger, sat down to share her insights with Egress’ CEO Tony Pepper in a recent webinar. Together, they broke down some of the common characteristics of why employees cause data breaches, as well as explored some strategies to tackle them.

Here’s our round up of our highlights from that discussion:

Moderator: Our subject today is insider data breaches, and I’m joined by Lisa Forte, Social Engineering & Insider Threat Expert Cyber Security Speaker and Vlogger, and Egress CEO Tony Pepper. We often tend to look at insider data breaches and simply chalk them up to somebody rushing or sending an email to the wrong person – but actually, the real cause lies far deeper within an individual’s psychology and their behaviour at the time.

This is a topic Lisa has spent time analysing and advising on – so Lisa, can you start by giving us an overview of insider data breaches, please?

Lisa Forte: Thank you for the introduction. I think the first thing to understand is that overall in society, we all tend to misunderstand insider threats a little bit. When we look at what the scientific community has discovered from a psychological perspective and what we know from the legal community in terms of culpability, we’re able to identify four real categories of insider data breaches. So, you have purely accidental mistakes, where there’s no level of culpability. Then we have negligence, where there’s more culpability, recklessness, where there’s a bit more again, and finally intent, where a person goes to deliberately attack your organisation.

The key here is the level of culpability between these different characters. The first is ‘Keen Katherine’ – a character that I feel I might’ve been described as during my career. Perhaps they’re new in the company and focused on doing well, maybe it’s someone who wants a promotion. With that comes a strong focus on productivity, so they might make a lot of mistakes with security despite meaning well. Next is someone who I’m sure we can all relate to: ‘Tired Tim’. Now Tired Tim is very busy – not just at work but maybe with things going on at home too. He was one of these people who gets behind because he always has a million things to do, so when he sees something like a 20 minute training come up, he misses it and then two days later falls victim to a phishing email.

Next we have ‘Risky Raj’. This is someone who doesn’t set out with the intention to harm the company, but is still a risk-taker. Now, we’ve all met people at work who cut corners or ignore the rules to get work done more quickly.

The final case, and perhaps it’s the one we’d usually associate with the word ‘insider’, is intent. I’ve done a lot of work on intentional insiders and the issues they cause organisations. For this case we’ll look at ‘Sneaky Sarah’ – an employee who’s planning on moving companies, so what she’s thinking of doing is forwarding the client list to herself in case they come in use at her new role. Now, obviously, at this point she knows she’s not supposed to be doing that, so she’s committing an intentional data leak.

So those are the characters, but we also need to understand the psychology of why this happens. One of the things we need to look at is called the Dunning-Kruger effect, which tells us that people who don’t know very much about something tend to totally overestimate how much they do know. Now, the problem with this is that even if you do just a little bit of training with your staff, they might then think that they know absolutely everything about cybersecurity.

We also need to understand cognitive dissonance, which is when we know we’re supposed to do one thing, but we’d like to do another. So, for instance, we might know that we’re supposed to encrypt an email, but we also know that it will make our jobs faster and easier to not encrypt it, so we make excuses.

One more thing to think about within the UK is that the default legal position is that everything you do during work hours is owned by your employer. It’s not owned by you, so it needs to stay with your employer when you leave. I actually read a statistic that said in the UK we have the worst rate of this type of data theft in the world. I think this is partly because we all move jobs so frequently, so it’s definitely something to be aware of and reiterate to your staff. 

Moderator:
Brilliant, thank you Lisa. It’s really interesting to understand why people cause data breaches as our first step to mitigating insider risk. I’m going to pass over to Tony now to look at the role of technology to prevent insider data breaches.

Tony Pepper: Thanks Lisa for that insight into the minds of people we all interact with on a daily basis. It’s only when you actually get inside the minds of the people in a business that you can understand where the problems really are. What I’m going to touch on is where technology can play a vital role in addressing this risk.

Technology can play a very powerful role to augment training. But unlike training, which might typically be carried out over a period of time or in series, we feel that technology works in a constant manner. It’s a layer of protection that’s wrapped around people, helping them to avoid mistakes and guiding them to better decisions when they’re doing something they’re not supposed to.

The world is a completely different place today. The technology we have now to deliver the type of pro-active protection that allows us to do these things just wasn’t available 10 years ago – or even five years ago! Now, advances in technology such as machine learning and AI have given companies like Egress the ability to do these things – and that’s why technology is in such an exciting place right now.

Next, let me a touch on a few of the business problems technology like this can actually solve today. Firstly, as Lisa quite rightly said, in today’s environment insider threat is a myriad of different things. We feel that email is the most exposed channel for this insider threat and that’s simply because it’s a ‘tunnel’ directly into your business. As a result, it is the riskiest component for your business, because it’s so easy for employees to communicate, both within the organisation and to external partners and suppliers. It’s easy to use, but it’s also so easy to make a mistake!

I think it’s interesting that over the last 10 years, email has gone from being this enabling technology for us to communicate, to being one that many companies are now trying to secure and control. This could be through boundary technology, it could be DLP, or it could be delivered at the desktop to try to control information that’s leaving the business.

The sad reality is that legacy technologies like DLP that address this risk just don’t work. That’s the place we find ourselves today, with this tunnel into our business that’s ultimately a productivity tool but is now also our biggest threat. Egress has focused our entire attention into trying to overlay email environments with intelligent security to tackle these problems.

Moderator: Brilliant, thank you Tony. As we mentioned at the start of the webinar, we’ve got some time now for questions. One person has asked: “When companies impose excessive security, that goes above and beyond what’s required, and it ultimately stops people doing their jobs effectively, does that encourage recklessness?”

Lisa Forte:
Yeah, you’re right. There’s something we call decision latitude, which is essentially how much freedom you give your employees to decide how to do their job. Now, if you give someone a lot of decision latitude, you’ve given them a lot of freedom to maybe use their own Google Docs or whatever app they want to use, which obviously isn’t great for security because it’s a bit of a free-for-all. However, if you have a very low level of decision latitude – i.e. if you’re overly restrictive – you also get that cutting corners, that recklessness, because you’ve made it so hard for employees to be productive. So it’s really important that when you employ security you remember that first and foremost it has to support business functions – it’s not there to restrict everyone’s lifestyle, it’s there to help you do your job properly. If you’re implementing anything that doesn’t do that you’re making a really serious mistake.

Tony Pepper: I’d agree with that Lisa, and I’ll actually share a memory of mine. I’m going to take you back to 2002, 2003 – long enough hopefully that I won’t get into too much trouble for sharing it! In a previous business, we used to work extensively with the UK M.O.D. and there was obviously an appetite to ensure a very high degree of protection. It was in regards to logging into laptops, where users had to use two randomly generated passwords just to login. So I visited a base and what did I see under every laptop? Post-it notes with the passwords written down, because nobody could remember them!

I think, thankfully, those days are behind us. If you look at the government agencies’ approach to risk and security we have come a long, long way and there’s much more sensitivity around the balance of risk and productivity. It’s all about appropriate levels of security – appropriate to the content, and appropriate to the risk. It’s not a kind of one-size-fits-all solution.

Watch the full webinar to hear all of Tony and Lisa’s insights into insider threats and hear how the panel responded to the audience's questions.