In recent days, there’s been a lot of comment (in news stories, by pundits, on social media, and in online forums) about how appalling it is that MPs are sharing passwords with their staff.
Amid all that chatter, on a forum that I follow, John Custy (@ITSMNinja) made an interesting observation:
“… I've seen enough bureaucracy to know that sometimes people give up trying to do things the right way, and need to take the path of least resistance to get their job done … sometimes we need to understand why people behave the way they do, not just apply our experiences…”
This really struck a chord with me. It’s easy for IT consultants and experts to judge others less technically savvy. This is NOT a process issue. This is a classic example of trying to impose top-down control without necessarily responding to the patterns of behaviour that emerge as a result.
Later that day, I happened to be speaking to some in-house RESILIA™ experts, and they summed up the challenge in the following four points:
- When IT is not user designed, this causes work arounds and poor habits to form
- IT security is as much about people and their behaviours as any technology
- People can be your strongest defence, if trained properly
- Culture plays a huge part in the overall resilience of a business.
To be clear, I’m not excusing bad behaviour. Information Security is everyone’s responsibility! At the same time, I have sympathy for people trying to work with clearly inadequate tools and training.
I don’t have any inside knowledge of how Parliament’s IT department designed and implemented its security controls, but I would frame my investigations and analysis by taking a cue from ITIL® Practitioner’s nine guiding principles:
- Design for experience: clearly, the experience of complying with IT security controls was less than satisfactory. What can be done to make IT systems and controls provide a better user experience?
- Observe directly: did anyone from the IT department observe how MPs and their staff work, and design controls with minimal friction? Did anyone observe the practice of sharing passwords (and I’m willing to bet it was an open secret), and if so, what corrective action was taken to reinforce existing controls or to design better ones?
- Progress iteratively: were these controls designed in isolation and rolled out en masse? Could they have been applied in an iterative manner to observe their impact on people’s behaviours and productivity, so any corrections could have been applied?
- Collaborate: did the IT department work with non-IT staff to find a balance of controls and usability?
- Keep it simple: now, to IT experts, the controls we put in place might appear simple, but this might not be the case to non-IT folk who feel the controls get in the way
- Work holistically: did the IT department think of the entire system of behaviours and business processes when designing its controls? Or did it apply point solutions in silos?
- Focus on value: value might have been provided to the information security and compliance officers, but what value was provided to end users?
- Be transparent: did end users understand why IT services and systems were designed the way there were? Was visibility, feedback and learning (from using the system or from security incidents) encouraged?
- Start where you are: we now have an IT crisis on our hands. The IT department needs to begin the process of correcting issues found in its controls, and to work more closely with its end users to ensure that similar issues do not reoccur. There’s a wonderful assertion in Jeff Patton’s book User Story Mapping: ‘Shared documentation is not the same as shared understanding’. There’s no point reverting to a thick binder of previously documented controls – they haven’t worked!
I would also encourage members of Parliament’s IT department to read this 2007 Harvard Business Review article. The department seems to be in a situation similar to that faced by the Chicago Police Department as described in the article.
All the best!