You Can’t Patch People

December 29, 2012 at 5:06 pm | Posted in Data Protection, Endpoint Security, Human Factors in Security | Leave a comment
Tags: , , , ,

In a recent blog post, Bruce Schneier highlighted how a commercially available and low-cost  (around £200) forensics tool is capable of cracking passwords for common commercial whole disk encryption products.

As I mentioned in a previous post, use of PGP Desktop to encrypt all laptop disks  is compulsory at IBM and is enforced through our end-user computing standards.

The default power management configuration for laptops often just suspends the laptop when the lid is closed or when ‘sleep’ button is pressed. unless the laptop user selects ‘hibernate’ the disk drives are not encrypted. standards dictate that laptop configuration should be changed to hibernate in these circumstances, but how many users actually make the necessary changes?

The comprehensive help documents provided by IBM for configuring the whole disk encryption software step the user through making a ‘rescue disk’ to allow recovery in the event of a lost encryption password. So, how many users take any precautions to protect that?

Going back to the potential attack against whole disk encryption, it relies on the attacker being able to recover the encryption key from memory dumps or hibernation files, after the disk has been decrypted. Of course, if the laptop is always left safe (ie. powered down or at least hibernating) then that attack vector isn’t available. However, how many users leave their laptop unattended and logged in when they believe the environment is ‘safe’? And, how many leave their laptop unattended before the hibernation process has completed?

The common thread through all of this is that if users are careless, they can inadvertently cancel out any benefits from technical countermeasures. It’s simple enough to describe the exact behaviour that will prevent this. In Public sector security, we call this Security Operating Procedures, or SyOPs for short.

It’s usual to define the IT security risk management process as starting with risk assessment to select the right security controls, followed by incident management to deal with residual risk, invoking crisis management and BCP when required, to recover from the most severe incidents. I strongly believe that SyOP production and security awareness training for end users must form part of the risk management process and must be in place before a service is activated to ensure that the security controls operate as designed and to defend against the sort of attack described here.

As I said in the title, users are the one part of the system that can’t be patched to remove vulnerabilities.  It’s vitally important to explain the importance of what we ask them to do and then to reinforce that through adherence to mandatory written instructions, in order to establish the ‘habit’.

Do I know you?

May 15, 2011 at 12:10 am | Posted in Human Factors in Security | 2 Comments
Tags: , , ,

“What the middle-aged Tory minister said to the young blonde Labour MP in a lift”

The Times 13 May 2011


It’s been a pretty hectic time for me work-wise recently – you may have noticed from the tumbleweed blowing through this blog in recent weeks!  But, after a concerted push to get some deliverables out, I finally found myself working from home today, with a little less pressure than normal.  So, I decided to set myself up for the day with an early morning trip to my favourite coffee shop, for a cappuccino (skinny, of course!) and a chance to read the newspaper in peace.

So it was that I found myself reading in the Times about a minor spat between two Members of Parliament.  In a nutshell, a senior (male) MP challenged a young woman he encountered in a restricted area, on the basis that “”Well, I thought you looked too young to be an MP”.  He challenged her to produce her pass, which she did.  Awkward.  Now, I don’t intend to defend the MP’s possibly boorish manner (after all, it seems he has form when it comes to acerbic remarks).  Equally, it seems at least possible that the younger (newly elected) MP might have been less than cooperative, when challenged.  So all in all, a storm in a tea-cup, but it reminded me of a serious point.

Must we wear photo passes?

Regular readers will know that I work for IBM where, in common with all technology based organisations and many large organisations of all types, it’s mandatory for all staff to have a pass to gain access to and move around the company sites.  These access passes form a key component of physical access control systems and even, in more advanced deployments, provide strong authentication for access to computer systems.  They also generally display a photo of the owner and their name.  The idea is that the most basic element of physical security is for those in a restricted area to be aware of who should be present and who shouldn’t.

In modern organisations, staff often visit their “home” office only infrequently.  Equally, the number of staff in any one location is often very large.  As I wrote in a previous post,  Dunbar’s Number suggests that we have difficulty keeping track of a circle of acquaintances numbering more than (say) 150.  This is, in large part, the reason behind photographic ID.  I’m sure IBM is not alone in insisting that these badges are worn in plain sight by all staff at all times.

They also help in avoiding embarrassing situations like the newspaper story, with which I opened.  “Tailgating” is frowned upon at card operated doors and clearly visible photo ID makes it easier for security staff to detect.  It’s everyone’s responsibility – and should be drummed into new staff through security awareness training –  to be aware of who is in the area and to confirm their right to be there.  We also have to be prepared to challenge anyone not displaying the correct pass, though hopefully showing a little more tact than the Tory MP.


Protecting Data Outside the Office

September 4, 2010 at 11:58 pm | Posted in Data Protection, Human Factors in Security | 1 Comment
Tags: , , , , , ,

A recent article in the Times caught my eye.  It was discussing the notion of “extreme jobs”.  I think most of us can agree with the idea that there’s been an inexorable increase in the pressure on us to always be available, working longer and longer hours and still prepared to answer the mobile phone to a customer or the boss late into the night, at weekends and even on holiday.

Coupled with the ready availability of increasingly sophisticated mobile technology, it’s inevitable that many of us will take work home with us, or at least, outside the safety of the office environment.  For many of us, that means we’re taking with us sensitive information and the consequences of the loss of that data could be catastrophic.

One of my current tasks is preparing security awareness training for colleagues working on a large Public Sector bid.  We’ll be delivering this training to highly skilled and very experienced IT professionals, but looking around, I’m reminded that what is obvious and necessary to a security specialist is often at best an annoying distraction to others.  We all have to remember that mishandling sensitive information can have grave contractual and even legal consequences both for an individual and for their employer.  So, take a look at these 5 simple precautions, to make sure it’s not you that makes the headlines.

1:  Pay attention to the physical security of your laptop while travelling

Any attempt to work outside the office almost inevitably means taking a laptop, loaded with project data (including sensitive commercial and even personal data) with you while you’re travelling.  No matter how you travel, it’s bound to present plenty of opportunities for your laptop to be lost or stolen.  It’s fair to assume that, generally the motive for theft is to sell the laptop onwards, rather than a concerted attempt to obtain any data stored on it.  However, you should take reasonable care not to advertise that you might be a valuable target.  Don’t for example wear your company pass outside the building.  The risk is greatest, when you have to leave the laptop unattended:

  • While driving, keep the laptop out of sight, in the boot of your car.
  • When staying in a hotel, keep the laptop in a safe, if one is provided in your room.
  • When using the laptop in a public place, secure the laptop with a Kensington lock.

2:  Use whole disk encryption to protect your data

If your laptop is lost or stolen, the cost of replacing the hardware is relatively minor – and it’s insured anyway, isn’t it?  The real cost of the incident is the loss or disclosure of sensitive information stored on the laptop.  To  protect against this, you should install whole disk encryption software.  This  ensures that all the data on the laptop’s disk is encrypted, when the laptop is shut down.  Only when the laptop is powered up and the authorised user completes pre-boot authentication, is the disk data decrypted and available for use.  Commercial software is available from a number of well-known vendors, including PGP and DESlock.  You should bear in mind that, unless care is taken, even the authorised user may be unable to decrypt the data on the disk.  You should make sure that:

  • You run the operating system’s disk maintenance utilities to defragment the disk and check and mark any bad areas on the disk;
  • You should make a full backup of the disk volume(s) before installing the encryption software;
  • The install process will give the opportunity to create Emergency Recovery Information – make sure you write this ERI to a CD or other removable medium and store it somewhere safe;
  • Most importantly, the encryption software only takes effect when the laptop is shut down or hibernated.  You should never travel with your laptop in standby.

3:  Protect yourself against eavesdropping when working in public places

One of my favourite tech commentators is Peter Cochrane, who writes a regular column for Silicon.com.  Earlier this year, Peter reported on how easy it was to collect sensitive information from fellow travellers on the train.  Anyone who travels regularly on commuter train services will be familiar with indiscreet conversations and (even worse) one-sided mobile phone conversations, that give away far more sensitive information than they should.  Do resist the temptation to discuss sensitive matters in public places and try to curtail calls to your mobile until you can find somewhere more private.

Back to Peter Cochrane again.  During his frequent air travel, he noticed people using mobile phones to photograph – or even video – the screens of other people’s laptops.  His blog shows how it’s possible (given enough patience and a bit of experimenting) to get a reasonable picture of someone’s laptop screen.  This situation is easily fixed for a modest outlay, through the use of a privacy screen.  These clip over the laptop screen and make it impossible to read the screen unless you’re directly in front of it.  These screens work along the same lines as polarising sun glasses – do make sure they’re fitted the right way round.

4:  If you must use removable media, take extra care

It’s almost an immutable law of nature that, if you copy sensitive data to removable media, eventually, that media is going to get lost.  The simplest remedy of course is not to use removable media.  My current employer bans the use of these devices on Public Sector projects and, at one time, at least one UK government department  filled the USB ports of laptops with superglue, to be absolutely sure.  Of course, a blanket ban isn’t always practicable, so, if you do need to use a memory stick, removable drive or similar, here are a few suggestions:

  • Don’t ever allow the use of personal removable devices – you have no idea how or where they’ve been used before or will be next
  • Have a pool of memory sticks for your project, clearly marked and with some sort of unique identifier.  Make team members check them in and out (with a signature) when they need them and make sure that missing or overdue devices are always followed up immediately.
  • Always encrypt the device.  As we discussed earlier in this article, the use of whole disk encryption when dealing with sensitive information is absolutely vital.  So, if all your team members have the capability, it’s crazy not to use it for removable devices as well.
  • It’s well worth the effort to select only the minimum amount of data for copying onto the removable media.  It might be quicker to export the whole contents of a database, but you must do everything in your power to limit the potential loss.

5:  Always use a secure connection over public networks.

Finally, when you’re out of the office and you need to work, be careful to secure your communications.  Assume that all networks (in hotels or other public spaces, in customer sites and even at home) are hostile.  Always use a Virtual Private Network (VPN) connection to encrypt all your traffic when connecting to your organisation’s intranet from outside and never use a public computer or your home computer to connect to the intranet.

So, to summarise, a mixture of sensible procedural precautions, together with a few simple and inexpensive technical additions can do much to control the risks of taking sensitive information outside the normal office environment.  These measures might be a little inconvenient, but they will go a long way to ensuring that you’re not the one found responsible for a data loss, resulting in massive reputational damage, the loss of contracts and potentially huge fines for your employer.

Tell the Truth …

March 12, 2010 at 8:31 am | Posted in Human Factors in Security, Incident Response | 2 Comments
Tags: , , ,

… and shame the Devil, as I was often told as a child.  Sound advice you’d think, but in the world of IT Security such honesty could cost you your job.  I was alerted on Twitter by Kai Wittenburg to the story of Pennsylvania’s CISO Robert Maley.  According to the story on Computerworld’s web site, Maley was fired by his employer, apparently after commenting on a security incident during the RSA show.  The reason given for his dismissal ws that he failed to get the proper approvals before making his comments.  The incident in question appears to have been a vulnerability in a scheduling system  at the Department of Transport.  The Department denies that any hacking or breach was involved in the incident, but details have been handed over to the State Police for investigation.  This furore is taking place against a backdrop of cuts of 38% in IT security budgets and 40% in staffing.

Chances are, Maley’s employer does insist on rigid prior approval for this sort of thing.  It’s all part of the culture of secrecy around security incidents that’s endemic in large organisations.  The immediate effect is to make it more difficult for all of us to get budgets approved for security programmes.  Faced with yet another capital expenditure request for an IT security programme, the CEO will say “..but , if this threat is real, why don’t I ever read about it in the Press?”  Answer:  because far too many organisations follow the lead of the Commonwealth of Pennsylvania and deny everything.

And there’s another consequence of not discussing these incidents – we don’t learn from them.  In his book “Managing the Human Factor in Information Security“, David Lacey describes how the aviation industry has systematically and ruthlessly pursued safety through a combination of mandatory incident reporting and thorough investigation of “near misses”.  Any major incident is the result of a series of cascading failures.  If any one element holds up under pressure, then the disaster is averted.  However, there are still a whole load of individual failures to be investigated and lessons to be learned.  Next time, you might not be so lucky.

As our World becomes ever more dependent upon on-line systems, so the impact of security incidents will become ever greater.  Unless we allow – even encourage – IT security professionals to follow Maley’s example and openly discuss these incidents, how can we ever hope to improve?

Reblog this post [with Zemanta]

Identity Economics: No Tech Required – yet!

January 20, 2010 at 3:09 pm | Posted in Human Factors in Security | 2 Comments
Tags: , , , ,

From the age of 16, for the next 15 years, I served in the Royal Navy.  Like all uniformed, military organisation, a vital part of the induction process is learning the etiquette attached to membership.  I don’t just mean the rules necessary for large and (at that time) wholly male groups to live and work in extremely close proximity, away from their families for long periods.  Nor do I just mean the discipline on which lives can depend in a fighting force.  Finally, I don’t just mean the quaint and unique traditions that come from 500 years of history.  What I mean is the way in which servicemen (and women) are expected to dress (both in and out of uniform) and to behave (whether on duty or not), particularly when in the view of the general public.

The pressure to conform to these standards (which generally far exceed the norms for society) is immense and is imposed by one’s peers, not through the hierarchy.  Having said that though, the lessons a 16 year-old learns from a Gunnery Instructor tend to stay learned for life!  A good example is the practice of saluting.  Saluting is always a mark of respect to the Monarch.  So, we face the mast and salute at morning Colours and at evening Sunset, we face the ensign and salute as we board the ship or go ashore.  And, we salute officers, because they hold the Queen’s Commission and that’s what we’re acknowledging, not the individual.  To illustrate that point, from their inception in November 1917, the Women’s’ Royal Naval Service (WRNS) were not formally part of the Royal Navy, having their own rules and organisation.  WRNS officers did not hold a commission and thus, Royal Naval personnel were not required to salute them.  This all changed on 1 July 1977, when the WRNS became subject to the Naval Discipline Act. 

Why am I telling this long winded story?  Well, although I left the Navy nearly 30 years ago, MrsV1951 and I still live in a naval town, so seeing uniformed RN personnel in the town centre is a common occurrence.  A few days ago, in search of sanctuary and free wi-fi, I was headed to a local coffee shop and I happened to be following a naval officer, in uniform.  Coming in the opposite direction were two naval ratings, also in uniform.  They passed without even acknowledging the other’s presence, much less saluting.  I was incensed, not just by this, but by the fact that the ratings were wearing their blue denim working uniforms (never, ever worn ashore in my day) and the officer was drinking Cola from a McDonalds cup as he walked!  Why was I so annoyed?  Maybe I’m just becoming a curmudgeon (I’m certainly old enough to qualify).

And then, today, an article in the Times by Daniel Finkelstein shed some light on my disquiet.  Finkelstein was discussing how group identity has an impact on how we behave.  This phenomenon has attracted the attention of the Nobel Prize-winning economist George Akerlof.  Together with Rachel Kranton, he developed the idea of Identity Economics.  The central concept is that we adopt an identity to fit in with our peer group and that preserving that identity is one of our major economic drivers.  In their book “Identity Economics: How Our Identities Shape Our Work, Wages, and Well-Being” (to be published next month), they describe how the Armed Forces successfully exploit this behaviour to make service personnel adopt the identity of the service to build team spirit and morale – all the attributes that make every serviceman and woman determined to do their best for their colleagues every time.  And they know that their colleagues will do the same – essential in the face of extreme danger (I served much of my time in submarines, where extreme danger was always close by, though rarely due to hostile action).  So, maybe that explains my annoyance.  What I saw was members of a peer group of which I am (subconsciously?) still a member not obeying what I think are the norms of group behaviour.  If Akerlof is right, then I see that (subconsciously?) as a threat to my identity.

So, finally, what’s all this got to do with Identity Management?  Well, it seems to me that some of the more perceptive commentators in the security industry, including David Lacey and Bruce Schneier, are saying that the real challenge for security professionals is to address the behaviour of the humans in the system.  And, if Akerlof is right, then those humans have a composite identity, where each segment represents a peer group with which they identify and carries with it a set of behavioural norms.

It seems to me that this is reflected in the different behaviour people exhibit in revealing personal information on sites such as Facebook and LinkedIn.  They expect to be able to portray an appropriate “face” to their peers in these different environments, without them interacting.  And this, allowing a user to control who can see which parts of their identity profile and under what circumstances, is where we’re going to need some technology.

Blog at WordPress.com.
Entries and comments feeds.