Managing Credentials on the Web

January 19, 2011 at 11:19 pm | Posted in Cyber Security, Identity Management | 1 Comment
Tags: , , , , , , , , , , , , , , , , ,

I enjoyed reading a good natured rant about the vagaries of managing your identity online on the Des Res blog the other week.  If, like me, you work for a large organisation, you’ll probably be obliged to follow strict rules on selecting a password for access to corporate systems.  If, again like me, you use a lot of websites that require you to select credentials for logging in, you may struggle to manage a large (and constantly growing) set of strong passwords without writing them down.  In these circumstances, it’s very tempting to re-use the strong password for your work systems for other purposes.

Identity 2.0

Identity 2.0 or digital identity has long promised to solve these problems in a world where a user can potentially have one online identity, with a pre-certified proof which is submitted when required for authentication.  This model is represented by Microsoft’s Cardspace and the open source Higgins project, but has been slow to gain momentum.  However, in recent years, a number of the larger IAM vendors, starting with CA Technologies, have added support for these technologies to their Web Access Management products.

Multiple Identities Online

Of course, being able to use a single identity and set of credentials for all your online activities is a real “good news/bad news” story.  The convenience of managing a single set of credentials comes at a price:  it’s quite conceivable that your visits to different websites could be aggregated and correlated, to build a far more comprehensive (and revealing) picture of your online activity than you might feel comfortable with.  It’s also true to say that not all web sites we visit (and register for) justify the same level of strength in authenticating our identity.  For example:

  • Online Banking: There’s so much at stake if your banking credentials become compromised that it’s obvious to all but the hard of thinking that those credentials should never be used elsewhere.  In a previous post, I described how my bank allows me to be warned if I try to re-use internet banking credentials on another site, by providing me with a free copy of Trusteer Rapport.  This protection can be easily extended to other high risk sites.
  • Social Media: As I’ve described on these pages before, I use a wide range of social media applications (in the widest sense of the term) to maintain my contact list, collect and collate information and publicise this blog.  Each site requires a separate set of credentials, but increasingly I’m offered the chance to sign in to one application using the credentials from another (very often, either Twitter or Facebook).  This makes use of the Open Authentication (OAuth) protocol.  OAuth allows the user to authenticate with their chosen service to generate a token.  The token can then be used to allow another application to access resources for a given period of time.  So, for example, when configuring Tweetdeck, I authenticate in turn to Twitter, Facebook, LinkedIn and Google Buzz and authorise Tweetdeck to use the OAuth tokens to retrieve data from those applications until I revoke that access.

Single Sign On
This still leaves a wide range on different sites that require a login.  I use a wide range of Cloud Services, including Drop Box (of which, more in a moment), Windows Live Mesh, Mind Meister (for collaborating on mind maps), MobileNoter (for sharing and synchronising Microsoft OneNote) and of course, Google Docs.  These (or at least the data I entrust to them) are important enough to me to warrant good quality credentials and together they make a good case for Single Sign On.  With more than 10 years’ experience in Identity Management projects, I’ve always viewed SSO as primarily a user productivity tool, with some incidental security benefits.  However, I came across a story on Mashable, describing tools for managing web passwords and quickly realised that I could:

  • Store all my credentials in a single location;
  • Secure them with a single strong password, which never leaves my machine;
  • Synchronise that credential store across multiple computers by locating the credential store on Drop Box;
  • Use the same, synchronised solution on my iPhone.

So, armed with these requirements and the Mashable product reviews, I eventually settled on 1Password.  As well as a management app, which sits in the system tray, 1Password installs a plug-in for all the modern browsers (I’m using it with IE and Firefox) which detects when you’re completing a registration or login form and prompts you to save the credentials.  Next time you visit the site, just press the 1Password button to login.  Incidentally, the Mashable article mentions that 1Password is primarily a Mac product, with a Windows version in beta.  The Windows version is now in fact available as a paid-for GA product.

Summing Up

So, in conclusion, it’s possible to figure out a strategy to at least simplify sign on and credential management to a wide range of web sites and applications, each with differing needs for strength and protection.  By and large, the tools to do this a available for free and even the commercial components I chose are available for a very modest fee.  All in all, the benefits far outweigh the modest outlay of time and cash.

Advertisements

21st Century Typing Pool

August 8, 2010 at 5:43 pm | Posted in Collaboration | Leave a comment
Tags: , , , , , , , ,

I’ve written before in this blog about the difficulties of managing information across multiple computers and other devices, when you’re an independent consultant, looking to stretch your budget using (mostly) free tools.  In those posts, I’ve speculated that at some point, I would need to resolve the problem of how to collaborate in real-time with colleagues.  As it happens, it was after my recent return to the corporate world that the first real need came up.

I accepted an assignment to write a short document for an important customer.  The document was to be co-authored by me and a colleague, with other members of our team making contributions or acting as reviewers.  The problem was that we had a very short period of time to produce a first draft and it was unlikely that we’d be able to find much time working together in the same office – a clear case for online collaboration.

The nice thing about my current employer is that staff are actively encouraged to experiment with social media, collaboration and other tools.  So in casting around for a solution, there were no shortage of suggestions.  Keep in mind that:

  • We didn’t have the time to be very formal in our approach;
  • There was no clear demarcation on who should write each section – we anticipated that we’d all contribute to all of it;
  • It was to be only a short (no more than 20 page) document.

Given who we work for, the logical first step was to try out Lotus Quickr. This web-based system allows real-time collaboration for teams and can work both inside and outside the corporate firewall.  It was useful for building a library for the reference material we needed for our task, particularly with connectors allowing us to drag and drop files into the library on the Windows desktop and to use it from within email (Lotus Notes) and IM (Lotus SameTime).  However, while it has all the facilities for managing collaboration on a document, they proved too formal for our requirements.  Documents must be checked out for editing and then checked back in for review.  That was just too slow (and single user!) for our purposes.

Our next attempt was to use a wiki.  This allowed us to work on our document collaboratively, either in a simple markup language or using a WYSIWYG editor from a web browser.  So far, so good.  The problem came when we tried to simultaneously edit the document.  Wikis are designed to be open for anyone to edit.  The philosophy is that incorrect information, bad grammar or typos will be quickly corrected by someone else.  This is fine, if you have the time to break your document into a series of hyperlinked pages.  For us though, when we were both working simultaneously, the last one to save changes was confronted with either overwriting his coauthor’s changes or discarding his own.

Finally, my co-author (Identity and Access Management specialist Matt Kowalski) persuaded me that we should try Google Docs.  We both use a number of Google services already (in my case, Buzz and Wave, as well as Calendar), so it was a simple matter to set up an account, import our existing draft from Microsoft Word and get started.  Google Docs is like using the 50% of functionality in Word that everyone uses, without being slowed down by the other 50% that no-one uses.  Even the toolbars are familiar enough to start working straightaway.  You of course have control over who can collaborate and who can view, but within those boundaries, everyone can work simultaneously.  This can be a little unnerving at first, seeing changes happen elsewhere on the page, as you’re typing.

Google Docs allows some collaboration apart from document editing.  It provides an online chat window when collaborators are editing or viewing the document at the same time.  However, it occurred to me that the whole idea of Google Wave is to provide more sophisticated collaboration tools.  The downside of Wave of course is that you can’t create, edit or share documents.  However, you can work around that by integrating the two services, using the Google Wave iFrame gadget.  I know that Google Wave will be shut down at the end of this year, but for now, it seems worth taking the time to experiment.  To me, it seems to work well, albeit in somewhat limited screen real estate.

Of course, if I’m going to consider using such a combination for real work, I need to consider security – that is after all my speciality.  The first consideration is to be able to back up and restore anything I commit to Google Docs.  For this, I turned again to Backupify.  Sure enough, their free service includes backup of a single Google Docs account.  I configured it and by next morning, I’d received an email confirming the first successful backup.  To be sure, I accessed the archive at Backupify.  I opened the archive, located my document and opened it, without any drama at all.

For a real commercial solution using Google Docs, it would be necessary to add further security.  CA Technologies recently announced new cloud based capabilities for its Identity and Access Management (IAM) suite, allowing customers to provision users with credentials in Google Apps (including Google Docs) and also to enforce access through CA Siteminder and for business partners through CA Federation Manager.  No doubt other vendors either have or are developing equivalent capabilities.

By way of a conclusion, we found a solution to our dilemma – a multiuser, real-time collaboration system, to edit and then publish a document.  In practice, it was easy to use and the necessary security can be easily (and to some extent for free) added.  Give it a try yourself – if you want to try it in Wave, then you’ll have to be quick.

News from the RSA Show: CA Provisions to Salesforce.com App

March 3, 2010 at 8:23 am | Posted in Cloud Security, Identity Management | 1 Comment
Tags: , , , , , , , , , , ,

You may have noticed that I published an article the other day on how user provisioning products have evolved into the sophisticated Identity Management offerings we see today from the major vendors. In that article, I ended by commenting that the next challenge is to be able to extend Identity Management beyond the enterprise, to cater for the whole raft of new application delivery platforms.

According to a Network World article today, CA is expected to announce at the RSA Show that CA Identity Manager will allow organisations to provision their users to Salesforce.com Sales Cloud 2.  This new addition is expected to be made available at no cost to exisiting customers.

CA itself is a Salesforce.com customer, with access to the applications made available to its sales and pre-sales teams.  CA Siteminder is already integrated into the Salesforce.com offering, to provide single sign on.

What will be interesting will be to see to what extent CA can incorporate this cloud-based provisioning into their role life cycle management story.

Reblog this post [with Zemanta]

Danger in the Cloud?

October 13, 2009 at 9:37 am | Posted in Systems Management | Leave a comment
Tags: , , , , , ,

10 years ago, I was interviewed for a position within the newly formed eTrust security practice at Computer Associates (now CA).  The Consulting Director who interviewed me asked how much I knew about the eTrust product set.  I reeled off the list of products (I know how to research!) and explained which of them I had firsthand experience with.  I concluded by saying “Oh, and we use Arcserve for all our backups.”  The consulting director pointed out that Arcserve (CA had recently acquired Cheyenne) is a storage product, not a security product.  My response “It is where I come from!”  I got the job anyway.  

The point of this anecdote is that security is based on that well-known triad Confidentiality-Integrity-Availability.  In fact, Dorothy Denning makes a compelling argument for expressing both confidentiality and integrity in terms of availability.  So, of course backup and recovery – the first line of defence for availability – are part of security.

Backups matter!More recently, as I was setting up Identigrate UK, my desktop PC suffered a catastrophic failure.  Things rapidly deteriorated until I couldn’t even start the machine in SAFE mode.  However, as a long-time paranoid security specialist (even paranoids have real enemies, right?) I had set up regular backups to an external eSATA drive (stored in a fire and water proof safe).  I had also set up to backup critical documents (business plan, budget spreadsheets …) as they changed, using BT’s Digital Vault service.   Finally, the PC manufacturer had had the good sense to configure a recovery disk, based on the excellent Norton Ghost.  So, after half a day of hard work, my PC was restored, all applications re-installed and virtually all data recovered.  It reminded me of a (somewhat cynical) definition of backup as “something you start doing immediately after your first hard disk failure”.

On 10 October, after a week of escalating outages, T-Mobile was forced to announce to it’s Sidekick users that their data had been lost and that recovery was extremely unlikely.  For those that (like me) haven’t come across the Sidekick beforeTrain Wreck!, it’s a smart phone, manufactured by Danger Inc.  Microsoft acquired Danger Inc in February of this year.  The important thing is that the Sidekick doesn’t store data (contacts, calendars, to do lists, photos) locally, but rather stores it  “in the cloud” or more accurately on Danger’s servers.

It’s still not clear what actually happened, but there’s speculation about a bodged SAN upgrade.  However it happened, how can you possibly run any enterprise IT setup and not have fully functioning – and tested – backup and recovery processes?

Now, I use an iPhone, so could the same disaster befall me?  Well, no.  My iPhone stores most of its data locally on the device.  When I connect the iPhone to my PC, it makes a backup on the PC (which is then backed up to the external disk).  I do use cloud services with my iPhone – MobileNoter, Google Calendar and so forth – but these are just synchronising data between my iPhone and my desktop/laptop.  So, the cloud data is not the only copy.

I suppose the moral of this story is that people are carrying ever more sophisticated computing devices in their pocket and they’re using them in conjunction with ever more complex cloud services.  For many people,  this is all new and bewildering, but that’s going to change.  As Larry Dignan comments on his blog, “As we rely on the cloud more there will become a day when everyone will have some basic knowledge of IT management. Rest assured, Sidekick customers will know you’re supposed to back up your servers better. Gmail customers may learn a bit about scalability. And TD Bank customers certainly know that you can’t merge systems without a fallback plan if things go awry.”

The Provenance of Provisioning

September 11, 2009 at 10:01 am | Posted in Identity Management | Leave a comment
Tags: , , , , , , , , ,

I was reading Dave Kearns’ article on directories in Network World Identity Management Alert (more on that in a later blog) the other day and I spotted a reference to an article from 10 years ago (the newsletter was then called “Fusion Focus on Directory Services”) on the beginnings of the provisioning sector . Aberdeen had christened this new breed of “office productivity” applications as e-provisioning in their Technology Viewpoint of September 1999. Dave recounts how he came across a startup at NetWorld+Interop 99. Dave noted that this startup, Business Layers, was the only vendor active in the new space.

At around that time (well, OK, in early 2000) I had moved from infrastructure and security management at a UK defence contractor to the newly formed security practice at a Top 5 software vendor. While I’m loathe to dispute Dave’s account, it’s not quite as I remembered it. I chatted with colleagues from that time and confirmed that, for example, CA had released their first provisioning solution in 1997. The solution was designed as an extension to CA’s flagship Unicenter networks and systems management family, and released under the name Unicenter Directory Management Option (DMO). Following CA’s acquisition of Platinum, DMO was relaunched as a standalone product under the name eTrust Admin in 2000. It’s maybe not all that surprising though that this went largely unnoticed. A friend (who was the eTrust Admin development manager at the time) recalls how one of the major industry analyst firms contacted CA Analyst Relations to ask if they had a tool for provisioning to be told “No”.

It seems to me that the earliest provisioning vendors were top tier network and systems management vendors (BMC, CA, IBM Tivoli). They started with important advantages. For example, their presence in the mainframe market exposed them to effective and mature (though largely manual) processes for user administration widely found in mainframe shops built around RACF, ACF2 or Top Secret. Secondly, their experience in building network and systems management solutions meant expertise in development of agent technology and reliable (store and forward) messaging, the vital “plumbing” for a provisioning engine. These first attempts placed emphasis on centralised, consistent manipulation of credentials on target systems.

The second wave of provisioning products came from niche vendors (Business Layers, Access 360, Waveset, Thor) and were characterised by their use of web technology and the adoption of configurable workflow-based approval processes. They also initially had limited coverage for connectors (and some connectors had limited capabilities) . At the time of the CA acquisition of Netegrity in 2005, Identity Minder -eProvision (formerly the Business Layers Day One product) was still licenced to use the connectors from BMC’s Control-SA product.

In late 2000, at the height of the DotCom boom, I was lead security architect for a proposed chain of high security hosting centres around the world, to be implemented by a consortium that included CA, Sun and Oracle. Business Layers demonstrated their product, showing me a workflow process, updating its status in real time, displayed on a Unicenter Worldview map. I was impressed – it was better than the integration between the Unicenter components!

These new capabilities however proved to be pre-requisites for delegated administration and user self-service. This then led to a rash of acquisitions, with Netegrity joining CA, Access 360 joining IBM, Thor joining Oracle and Waveset joining Sun. Netegrity brought two distinct offerings to the party, in Identity Minder (web based administration for Siteminder deployments) and eProvision (the former Business Layers product). The 2nd generation CA product was built by integrating Netegrity’s Identity Minder with CA’s eTrust Admin. The eProvision developers left CA to form a new company IDFocus, which developed add-ons for Identity Manager implementing the best features of eProvision which were still missing from the CA product. CA eventually acquired IDFocus in late 2008 and merged the two development teams. BMC acquired a directory management product (Calendra) in 2005 to add the missing elements of workflow and graphical interfaces.

The current race for the Identity Management vendors is to integrate role mining and role management capabilities into their solutions. First, Oracle acquired Bridgestream, then Sun acquired VAAU with their RBACx product. Finally in late 2008, CA acquired Eurekify. Meanwhile IBM have decided to build their capability in-house.

So, where next? It goes without saying that all the major vendors still have much to do to improve integration and remove duplication between the multiple components from which their products are built. Beyond that, I think there’s a growing realisation that real-world deployments of identity management will have to be built from multi-vendor solutions. Mergers , acquisitions and divestments will see to that. The cost, time and risk of replacing one vendor’s IdM products with another’s will prove to be completely unacceptable to the business. So, vendors are going to have to address interoperability seriously. Perhaps this will be the catalyst for renewed interest in
Open standards, such as SPML and DSML. In his article on directories, Dave Kearns noted that as directories matured from the hype of directory-centric networks to unglamorous (but still vital) low level infrastructure, DSML never really took off, despite being adopted by OASIS in 2002. Interoperability is aided when directories (the single source of truth for an IdM system) are able to exchange updated information autonomously.
Which brings us back to where we started.

Create a free website or blog at WordPress.com.
Entries and comments feeds.