Tags: backup, Backupify, CA, cloud, Google Docs, IBM, Identity Management, Lotus Quick Rooms, wiki
I’ve written before in this blog about the difficulties of managing information across multiple computers and other devices, when you’re an independent consultant, looking to stretch your budget using (mostly) free tools. In those posts, I’ve speculated that at some point, I would need to resolve the problem of how to collaborate in real-time with colleagues. As it happens, it was after my recent return to the corporate world that the first real need came up.
I accepted an assignment to write a short document for an important customer. The document was to be co-authored by me and a colleague, with other members of our team making contributions or acting as reviewers. The problem was that we had a very short period of time to produce a first draft and it was unlikely that we’d be able to find much time working together in the same office – a clear case for online collaboration.
The nice thing about my current employer is that staff are actively encouraged to experiment with social media, collaboration and other tools. So in casting around for a solution, there were no shortage of suggestions. Keep in mind that:
- We didn’t have the time to be very formal in our approach;
- There was no clear demarcation on who should write each section – we anticipated that we’d all contribute to all of it;
- It was to be only a short (no more than 20 page) document.
Given who we work for, the logical first step was to try out Lotus Quickr. This web-based system allows real-time collaboration for teams and can work both inside and outside the corporate firewall. It was useful for building a library for the reference material we needed for our task, particularly with connectors allowing us to drag and drop files into the library on the Windows desktop and to use it from within email (Lotus Notes) and IM (Lotus SameTime). However, while it has all the facilities for managing collaboration on a document, they proved too formal for our requirements. Documents must be checked out for editing and then checked back in for review. That was just too slow (and single user!) for our purposes.
Our next attempt was to use a wiki. This allowed us to work on our document collaboratively, either in a simple markup language or using a WYSIWYG editor from a web browser. So far, so good. The problem came when we tried to simultaneously edit the document. Wikis are designed to be open for anyone to edit. The philosophy is that incorrect information, bad grammar or typos will be quickly corrected by someone else. This is fine, if you have the time to break your document into a series of hyperlinked pages. For us though, when we were both working simultaneously, the last one to save changes was confronted with either overwriting his coauthor’s changes or discarding his own.
Finally, my co-author (Identity and Access Management specialist Matt Kowalski) persuaded me that we should try Google Docs. We both use a number of Google services already (in my case, Buzz and Wave, as well as Calendar), so it was a simple matter to set up an account, import our existing draft from Microsoft Word and get started. Google Docs is like using the 50% of functionality in Word that everyone uses, without being slowed down by the other 50% that no-one uses. Even the toolbars are familiar enough to start working straightaway. You of course have control over who can collaborate and who can view, but within those boundaries, everyone can work simultaneously. This can be a little unnerving at first, seeing changes happen elsewhere on the page, as you’re typing.
Google Docs allows some collaboration apart from document editing. It provides an online chat window when collaborators are editing or viewing the document at the same time. However, it occurred to me that the whole idea of Google Wave is to provide more sophisticated collaboration tools. The downside of Wave of course is that you can’t create, edit or share documents. However, you can work around that by integrating the two services, using the Google Wave iFrame gadget. I know that Google Wave will be shut down at the end of this year, but for now, it seems worth taking the time to experiment. To me, it seems to work well, albeit in somewhat limited screen real estate.
Of course, if I’m going to consider using such a combination for real work, I need to consider security – that is after all my speciality. The first consideration is to be able to back up and restore anything I commit to Google Docs. For this, I turned again to Backupify. Sure enough, their free service includes backup of a single Google Docs account. I configured it and by next morning, I’d received an email confirming the first successful backup. To be sure, I accessed the archive at Backupify. I opened the archive, located my document and opened it, without any drama at all.
For a real commercial solution using Google Docs, it would be necessary to add further security. CA Technologies recently announced new cloud based capabilities for its Identity and Access Management (IAM) suite, allowing customers to provision users with credentials in Google Apps (including Google Docs) and also to enforce access through CA Siteminder and for business partners through CA Federation Manager. No doubt other vendors either have or are developing equivalent capabilities.
By way of a conclusion, we found a solution to our dilemma – a multiuser, real-time collaboration system, to edit and then publish a document. In practice, it was easy to use and the necessary security can be easily (and to some extent for free) added. Give it a try yourself – if you want to try it in Wave, then you’ll have to be quick.
Tags: CA, DSML, IBM, Identity Management, Oracle, provisioning, role engineering, role mining, SPML, Sun
I was reading Dave Kearns’ article on directories in Network World Identity Management Alert (more on that in a later blog) the other day and I spotted a reference to an article from 10 years ago (the newsletter was then called “Fusion Focus on Directory Services”) on the beginnings of the provisioning sector . Aberdeen had christened this new breed of “office productivity” applications as e-provisioning in their Technology Viewpoint of September 1999. Dave recounts how he came across a startup at NetWorld+Interop 99. Dave noted that this startup, Business Layers, was the only vendor active in the new space.
At around that time (well, OK, in early 2000) I had moved from infrastructure and security management at a UK defence contractor to the newly formed security practice at a Top 5 software vendor. While I’m loathe to dispute Dave’s account, it’s not quite as I remembered it. I chatted with colleagues from that time and confirmed that, for example, CA had released their first provisioning solution in 1997. The solution was designed as an extension to CA’s flagship Unicenter networks and systems management family, and released under the name Unicenter Directory Management Option (DMO). Following CA’s acquisition of Platinum, DMO was relaunched as a standalone product under the name eTrust Admin in 2000. It’s maybe not all that surprising though that this went largely unnoticed. A friend (who was the eTrust Admin development manager at the time) recalls how one of the major industry analyst firms contacted CA Analyst Relations to ask if they had a tool for provisioning to be told “No”.
It seems to me that the earliest provisioning vendors were top tier network and systems management vendors (BMC, CA, IBM Tivoli). They started with important advantages. For example, their presence in the mainframe market exposed them to effective and mature (though largely manual) processes for user administration widely found in mainframe shops built around RACF, ACF2 or Top Secret. Secondly, their experience in building network and systems management solutions meant expertise in development of agent technology and reliable (store and forward) messaging, the vital “plumbing” for a provisioning engine. These first attempts placed emphasis on centralised, consistent manipulation of credentials on target systems.
The second wave of provisioning products came from niche vendors (Business Layers, Access 360, Waveset, Thor) and were characterised by their use of web technology and the adoption of configurable workflow-based approval processes. They also initially had limited coverage for connectors (and some connectors had limited capabilities) . At the time of the CA acquisition of Netegrity in 2005, Identity Minder -eProvision (formerly the Business Layers Day One product) was still licenced to use the connectors from BMC’s Control-SA product.
In late 2000, at the height of the DotCom boom, I was lead security architect for a proposed chain of high security hosting centres around the world, to be implemented by a consortium that included CA, Sun and Oracle. Business Layers demonstrated their product, showing me a workflow process, updating its status in real time, displayed on a Unicenter Worldview map. I was impressed – it was better than the integration between the Unicenter components!
These new capabilities however proved to be pre-requisites for delegated administration and user self-service. This then led to a rash of acquisitions, with Netegrity joining CA, Access 360 joining IBM, Thor joining Oracle and Waveset joining Sun. Netegrity brought two distinct offerings to the party, in Identity Minder (web based administration for Siteminder deployments) and eProvision (the former Business Layers product). The 2nd generation CA product was built by integrating Netegrity’s Identity Minder with CA’s eTrust Admin. The eProvision developers left CA to form a new company IDFocus, which developed add-ons for Identity Manager implementing the best features of eProvision which were still missing from the CA product. CA eventually acquired IDFocus in late 2008 and merged the two development teams. BMC acquired a directory management product (Calendra) in 2005 to add the missing elements of workflow and graphical interfaces.
The current race for the Identity Management vendors is to integrate role mining and role management capabilities into their solutions. First, Oracle acquired Bridgestream, then Sun acquired VAAU with their RBACx product. Finally in late 2008, CA acquired Eurekify. Meanwhile IBM have decided to build their capability in-house.
So, where next? It goes without saying that all the major vendors still have much to do to improve integration and remove duplication between the multiple components from which their products are built. Beyond that, I think there’s a growing realisation that real-world deployments of identity management will have to be built from multi-vendor solutions. Mergers , acquisitions and divestments will see to that. The cost, time and risk of replacing one vendor’s IdM products with another’s will prove to be completely unacceptable to the business. So, vendors are going to have to address interoperability seriously. Perhaps this will be the catalyst for renewed interest in
Open standards, such as SPML and DSML. In his article on directories, Dave Kearns noted that as directories matured from the hype of directory-centric networks to unglamorous (but still vital) low level infrastructure, DSML never really took off, despite being adopted by OASIS in 2002. Interoperability is aided when directories (the single source of truth for an IdM system) are able to exchange updated information autonomously.
Which brings us back to where we started.
Tags: compliance, governance, IBM, roles
A few days ago, I was invited to IBM South Bank for a workshop on Identity and Access Management (IAM) Governance. The workshop was timed to coincide with the launch of the latest release of Tivoli Identity Manager (v5.1). IBM’s press release describes the new features in TIM v5.1, but I’ll summarise them here:
- Role management capabilities
The latest version of TIM allows the definition of (optionally nested) roles. Roles are not used to manage user access to resources, but rather provide a structure through which to do it more efficiently.
- Separation of duty capabilities
Separation of duty is a policy-driven feature to manage potential or existing role conflicts. A separation of duty policy is a logical container of separation rules that define mutually exclusive relationships among roles. Separation of duty policies are defined by one or more business rules that exclude users from membership in multiple roles that might present a business conflict.
- User recertification
TIM provides a process to periodically certify and validate a user’s access to IT resources, combines recertification of a user’s accounts, group memberships of accounts, and role memberships into a single activity.
- Group management capabilities
TIM now allows the creation of groups of users, to help with automation of identity management processes.
- Tivoli Common Reporting
TIM’s reporting capabilities have been migrated to IBM Tivoli Common Reporting. This component is based on the Eclipse Business Intelligence Reporting Tool and provides custom report authoring, report distribution, report scheduling capabilities, and the ability to run and manage reports from multiple IBM Tivoli products.
- New APIs
The theme for the day was IAM Governance and in IBM’s view “Tivoli Identity Manager delivers important functionality for identity and access management governance”. The new features support governance, by enforcing compliance through product policies (as opposed to technical policies – see Earl Perkins’ blog for more details) and by allowing reconciliation between the policy-based view of user entitlements, stored in TIM’s directory and the reality, defined on the managed platforms and applications. While regulatory mandates don’t demand the use of roles (though corporate policy might) they do offer a simplified abstraction, through which access can be governed. At the risk of being pedantic, I’d call this compliance, rather than governance, but it’s all down to your own definition.
Uniquely among the major IAM vendors, IBM chose not to acquire a niche role management vendor to add this capability and instead developed the capability in-house, as an integral part of their identity management platform. This has the positive effect of avoiding the inevitable difficulties of bringing together two distinct (and often conflicting) technology platforms and development teams. Sun, Oracle and CA are all working through these issues currently, following their acquisitions of VAAU, Bridgestream and Eurekify respectively. On the negative side, it means that role management in TIM is a “work in progress”. However, I’m assured that IBM plan to release further functionality in this area, during the 2nd half of 2009.
What would I like to see added to the role management capability? I think that a function to help with the discovery and mining of roles from existing entitlement data would speed up the creation and deployment of an initial enterprise role structure. I have to declare an interest here. As a consultant, who specialises in the organisational change required for IAM programmes, I strongly favour the ability to run the role mining and discovery effort without the need to deploy the identity management infrastructure and connectors. Once an initial enterprise model is complete (and agreed) then it can be imported into the identity management system, where it should become subject to life cycle management, with TIM providing recertification and approval for changes to role definitions. This approach is elegantly illustrated by CA’s deployment architecture for Eurekify. So, if I had a vote, I’d say integrate role life cycle management into TIM and leave role mining as a stand-alone tool.
My final thought relates to Governance, Risk and Compliance. The objective must be to take result from computerised controls (such as TIM) and use those results to update an overall picture of the organisation’s risk exposure. This is the job of a GRC Management platform. In the final session of the South Bank workshop, IBM showed how TIM can be used in conjunction with Tivoli Compliance Insight Manager. This closed loop integration between security event management and identity and access management allows administrators to compare real user behaviour with desired behaviour, exactly as an auditor would. TCIM can provide a graphical representation of the information, along the lines of a heat map. IBM partner with niche vendors, such as Sailpoint and Aveksa, to deliver a complete IAM Governance solution. Personally, I’d love to see the TIM and TCIM products integrated with (for example) the excellent STREAM integrated risk and assurance management platform from Acuity.
By way of a conclusion, this latest release of TIM continues to address the use cases needed by IAM professional and does it with the benefit of a simple and consistent user interface and a simple trouble-free install process. If there’s a downside, then it’s that TIM is a monolithic application, limiting the ability of an organisation to pick the parts they need to start with. Having said that, organisations can readily deploy the application and utilise initially (say) reconciliation, recertification or compliance reporting, without needing to design and implement the heavyweight user provisioning and role management functions.
Readers may notice from my profile that I’m currently employed as a Senior Managing Consultant in IBM’s Global Business Services. However, at the time of writing this article, I was an independent consultant, with no commercial relationship with IBM.