An interesting discussion yesterday with Brian Kelly and , at UKOLN, University of Bath in the UK. Together with a couple of colleagues from the OASIS Transformational Government TC, we were talking about common misconceptions about open standards and the difficulty that many public authorities seem to face in establishing policies regarding the role that standards play in interoperability.
The debate about standards and interoperability, when engaged, all too often (and all too soon) descends into details about which technical standards should be used in order to address particular problems rather than expending more effort on fully understanding the initial problem! It could be seen as laziness or lack of intellectual rigour that public officials (myself included, in my time) quickly grasp for off-the-shelf standards or objective criteria in order to justify a particular decision, whether it concerns a choice of platform or software; or a longer term strategic choice. A more charitable interpretation is that busy officials are concerned that their decisions and recommendations have objective foundations that can be justified as part of broader public sector strategy.
Whatever might be the reality in different administrations, our discussion yesterday focussed on the need for greater attention to policy and less to technical implementation concerns. Remember the sage advice: “be wary of answers before you have properly understood the question”. Whilst it is tempting, in response to a particular policy consideration, to immediately declare “there’s a standard for that!” and assume the problem thus solved, it is a temptation that ought to be resisted.
Let’s take an example: the often heated debate about document formats. Many public administrations are rightly concerned about maintaining vendor independence and ensuring long-term preservation of their documents. The key concerns should therefore be about what high-level policy decisions need to be taken before (and possibly instead of) more technology, solution oriented decisions. For example, storing any document in a proprietary, binary format, is largely accepted as “probably a very bad idea”. A clear policy commitment could therefore be that any technology used should ensure that this trap is avoided.
But to jump from that to mandating that content be stored in a specificformat, however it might be, seems less obvious. Technologies that manage, in this example, documents and information content are far more mature today and the process of importing and exporting content between different open standards has become trivial and a boon for the end user. Import a Word document to OpenOffice? No problem; Export from Word to .odf? No problem; Generate a .pdf from either? No problem. The core technologies – whether proprietary, open source, or a mix of both, have largely solved the technical interoperability problems of yesteryear. Policy makers need only to keep sight of their original high-level objective – avoid lock-in to binary, proprietary, formats – and move further up the value chain to address interoperability issues where they really still need to be addressed – which tend to be at the organisational, legal and “political” levels.
UKOLN’s approach, while taking a view on some core criteria for assessing open standards, emphasises this with its “Risk and Opportunities Framework” – it allows policy makers to move away from the heat of battle along ideological (often sterile) lines and about binary yes/no choices, and towards the light of more considered reflection of what is fit for purpose and value for money. “Use RAND”; “Use Open Source”; “RF only”; whatever the statement, it allows only one approach, one view and assumes there is a single, ideal truth or way, a binary bind. (There is also an implication when setting the debate up in this manner that there is a default ‘right’ choice – but I’ll return to that debate in another post).
Imagine instead a policy statement along the lines of: “Favour solutions that are not encumbered by licensing or patent restrictions”. These are not weasel words nor is the statement a cop-out of responsibility, it is a far more powerful and flexible policy statement and actually forces a debate about what “encumbered” means. It is not a binary choice, it forces a debate about how appropriate a particular IP model is to a particular situation. This is surely a good thing. It is a recognition that the domain is very complex and that policy makers need to take responsibility for assessing what is indeed fit for purpose and value for money rather than pushing it off to a curt reference to a standard, at least not before a considered examination of what specific needs are being addressed.
Working with standards and interoperability, as with any domain of expertise, requires a certain skill set. I believe that using approaches such as proposed by UKOLN, approach will create a climate of more mature, evidence-based, decision-making around technology choices and far less, ultimately vacuous, flag waving.