Nigel Shadbolt weighs up a timely look at a key digital challenge — interoperability.
Interop: The Promise and Perils of Highly Interconnected Systems
John Palfrey and Urs Gasser. Basic Books: 2012. 304 pp./256 pp. $28.99/£18.99
Digitization eases many aspects of life, but it has introduced a slew of frustrations. Most of us have struggled to display a presentation on a projector or play digital content across a range of devices — both aspects of the challenge that John Palfrey and Urs Gasser lay out in Interop. The interoperability of devices and systems is an ever more pressing issue in an era of increasing digital complexity.
The authors, both Internet law experts, trawl through case studies to outline why interoperability is so important, how it arises and the benefits and issues that flow from it.
Palfrey and Gasser have a record of taking up a concept early and writing about it accessibly and informatively. Their book Born Digital (Basic Books, 2008) charted how growing up surrounded from birth by the benefits of the Internet affects all aspects of life. So I was looking forward to this latest work. However, the result is a set of examples and case studies that highlights the challenges well, but offers no clear remedies.
Most readers will know much of the terrain. There is the appeal of Apple's products and iTunes content, offset by the frustration of their 'walled garden' — a format that won't operate with other systems. Companies such as Facebook, Google and Twitter all seek to understand users' friendships and contacts, likes, dislikes and intentions, while fighting hard to keep us from 'interoperating' with their competitors for fear that we might leave, taking our social networks with us.
In search of an overarching theory of interoperability, Palfrey and Gasser offer an analysis at various levels: data, technological, human and organizational. In each layer, systems can promote or frustrate interoperability. For example, spreadsheet applications can exchange data if a common format can be found. At the technological level, the open standards of the World Wide Web mean that content can be delivered to a browser on any operating system.
Human interoperability is trickier. Language provides the appearance of agreement, but in situations with strict safety demands, the scope of language interpretation must be limited to avoid ambiguity. Enter the controlled natural languages of air traffic control (ATC). Palfrey and Gasser use ATC to show their multilayer analysis at work. It requires interoperable data and technology — but human and organizational interoperability are essential too, because without the conventions and regulations of global ATC, the air corridors would neither work nor be safe.
The authors argue that it is important to anticipate and predict the amount of interoperability needed at each level, while maintaining security, avoiding stagnation and promoting long-term stability and viability of the systems. The divisions that comprise the layers are recognizable, but too often Palfrey and Gasser's analysis lacks depth. Their take on the Web is a case in point: they say little about open standards or the effort to ensure that these are maintained and developed.
Yet the Web has been one of the pre-eminent successes of the paradigm. When computer scientist Tim Berners-Lee set up the first Web server and the source code for the abstraction layer that sits above the Internet, openness and interoperability were at the heart of his dream. The open hypertext transfer protocol (HTTP) enabled content to be placed on and taken from any computer on the Internet that ran the open-source server software. The open hypertext markup language (HTML) allowed us to write, and make links between, documents on any accessible server on the planet. Without these standards — which are kept open by the World Wide Web Consortium — the Web could have lapsed into walled gardens.
The power of open standards goes further. Open-source systems can be more robust and secure than closed-source software — something that the discussion of security in Interop fails to consider. When a group of us established http://data.gov.uk, a UK portal for open government data, we used a variety of open-source components of the type that power systems such as Wikipedia — a site that has been subject to attack by hackers over the years. Open-source components receive extensive penetration testing, and the community works hard to fix vulnerabilities.
Palfrey and Gasser are at their best when discussing how regulation and legislation can promote interoperability. They believe that digital rights management will have to evolve to support the increasing demands for interoperability, and they describe how Microsoft's approach to it has evolved in the face of legislative scrutiny. And they praise the efforts of European Union courts and Chinese government ministries where the results enhance interoperability.
This issue, the authors stress, is not about making systems the same, but about maintaining diversity while identifying key areas of contact: an important point well made. But the book that gives us blueprints for better interoperability has yet to be written. Theories must make predictions, not merely describe phenomena.
About this article
Cite this article
Shadbolt, N. Computer science: The great between. Nature 485, 580 (2012). https://doi.org/10.1038/485580a