5 Web Service Architecture Tenets
Use these guidelines to ensure you build successful Web service architectures.
by Jeff Schneider and Frank Martinez
October 19, 2005
Software construction and integration continues to be a labor-intensive task. As the state of computer science progresses, we continually are finding new tools and techniques to reduce the efforts and increase the return. The rise of a ubiquitous service-oriented computing theme has brought new opportunity for productivity and new challenges for architects and designers. You can organize the problems and solutions into five overarching groups: working with services units, focusing on loose coupling; working with networks, focusing on resource virtualization; networking services together, focusing on agile composition; doing the first three economically, focusing on scalability across several dimensions; and enabling new offerings based on the new technology, focusing on the future.
In this article, we'll guide you through all five categories and help you understand why using Web services in your architectures is important.
1. Consumers and Producers are Loosely Coupled
Loose coupling is a primary enablement of reuse and integration. Attempts to enable reuse in platform-specific, object-oriented design have failed, and most modern Web Service Architecture (WSA) programs spend extra effort to provide an atmosphere where functionality can be reused.
To do this, you first need to provide functional encapsulation, which is the process of hiding the internal workings of a service to the outside world. This is a concept that was highly promoted in the object-oriented world and continues to be a primary consideration in designing a service.
Previously one of the primary problems with reuse was that the remedies to the non-functional concerns glued the components together. Common concerns were reliability, security, and integrity. These remedies were things such as atomic transactions, triple-DES encryption, and reliable messaging. Unfortunately, references to these remedies were often hard-coded into components, making them more difficult to tear off and reuse.
The SOA model encourages the use of standardized policies that advertise the remedies to the non-functional concerns. Here, a consumer can read the policy and determine if it meets their needs. If so, they can move forward with using the service; if not, they can query the network to determine if a mediating device is available to provide translation.
Also, you need to force ubiquity at the edge of the service. The Web services umbrella is a collection of standards such as SOAP, WSDL, and the WS-* specifications, which take anything not functionally encapsulated and converts it to a standard. The goal of these standards is to provide ubiquity in the programming model at the edge of the service. Introducing a new protocol or format that lives on the outside of the service yet can't be recognized or consumed will reduce the reusability of the service, forcing a tight coupling between the intended consumer and actual producer.
2. Leverage Network Computing and Resource Virtualization
In the 1990s, a significant focus was placed on virtualizing the machine; this effort produced the "virtual machine," or VM. The Web service computing paradigm takes this concept one step further; now the emphasis is on virtualizing many machines across a network. This virtual network of machines is usually referred to as a grid. Although the Web services architecture doesn't mandate full resource virtualization, it does offer a stepping stone to this capability. This is attractive to organizations interested in implementing autonomic or on-demand operational models.
As services are created, their interfaces and policies are advertised using ubiquitous formats. This advertisement of capability becomes the software contract. As more and more contracts are created and placed on the network, you eventually create a service network. These services, along with intermediaries, become the basic building blocks for creating and integrating software. In-network computing, which is a single programming style, often is used for both the local network (the local computer) and the remote network (this is the virtualization of the network itself).
Back to top