SOA with MDA
Re-Engineering A Legacy Enterprise IT System
By: Chung-Yeung Pang
Mar. 28, 2007 05:15 PM
For the past six years I've been engaged in a project to re-engineer a legacy enterprise IT system of a large international bank. The company still had applications with host terminal emulation that were to be replaced with Windows- and Web-based client applications. The host system was completed based on legacy applications that evolved since the 1970s. Any new applications had to interoperate with the existing legacy applications. A strategy to replace difficult-to-maintain legacy applications with new applications was implemented.
Within these constraints, Service Oriented Architecture (SOA) was introduced into the existing system. The dependency on the interfaces, performance, and transaction scopes of the legacy system prohibited any change to, or mixing with, other programming languages. All host applications were still developed in COBOL, while Windows-based client applications were developed in Java or Web-based languages. A framework with many COBOL modules was built to support the development of applications that adhere to SOA. It took three years to integrate the framework and architecture into the existing IT system and fully interoperate with existing applications.
Despite the new architecture and framework, applications were developed in the same ad hoc style as in the 1970s and 1980s. The result was undocumented and error-prone systems. Applications were inflexible for code adaptation to standards. Reusability was low. Development relied heavily on experienced programmers.
To overcome the problems common to any software development, the establishment of component architecture with "plug and play" features for individual modules made sense. A componentized development approach allows for independent module testing. A full set of modules was written to support the architecture.
With the introduction of SOA, application development was greatly simplified. However, using the framework was still a challenge for traditional programmers. To combat this, three years ago we introduced and refined techniques based on Model Driven Architecture (MDA) and have used these techniques in the development process. Since then, the techniques have been successfully applied to six IT projects for a variety of applications in treasury products, payments, credits, and securities. Following the principles of MDA, all systems are fully documented with models. As a consequence of this change in the process architects and analysts could drive development. Productivity has increased by a factor of three. Programming errors are minimized. Applications are flexible for code adaptation to standards. Reusability is very high. Through the use of models and code generation from the models, development is automated, effectively reducing the number of experienced programmers needed. The concept and design of the architecture, framework, and MDA techniques are presented below.
The software architecture for the host applications is illustrated in Figure 2. The interface is exposed using XML to transfer across environment divides. A security controller controls access to the services and data. The service mediator acts as the interface and exposes the service to other applications. It also controls the transaction, retrieves the XML message, parses the XML, sets up the context container, resolves and instantiates the requested service, and returns the XML message.
The service mediator calls the service controller, which in turn calls the components making up the application. Components are completely decoupled: they communicate only via the context container. Each component reads its input data from the context container and writes its output data into the context container. Thus each component can be tested separately as long as the data required for a component is inserted into the context container.
The service mediator is a generic module that can handle different services from the service request information in the incoming XML message. The context container provides a set of APIs for application components to extract and deposit data. The data in the context container has a tree structure and a complete data structure can be added to a node or retrieved from a node.
The service controller is a finite state machine COBOL module. It performs the centralized controlling of the process in the business service based on the process description of state, event, and transition.
Meta-Information Generation via Models
The service controller also requires a description of the process flow to provide state event transition information. This is generated from an UML activity diagram as shown in Figure 4. All program code for both of these cases is generated. The control flow description specifies which condition calls which module. As shown in Figure 4, the first state is an "Init" state. The module DEMOPF1 is called, which resolves the input. The outcome of calling this module returns an event variable in the service context, meaning either the "Customer is Company" or "Customer is Private." Depending on this event, the process flow controller would invoke the subsequent module.
To add a module, one only needs to modify the activity diagram by adding a new action to the diagram and generate from the model once more. To remove a module is just as simple. Hence a plug-and-play mechanism is provided. The only condition is that each module must adhere to the programming pattern using the context container for data input and output and the module must update the event in the service context in the context container.
Application Module Development Using Models
The COBOL data structure can be defined in an UML class model. Packages can be used to define the different sections of the structure as required in COBOL, such as the working storage and linkage section. The logic of a COBOL module can be modelled in form of a UML activity diagram. Each action of the activity can contain a COBOL code segment. Alternatively, it can be bound to a code pattern with specific pattern parameters. A complete COBOL module can be generated from such an activity diagram together with the class model. Standard templates are used to ensure that each module adheres to the programming pattern required for the framework, code standard, and platform. Actions are bound to the code patterns with values for the parameters via links as shown in Figure 5.
By using UML models, the whole application can be developed in COBOL with almost no coding at all. With CASE tools like Artisan that have a central repository, code patterns and data items can be stored in the same repository accessible to all developers across the whole corporation. In addition, the interface definition (input and output data structure defined for the context container) of a component can be extracted from the repository. This eases the analyses of component dependency and enhances the availability of component documentation.
Reader Feedback: Page 1 of 1
SOA World Latest Stories
Subscribe to the World's Most Powerful Newsletters
Subscribe to Our Rss Feeds & Get Your SYS-CON News Live!
SYS-CON Featured Whitepapers
Most Read This Week