Managing Enterprise Data Complexity Using Web Services: Part 1
Data services architecture
Jun. 28, 2005 10:00 AM
The crux of the solution option in this regard involves the development of shared data services. The principle behind shared data services is the consolidation of common data along with the development of interfaces for data dissemination using open standards. Web services provide a viable option in this regard due to the ubiquitous nature of the Internet, and more importantly, the lack of constraints on the consumer end from a technology standpoint. Developing the shared data services will help alleviate some of the problem areas by:
- Eliminating redundant data by gradual reduction in the scatter of core data
- Standardizing mechanisms for data access/update
- Standardizing formats for common data across the enterprise
- Modifying business applications to use the new services for all data
This will significantly reduce the costs incurred for data access in multiple lines of business. Specific activities that reach toward the adoption of shared data services are as follows.
- Develop shared data services that can retrieve information for a set of related applications. Each client application uses the data service, which manages the relationships between the databases and related systems. The services will become the only channel for read/update of all data for all business applications.
- Rationalize schemas for common data across databases.
- Design service contracts based on the needs of individual LOBs/client applications.
- Provide information on demand (in response to service requests) by optimizing performance and caching heavily accessed data.
The specific features of the proposed DSA are as follows.
Why Web Services?
- A centralized database that maintains data common to multiple lines of business. This database will need to be created as discussed above, by analyzing schemas and formats of data required by current and future business needs. This database will become the system of record and the owner for the common data.
- A set of Web services that manage data access/update for all databases. These services will manage all data access/update and will become the de facto data access layer for all applications. The services will provide access to multiple data sources; it will be the responsibility of the consumer business applications to manage heterogeneous transactions appropriately based on the associated business process.
- The Web services will provide data at a granularity that is dictated by the business use cases. The Web services should not be designed based on existing queries or views. This is a very important success factor for the successful adoption of the data services. Domain decomposition should be performed to determine the use cases and their specific data needs. One caveat in this regard is that most applications will require some modification to enable them to use the new data services. The specific mechanism to perform this must be a part of the detailed migration strategy that is developed for the adoption of the data services.
- The IVR and CSR applications will interface with dedicated adapter applications that will communicate with the data services. This will ensure that the services become the only gateway to critical data. The service implementations will ensure that data is consistent at all times between the repositories.
- Line of business data continues to reside in an existing database. Depending on the specific line of business and the problems (or lack thereof) associated with the business data, this data will continue to remain in the existing databases. Physically the data may be migrated to one common platform if it makes sense from a strategic vendor management or licensing perspective. However, the important point to note is that data will be segregated on the basis of usage by lines of business with the shared data services layer simply providing a uniform mechanism of accessing the data.
- Note that there is an element of data synchronization present in the current architecture as well. This is due to certain elements that may be related to specific data items required by legacy applications in different lines of business. It may be more difficult for these applications to invoke a Web service for a specific attribute, than it would be to pick up the data through some synchronization routines. However, it needs to be pointed out that the synchronization routines developed here will be a part of the DSA and therefore should be implemented in a consistent manner along with the Web services.
It is important to point out the rationale for using Web services as the backbone for the data services architecture. There are other options for implementing this DSA such as an ETL tool, an EAI tool, or custom integration. The reasons why these are not suitable for this case are related to the factors mentioned in the Solution Tracks section above. These options would be viable for data integration, although in the case of the ETL tools or the EAI tools, there is the potential for vendor lock-in. Apart from this, it is critical to examine the issue of the heterogeneous consumers of data. It is imperative to provide data through an open channel that does not impose any constraints on the collaborating applications due to technology or client libraries. This is applicable to existing applications and also to future consumers about whom we have limited knowledge at the present time. Web services represent a viable choice to satisfy these requirements given their open standards and the ability of any technology platform - .NET-based, J2EE-based, or Mainframe - to consume them.
The path to Web services nirvana is not easy and the organization needs to make a firm commitment to this cause and be aware of certain critical factors described below.
- The DSA must be implemented in a top-down manner by focusing on the business processes and the role of data in the process fulfillment. A heavy investment must be performed in upfront analysis to understand business processes and data usage.
- Service designs should not be implemented without complete support for the interfaces from all stakeholders. If not followed, this will later lead to an undesirable proliferation of services.
- An acceptable governance process must be implemented with clear ownership of services. The best approach is to create a dedicated data governance council that facilitates the entire process and works with the business as well as with technology groups on a continuous basis.
- Service granularity must be designed based on the "get only what you need" principle. In the interests of performance, there is limited value to be gained in designing "one size fits all" type services. As mentioned earlier, the interfaces must be designed keeping in mind business processes associated with various business units. For example, service interfaces must have flexible data paging capability in the interfaces. This will ensure that consumers can control the amount of data returned to them and the workload of the services can be optimized.
In this article the case for shared data services has been analyzed. Shared data services can significantly increase reuse and developer productivity while providing consistent performance and highly available data across business units. There are technical and organizational challenges with respect to the migration to shared data services. A prudent approach to migration would involve migration in stages, with new applications moving over to use the new services before decoupling older applications from existing data access methods. Readers can use this article to come up with strategies to centralize data management and move from piecemeal, "band-aid" type solutions. Apart from this, they can also develop a data architecture that can reduce the complexity of managing data residing in multiple data sources and become flexible toward the needs of heterogeneous data consumers. Finally, we can conclude with the thought that implementation and rollout of a comprehensive data services infrastructure will require significant upfront investment in time and cost; however, the short- and long-term benefits should justify the cost. In a future article I will demonstrate the applicability of Web services in the development of enterprise-level dashboards for management reporting.
- How Do You SOA Enable Your Data Assets? Jim Green, DM Direct Newsletter, October 15, 2004
- 12 Steps to Implementing a Service-Oriented Architecture, David S. Linthicum, White paper - Grand Central Communications, October 2004
- Data Services for Next-Generation SOAs, Christopher Keene, Web Services Journal, December 2004
- Architecting Data Assets, Jeff Schulman, Gartner Research, August 2004
- XML, Web Services and the Data Revolution, Frank P. Coyle, Addison-Wesley, 2002
- An alternative architecture for financial data integration, Alberto Pan et al, Communications of the ACM, 2004
Reader Feedback: Page 1 of 1
Siriram Ananad commented on 28 Jun 2005
Managing Enterprise Data Complexity Using Web Services. Business data is one of the most critical components of the IT portfolio of any enterprise. Most e-business applications are responsible for reading and writing business data in some form or other. Therefore, the efficient storage, retrieval, and management of the data constitute a challenging problem in all organizations.
Siriram Ananad wrote: Managing Enterprise Data Complexity Using Web Services. Business data is one of the most critical components of the IT portfolio of any enterprise. Most e-business applications are responsible for reading and writing business data in some form or other. Therefore, the efficient storage, retrieval, and management of the data constitute a challenging problem in all organizations.
SOA World Latest Stories
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand usin...
Lori MacVittie is a subject matter expert on emerging technology responsible for outbound evangelism across F5's entire product suite. MacVittie has extensive development and technical architecture experience in both high-tech and enterprise organizations, in addition to network and sy...
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes ...
Containers and Kubernetes allow for code portability across on-premise VMs, bare metal, or multiple cloud provider environments. Yet, despite this portability promise, developers may include configuration and application definitions that constrain or even eliminate application portabil...
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is founda...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder an...
Subscribe to the World's Most Powerful Newsletters
Subscribe to Our Rss Feeds & Get Your SYS-CON News Live!
Publish Your Article! Please send it to editorial(at)sys-con.com!
Advertise on this site! Contact advertising(at)sys-con.com! 201 802-3021
SYS-CON Featured Whitepapers