All Integration Methods Are Not Created Equal

Published by Bo McWilliams, Managing Director - General Manager, Managed Hosting
February 22nd, 2016

As a Wealth Manager, your team needs to be devoted stewards of your clients’ wealth. To do that successfully, your clients need to know:

  • How their investments are performing against their goals;
  • What balance of risk versus potential reward they’re taking on; and
  • How the performance of their investments compares to market benchmarks.

Getting to this information requires, among other things, easy access to data stored in multiple systems—such as Customer Relationship Management (CRM) systems, portfolio management systems, and trading systems—for more informed decision-making. Most importantly, it requires the ability to present consolidated reports to clients based on those data sources. But every minute spent by your team managing data and preparing reports is time that could instead be used focusing on clients’ objectives.

The problem is the technical toolset most wealth managers use to review client data. This article looks at the two options for data integration—and why one method is clearly superior.

ETL vs. API Processes for Data Integration

There are two general methods for data integration. The most prevalent in the financial services industry is the Extract, Transform and Load (ETL) process. In this three-step approach, data is first extracted from a “target” system, often in a Comma Separated Values (CSV) file that is similar to a spreadsheet. This extraction can either be performed manually when demanded, or as part of an automated routine such as a nightly batch process.

Before that file can be loaded into another database or application for analysis it needs to be translated into a format that the source system will readily accept. This generally requires some manual manipulation of the file. For instance, the target system might store and export account numbers separated by dashes (123-456), whereas the source system might not use dashes (123456). Importing records with dashes into a system that doesn’t use them would cause all kinds of problems, so these need to be removed. Once the files are organized in a way that matches what the source system expects, they can be loaded into it.

The alternate approach to data integration is an Application Programming Interface (API). An API is essentially a set of instructions and acceptable responses (i.e., in a specific format) that can be sent between two systems. It allows a software developer to write code that automates the same basic procedure of ETL—requesting certain data from another system, transforming it if necessary, and pulling it into the requesting system. But the use of APIs for data integration offers several advantages over ETL.

The 5 Critical Advantages of APIs

APIs save wealth managers time. Once established, the links between source and target systems are permanent. Firms will no longer need to run manual exports and imports or manipulate data.

APIs enable real-time data analysis. The data extracted through an ETL process is out of date the moment it’s extracted. APIs, in contrast, are real-time links between source and target systems. In today’s volatile financial market, having up-to-date performance data is critical.

APIs don’t require maintenance. System updates occur frequently. When they do, the manual integration process you relied upon can suddenly stop working. But the code involving APIs can be written in such a way that the integration won’t break even if the code in the source or target systems does.

APIs don’t require a database. Many ETL processes involve consolidating information from multiple systems into a database (or data warehouse), against which reports can be run or analysis performed. For an advisory firm with very large amounts of data, this storage requirement can be extreme. In an API model, however, all the data does not need to be pulled into one system; rather, only the specific data required for an analysis can be pulled in when needed. It’s analogous to looking up the stock performance online for one particular stock over the last three months, versus pulling in the stock performance history of the entire stock market.

APIs avoid user error. ETL processes often involve some manual tweaking of data sets at either the translation step or when compiling data sets for reporting or analysis. That makes it prone to errors, which introduces a host of regulatory compliance concerns on top of serious client satisfaction issues.  

CORE: Making Data Valuable

Moving data from one system to another is only a first step. The data itself isn’t what’s valuable—it’s the insights that can be gleaned from analyzing the data and displaying it in the form of Web-based portals and dashboards that can be shared with clients.

That’s where First Rate comes in. Our CORE solution does the heavy lifting of performance measurement and report generation. With CORE, measurement and reporting is fully automated. Data can be pulled in or accessed by an outside application through our open API architecture, while maintaining the integrity and security of your clients’ data. CORE frees you to spend less time compiling and analyzing data and more time engaging your clients, earning their trust. 

2016-02-First-Rate-Demo-Banner-519-179

About the Author: Hugh (Bo) McWilliams is General Manager of First Rate’s Managed Hosting division, and is responsible for First Rate’s data centers, infrastructure, data processing operations and security.

Share This Post: