ISC Chronicle White Papers Contact Us

Want to Win Clients on the Strength of Client Reporting - Workflow is the Key

Return to library

1. Introduction

In the Winter 2010 edition of the ISC Chronicle, we examined the ways in which client reporting is evolving with views on “The Shape of Things to Come”. In this article, we take a more detailed look at the challenges of client reporting workflow as we seek to de-mystify the perception that it is a complex beast when in fact adhering to some simple dos and don’ts proves it doesn’t have to a daunting exercise to implement a sensible process for client report production.

2. An Inefficient History

The Asset Management Industry has historically struggled to implement strong software-based workflow solutions, being constrained by specific processes, structures and systems.

Workflow solutions have been relegated to manual processes along with responses to issues within the document production lifecycle. Non-automated email, phone and face to face contact have been the mechanisms by which many production processes have been assembled and conducted

These traditional processes are inefficient, error prone and unable to scale significantly without the addition of significant headcount and cost. Historical workflow processes require substantial involvement from technical areas within Asset Management , hindering operations and report production teams due to their over reliance on IT helpdesk support ; an area of substantial resource conflict within organisations.

Historical workflow processes all suffer from similar problems:

  • Workflow processes are manual and there is a limit to the number of client reports that can realistically be managed at one time, otherwise there is a risk of errors entering the process, or reducing the urgency associated with the most important reports. Priorities have to be taken from the client reports that need to be produced first, but the effort and coordination required to meet the initial reporting deadlines means little or no focus is placed on all subsequent reports until the most urgent reports are out. The end result is an elongated reporting cycle
  • The possibility for user error. As deadlines approach and the necessity to multi task increases, it is no surprise a workflow step is missed, forgotten or performed incorrectly
  • Manual workflow solutions lack transparency. Manually defining and managing who needs to do what, when and how for different report types proves to be extremely difficult, not helped by the need to resolve issues, identify bottlenecks, load deltas and manage user holidays.

It is no surprise that manual workflow is a strong candidate for dramatic improvement or replacement.

Automation and management of client reporting goes far beyond the physical production of a document or report. The production process that is used to produce the final output involves many functions across the business and requires substantive definition, timing and tight control to make it possible for the process to work on a recurrent basis. Understand the end-to-end process, employ a prescriptive workflow and the process becomes robust and scalable!

Workflow should control the sourcing and receipt of data inputs, the construction of each report component and the scheduled delivery of the end-state information or report.

3. Understanding the Process

Each Output Requires and Input

The “make-up” of typical recurrent documents is substantive and complex. At the output level the document requires production in different formats: PowerPoint; PDF, Excel or raw data files. Each output is the sum of many individual input parts, reaching across multiple systems and databases within an organisation. On the face of it, a “Fund Factsheet” appears to be a relatively simple document in a straightforward design with some simple charts and tables and commentary. However, in reality, it is a complex document, with up to 30 different data types from 5 or more internal or external systems requiring calculation of data and delivery and presentation within a 5 to 10 day period from month end.

Each output part within a report can be described as a “component” or building block and has an input element; a data source and data type. Data sources fall into two broad categories of structured and unstructured.

Structured Data is held within core investment management systems or third party systems and is periodically available on a scheduled basis (e.g. Performance history or portfolio analytics). Unstructured Data is held within non- core systems or databases and is available on an ad-hoc basis (e.g. commentary or images).

The management of the data, whether structured or unstructured, has significant impact on the success or otherwise of the workflow. Each piece of data requires the usual disciplines of data management to be applied to ensure the data is timely, accurate, complete, consistently sourced and the deltas controlled.

Sourcing Data

When considering using upstream data as an input into a report production process, it is important to assess the availability, state and validity of that data.

While systems contain data that is required for inclusion in client reports, a mechanism will not always be available for extracting that data. There are 5 options here:

  • Use an existing system extract that is not specific for client reporting, but contains the information required
  • Use an existing system extract that contains some, but not all of the information required for reporting purposes, and enrich with additional data
  • Write a new system extract to be used specifically for client reporting
  • Manually enter data to data files so it can be included in the reporting process
  • An Asset Management organisation may utilise an internal data warehouse, where all upstream system data is fed before being made available for client reporting. Extracts can then be defined from the data warehouse, meeting the needs of client reporting.

Manipulating Data Received

The state of the data refers to both its completeness and also its form; is it raw or is it processed? The completeness of system data often varies depending on the type of data. Taking a performance system as an example, while this may contain performance data for 95% of the funds that need to be reported on, there may be specific fund types that use alternate systems to calculate performance. In this instance it is important to establish what the gaps in data are and how they can be plugged. This may mean for the same report component data will need to be sourced from more than one system depending on the fund/portfolio.

It is also important to establish whether the data can be extracted from a system as raw data, or as calculated values. Raw data tends to be at a stock level and after being extracted from the system, passes through a series of rules and calculations to produce the information required for a client report. If data is extracted in this form, then the reporting process will need to have intelligence built in to handle the rules and calculations required.

Processed data has already been passed through the rules and calculations required for client reporting. This could happen as part of the system extract process, or raw data could be extracted and then feed into another tool that applies the intelligence required.

While it may seem to make sense to handle all rules and calculations before data leaves the source system, this is not always possible due to system limitations. Additionally, if data can be extracted at its lowest level (before any rules or calculations are applied to it), it means data from the same extract is more flexible and can potentially be used for other functions, client reporting or not.

Ensuring the Validity of Data

The validity of data refers to the association of a specific set of data to a particular reporting period. This means that client reports are produced using the correct data, and as an example, December performance returns do not show in a March report. For this to be managed, processes need to be defined that control data delivery, and when delivered, make relational associations across all data for a specific reporting period. Further to this, consideration also needs to be given to handling deltas (corrections to data). This could lead to the same extract being run many times for the same reporting period so ensuring the corrections to data in upstream systems are filtered down to client reports.

Including Commentary

A large part of client reports consist of commentary sections that give information on the performance and composition of the portfolio, and general comments about the relevant market sector the fund invests in. As the number of portfolios included in a production run increases, so does the amount of text required to be written. This needs careful definition and planning. Working practises for this vary between Asset Management organisations, there may be a team of Investment Writers who churn this out, or it may be down to the Fund Managers themselves to provide this input. Regardless of the author, the process needs to define the text components, any possible reuse across different funds, the review and validation and how commentary is included.

Including Static Data and Images

Static data refers to data that is related to a portfolio, but is not impacted by the performance or composition of the fund. Typical examples include fund manager name; launch date; benchmark name; sector name; ex-div date; contact details, etc. This data does not change, or if it does, very infrequently. Similarly images are static by nature and a client reporting production run needs to consider all static data; the master source, how change is notified and updates made.

Managing the Scheduled Delivery of Data

After all the report data inputs have been defined, they must be given a delivery schedule and mechanism. The delivery schedule will define when the data will be fed into the production process, and the mechanism defines how the data will be delivered. A data delivery schedule needs to consider the following factors:

  • What working day is audited data available?
  • Is all data available at the same time or in stages?
  • What is the working day deadline for individual client reports?

To define a delivery schedule, the client reporting function should maintain a matrix that defines all data sources, what business day those sources will be available, whether all data of the same type will be available at the same time or staggered and whether there are any exceptions. This matrix should then be referenced during the production process to ensure data delivery timeframes are being met.

The delivery mechanism can vary depending on the type of data being handled, but as with the schedule, needs to be defined in advance for the production process to run smoothly.

Distribution

The output must be capable of being distributed intelligently if Asset Managers are to make value of a proper Workflow and document production system. The distribution of a document must be at a simple level to an internal “library” or directory of completed documents that are searchable and viewable by business members who have no typical access to the production system. At a more complex level distribution must enable Asset Managers to automate the sending of documents and information to external users such as print fulfilment houses (with job tickets), clients or intermediaries such as Financial Advisers or Consultants. A client reporting document typically goes to several members within a pension fund, the consultant and additionally internal sales/ relationship managers in differing output formats.

Additionally, the end ‘clients’ should have the ability to view their reports online through a secure extranet, and additionally select and compile reports “on demand” prior to a structured reporting process in order to gain insight into their fund performance on a more regular basis. Asset Managers should have available information on the client usage and requests delivered automatically to their chosen CRM systems in order to manage their customers effectively.

4. Conclusion

The inputs and their sources are understood. These inputs are delivered to match the requirements of each and every report component. Data is controlled robustly. Commentary is included efficiently and that commentary is accurate and reusable. Delivery is scheduled based on the knowledge of the availability of the data used in each component. The delivery matches the SLA. The mechanism used for distribution is appropriate and the end-client can access data, matching their expectations. All sound utopian? Integrating workflow across the end-to-end client reporting process that is visible, transparent and constructive, will deliver a scalable and robust reporting function that will become a service differentiator!

Return to library