You can search for articles in back issues of Contingencies from July/August 2000 to March/April 2009 using the search box to the right. Simply type in subject words, author's name, or article title and click search. To search for articles from May/June 2009 to the present, go to the current digital issue of the magazine and use the search function on the left of the top navigation bar.
The Case for Fundamental Change
By Doug French, Alex Korogodsky and Rob Frasca
The frameworks for change in financial reporting and management are all predicated on real-time measurement leading to real-time reaction. It’s the only way companies will survive in the 21st century.
It all seems so simple now, but it wasn’t at the time. The reserves absolutely, positively had to be booked by Jan. 23 at 4:30 p.m., and not a minute later. That meant that the calculations had to be done at 4:00 p.m., signed off by the chief actuary so the journal voucher could be prepared in accounting, the reserves posted, the books closed, and the statements printed so that the policyholders could once again rest peacefully, confident that the Secure Feeling Mutual Life Insurance Co. had enjoyed its 83rd consecutive year of growing surplus, increased dividends, and unmitigated policyholder confidence. Twenty-three days and only 18 people to get all those policy files, make all those calculations, and check all those reports—to be sure the statutory reserves that were reported at 4:30 p.m. on Jan. 23 were complete, accurate, and on time. And to think we’d have to go through the whole process again next year!
Fast-forward 23 years, and a lot has changed. That Jan. 23 deadline has become Jan. 5, and the staff of 18— well it’s now a team of four. The “Mutual” was dropped from the company name after the demutualization, as was the “Secure Feeling” after the first merger, all of which means that statutory reporting plays second fiddle to GAAP and, increasingly, to embedded value. But they all have to be reported on time. Do it all again next year? Try next month.
We’ve become progressively dependent on the technology systems that have made this leap in productivity possible. We’ve accomplished a streamlining of processes thought impossible just a decade ago. But is our success in acceleration enough? In a world where success is predicated on reporting, analyzing, and reacting to the continuous flow of risk and information, will a timely snapshot suffice? How well do two-dimensional systems reflect a three-dimensional world? Can we wait for the one-hour photo when the competition is streaming video?
Financial reporting systems today are a legacy of the financial reporting and financial management needs that gave rise to them. U.S. statutory and early GAAP reporting had historically been linear processes characterized by sequential steps. Assumptions were prescribed and formulas pre-defined, so the valuation process was routine and formulaic. Data were collected, formulas applied, and results assembled in a self-contained system. Challenges were relatively few: Make sure the data are complete and accurate; make sure the calculation engine works as desired; make sure all the data records are processed; make sure the final number gets reported. The design of the systems and controls supporting the processes was a function of these needs.
Today, this same design dominates the structure of financial reporting systems, even as the needs of insurance companies have rapidly evolved. Mainstream financial reporting systems do a great job of handling the mainstream financial reporting functions, but these functions are no longer sufficient in the opening years of the 21st century. The drivers of the need for a fundamental overhaul of financial reporting systems have been building for many years, but developments in the past few years have accelerated the case for change. Companies that ignore these signals do so at the risk of their own survival.
Changing the Rules
The first and perhaps most obvious driver of the need for fundamental changes in financial systems is the rapid change in financial reporting frameworks. Signs of these changes have been building for nearly two decades in the United States. In the mid-1980s, cash-flow testing brought the notion of multiple scenarios into U.S. statutory requirements for the first time.
In the late 1980s, FASB Statement 97 introduced the concept of continuous unlocking into U.S. GAAP—the establishment of actuarial balances that are calculated with constantly changing assumptions. The existing systems structures accommodated these changes, but begrudgingly, as an add-on outside the main linear processing framework.
As financial reporting rules have continued to change, the ability to accommodate has come under increased strain. FASB Statement 133, adopted in the late 1990s, introduced the need to determine the fair value of contract features, using option pricing or stochastic valuation techniques that fell well outside the capabilities of the established financial reporting structure.
Product innovation and the necessity for increasingly complex and sophisticated products in a competitive world are at least partially to blame. The multi-scenario valuation process required under the recently adopted American Institute of Certified Public Accounts’ Statement of Position 03-1 would never have been required were it not for the various accoutrements added to variable annuity products over the past several years. Add to these the anticipated changes in statutory requirements, including the C-3 Phase 2 requirements under the proposed risk-based capital formulas, the proposed stochastic valuation requirements for guaranteed minimum death benefits, and similar market-based guarantees. Once again, the existing financial reporting systems are ill-equipped to cope.
The second impetus for change comes from the increased pressures of oversight, from both a corporate governance perspective and a regulatory perspective. Stung by recent high-profile failures, boards of directors are demanding timely, comprehensive pictures of the financial health of their companies. These include various measures of success and profitability, risks and exposures—elements that were historically outside the mainstream financial reporting egg basket.
Separately, the requirements of Sec. 404 of the Sarbanes-Oxley Act have placed intense scrutiny on the key processes in organizations and the controls in place to protect them. Implementation of Sec. 404 is proving to be an eye-opener for companies that had introduced adjunct processes for activities that fell outside of the main financial reporting channels.
For example, many companies are dismayed to discover how ill-defined and uncontrolled a process as fundamental as the setting of actuarial assumptions is. Traditionally set once and locked in forever, the systems and controls around setting assumptions never fully adapted to a world where, more often than not, assumptions are expected to be reassessed at each reporting cycle, with potentially enormous impact to reported earnings when they’re modified. Compliance with 404 will require a new assessment and systemization of processes like these, long relegated outside the mainstream, controlled environment even as their importance increased over time.
The third and undoubtedly most critical driver necessitating change in financial systems is the current fundamental shift in how financial management takes place. Previously, the same linear thinking that characterized reserve processes could be used to describe the entire financial management process as well. Financial statements would be prepared on a periodic basis, generally quarterly or annually.
After the books were closed, a team of accountants and actuaries would scurry feverishly to make sure the numbers were right. Only after the financial statements had been prepared would any real analysis take place—a retrospective assessment of what the financial statements for the preceding period were saying.
Financial planning would follow. It consisted of a static projection based on sales and economic assumptions. Sensitivity analyses on the key assumptions would supplement the projections as a sort of rudimentary risk analysis. Business plans would be set by altering sales and expenses, the only levers that could move the results in such a deterministically modeled world. All other elements of financial management—risk analysis, hedging activities, etc.—would take place outside the linear production cycle as one-off projects.
The apparent improvements in financial systems implemented in the past few years have all been directed at enhancing this traditional financial management process. Improved work flows and enhanced computing capacities have cut down the time between when the books close and when the numbers get reported, but the underlying philosophy is unchanged. Numbers that used to get reported annually, then quarterly, are now routinely reported on a monthly basis.
Yet the information derived from the reporting process—the information on which the business decisions are made—is still largely an accumulation of after-the-fact metrics made meaningful only in the context of historical trends observed over multiple reporting periods. The financial system doesn’t support decision making on real-time information, in large part because the financial reporting basis (U.S. GAAP in particular) isn’t meaningful in a real-time context.
But financial management in insurance companies is rapidly moving away from a singular focus on U.S. GAAP financials to real-time, economically based measures of financial performance and risk. Embedded value, fair value, embedded value at risk, and similar measures of value are the new metrics, at once measuring the enterprise’s success in selling, pricing, and risk management in encompassing point-in-time measures.
These new measurement systems are supporting the new financial management philosophy whereby value and risk are measured instantaneously relative to the market. Risk mitigation and value-management techniques are predicated on instantaneous action based on instantaneous information.
Point-in-time access to information, potentially down to the policy level, has conceptual meaning in value-based measurement frameworks, unlike in the U.S. GAAP and statutory frameworks that preceded them. A fundamental shift in financial systems is needed to support this fundamental shift in financial management.
Getting Past Illusion
Obviously, the challenges to recreating financial systems to accommodate an entirely revamped process of financial management are myriad. Adoption of a real-time management basis—such as embedded value, fair value, or one of their many hybrids—and institutionalizing it within the corporate culture is the first of many hurdles. This is an absolute prerequisite. Without a real-time measurement basis, real-time financial management is pure illusion. But other steps are needed as well.
First, companies will need to incorporate within their financial systems all elements that are fundamental to the valuation process. This includes previously ad hoc but vital processes such as the setting of assumptions. With financial valuation in a real-time world so heavily dependent on accurate, current assumptions, any financial system developed must ensure that this is an integral, mainstream process and that it’s protected by the appropriate controls.
This is not just a “nice to have”; it’s a regulatory requirement. Sec. 404 requirements mandate the documentation and testing of controls over all key financial reporting processes. No longer can critical but hard-to-standardize processes be left outside the mainstream system simply because they don’t neatly fit in.
Second, stochastic and option pricing valuation techniques must become part of the mainstream reporting and planning processes. This represents a significant change from the world where cash-flow testing and stochastic analysis were performed external to the linear financial reporting flow. U.S. GAAP and statutory financial reporting requirements are increasingly incorporating stochastic valuation methods for policy guarantees, so the need to incorporate multiscenario processing as a standard flow rather than a one-off test is with us already.
Stochastic methods add an entirely new dimension to systems design with an attendant multiplication in complexity. Companies will need to design controls built around the nonlinear processes demanded by stochastic techniques. Techniques for the analysis of results need to be built. Incorporating projections of stochastically based balance sheet items within planning models that are themselves stochastically based poses particular challenges that must be addressed in enhanced system design.
Finally, risk management techniques will need to be supported within an integrated financial systems architecture. Hedge programs, for example, which require constant monitoring and continuous maintenance, share a natural affinity to financial measurement frameworks that provide point-in-time measures of value. This relationship argues for consolidating the risk management and financial reporting processes into a single, integrated platform.
Companies will find increasingly that managing with one eye to U.S. GAAP results and another to economic risk is a losing game and that consistency and reconciliation between management and measurement of the business is the only route to long-term success. System design must accommodate the needs of these two functions—financial reporting and financial risk management—simultaneously in this new environment.
The Customer Service Model
To understand how comprehensive a change is required in the financial systems of life insurance companies, it’s useful to consider a recent parallel revolution in the systems that support customer services.
Not long ago, customer service representatives relied on a combination of disjointed systems to provide information to policyholders. A caller with multiple questions might end up talking to multiple people to get all the information requested. Address information might be obtained from one area, recent deposit activity from a second, and benefit information from a third. Account balances would be available as of recent reporting cycles, but they certainly wouldn’t be current. Exotic benefits, such as guaranteed- minimum death benefits, might actually be calculated by the customer service representative by hand. Want to talk about a second policy with the same company? Call back later.
Today, the customer service model is entirely different. Access to customer information is real time. When a policyholder calls with a question, a customer service representative has available instantly on a computer screen all of the pertinent information about the policyholder and his relationship with the company. The information is integrated and accurate, current and relevant. It enables both the customer and the representative to make immediate, actionable decisions.
Getting to this level of service capability is by no means painless, and certainly not all companies are there yet. It requires an entirely new discipline known as customer relationship management (CRM). Now, it’s the normative model; no company believes it can long survive without customer service capabilities at this level.
A similar overhaul is needed for financial systems. The same instantaneous access to integrated information upon which customer service depends is the lifeblood of the new financial management paradigm. This is the threshold for survival that financial systems will need to provide. Modern financial measurement frameworks, like embedded value, fair value, and the fair value-like precepts of International Financial Reporting Standards provide the conceptual bases that make real-time snapshots of financial performance meaningful and actionable.
Without a real-time measurement basis, real-time financial management is pure illusion.
Risk measurement and associated risk mitigation techniques all depend upon being able to obtain continuous real-time quantifications of risk exposures to enable continuous, real-time reactions. Just as the customer service representative needs instantaneous access to current policyholder information, so too do the financial managers of the company need instantaneous access to the financial situation of the company to manage it properly.
The Right Tools
The traditional emphasis of corporate information technology (IT) groups around back-end administrative systems, as well as the recent focus on CRM solutions, data warehousing, and Web applications, have led to actuarial and risk management areas being left out of the game. The complexity of the actuarial domain is intimidating when it comes to communicating with other functional groups within the company. This is especially true with IT, which is known for its rigid systems development protocols, its demand for clarity around business requirements, its insistence on well-controlled production environments, and its constant resource contention issues.
Add to the mix the traditional tendency of actuarial areas to develop silo-based, self-service approaches to technology utilizing every desktop tool ever invented, and we get a pretty good picture of an actuarial shop of… yesterday. But as modern financial requirements are gaining in complexity, actuaries and risk managers are looking outside their traditional, insulated realms to use the best technologies available on the market. Tools that support the concept of financial transformation are entering the mainstream of actuarial practice.
These are not “bleeding edge” technologies. They have been tested by technology architects and developer groups, they have been accepted by the CIOs and CTOs, and they have become integral components of mainstream IT implementations.
Take, for example, data integration. Data integration is the simple concept that data residing in different back-end systems and used for different purposes should possess an underlying consistency and quality. This is critical if financial information and the resulting metrics derived from it are to be credible enough on a real-time basis to enable instantaneous action without the delay of after-the-fact validation and manual editing.
Extract-transform-load (ETL) tools are revolutionizing this field. They offer powerful yet cost-effective solutions for extracting data from the source, transforming and standardizing it according to the pre-defined business rules, and moving it to a single location so it can be used by actuarial and financial models. These tools require little knowledge of programming beyond simple formula-like expressions; they’re completely user-driven “click and drop,” yet they’re as capable, scaleable, and well designed as the best-in-class tools that IT has been using for years.
In addition to ETL tools, there are solutions that help solve the perennial concern of data quality. Data-quality problems can be addressed by the ability to periodically profile and instantaneously cleanse data and augment it as appropriate. This is essential in verifying that the set of controls embedded in the ETL process really works on a seriatim data feed that can easily exceed 1 million records.
In the area of financial modeling, progress is being made to institutionalize stochastic analysis as a standard actuarial approach. In addition to the exponential increases in sheer computing power that we witness year after year, parallel proc-essing techniques have also developed. Distributed computing, high-performance computing, and grid computing have served to reduce run times, making it practical to run thousands of stochastically generated economic scenarios in less than a day (provided a company has 20 or so computers available to do the work).
Even the seemingly intractable problem of embedding stochastic valuation techniques within projection models that are themselves stochastic (the so-called stochastics-on-stochastics dilemma) is being addressed through clever, computationally efficient techniques.
Rather than continuing to run thousands of scenarios on a single mega PC, companies are introducing flexible computer architectures that take advantage of existing multiprocessor servers, advanced networking, and multithreading techniques that enable simultaneous execution of model steps (imaging being able to execute not one, but two instructions at a time on the same processor!) in order to expedite the computation process. This enables virtually real-time modeling of financial results even when components of the projection model are stochastic in nature.
Finally, revolutionary progress is being made in the area of decision support, bringing the information on which business decisions are based to the decision maker on a real-time basis. Performance dashboards (once solely populated by sales statistics, in-force data, and operational metrics) are increasingly incorporating financial metrics as well. Embedded value measures, perfectly suited for the dashboard “snapshot” perspective, are provided in the current state-of-the-art systems. Risk measures, such as embedded value at risk and hedge exposures, are only an iteration away as the next generation of “must haves.”
Financial managers need accurate, reliable, and timely forward-looking information to manage the business. Applications that facilitate decision support have migrated from the spreadsheets to powerful business intelligence (BI) platforms. These tools enable the presentation of information in multidimensional formats so that decision makers can see and act on relationships that aren’t apparent in static, two-dimensional report book pages. They offer an opportunity to visualize the data and to establish proactive alerts that inform the decision maker instantaneously when pre-defined patterns are observed.
Drill-down capabilities enable managers to review information in the arrangements and in the level of detail they see as most meaningful, enhancing their ability to understand not only what is happening but why it’s happening as well.
These BI tools that have existed for years, primarily in the IT domain, are now available for nontechnical users. These tools require no programming, yet are capable of anything from ad hoc querying and multidimensional data exploration to building multifunctional applications such as state-of-the-art dashboards and scorecards.
Evidently, the trend in financial, actuarial, and risk management disciplines is toward deploying tools traditionally associated with heavy-duty enterprise-class IT in the context of traditional business areas. They’re attainable for business groups at insurance companies because of their user-friendly click-and-drag interfaces that make the sophisticated functionality intuitive. And they’re robust enough to fit with corporate IT architecture standards. Rapid adoption in the industry will very likely make these tools prevalent within the next couple of years.
Leadership From the Top
Despite the availability of the tools and the concepts, meaningful overhauling of financial systems to meet the needs of the new millennium remains on the horizon at many companies. Inability to recognize how fundamentally the financial reporting and financial management environments are changing is at the root of much of this inertia. But even where the need for change is recognized, the quandary of how to start remains.
For many organizations, the parallel drawn earlier between customer service and financial reporting systems may provide some guidance. When CRM first began to coalesce out of the distinct applications of contact management, sales force automation, and call center software, vendors were eager to deliver an all-in-one, integrated solution. Those who succeeded understood that there are four critical success factors.
First, leadership from the most senior levels on down must be committed to change. This means more than the perfunctory “executive buy-in.” Leadership must be willing to take a stand and drive financial transformation. The changing regulatory environment provides a convenient platform for propelling such initiatives.
Second, leaders must deal with project visibility. While the transformations needed are potentially massive, the organization must avoid the “runaway project” approach. A highly visible initiative that delivers tangible results quickly—within, say, six to nine months—is a wise choice as a starting point because it helps to establish trust with executives and inspires teams to succeed.
Third, transformation of financial systems requires effective and integrated project management. Inherently, a project management expertise is embedded in the IT groups, but it’s critical that the project is co-managed by the financial and IT organizations as a partnership of equals. Remember, financial transformation is more than just buying software and implementing tools.
Finally, the initiative should be carried out by multidisciplinary and cross-functional teams, or “competency centers,” that are embedded directly in the business group. In achieving fundamental overhauls of customer service systems, companies recognized early on the need for organizational realignments to put computer systems, process design, and business unit professionals into organized teams with identical objectives in the engineering task.
This lesson needs to be learned by actuarial and accounting professionals as well if meaningful progress is to be made in rebuilding financial systems. The same computer systems and process design professionals essential to a successful redesign project need to team with accounting and actuarial professionals in a unified organizational unit. Piecemeal approaches to fundamental change are doomed to fail.
Late to the Party
Financial reporting and financial management in the early 21st century are destined to converge on value-based economic frameworks not contemplated when existing financial systems were designed. These frameworks are all predicated on real-time measurement leading to real-time reaction. System overhauls to accommodate this fundamental shift in the way companies are managed and measured are inevitable.
The ability to recognize financial risks and opportunities instantaneously as they arise will be a prerequisite to compete in an environment where the market leaders have the same immediate access to information. Those who think that the current systems and processes are good enough will be left one step behind in a competitive environment where one step is the difference between success and failure.
The tools to fundamentally transform financial management systems are available today, waiting to be adapted to the specialized needs of actuaries, risk managers, and the other financial architects of insurance companies. Those companies that recognize the inevitability of the need to begin this transformation may foreshorten the process and gain a decided edge on the competition by embarking on it now. Those that do not run the risk of being late to the party or, more dramatically, not making it to the party at all.
Doug French is global director of and Alex Korogodsky is a manager in Ernst & Young’s insurance and actuarial advisory services practice in New York City. Rob Frasca is a senior manager in Ernst & Young’s insurance and actuarial advisory services practice in Boston.
Contingencies (ISSN 1048-9851) is published by the American Academy of Actuaries, 1100 17th St. NW, 7th floor, Washington, DC 20036. The basic annual subscription rate is included in Academy dues. The nonmember rate is $24. Periodicals postage paid at Washington, DC, and at additional mailing offices. BPA circulation audited.
This article may not be reproduced in whole or in part without written permission of the publisher. Opinions expressed in signed articles are those of the author and do not necessarily reflect official policy of the American Academy of Actuaries.
Inside Track: Anecdotal Evidence