Once in a while a technology comes along that fundamentally changes how people conduct business. Some, like mobile phones or the Web, were obvious game changers from the start. Others, like in-memory analytics, may be overlooked at first because as an underlying technology their potential impact is initially less evident to businesspeople.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
But I believe that emerging in-memory systems will fundamentally change the way finance organizations handle planning, budgeting, forecasting and analysis. They can help finance be more agile and able to shift its focus from the traditional “rear-view mirror” orientation of historical accounting data to a forward-looking approach that emphasizes analytics-based contingency planning to identify a company’s best course of action.
How in-memory computing can transform finance
As the name suggests, in-memory analytics systems rely primarily on built-in chip-based memory for data storage. That’s in contrast with systems that store some or all of the data to be analyzed on a disk. The in-memory computing approach can be considerably faster because it eliminates physical disk reading when querying the data. The difference between the two approaches is insignificant when dealing with small data sets and simple programs. But when companies are working with large data sets and complex algorithms and models, the difference in response times can be several minutes or even hours.
More on in-memory analytics
Understand the benefits of in-memory analytics
Learn about SAP HANA
When using disk-based systems, long response times effectively force companies to choose one of three basic approaches. They can forget about analytics and just use gut feeling. They can work interactively with small data sets, less complex models or both. Or if they must work at a granular level of detail that involves large data sets and complex models, they have to tolerate an analysis and decision-making process that stretches out over hours, days or even weeks.
Typically, companies do some combination of the first two. In a weekly or monthly performance review session, if executives and managers are unable to weigh the implications of various courses of action in response to some variance, the group either has to guess at the best approach (inviting self-serving assessments) or delay deciding what to do. None of this promotes agility or fact-based decision-making. Worse, it doesn’t foster an organization with an action-oriented mindset.
In-memory computing can provide an alternative to these inferior options. Users can work interactively with much larger data sets in more complex models. Say, for example, a finance organization needs to work with detailed projections for unit sales and revenues at the stock-keeping unit (SKU) level. In-memory analytics systems allow it to interactively explore the financial and volume implications of price changes in specific geographic markets or channels. Those working on the projections are able to determine almost instantly the impact on profitability, inventories and cash flow of specific actions. An in-memory analytics system can run several different sets of assumptions or scenarios in a matter of minutes, compare results and enable those involved to knowledgeably discuss the best approach.
Another example is the monthly budget review. After drilling down to identify the reasons for an issue or unexpected opportunity, users can immediately explore the detailed impacts of different options. If a company expects certain higher costs to persist, it can use detailed models to quickly determine the best option to adapt to the new situation.
Survey shows need for agile planning technology
Unfortunately, because of technology limitations, few finance organizations possess this ability, not to mention the business agility it can provide. Our recent benchmark research finds that just 13% of midsize or larger companies (those with 100 or more employees) can explore every relevant scenario and examine the implications of each to any degree of detail across the entire company. Another 37% can assess the full implications for a limited number of scenarios, while the remaining half cannot gauge the impact of possible actions to any significant degree.
Narrowing scenario planning down to a few preset alternatives is better than nothing, but it’s far from ideal. Rather than sticking to some preordained scenario, having the ability to do ad-hoc what-if planning allows more outside-the-box thinking. Moreover, because it can be done collaboratively in real time, more people can participate in the process, making it easier to raise potential opportunities, threats and a wider range of responses. Taking a comprehensive, enterprise-wide approach is important because decisions that are optimized at a local or business-unit level frequently turn out to be much less optimal for the corporation as a whole.
Transforming the finance organization so that it becomes more of a strategic asset is once again on the minds of senior executives. Technology in general -- and in-memory analytics in particular -- will be an important element in this evolution. It can remove many of the constraints that currently limit finance organizations’ capabilities. It can provide the catalyst to shift the focus of analysis and communications within a company away from backward-looking accounting data toward the implications of potential actions.
At the moment, in-memory computing appears to be of interest mostly to the IT community. Finance organizations should learn more about it. Those that understand the full potential of in-memory computing will have a leg up in becoming more strategic and, ultimately, improving their company’s performance.
ABOUT THE AUTHOR
Robert Kugel, CFA, is senior vice president and research director for CFO and business research at Ventana Research, based in San Ramon, Calif.