inlumi blog

EPM Implementations Explained: Part 1 – The Data Model

EPM Implementations Explained: Part 1 – The Data Model

May 22nd, 2020

Welcome to the first in a series of articles that describe how we at inlumi approach projects and why this approach is important to us.

A source of continuing frustration to me is seeing finance professionals wrestling with data and systems, trying to deliver some insight that may help the business. Most finance professionals I have spoken to didn’t sign up to a life of data manipulation and reconciliation. They signed up to help drive the business forward by providing valuable and timely insight, enabling decisions.

I was the same. My first roles in finance involved me trying to make my job easier and more rewarding by doing things differently. I did whatever it took to eliminate the monotonous elements of my roles, enabling me to add more value and maybe even leave the office at a reasonable time! This is still something that drives me today, except now I am helping others to achieve these goals.

So how do we at inlumi do this?

If we start with the objective of enabling finance professionals to add value, then a modern, well designed EPM solution is an essential tool. In this and subsequent articles we will step through a few of the critical components for the design of an EPM solution that can deliver significant added value.

In this article we assume that the project has been mobilised and consultants are on-site (or remotely connected; either approach works for us).

When starting a new financial systems implementation, we always stress to the client that the definition of the data model is critical to the success of a project. If you get your data model right, so many other pieces of the puzzle become clearer and easier. It will therefore be no surprise to learn that one of the first elements we will kick off in any project is the design of the to-be data model.

So, what do we mean when we talk about the data model?

A simple Google search will give you this definition:

‘A data model is an abstract model that organises elements of data and standardises how they relate to one another’.

Let’s put that in context of the implementation of an EPM solution design.

  1. Abstract – the data model will initially be designed as a concept that can be tested against the client’s requirements. Only once the concept has been tested and approved can the abstract become something more real.
  2. Organises elements of data – The data model provides the details of information which need to be stored in a structured manner; for example, customers, orders or products. To be able to do this effectively, the data model needs to be organised into a structure that makes the production of useful and relevant information to create insight as seamless as possible. What we are looking to avoid when creating our data model abstract is the duplication of data at different levels of granularity. We want to achieve a simple, easy-to-understand construct that lends itself to simple interrogation and analysis.
  3. Standardises how they relate to one another – The abstract needs to specify how the different types of data relate to each other, their naming conventions and data conventions. For example, a trial balance pulled from a source GL will have revenue, expenses, assets, liabilities and equity data. The abstract will need to define the sign convention to be applied so that these different data types can be combined using a standard methodology. Alternatively, your data model may include ‘volumes of units sold’, the ‘selling price per unit’ and a standard definition to calculate ‘revenue’ being volume * selling price.

We describe the above as structuring your data in a simple model that enables you to use the data to drive insight to help understand business performance, whilst enabling simple consistent reporting to your stakeholders.

How then do you start to define what this simple model looks like?

We recommend a few simple steps which also start to inform the beginnings of a data integration strategy:

  1. Understand your reporting requirements. This could be your external financial reporting requirements, your internal business plan, budget or forecast. Whichever requirement you are looking to meet, you need to consider what you need to report, what you need to enable decision-making, and what data you need to make this happen. Answers to these questions enable you to start to design a conceptual model based on your required reporting views.
  2. Understand your data sources. Once you understand what data you need to enable your reporting and decision-making objectives, you need to understand the sources of this data. For a financial consolidation solution, this is likely to be multiple general ledger solutions supplemented by additional detail held perhaps in sub-ledgers, or maybe held in other financial systems. For a planning or forecasting solution, your data sources may be more operationally focused. Understanding the sources enables you to start designing an efficient data capture process.
  3. Plan your data integration strategy. Once all data sources have been identified, an assessment should be made of how easy it is to capture the data and which is the primary source for the data. For example, in a financial consolidation solution, all balance sheet accounts need to be analysed by movement to generate an accurate cash flow. However, detailed balance sheet movements might be difficult to source, whereas the closing balance for all balance sheet accounts will be immediately available in a general ledger trial balance report. In this case, the primary source for balance sheet information would be the GL trial balance report with detailed balance sheet movements then becoming a supplementary input.
  4. Pull it together. The final step, once the data and data sources are understood, is to pull this together into the abstract that defines a structure into which data can be loaded (i.e. the data model) in a simple, controlled and auditable manner (i.e. the data integration strategy). This enables the data to be viewed and analysed in an insightful manner or in accounting speak, and in a chart of accounts with appropriate dimensionality to deliver reporting requirements.

Once defined, it is important to understand that this data model is a living, breathing construct; it will need to flex as your business changes or as the external environment changes.

If you think your data model is no longer supporting your business as well as it should, or if you are struggling to capture the right data in an efficient manner, please contact us. We would be happy to help.

Part 2 of this series will look at the functionality required in EPM solutions and the approach we take to building this functionality.


About the author

Paul Wilcock

Paul Wilcock
Principal Consultant in Financial Close at inlumi
paul.wilcock@inlumi.com

Paul’s experiences within the finance function have given him a depth of experience in all things “consolidation”. He has been using these experiences to take leading roles in the design and build of EPM solutions for clients over the last 15 years. Paul understands the challenges facing finance today and is passionate about finding solutions to these challenges.