Strategy

The hidden layer between your data and your decisions.

March 29, 2026 · By AshPoint Solutions · 10 min read

Your EPM platform knows your numbers. It can tell you that revenue came in at $18.2M against a plan of $20M, show you the variance by product line, by region, by customer segment, and do it quickly and accurately. Without deliberate design, it won’t tell you which strategic assumption broke, why it broke, or what response makes sense now. The system can report the miss. Inferring the decision logic is a different problem, and one the system cannot solve on its own.

That gap between what the platform reports and what leadership needs to decide is one reason planning investments disappoint. McKinsey has argued that even high-performing companies leave roughly 30% of a strategy’s potential unrealised because of operating-model shortcomings. [1] In planning terms, one version of that problem is failing to build the layer that connects financial outputs to operational signals and strategic decisions. The platform is rarely the issue. The connective tissue was never designed in.

In the previous article, I argued that scenario planning often defaults to sensitivity ranges because teams treat it as a technology problem instead of a design problem. This article looks at what the missing design layer actually consists of.

A composite example

Let me walk through what I mean using a scenario drawn from patterns I’ve seen across multiple engagements. The details are composite, but the dynamics are real.

Take Apex Industrial, a mid-market manufacturer that launched a two-year growth program. For the current fiscal year, the operating plan calls for 15% year-over-year revenue growth as the first major milestone. The board agreed that target in a strategy session based on market opportunity, investor expectations, and management’s view of what the business can realistically deliver.

In a typical EPM implementation, that 15% target gets handed to the finance team. They build a bottoms-up plan: sales volumes by product line, pricing assumptions, new customer acquisition rates, expansion within existing accounts. The numbers roll up to something close to 15%, or they don’t, and a round of negotiation happens until the gap closes on paper.

The plan goes into the system. The implementation team builds reporting around it. Variance analysis compares actuals to plan every month. Six months into the fiscal year, with revenue tracking at 9% against a 15% full-year plan, the conversation is about the gap.

That is because nobody translated the board objective into named initiatives, expected contribution ranges, early warning indicators, and decision triggers. That layer was never built.

Building the connective tissue

Here is how that layer looks when it is built deliberately.

Apex’s 15% growth objective is not going to come from a single source. When you talk to leadership, which is the conversation most implementations skip, you learn that the growth is expected to come from three initiatives, each with a different risk profile and a different set of drivers.

The first is geographic expansion into a new region, where they have been piloting for six months and plan to scale. The second is a pricing optimisation effort across their core product lines, where they believe they have been underpricing relative to the value they deliver. The third is deepening penetration within their top twenty existing accounts through cross-selling and expanded service agreements.

Each initiative carries a different share of the incremental growth required to hit plan. Leadership expects roughly 40% from geographic expansion, 25% from pricing, and 35% from account deepening. Those proportions are working assumptions, not forecasts. Their value is in making the strategic bet explicit so it can be tested, and in reflecting priority and resource allocation rather than just financial arithmetic.

At that point, the design should be expressible as a simple chain: objective, initiative, expected contribution, leading indicators, trigger thresholds, response options. For each initiative, name the owner, state the expected contribution, define three to five leading indicators, set thresholds, and decide in advance what escalation path follows if the signal turns.

From initiatives to indicators

For each initiative, identify the indicators that tell you early whether it is working. Revenue is still an outcome measure; the more useful design question is which upstream signals move early enough to guide action. Effective FP&A design links lagging indicators, which measure what already happened, with leading indicators that predict what is about to happen, in a cause-and-effect chain.

For geographic expansion: pipeline development in the new region, conversion rates on early-stage opportunities, time-to-close compared to mature markets, and customer acquisition cost relative to the business case that justified the expansion. Stalled pipeline development in month three signals that the 40% growth contribution from this initiative is at risk. That signal arrives weeks before the miss surfaces in the quarterly numbers.

For pricing optimisation: win rate changes after price adjustments, customer churn in the repriced segments, average deal size trends, and the ratio of deals closed at the new price versus deals that required discounting back to the old rate. A rising discount rate three months into the pricing initiative tells you something specific: the market may not support the new pricing in certain segments, and leadership needs to decide whether to hold firm, adjust selectively, or revisit the assumptions.

For account deepening: cross-sell attachment rates, service agreement renewals and expansions, share-of-wallet estimates where available, and relationship health signals that indicate whether the account base supports expansion. Strong attachment rates alongside flat service renewals point to a different problem than both metrics declining, and the response options differ in each case.

The logic behind these indicators does not require sophisticated technology. It requires someone to ask, for each initiative: what would we need to see in the first 90 days to know it is working, what would tell us it is not, who owns that signal, and what review or escalation should follow if it turns. At scale, surfacing those indicators reliably usually does require technical investment. The design conversation has to happen first.

What changes when the layer exists

Six months in, Apex is running at 9% year-over-year growth against a 15% full-year plan. On a $60M revenue base, that six-point gap implies roughly $3.6M versus plan if the run rate persists. That is enough to miss the investor target the board committed to at the start of the year. Without the decision layer, the finance team reports the variance, the executive team asks what happened, and the conversation defaults to generic explanations: sales are soft, the market is tougher than expected, we need to push harder in Q3.

With the decision layer, the conversation is specific. Geographic expansion is performing well: pipeline and conversion rates are both ahead of target. Account deepening is on track but slower than planned, contributing its expected share with a longer ramp. Pricing optimisation is the problem. Win rates dropped by 12 percentage points in two product lines after the adjustment, and the discount override rate is running at 40%, which means the field is effectively reverting to old pricing to close deals.

Now leadership has something to work with. They can see that the 6% growth gap traces primarily to one initiative. They can evaluate specific responses: hold pricing where win rates held and revert in the two problem segments, invest more in geographic expansion since it is outperforming, accelerate account deepening by pulling forward a planned service offering. Each option has implications they can model against the scenarios built into the system.

Leadership has moved from explaining a variance to evaluating a response. McKinsey’s KPI architecture research describes this shift as value-based steering: using operational signals to guide decisions rather than simply reporting outcomes. [2]

This is implementation work instead of a product feature

That missing layer is not a capability gap you can close by switching platforms. Leading EPM providers increasingly market AI-assisted analysis, predictive analytics, and decision-support capabilities, and Hackett’s 2024 research describes AI integration as a clear focus of the market. [3] That does not remove the harder design task. Someone still has to define which strategic initiatives matter, how much each is expected to contribute, and which operational signals provide an early read on whether each initiative is working.

That missing layer is one reason so many planning conversations stall at variance explanation. As Graham Kenny argues in Harvard Business Review, there is often an alarming disconnect between strategic plans and budgets. [4] In EPM terms, the problem is not just whether the numbers reconcile. It is whether the model captures the logic by which strategy is expected to produce them.

That is also why this work remains human-led. McKinsey’s February 2026 research on KPI architecture argues that generative AI can help with early hypotheses and first-level driver structures, but translating strategy into a coherent value architecture still requires economic judgement and business intuition, with finance and business leadership retaining ownership of the KPI architecture. [2] The system can support that thinking. The strategy still has to come from people who understand the business.

This layer also requires active ownership. Strategic priorities shift, initiatives are added or dropped, and the indicators that mattered last year may not be the ones that matter now. Without a named owner, a review cadence, and escalation rules, the model drifts back toward backward-looking reporting. [2]

Platform configuration matters, and getting it right is foundational. But configuration alone rarely produces the outcome most finance teams are actually paying for. They are paying for better decisions, and that requires the initiative-to-decision layer to be designed explicitly.

Where this leads

Everything described here is achievable without AI. It is a design discipline grounded in strategic understanding, domain expertise, and the right conversations during implementation. AI is starting to become useful in specific parts of the process: accelerating early hypothesis generation and helping structure driver trees. [2] Modern EPM platforms are also improving predictve analytics, real-time decision making, and decision support. [3] Those benefits depend on clear metrics, explicit value drivers, and governance strong enough to keep the model current. Applied to a weak planning design, AI produces faster noise; applied to a sound one, it can materially improve the speed and quality of decision support.

The next article looks at where AI genuinely helps in this chain, and where it mostly accelerates the wrong work.


Robyn Halbot, MBA, BSc, PMI-ACP is Principal at AshPoint Solutions, with fifteen years of EPM implementation experience. She previously co-founded an ML-based forecasting startup and is currently building AI applications that connect financial planning to strategic objectives.

Whether you’re evaluating EPM platforms, rethinking how your current build supports decision-making, or curious about where AI fits in your planning process, I’m always happy to talk through it. Let’s connect.


References

[1] McKinsey & Company, “How the right operating model can close your performance gap,” July 30, 2025. https://www.mckinsey.com/capabilities/people-and-organizational-performance/our-insights/how-the-right-operating-model-can-close-your-performance-gap

[2] McKinsey & Company, “Selecting P&L-linked KPIs for industrial transformations,” February 27, 2026. https://www.mckinsey.com/capabilities/strategy-and-corporate-finance/our-insights/selecting-p-and-l-linked-kpis-for-industrial-transformations

[3] The Hackett Group, “The Hackett Group: New Enterprise Performance Management Software Provider Study Finds Modern Cloud-Based Software Delivers Far Greater Value Realization Than Legacy Systems,” February 28, 2024. https://www.thehackettgroup.com/the-hackett-group-new-enterprise-performance-management-software-provider-study-finds-modern-cloud-based-software-delivers-far-greater-value-realization-than-legacy-systems/

[4] Graham Kenny, “How to Sync Your Budget with a Strategic Plan,” Harvard Business Review, August 18, 2025. https://hbr.org/2025/08/how-to-sync-your-budget-with-a-strategic-plan

Related posts