Executive management often harbors misconceptions about effective cost optimization, a misunderstanding often rooted in the conventional benchmarking process. This typical scenario unfolds as follows:
- Business leaders engage a generic management consulting firm, often lacking practical implementation experience in benchmarking. Unfortunately, even specialized advisory firms tend to place benchmarking under the research function rather than advisory, preventing meaningful context application to benchmarks.
- They enlist a traditional benchmarker creating research and survey-based data. This essentially provides access to a data catalog crafted by spreadsheet analysts distant from transactional realities. The data comprises averages of averages and is unlikely to reflect unique IT environments.
- The organization conducts a superficial rate exercise, focusing narrowly on rates and overlooking broader factors influencing the total cost reduction potential.
- Executive management persists in treating benchmarking as a swift and tactical procurement exercise rather than a strategic endeavor guiding critical decisions such as employee retention, outsourcing strategy, and contract renewals.
This delineates why traditional benchmarking is an unsuccessful process.
Data lacks meaning without proper analysis or context. Even with results, the original inefficiencies in the organization remain unclear. Moreover, executive committees may incorrectly perceive IT costs as excessive, leading to misguided efforts to reduce costs that are actually appropriate for the environment. In some cases, maintaining the status quo would have been preferable to uninformed actions.
Traditional benchmarking providers, typically situated within research entities, treat the process as a productized data catalog, aiming for frequent circulation and upselling.
With more salespeople than actual IT advisors, the data extrapolation lacks strategy. Researchers, devoid of negotiation table experience and cost assessment, rely on survey-based numbers rather than hands-on research. At best, data is funneled from the advisory side, risking loss in translation.
Is the data archived and normalized? Is it an apples-to-apples comparison? Likely not. True optimization potential remains unattainable when basic spreadsheet data is applied to organizational chaos.
Further complicating matters is the hype surrounding automation. Despite significant investments in global robotic automation, estimated to achieve a 60% CAGR totaling $6.5 billion by 2020, its efficacy is questionable due to a lack of understanding of how IT organizations operate.
The path to proper cost reduction necessitates a more holistic benchmarking methodology based on data derived from actual engagements, considering the organization’s broader picture.