Last year the market saw $3.8 trillion spent on mergers and acquisitions (M&As), historically the most amount of money ever spent on mergers and acquisitions.
Coincidentally, insurance agency related M&As hit a record high, while property and casualty agencies accounted for the largest seller category.
Low interest rates, resilient stock prices and solid employment all played a part, and that momentum is widely expected to continue as carriers look to fortify their competitive positions in the current market, expand their customer base and geographic reach, and enter new markets.
Mergers and acquisitions provide businesses with the opportunity to generate new value, but it often takes longer than expected to eliminate transitional service agreement costs, resolve clashes of culture, align priorities and reap the rewards of an integration.
Large scale systems rationalization can be a daunting challenge. Certainly not for the squeamish, M&A integrations suffer from failure rates as high as 90 percent according to Harvard Business Review.
The post M&A strategy has to focus on unlocking the promised business value as quickly as possible.
That's always been the goal.
Planning for IT integration
It's important to include IT and operations from an early stage of any merger or acquisition. They should be assessing technology and laying the groundwork long before the day actually arrives.
Consider that 50 to 60 percent of the initiatives intended to capture synergies are strongly related to IT, according to McKinsey & Company. In their experience, there are three things companies have to do right: Adopt flexible service-oriented architectures that are adaptive, include IT at the due diligence table and carefully plan post-merger integration.
Finding the optimal way to bridge old and new systems is paramount. The faster you can access and encapsulate key functionality, the faster you can achieve additional savings and efficiencies.
The importance of settings goals and putting metrics in place to ensure that your IT integrations and conversions are going in the right direction cannot be overstated.
Continue reading…

Integrating multiple legacy systems as part of a merger is hugely ambitious and CEOs should beware of strategies that require substantial back-end integration. (Photo: iStock)
Bypassing conventional centralization
Large scale migration, homogenization and processing of any and all data is incredibly time-consuming and expensive. The traditional, dominant centralization paradigm is fatally flawed. There's a great deal of risk inherent in committing to a huge integration project, the success of which can't be accurately assessed in the short-term. By the time you find out you're going in the wrong direction, it's too late to change course.
Trying to integrate a multitude of incompatible systems is hugely ambitious. McKinsey warns that "CEOs and CFOs should be wary of embarking on an M&A growth strategy that will require a lot of back-end integration if their corporate IT architectures are still fragmented: The risk of failure is too high."
For many organizations, the cost and complexity of full integration is too much to bear. It can't be achieved quickly enough and any rationalizations must consider not only the company as it is now, but the new data or businesses that may be mixed into the fold tomorrow.
Rolling in virtualization, distributed processing
You can break the interdependency of value and cost drivers with a virtualized structure and operating model. This approach extends the lives of your existing systems and provides enormous flexibility to pivot and improve.
Complexity is an inevitable by-product as businesses look to leverage the latest technology and tools alongside legacy systems that still hold enormous value. Establishing data and functions as a service is a neat way to sidestep huge upfront costs and accelerate interoperability.
A parallel and distributed computing processing model provides a direct route to target existing and acquired systems, components, libraries and functions. You can incorporate all of the applications, platforms, and data you need without transforming the underlying systems. Focus on processing only the subsets of information that meet specific business needs and avoid the disruption and cost of major transformations.
Distributed processing can preserve your existing investments, offering a shortcut to interoperability. There is transparency and flexibility, so you can track efficacy and make the changes you need to improve. It can address the dual pressures of synergy realization and cost take-out, delivering rapid benefits without incurring high upfront costs.
Plan your strategy carefully and you can build a bridge for successful M&A integration.
Simon Moss (simon@pneuron.com) is chief executive officer for Pneuron Corporation, a business orchestration software provider.
© Arc, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to TMSalesOperations@arc-network.com. For more information visit Asset & Logo Licensing.