Many years ago, enterprise software was written to run on mainframe computers. This was the best (only?) solution at that time that had the requisite memory and processing power to run these applications and so – despite their cost, inflexibility and operational complexity – mainframes represented the optimal computing model for enterprise applications. Until a new computing model emerged. Based on powerful, plentiful and inexpensive blade servers and a number of new, standard software components, this “technology stack” became the new optimal computing model for running more and more of the enterprise. LAMP was the new 700/7000.
Not only was this new model less expensive, more robust and more resilient, it was much more adaptable. Further, the open standards encouraged a tremendous amount of innovation and experimentation which in turn fostered the development of a vast array of specialist but compatible variations. This enabled bespoke solutions for different applications and environments to be easily developed without the need to build a new platform each time. And as each component in the stack had a very specific role or purpose, its design could be optimised without compromise.
The traditional banking business model mirrors the mainframe: a vertically integrated, all-in-one solution with all the resources and tools needed to deliver banking products and services in one big (black) box. In the context of the 20th century competitive and technological landscape this worked fine. It was the optimal solution. But like the mainframe of the computing world, the all-in-one “big iron” approach to banking is no longer the optimal business model with which to efficiently and profitably serve the banking customers of today. A new approach, predicated on assembling specialist providers of the component elements required to deliver end products and services will prove to be the new optimal business model for banking. Welcome to the (banking) stack.
Take for example the process of making a loan. This actually breaks down into a “process stack” that at a high level looks something like this:
Each layer of this stack requires different skills and resources. The value drivers for each activity are different. Each requires a different mix of technology, design and talent and the application of fundamentally different business models and capital resources. As such, trying to house them all in the same organisation means that some or indeed each of these activities are operated in a sub-optimal fashion. Indeed, the stronger the culture, the better managed the bank is (in the context of traditional, hierarchical models), the more acute is this problem.
That said, so long as margins remained high and competition muted, with competitors operating more or less efficient and skillfully executed versions of the same business model, sticking with the “mainframe” model was just about tenable. However this is no longer the case. New entrants – unburdened by legacy technologies and mindsets – are emerging across the stack with business models that are natively adapted not only to leverage the technologies of today but that also address the changing expectations of customers in terms of pricing, design and user experience. In many parts of the stack, incumbent institutions will find it hard to compete as the best of these new entrants gain traction.
The best managed of today’s leading institutions will adapt to this changing landscape. How? By letting go of their traditional business models, opening up their value chain and making an honest assessment of where in the stack they have a sustainable competitive advantage and where indeed they do not. This is not a trivial change for most traditional banks and aside from the adjustments in technology and business model it will entail, perhaps the most challenging aspect in this transition will be to change the culture and mindset of these institutions for whom open architectures and collaboration is often anathema.
But for those institutions that are able to make these changes, the rewards will be significant. By focusing their resources and talents on the areas of the stack where they have a true competitive advantage, exiting other areas where they are structurally uncompetitive and collaborating with (and investing in) companies with disruptive new and powerful value propositions in these areas, they will successfully navigate the transition to becoming an information age bank.
Taking the example above, already it is becoming clear that the traditional models for originating, underwriting and processing loans are no longer competitive. New models from companies like FundingOptions, Zopa, OnDeck Capital, Kabbage and many others are proving to be much more effective and economical. Traditional banks should be lining up to partner with companies like these and in particular become lenders and provide core transaction banking services, areas where they do have a real competitive advantage. They should also be leveraging their strong distribution channels to drive customers to these platforms in exchange for lead generation fees. Of course for the managers and employees responsible for these functions within traditional banks, the transition will be painful, ultimately their jobs will disappear. However, in any case, this outcome is inevitable as their value proposition and competitive position becomes ever more compromised.
By embracing change and working within the grain of this new paradigm, incumbent banks can do much to ensure their future success and survival and will find it much easier to rebuild trust – with customers, regulators and their communities – mitigating the short term pain and setting themselves on a path to sustainable profitability. The alternative is to keep doing the same thing and slowly but surely rust away. The best banking executives of tomorrow will need to be as familiar with APIs and SDKs as they are with APRs and RAROC.
- Platforms, Markets and Bytes (parkparadigm.com)