By Jean-Francois Tapprest, April 2023.
In this article, based on my first-hand experience at an incumbent financial institution, I reflect on how end-to-end processes can be reinvented to improve the operational efficiency of the corporate banking activity.
The push to streamline and digitise key processes needs convincing
Corporate Banking depends on a few key processes, for instance the ones related to compliance, including the life cycle management of customers and the AML (Anti Money Laundering) procedure, the lending end-to-end value chain covering also the credit process, or the ones related to the offering and delivery of transaction banking products (cash management, trade finance, working capital management, forex, etc.).
When arguing on the importance of improving those processes, some get more attention than others. For instance anything related to compliance attracts immediate consideration from upper management, whilst some other processes seem to predate the internet, with no one concerned about it. But in all cases, moving from incremental improvements which provide immediate benefits to longer term process redesign is seldom a priority. As a result, it needs quite a bit of effort to convince the users as well as the decision makers that the advantages are plenty.
The most immediate benefit of streamlining and digitalising end-to-end processes is the resulting improvement in internal efficiency with a faster process, a quicker decision making and ultimately lower internal costs. All of this with a consistency in quality, more predictable execution times and a better auditability, the latter point being key in a highly regulated industry. Another one is that it allows to create a modern digital communication channel with customers such that it is not just a digital user interface requiring manual work on the back of it to handle a customer request, but a real end-to-end digitised process.
Someone needs to take the lead
Change does not happen by itself, especially in incumbent corporate banks where the staff is used to doing things in a certain way and where Standard Operating Procedures (SOP) are too often missing. In addition, end-to-end processes involve, by definition, a number of units, including the front line sales team, the product units (cash management, trade finance, forex, etc.) as well as the support units (compliance, credit, middle and back office, etc.). Each one can try to improve its own way of doing, i.e. its contribution to the full process, but if no one takes the responsibility of the whole value chain, little will happen.
In my opinion, this role should be carried by the customer responsible unit since, at the end of the day, it is the one accountable for the satisfaction of customers, the profitability that they bring (including internal costs) and the risks that they generate (including credit and compliance risks). It is also the one best placed to initiate a cascading effect, by starting with the core process and then demanding the digitalisation of connected ones, and in addition insist on having clear internal SLAs (Service Level Agreements) and full auditability and visibility of all supporting processes.
The issue is that a customer responsible unit focuses on what it is best at, i.e. developing customer relationships and originating and closing deals. I spent over ten years in such a unit and faced at times the “it’s all about the relationship, stupid!” kind of attitude. In other words, please make our life easier if you can, but for the rest, don’t try to fix what is not broken.
This means that, unless it is led by a visionary manager, a customer responsible unit will seldom initiate the streamlining of end-to-end processes even if they would be the ones to benefit most from it. They have neither the right attitude nor the project management resources. In addition, when they try to do so, they risk getting pushed-back by product and support units resisting that someone else is stepping on their ground.
The change has to come from the top
Individual units are primarily focused on their KPIs (Key Performance Indicators) which are usually short term goals in a “business as usual” mindset. It is therefore the role of upper management to take the balcony view, set longer term goals, allocate the resources and follow-up on the achievements.
Especially in the current market conditions, it is those institutions that manage to go beyond short-term business environment challenges and instead have the guts to invest into long-term changes, involving data, technology and new operating models, that will ultimately emerge as the most competitive industry players.
As I see it, it is also only upper management which can break the mental barriers to change. People tend to love the status quo and the hurdle to change how things have always been done is really high. Yes, initiatives will be regularly started from the ground by enthusiastic employees, but unless they get support which goes beyond moral encouragement, they will tire and return to their business-as-usual mindset, or leave.
Let’s use our own brain before calling-in a robot or an AI
Streamlining end-to-end processes in a profound way requires the widespread use of technology (Digital platforms, robots, workflow tools, iBPMs, etc.) but also to apply common sense. My experience is that some of the initiatives require substantial development resources and IT budget, but the bulk of them are more about clarifying roles and responsibilities, and challenging the internal guidelines which in many cases have become outdated.
A method used by my former employer really made a difference. We created ChangeLabs which gathered around a facilitator all the participants in a specific value chain for a time-boxed “sprint” lasting a few weeks. This brought a structure as well as a proven framework for analysing the pain points, prioritising the issues and looking for concrete, and primarily short term, solutions. It was a unique way to gather people together and openly discuss problems that often had never been brought up.
It is also worth noting that, ironically, the organisation of such ChangeLabs was the hardest for processes that needed the most revamping, simply because the participants in those inefficient processes were so much under water in their daily work that they had no time to spend on trying to improve them.
Towards modular processes
However, in the longer run and in order to become really efficient, processes need to be rethought totally. It is not about improving bits and pieces, or even making them more digital or automated. Instead, it is about re-designing the end-to-end process from scratch in the best possible way such that it might look substantially different afterwards.
In addition, when processes are redesigned, the goal should be to move towards modularisation: each process should be cut into a series of modules or micro-services which plug into each-others through APIs. This brings flexibility and the ability to easily change or upgrade a given micro-service. For instance, a modular lending process is made up of micro-services that are integrated together via APIs to deliver, i.a., the credit rating of the customer, the list of all the bank’s exposures to that customer, or the industry sector analysis. When this is achieved, data can flow between systems in real time, eliminating rekeying, manual imports and inconsistent formatting. In such a set-up, a good governance model is really important, with in particular each microservice having an owner, i.e. someone responsible for its performance and improvement.
It is all about data
Banking is about dealing with immaterial things, and an operational process in a bank is primarily about managing data: how data is sourced, created, refined and exchanged.
But banks are notoriously bad at mastering all the data they have and produce. Most don’t have a singular view of their customers and still have to deal with data duplication across the range of products and services that they offer. Lending operations are a good illustration of this, with the data required to complete a credit memo being siloed across multiple systems, spreadsheets, e.mails and even paper notes, carrying the risk that it is mismatched and out of date.
The goal to create a “data lake” where the relevant data could be fetched in real time will remain for many a dream during the years to come. Which is a pity, because if banks could master that data, enrich it with external information and use AI to create “what-if” scenarios, they would gain super powers, like I described in an earlier article co-written with Bernardo Antunes.
In any case, banks are increasingly reliant on external software providers to solve their systems’ deficiencies and need to reflect on the dilemma of build versus partner or buy when it comes to process improvement.
Build versus buy or partner
The initial reaction from the IT department when proposing to develop something new is generally to produce it by themselves, simply because it is the traditional way of doing. Whilst in certain cases it still makes sense, other options should be considered first, for many reasons.
One is the lack of IT development money. Without an allocated budget covering the full development costs, no one will start to work on your initiative. In comparison, an external software provider will charge a recurring fee which is accounted for as a running operational cost for the unit using it, hence not needing a large upfront development budget.
Another is related to the cost of software ownership, including development and maintenance. Besides being difficult to predict, this cost is also frequently underestimated, especially given new security threats, dependency management and regulatory change. In the same way as cloud providers have changed the way IT infrastructure is being built, SaaS (Software as a Service) providers are offering an alternative where the cost is more predictable and transparent.
Anyway, why reinvent the wheel. Since the products and services offered to corporate clients are basically the same, irrespective of the bank providing them, the optimised underlying processes should not be that different. Meaning that sharing the costs of a more generic process with a software supplier or partner is likely to be a better option. In my experience, unique use cases are quite seldom.
One or several software suppliers for a given end-to-end process?
I argued above that new, re-invented processes should be modularised, i.e. made of internal micro-services plugged into it. In addition, an end-to-end process can be cut in pieces, with external software providers being selected to support one part only. For instance, in the lending process, one piece is the loan origination based on the customer interface, another is the credit memo preparation, a third is the credit decision and a fourth is the loan administration. As long as software vendors use open APIs, what they provide is compatible. To illustrate, the overall lending process can be built on a BPM (Business Process Management) platform to handle the overall orchestration of the process, with other components, like credit decision making or loan administration being provided by other software vendors. Ideally, an open ecosystem made up of plug-and play components can be achieved, with data freely flowing between them. This approach allows corporate banks to mix the best providers, including the parts they build themselves, parts they buy on a generic basis and parts they tailor-make with a partner.
“The two most powerful warriors are patience and time” — Leo Tolstoy
So, yes, ideally, an end-to-end process needs to be modular and supported by the best SaaS providers for each part of it. However, for someone who has worked on this topic in an incumbent bank, this starts to sound like science fiction, or at least like wishful thinking. It is said that digital change is hard, but it is the hardest for the ones promoting it and you often feel like pushing on the string of change. So be patient and persistent, the time will come …
Comments