Do you have the tech skills to deliver the services your customers want?

The last few years has witnessed great change in the banking sector.

What will happen to critical mainframes when COBOL skills are in short supply?

Customer demand continues to move online and is accelerating, the further decline of bank branches post Covid sees banks struggling to deliver effective virtual customer service, and challenger banks who aren’t constrained by legacy tech are demonstrating colossal growth.

For established banks, the drive to digital is proving critical in the fight to attract and retain customers.

Outside of banking, the financial services sector has also seen a huge rise in competition from fintech companies focused on disrupting the sector. With this increasingly competitive marketplace comes additional pressure on core computing to maintain and enhance service delivery.

But how are banks responding to the shift towards digital transformation and customer-first services, when many banking systems and processes still run on mainframe technology? These legacy, particularly monolithic systems, have been in operation for years. They deliver consistency and continuity to banks who don’t necessarily want to risk moving to flexible (but often feared) cloud-based platforms. Staying with the monolithic system means that they can’t unlock the value of the data that they currently hold, which is what’s needed to deliver the current customer-first approach.

According to a fourth-quarter update in 2020 by mainframe manufacturer IBM, 45 of the top 50 banks (listed in the Fortune 100) still rely on mainframe technology to deliver services. IBM went on to add that MIPS usage levels (a measure of mainframe architecture) have jumped by 350% in the past decade.

What’s driving this reliance on mainframes? Simply put, it’s a combination of processing power and security. Banks and financial services institutions generate masses of data that is easily managed by mainframes – around 2.5 billion transactions a day in some cases. The compute performance lets banks keep up with analytics in real time, helping to manage risk and spot fraud. So, what’s the problem then?

The real concern here is how the sector will keep the lights on their critical mainframes when COBOL skills are in short supply. The stats speak for themselves.

  • The Institute of Employment Studies estimates there are 1.1 million fewer workers in the labour market than there would have been if pre-pandemic trends had continued.
  • ‘The Great Resignation’ has seen workers reframe how they want to work – work to live not live to work.
  • The ‘Great Retirement’ is following suit with baby boomers leaving the workforce. According to the Pew Research Centre, nearly 30 million baby boomers (globally) retired in the last part of 2020. The research showed that Covid heavily contributed to the numbers of older workers retiring, with a large percentage realising that they’ll be more fulfilled by leaving their jobs.

With mainframes introduced 50 to 60 years ago, those with the skills required to maintain and adapt them are now leaving the workplace. This has resulted in a huge shortage of mainframe skills. To exacerbate the issue, there simply isn’t the range or number of computer science or engineering degrees that cover COBOL available to study. Even if there were courses available, current graduates simply aren’t interested in learning COBOL. They view it as old and dull in comparison to languages such as Elm, Kotlin and even Python.

So where does this leave the banks? Clearly there is a need to retain mainframe support skills – even if it is reducing and even if they may have to pay over the odds for them. Yes, we advocate keeping the lights on and keeping data where it is, but banks need to reframe their approach to building ‘customer-first’ services – or indeed digital transformation. Instead, the key to rolling out new competitive services effectively is to layer apps over the top of existing systems.

Tech teams will already know how costly and resource prohibitive it is likely to be to replace or even maintain mainframe architecture (especially when they are delivering on current needs). BFSI organisations can (and should) build apps that can integrate into their existing platforms, pulling data and sub-processes from existing sources. Using low-code platforms for example, which reduces the need for highly skilled developers, can help organisations overcome potential integration issues often found with out-of-the-box software apps. Low-code can also be used to help the organisation tackle the request backlog coming in from the business.

Creating bespoke apps in this way simplifies the process of integration. So much so that banks and financial services companies can maintain their reliance on mainframes while at the same time accelerate the introduction of new customer services and ultimately their digital transformation future. It also ensures that banks can rapidly scale those services. Since enterprise low-code hooks into every foundational app, it allows organisations to build anything using any combination of applications.

Away from the technology, internally, banks will also benefit from increased collaboration by bringing the business and IT together to deliver the future. Low-code is intuitive: it’s easy to understand and that makes it useful to the business and not just the tech team. The business understands how the process needs to be automated, IT knows how to build it.

While this may seem like an oxymoron, digital transformation doesn’t mean ripping everything out and starting again. It’s about using technology to you and your customers’ best advantage. There is a way for legacy and new digital technologies, such as low-code, to co-exist and even benefit from integration to deliver the services that customers want, and banks and financial institutions need to deliver to stay competitive and relevant in such a dynamic marketplace.


Credit: Source link

Comments are closed.