Bank battle for innovation and market share needs huge live data crunching

By WANdisco
Jun 05, 2017

From self-service chatbots to faster loan decisions, next-generation banking needs big data to be crunched in real time or it won’t deliver the level of service customers expect.

When there isn’t much else to choose between brands, customer service becomes an important differentiator, and in financial services the situation has become acute. As regulators continue to make it easier for customers to switch providers, financial institutions must spend as much time keeping existing account holders happy as they do wooing new ones. Issuing apps and making it easier for customers to bank and source products online is a good start, but account holders will soon notice and defect if such moves are really a thinly disguised attempt to reduce costs and close branches.

A new EY report has found that nearly two-thirds of consumers perceive little or no differentiation of products and services across the overall banking sector. The last major technical innovation in banking was the ATM after all. Bain has demonstrated a clear correlation between a bank’s Net Promoter Score (the willingness of customers to recommend a company’s products or services to others) and how customers rank simple things such as the ease of bill-paying. By improving routine interactions, the average bank could increase its score by 10 to 15 percentage points, Bain says.

Institutions are well aware of this: it is no coincidence that removing friction from the customer journey has been identified as the most significant trend by financial service providers globally. Improved use of data and advanced analytics ranked similarly highly, suggesting the two are closely connected. Yet without advanced use of technology and data to support innovation, costs will soar as established players try to play catch-up with challenger banks.

Smarter services need smarter data processes

A number of leading organizations, from Bank of America to Barclays Africa, are currently trialing artificial intelligence to improve the customer experience, by accelerating the resolution of queries and completion of tasks – with experts on hand to help with anything tricky. Beyond banking, insurance companies are embracing technology to simplify the claims process and offer lower premiums, as long as customers agree to connect their cars, boilers and home alarm systems so they can be monitored online.

Behind many of these service innovations is a wealth of data which needs to be managed and mined efficiently: hence the intense interest in big data analytics. IDC estimates that, as well as being the industry with the largest investment in big data and business analytics solutions (nearly $17 billion in 2016), banking will see the fastest growth in spending on associated technology of all the main enterprise sectors between now and 2020.

The big challenge is that many of these emerging applications and big ideas rely on the ability to process huge volumes at data in real time, to allow the next action to happen. This is not an after-hours number crunching exercise.

And this isn’t something that can be achieved within a private data center – even one owned by a large bank. The scale and performance required would make this far too expensive, by up to a factor of 20 compared with using the cloud, if indeed it were possible to build out the infrastructure in a reasonable timeframe. (It isn’t.)

But nor is it something that can be easily moved offsite – or at least not without special provision. That’s because the data, being live, needs to exist in more than one place at the same time: in core systems within the bank’s main data center; and in the public cloud where the rapid analysis really needs to happen.

This leads to the very practical issue of how to keep everything in sync, when data is being updated every split second.

Multi-tasking with live data

Without this capability – of keeping each location tightly aligned – financial institutions will hit a wall with real-time analytics. Unless they can move continuously changing data sets between locations without risk of data disparity, they can’t realistically hope to distil instantly actionable insights to amaze customers or support new business models.

Routine processes need this facility too, if financial service providers are going to stay on their game. Quick and reliable fraud checking, and faster approvals for loans – both just as important to improving customer service as new service innovation – depend on accurate analytics happening at speed. To minimize delays to customers, the outcomes of these risk and credit checks must be looped straight back into current workflow, too, so transactions can progress.

For those in the front line of customer service coming up with fresh new ideas then, is a requirement that their ideas are matched by innovation in the back office, and in particular in the way institutions manage live data at epic scale across distance.

That’s where we come in.

FOLLOW

SUBSCRIBE

Get notified of the latest WANdisco Blog posts and Newsletter.

Terms of Service and Privacy Policy. You also agree to receive other marketing communications from WANdisco and our subsidiaries. You can unsubscribe anytime.

Related Blog Posts

https://www.wandisco.com/news-events/blog/tech-trends/how-iot-will-transform-transportation

Tech & Trends

How IoT Will Transform Transportation

IoT is at the core of forces reshaping transportation: providing greater safety; making travel more...

https://www.wandisco.com/news-events/blog/tech-trends/3-ways-og-industry-applying-iot-cut-costs

Tech & Trends

3 Ways the Oil & Gas Industry is Applying IoT to Cut Costs

Oil and gas companies that use IoT can cut operating costs and free up cash to finance migration to...

Cookies and Privacy

We use technology on our website to collect information that helps us enhance your experience and understand what information is most useful to visitors.
By clicking “I ACCEPT,” you agree to the terms of our privacy policy.

Cookie Setting