Connect with us

Global Banking and Finance Review is an online platform offering news, analysis, and opinion on the latest trends, developments, and innovations in the banking and finance industry worldwide. The platform covers a diverse range of topics, including banking, insurance, investment, wealth management, fintech, and regulatory issues. The website publishes news, press releases, opinion and advertorials on various financial organizations, products and services which are commissioned from various Companies, Organizations, PR agencies, Bloggers etc. These commissioned articles are commercial in nature. This is not to be considered as financial advice and should be considered only for information purposes. It does not reflect the views or opinion of our website and is not to be considered an endorsement or a recommendation. We cannot guarantee the accuracy or applicability of any information provided with respect to your individual or personal circumstances. Please seek Professional advice from a qualified professional before making any financial decisions. We link to various third-party websites, affiliate sales networks, and to our advertising partners websites. When you view or click on certain links available on our articles, our partners may compensate us for displaying the content to you or make a purchase or fill a form. This will not incur any additional charges to you. To make things simpler for you to identity or distinguish advertised or sponsored articles or links, you may consider all articles or links hosted on our site as a commercial article placement. We will not be responsible for any loss you may suffer as a result of any omission or inaccuracy on the website. .

Business

MASTER OR SERVANT? WHY THE EXPLOSION IN DATA DEMANDS A FRESH APPROACH IN ORDER TO CREATE TRUE COMMERCIAL ADVANTAGE

MASTER OR SERVANT? WHY THE EXPLOSION IN DATA DEMANDS A FRESH APPROACH IN ORDER TO CREATE TRUE COMMERCIAL ADVANTAGE

Michael Upchurch, COO, Fuzzy Logix.

The financial industry has always relied on data – from the earliest moneylenders mentally calculating interest rates on their loans, through to today’s traders moving vast sums around electronically, supported by complex algorithms.  But we’re at a tipping point.  A point at which the modern throughput and consumption of financial data is happening at a speed that far outstrips the rate at which humans can reasonably interact with it.   Machines 1, humans 0?

Michael Upchurch

Michael Upchurch

If we assume the data explosion trend isn’t going to reverse any time soon (and I think we’d all agree that the world is only going to create more data rather than less!) what steps can be taken to ensure that we manage the vast throughput of data and enjoy the appropriate commercial benefits as a result?

If we look at the challenge, we know that consumer and corporate financial transactions happen on a global basis across complex interconnected networks through which data must pass, driven by defined rules and processes.  This activity is supported by some basic modern banking technology fundamental parameters, which include:

  • Real-time competencies
  • Transactional throughput capabilities
  • Deep analytics intelligence
  • Effective data workload management control
  • An intuitive user interface and dashboard presentation layer

The problem is that conventional approaches to even the most contemporary banking frameworks and their architectural development fail to engineer-in the need for analytics velocity from the outset.  The specific location and operational approach to data itself has failed to provide a platform for analytics at the speed and accuracy needed.

For the financial sector and its use of big data analytics, real time data analytics is subject to complex parameters due to global exchange rates, privacy factors and the increasing use of time series data tracking.  But, here’s the rub, if we can manage these complexities in situ and work closer to the coalface of data where it is stored, then we can start to engineer time and cost savings that would never have been achievable and make previously impossible tasks possible. The complexities themselves have developed as a result of year after year of silo-centric IT programmes being pushed forward with little appreciation for the future and the possibility of real time processing and analytics.

Now that we have the opportunity to architect our approach to data with more finely grained and more powerful control mechanisms, banks and all varieties of financial institutions will be able to make informed commercial decisions quicker than their competitors. As a consequence, they will then be able to seize market opportunities faster and meet the demand of their customers quicker.  Equally importantly, these same institutions will be able to hone their analytics to help identify and reduce theft, corruption, security breaches and all manner of malicious activity common to wherever money is located.

The time has come then for a new approach to data analytics that can deliver results not from a different perspective, but from a different point of applied data processing and application logic. This technology inflexion point is created by the presence of in-database analytics, which can allow us to leverage analytics insight on demand, the second the data is available   How is this possible?  Because in-database analytics run directly inside your database using the full power of the platform.  Unlike traditional analytics products that require you to move data from the data warehouse to another environment for processing, in-database analytics will let you process the data without moving it.  This has many benefits.  First, as data gets bigger, the price paid to move it also gets bigger.   Up to 80% of the processing time for analytics solutions can be consumed by just moving data.  Second modern data warehouses provide a very powerful engine for performing analytics; if the models are built to optimize performance.  Research shows that coding models to take advantage of data and process parallelism can result in models that run 10X to 100X times faster than non-optimized models.

In-database analytics are also easy to use.  It takes about an hour to install over 700 datamining, machine learning and financial models into your database.  No additional hardware or storage is needed and the models inherit the security layer already in place, so there’s no user setup to manage.  Once installed, the models become part of the database and appear as native functions.   You run them using SQL; the most popular data language on earth.

Let’s take the example of calculating VaR (Value at Risk).  Using a traditional approach, users move data from their database to another analytics environment and run all the calculation included in VaR modelling.  This can typically take 2-6 hours depending on the environment.  We used in-database analytics and performed 10,000 simulations for a portfolio of 500 stocks over 252 days which created 1.26 billion simulations.  We then calculated P&L for 30,000 positions with discrete intervals (1..5 days, 1..4 weeks, 1..3 months, etc.) with 10,000 simulations which involved 1.5 billion P&L calculations.  Finally, we performed aggregation and VaR calculations for each discrete interval.  In total, 12.6 billion simulations were performed in less than 2 minutes and the entire VaR and P&L process can be performed in less than 5 minutes.

Other use cases for exploiting the power of in-database analytics are numerous and significant; from cleaning up money laundering and driving better customer service through to minimising the need for ALLL (Allowance for Loans and Lease Losses) and to reducing the burden of CCAR (Comprehensive Capital Analysis & Review).  The rise of in-database analytics has really evolved in direct response to the need for data analytics techniques that can be applied to these kind of use cases.  For ultimate scalability and performance, why move the data to the analytics if you can move the analytics to the data.

It is clear that the opportunity for banks and financial institutions to bring in-database analytics and a completely new approach to data mining into their operational strategies is significant.  The move to using in-database analytics is an evolutionary step and the level of competitive advantage gained is commensurate with the level of adoption. So, consider the ways in which in-database analytics can help you take back real control of your data and start reaping the benefits immediately.

Michael’s biog is as follows:

Michael is responsible for operations and healthcare and financial services business units. Before Fuzzy Logix he worked at Bank of America to develop the strategy and operations for telephone-based mortgage lending that grew sales from $11B to $22B in 4 years.

He also worked in the Consumer Innovation Team in the Global Corporate and Investment Bank where he developed financial products for consumers that leveraged capital market instruments. Earlier, Michael worked at The Hunter Group implementing ERP systems for Global Fortune 500 companies. After multiple engagements, Michael joined the management team and developed the company’s market offerings across 9 lines of business, spanning 12 countries and leveraging the strengths of 13 companies acquired during a 3 year period.

Global Banking & Finance Review

 

Why waste money on news and opinions when you can access them for free?

Take advantage of our newsletter subscription and stay informed on the go!


By submitting this form, you are consenting to receive marketing emails from: Global Banking & Finance Review │ Banking │ Finance │ Technology. You can revoke your consent to receive emails at any time by using the SafeUnsubscribe® link, found at the bottom of every email. Emails are serviced by Constant Contact

Recent Post