Connect with us

Technology

We need to talk about legacy IT architectures

Published

on

Cliff-Moyce

By Cliff Moyce, DataArt Global Head of Financial Services Practice

Cliff Moyce, DataArt Global Head of Financial Services Practice

Cliff Moyce, DataArt Global Head of Financial Services Practice

Unlike their challenger bank siblings and fintech cousins, incumbent banks have one particular problem to solve if they are to remain viable and competitive.  That problem is high cost to income ratios of (typically) around 60%.  Compare that figure to fintech firms engaged in ‘unbundling the bank’ who even when fully operational are operating at ratios as low as 20%.

A large part of the difference in costs is the cost of operating and supporting legacy system architectures.Other factors include the cost of branch networks, and the (over)staffing implications of functionally divided organisations. High IT infrastructures costs in large banks arise from significant duplication and hidden redundancy; poor integration; high complexity; poor systems documentation and knowledge; a lack of agility/ flexibility/ adaptability; old fashioned interfaces and reporting capabilities; difficulties integrating with newer models such as cloud computing and mobile devices; being difficult to monitor, control and recover; and, susceptible to security problems.

Getting old and new applications, systems and data sources to work seamlessly can be difficult, verging on impossible. This lack of agility means that legacy systems in their existing configuration can be barriers to improved customer service, satisfaction and retention.  In regulated sectors they can also be a barrier to achieving statutory compliance.  Pressure to replace these systems can be intensified by new competitors who are able to deploy more modern technologies from day one.

One radical approach to solving the infrastructure issue is to design and implement a new, more modern architecture using a radical clean-slate or blueprint-driven approach.  Amusing analogies have often been used to encourage audiences to take such an approach, including the analogy of legacy infrastructures resembling an unplanned house that has been extended many times.  But how easy is it to design and implement a new IT architecture in a large mature organisation with an extensive IT systems estate?

Rather than the unplanned house analogy, a better analogy might be a ship at sea involved in a battle. Imagine if you were the captain of such a ship and someone came onto the bridge to suggest that everyone stop taking action to evade the enemy and instead draw up a new design for the ship that would make evasion easier once implemented. You might be forced to be uncharacteristically impolite for a moment before getting back to the job at hand.

The temptation to start again is enormous, but big-bang approaches to legacy IT systems replacement can be naive, expensive and fraught with risk. At some point, many large organisations have attempted the enterprise-wide re-design approach to resolving their legacy systems problems. Yet so many initiatives have been abandoned when the scale of the challenge or the impossibility of delivering against a moving target become clear. Time has a nasty habit of refusing to stand still while you draw up your new blueprint.   Re-designing an entire architecture is not a trivial undertaking, and building / buying and implementing replacement systems will take a long time.  Long before a new architecture could ever be implemented the organisation will have launched new products and services; changed existing business processes; experienced changes to regulations; witnessed the birth of a disruptive technology; encountered new competitors; exited a particular business sector and entered others.

All of these things conspire to make the redesign invalid even before it’s live.  If you are lucky, you may realise the futility of the approach before too much money has been spent.  Furthermore, the sort of major projects required to achieve the transformation are the sorts of projects that run notoriously high failure rates. A 2005 KPGM report showed that in just a twelve month period 49% of organizations had suffered a recent project failure,with IBM later reporting in 2008 that only 40% of the projects met their schedule, budget and quality goals.And as recently as 2012, a McKinsey and Company report identified that 17% of large IT projects fail so critically as to threaten the very existence of the company.

So if wholesale blueprinting and re-engineering is impractical, what options are left to solve the problem?  Luckily there are some practical and cost effective approaches that can mitigate many of the problems with legacy systems while obviating the immediate need to replace systems (though eventual systems replacement should be an objective).  Two viable alternative approaches are service-oriented architecture (SOA) and web services. Used in combination, they offer an effective solution to legacy systems problem.

SOA refers to an architectural pattern in which application components talk to each other via interfaces.  Rather than replacing multiple legacy systems, it provides a messaging layer between components that allows them to co-operate at a level you would expect if everything had been designed at the same time and was running on much newer technologies.  These components not only include applications and databases, but can also be the different layers of applications.  For example, multiple presentation layers talk to SOA and SOA talks to multiple business logic layers – and thus an individual prevention layer that previously could not talk easily (if at all) to the business logic layer of another application can now do so.

Web services aims to deliver everything over web protocols so that every service can talk to every other service using various types of web communications (WSDL, XML, SOAP etc.).  Rather than relying on proprietary APIs to allow architectural components to communicate, SOA achieved through web services provides a truly open interoperable environment for co-operation between components.

The improvements that can be achieved in an existing legacy systems architecture using SOA though webs services can be immense, and there is no need for major high risk replacement projects and significant re-engineering.  Instead organisations can focus on improving cost efficiency by removing duplication and redundancy though a process of continuous improvement, knowing that their major operations and support issues have been addressed by SOA and web services. Another benefit is that the operations of the organisation can start to be viewed as a collection of components that can be configured quickly to provide new services even though the components were not built with the new service in mind. This principle is known as the composable enterprise.

But addressing the issue of legacy systems in a way that makes good sense is not just an IT issue; it is also a people issue. It requires people to resist their natural inclination to get rid of old things and build new things in the mistaken assumption that new is always better than old.  It requires people to resist the temptation to launch ‘big deal projects’, for all of the reasons that people launch big deal projects – from genuine belief that they are required (or the only way), to it being a way of self-promotion, and everything in-between. It requires people to take a genuinely objective view of the business case for change, while operating in a subjective environment.  It requires people to prioritise customer service over the compulsion to tidy up internally. And, it requires the default method of change to be continuous improvement rather than step change projects – which can be counter intuitive in cultures where many employees have the words ‘project’ or ‘programme’ in their job titles.

So, to summarise, of course legacy enterprise IT architectures can feel like barriers to efficiency, agility and customer satisfaction and making even the smallest change can often feel like it takes too long and costs too much money.  The overwhelming temptation to throw the legacy architecture away and start again is understandable, but succumbing to that temptation can be a mistake.  Luckily we now have technical tools and approaches available to affect radical improvements without having to incur the expense, effort and risk of major replacement projects.  But using these tools comes with a change of mindset and approach that may be counter-cultural in some organisations. It can mean a move away from step-change and ‘long-march’ projects, and a move towards continuous improvement.  Education and engagement will be one of the keys to making it happen.

*Previously published in Issue 3

Technology

Why technology is key to the future of auditing

Published

on

Why technology is key to the future of auditing 1

By Piers Wilson, Head of Product Management at Huntsman Security

The Financial Reporting Council (FRC), which is responsible for corporate governance, reporting and auditing in the UK, has been consulting on the role of technology in audit processes. This highlights growing recognition for the fact that technology can assist audits, providing the ability to automate data gathering or assessment to increase quality, remove subjectivity and make the process more trustworthy and consistent. Both the Brydon review and the latest AQR thematic suggest a link between enhanced audit quality and the increasing use of technology. This goes beyond efficiency gains from process automation and relates, in part, to the larger volume of data and evidence which can be extracted from an audited entity and the sophistication of the tools available to interrogate it.

As one example, the PCAOB in the US has for a while advocated for the provision of audit evidence and reports to be timely (which implies computerisation and automation) to assure that risks are being managed, and for the extent of human interaction with evidence or source data to be reflected to ensure influence is minimised (the more that can be achieved programmatically and objectively the better).

However, technology may obscure the nature of analysis and decision making and create a barrier to fully transparent audits compared to more manual (yet labour intensive) processes. There is also a competition aspect between larger firms and smaller ones as regards access to technology:

Brydon raised concerns about the ability of challenger firms to keep pace with the Big Four firms in the deployment of innovative new technology.

The FRC consultation paper covers issues, and asks questions, in a number of areas. Examples include:

  • The use of AI and machine learning that collect or analyse evidence and due to the continual learning nature, their criteria for assessment may be difficult to establish or could change over time.
  • The data issues around greater access to networks and systems putting information at risk (e.g. under GDPR) or a reluctance for audited companies to allow audit firms to connect or install software/technologies into their live environments.
  • The nature of technology may mean it is harder for auditors to understand or establish the nature of data collection, analysis or decision making.
  • The ongoing need to train auditors on technologies that might be introduced, so they can utilise them in a way that generates trusted outputs.

Clearly these are real issues – for a process that aims to provide trustworthy, objective, transparent and repeatable outputs – any use of technology to speed up or improve the process must maintain these standards.

Audit technology solutions in cyber security

The cyber security realm has grown to quickly become a major area of risk and hence a focus for boards, technologists and auditors alike. The highly technical nature of threats and the adversarial nature of cybers attackers (who will actively try and find/exploit control failures) means that technology solutions that identify weaknesses and report on specific or overall vulnerabilities are becoming more entrenched in the assurance process within this discipline.

While the audit consultations and reports mentioned above cover the wider audit spectrum, similar challenges relate to cyber security as an inherently technology-focussed area of operation.

Benefits of speed

The gains from using technology to conduct data gathering, analysis and reporting are obvious – removing the need for human questionnaires, interviews, inspections and manual number crunching. Increasing the speed of the process has a number of benefits:

  • You can cover larger scopes or bigger samples (even avoid sampling all together)
  • You can conduct audit/assurance activities more often (weekly instead of annually)
  • You can scale your approach beyond one part of the business to encompass multiple business units or even third parties
  • You get answers more quickly – which for things that change continually (like patching status) means same day awareness rather than 3 weeks later

Benefits of flexibility

The ability to conduct audits across different sites or scopes, to specify different thresholds of risk for different domains, the ease of conducting audits at remote locations or on suppliers networks (especially during period of restricted travel) are ALL factors that can make technology a useful tool for the auditor.

Benefits of transparency

One part of the FRC’s perceived problem space is that of transparency, you can ask a human how they derived a result, and they can probably tell you, or at least show you the audit trail of correspondence, meeting notes or spreadsheet calculations. But can you do this with software or technology?

Certainly, the use of AI and machine learning makes this hard, the learning nature and often black box calculations are not easy to either understand, recalculate in a repeatable way or to document. The system learns, so is always changing, and hence the rationale that a decision might not always be the same.

In technologies that are geared towards delivering audit outcomes this is easier. First, if you collect and retain data, provide an easy interface to go from results to the underlying cases in the source data, it is possible to take a score/rating/risk and reveal the specifics of what led to it. Secondly, it is vital that the calculations are transparent, i.e. that the methods of calculating risks or the way results are scored is decipherable.

Benefits of consistency

This is one obvious gain from technology, the logic is pre-programmed in.  If you take two auditors and give them the same data sets or evidence case files they might draw different conclusions (possibly for valid reasons or due to them having different skill areas or experience), but the same algorithm operating on the same data will produce the same result every time.

Manual evidence gathering suffers a number of drawbacks – it relies on written notes, records of verbal conversations, email trails, spreadsheets, or questionnaire responses in different formats.  Retaining all this in a coherent way is difficult and going back through it even harder.

Using a consistent toolset and consistent data format means that if you need to go back to a data source from a particular network domain three months ago, you will have information that is readily available and readable.  And as stated above, if the source data and evidence is re-examined using a consistent solution, you will get the same calculations, decisions and results.

Benefits of systematically generated KPIs, cyber maturity measures and issues

The outputs of any audit process need to provide details of the issues found so that the specific or general cases of the failures can be investigated and resolved.  But for managers, operational teams and businesses, having a view of the KPIs for the security operations process is extremely useful.

Of course, following the “lines of defence” model, an internal or external “formal” audit might simply want the results and a level of trust in how they were calculated; however for operational management and ongoing continuous visibility, the need to derive performance statistics comes into its own.

It is worth noting that there are two dimensions to KPIs:   The assessment of the strength or configuration of a control or policy (how good is the control) and the extent or level of coverage (how widely is it enforced).

To give a view of the technical maturity of a defence you really need to combine these two factors together.  A weak control that is widely implemented or a strong control that provides only partial coverage are both causes for concern.

Benefits of separation of process stages

The final area where technology can help is in allowing the separation and distribution of the data gathering, analysis and reporting processes.  It is hard to take the data, evidence and meeting notes from someone else and analyse it. For one thing, is it trustworthy and reliable (in the case of third-party assurance questionnaires perhaps)? Then it is also hard to draw high-level conclusions about the analysis.

If technology allows the data gathering to be performed in a distributed way, say by local site administrators, third-party IT staff or non-expert users BUT in a trustworthy way, then the overhead of the audit process is much reduced. Instead of a team having to conduct multiple visits, interviews or data collection activities the toolset can be provided to the people nearest to the point of collection.

This allows the data analysis and interpretation to be performed centrally by the experts in a particular field or control area. So giving a non-expert user a way to collect and provide relevant and trustworthy audit evidence takes a large bite out of the resource overhead of conducting the audit, for both auditor and auditee.

It also means that a target organisation doesn’t have to manage the issue of allowing auditors to have access to networks, sites, data, accounts and systems to gather the audit evidence as this can be undertaken by existing administrators in the environment.

Making the right choice

Technology solutions in the audit process can clearly deliver benefits, however if they are too simplistic or aim to be too clever, they can simply move the problem of providing high levels of audit quality. A rapidly generated AI-based risk score is useful, but if it’s not possible to understand the calculation it is hard to either correct the control issues or trouble shoot the underlying process.

Where technology can assist the audit process, speed up data gathering and analysis, and streamline the generation of high- and low-level outputs it can be a boon.

Technology allows organisations to put trustworthy assurance into the hands of operations teams and managers, consultants and auditors alike to provide flexible, rapid and frequent views of control data and understanding of risk posture. If this can be done in a way that is cognisant of the risks and challenges as we have shown, then auditors and regulators such as the FRC can be satisfied.

Continue Reading

Technology

The Future Growth of AI and ML

Published

on

The Future Growth of AI and ML 2

By Rachel Roumeliotis, VP of Data and AI at O’Reilly

We’ve all come to terms with the fact that artificial intelligence (AI) is transforming how businesses operate and how much it can help a business in the long term. Over the past few years, this understanding has driven a spike in companies experimenting and evaluating AI technologies and who are now using it specifically in production deployments.

Of course, when organisations adopt new technologies such as AI and machine learning (ML), they gradually start to consider how new areas could be affected by technology. This can range across multiple sectors, including production and logistics, manufacturing, IT and customer service. Once the use of AI and ML techniques becomes ingrained in how businesses function and in the different ways in which they can be used, organisations will be able to gain new knowledge which will help them to adapt to evolving needs.

By delving into O’Reilly’s learning platform, a variety of information about the different trends and topics tech and business leaders need to know can be discovered. This will allow them to better understand their jobs and will ensure that their businesses continue to thrive. Over the last few months, we have analysed the platform’s user usage and have discovered the most popular and most-searched topics in AI and ML. We’ll be exploring some of the most important finding below which gives us a wider picture of where the state of AI and ML is, and ultimately, where it is headed.

AI outpacing growth in ML

First and foremost, our analysis shone a light on how interest in AI is continuing to grow. When comparing 2018 to 2019, engagement in AI increased by 58% – far outpacing growth in the much larger machine learning topic, which increased only 5% in 2019. When aggregating all AI and ML topics, this accounts for nearly 5% of all usage activity on the platform. While this is just slightly less than high-level, well-established topics like data engineering (8% of usage activity) and data science (5% of usage activity), interest in these topics grew 50% faster than data science. Data engineering actually decreased about 8% over the same time due to declines in engagement with data management topics.

We also discovered early signs that organisations are experimenting with advanced tools and methods. Of our findings, engagement in unsupervised learning content is probably one of the most interesting. In unsupervised learning, an AI algorithm is trained to look for previously undetected patterns in a data set with no pre-existing labels or classification with minimum human supervision or guidance. In 2018, the usage for unsupervised learning topics grew by 53% and by 172% in 2019.

But what’s driving this growth? While the names of its methods (clustering and association) and its applications (neural networks) are familiar, unsupervised learning isn’t as well understood as its supervised learning counterpart, which serves as the default strategy for ML for most people and most use cases. This surge in unsupervised learning activity is likely driven by a lack of familiarity with the term itself, as well as with its uses, benefits, and requirements by more sophisticated users who are faced with use cases not easily addressed with supervised methods. It is also likely that that the visible success of unsupervised learning in neural networks and deep learning has helped our interest, as has the diversity of open source tools, libraries and tutorials, that support unsupervised learning.

A Deep Learning Resurrection

While deep learning cooled slightly in 2019, it still accounted for 22% of all AI and ML usage. We also suspect that its success has helped spur the resurrection of a number of other disused or neglected ideas. The biggest example of this is reinforcement learning. This topic experienced exponential growth, growing over 1,500% since 2017.

Even with engagement rates dropping by 10% in 2019, deep learning itself is one of the most popular ML methods among companies that are evaluating AI, with many companies choosing the technique to support production use cases. It might be that engagement with deep learning topics has plateaued because most people are already actively engaging with the technology, meaning growth could slow down.

Natural language processing is another topic that has showed consistent growth. While its growth rate isn’t huge – it grew by 15% in 2018 and 9% in 2019 – natural language processing accounts for about 12% of all AI and ML usage on our platform. This is around 6x the share of unsupervised learning and 5x the share of reinforcement learning usage, despite the significant growth these two topics have experienced over the last two years.

Not all AI/ML methods are treated equally, however. For example, interest in chatbots seems to be waning, with engagement decreasing by 17% in 2018 and by 34% in 2019. This is likely because chatbots were one of the first application of AI and is probably a reflection of the relative maturity of its application.

The growing engagement in unsupervised learning and reinforcement learning demonstrates that organisations are experimenting with advanced analytics tools and methods. These tools and techniques open up new use cases for businesses to experiment and benefit from, including decision support, interactive games, and real-time retail recommendation engines. We can only imagine that organisations will continue to use AI and ML to solve problems, increase productivity, accelerate processes, and deliver new products and services.

As organisations adopt analytic technologies, they’re discovering more about themselves and their worlds. Adoption of ML, in particular, prompts people at all levels of an organisation to start asking questions that challenge what an organisation thinks it knows about itself. With ML and AI, we’re training machines to surface new objects of knowledge that help us as we learn to ask new, different, and sometimes difficult questions about ourselves. By all indications, we seem to be having some success with this. Who knows what the future holds, but as technologies become smarter, there is no doubt that we will we become more dependent.

Continue Reading

Technology

Artificial Intelligence and Speech Analytics are crucial to Financial Organisations’ future

Published

on

Artificial Intelligence and Speech Analytics are crucial to Financial Organisations’ future 3

By Richard Stevenson, CEO, Red Box

At the beginning of 2020, when the world was still largely unaware of the looming pandemic that was set to alter so many aspects of our lives and business operations, enterprises across all sectors, from finance to retail, already felt the clock was quickly ticking for them to embark on a radical technological change.

With Industry 4.0 in full swing, Artificial Intelligence (AI) and Speech Analytics are two key technologies that have promised to future proof the financial sector. The benefits of adopting such technologies include the streamlining of entire business processes, but if unlocking the value of voice data was a key goal for banks and financial organisations in the past, 2020 and the coronavirus pandemic has only served to fast track those plans.

The Data Speaks for Itself

We asked 500 CEOs, Directors and Middle Managers across enterprises of varying sizes to relay their thoughts on the importance of AI, Speech Analytics and voice data to their business operations. AI and Speech are already making waves in the financial sector, with banks using voice data to detect and combat fraud at a larger and faster rate than previously possible, and insurance companies fast-tracking their claims processing and underwriting through AI, so some of the research results come as no surprise. However, 91% of those surveyed in banking, insurance and finance already believe that voice data is, or will be, a strategic asset in the near future. This is a huge majority.

Living in such unprecedented times, businesses will be trying their best to leverage every competitive advantage they can, and the adoption of new technology is clearly high up on that list. With customer experience being key to retaining business during times of a crisis, having the right technology to support customers has proven to be a must.

To take the high street bank as an example, customers have, for decades, become accustomed to visiting their local branch. In March, many bank branches across the UK and the world closed for months on end or had their opening hours greatly reduced during the peak of lockdown. With cashiers and advisers unable to talk to customers or provide guidance of sometimes complex in-house machine operation, a whole new way of banking emerged. For those already familiar with modern banking methods – online banking, chatbots and mobile apps – this wasn’t so daunting. But contact centres found that they were dealing with a massive uptick in customer numbers as people were unable to access their traditional banking methods or were worried about their financial situation. Such a huge surge in calls, from customers worried about their mortgage payments or how they were going to deal with their next gas bill, put added stress on contact centre staff who were adjusting, in many cases, to having to work remotely.

Introducing the right AI and Speech Analytics tools and replacing many old-age, antiquated practices, is enabling those in the finance industry to look ahead to a post-pandemic future. With voice data set to unlock major new insights in the customer journey, enable organizations to experience newfound agility, and unlock the potential to improve both the customer and employee experience, all whilst cutting costs and enhancing productivity, the financial organisations of the future are looking to change how humans can be used more effectively .

Making Informed Decisions on AI & Speech Analytics

Adopting AI and Speech Analytics, and maximising the use of generated voice data can create a plethora of benefits to an organisation, however, only 7% of the financial sector currently see speech analytics as a strategic asset. To stay on top of the competition, CTOs and CIOs will often be pressured into making a decision quickly when going to market, and when being presented with endless choices, picking the right software is not only important, it can be game changing.

Without the proper foundations in place or the knowledge on how to maximise the value of corporate purchases, organisations shopping for new tools need to put the data they’re currently generating under the microscope. With such promise, nearly two-thirds of businesses (62%) are still failing to use transcribed voice data to fuel their AI engines. Organizations that are interested in adopting this new technology must remember that AI and analytics tools are fueled by high quality data, i.e. the data must be extracted, processed, stored and analysed in the most optimal way – that’s where the journey to extract value from AI and Speech Technology tools begins.

Unlocking the Full Potential of Voice Data

Captured voice data is the richest and most human source of insight, and most organisations in the financial services sector are already generating this at an incredible volume for compliance reasons. The pandemic has made C-Level executives Directors and Managers increasingly aware of the strategic importance this data source can have when fed through AI solutions.

Now that we are being pushed to digitization faster than ever before and entire processes once dealt with in person are being transferred to the call centre, organisations have never processed this much voice data. A single person, or team, can only go through such vast data sets with a helping hand from technology, making AI the next logical step to streamlining the customer and employee experience, and indeed the business as a whole. Correctly adopting AI and feeding it with high quality data sets will help steer organisations into a technology-enabled future. To remain relevant and competitive during and after this global pandemic, one thing is certain: companies must act now to better leverage what’s effectively one of their most valuable and strategic assets.

Continue Reading

Call For Entries

Global Banking and Finance Review Awards Nominations 2020
2020 Global Banking & Finance Awards now open. Click Here

Latest Articles

Return to Work Doesn’t Mean Business as Usual When it Comes to Travel and Expense 4 Return to Work Doesn’t Mean Business as Usual When it Comes to Travel and Expense 5
Top Stories20 hours ago

Return to Work Doesn’t Mean Business as Usual When it Comes to Travel and Expense

By Rob Harrison, MD UK & Ireland, SAP Concur The last few months have been an exercise in adaptability for...

Why technology is key to the future of auditing 6 Why technology is key to the future of auditing 7
Technology21 hours ago

Why technology is key to the future of auditing

By Piers Wilson, Head of Product Management at Huntsman Security The Financial Reporting Council (FRC), which is responsible for corporate governance,...

Staff training crucial for SME recovery post-COVID 8 Staff training crucial for SME recovery post-COVID 9
Business1 day ago

Staff training crucial for SME recovery post-COVID

47% of UK’s top performing SMEs provide regular, formalised training for all staff Despite this, 15% of small businesses report to...

What Is Globalization 10 What Is Globalization 11
Business2 days ago

What Is Globalization

What is globalization? Globalization, or inter-connectedness, is the ever-growing process of integration and interaction among countries, individuals, businesses, and even...

What Is Microsoft Teams 12 What Is Microsoft Teams 13
Business2 days ago

What Is Microsoft Teams

Microsoft Teams is an application and web-based collaboration tool that combines chat, videos, online collaboration, document storage, and collaboration with...

What Is Capitalism 14 What Is Capitalism 15
Business2 days ago

What Is Capitalism

What is capitalism? Is it a great economic system or just another economic system that is not so great? Well,...

How To Start A Youtube Channel 16 How To Start A Youtube Channel 17
Business2 days ago

How To Start A Youtube Channel

How to Start a YouTube Channel For Your Business: Do you have a blog or website? If you do, it’s...

What is URL 18 What is URL 19
Business2 days ago

What is URL

A Uniform Resource Locater, colloquially known as a URL, is an identification to a certain web resource, a directory or...

What Is Seo 20 What Is Seo 21
Business2 days ago

What Is Seo

Search engine optimization, also known as SEO, is the process of increasing the quantity and quality of site traffic from...

How Much Rent Can I Afford. 22 How Much Rent Can I Afford. 23
Business2 days ago

How Much Rent Can I Afford.

How much rent is too much to pay? Sometimes, apartment complexes look at an annual income that’s over forty times...

Newsletters with Secrets & Analysis. Subscribe Now