Editorial & Advertiser Disclosure Global Banking And Finance Review is an independent publisher which offers News, information, Analysis, Opinion, Press Releases, Reviews, Research reports covering various economies, industries, products, services and companies. The content available on globalbankingandfinance.com is sourced by a mixture of different methods which is not limited to content produced and supplied by various staff writers, journalists, freelancers, individuals, organizations, companies, PR agencies Sponsored Posts etc. The information available on this website is purely for educational and informational purposes only. We cannot guarantee the accuracy or applicability of any of the information provided at globalbankingandfinance.com with respect to your individual or personal circumstances. Please seek professional advice from a qualified professional before making any financial decisions. Globalbankingandfinance.com also links to various third party websites and we cannot guarantee the accuracy or applicability of the information provided by third party websites. Links from various articles on our site to third party websites are a mixture of non-sponsored links and sponsored links. Only a very small fraction of the links which point to external websites are affiliate links. Some of the links which you may click on our website may link to various products and services from our partners who may compensate us if you buy a service or product or fill a form or install an app. This will not incur additional cost to you. A very few articles on our website are sponsored posts or paid advertorials. These are marked as sponsored posts at the bottom of each post. For avoidance of any doubts and to make it easier for you to differentiate sponsored or non-sponsored articles or links, you may consider all articles on our site or all links to external websites as sponsored . Please note that some of the services or products which we talk about carry a high level of risk and may not be suitable for everyone. These may be complex services or products and we request the readers to consider this purely from an educational standpoint. The information provided on this website is general in nature. Global Banking & Finance Review expressly disclaims any liability without any limitation which may arise directly or indirectly from the use of such information.

Embrace automation – and maximise ROI across the lab infrastructure

…says Ralph Joseph, Business Development Manager for Automation at Spirent. He is championing an automation strategy called iTO that brings ROI to the forefront.
The demand for quality, customer retention, corporate reputation means goods and services must be tested. That is clear. But what is the cost? Does testing remain a necessary overhead, like security, or can it deliver real return on investment?

As systems grow more complex, competition mounts and customers grow harder to please, the test team is under pressure. Test integrations grow brittle in the face of new features. Money is invested in expert scripters, developing tests that no one else can replicate or even understand, and test engineers are forced to spend increasing time re-wiring and re-configuring the lab for every scenario, release or project.

Test automation offers a solution, and the industry is moving in that direction. Without a holistic approach, however, automation can involve significant cost without delivering the optimised data-flow and ROI to justify further investment into a smarter test strategy, more capable tools or a lab that is streamlined, accessible, efficient and, most importantly, pays for itself.

The secret is to ensure that you optimise all interactions and all workflows across all aspects of your test strategy. Failure to optimise across the test infrastructure will result in time, money and resource leakage that could impact a test lab for many years. But get it right, and you can look forward to future investment in smarter strategies and even better lab equipment.

The savings and benefits from optimised automation
Here is an example: a leading manufacturer of low-cost Ethernet switches and data-center management solutions to global corporations, large telecommunications companies, service providers, carriers, enterprise and government organisations alike. The growing demand for network virtualization meant the testing cycle had grown seven-fold, demanding extra staff and overseas outsourcing to keep up with the increase in scale and complexity.

The SVP of Business Development recalled that “The growth rate of QA was becoming faster than development”.
Automation was an obvious necessity, and the company had already invested and developed a home-grown script-based system that followed a proven testing workflow yet was proving inefficient and a huge drain on resources. They needed a commercial solution that would enable engineers of all skill levels to contribute to the automation process and help bring a quality product to market in a cost effective manner.

Only one year after adopting an optimised automated test solution, they had created approximately 3,700 test cases spanning 13 different products and 65% of the original test plan was now fully automated. The resulting standardisation meant that their engineers could create and share portable automation assets, including tests, reports, topologies and documentation.

“We can avoid the inefficiency of having one engineer create a test and give it to the automation team to make into an automation test suite,” “The VP for Software Engineering went on, “The automation team can simply leverage the original test plan, moving automation up earlier in the process and saving us a lot of time and cycles.”
Substantial time saving was noted elsewhere in the QA workflow, including regression testing, product delivery and maintenance release cycles. Developers and test engineers streamlined communication and reduced time to resolution by sharing automation assets. Developers were easily able to reproduce bugs at their desktops, saving both groups time and allowing them to focus on further quality testing and development. In summary, the ROI

Analysis Study noted:

  • Scripting costs reduced by more than 40%
  • 12-fold increase in productivity
  • 50% reduction in test cycle
  • 14-fold reduction in bug reproduction time
  • Over 10-fold reduction in time for maintenance releases

As a consequence, the company was able to reshape its whole delivery approach. According to the company’s Director of Software Quality Assurance: “The performance gains from allowed us to save about $500,000 in resources and equipment. It enabled us to leverage existing resources more efficiently so that we can meet our quality objectives while adhering to a more aggressive schedule.”
A year after adopting this approach, the company was purchased by a global technology and consulting corporation, primarily on the strength of its improved performance, demonstrable ROI and re-positioning as a market sector leader.
This level of productivity gain and ROI does not simply apply to the network equipment manufacturer – service providers, carriers and enterprise organisation have shown similar benefits across the world. The following service provider example shows what can be achieved when optimising across the entire test environment.


The number of automated test cases this service provider could perform against manually-driven tests was increased 50% from a 40/60 ratio to 60/40. Test Bed setup time when viewed as an overall average across the project was reduced from 60% of the project to 20% – 3 Times faster than before. This had a significant impact on actual tester productivity where they observed a 12 Times increase in productivity when viewed as the number of executed tests per person. Defect resolution time was reduced by 86% which facilitated a restructuring of the test organisation to adopt the common communication approach. A 50% reduction in test cycle was enough for the executive management to mandate that all test organisation project and program managers take a closer look at automation and assess viability on all projects moving forward. A reduction in time to market was viewed very seriously as meeting a core business objective. The service provider had seen years of additional cost with delayed product releases and the subsequent impact on project management. With the average test cycle dropping despite the massively increased level of quality, the improvements brought by automation were welcomed across the wider organisation and lauded by senior management and the executive team.

These figures are clearly impressive, but what might they mean in terms of monetary ROI? The following table shows actual savings made by a small product qualification group on one typical release test schedule, based on time saved. Note that this result does not include further savings made possible by multi-threaded stress tests, for example. There is no comparison here because such tests could not be performed without a sophisticated commercial automation solution. Their previous scripted environment was not capable of automating these types of tests.



It’s clear that, scaling up the saving for one release up to a year of 6 regular releases and 6 point or maintenance releases, the total 1st year savings totalled $674,588. The subsequent impact on the time to market, increase in product quality, customer satisfaction and overall increased satisfaction of the test team were observed as key knock-on effects of such astounding results.

How this is achieved – a radical approach to test automation
The testing and quality industry can no longer afford to focus simply on the functional, performance or regression test – instead it must integrate the individual elements of the test workflow and optimise across the entire test lab infrastructure to ensure significant ROI in the lab.

A sure way to achieve this is by adopting a strategy known as Infrastructure Test Optimization (iTO). The strategy was proposed back in 2010 by leading US tech industry analysts, Voke, who determined that to be successful with automation in the network test arena, organisations had to look beyond the functional flow of a test case and into optimising all elements of the test lifecycle from requirements traceability, through tighter element integration right down to optimised test execution.

Organizations that apply iTO are rewarded with better visibility, deeper traceability and realism in their test infrastructures whilst also observing increased collaboration and productivity. Ultimately this leads to greater business value as business objectives can be more closely aligned and synchronised with activities promoted by the iTO method. New innovations and best practices that arise from the iTO method can be embedded in the organisation for years to come.

For “the proof of the pudding”, my organisation first applied iTO to its own automation software product development processes. The results were so impressive that senior engineering management then adopted the principle across the wider engineering organisation as a standard for high achievement.

More significantly, iTO is now embedded into all the company’s pre-sales strategies with the result that, within 6 months of deployment, customers find they can start to develop and improve the way projects are planned and executed – realistically cutting project times by half.

The essence of iTO
iTO encompasses five primary practice areas:

  • Emulation & Analysis – Ensuring that real world traffic generation and input data is optimised and provides a realistic picture of the network complexities, subscriber activity and application demand.
  • Test & Topology Automation – Automation development environments to build test cases, lab utilities, system workflow and to automate the switching of the physical lab via media cross connects. The provision of automation for all skill levels based on industry standards and established test engineering best practices. Tools to establish test setup, test configuration and test teardown. Tools to analyse real-time incoming data, to handle variance and scale. Tools to execute tests, provide in depth reporting and to facilitate lights-out regression testing.
  • Manual & Developer Testing – Support for industry-standard, open software engineering architectures such as Eclipse promote the development of tighter integrations and also enable software development teams to innovate across the testing workflow. The industry is relying on the availability of strong and expansive tool API. The industry tool vendors appear to be listening and this is at the heart of iTO.
  • Quality Management – Ensure reliable and efficient implementation of the corporate-wide quality strategy by easing the workflow to improve the level of traceability. iTO promotes integrations with industry standard quality management platforms which are easy to manage, seamless to the test community and fully automated. From Requirements to Design to Implementation to Execution – iTO can ensure that traceability is optimised.
  • Lifecycle Virtualization – Embracing virtual computing technology across the entire test infrastructure to provide the required level of scalability whilst avoiding compromise on system efficiency and test performance.

These five practice areas are encouraged to work together in an holistic manner thus promoting the use of shareable and reusable test assets. Collaboration is supported by improved and common communication between adjacent teams in the workflow and this allows for the organisation-wide adoption of best practices and conventions.

The industry is testing. That is clear. But at what cost? The resources needed to maintain a disparate testing workflow make it hard to achieve core business objectives. On the other hand, optimising a test strategy right across the lab infrastructure from the people involved to the potential for tighter element integration can significantly raise productivity, ROI and product quality.

Testing is not just a necessary overhead. An optimal test strategy can put you on the path to market leadership.