Cisco Systems, Inc. is an American-based multi-national digital communications technology conglomerate corporation, famous for developing and manufacturing networking hardware, software, telecommunications equipment and other high-technology services and products.
Many of its hardware products had to be redesigned during the COVID-19 pandemic because of component shortages. The challenge put a spotlight on the company’s existing testing operations, which threatened to form a bottleneck that, given the shortage of materials, could prove extremely disruptive. An added element was that Cisco was forced to source components from suppliers they had never used before, making testing more critical than ever. But the existing testing operations delivered a “best guess” figure for the proportion of finished products that needed to be tested – sometimes as high as 100%, which was not practical. Furthermore, increased production volumes seemed to indicate a need for an increase in testing chambers, which are extremely expensive.
Cisco needed to find a way to tailor its testing operations to figure out an appropriate proportion of finished hardware that actually needed testing.
Building an Internal Team
The company put together an internal team, consisting of Mike Ruddick, director of hardware engineering; Senan Khairie, business architecture director, quality engineering; Shenba Sudalaiyandi, business operations manager, quality engineering; and Tom Cooper, business operations manager, test development engineering. Together, they took a close look at the existing protocols and why they were delivering unsatisfactory results.
What they found was that their existing process raised problems of weak accounting for product maturity and associated testing in the factories, as well as an historical lack of real-time tuning of Cisco’s test processes. While the tools at hand gave great success in pockets, they did not achieve widespread adoption or deployment, and the process was often hindered by the lack of an associated business process to evaluate the needs, feasibility, and effectiveness of the toolset.
The Cisco team noticed that where there was more engagement by the product teams, deployments of the existing tools actually did allow the quality-guarantee algorithms to do their job. However, these results were identified mostly via manual collection of field data.
The more obvious method of expanding capacity to meet the non-linearity in the factory, by buying and installing equipment, was not likely to solve the issues due to equipment availability and lead time.
Developing a New Toolset
Building on the 2016 Gartner Chainnovator award granted to Cisco’s Adaptive Test toolset, the team was able to develop a very strong business process that resulted in a substantial boost to Cisco’s revenue output without putting product quality at undue risk.
To drive scale, new business processes now assess a product’s quality maturity, and use this information to formally configure the Adaptive Test automated engines with an Average Outgoing Quality Limit (AOQL ) setting.
Now a “New Automated Field Monitoring Process” measures the products using Adaptive Test against their quality goals, while also generating alerts for any excursions or near-threshold performance. This has improved the risk management profile of the process.
First, the team developed a common business process that ingests current factory and field performance of the product to assess feasibility of using the Adaptive Test methods and tools. This process enables multiple parties within Cisco to evaluate with common criteria and thresholds to ensure proper use of the toolset. Previously, more discretion was allowed in individual product teams, often resulting in the desire to understand how other teams are operating, and thus resulting too often in not proceeding.
Second came automated monitoring of products that enabled the Adaptive Test. Previously, quality engineers would monitor effects in the field in a relatively ad hoc manner, unsure of how to properly assess if the Adaptive Tested products were performing equally, better than, or worse than the peer products that were not using Adaptive Test. The process innovation deployed was to create a standard to evaluate Adaptive Tested products vs. non-Adaptive Tested products, using the AOQL setting as a guide, and common methods to attribute any field effects back to anything that Adaptive Test incurred.
These two primary changes to the testing process have enabled teams to address the constantly shifting needs of product, geography, and material position to maximize Cisco’s ability to ship revenue-producing units without sacrificing product quality.
The consequent automation of quality monitoring tools has resulted in a ten-fold increase in capacity, right when Cisco needed it. They also managed to reduce, not increase, the number of expensive testing chambers required, which came with the added ESG benefit of using less electricity and refrigerants.
Resource Link: Cisco, www.cisco.com
RELATED CONTENT
RELATED VIDEOS
Timely, incisive articles delivered directly to your inbox.