Key to quality
A data warehouse appliance empowers quality-control efforts at
Western Digital has always had a strong quality conscience. A brief lapse in quality control more than 10 years ago resulted in a concerted effort and expense to extract faulty hard drives back from customers. Under the banner of the Quality Information System (QIS), the company started a long-term initiative to link all of the data sources required to track the genealogy of components and assemblies, along with the thousands of test parameters that are collected at each stage of manufacturing.
Director of Business Intelligence and Data Warehouse Ross Gough indicates that QIS is integrating data from Western Digital suppliers across components for the read/write heads, disk recording media and printed circuit boards, along with thousands of test parameters collected from each drive, head, radius and zone. Data from the call center on customer problems is also collected. When there is a disk failure, QIS collects the drive failure analysis consisting of several hundred counters, such as the total operation hours and number of spin-ups and spin-downs.
QIS has become pervasive across the company. “And all this information is available to everyone within the four walls of Western Digital to query,” Gough says.
Time to upgrade
When daily data loads were requiring 18 hours, the QIS team knew it was running up against a wall. So in January 2008, Gough’s group started procurement procedures, considering options from Teradata, Sybase and Netezza. The group ultimately selected the Teradata Data Warehouse Appliance. “No one else came close to the price/performance or to the maturity,” he says. The decision was made easier when “Teradata made us an attractive offer,” Gough notes. “Because of the absence of conversion costs, Teradata would probably still have been cheaper at the end of the day.”
The Teradata Data Warehouse Appliance was selected, instead of the larger Teradata Active Enterprise Data Warehouse, because Western Digital did not need “all the bells and whistles of an enterprise-level platform,” such as mixed workload management.
Whether it is labeled an “appliance” does not make a difference to Western Digital. “Teradata has always been an appliance to us,” explains Gough. “It is a bundle of hardware, software and services … one stack delivered and maintained by one vendor.”
The purchase was approved in June 2008, and Teradata delivered the system in one week. “The box was on site by Friday; all the pieces were in place by Thursday and in production by the following Tuesday,” Gough recalls. His team was surprised that it took less than three weeks from purchase to production. The old system was phased out by mid-July, and the conversion involved only two database administrators (DBAs).
After implementation, Gough and Senior Data Warehouse Architect Keehan Mallon say, the system has been great. The upgrades went smoothly, with only eight minor regressions. This was a surprisingly low total, since everything was new on the system.
“Users saw their queries go from 30 to 40 minutes down to five to 10 minutes, allowing them to get on and off the box so much faster.”
With the new system, typically 20 users are executing 15 to 30 active queries at any one time. The Teradata Data Warehouse Appliance has increased performance by four or five times. Gough notes, “Users saw their queries go from 30 to 40 minutes down to five to 10 minutes, allowing them to get on and off the box so much faster.”
Mallon described the daily manufacturing data volumes: “Two sets of data measurements are taken on the shop floor, each of which involves 60 to 100 values from the head, radius and zone. And we are building 500,000 drives per day.” The total amount of manufacturing data is around 60GB to 100GB per day, plus additional data flow from other sources.
Even with the additional data, its loading process was reduced from 18 hours to between four and six hours. Previously, recovering from the loss of a daily load cycle caused by instability in a data source would take seven to 10 days. It now takes one or two days to recover, which is a major improvement.
The initial return on investment (ROI) was quickly realized by eliminating a “significant massive inconvenience and downtime for the customer” from the extraction of faulty drives, Gough notes. “That was pretty quick and easy. QIS helped Western Digital deliver better quality. We now make warranty predictions with SAS analytics so that we do not reserve too much on the balance sheet.”
Western Digital is starting to predict which drives will fail in the field. The company has had some success and is honing its predictive capability with high-end analytics, neural network and regression modeling. And it is adding more media test data, which is fine-tuning the manufacturing process for media.
Gough summarized: “Our project is governed by the 80:20 rule. We are now focusing on the 20 percent because the 80 percent has been a success.”
This is a classic success story for a focused data warehousing project. First, the team had a clear, tangible business justification based on improvements in the quality of a high-volume manufacturing operation. The Western Digital culture of continuous quality improvement further reinforced that justification for the data warehousing efforts.
Second, the company retained its legacy infrastructure from a vendor that provided a viable migration path, thus avoiding lengthy conversion efforts. Third, it exploited the business value of disparate data by integrating this data and applying advanced analytics to surface actionable insights. This project had considerable synergism from integrating across the traditional functional silos.
The data warehouse team risked the adoption of new technology and succeeded in delivering on several key advantages, including reduced time to value and a noticeable boost in performance to users.