The History of Predictive Engineering

Happy Holden, I-Connect007 | 11-01-2017

It all started in 1983, at HP, when I complained to our group's vice president that our W. Edwards Deming and Total Quality Management (TQM) Six Sigma training was being concentrated in PCB manufacturing. We had eliminated final inspection and instead placed quality in the hands of the operators with a final electrical test. The electrical test was governed by what we learned from Deming.

Our customers’ problems with on-time delivery and rejects had gone away, but now our problem was that we were having to put all our engineering at the front end of the process to inspect incoming CAD designs for producibility, as the PCB design groups had no concept of a “quality design.” To them their customer was the electronic designer and the schedule. To meet the schedules, they would just throw the design “over the wall,” even though they knew the designs had problems.

“Since you understand the problem, I am going to promote you to PCB design services manager,” said the VP, agreeing with me. “In integrated circuits, we design the physical chips and then fabricate them from the specs and circuits the EE gives us. I have never understood why the PCB fab group also did not do PCB layout in the IC model!”  

Not the answer I wanted, but at least he gave me the budget and the designers’ salary flexibility I needed to recruit and start a PCB layout group as part of PCB manufacturing. So, DFM became the central focus for our design strategy.

DFM/A was just starting out at HP. HP had taken a license from Hitachi and GE to use the Dewhurst and Boothroyd DFM/A methodology. This came after extensive benchmarking review of past HP product designs and visiting other large manufacturing companies like John Deere, Ford, Hitachi, Caterpillar, and Western Electric. The benchmarking clearly showed that just hiring the smartest engineers and giving them the best EDA tools did not guarantee a superior product. The electronic circuits may have been superior, but the physical products were far too overdesigned’ and complex, thus adding a lot of unnecessary costs and complexity.

The D&B DFM/A methodology was used concurrently with the physical design to point out complexities and provided feedback on how to simplify. It was also used to benchmark competitor’s products. The DFM/A methodology was focused on “Doing it right the first time.” This was not a design rule checker, and a lot more useful than “best practices,” although these are useful tools. This is where I get my definition of DFM or DFX.

CAM software, including DRCs, are activities that take place after the design is finished. We found one EDA supplier, Zuken, that had “design advisor” software for signal integrity, which they gained when they acquired Racal-Redac. This software resided on the screen and provided real-time feedback on SI as they routed traces by taking variables from the database and doing a real-time simulation.

To read this entire article, which appeared in the September 2017 issue of The PCB Design Magazine, click here.