Estimated reading time: 7 minutes

Contact Columnist Form
Happy’s Tech Talk #43: Engineering Statistics Training With Free Software
In over 50 years as a PCB process engineer, the one skill I acquired in college that has been most beneficial is engineering statistics. Basic statistics was part of my engineering fundamentals classes, but I petitioned the dean to let me take the engineering statistics graduate course because I was creating a senior thesis for my honors focus and needed more training on Design of Experiments (DOE).
I was recruited by Hewlett-Packard for its new semiconductor fabrication facility in Palo Alto (CA) specifically because of my undergraduate degree in chemical engineering and my master’s degree in electrical engineering with an emphasis on control engineering. After working in the new facility and as the only chemical engineer in the company, the semiconductor VP asked if I would be open to “going down the hill” to help out the struggling PCB multilayer plant with its process problems.
I took him up on his offer even though I had no knowledge or experience in printed circuit manufacturing. However, I had the tools and knowledge to solve the process problems.
My first impression: Making a PC multilayer was convoluted. It involved materials engineering, photolithography, complicated etching, lamination technology, CNC drilling and routing, screen printing, chemical deposition, electroplating, and electrical testing.
To learn, I read the only book on PCBs I could find, Printed Circuits Handbook (1967), by Clyde F. Coombs, and then I contacted all the suppliers involved with some of these problems.
Thanks to my DOE training, I whittled down the long list of variables and discovered the root cause of the production problems: The MEs, EEs, and chemists in the PCB engineering group had made the usual mistake of trying to solve the problems by holding everything constant and varying one variable at a time. They also focused on the quantitative variables, disregarding the qualitative variables. I didn’t do that.
The Need for Statistical Tools
The discussion of quality and customer satisfaction shows how important variables are to PCBs. Any loss goes to the bottom line. So, what are some of the tools to help improve process yields? Process control comes to mind.
We use four criteria to select the correct statistical tool for problem analysis (Figure 1). Chemical processes have always been difficult to control in printed circuit manufacturing. These uncontrolled factors can creep into our processes.
All process control is a feedback loop of sorts. Nevertheless, the element I want to focus on is the control block, or more precisely, the human decisions that make up process control.
Problem Solving and Process Control
The first link in process control is the human. The high-level objectives are to:
- Reduce variations
- Increase first-pass yields
- Reduce repair and rework
- Improve quality and reliability
- Improve workmanship
- Reduce waste
I have already listed the process control tools and methods that a person may work with. Statistical tools are of the utmost importance for engineers (Figure 1). They have traditionally been cumbersome and not easy to learn, but I have good news.
You can now get excellent statistics training online for free. The NIST/SEMATECH e-Handbook of Statistical Methods and its Dataplot software provide the necessary tools to get this training. Even if your company has statistical tools like Minitab or DFSS, run the same experiments with Dataplot. This will give you Dataplot experience and a permanent record. Having an excellent statistical package at hand allows you to work on complex ideas over a long period.
NIST/SEMATECH e-Handbook of Statistical Methods
While looking for Weibull reliability plot information, I found this book from NIST (also called the Engineering Statistics e-Handbook) (Figure 2). The organization of the handbook follows the statistical tools that a process engineer needs to solve a problem (detailed in Table 1) with a breakdown of chapters as follows:
- 1–2: Measure the extent of the problem
- 3–4: Look for root causes and uncover how many factors are involved in the problem
- 5: Postulate a solution
- 6: Verify the solution
- 7: Monitor the process to ensure the problem is gone and does not reappear
- 8: Did the solution create any reliability problems?
Handbook Chapters
The e-Handbook of Statistical Methods is over 3,300 pages (Table 1). The table of contents presents the eight chapter headings and the highest-level section headings within each chapter. The usefulness of this handbook comes from the experimental data sets (case studies). As you read, you are encouraged to make comparisons with its statistical tools by supplying a complete statistical software program. When used with the supplied data sets, Dataplot coaches you through the interpretation of the results. You can also substitute your own data. This system of running a demonstration first, then running personal data, is an effective way to coach someone through the statistical tools. NIST has prepared a version of the Dataplot software for nearly all computer operating systems: DOS, Windows, NT, UNIX, MAC OS, Linux, etc. It is large, but the download time is worth it.
Handbook Integrated with the Software
Most sample output, graphics, and case studies came from Dataplot. This aspect does not require you to have Dataplot or know anything about it. However, any statistical program could also have generated the sample outputs and graphics.
The case studies contain a “work this example yourself” section using Dataplot with a pre-existing macro. Since Dataplot runs in a separate window, you can view the handbook pages and the Dataplot output simultaneously.
You can also generate your own commands in addition to running the handbook-generated macros. This integration requires the installation of Dataplot on your local system.
Dataplot can access the handbook as an online help system. This complements its online help feature because the handbook accesses descriptions of the statistical techniques, while the online help focuses on the implementation of the technique.
Correlation Plots and Curve Fitting
One of the Six Sigma (TQC) statistical tools is correlation plots and curve fitting/regression. CurveExpert Basic 21 (Figure 3) is an excellent, low-cost, comprehensive curve-fitting software system for Windows. A toolbox of over 35 built-in models can model XY data, including:
- Linear regression models
- Linear, quadratic, and polynomials to the 16th order
- Nonlinear regression models
- Exponential, modified exponential, logarithmic, reciprocal logarithm, vapor pressure, power, modified power, shifted power, geometric, modified geometric, root, Hoerl, modified Hoerl, reciprocal, reciprocal quadratic, Bleasdale, Harris, exponential association, saturation growth, Gompertz, logistic, Richards, MMF, Weibull, sinusoidal, Gaussian, hyperbolic, heat capacity, and rational function
- Interpolation (spine)
Figure 3 shows the model used to simulate the PCB first-pass yield (FPY)2.
You can also build custom regression models with the 15 additional models provided:
Bacon, Watts, Bent Hyperbola, BET, Beta distribution, Cauchy, Chapman-Richards, Freundlich, Gamma, generalized hyperbola, Gunary, inverse, Langmuir, log normal, Lorentz equation, sum of exponentials, truncated Fourier series, and the two-parameter bell.
Moreover, you can compare the fit of various models, or let the software pick the best for you. You can download it from several websites featuring shareware software.
Summary
Per the SixSigma tutorial3, “Always keep in mind that Engineering Statistics (ES) should be purpose-driven. Be clear on the organization’s vision for the future and stay focused on it. ES can be a powerful technique for unleashing employee creativity and potential, reducing bureaucracy and costs, and improving service to clients and the community. There is no single theoretical formalization of total quality, but Deming, Juran, and Ishikawa provide the core assumptions, as a ‘…discipline and philosophy of management which institutionalizes planned and continuous… improvement … and assumes that quality is the outcome of all activities that take place within an organization; that all functions and all employees have to participate in the improvement process; that organizations need both quality systems and a quality culture.’”
Learning engineering statistics will be the most useful tool in your tool belt. I learned it during my chemical engineering coursework, and it helped me solve problems and get promotions. Later, Hewlett-Packard’s Japanese division earned the prestigious Deming Prize, and engineering statistics became a major course taught internally to all engineers and techs.
References
- CurveExpert Basic 2.0 Released, CurveExpert.
- “Calculating your PCB Complexity and First-pass Yields,” by Happy Holden, PCB007 Magazine, November 2024.
- “Why Quality Circles Failed but Total Quality Management Might Succeed,” by Stephen Hill, British Journal of Industrial Relations, 1991.
This column originally appeared in the September 2025 issue of PCB007 Magazine.
More Columns from Happy’s Tech Talk
Happy’s Tech Talk #42: Applying Density Equations to UHDI DesignHappy’s Tech Talk #41: Sustainability and Circularity for Electronics Manufacturing
Happy’s Tech Talk #40: Factors in PTH Reliability—Hole Voids
Happy’s Tech Talk #39: PCBs Replace Motor Windings
Happy’s Tech Talk #38: Novel Metallization for UHDI
Happy’s Tech Talk #37: New Ultra HDI Materials
Happy’s Tech Talk #36: The LEGO Principle of Optical Assembly
Happy’s Tech Talk #35: Yields March to Design Rules