-
- News
- Books
Featured Books
- pcb007 Magazine
Latest Issues
Current IssueThe Growing Industry
In this issue of PCB007 Magazine, we talk with leading economic experts, advocacy specialists in Washington, D.C., and PCB company leadership to get a well-rounded picture of what’s happening in the industry today. Don’t miss it.
The Sustainability Issue
Sustainability is one of the most widely used terms in business today, especially for electronics and manufacturing but what does it mean to you? We explore the environmental, business, and economic impacts.
The Fabricator’s Guide to IPC APEX EXPO
This issue previews many of the important events taking place at this year's show and highlights some changes and opportunities. So, buckle up. We are counting down to IPC APEX EXPO 2024.
- Articles
- Columns
Search Console
- Links
- Events
||| MENU - pcb007 Magazine
A 21st Century Perspective on Data, Analysis, and TQM
August 8, 2023 | Nolan Johnson, I-Connect007Estimated reading time: 2 minutes
Chris Chapman is a Deming management method practitioner and consultant who publishes “The Digestible Deming” blog on Substack. Chris has been a student of Deming’s agile, Lean, and related methods since 2007. With his software and data background, he brings something of a 21st century perspective to quality. In this conversation, we explore how data and AI might be changing how we approach quality.
Nolan Johnson: Chris, as we talk about TQM, I’d like to ask you to consider the amount of data that we now have available. Our ability to collect data, implement sensors, get real-time data from the manufacturing floor, analyze it, and make it available to upper management has increased. What would be Deming’s perspective on this amount of data? Back in his day, surely, they didn’t have this much data to roll up.
Christopher Chapman: There's a fantastic little quip in Dr. Joyce Orsini’s book, The Essential Deming. Joyce was one of Deming’s PhD students and worked with him right up until the end. In one chapter, he comments on the data that was available to management back then. He says, “Tons of figures, no knowledge.” It was the idea that, even 30 to 40 years ago, you could have voluminous amounts of information that would overwhelm management—never mind what we can get access to today—without any real methodology to interpret what you're actually seeing. , to distinguish signals from noise and gain insights into what the data is telling you about how a system or process is working.
Here's a real-world example from a particular customer of mine in telecom, where they wanted to track the activation of Apple watches as the first versions rolled out. How many people were going onto the network? A senior executive who was very hot to trot on understanding this said, “Gather all the data that tells me the activation and deactivation rates of these watches nationwide.”
It took seven or eight hours a day for people to pull together that data from all the systems and roll it up onto a dashboard as a single number. That number was green when it went up, and red when it went down. You can imagine the attendant behavior which that encouraged.
What I suggested to the managers I was working with was to present the data in context over time on a run chart with process limits, such that the leadership could see a quarter’s worth of activation, perhaps. In that way, you could take some of the temperature, and establish what the estimated three sigma limits are for normal activations. You don't have to explain the math, don't draw them manually; just calculate the limits and report what the picture of normal looks like. We want it to economize our interventions only to when we see very overt signals above or below limits in the activation/deactivation cycle. Only then do we look at the contributory system indications for that.
To read this entire conversation, which appeared in the July 2023 issue of PCB007 Magazine, click here.
Suggested Items
Nolan’s Notes: Coming to Terms With AI
05/07/2024 | Nolan Johnson -- Column: Nolan's NotesHow fast do things move in the world of data analytics? Here’s an example. We’ve been planning this issue on artificial intelligence for the past few months, and, in fact, I had already written this column about a month ago. Then I went to IPC APEX EXPO and upended it all. I originally had compared AI to drag racing in that (CPU) horsepower and new (data) vehicles have steadily delivered higher performance competition. That seemed pretty accurate given how generative AI models dominated the popular media with amazing results—and sometimes spectacular crashes.
RTX's Advanced Ground System for Space-based Missile Warning Now Operational
05/06/2024 | RTXAn advanced ground system for space-based missile warning developed by Raytheon, an RTX business, is now operational at the U.S. Space Force's Overhead Persistent Infrared Battlespace Awareness Center (OBAC).
Cigent, Swissbit Announce Partnership to Enhance Endpoint Data Security
05/06/2024 | CigentCigent, a leading provider of endpoint data protection solutions, and Swissbit, a leading manufacturer of storage, security, and embedded IoT solutions, today announced a strategic partnership to offer a comprehensive portfolio of secure storage drives designed to safeguard endpoint data against a growing landscape of cyberthreats.
Industrial PC Market Size to Record $1.75 Billion Growth from 2023-2027
05/03/2024 | PRNewswireThe global industrial pc market size is estimated to grow by USD 1.75 billion from 2023 to 2027, according to Technavio. This growth is expected to occur at a Compound Annual Growth Rate (CAGR) of almost 6.29% during the forecast period.
Gartner Survey: 61% of Organizations Evolving D&A Model Due to AI
05/01/2024 | Gartner, Inc.Sixty-one percent of organizations are forced to evolve or rethink their data and analytics (D&A) operating model because of the impact of disruptive artificial intelligence (AI) technologies, according to a new Gartner, Inc. survey.