
In manufacturing, variation is the enemy of quality. Left unchecked, it erodes reliability, increases waste, and drives up costs. One of the simplest but most effective tools for controlling variation is Layered Process Auditing (LPA).
[Read more…]Your Reliability Engineering Professional Development Site
This article series by Ray Harkins explores the tools essential for quality or reliability engineers and managers. Topics include statistical process control, reliability engineering, root cause analysis, and business finance.
by Ray Harkins Leave a Comment

In manufacturing, variation is the enemy of quality. Left unchecked, it erodes reliability, increases waste, and drives up costs. One of the simplest but most effective tools for controlling variation is Layered Process Auditing (LPA).
[Read more…]by Ray Harkins Leave a Comment

Of the many constructive soft skills, active listening is surely one of the most helpful in the design thinking process. More than just listening to collect information, active listening is a mindset that fosters empathy for the speaker. In business and engineering, it’s our customer that’s speaking. Active listening seeks to understand not only the other person’s words, but also their attitudes and motivations.
For many people, even the idea of “being a better listener” seems ludicrous. Once you’re listening, how can you listen better? It’s like swallowing better or breathing better. [But the truth is, you can.] Even the best listeners periodically fall into a handful of traps that obstruct their understanding of their employer’s, customer’s, and loved one’s words.
[Read more…]by Ray Harkins Leave a Comment

Early in my quality management career, while working at a small extrusion and fabrication company, I learned something important: bosses pay attention to the money. And if I focused on cost savings projects, I could stay on their good side.
Most of my cost savings efforts at that time focused on eliminating specific types of defects. After all, even a low-frequency defect—especially one that reaches a customer—can drive substantial savings once resolved. Other projects looked inward, targeting inefficiencies in our systems and practices. Lab procedures, control plans, and audit schedules tend to drift out of sync with the products and processes they’re supposed to control. So every now and then, a little system hygiene—an organized cleanup—can free up resources and allow you to reallocate attention to where it’s needed most.
It was during one of those hygiene projects that I stumbled into something I’ve since come to call The Paradox of Invisible Discipline.
[Read more…]by Ray Harkins Leave a Comment

Manufacturing professionals who regularly apply Statistical Process Control (SPC) methods, such as X̄ and R charts, know the power of SPC to maintain control of both quality and cost. The standard practice emphasizes attention to variation, with operators trained to take corrective action whenever a process signals an out-of-control condition. Typically, this means adjusting a process input, verifying the correction, and resuming production.
However, not every out-of-control signal requires intervention, especially in processes where inherent variation is extremely small and product quality is consistently high. In some cases, such as tooling wear or small material variations, process center shifts may occur outside of the operator’s ability to correct. Responding to these shifts with standard SPC rules can lead to overcorrection, lost productivity, and increased variation.
[Read more…]by Ray Harkins Leave a Comment

In manufacturing and quality assurance, we rarely have the time—or budget—to inspect every single part in a batch. That’s where acceptance sampling comes in. It’s a statistical tool that helps us decide whether to accept or reject a lot of material based on a random sample. But more than just a shortcut, acceptance sampling is a structured decision-making process that balances quality expectations with efficiency and risk.
[Read more…]
If you work in manufacturing, quality, or engineering, you’ve probably heard the term “Design of Experiments,” or DOE. Maybe you’ve even helped collect data for one without fully understanding what was going on behind the scenes. That’s okay. DOE is a sophisticated statistical method, but even if you’re not the one crunching the numbers, understanding the big ideas behind it can make you a more effective technician, engineer, or manager.
At its core, Design of Experiments is a structured statistical approach to testing. It helps us understand how different inputs (or factors) affect an outcome (or response). Whether you’re adjusting a process to improve tensile strength, dialing in machine settings to reduce defects, or figuring out which supplier provides the most consistent material, DOE helps answer one big question: What settings or conditions will give me the result I want?
[Read more…]by Ray Harkins Leave a Comment

Brainstorming may sound like a casual conversation technique, but when applied properly, it becomes a critical tool for continuous improvement across a range of creative and technical disciplines.
In a structured environment, brainstorming enables organizations to capture a wide range of ideas, perspectives, and solutions — ideas that might never surface in a traditional problem-solving meeting.
Let’s take a step back and revisit the basics of effective brainstorming, especially its role in driving meaningful change.
[Read more…]by Ray Harkins Leave a Comment

In the world of quality, reliability, product design, and manufacturing, improvement is a necessity, not a luxury. Few models have provided a stronger foundation for improvement than the Deming Cycle, commonly referred to as PDCA: Plan-Do-Check-Act.
Although simple in structure, PDCA represents a deep and disciplined approach to learning, problem-solving, and continuous improvement. Whether you are optimizing a production process, refining a laboratory method, or developing a new product, understanding PDCA is essential.
[Read more…]by Ray Harkins Leave a Comment

Continuous improvement doesn’t happen in a vacuum. To raise the quality, reliability, and performance of their products and processes, organizations must look beyond their own four walls.
Benchmarking — the practice of studying others to improve your own performance — is a foundational tool for competitive advantage.
Whether comparing products, processes, or management systems, effective benchmarking helps organizations discover new ideas, set realistic goals, and accelerate improvement.
Let’s take a step back and explore the basics of benchmarking: what it is, how it works, and why it matters.
[Read more…]by Ray Harkins Leave a Comment

Across industries and disciplines, one challenge remains constant: how to prioritize the right improvement projects to drive meaningful, measurable progress. Whether you work in continuous improvement, reliability, quality, or manufacturing engineering, you have more ideas and opportunities than time or resources to pursue them. This is where the Theory of Constraints (TOC) shines. Developed by Dr. Eliyahu Goldratt, TOC offers a focused methodology for identifying the best opportunities for improvement, allocating resources wisely, and sustaining continuous advancement.
by Ray Harkins Leave a Comment

H.L. Mencken, the sharp-witted satirist and critic of early 20th-century American life, once wrote, “For every complex problem, there is an answer that is clear, simple, and wrong.”1 Mencken, born in Baltimore in 1880, was known for his incisive critiques of societal norms and his skepticism of simplistic solutions to complex issues.
Mencken’s wisdom applies to both everyday life and technical fields. His insights are particularly relevant in the engineering, manufacturing, and reliability disciplines, where the temptation to seek easy answers can lead to costly errors. We’ll start with a relatable everyday example before exploring documented cases in engineering and manufacturing that demonstrate the pitfalls of oversimplified solutions.
[Read more…]by Ray Harkins Leave a Comment
One of the most persistent points of confusion in quality engineering is the difference between traditional statistical process capability analysis and the Six Sigma approach. Specifically, why does Six Sigma define a “six sigma” process as having 3.4 defective parts per million (DPPM), when a straightforward application of statistical tables suggests that six standard deviations from the mean should correspond to a far lower defect rate—about 2 parts per billion? The answer lies in what Six Sigma practitioners call the 1.5 sigma shift.
[Read more…]by Ray Harkins Leave a Comment

For engineering, quality and manufacturing professionals, the accuracy and precision of measurement systems are essential. Gage Repeatability and Reproducibility (Gage R&R) studies provide a formal method for evaluating measurement system variation. And when the results of a study indicate that the gage is unacceptable, it’s a signal that something needs to change. But how should you approach solving the problem? This article provides a detailed guide to systematically troubleshoot and improve your measurement system when your Gage R&R results fall short of expectations. [Read more…]
by Ray Harkins Leave a Comment

The term Measurement Systems Analysis refers to a collection of experimental and statistical methods designed to evaluate the error introduced by a measurement system and the resulting usefulness of that system for a particular application.
Measurement systems range from the simplest of gages like steel rulers to the most complex, multi-sensor measurement systems. Yet regardless of their sophistication, all gages are flawed and fail to deliver a perfectly accurate result to their users. This idea is best expressed by an equation fundamental to measurement science,
[Read more…]by Ray Harkins Leave a Comment

Our work as quality and reliability engineers, or as countless other technical positions across every industry, relies heavily on the instrumentation we use. Torque meters, tensile testers, micrometers, spectrometers and coordinate measuring machines provide critical data about the variation within the processes we design and maintain.
But these tools execute measurement processes which, like all processes, introduce variation into the results they generate. This fact – that every gage contributes variation to the values it reports – is the basis for Measurement Systems Analysis (MSA), a collection of statistical tools and approaches designed to isolate and quantify sources of measurement error.
Ask a question or send along a comment.
Please login to view and use the contact form.