Deming's Funnel Simulator: Learn the Principles of Variation and Process Management

Posted on March 8, 2012 by

0



I've got my head caught in a funnel!
photo credit: hyperboreal

Dr. W Edwards Deming was an advocate of achieving Quality through Process Management. He has described the adverse effects of tampering with a stable process in his book titled ‘Out of the Crisis‘. He demonstrated through the famous Funnel Experiment how adjustment of a stable process can lead to an increase in process variation.

Symphony Technologies have hosted a Deming’s Funnel Simulator on their web site. The Simulator is free to use.

The Deming’s Funnel Simulator simulates the four rules of adjustment described by Deming. In addition it also calculates the process variation associated with each. The simulator is strong in graphics and explains the four types of process adjustments in details.

It can be used by trainers to train others in the Principles of Variation and Process Management. It can be used by students to explore Deming’s principles of process adjustment and overadjustment.

From Quality America

Tampering occurs when adjustments are made to a process that is in statistical control. Adjusting a controlled process will always increase process variability, an obviously undesirable result. The best means of diagnosing tampering is to conduct a process capability study (see IV.H.4) and to use a control chart to provide guidelines for adjusting the process.

Perhaps the best analysis of the effects of tampering is from Deming (1986). Deming describes four common types of tampering by drawing the analogy of aiming a funnel to hit a desired target. These “funnel rules” are described by Deming (1986, p. 328):

1. “Leave the funnel fixed, aimed at the target, no adjustment.
2. “At drop k (k = 1, 2, 3, …) the marble will come to rest at point zk, measured from the target. (In other words, zk is the error at drop k.) Move the funnel the distance -zk from the last position. Memory 1.
3. “Set the funnel at each drop right over the spot zk, measured from the
target. No memory.
4. “Set the funnel at each drop right over the spot (zk) where it last came
to rest. No memory.”

Rule #1 is the best rule for stable processes. By following this rule, the process average will remain stable and the variance will be minimized. Rule #2 produces a stable output but one with twice the variance of rule #1. Rule #3 results in a system that “explodes”, i.e., a symmetrical pattern will appear with a variance that increases without bound. Rule #4 creates a pattern that steadily moves away from the target, without limit.

At first glance, one might wonder about the relevance of such apparently abstract rules. However, upon more careful consideration, one finds many practical situations where these rules apply.

Rule #1 is the ideal situation and it can be approximated by using control charts to guide decision-making. If process adjustments are made only when special causes are indicated and identified, a pattern similar to that produced by rule #1 will result.

Rule #2 has intuitive appeal for many people. It is commonly encountered in such activities as gage calibration (check the standard once and adjust the gage accordingly) or in some automated equipment (using an automatic gage, check the size of the last feature produced and make a compensating adjustment). Since the system produces a stable result, this situation can go unnoticed indefinitely. However, as shown by Taguchi, increased variance translates to poorer quality and higher cost.

The rationale that leads to rule #3 goes something like this: “A measurement was taken and it was found to be 10 units above the desired target. This happened because the process was set 10 units too high. I want the average to equal the target. To accomplish this I must try to get the next unit to be 10 units too low.” This might be used, for example, in preparing a chemical solution. While reasonable on its face, the result of this approach is a wildly oscillating system.

A common example of rule #4 is the “train-the-trainer” method. A master spends a short time training a group of “experts,” who then train others, who train others, et cetera. An example is on-the-job training. Another is creating a setup by using a piece from the last job. Yet another is a gage calibration system where standards are used to create other standards, which are used to create still others, and so on. Just how far the final result will be from the ideal depends on how many levels deep the scheme has progressed.

Posted in: Quality, Statistics