• 0

# Stability for Six Sigma Process

Six Sigma is a data driven problem solving approach with the objective of increasing profits by reducing the defects (improving effectiveness) in a process. Defect reduction happens in two ways - first, reducing the variation in the process, second, by moving the actual mean closer to the target mean.

Process Stability - refers to the consistency of the process to stay within the Control Limits. If the process distribution remains consistent over time, i.e. the outputs fall within the range (Process Width), then the process is said to be stable or in control. If the Outputs are spread across outside the limits, then the process is Unstable or Out of Control.

Nelson Rules is a set of rules to detect the presence of special causes or non-random behaviour in the process. These are commonly used to check for process stability in Statistical Process Control (SPC). They were first published by Lloyd S Nelson in 1984.

Applause for all the respondents - Suresh Kumar Gupta, Anuj Bhatnagar, Himanshu Sharma.

## Question

Q 526. A process is operating at Six Sigma Level. While checking for stability of this process, should one check for all Nelson Rules or only the first one? Support your answers with clear reasoning.

Note for website visitors - Two questions are asked every week on this platform. One on Tuesday and the other on Friday.

## Recommended Posts

• 0

Generally, a process is said to be stable if no observations/data points are lying outside the control limits in Control Chart. However, Lloyd Nelson stated that there is also a possibility of other situations where even though all the points are within the control limits there could be situations indicating the existence of special causes. He came out with 8 rules which would help us identify special causes even in a seemingly stable process.

Below are the rules applicable for both Variable & Attribute data to detect any special causes of Variation in the process.

1. One point more than 3 Standard Deviations from the Mean

2. 6 points in a row all either increasing or all decreasing.

3. 14 points in a row alternating up and down.

4. 9 points in a row on the same side of the Mean.

These remaining four rules are only for variable data to detect Special Causes.

5. 2 out of 3 points greater than 2 Standard Deviations from the Center Line on the same side.

6. 4 out of 5 points greater than 1 Standard Deviation from the Center Line on the same side.

7. 15 points in a row all within one Standard Deviation of either side of the Center Line.

8. 8 points in a row all greater than one Standard Deviation of either side of the Center Line.

While multiple rules are available, it is important to decide which rule needs to be applied when. Rule 1- is the fundamental one and is hence we can start with it for any situation. Subsequently, Rules 2 to 4 comprise a good set that help for many of the commonly occurring special causes for both Variable and Attribute Data. For a Variable data rules 5 and 6 will increase the sensitivity to changes in the process average and Rules 7 and 8 will help to identify problems relating to sampling, viz. stratification and mixtures.

So while checking for stability of process one should refer to all rules for Variable Data and only the first four for Attribute Data.

##### Share on other sites

• 0

Nelson rules is a framework/ method in process control. These help in predicting if a variable is consistent or out of control. Initially the rules to for detecting the out of control or non-random conditions were given by Walter A. Shewhart in 1920s. In October 1984 Lloyd S Nelson published the Nelson rules in the Journal of Quality Technology.

The method involves plotting the variables on a control chart. The rules are 8 rules based on the mean value and standard deviation of the samples.

To properly monitor a process operating at the defined level, all the 8 rules should be applied to check the possibilities as each rule signifies a certain behavior.

First 4 rules are for applicable to both variable and attribute data while last 4 are applicable to only variable data.

Below are the details:

##### Share on other sites

• 0

Walter A. Shewhart introduced control charts in 1924. These control charts are very important tool to detect abnormality in process. In control chart UCL and LCL represent +/- 3σ limits based on historical data collected from process. So if any point falling outside of UCL or LCL is suspected as an abnormal situation and indicate us to analysis assignable causes.

So if point falls within UCL and LCL limit than process is in statistical control and there is no need to suspect any abnormality.

Lloyd S. Nelson published Journal of Quality technology in 1984, as there is possibility of other situations with low probability of occurrence, even all the points could be within the control limits. So such situations also indicate presence of special causes. Nelson 8 rule helps to identify special causes in which Rule -1 is original case of a 1 point out of UCl or LCL.

Rule one is sufficient for most cases. Nelson rule one will detect more signals than you will have time to investigate. So question is why other rule are there and why we should not use ? Below figure gives clue that whenever you add a detection rule it has the effect of moving the power function curve left. As you add more rule these incremental improvements in power become smaller. So the additional power gained by using an extra detection rule will never be a great as it looks when detection rule being considered. As power function reach to limit the only way that detection rule has of moving power curve to left is by beginning point of curve to shift upward. As sown in figure below:

When we use extra detection rules, it increase chance of a false alarm and using rules that have less and less power. Below is recommendation for detection rule:

##### Share on other sites

• 0

It is a little sad to know that none of the respondents even remotely tried to answer the question

The question specifically asked about the applicability of nelson rules for a six sigma process. Well, the correct answer is that if a process is operating at six sigma level, then only the first rule is sufficient.

I leave the reasoning part up to the readers

##### Share on other sites

• 0

A control chart monitors a process variable over time. The control limits are calculated an Upper Control limit (UCL) and a lower Control limit (LCL). The UCL is largest value you would expect from a process with just common cause of variation present. The LCL is the smallest value you would expect with just common cause of variation present. as long as all the points are within the limits and there are no patterns, only common causes of variation are present, the process is said to be "in Control".

The Nelson first rules states that if one point is more than 3 standard deviations from the mean than there is special cause exist into the process and if process is operating at six sigma level than this rule is sufficient to identify special cause into your process. this rule is common for attribute and continuous data.

## Create an account

Register a new account

• ### Who's Online (See full list)

• There are no registered users currently online
• ### Forum Statistics

• Total Topics
3.1k
• Total Posts
15.8k
• ### Member Statistics

• Total Members
54,460
• Most Online
990