Jump to content

Chaudhary Anshu

Excellence Ambassador
  • Content Count

  • Joined

  • Last visited

Community Reputation

2 Average

About Chaudhary Anshu

  • Rank
    Active Member

Profile Information

  • Name
    Anshu Chaudhary
  • Company
  • Designation

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Efficient is about using less resources to deliver the output. Effective is about delivering the meaningful and required output. A small example can be of a bulb that uses more energy, provides required lighting but is able to convert 2% in light energy while 98% in heat energy. So it's effective but not efficient.
  2. The producer risk or type I error occurs when a true null hypothesis is rejected whereas the consumer risk or type II error occurs when a false null hypothesis is not rejected.Increase or decrease of one type error affects inversely on other type.But to control both together depends on sample size which is sometimes difficult to increase or decrease for a given situation.
  3. At first go,zero defect looks impractical.We cant collect any example from the industries known or around where u can present evidence of zero defect.So thoughfully why the concept of zero defect would have been originated.But nothing is without reason and zero defect do have its evidence in aviation and space industries.Any miss from zero defect in such industries leads to damages which categorically are called critical or major in the world of quality as human life is at stake.And when we have industries having zero defect ,then for sure it can be applied in other industries as well provided we drop a little cushion of grant.
  4. Measurement of Central tendency (centre of data)and spread(measure of variability) are the part of descriptive statistics presenting the key to understand data and the process that generated them.Different measurements of Central tendency is to realise what might variously be termed the typical, normal, expected or average value of a set of data.Three ways to quantify the Central tendency of a data set are mean,median and mode. Whereas spread of data can be examined in a table or a graphical form. Three ways to quantify Spread of a data set are range,variance and standard deviations.A graph makes clarity of any symmetry (or lack of it) in the spread of data, whether there are obvious atypical values (outliers) and whether the data is skewed in one direction or the other (a tendency for more values to fall in the upper or lower tail of the distribution).To assess and understand the data having extreme outliers, measurement of central tendency has less relevance.
  5. Every process has variation.The source of process variation can be divided into two categories:special and common.Common cause variability is that which is inherent in the process and generally is not controllable by process operators.Examples of common causes include variation in raw materials and variations in ambient temperature and humidity.In the case of service processes,common causes typically include such things as variation in input data,variations in customer load,and variation in computer operations.Some authors refer to common cause variation as natural variation.Special causes of variation include unusual events they when detected can usually be removed or adjusted.Examples include tool wear,gross changes in raw materials, and broken equipment.Special causes are Sometimes called assignable causes.A principal problem in process management is the separation of special and common causes.If the process operator tries to adjust a process in response to common cause variation,the result is usually more variation rather than less.This is Sometimes called overadjustment or overcontrol.If a process operator fails to respond to the presence of a special cause of variation,this cause is likely to produce additional process variation.This is referred to as underachievement or under control.The principal purpose of control charts is to help the process operator recognise the presence of special causes so that appropriate action can be taken.
  6. Autonomation is automation with a human touch; but what does that really mean and where has it come from? Sakichi Toyoda invented a loom in 1896 that not only operated automatically but also stopped when any of the threads broke, this simple idea revolutionized the industry as instead of there being a requirement for an operator having to sit beside each and every machine waiting and searching for a problem, one operator could now watch several machines and just take action when a problem occurred thus increasing productivity and quality. It is said that the later sale of this technology and the patent to a UK textile company provided the cash that the Toyoda family required to start their new business Toyota Cars. Autonomation therefore is not full scale automation, it automates the tasks that operators would find boring, repetitive or unsafe but retains human beings to look after the process, often loading the machines and monitoring for abnormalities highlighted by the machines. Autonomation Improves Productivity Autonomation is the strategy that Toyota uses for its machines, rather than investing in huge monolith machines that can do everything but take forever to set up and require to run huge batches they invest in small machines that do specific tasks that humans would find difficult or repetitive and use autonomation principles to ensure that the operator only has to interrupt the cycle if something goes wrong. This increase productivity and reduces costs considerably as now an operator can monitor several machines on an exception basis and only has to take action if something goes wrong. In addition to autonomation they also developed the idea of mistake proofing known as PokaYoke which seeks to either prevent the possibility of creating a defect or in highlighting if one has been created. Automation & JIDOKA Autonomation is part of Jidoka, jidoka being a simple set of rules that were inspired by Toyoda’s first loom; Discover an abnormality STOP Fix the immediate problem Investigate and correct root cause Jidoka covers both the whole process as well as individual machines and requires that operators who spot an abnormality stop the process in just the same way that autonomation has the machine stop when something is incorrect. The important thing however is not to just stop, autonomation without the follow through of the remaining Jidoka principles just results in machines that keep stopping; we have to fix the problem and remove the root cause. This requires operators to be trained in simple problem solving techniques and to be empowered to solve problems along with their team leaders and supervisors thus ensuring that we continually improve our processes to remove all quality problems improving product quality and our productivity. EXAMPLE The picture to the left is a simple coil feeder that provides a continuous supply of steel sheet to an automated press stamping out components, without any form of autonomation sensor an operator would have to watch this to ensure that the tension was correct and that the steel has not run out. Simple sensors will alert the operator if any problems occur and stop the press to prevent defects being produced or even damage to the press. This frees the operator to conduct other work and improves productivity and improves quality. The stamping press feeds components via a small slide to load the next machine in the process, if that next machine stops for some reason a senor on the slide will register the build up of additional components on the slide and stop the stamping press to prevent overproduction of parts which would overflow the slide and potentially cause jams and expensive damage. Some devices are also known as Poka Yoke devices or mistake proofing; these are simple ideas that prevent the creation of defects and are very much part of autonomation. Examples are things like sensors that register when all holding clamps on a fixture are fully closed so that you know all components are loaded correctly. Shaped fixtures that will only accept the correct orientation of components, pins in fixtures that mate with holes in components preventing you from fitting the wrong components are all simple examples of Poka Yoke. Other examples cover simple devices that measure the number of fasteners that are tightened and the torques that are tightened, if the correct torque is not reached or not enough fasteners are tightened you cannot proceed onto the next process highlighting the defect. The use of autonomation can automate mundane tasks while keeping oversight, reducing errors and the cost of shipping returns.
  7. The Hypothesis testing is tool used in inferential statistics.Hypothesis tells us whether there exists statistically significant difference between the data set to consider that they represents different distributions.Here a null hypothesis and an alternative hypothesis are stated.The null hypothesis is a statement about the value of a population parameter such as mean,and must contain the condition of equality.The alternate hypothesis is a statement that must be true if null hypothesis is false.A null hypothesis can only be rejected or fail to be rejected,it cannot be accepted because of a lack of evidence to reject it. The Hypothesis test is used in ANALYZE phase of six Sigma methodology.The practical problem was created in earlier phases.The Hypothesis test reviews the families of variation statistically to determine the significant contributors to the output.The steps involved are: State the null hypothesis (Ho) and alternative hypothesis (Ha).Choose the level of significance (alpha). Determine the rejection region for the statistics of interest.Calculate the test statistic.Decide whether the null hypothesis should be rejected.State the conclusion in terms of the original problem.
  8. The Central limit theorem tells that as the sample size tends to infinity,the distribution of sample means approaches the normal distribution.So it's about the shape of distribution as normal distribution is bell shaped .So the shape of distribution of sample means begins to look bell shaped as the sample size increases. The Law of Large numbers tells where the centre of the bell is located.As the sample size approaches infinity the centre of the distribution of the sample means becomes very close to the population mean. The CLT requires the data to be independent and identically distributed.It holds good under the suitable assumption of short tails or finite moments.Under serial dependence CLT deteriorates and fails for certain long memory processes.
  9. Fault tree analysis is a type of Tree diagram.As known tree diagrams are useful in situations where we want to discover or define a hierarchical relationship between events-desirable or undesirable.A FTA is constructed to relate an undesirable "top event" or failure to a sequence of events that led to the top event.In other words, the FTA depicts logical pathways from set of basic causal events to a single undesirable result or top event.We typically use logical operators, such as AND or OR gates,to connect lower level events with higher events.Hence,once the logic has been described, quantification can take place and risk level can be assessed.The steps involved are:. 1.Identify the top event. 2. Identify the next level events. 3.Develop logical relationships between the top and next level events. 4.Identify and link lower level events. 5.Quantify the FT. A fault tree does not contain all possible failure modes or all possible fault events that could cause system failure.Howevee ,an FT is capable of considering human error,hardware and software failures and act of nature.It has got widespread usage in the field of reliability,safety and risk analysis.The FT is a more focused tool than the FMEA.FTA works well for independent events.
  • Create New...