Jump to content

Prashanth Datta

Lean Six Sigma Green Belt
  • Content Count

    27
  • Joined

  • Last visited

  • Days Won

    4

Prashanth Datta last won the day on March 23 2019

Prashanth Datta had the most liked content!

Community Reputation

4 Average

1 Follower

About Prashanth Datta

  • Rank
    Active Member

Profile Information

  • Name
    PRASHANTH D
  • Company
    NA
  • Designation
    Project Program Management Senior Advisor

Recent Profile Visitors

666 profile views
  1. The Pareto Principle or the 80/20 rule essentially means 80% of the Output comes from 20% of the input. This no doubt is a very important tool in Six Sigma Analyze phase to identify those critical root causes which can make some significant changes to my output Y. Given most of our Six Sigma projects are time bound, we need to work with the approach on "Not to Boil the Ocean". As per Pareto Principle, we identify those top 20% of the inputs which when addressed can bring about the desired changes for my Output Y i.e. either shift the mean or reduce the variation. We then try to d
  2. Six Sigma summarizes its problem statement as a mathematical equation in the form of Y=f(X1,X2,...,Xn). The Y here represents the Dependent, Output or the Effect and X is the independent, Input or Cause. In order to improve the outcome Y, which could be either shifting the mean or reducing the variation in the process, the systematic approach is to control your X's which are your causes. Now the question comes will all X's make an impact on Y. Let's look at an example. I am a 2 Wheeler manufacturing company and see my Sales are gradually declining quarter on
  3. Simply stating, the One-Sample T - test compares the mean of our sample data to a known value. For example, if we want to measure the Intelligence Quotient [IQ] for a group of selected people in India, we compute their IQ using a set of predefined tests (mapping to global standards). With the results we get the average IQ for the team selected as well as their individual IQ scores. This average IQ score of the group can always be compared to a known value of 82, which is the average IQ of Indians [which is already computed by accredited testing organizations]. Further, an average score of <
  4. Before analyzing the COPQ calculator, let us quickly understand what COPQ is all about. As an organization, when we commit to deliver a product or service to our customers, it is deemed that the quality of this product or service meets and/or exceeds our customer expectations. Hence, right from the Design or Planning phase itself, it is important to give a lot of emphasis to the identified quality parameters associated with the product or service. When it comes to the financial planning part of your project, it is extremely important to assess the budget that needs to be allocated
  5. As we have already seen and understood, Root Cause Analysis focuses on identification of all those independent variable X's (further narrowed to critical X's) deemed as input, which has an impact on the dependent output variable Y. In other words, identification of all causes which influences the effect. In a Sensitivity Analysis, also referred to as "What-if" or "Simulation Analysis", we determine how the output-dependent variable (Y or effect) varies when each of the independent variable (X or causes) are varied under a predefined set of assumptions. Simply stating, how different
  6. Attribute Agreement Analysis typically involves a binary decision making around one or more attributes of the item being inspected i.e. accept or reject, good or bad, pass or fail etc., So, in this context we are referring to discrete data type. The key call out here is the human factor (known as appraisers) who typically make these assessments. With this condition in place, it is all important that the appraisers are consistent with themselves (repeatability), with one another appraisers (reproducibility) and with known standards (team accuracy). A poor repeatability implies th
  7. YMy two cents on this topic... Artificial intelligence takes it's significance at those areas where repetitive tasks or scope of self help is at abundance scope. While it helps business from cost optimization stand point, for customers it is more of quicker time to resolve. The need to check the intelligence of machines are very high where we have sensitive data. For example, banking system AI has to be smart enough to interpret complete query and present answers as the output in such cases could be personal finance data. If the machine picks up only keywords and presents output the risk of pr
  8. As suggested, Lean focuses on continuous improvement in processes by eliminating the non value adds. Lean tools which enables this activity manifests it's capability in such a way that some of the tools have gone beyond industry restrictions and hence play a critical role even in Six Sigma approach, especially during Analyze and Improve Phase. I will list 5 such tools which I use and believe in its effectiveness irrespective of the process improvement projects domain. 1. Value Stream Mapping 2. Kaizen 3. Cause- Effect Diagram, Fish Bone Analysis 4. 5 Why's 5. Poka-Yoke
  9. What keeps your business engine running amidst tough competition in today's environment? I strongly believe in 4 things that acts as a differentiator a. Customer Centric Approach - Have customer in focus in whatever you do. Historical studies have always shown it costs more to acquire new customer and effective management of existing customer assures you of continuous business which in turn benefits to invest in better ways to gain new customers. b. Productivity and Efficiency - Key mantra is to do more with less within committed timelines with the focus on better OpEx manage
  10. In order to answer above question, let me break down the question into two parts... 1. What is Hypothesis testing? 2. What is the most desired "output" from each phase of a DMAIC project? While we understand that there are few tools and techniques that can be applied across Phases, what we need to caution exercise is not to force fit tools as it can result in experimentation and possibly delay the project timelines. Now let's look at what Hypothesis testing is all about. Hypothesis testing is a part Inferential Statistics which is used to predict the behavior
  11. Six Sigma gains it's edge over other Quality Management System as it uses data driven approach for problem solving. Statistics forms an integral part of Six Sigma methodology as many of it's tools refers to statistics for logical conclusions. We essentially have two branches in Statistics - Descriptive and Inferential. Descriptive Statistics helps work on collecting, analyzing and presenting information as mean, standard deviation, variation, percentage, proportion etc. While Descriptive Statistics helps with description of data, it will not manifest itself with any infere
  12. Dear Vishnu, While I am not a health care expert, for my academic interests and basis my understanding w.r.t Emergency Services, I am putting forth few pointers for your perusal. You may treat these as "Voice of Customer" as some of the inputs presented are basis my observations and experiences as heard for a desired better "To-Be" state. While we see huge expenditure associated with Emergency Services, be it Capital Expenditure to set up the services or Operating Expenditure to run the services, which in itself provides opportunities on cost saving projects, we shall treat that a
  13. Control Phase is that critical stage of Business Process Improvement where the selected solution(s) which are implemented to achieve the desired output [Y] are monitored for it's effectiveness. In other words, the performance of the process which needs to be maintained at the desired level as per the Voice of Business or Voice of Customer has to be sustained through the actions implemented. An effective Control Plan a. Acts as Risk Assessment mechanism on the new improved process. b. Ensures process stays within control and helps identify any out of control variations due to any spec
  14. Before delving into the tools used for Design Risk Analysis, let us try and break down this question further to understand, What “Design Risk Analysis” means, Understanding what “Risk” is, and Common tools used for Design Risk Analysis. What is “Design Risk Analysis”? As we are aware, we have two methodologies in Six Sigma 1. DMAIC – Define, Measure, Analyze, Improve and Control --- Typically used for improving existing processes or products 2. DMADV – Define, Measure, Analyze, Design and Validate/Verify – Typically used for developing or
  15. With increasing demands from Customers for high Quality and Reliable Products or Service, it is posing additional challenges for the Vendors (or Service Providers) to accomplish this through more scientific approach and reliable modeling, especially at the early phase of design or planning to ensure the outcome maps to Customer requirement by the time the final deliverables are ready. Failure Modes and Effects Analysis (FMEA) is a tool for evaluating possible reliability issues at the early stages of process cycle where it is simpler to acquire actions to overcome these matters, th
×
×
  • Create New...