Jump to content

Sreyash Sangam

Lean Six Sigma Green Belt
  • Content Count

  • Joined

  • Last visited

Everything posted by Sreyash Sangam

  1. Bullwhip effect (Brief Concept) : Bullwhip effect is a phenomenon, particularly in supply chain management, in which the more processes or steps we go away from the Voice of main customer, the accuracy of forecasting gets depleted. That means if we see in terms of graphical representation of the peak of graph, it is very low at the initial point and keeps getting enlarged as it moves away from the initial position. Why It Happens : It happens because of following key factors: Communication Ineffective Inventory Management Process Management syst
  2. Quality Assurance and Quality Control are both important component for the robust 'Quality Management System'. Quality Control (QC) in general is a reactive concept which deals with the intermediate quality parameters of products and process once the activity is performed. e.g, The assembly line in an auto industry have passed certain number of vehicles and than QC Team, based on the agreed and standard set of parameters, does the inspection of the same to check where there are any deviations. In,fact there is a concept in industrial world of TEST - MAKE - INSPECT . So inspection
  3. Median is a statistical tool for data representation which is a measure for central tendency. In cases where, Mean which is the average of all the data, are not able to represent the correct picture of data requires Median as an indicator which is the central value in the entire data sets and make a clear distinction between first half and second half of the data sets. EXAMPLE 01 : The sales performance for 9 units of an apparel industry showed above average performance for last 3 quarters. So If Team goes by Mean as an indicator for statistical performance monitoring, management w
  4. 1. All the machines, processes and instruments are connected wirelessly with clear communication in an integrated manner. 2. Works towards clear exchange of data and inter-dependency in the processes, products and systems. 3. Cyber physical system create a kind of simulated environment of real world like situation. 4.It works in sync with DDA i.e., Digital and Data analysis. 5.More decentralized decision making and business strategy making. 6. Complete data based decision making rather than experiential decision making. 7.Based on the clear visualization of tasks
  5. Net promoter score or NPS, as we call shortly, is one of the analytical tool to measure the customer satisfaction index or loyalty of the customer to a particular manufacturer or service provider. It is an important metric to gauge the performance of producers against their stated brand or value product. The Metric uses scale from -X to +X and places some questions to the customers. List of questions varies widely from basic features of the product/services , customers likelihood of using the same product/services again, their probability to refer the similar brand/product/services to other po
  6. Below are certain examples of scenarios wherein the organisations have deployed various AI and digital algorithms to improve the business performance. example 01 :One of the Financial transaction company called Paypall had collaborated with Rapid Miner to gauge the intentions of Top customers and monitor their complaints. The digital analytics team of Rapidminer helped Paypall to manage and analyze the huge amount of data. AI and data science engineers of Rapid Miner used SENTIMENT ANALYSIS in different languages based on the various social media posts and messages. Rapidminer als
  7. Drum, Buffer, Rope is a supply chain technique to define the flow of processes based in pull system of production. Drum is synonymous with the first link of the supply chain which defines the pace of flow or pace of manufacturing which in order is followed by rest of the processes. This is in line with the concept of Theory of constraints which states the business performance or supply chain effectiveness is controlled by one or few critical factors which are actually bottleneck in the entire pipeline.By identifying that area of constraint, effort is made to improve that so that Throughput an
  8. Bessel's correction: When we intend to perform the statistical analysis of data and get valuable inferences from it, there are various statistical calculations we perform which includes, Mean, Standard Deviation, Variance etc. While carrying out these statistical analysis, we have either sample of the data from a large population or the population itself. For answering the above question lets consider the standard deviation as an example for better clarity. If the consideration in point is the population, then the chances of data we obtain is normally accurate. Th
  9. Bayes theorem is highly applicable in business scenarios wherever we want to find the probability of occurrence of any event when we have certain clues and guides regarding the processes impacting the outcome of happening of any event. Bayes theorem is closely associated with the Prior and Posterior probability in which the all the evidence and data associated with the occurrence of an event is well known in advance and that is primarily used to calculate the probability of occurrence of an event. One of the example associated with the manufacturing of textile machinery wherein the
  10. Heuristic methods are Non conventional method of conducting the problem solving exercise, wherein we get to know the best optimized solution of the problem, for which the solution through statistical approach is either not possible or not feasible. This is very useful in cased where in the general problem solving technique is unable to get to the root cause and validated countermeasures. Heuristic methods employ various tools and techniques, processes and procedures to come to the best agreed optimized solution. Some of the Heuristic tools include Trial and error, Best among m
  11. Box Plot is one of the most effective and efficient ways of representing the data. It is one of the standardized ways of representing the data in the form of different Quartiles. The box plot helps distinctly in following important ways: 1. Helps in understanding the outliers in the group. 2. Clear visual representation and standardization of the data, hence facilitating the decision making process. 3.Helps the leaders to understand the pattern and behaviour of data. 4.Minimum, Maximum and Median can be understood clearly unambiguously. 5.Spread of the data can b
  12. Basically data can be categorized into quantitative and qualitative. Qualitative data can be further classified into Nominal, Ordinal and and Binary. Each data has its own ways to get the inference of the data. Since the Kendall's coefficient is designed to identify any number of distinct outcome from the list of Ordinal attribute data, it become highly ineffective for the Kappa value to do so once it is used to make the inference from the set of Ordinal attribute data. Ordinal data is one which has got some scale range in its preparation. For example: 1 being least, 2
  13. Simpson paradox is a situation in the world of statistics in which a particular pattern or trend is witnessed from the sets & subsets of data from two or more groups. But as soon as we combine all the sub groups or sub groups , the trend reverses dramatically. This is essential in order to have a logical and rational causal relationship the cause and effect. Simpson paradox can be understood well with few of following few examples: a. In Pharmaceutical company the turnaround time of product A,B,C......n will give a particular type of trend once these are combined in a common gro
  14. Process cycle efficiency is the proportion of time a process or activity is involved in carrying out value added activities. Therefore, this is also sometimes referred to as VA/NVA ratio. All the activities involved within a process or throughout the process can be categorized into value added and Non value added activities. In Lean terms the value added activities are those in which customer is willing to pay. Non value added activities, hence are considered to be the "WASTE" for the process and the organisation. These wastes can be summarized in terms of TIMWOODS i.e., Transportation,
  15. The guide to Hypothesis test can be improved in following manner: 1). Clear examples to demonstrate for the newbies. 2). Use the case studies for different sectors for carrying out Hypothesis test 3). Structure the framework in reverse manner so that understanding can be more clear
  16. Net Present value refers to the difference between the amount of cash flow achieved and the quantum of cash invested in a project / activity. In most of the situation, the organisational leaders are required to deliberate on the decision making in terms of projects profitability based on the cost - benefit analysis. Net present value (NPV) helps in order to process that decision making. After the NPV calculation, if it comes out to be POSITIVE, than the project or activity is said to be PROFITABLE or if the NPV comes out to be NEGATIVE, activity/Project is said to be NOT PROFITABLE.
  17. DPMO and DPU are both used to calculate and measure the results by analyzing from the available data sets. DPMO - Defect per million opportunities are used in scenarios when the process contains uncountable errors or defects. Defects per unit (DPU) means the data is attribute data when the classification is in terms of binary i.e, yes/no, good/bad etc. Former is used in cases of measurement data wherein the classification is binary like satisfied . not satisfied. These include the examples like conducting surveys of employee satisfaction, saying the quality attribute of an item in te
  18. Process FMEA and DMAIC are complementary to each other as both of them serve the purpose to meet the end goal of process improvement. Even though the FMEA was initiated by industries much before the advent of DMAIC methodology came into picture, however currently these are widely used in most of the DMAIC elements, particularly Measure, Analyze, Improve and Control. Primarily, Process FMEA is used to identify the critical X's by calculating the Risk Priority Number against each X's. This helps the six sigma professionals to identify and focus on those critical few factors, which are contr
  19. As a Problem solver, when we are making an attempt to investigate and understand the root cause of the problem, There are several tried and tested method that are experienced across industries. One of the best technique is Fish-bone or Ishikawa analysis where we brainstorm and map various probable causes from different dimensions, experiences, horizons, facts and perspectives. While identifying the root cause, we validate each and every probable causes through several validation techniques, most successful of which is GENCHI GENBUTSU. This involves making the Gemba round for each of the
  20. Statistical significance between X and Y are important from data analysis point of view. However the correlation between X and Y is crucial significantly from Business consideration, we commonly refer that as Practical significance. One of the best example for this misnomer is Water purification system which are an important factor for many industries including Pharma. Chemical dosing is one of the X which impacts the Y i.e., hardness of water. However at certain level X can not be altered because of other considerations. Hence at a certain point the Changing X does not warrant a change
  21. Activity based costing is extremely helpful and essential costing technique for effectuating the Top down decision making in Strategic and Operational cases. By offering an opportunities to allocate costs in direct proportion to the activities being performed it helps to develop powerful cost centers in an organisation in layered format ,,say Facilities sustaining activities,Product sustaining activities, Batch level activities and unit level activities. With above advantages these techniques are highly beneficial in industries where higher transactions are happening at diffe
  • Create New...