Jump to content

SandhyaKamath

Excellence Ambassador
  • Content Count

    39
  • Joined

  • Last visited

Community Reputation

0 Average

About SandhyaKamath

  • Rank
    Advanced Member

Profile Information

  • Name
    Sandhya Kamath
  • Company
    Nelito Systems Ltd
  • Designation
    AGM

Recent Profile Visitors

1,099 profile views
  1. Ranking for the five metrics as per order of importance for Performance of After Sales On- Site Service for electronic goods are: 1. NPS (Net Promoter Score) - Loyalty and referral check. 2. CES (Customer Effort Score) - Customer effort assessment in getting work done/ issues resolved. . 3. C-SAT (Customer Satisfaction Index) - Satisfaction attained by the use of product/service 4. Churn (Customer Churn Rate) – Customer loss assessment. 5. CAC (Customer Acquisition Cost) - CAC is all the costs spent on acquiring more customers (marketing expenses) divided by the number of customers acquired in the period the money was spent. The first three offers great insight into Customer Experience and loyalty and when all these 3 metrics are used together one would get the most actionable feedback. Churn enables the organizations to identify strategies of customer retention for improvement on customer base whereas CAC (Customer Acquisition Cost) is the cost that the business must allocate in order to acquire a customer.
  2. Let me begin by explaining the terms ARMI & RACI ARMI, a project management tool is used to analyze the level of support of project stakeholders. The acronym stands for the four levels of support that can be assigned to the various stakeholders: · Approver – Person whose sign-off is required to move ahead in project · Resource – Expert whose skills are required for the project · Member – Individuals making up the project team · Interested party –People who need to be kept up-to-date on the project The RACI model on the other hand is a tool used for identifying roles and responsibilities and categorizing the level of participation. The acronym RACI stands for: Responsible: The person who has responsibility for getting the work done or decision made. For examples the person might be a business analyst, application developer or technical architect. Accountable: The person who is accountable for the correct and thorough completion of the task. For example the person may be the project executive or project sponsor. Consulted: The people who provide information for the project and with whom there is two-way communication. For Example subject matter experts. Informed: The people kept informed of progress and with whom there is one-way communication. They are people who are affected by the outcome of the tasks, so need to be kept up-to-date. I will go with the RACI model as it provides a clear cut delegation of tasks to people involved in the project. Tasks are listed out clearly and people who are involved in the project are also listed out along with their roles and responsibilities at the beginning of the project itself. Each and every process is clearly defined. Tasks are properly divided among different members thus balancing workload appropriately. RACI enables every member to know very clearly what is expected of him/her and helps eliminate misunderstanding, fighting and duplication of effort. It is a good communication tool and helps in taking the right decisions. It enables good Team Bonding as it gives a cross-functional view for people across divisions or departments and enables them to appreciate the support given by each and every team member. The outcome of all this is timely completion of project with quality and within budget.
  3. Lead time is the total time taken from receiving an order to delivering an item, thereby measuring the production process from the customer's perspective. Cycle time is the time taken to complete a unit from start to finish, i.e. the time taken when the actual work begins on the unit and when it is ready for delivery. It is the average time taken to finish one unit. In other words, cycle time measures the completion rate and lead time measures the arrival rate. If the lead time is much higher than cycle time, it means that there are lot of units in the inventory. For example if one has to manufacture a small gadget, Cycle time is the time it takes to manufacture the gadget. It tells how efficiently processes and resources can produce one single gadget. Improving the internal process will help reduce cycle time. On the other hand Lead time is the time taken for the gadgets to come off the assembly line. It tells how much time the customer has to wait to get the gadget. Lead time includes cycle time and also delay time between processes. In short, Cycle Time is a metric that focuses on improving the production time and Lead Time is the overall capacity to produce.
  4. Zero Defects an idea conceived by Phil Crosby in the early 1960s was a very successful approach until the 70s till a whole new range of programs like Total Quality Management, SPC, Malcolm Baldridge and Six Sigma resulted in Zero defects fading out. Companies preferred to use the user friendly Six Sigma to the slightly impractical Zero defects. But let me try to explain what Zero defects is and how it works and will continue to work in spite of statistical approaches. Zero defects focuses on Defect Prevention and removal of wastage in a project. It does not require any specialized training and can be managed by the existing employees. It has a very high performance standard and no defect is accepted at all in this concept. It is a quest for perfection in order to improve quality and uses the concept of “doing it right the first time” in order to avoid costly and time-consuming rectifications. This doesn’t mean that errors won’t be made during development, but they can be corrected before the product is released to the customer. Quality output delivered in a timely fashion with proper utilization of resources and within budget is what Zero defects theory focuses on. Zero defects also ensures higher customer satisfaction and increased sales and profits. Zero defects is a standard. It is a measure against which any outcome or action can be analysed. Option A, E & F fit best in my opinion. Unless management commits to zero defects and integrates it with corporate culture it will just not be possible to achieve zero defects. Zero defects requires a top down approach and should be taken as the way things are to be done and followed in the whole organization. Unless Top management insists on quality improvement teams, zero defects cannot be achieved. Quality needs to be built into performance expectations. Top Management should ensure that teams are motivated on a continual basis and given the proper tools to achieve zero defects and rewarded as well for zero defects achievements. Involvement of Top Management is also required as there should be continuous improvement in the production of goods and services so that inefficient process or application are cut down and business expanded. Top Management should ensure that customer requirements are met and a high quality product is delivered and enhanced customer satisfaction is achieved. When zero defects is the goal set by Top Management, every aspect of the business be it marketing, finance, design, engineering, production, customer service etc. from top level to bottom level is subject to scrutiny and check is conducted whether the organizational quality objectives are met or not.
  5. When utilizing a team approach to problem solving, there are often many opinions as to the problem's root cause. One way to capture these different ideas and stimulate the team's brainstorming on root causes is the cause and effect diagram, commonly called a fishbone. A fishbone diagram, also called Ishikawa diagram, is a visualization tool for categorizing the potential causes of a problem in order to identify its root causes. Drawing a fishbone diagram requires asking “Why?” as many times as necessary to try and reach the roots of a problem and is used to analyze problems systematically. When used in theory this makes so much sense but in reality most of the problem causes are already solved so using the 5 whys techniques never quite hits the mark. It helps only in understanding problems and not in goals and reasoning. “Why” in psychotherapy is seen as a dangerous question. From a purely human perspective it can be confrontational and seen as judgmental. One should ask very specific questions based on context and encourage introspection in a safe or well-judged manner else it may backfire. After the group has brainstormed all the possible causes for a problem, the facilitator helps the group to rate the potential causes according to their level of importance and diagram a hierarchy but this may also be misinterpreted. Here is where these two conditions come into play. Necessary Condition – A condition is necessary if absence of the condition guarantees absence of the effect. Sufficient Condition – A condition is sufficient if presence of the condition guarantees the presence of the effect. The real question now is whether being necessary and/or sufficient really important and to find its exact role in problem solving. Thus the effective use of Fishbone or the Why-Why analysis depends on the kind of causes we identify or get satisfied with. This tool requires many other techniques like DMAIC in addition to solve complex problems. Just as starting right is important, it is also important to know when to end the cause analyses. Most of it depends on asking the right questions. Necessary and Sufficient conditions form an integral part of this analysis.
  6. Segmentation means to divide the marketplace into parts, or segments, which are definable, accessible, actionable, and profitable and have a growth potential. It has been used by marketers to get a close-up view of the market. It takes on great significance in today's cluttered marketplace, with thousands of products, media proliferation and ad-fatigue. Rightly segmenting the market place can make the difference between successes and shut down for a company. The purpose of segmentation is to cut down the number of process steps within a system involved in problem creation. Segmentation allows to closely tailor product to the needs, desires, uses and paying ability of the customers with specific characteristics. Collecting data is time consuming and tedious. By applying the principles of segmentation deep insight about problems at hand and potential solutions can be explored. Nine segmentation variables in three categories will help perform quick and effective root cause analysis. Root cause analysis (RCA) is a method of problem solving used for identifying the root causes of faults or problems. A factor is considered a root cause if removal prevents the undesirable outcome from recurring. Root cause analysis should be performed as soon as possible after the error or variance occurs. Otherwise, important details may be missed. 1. Industry: to uncover trends about sales cycles, size and substitutes in similar businesses like telecommunications, financial service. 2. Company: to uncover patterns of purchasing, processes and people 3. Individual: to uncover their personal disposition, demeanour and demographics Evaluating different segments provide quick insight to executives and senior management. A good way to start is to probably take a look at the top and bottom 20 or top and bottom 10%. A finite workable set of information is provided where emerging patterns or gaps are easy to identify. Since the goal is to find issues quickly and effectively, the best approach is to evaluate the company related variables first, and then perform the analysis up to the industry level or down to the individual level to get detailed insight. These techniques are to be used proactively to address existing gaps and launch a product that meets business goals. Segmentation by external factors helps to find the drivers of variation. For example for loans disbursement- categorization by size i.e. small, medium and large is segmentation. If there is same mean and different variances RCA can be done. If there is different mean and the same variances segmentation has to be done after finding the causes of variation specific to each segment .Compare processes and try to equate the means. But if there is different mean and different variance then segments have to be defined and identify the segment driving the variation.
  7. Cost of poor quality (COPQ) is the cost associated with providing poor quality products or services. It does not mean the use of expensive or very high quality materials to manufacture a product. It refers to the cost that is incurred to prevent, detect and remove defects from products. Cost of quality is a methodology and an important communication tool that enables an organization to determine the extent to which its resources are used for activities that prevent poor quality and determine the savings to be gained by implementing process improvements and to raise awareness of the importance of quality. They must be carefully managed so that the long-term effect of quality on the organization is a desirable one. They are divided into four categories: internal failure costs, external failure costs, appraisal costs and prevention costs. Prevention costs Prevention costs are incurred to prevent or avoid quality problems. They are planned and incurred before actual operation and they could include training, quality engineering-Planning & Assurance, statistical process control. Appraisal costs Appraisal costs are associated with measuring and monitoring activities related to quality and whether they conform to specifications and could include Verification, Quality audits & Supplier Ratings. Internal failure costs Internal failure costs are incurred to remedy defects discovered before the product or service is delivered to the customer and they could include waste, scrap, rework, Rectification. External failure costs External failure costs are incurred to rectify defects discovered by customers and they could include repairs & servicing, warranty claims, complaints & returns. To discuss on equilibrium on costs let us take the example of a retailer who sells merchandise. He invest in store “atmospherics” like lighting, merchandising, pleasant music, attractive salespeople, and even disperses fragrance in the air in order to put consumers in a good mood during the shopping process. He also uses eye-catching ads to get consumers positively disposed towards his products. This does not change the utility the consumer will get from the product. Pleasant music or merchandizing in a retail store should not directly impact the quality of a dress for example. These activities involve costs. Therefore, the question arises why retailers would incur these costs rather than offer a lower price to close a sale. This is because the customer is unable to separate the affect from true product quality. For him both are together and affects his decision to go for a particular product. The primary focus of any organizations is customer satisfaction. Customers are also known to pay a higher price for products they value. However, there are no substitutions for quality. It’s wise therefore to create a few good systems and processes and update the higher management and the project team. Educate the team on the importance of quality. Quality is an important aspect of the project which makes it essential for a project manager to stay alert throughout the lifecycle of the project. It is the responsibility of the team to maintain the quality of the project. Poor quality should be avoided by planning the quality policies effectively. Otherwise, this may result in loss of the project as a whole—and consequently, loss of business and reputation in the market. Also go green to reduce operating costs and to reduce energy usage. Replace regular light bulbs with compact fluorescent lighting, look to reduce heating and cooling costs by improving your insulation and windows, and cut back on the amount of physical waste. Encourage employees to communicate via email or other electronic means and ask your vendors to do the same. This can drastically decrease the cost of your monthly office supply order. Once the systemic issues with the cost of quality and significant opportunity for improvement has been identified, the best approach is to apply the Six Sigma approach to improvement using the DMAIC methodology focused on a specific process issue that is significantly impacting the Cost of Quality.
  8. Process control cannot be possible without some sort of a reliable system in place for identifying and understanding the variations in the business processes. A control chart can help identify these variations and see how to use them for future improvements. It is one of the seven basic tools of quality control. Control chart are graphs used to study how a process changes over time. Data are plotted in time order. A control chart always has a central line for the average, an upper line for the upper control limit and a lower line for the lower control limit. These lines are determined from historical data. Control charts are graphs for attribute data and are plotted basis the progress in quality of the product. When a point falls outside the limits established for a given control chart, those responsible for the process are expected to determine whether a special cause has occurred. Determine whether the results with the special cause are better than or worse than results from common causes alone. If worse, then that cause should be eliminated if possible. If better, then special cause should be retained within the system. Statistically based control chart is a device intended to be used - at the point of operation - by the operator of that process - to assess the current situation - by taking sample and plotting sample result So as to enable the operator to decide about the process. A control chart is like a traffic signal, the operation of which is based on evidence from samples taken at random intervals. A green signal - Process be allowed to continue without adjustment as only common causes are present. A yellow signal - Wait and watch trouble is possible. Be careful and seek more information. A red signal - Process has wandered. Investigate and adjust. and take corrective actions else defective items will be produced. There is practically no doubt a special cause has crept in the system. The u and the c control charts use the Poisson distribution to model the results. A U-chart is an attributes control chart used by collecting data in subgroups of varying sizes. U-charts shows how the process, measured by the number of nonconformities per item or group of items, changes over time. Nonconformities are defects are occurrences found in the sampled subgroup. A U-chart is particularly useful when the item is too complex to be ruled as simply conforming or nonconforming. For example, a car could have hundreds of possible defects, yet still not be considered defective. U-charts are used to assess the systems stability, analyse the results of process improvements and for standardization. C-charts are used to look at variation in counting type attributes data. They are used to determine the variation in the number of defects in a constant subgroup size. The opportunities for defects to occur must be large but the number of defects that occur must be small. Control chart views the process in real time, at different time intervals as the process progresses. It helps in keeping the cost of production minimum. By enabling corrective actions to be taken at the earliest possible moment and avoiding unnecessary corrections, the charts help to ensure the manufacture of uniform product or providing consistent services which complies with the specification.
  9. The objective of a Six Sigma process improvement project is to reduce defects. A defect is a failure to meet customer standards for quality. Measuring process sigma for a business process provides a means of assessing how well the business meets its customer requirements .Process sigma (also referred to as sigma level) is a measure of process capability and the higher the process sigma, the more capable the process is. Sigma is a statistical term that measures how much a process varies from perfection, based on the number of defects per million units. Sigma is a measure of the process variability or spread. Sigma is the symbol for Standard Deviation. A Six Sigma process has a short-term process sigma of 6, and a long-term process sigma of 4.5. Determining sigma levels of processes (one sigma, six sigma, etc.) allows process performance to be compared throughout an entire organization, because it is independent of the process. The theoretical defect rate for Sigma process levels 1 to 6 is as follows: One Sigma = 690,000 DPM Two Sigma = 308,000 DPM Three Sigma = 66,800 DPM Four Sigma = 6,210 DPM Five Sigma = 230 DPM Six Sigma = 3.4 DPM, To put it very simply, the process sigma indicates how many Sigmas can fit inside the gap between the process average and the nearest specification limit. Any value beyond the specification limit indicates a defect or unacceptable result. In cases where both lower and upper specification limits exist and the process is not capable on either side of the distribution, the process sigma can be calculated by adding the theoretical DPM levels on each side of the distribution (using the Sigma Conversion Chart) and then finding the corresponding process sigma for the combined DPM level. It aims at reducing variation and the associated defects, wastes and risks in any process. Sigma level is an attribute capability measure. The maturity of a process is described by a sigma rating indicating its percentage of defect-free products it creates. If the defect data is collected over a short period of time z-value is the sigma level. If defect data is collected over a long period add 1.5 sigma to the z-value before reporting the sigma level. Thus long term data capability is short term z-value plus 1.5 sigma. This is why a six sigma process has 3.4 DPM or a .0000034 probability defective, which is a 4.5 z-score. The assumption is that a short-term capability for a single opportunity is equal to a z-score of 6; then at the end of a year, it would be closer to a z-score of 4.5. A more common measure of process capability is Cpk, which is equal to the process sigma divided by 3. So a Six Sigma process has a Cpk of 2.0. In a stable process, the mean naturally shifts as much as 1.5 sigma in the long term on either side of its short-term value. A Six Sigma process is actually 4.5 sigma in the long term. A typical process has been proven to have a shift in its average performance of up to +/- 1.5 sigma over the long term. A long term Six Sigma process that is rated at 4.5 sigma is considered to have a short term sigma score of 6 sigma. The real motivation for implementing a Six Sigma program is to make more money. Six Sigma strives to improve customer satisfaction, increase sales and reduce defects with the ultimate goal of increasing profitability.
  10. One of the biggest challenges in data analysis is dealing with outliers. Detecting outliers and understanding them can lead to interesting findings. Outliers are defined as data that are exceptionally far from the mainstream data or is an observation that appears to deviate markedly from other observations in the sample. Determining whether data is an outlier is purely subjective as there is no mathematical definition of what constitutes an outlier. Many approaches are available to detecting outliers. They are: Extreme Value Analysis: This is the most basic form of outlier detection and only good for 1-dimension data. In these types of analysis, it is assumed that values which are too large or too small are outliers. Z-test and Student’s t-test are examples of these statistical methods... They can be used as final steps for interpreting outputs of other outlier detection methods. Probabilistic and Statistical Models: These models assume specific distributions for data. They use the expectation-maximization (EM) methods to estimate the parameters of the model and calculate probability of membership of each data point. The points with low probability of membership are marked as outliers. Linear Models: These methods uses the distance of each data point to plane that fits the sub-space to find outliers. PCA(Principal Component Analysis) is an example of linear models for anomaly detection. Proximity-based Models: Outliers are used as points which are isolated from rest of observations. Cluster analysis, density based analysis and nearest neighbourhood are main approaches of this kind. LOF (Local Outlier Factor) method is one of the methods used. Information Theoretic Models: These methods are used based on the fact that outliers increase the minimum code length to describe a data set. High-Dimensional Outlier Detection: Specific methods to handle high dimensional sparse data is used. Here the nearest neighbourhood, density of each cluster and finally outlier score of each data point is calculated. The importance of identification of potential outliers is as follows 1. An outlier may indicate bad data. Data may have been coded incorrectly or may not run correctly. If it proved that data is wrong then it should be deleted from the analysis. 2. In some cases the outlier may not be bad data. Outlier may be due to random variation or may indicate something scientifically interesting. The issues with regards to outliers are as follows: · Outlier Labelling · Outlier Accommodation · Outlier Identification It is recommended that some tests like Normal Probability Plot be used. Box Plot & histogram Tools can also be used in checking the normality Assumption and in identifying potential outliers. Grubbs Test, Tietien-Moore Test or the ESD (Generalized Extreme Studentized Deviate) Test can be used where data is normally distributed. In case of lognormal distribution, data can be converted to normal distribution before applying the above test.
  11. Management which follows no standards has no means to assess the maturity of their business processes. They will have no method to assess the risk, immature processes pose to the projects, or to identify the root causes of weaknesses in their process workflows which if and when addressed, can reduce cost and increase operating efficiency. A process may be defined as “A system of operation or series of actions, changes, or functions that bring about an end or result including the transition criteria for moving from one stage or process step to the next.” A process is thus a predetermined course of action and a standing plan for people to follow in carrying out repetitive administrative tasks in a systematic way. Processes play prominent role in coordination of activities between individuals and departments. They establish formal lines of communication and cooperation between organizational units. A framework (ideally which is a top down structure from planning to action ) should be created for the business processes to guide the organization as they move from immature, inconsistent business process to mature, disciplined processes and also provide a roadmap for continuous process improvement. This will help identify process deficiencies in the organization and guide the improvements in logical, incremental steps. This framework should emphasize on the service level agreement (SLA) and metric definition for each business process. In other words, the organization needs to focus on improving the maturity level of key business processes. It is a must for quality improvement, cost reduction and reducing delivery-time and thereby improves organization’s commitment to quality. With this in mind companies go in for certifications like ISO, CMM/CMMI, and PCMM etc. The maturity of a process or activity can be defined to be at one of five levels, from Level 1 (the least mature) to level 5 (the most mature). The ground level is Level 0 where no process exists for the activity. The process in this level is informal and adhoc and performance is unpredictable. Mature processes on the other hand are ones that are: – Defined- Practices are defined and integrated – Repeatable and predictable- Project management system in place and Performance is repeatable – Controlled, measured and monitored- Product and processes are Quantitatively controlled with detailed measurement – Optimised for improvement- Process improvement includes change management and defect prevention. Through the Organization’s growth, the maturity of its processes usually improves. But this may not be the case for all its business processes. Some processes may still be at the lower levels. This shows that the organization does not have an all-round maturity of its processes. There may be cases where the maturity levels may drop if not monitored or if the documents are not revised, according to changes in the business. So an established business may contain processes at different levels. To improve all business processes, an organization needs to spend a great deal of money, effort and time. In order to avoid this the organization needs to prioritize its business processes and enhance the process maturity levels gradually. Thus process maturity is an indication of how close a developing process is to being complete and capable of continual improvement through qualitative measures and feedback. A mature process is one that is complete in its usefulness, automated, reliable in information and continuously improving. Six Sigma, Kaizan, business excellence, total quality management and similar methodologies encourage a quality and continuous process improvement culture. The ‘capability’ of an organisation is measured by the maturity of its processes. Business-critical processes are the ones that have a stronger impact on the overall maturity of the organization. To have an all-round improvement of business processes, the Six Sigma DMAIC (Define, Measure, Analyze, Improve and Control) methodology can be used. Other well-known process models that can be used are Waterfall model, Spiral Development Model, Rapid Application Development, Incremental refinement.
  12. Companies carry on business in order to earn profits but this should not be done at the cost of customers. The same concept applies to the customer as well. Customers want best products at the lowest prices which may not be possible for Companies to provide without going bankrupt. So VOC (Voice of Customer) and VOB (Voice of Business) should be looked at and matched so that they work in conjunction with each other and processes are mapped and full value delivered to customers. A comparison chart is provided below to bring out the similarities and differences in the 2 concepts: S# Parameters Voice of Customer Voice of business 1 Definition The Voice Of the Customer are the needs, wants, expectations, and preferences, both spoken and unspoken, of business’s customers, both internal or external. The Voice Of the Business are the needs, wants, expectations, and preferences, both spoken and unspoken, of the people who run the business. 2 Examples Fast 24x7 service to customers, no down time revenue, growth, market leadership 3 Procurement Reactive or Proactive method with customer surveys, customer interviews, market research, release evaluations, feedback forms Financial and market data analysis, competition analysis, employee surveys, 4 Tools usable Surveys, Kano analysis & CTQ KPIs like ROI, % income from returning customers, shareholder equity 5 Usage of data · Modify products. · Modify processes. · Finetuning Business strategy · Identify opportunities · the internal processes needed to support the processes · To ensure zero defects, zero waste, employee motivation 6 Metrices · release evaluations, · customer satisfaction scores · Net Promoter Score · product delivery times · employee satisfaction · employee turnover · number of defects The manner in which one collects, analyzes and deploys the results of VOC and VOB depends largely on the application and how one wants to use the results and which business decisions one wants to drive and what kind of organizational outcomes one expects. Project management struggles with time, cost and resource management challenges. With many projects under evaluation, it is difficult to isolate information and inputs from the various voices. Identifying which project will meet business objectives requires deep level of visibility of the key influencers. Approach taken by project managers of focusing on optimizing business outcomes enables business to improve their bottom lines. . Examining the gaps and overlaps among VOB & VOC, identifying strategic direction and change that is rational and data based ensures positive ROI as a result of the changes. Staying tuned to the inputs received from customers and business will go a long way to achieving growth in business.
  13. Elimination of Waste is one of the most effective ways to increase the profitability of any business. Processes either add value or waste to the production of a good or service. Waste is something that adds no value. Muda is a Japanese word for waste. Seven Wastes have been identified in lean. Mnemonic used to remember Seven wastes is TIMWOOD (Transport, Inventory, Motion, Waiting, Overproduction, Over-Processing, and Defects) Or WORMPIT (Waiting, Over Production, Rejects, Motion, Processing, Inventory, Transport).Two additional wastes have been identified which directly impacts the bottom line-Waste of Talent & Waste of resources To eliminate waste, it is important to understand exactly what waste is and where it exists. Wastes are included within the cost of the products, either inflating the price you pay or reducing the profit of the company. A Companies Profit is selling price less costs incurred in manufacturing the product. Selling price is very much dictated by the market because if you charge too much or too little then you lose your customers so the only way to improve your profits is to reduce your costs that too by removing all waste from your processes. In addition to improving your profits you will find that waste has a major impact on your customer’s satisfaction with your products and services. Customers want on time delivery, perfect quality and at the right price. Something that you cannot achieve if you allow the 7 wastes to persist within your processes. To eliminate the seven wastes we should implement lean and various lean tools. Focus is to be put on improving the process keeping in mind the satisfaction of the end customer and indirectly dissolving waste. Non-value added processes as a result become much better and we stop doing activities which does not benefit the customer. Customers will pay for value added work, but never for waste. Defect is one area which should be looked into first. Quality defects are a tremendous cost to organization as they result in rework and scrap. There are many associated costs like quarantining inventory, re-inspecting, rescheduling, and capacity loss. Through employee involvement and Continuous Process Improvement, one can try to reduce defects at many facilities. Other things to eliminate waste that can be looked into are Under utilization of employee skills, unsafe workplaces and environments, Lack of information or sharing of information and Equipment breakdown issues. For each waste, there should be a strategy to reduce or eliminate its effect on the company, thereby improving overall performance and quality.
  14. Two kinds of statistics are most often used to describe data. They are measures of central tendency and measures of dispersion (spread). They are also called descriptive statistics because they help describe your data. Mean, median and mode are all measures of central tendency. They help to summarize and communicate data with a single number and in such a manner that they are easy to understand. Range, variance and standard deviation are all measures of dispersion. They are used to describe the variability in a sample or population and help to know the spread of scores within a bunch of scores like whether the scores are close together or are they far apart. For example if we were to record the height of students in a class and to come to a conclusion as to how tall the students are, the measures of central tendency like mean, mode or median are used but if one wants to know how much the heights vary and how many students are 5 feet 2 inches tall for example, then measures of dispersion like range, variance or standard deviation needs to be used. When conducting research generally only random sampling of data is done as examination of all data would incur heavy cost and time. Whether to use the median, mean or mode will depend on the type of data such as nominal or continuous data or whether the data has outliers and/or is skewed and what is to be inferred from the data. One should not use mean where the data is skewed instead one should normally use median or mode, with the median being usually preferred. The first step in assessing spread of data is to examine it in either a table or in a graphical form. In a graph one can clearly see symmetry (or lack of it) in the spread of data, whether there are obvious atypical values (outliers) and whether the data is skewed in one direction or the other. It is extremely important to detect outliers within a distribution, because they can alter the results of the data analysis. The most important and useful distribution of data in statistical analysis is the normal distribution which is characterized by a bell-shaped curve when interval data is represented by a histogram or line graph. When faced with a sample that comprises non-normally distributed (skewed) data, there are two choices: one is to accept the distribution as it is and another is to use statistical analysis or to attempt to transform the data into a normal distribution. Common methods of transforming skewed data into a normal distribution are logarithmic, square root, and reciprocal transformations. Two types of Standard Deviation can be used to describe the variability of the actual sample data. Population standard deviation and Sample standard deviation. For example if a researcher has recruited males aged 50 to 65 years old to investigate risk markers for heart disease for e.g., cholesterol he would use sample standard deviation because though not explicitly stated he will not be concerned just with health related issues of the participants of the study he will also want to generalise the results for the whole population, in this case males aged 50 to 65 years. Whereas if a teacher sets exam for his students he would use population standard deviation because he will be only interested in the scores of his students and nobody else.
  15. Tribal knowledge is unwritten information or knowledge that is known within a tribe or a group of people but not commonly known to others within the organization. It is required in order to produce quality product or service. It is the collective knowledge and capabilities of people in the organization and can be converted into company property but the fact is that management actions, organization structure and processes used are in direct conflict with this six sigma process step as free information flow is discouraged. Knowledge held by this tribe is considered to be factual and there is no known data or analysis to verify the facts. It is not documented and hence exists only in the minds of certain people. Organizations face challenge in the following ways · When employees leave, they take this tribal knowledge with them-. When new employees are not correctly trained by long-term employees this may create serious risks to the product, service or the safety of employees as certain software or equipment may not be correctly used. · Tribal knowledge can be an excuse to avoid automation- Businesses or Management that are averse to the idea of automation and rely solely on time-consuming manual processes. · Employees can hoard information for job security- Many a times employees either intentionally or unintentionally put themselves in the position of being the only people who knows how to fix a problem or perform a task to ensure job security and be indispensable. Tribal Knowledge can be captured in the following ways · Identify and utilize the most knowledgeable employees- Knowledgeable employees are invaluable and focus should be high on capturing this knowledge. · Identify the available tribal knowledge- Identify the knowledge available in different departments or teams and look at what is working and what isn’t. · Document the knowledge you want to keep- Document the knowledge that is to be retained to be able to successfully train your newest employees. · Confront the knowledge gap-Train new employees to minimize the knowledge gap between the more experienced employees and the newer employees so that new employees are more efficient and knowledgeable about their job. Tribal knowledge is often created unintentionally and is common in most organizations. Companies must be diligent in capturing this information and making it readily available to all employees through proper Training. It may not only be essential to the production of a product or performance of a service but it may also be counterintuitive to the process. For example if 3 chefs prepare the same dish each with some degree of variability and the hotel owner wanted to recreate the dish of 1 chef on a continual basis he would have to totally rely on that chef unless the procedure to create the dish is well documented. Another example could be a product line was re-started after being down for two years but the original operators had to be re-hired in order to produce product that worked. A system for training built on technology will go a long way in ensuring that every employee has the same understanding and help in keeping the workplace unified and safe as it should be.
×
×
  • Create New...