Jump to content

Mohamed Asif

Fraternity Members
  • Content Count

  • Joined

  • Last visited

  • Days Won


Mohamed Asif last won the day on March 3

Mohamed Asif had the most liked content!

Community Reputation

15 Good


About Mohamed Asif

  • Rank
    Advanced Member
  • Birthday 03/30/1985

Profile Information

  • Name
    Mohamed Asif Abdul Hameed
  • Company
    Allstate India
  • Designation
    Senior Lead Consultant - Operations Excellence

Recent Profile Visitors

671 profile views
  1. 自働化 - Jidoka - Autonomation Simply means process automatically halts when there are non conformities/irregularities/abnormalities in the system and it can also act as early warning device in manufacturing unit Andon light system is one of the Vital component in Autonomation. In the below reference pic, Andon is used as Visual Management tool to know the status of the production. Legend reference: Green - All Good, Normal Operation > Proceed further Yellow - Warning - Issue Identified, require attention > CAPA required Red - Production halted > Issue not identified; Immediate Supervisor inspection and RCA required Andon systems are commonly used in the below industries: Automotive manufacturing Packaging and shipping Retail Textile Inventory management Medical and Healthcare Example in healthcare: Visual Andon which include lights on Code Blue carts to indicate that they need to be checked for the day and intrusive computer screen alerts for pharmacists that indicate a patient allergy or contraindicated medication. Auditory Andon which beeps alarm on infusion pumps which gives signal stating medication is nearly gone or indicate there is defect with tubing and the audible alarm indicates acute medical emergencies Andon System has wide range of applications and below listed are few: It can Reduce downtime to enhance OEE It could significantly contribute to quality improvements due to shortened downtime Improved communication between operators and engineers Evaluating Incidents and instances from Andon events can lead to Poka Yoke applications Advanced system (@Next-gen Andon systems) can display KPI metrics and data with real-time embedded data Productions systems can be remotely managed by data flowing into tablets and smartphones Improved Productivity Reduced Lead-time Reduction in equipment failure rate Improved customer service Lower Costs However, there are few challenges as well, such as below which needs to be taken care High cost implementation Misunderstanding of automation leads to misapplication Difficulty of small enterprises hiring qualified staff
  2. Kubler-Ross Change Curve is a version for crossing the transition between - Change initiation to reaching overall goal. It describes the emotional journey that people undergo when coming across with change and transition. Below are the typical emotional change over’s Shock Denial Anger / Frustration Depression Experiment Decision Integration / Acceptance This can be viewed in 4 broad stages: Stage State Reaction Stage 1 Status Quo Shock, Denial Stage 2 Disruption Anger, Fear, Blame, Confusion Stage 3 Exploration Searching, Acceptance Stage 4 Rebuilding Commitment, Problem-solving People's reaction to pandemic is no different and resembles the below curve Critical learnings and take away from this transition that organization should adopt and look into would be to: Involve and engage people Put people first Move faster Reimaging value creation Communicate with transparency Working at the speed of customer Build augmented workforce strategy Reskill resources – Promote and facilitate digital learning Make processes and services mobile and flexible Adopting ecosystem mindset Embrace new technologies and data-rich technology platforms Empower leaders with judgement Turbocharge decision making Treat talents as scarcest resource Learn how to learn Change, adapt and innovate Not only adopting these at crisis situation, nevertheless it should be considered on-going basis to evolve, grow and achieve organizational success and it could possibly shrink the change transition timeline.
  3. In six sigma - DMAIC methodology goals should be SMART – Specific, Measurable, Attainable, Relevant and Time-bound and the goal of the project should be aligned to the issue/problem that we are trying to solve. A in SMART, Referring to - Attainable, Agreed upon, Achievable, Acceptable, Action-oriented Attainable is the important element and needs to be considered thoughtfully. Attaining defect % of less than 1% from 50% might not be possible in the due course of the project with out knowing the causes and the solution path. Below are few questions that can be asked while framing the goal.. Is 50% allowed? Accepted defect%? Cost of Poor Quality? FTR% Control measures in place Frequency of defects? Defect Detection methods? What is the financial Impact? What is the GAP? Actual Versus Desired How to ensure Quality at source (Quality by design, monitoring and control, self check and verification) Based on the response, aggressive but reasonable goals are drafted. One should have clear understanding of the current state (current performance – baseline) and the future state (desirable performance – target), do gap analysis to verify whether it is reasonable/possible while starting the six-sigma project and framing the goal statement. It is not hard and fast rule when it comes to the Goal statement. Entire project charter is a Living Document and can be revised/revisited as we get more clarity while we progress through the phases and during the iteration. To keep from overwhelming, goals should be fine-tuned/broken down into action steps and we should not try to take over the world in one night by setting unachievable goals. Few vital pointers with regards to Goal: • Achievable and Attainable – Practical with the assigned target value • Realistic – with the existing resource and available time • Action-oriented – defined plan of action Specifically Goal should be possible within the defined ability. Take Away: Achieving Goal is far more realistic when being Specific.
  4. Benford's law also referred as first digit law highlights about distribution of digits of randomly collected numbers to be in non-uniform way, especially the digit 1 tends to occur with probability of around 30%, which is much greater than expected 11.1% (1 out of 9 leading digits) Flip side, Non-naturally occurring data would have pre-defined number like Zip Codes or Universal Product Code (barcode symbology) for instance. Computer-Generated Numbers using Rand() does not follow Benford's law Scientifically, this Law is based on base-10 log that shows the probability that the leading digit of a number will be n can be calculated based on log10(1+1/n) This is specifically used in businesses and accounting services to detect fraud and have application in organizational and business environment and can be used while dealing with: General ledgers, Trial balance reports, Income statements, Balance sheets, Invoice listings, Inventory listings, Depreciation schedules, Investment statements, Expenses reimbursement, Accounts payable and Receivable reports, Timesheet data, Portfolios, Expense reports. In Risk based Audits: This law could serve as an early indicator showing abnormality in the data patterns Forensic Audits: Checking frauds, bypassing threshold limits, improper payments Financial Statement Audits: Manipulation of checks, cash on hand Corporate Finance: Examine cash-flow forecast for profit centers It is widely used by Income tax agencies, auditors and fraud examiners to detect abnormal patterns It works when, we have large data sets we have equal opportunity for metrics considered and we don't have definitive proof No build-in minimum or maximum values are in the data set Insurance Industry: In US, general accounting office has estimated fraud accounts for up to 10% of annual expenditure on health care or $100 billion in the US. In health insurance industry, there is a large amount of claims data submitted by health care providers. Benford's law can be used to analyze and detect abnormalities in the data. We can use Z-statistics to determine the difference between the actual and expected proportions and check for their significance. One can use this tool as a method of detecting possible fraudulent or errant claims received on behalf of health insurance company. Closure Points: Benford's Law is an excellent tool to predict the distribution of the first digit in a large population of data, given that the data has not been inferred with human touch.
  5. Decision Making can be combination of Logic and Emotions. Analysis paralysis is Overanalyzing/Overthinking (Drowning in Data) Extinct by Instinct is taking decisions by Gut reaction (Disregarding the Data ) Overanalyzing or Overthinking can hinder the productivity It lowers performance on mentally demanding jobs There is high possibility that overthinking can kill creativity It pulls willpower of the individual It makes people less happy Organizations carrying excessive formal analysis is labeled as “dialogue of the deaf,” the “vicious circle,” and the “decision vacuum.” AI and ML tools can help in interpreting big data and it can be wisely leveraged in this technology era rather than overanalyzing. Some tips to overcome Overthinking: Structure and Plan decisions that matter the most Deliberately limit information intake Set a target deadline for Research Focus on primary objective and try not to deviate Discuss, collaborate and be open to take inputs from others Approach problem solving with an iterative mindset and Finally, practice making decision quickly On the other extreme, Extinct by Instinct, that is decisions without assessment and just by Gut feeling can be Fatal. Fatal is all aspects, For example: Stepping over the line in personal relationships Lack of spending control in Finances Ignoring the impacts in Health Not using wisely, the Time Malcolm Gladwell argues in his book, Blink, that the quick decision–the “snap” judgment–is much maligned. Below is the reference grid indicating problem solving and decision-making styles. Organization’s should leverage technology and wisely make decision. It should not get drenched into data pool as well as not discount the data. Quote "If you spend too much time thinking about a thing, you'll never get it done" – Lee At the same time, Quote “Quick decisions are unsafe decisions” – Sophocles
  6. #KeepItSimple If you have got the Bright idea, you have to cash it to get it go. Elevator pitch is the term used for less than 2-minute presentation, the time it takes to go from Lobby to the office floor, used to catch the attention and interest of the sponsor. Points to remember: Keep it simple and put it effortless and natural No need to close the deal in the elevator pitch, it’s all about grabbing interest Keep it ready, set and prepared Below are 10 Do's and Don’ts that can keep your pitch SMART. Do's: First and Foremost, Practice your presentation Start the pitch with a Vibrant Compelling Hook Be Positive Structure your USP Be confident Do take it slowly Have Data/Numbers to back your idea Maintain eye contact Have a quick takeaway point Keep it to a minute or less Don'ts: Don't be too fast Don't use Jargon/acronyms that might get you off track Don't focus just on yourself Don't hesitate to update your speech as the situation changes Don't be robotic (monotone way) or sound that you have memorized Don’t oversell or undersell Don't sound too salesy Don't restrict to a single pitch, have different versions Don't wrap up early if your listener is more interested or their eye's glazing over Finally, don't forget to say "Nice to Meet you" End lines, referring to the quotes from great physicist
  7. With reference to Weight of Ox, Galton discovered that average guess was awfully close to the actual weight. Sporadically, it makes sense to go with guess from crowd intelligence, specifically when there is no scale, or when the measurement is expensive or when it is time-consuming for accurate measurement. Few considerations and points to be factored before we rely on this method would include the below pointers: Diversity of opinion Independence of opinion Trusting the group to be fair in terms of giving the opinion Should be cautious of Groupthink Consideration on activity time Distribution of expert panel is important to avoid dominance and skewness Diffusion of responsibility We still use similar versions of Wisdom of Crowds in organizations. Ref 1: When we do Story Point Estimation in Agile development During development, it is critical and vital to plan budget, time and resource required to complete the module in every sprint. Entire project is broken into different levels and function point analysis and story points is assigned to each user stories. Story point estimation gives rough estimation (more like a guess) for the product backlog items with relative sizing. This is a comparative analysis. Key estimators include developers, testers, scrum master, product owner and other related stake holders. Some of the common estimating methods are T – shirt Sizing Using Fibonacci Series Animal Sizing Typical measures used in T – shirt sizing comprise XS, S, M, L, XL, XXL; In Fibonacci sequence measures includes numerical 1,2,3,5,8,13,21,34,55, etc.; In Animal Sizing team uses Ant, bee, cat, dog, lion, horse, elephant, etc. It’s fun to use animals instead of numbers Based on the efforts required, points are assigned for the stories during planning. Quick and effective way of using crowd intelligence while doing story point estimation. Ref 2: Delphi interview technique This method relies on expert panels decision, considering opinion from experts more like that of getting crowd intelligence in wisdom of crowd Ref 3: Group Decision Support System (GDSS) Many organizations use this method for idea generation, to select optimal solution, for voting and so on. By using this technique, it easy to arrive at solution for complex problems Other effective methods and applications includes below, Brain storming Nominal group technique Devil’s advocacy Fish bowling Didactic interaction Closing quotes to go with Wisdom of Crowd.
  8. Kaplan Meier Estimator / Product Limit estimator is used mostly in medical and pharma research. It estimates survival function from the recorded data to analyze and estimate post treatment performance. It is also used in non medical fields to measure time, performance, and other metrics of interest post an event. Survival analysis is used to analyze time duration and numerical variables [Time from origin event to the occurrence of event of interest (it can be improvement, death, etc...) The estimator is plotted over time and the plot curve is referred as Kaplan-Meier curve Apart from Kaplan Meier Estimator, Cox model and Cochran–Mantel–Haenszel test is also used for survival analysis. For the considered data, we could either have complete time data with event of interest or they could be censored. In censored we will not be able to accurately measure the total survival time of the patient. This analysis can be calculated for two group of subjects and the statistical significance in the survivals can be estimated. Some Assumptions in the test includes below points: In all time censored patient will have similar survival as those of regular patients Survival probabilities are same for subjects interviewed early and later during the study period Events happens at specified time Survival probability, St is calculated by using below formula St = [No. of subjects living at start - No. of subjects died] / [No. of subjects living at start] Working on Minitab: This method is non parametric Analysis and can be done in Minitab by following the below steps: 1) Creating Distribution overview plot [Stat > Reliability/Survival > Distribution Analysis (Right Censoring) > Distribution Overview Plot] Either Right Censoring / Arbitrary Censoring can be used based on the data type 2) Select Variables 3) Specify distribution 4) update censoring column / value Lets understand with an example in R: Would like to walkthrough Kaplan Meier Estimator in R studio for in-build data set (lung data set) for easy understanding. Input: Weight loss between male and female post event Output: Improved / Not improved Perform Log-Rank test (for binary outcome) Load Survival library Create survival object (surv()) Create data for plot (survfit()) Perform test (survdiff()) Select Parameters [Eg: time, status, sex, weight loss, etc.] Convert data to binary for censored data (i.e., patient with no outcome/no event) use surv(time, event) and run function to view plot and interpret the analysis In the above curve, we could see weight loss is better in case of females compared to males. Referring to p value = 0.00133, we can say there is significant difference Pitfalls of Kaplan Meier Estimator is that, not all patients turn up for follow up after the treatment, so while interpreting the curve, we will have to cautiously analyze and interpret.
  9. Ensuring Compliance and Risk Mitigation are vital elements in organizations Fraud Detection and Risk Management Framework Fraud Detection is inevitable in organizations because “Undetected fraud encourages more fraud” In banking environment, Fraud Detection and Prevention are done more proactively compared to other domains. Below are some of the methods followed to detect frauds: Using Intrusion detection systems – It is passive system which monitors and notifies user Transaction monitoring for any suspicious activity and operating procedure violations Alert User and Risk response team when there is unusual activity detected (based on spending behavior and location) Real-time monitoring for high value transactions Using advanced firewall to auto detect and block traffic based on IP Port number Below are some of the frequent scheduled activities and best practices to prevent frauds: Using Intrusion prevention systems – it is active system which monitors and automatically defect attacks 2FA (two-factor authentication or multi-factor authentication) security process is an extra layer of protection to ensure security of online transactions Blocking debit/credit cards when repetitively wrong pin entered OTP and secure code authentication for online transactions Limiting transfer value for online transfers for new beneficiary for first 24 hours, Limiting to add number of beneficiaries with in 24-hour window Auto Logoff after when user is idle, and no activity detected Commonly used security Tools in financial institutes include: Proxy Piercing - This tool helps in tracing fraudster's true location Device fingerprinting - This tool captures transaction pattern associated with the device and flags Blacklisting - This tool blocks traffic initiated from specific user/domain/location/country (dark web monitoring) Velocity Checking - This tool watches repeat purchase from same user and flags Adopting to multiple fraud detection tools and methodologies is the only way to effectively fight back with online frauds. These tools can help in Payment fraud prevention New Account fraud prevention Account takeover protection Payment authorization Dynamic checkout Charge-back guarantee Representment Content Integrity CNP fraud protection In Insurance environment, especially during claims, organizations were following traditionally measures such as relying on expert judgement, Special Investigation Team and adjusters. However, Organizations should leverage technology to mitigate, prevent and combat fraudulent activities: for instance Using Analytical Techniques such as Artificial Neural Networks (ANN) to flag an unusual claim Using Data Mining methods such as clustering based on specific customer UID’s and segments Using Pattern Recognition Algorithm and models to identify patterns in comparison with the historical records Text mining and logical regression techniques to identify claimant records Categorization can be done based on available date such as Clean Claim, Standard Analysis and Critical Investigation Clean Claim – For Fast Track Settlement Standard Analysis – With Normal processing TAT Critical Investigation – For potential fraudulent claim Lemonade Insurance Company reports claim paid in 3 seconds with no paperwork Source: Insurance Innovation Reporter For companies like Lemonade, fraud detection and prevention system should have apex standards to maintain reputation and customer relationship
  10. 自働化 - Jidoka - Autonomation Simply means process automatically halts when there are non-conformities/irregularities/abnormalities in the system Andon light system is one of the Vital component in Autonomation. In the below reference pic, Andon is used as Visual Management tool to know the status of the production. Legend reference: Green - All Good, Normal Operation > Proceed further Yellow - Warning - Issue Identified, require attention > CAPA required Red - Production halted > Issue not identified; Immediate Supervisor inspection and RCA required Some of the planning aspects necessary to benefit from Jidoka is mentioned below. Organizations should combine JIT & Jidoka together, by doing this overproduction will be avoided, poor quality is minimized along with increased productivity. Under Continuous flow, this will avoid bottleneck and idle time Implement Lean flow before Autonomation Use effective use of systems and technology to make Andon lights interactive. This can improve the communication between operators and engineers Keep downtime minimal to magnify Quality & Overall Equipment Effectiveness (OEE) Have Rapid Issue Response (RIR) teams ready to address open and high priority tickets Integrate Andon Boards, Monitoring system and Alert systems for quick response Train operators and engineers on Autonomation tools - Andon, Andon cord, Fixed-position stop, Poke-yoke, Sensors and appropriate lean tools Empower the workforce for Pursuit of Excellence Corrective Action is essential, however importance for Preventive Action and Poke-yoke should be given for effective Jidoka benefits Moving to Jidoka: Minimize Manual Labor > Mechanize Flow > Implement Lean > Optimize > Automate > Autonomate
  11. I have summarized some of the methods to overcome and defeat Groupthink during a brainstorming session: Few best practices include below methods: Engage in Open discussions Allocate a “Devil’s Advocate” in the team - Red Teaming Structure the brainstorming session Encourage Wild Ideas Evaluate alternatives cautiously - Use "Six Thinking Hats" approach Disrupt abundantly when required Encourage Conflict of ideas, so that group doesn’t ends up with limited decisions Give everyone chance to speak up, during the session, may be in a Round-Robin fashion timely Add new elements to Brainstorming à Introduce “Reverse Brainstorming”; “Brainwriting” Give more attention to Group Dynamics Encourage Diversity Occasionally, Invite Cross function team member as external consultant Remain to be Impartial until to wrap up
  12. By Definition: Nash Equilibrium is a stable state of a system having interaction of different players were no player can gain by taking independent(in isolation) change of strategy if the strategies of the other player remains unchanged. Below is the Payoff Matrix for Company A & B for their decision for diversifying or not. In this scenario, “Players” are the firms, Company A and Company B “Moves” are the actions the firms can take: Either Diversify or not diversify (Something like Apple’s strategic decision of getting into Car business) “Payoffs” are the profits the firms will earn: (Diversifying increases firm’s operational costs, but in long run can increase revenues) Here we have the equilibrium outcome, which is both companies will diversify. Even though both A & B will perform better if they do not diversify, however such decisions are highly unstable as each company will have upper hand to diversify (extra +30), when competitor is not diversifying. The result referred is called as “Nash Equilibrium” A “Win Win Situation” Here Neither Company A nor Company B has anything to gain by modifying its own decision separately. Simply, Nash Equilibrium position is most equitable solution (most stable state), though not the most obvious solution when there is Multi – Party Conflict Nash Equilibrium is one of the fundamental concepts in Game Theory and it provides the base for rational decision making. It can be used to predict company’s response to competitor’s prices and decision. In 2000, advice from economists raised £22.5 billion for UK government from an auction of 3G Bandwidth license for mobile phones. Source: UKRI Economic and Social Research Council In Oligopolistic market condition, if one organization reduces its service prices, the other competitor must reduce the prices, so that they can retain customers. Classical Indian examples: > Bharti Infratel & Jio striking Nash Equilibrium for Telecom Infrastructure sharing > Dilemma of Shiv Sena whether to support or scoot BJP from the alliance, while forming government in Maharashtra Organizational decision making involves deciding between alternatives, uncertainty, complexity, Interpersonal Issues & High-Risk consequences. Organizations can follow the application of Game Theory by changing the Payoffs. Even though it is difficult to shift/switch from an competitive to cooperative strategy with any degree of success. It is better for organizations, cooperating with rivals/competitors which would leave everyone better off. Applying the concept in organizations helps narrow down the number of possible outcome, and therefore strategic decisions that would be taken. Take Away: A Lose-Win and Win-Lose situations usually in any kind of relationship does not last and it is temporary and can easily turn to be a Lose-Lose situation later. To make strong, long term relationship to be sustainable, we will have to rely upon Win-Win Situation.
  13. Value of any measurement is the sum of actual measurement and measurement error. Measurement system variation/error can occur because of Precision or Accuracy. Gage R&R tool measures the amount of variation in measurement system. Variation could be from the device or from the people. In the below diagram, Ref 1.1 is a classical example for High Precision and Low Accuracy. Even though Precision is high, values/points are highly biased and inaccurate. In Ref 1.3, values/points are both Accurate and Precise. Resolution is pivotal in measurement systems as it discriminates between measurement value. Post looking at resolution, it would make sense to look after the accuracy part, that is to measure the distance between the average value and true value. Moving from constant bias to Zero bias is the next objective. Linearity is the consistency of the bias existing in the system over the measurement range. Then looking at stability of the system, that is whether the measurement system has the ability to produce the same value over time when same sample is considered and to proceed with Precision - Repeatability and Reproducibility. Primary objective is to find out if there is any variation (either process or appraiser) and then look at total measurement system variation. So best order of checking the variation would start from 1. Resolution / Discrimination against tolerance (smallest unit of measure of the gage) 2. Accuracy / Bias (closeness of the data to target value) 3. Linearity (change in bias value within range) 4. Stability (change in bias over a period) 5. Precision (Repeatability and Reproducibility) (closeness of value with each other) Other views on the order could be 1. Resolution 2. Accuracy 3. Linearity 4. Precision & 5. Stability
  14. IoT (Internet of Things) is connecting things (devices, appliances, utilities, objects, machines, etc.,) to the internet. Car gate/barrier opening automatically when you reach your home location; Air conditioner, washing machine, geyser, TV switching on and off based on the pattern/specification are some of the examples of IoT. According to a recent research, distribution of multipurpose RAT’s (Remote Access Tools) that affect IoT has doubled in recent years (6.5% in 2018, 12.2% in 2018) Source: Kaspersky Global Research and Analysis Security concerns of IoT: As multiple devices are connected over the internet, there are possibilities that the information/data can fall into the wrong hands/hackers, results in misusing the data and ascending security concerns such as Data privacy Home security Network hacking Distributed Denial of Service attack Deliberate Radio Frequency jamming Extortion Losses Theft of financial information/money Simply to summarize losses could be Physical, digital, economical, psychological, reputation or social damages There is a clear limitation of IoT Security - We can’t install antivirus in most of the IoT devices (smart TV, internet security cameras) as it does not have adequate computing power to run an antivirus program To overcome Security concerns on IoT, we could follow some of the best practices listed below, Creating a strong password for the connected devices, encrypted, complex and not guessable viz., admin, 12345 Reset/Change password in regular frequency Not having the same password for all connected devices Enabling notification for any intrusion / invasion to take rapid action (Intrusion prevention) Frequently monitoring for suspicious/unusual activities (Anomaly detection) Regular application update from hardware vendors for improved security Selecting build-in security devices with embedded firmware for IoT connectivity IoT has great potential, doing due diligence before investing is wise.
  15. Both, Scatter Plot as well as Bubble Plot examines relationship between 2 variables (X Variable and Y Variable). However, In Bubble chart, the area of each bubble represents the value of third variable. Size can represent - Area of bubble or width of bubble based on input specification. Bubble chart is build upon scatter plot as a base. Below Scatter Plot and Bubble Plot reference same data points. Scatter Plot 1 - Examining relationship between Y Variable and X Variable Bubble Plot 1- Examining relationship between Y Variable and X Variable, Bubble size representing third variable Variants: Based on the groups, we could have a simple bubble plot or one with groups. Bubble Plot 2- With Groups - 3 Category A, B, C Limitations and Misinterpretations: Area or size of the bubble, proportionally increases or decreases in the plot and does not depend on the largest value/size of the bubble. Possibly there are high chances of misinterpretation to ascertain the value based on the bubble size. However, in Minitab, we have option to Edit Bubble Size (Minitab can calculate the size or we could go with actual size of the bubble in the mentioned variable) Complex to understand and read the data compared to that of a scatter plot It becomes chaos / confusing when there are more data points in the bubble plot (In above referred Bubble plot 2, 50 data points considered with 3 categories). Not Ideal for large set of data. hard to identify smaller bubble (smaller bubble might be covered/hidden), especially when it is closer/overlapped by a bigger bubble. Information lost Using Jitter can help in revealing overlapping points. However, it could confuse the reader as Jitter is generated based on random function (it is not the same point each time when it is generated) It could be difficult to determine the exact location of the data points when the bubbles are clustered When there is no clear legend, reader can misinterpret / misunderstand the data point and the relationship Negative Size? , Any negative/null value representing 3rd variable size would not be visible, after all, shape cannot have negative area Data is valuable only if we know how to visualize and give context. It would be better to select the Chart based on the message that we want to share with the audience rather than just going with the chart type.
  • Create New...