Accendo Reliability

Your Reliability Engineering Professional Development Site

  • Home
  • About
    • Contributors
  • Reliability.fm
    • Speaking Of Reliability
    • Rooted in Reliability: The Plant Performance Podcast
    • Quality during Design
    • Critical Talks
    • Dare to Know
    • Maintenance Disrupted
    • Metal Conversations
    • The Leadership Connection
    • Practical Reliability Podcast
    • Reliability Matters
    • Reliability it Matters
    • Maintenance Mavericks Podcast
    • Women in Maintenance
    • Accendo Reliability Webinar Series
    • Asset Reliability @ Work
  • Articles
    • CRE Preparation Notes
    • on Leadership & Career
      • Advanced Engineering Culture
      • Engineering Leadership
      • Managing in the 2000s
      • Product Development and Process Improvement
    • on Maintenance Reliability
      • Aasan Asset Management
      • CMMS and Reliability
      • Conscious Asset
      • EAM & CMMS
      • Everyday RCM
      • History of Maintenance Management
      • Life Cycle Asset Management
      • Maintenance and Reliability
      • Maintenance Management
      • Plant Maintenance
      • Process Plant Reliability Engineering
      • ReliabilityXperience
      • RCM Blitz®
      • Rob’s Reliability Project
      • The Intelligent Transformer Blog
    • on Product Reliability
      • Accelerated Reliability
      • Achieving the Benefits of Reliability
      • Apex Ridge
      • Metals Engineering and Product Reliability
      • Musings on Reliability and Maintenance Topics
      • Product Validation
      • Reliability Engineering Insights
      • Reliability in Emerging Technology
    • on Risk & Safety
      • CERM® Risk Insights
      • Equipment Risk and Reliability in Downhole Applications
      • Operational Risk Process Safety
    • on Systems Thinking
      • Communicating with FINESSE
      • The RCA
    • on Tools & Techniques
      • Big Data & Analytics
      • Experimental Design for NPD
      • Innovative Thinking in Reliability and Durability
      • Inside and Beyond HALT
      • Inside FMEA
      • Integral Concepts
      • Learning from Failures
      • Progress in Field Reliability?
      • Reliability Engineering Using Python
      • Reliability Reflections
      • Testing 1 2 3
      • The Manufacturing Academy
  • eBooks
  • Resources
    • Accendo Authors
    • FMEA Resources
    • Feed Forward Publications
    • Openings
    • Books
    • Webinars
    • Journals
    • Higher Education
    • Podcasts
  • Courses
    • 14 Ways to Acquire Reliability Engineering Knowledge
    • Reliability Analysis Methods online course
    • Measurement System Assessment
    • SPC-Process Capability Course
    • Design of Experiments
    • Foundations of RCM online course
    • Quality during Design Journey
    • Reliability Engineering Statistics
    • Quality Engineering Statistics
    • An Introduction to Reliability Engineering
    • An Introduction to Quality Engineering
    • Process Capability Analysis course
    • Root Cause Analysis and the 8D Corrective Action Process course
    • Return on Investment online course
    • CRE Preparation Online Course
    • Quondam Courses
  • Webinars
    • Upcoming Live Events
  • Calendar
    • Call for Papers Listing
    • Upcoming Webinars
    • Webinar Calendar
  • Login
    • Member Home

by Dianna Deeney Leave a Comment

QDD 095 After the ‘Storm: Pareto Voting and Screening Methods

After the ‘Storm: Pareto Voting and Screening Methods

We’re in our 4th episode into our series about generating ideas with our team toward action. The first two episodes were all about idea generation. The 3rd was about grouping and exploring ideas.

We’re still considering that we’re just after brainstorming, at the point where we have many ideas and no next steps.

Let’s instead screen our ideas so we can move toward action. We explore these Quality Tools and how to use them after a brainstorming or other idea-generating team activity:

  • 2×2 chart
  • Systematic list reduction
  • Multivoting or Pareto Voting

 

View the Episode Transcript

 


Reminders when evaluating ideas with a team

We need to Mind our Mindset

Recognize that it’s difficult to evaluate ideas from a brainstorming activity into actions for next steps.

We’re handling ideas systematically with our team to get the maximum benefit from our creative phase.

We want to control our itch for a quick decision on the best idea – to do so would ruin our efforts toward creativity and innovative ideas.

We aren’t looking to eliminate ideas. We’re looking to develop them to the best solution we think there could be.

 

Yes, we approach activities with the  spirit of developing creative ideas. We say things like, “That’s a great idea, what can we do to make it work?” or “What is it about this idea we can use?”

No, we don’t want to just eliminate ideas. We try to avoid first jumping to say things like, “That’s a great idea, but here’s why it won’t work.”

Discussing Ideas

We’d like consensus on a clear option, which is that place where everyone supports the decision, even if it wasn’t their first choice.

We discuss to clarify ideas. If it’s not clear, then let’s make sure that everyone understands the information about some ideas.

We don’t need to pressure anyone to change votes, but we do need to ensure we’re all voting on the same idea, or the same understanding of an idea.


Screen Ideas

2×2 Chart

We can get a quick comparison with a 2×2 chart, using 2 criteria.

  • Time vs. impact
  • Value vs. effort
  • Risk vs. reward

 

Urgent/Important Priorities Matrix

Multivoting or Pareto Voting

We’re judging ideas by ranking them instead of using “majority rules”.

How many votes does each member get? Try using the Pareto Principle to decide.

 


Other Quality during Design podcast episodes you might like:

Using the Pareto Principle and Avoiding Common Pitfalls


Visit asq.org for another overview of multivoting:

“What is Multivoting?” ASQ, asq.org/quality-resources/multivoting. Accessed 10 Jan 2023.


Previous episodes in our series about generating ideas with our team toward action:

Episode 1: Ways to Gather Ideas with a Team

Episode 2: Product Design with Brainstorming, with Emily Haidemenos (A Chat with Cross Functional Experts)

Episode 3: After the ‘Storm: Group and Explore Ideas


Episode Transcript

Hello. This is our fourth episode into our series about generating ideas with our team toward action. The first two episodes, which started at the beginning of January, 2023, we’re all about idea generation. We’re now at the point where we have a lot of ideas and it’s messy and we can’t really define next steps, so now we’re going to take our ideas and start organizing and prioritizing them so we can move toward action. We’ll be talking about ideas to screen and select ideas using quality tools. In the last episode, we talked about affinity diagrams, fishbones and tree diagrams, and how we can use that to better understand and group our ideas. In this episode, let’s talk about something a little bit different, where we’re going to be screening and reducing our choices. More about that after the brief introduction.

Hello and welcome to Quality During Design, the place to use quality thinking to create products, others love for less. Each week we talk about ways to use quality during design, engineering, and product development. My name is Dianna Deeney. I’m a senior level quality, professional, and engineer with over 20 years of experience in manufacturing and design. Listen in and then join us. Visit quality during design.com. Do you know what 12 things you should have before a design concept makes it to the engineering drawing board where you’re setting specifications. I’ve got a free checklist for you and you can do some assessments of your own. Where do you stack up with the checklist? You can log into a learning portal to access the checklist and an introduction to more information about how to get those 12 things. To get this free information, just sign up at qualityduringdesign.com. On the homepage, there’s a link in the middle of the page. Just click it and say, I want it.

There are many quality tools that are team-based and visual in nature, and this is a good thing because it aligns a team around common goals. It’s visual information, so it’s another way that we can take in information and analyze it and regroup it for ourselves, and it also helps us get aligned with each other so that we can move toward action. We can see where it is we need to go or get better ideas of where we need to end up if it’s mapped out in front of us. Imagine this scenario that we’ve just finished a brainstorming session. We had done some quiet brainstorming and everyone has their stacks of post-it notes or ideas or virtual lists, and we, we may have even started grouping them using the Affinity Diagram team sorting method. The affinity diagram exercise helped us to group ideas into like ideas.

It probably also helped us to identify ideas that were duplicated, so that is one way that we can reduce our list of ideas. Now, we need to move from our quantity of ideas to quality ideas that we want to execute. We wanna get to our top priorities. Generally, team members individually judge or arrange ideas. Then we discuss as a team and if we need to, we can judge ideas. Again, what we’re really doing is using a team approach to finding and choosing solutions or collecting and deciding on options. We might be looking at design features where we want to consider different perspectives of all of our internal customers, our end users, our manufacturing friends, our suppliers and others, or we may be working on a problem that we’re having trouble with that we need others to help us figure it out and look at it from different perspectives.

Those are just a couple examples of why we’re working through this as a team. When we’re evaluating ideas like this as a team, we need to recognize a couple things. We need to understand that evaluating ideas from a brainstorm and deciding which action to take is actually a pretty difficult task. We need to control any itches or desires to get to an idea quickly so we can start solving the problem. We’re not looking for a quick decision for the best idea. Instead, we’re handling ideas systematically to get the maximum benefit from our creative phase. The whole reason why we’re getting together to do these exercises in the first place, it doesn’t have to take a long time or it may either way, we’re going to be intentional with our next steps when we’re screening ideas, we’re looking for the good things about the ideas. We’re not looking to discard ideas instead of that’s a great idea, but here’s why it won’t work.

Approach it like, that’s a great idea. What can we do to make it work? Or what is it about the idea we can use? It is more challenging to find ways to make an idea work than it is to give in to our original negative action about it. This might be something we need to remind ourselves about, to remind our team about during this exercise, and we want to adopt a mindset of “What can we do to make it work, or what is it about this idea we can use in order to get to the idea that we can use?”

We need to reduce our list. We can do a list reduction activity. It uses a simple majority voting with the option that anyone can keep an idea on the board. It’s systematic and very intentional. A one idea at a time when a majority of the team votes for an idea to be discarded, it’s not taken off the board immediately. It’s bracketed or otherwise put off to the side, and then everyone is asked if anyone wants to keep any of the discarded ideas, and if anybody wants to keep a discarded idea, it goes back on the list and the team continues to discuss the ideas.

Team discussions are centered around everyone understanding what the idea is, because if the team isn’t on the same page about what ideas are, they don’t have the same understanding. We’re not voting on the same idea. Remember: “What can we do to make it work, or what is it about this idea we can use?”

If we want a quick comparison of the ideas that we’ve generated, we can use a two by two chart. This is something that our expert, Emily recommended. Choose two criteria against which to measure the idea. It could be time and impact, value versus effort, risk versus reward, and if you’re evaluating a user process, it could also be urgent versus important. We cover that in a previous episode of this podcast and I’ll link to it. To do the two-by-two chart, we’re going to follow similar steps to an affinity diagram. Instead of grouping ideas, we’re placing them on a four-window chart based on high-low values.

If we are not getting to our final decision or selection and we need to narrow our list further into top priorities, then we can vote on ideas, and the technique is called multivoting. Each team member gets a certain amount of votes to apply to an idea. In any team dynamic, there are going to be a couple personalities that are dominant in the team. Multivoting allows people to have equal footing on the decisions that are made Coming out of our idea generating activity, everyone has equal participation. Nobody is pressured to change their votes or rank issues a certain way based on what somebody else is saying.

Multivoting is systematic, and here’s how it works. We display our ideas on a board, on a virtual board, just make sure everybody can see all the ideas. It’s best to do it if we could see all the ideas together at once. Everyone individually judges the ideas by giving them a vote. Then the team looks at where everybody put their votes and discusses the results.

Where we’d like to get to in Multivoting is to get a consensus, which is that place where everyone supports the decision, even if it wasn’t their first choice. We’re looking for a clear option. If it’s not clear, then again, let’s make sure everyone understands the ideas and information about some of the ideas. Look to ideas that the team has voted with a split or those ideas that some ranked the highest and others didn’t. We don’t need to pressure anyone to change votes, but we do need to ensure we’re all voting on the same idea or the same understanding of an idea. If we need to, we can repeat the multivoting process. We can individually judge the idea again if we need to, and then discuss the results. Again, if you’ve noticed, we’re sticking with that individual judging, then stepping back to see what we’ve done as a group, then individually judging again.

Some judgment calls we need to make about Multivoting. The first is that we need to intentionally plan to use Multivoting and not pull it out as an emergency when the team is in disagreement, <laugh>. So we don’t want to have a situation where we can’t move ahead with choosing an idea and we decide to angrily say, “You know what? Let’s just have a vote on this!” That’s not in the true spirit of multivoting and we’re not reaching the consensus that we want.

The other thing about multivoting is the deciders. Sometimes teams are structured or they’re working on a problem where the ultimate decision of whether or not that idea is pursued is left to someone else. It could be a CEO if it’s a small company, it could be a vice president or a project manager. There is a decider that decides whether or not this idea is pursued. If that’s the case, then they need to be involved. Sometimes they need to be involved in the idea generation because they have a broader viewpoint or different ideas about what this project needs to be able to do in the end, and we need to get that input from them.

The other one is we don’t wanna spend our team time developing ideas that aren’t going to get approved and are gonna go nowhere. So sometimes we need to have the deciders involved in our multivoting. Coming out of this multivoting, we want honest decisions. With the decider, we don’t want to let them cede their authority, meaning during the multivoting team exercises, we don’t want them to say, “Oh, consensus rules, I’m gonna go with what the group says”, because chances are they’ll change their mind later to their first pick anyway. A way to handle this, to have the decider involved, is to give them a super vote. In the multivoting procedure, ideas that have super votes, the team is going to pursue. Either choosing features from those super votes, combining them, or working with the deciders again, to come up with the ultimate idea. If they’re the deciders, it’s their responsibility to ultimately decide, so the team needs to address that and ask them to be the decider.

How many votes does everyone get? We could use the Pareto principle. We cover the Pareto principle in a previous episode. Let me share with you again a little bit about its history and what the Pareto principle is: ”

“The Pareto principle is named after an Italian economist, Alfredo Fedco Damaso Pareto. He published in the 1890s, a paper that showed about 80% of the land in Italy was owned by 20% of the population. Lore has it that he started seeing that ratio everywhere, even down to the pea plants in his garden. Other people noticed that it seemed to carry through in other things too. This 80 20 rule, so the Preto principle is generally that 80% of the output is caused by 20% of the input, or 80% of the consequences come from 20% of the causes, and those 20% of the causes is dubbed the vital few. Now, why would we want to apply this in our engineering and design processes? Well, it’s a tool to separate the vital few factors from the trivial many or in plain speak. We wanna spend the least amount of effort we need in order to make the biggest effect. A Pareto chart helps us to identify if we’ve got a cause or a short list of causes that we can work hard to solve to fix most of the problem.”

Back to Multivoting, where we apply the Pareto principle is if we have a list of ideas, we’re going to count the number ideas that we have, multiply that by 20%, and then that’s the number of votes that everybody gets. For example, if we have a list of 30 ideas, 20% of 30 is six, so each team member gets six votes.

You don’t like the idea of using a Pareto principle to determine how many votes everyone gets? Then we don’t need to use it. There are lots of other alternatives. A couple of them are that everyone gets three votes. They can choose their top three. Their first choice gets three points. Their second choice gets two points, and the last choice gets one point. Then tally the numbers. If we don’t wanna add numbers, then we can give everyone little round stickers. They can place stickers on their top picks, distributing the stickers the way they want. If you’re using a virtual whiteboard, you could add a little check mark or some other symbol. Team members can put the majority or all of their stickers on the ideas they like the most.

What’s today’s insight to action?

When we’re going from a quantity of ideas to a list of quality ideas that we can maybe execute on, we recognize that evaluating ideas with a team and deciding which action to take is a difficult task, and that we’re not looking for a quick decision. We’re handling ideas systematically so that we can get the maximum benefit from our creative phase. We’re intentional with next steps, and these team approaches give everyone an equal voice. We approach these list reduction and multivoting activities from a mindset of, “What can we do to make this work? Or what is it about this idea we can use?”

We can use a two by two chart to help us choose criteria based on the goal of a project to quickly compare ideas.

We talked about a systematic way to reduce our list of ideas, and we also talked about multivoting as a way to narrow the list of choices and further discuss options.

Now, just a note: after our brainstorming session, we could just head right into a two by two chart or multivoting. We can skip the affinity diagram if we don’t need it, and we can skip the list reduction exercise if we don’t need that either.

Next week, we’ll talk about more ways to make a final decision by using paired comparisons. Thanks for joining me today. I’ll see you then.

If you like this topic or the content in this episode, there’s much more on our website, including information about how to join our signature coaching program. The quality during design journey consistency is important, so subscribe to the weekly newsletter. This has been a production of Deeney Enterprises. Thanks for listening.

 

Filed Under: Quality during Design

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Quality during Design podcast logo

Tips for using quality tools and methods to help you design products others love, for less.


by Dianna Deeney
Quality during Design,
Hosted on Buzzsprout.com
Subscribe and enjoy every episode
Google
Apple
Spotify

Recent Episodes

QDD 102 Get Design Inputs with Flowcharts

QDD 101 Quality Tools are Legos of Development (and Their 7 Uses)

QDD 100 Lessons Learned from Coffee Pod Stories

QDD 099 Crucial Conversations in Engineering, with Shere Tuckey (A Chat with Cross-Functional Experts)

QDD 098 Challenges Getting Team Input in Concept Development

QDD 097 Brainstorming within Design Sprints

QDD 096 After the ‘Storm: Compare and Prioritize Ideas

QDD 095 After the ‘Storm: Pareto Voting and Screening Methods

QDD 094 After the ‘Storm: Group and Explore Ideas

QDD 093 Product Design with Brainstorming, with Emily Haidemenos (A Chat with Cross Functional Experts)

QDD 092 Ways to Gather Ideas with a Team

QDD 091 The Spirits of Technical Writing Past, Present, and Future

QDD 090 The Gifts Others Bring

QDD 089 Next Steps after Surprising Test Results

QDD 088 Choose Reliability Goals for Modules

QDD 087 Start a System Architecture Diagram Early

QDD 086 Why Yield Quality in the Front-End of Product Development

QDD 085 Book Cast

QDD 084 Engineering in the Color Economy

QDD 083 Getting to Great Designs

QDD 082 Get Clarity on Goals with a Continuum

QDD 081 Variable Relationships: Correlation and Causation

QDD 080 Use Meetings to Add Productivity

QDD 079 Ways to Partner with Test Engineers

QDD 078 What do We do with FMEA Early in Design Concept?

QDD 077 A Severity Scale based on Quality Dimensions

QDD 076 Use Force Field Analysis to Understand Nuances

QDD 075 Getting Use Information without a Prototype

QDD 074 Finite Element Analysis (FEA) Supplements Test

QDD 073 2 Lessons about Remote Work for Design Engineers

QDD 072 Always Plot the Data

QDD 071 Supplier Control Plans and Design Specs

QDD 070 Use FMEA to Design for In-Process Testing

QDD 069 Use FMEA to Choose Critical Design Features

QDD 068 Get Unstuck: Expand and Contract Our Problem

QDD 067 Get Unstuck: Reframe our Problem

QDD 066 5 Options to Manage Risks during Product Engineering

QDD 065 Prioritizing Technical Requirements with a House of Quality

QDD 064 Gemba for Product Design Engineering

QDD 063 Product Design from a Data Professional Viewpoint, with Gabor Szabo (A Chat with Cross Functional Experts)

QDD 062 How Does Reliability Engineering Affect (Not Just Assess) Design?

QDD 061 How to use FMEA for Complaint Investigation

QDD 060 3 Tips for Planning Design Reviews

QDD 059 Product Design from a Marketing Viewpoint, with Laura Krick (A Chat with Cross Functional Experts)

QDD 058 UFMEA vs. DFMEA

QDD 057 Design Input & Specs vs. Test & Measure Capability

QDD 056 ALT vs. HALT

QDD 055 Quality as a Strategic Asset vs. Quality as a Control

QDD 054 Design Specs vs. Process Control, Capability, and SPC

QDD 053 Internal Customers vs. External Customers

QDD 052 Discrete Data vs. Continuous Data

QDD 051 Prevention Controls vs. Detection Controls

QDD 050 Try this Method to Help with Complex Decisions (DMRCS)

QDD 049 Overlapping Ideas: Quality, Reliability, and Safety

QDD 048 Using SIPOC to Get Started

QDD 047 Risk Barriers as Swiss Cheese?

QDD 046 Environmental Stress Testing for Robust Designs

QDD 045 Choosing a Confidence Level for Test using FMEA

QDD 044 Getting Started with FMEA – It All Begins with a Plan

QDD 043 How can 8D help Solve my Recurring Problem?

QDD 042 Mistake-Proofing – The Poka-Yoke of Usability

QDD 041 Getting Comfortable with using Reliability Results

QDD 040 How to Self-Advocate for More Customer Face Time (and why it’s important)

QDD 039 Choosing Quality Tools (Mind Map vs. Flowchart vs. Spaghetti Diagram)

QDD 038 The DFE Part of DFX (Design For Environment and eXcellence)

QDD 037 Results-Driven Decisions, Faster: Accelerated Stress Testing as a Reliability Life Test

QDD 036 When to use DOE (Design of Experiments)?

QDD 035 Design for User Tasks using an Urgent/Important Matrix

QDD 034 Statistical vs. Practical Significance

QDD 033 How Many Do We Need To Test?

QDD 032 Life Cycle Costing for Product Design Choices

QDD 031 5 Aspects of Good Reliability Goals and Requirements

QDD 030 Using Failure Rate Functions to Drive Early Design Decisions

QDD 029 Types of Design Analyses possible with User Process Flowcharts

QDD 028 Design Tolerances Based on Economics (Using the Taguchi Loss Function)

QDD 027 How Many Controls do we Need to Reduce Risk?

QDD 026 Solving Symptoms Instead of Causes?

QDD 025 Do you have SMART ACORN objectives?

QDD 024 Why Look to Standards

QDD 023 Getting the Voice of the Customer

QDD 022 The Way We Test Matters

QDD 021 Designing Specs for QA

QDD 020 Every Failure is a Gift

QDD 019 Understanding the Purposes behind Kaizen

QDD 018 Fishbone Diagram: A Supertool to Understand Problems, Potential Solutions, and Goals

QDD 017 What is ‘Production Equivalent’ and Why Does it Matter?

QDD 016 About Visual Quality Standards

QDD 015 Using the Pareto Principle and Avoiding Common Pitfalls

QDD 014 The Who’s Who of your Quality Team

QDD 013 When it’s Not Normal: How to Choose from a Library of Distributions

QDD 012 What are TQM, QFD, Six Sigma, and Lean?

QDD 011 The Designer’s Important Influence on Monitoring After Launch

QDD 010 How to Handle Competing Failure Modes

QDD 009 About Using Slide Decks for Technical Design Reviews

QDD 008 Remaking Risk-Based Decisions: Allowing Ourselves to Change our Minds.

QDD 007 Need to innovate? Stop brainstorming and try a systematic approach.

QDD 006 HALT! Watch out for that weakest link

QDD 005 The Designer’s Risk Analysis affects Business, Projects, and Suppliers

QDD 004 A big failure and too many causes? Try this analysis.

QDD 003 Why Your Design Inputs Need to Include Quality & Reliability

QDD 002 My product works. Why don’t they want it?

QDD 001 How to Choose the Right Improvement Model

© 2023 FMS Reliability · Privacy Policy · Terms of Service · Cookies Policy

This site uses cookies to give you a better experience, analyze site traffic, and gain insight to products or offers that may interest you. By continuing, you consent to the use of cookies. Learn how we use cookies, how they work, and how to set your browser preferences by reading our Cookies Policy.