Accendo Reliability

Your Reliability Engineering Professional Development Site

  • Home
  • About
    • Contributors
  • Reliability.fm
    • Speaking Of Reliability
    • Rooted in Reliability: The Plant Performance Podcast
    • Quality during Design
    • Critical Talks
    • Dare to Know
    • Maintenance Disrupted
    • Metal Conversations
    • The Leadership Connection
    • Practical Reliability Podcast
    • Reliability Matters
    • Reliability it Matters
    • Maintenance Mavericks Podcast
    • Women in Maintenance
    • Accendo Reliability Webinar Series
    • Asset Reliability @ Work
  • Articles
    • CRE Preparation Notes
    • on Leadership & Career
      • Advanced Engineering Culture
      • Engineering Leadership
      • Managing in the 2000s
      • Product Development and Process Improvement
    • on Maintenance Reliability
      • Aasan Asset Management
      • CMMS and Reliability
      • Conscious Asset
      • EAM & CMMS
      • Everyday RCM
      • History of Maintenance Management
      • Life Cycle Asset Management
      • Maintenance and Reliability
      • Maintenance Management
      • Plant Maintenance
      • Process Plant Reliability Engineering
      • ReliabilityXperience
      • RCM Blitz®
      • Rob’s Reliability Project
      • The Intelligent Transformer Blog
    • on Product Reliability
      • Accelerated Reliability
      • Achieving the Benefits of Reliability
      • Apex Ridge
      • Metals Engineering and Product Reliability
      • Musings on Reliability and Maintenance Topics
      • Product Validation
      • Reliability Engineering Insights
      • Reliability in Emerging Technology
    • on Risk & Safety
      • CERM® Risk Insights
      • Equipment Risk and Reliability in Downhole Applications
      • Operational Risk Process Safety
    • on Systems Thinking
      • Communicating with FINESSE
      • The RCA
    • on Tools & Techniques
      • Big Data & Analytics
      • Experimental Design for NPD
      • Innovative Thinking in Reliability and Durability
      • Inside and Beyond HALT
      • Inside FMEA
      • Integral Concepts
      • Learning from Failures
      • Progress in Field Reliability?
      • Reliability Engineering Using Python
      • Reliability Reflections
      • Testing 1 2 3
      • The Manufacturing Academy
  • eBooks
  • Resources
    • Accendo Authors
    • FMEA Resources
    • Feed Forward Publications
    • Openings
    • Books
    • Webinars
    • Journals
    • Higher Education
    • Podcasts
  • Courses
    • 14 Ways to Acquire Reliability Engineering Knowledge
    • Reliability Analysis Methods online course
    • Measurement System Assessment
    • SPC-Process Capability Course
    • Design of Experiments
    • Foundations of RCM online course
    • Quality during Design Journey
    • Reliability Engineering Statistics
    • Quality Engineering Statistics
    • An Introduction to Reliability Engineering
    • An Introduction to Quality Engineering
    • Process Capability Analysis course
    • Root Cause Analysis and the 8D Corrective Action Process course
    • Return on Investment online course
    • CRE Preparation Online Course
    • Quondam Courses
  • Webinars
    • Upcoming Live Events
  • Calendar
    • Call for Papers Listing
    • Upcoming Webinars
    • Webinar Calendar
  • Login
    • Member Home

by Dianna Deeney Leave a Comment

QDD 079 Ways to Partner with Test Engineers

Ways to Partner with Test Engineers

We have test results but realize that testing didn’t go as planned.

What can we do to help prevent this scenario?

We talk about ways to partner with test engineers and test technicians and the importance of still maintaining their independence.

 

View the Episode Transcript

Proactive things design engineers can do:

  • Partner with the test engineers and test technicians when developing the test plan. Get their insight.
  • Check-in during test (briefly make yourself available for questions), partner with your Quality Engineer or Reliability Engineer to check-in or let the test lab folks know that you’re available for questions. Be available during test while letting them maintain their independence.
  • After a round of tests (no matter where in the development process), host a mini lessons-learned with the test engineers and technicians. Prepare for the next round of tests with what you’ve learned this time.

 

Citations:

Other Quality during Design podcast episodes you might like:

Design Input & Specs vs. Test & Measure Capability

Choosing a Confidence Level for Test using FMEA

The Way We Test Matters

How Many Do We Need To Test?

 

Episode Transcript

Things didn’t go as planned in the test lab. Now we have destroyed parts and results that aren’t as useful as we thought they’d be. They may need to do extra analysis or retest new parts. What could we do next time to prevent this outcome? Let’s talk about some options after this brief introduction.

Hello and welcome to Quality During Design, the place to use quality thinking to create products others love for less. My name is Dianna. I’m a senior level quality, professional, and engineer with over 20 years of experience in manufacturing and design. Listen in and then join the conversation at qualityduringdesign.com,

We sometimes develop complicated test plans in the name of trying to save the number of parts we need to test or to save test time. We have parallel test tracks and sequential testing that is done on one part. We cut our parts apart and test individual components separately.

We submitted our parts to the test lab and our testing is done and we now have the broken or damaged parts, which we expected. We also have test results that when we plot the data, because we will plot the data, we always do, it looks a little off or there’s some curiosities in our data or it doesn’t look like what we thought it would look like, and we see that there are notes from the test technician about how the test went. So we visit the test area and we make an appointment with our test technician and our test engineers. We find out that there were problems during the test. They had to change fixturing in order to be able to even test the part, or they had to retest parts multiple times because of, well, for whatever reason it was slipping or there’s a hiccup in the test equipment.

Maybe a solution wasn’t mixed properly, but it was used anyway because of timing. Test equipment wasn’t available, so a substitution was made, or parts had to sit on a shelf between tests that we expected to be done to back. These are all real issues that test engineers and test technicians have to work through. They have their own job and they also need to meet deadlines and they need to fit all this testing that they’re getting from multiple projects within a schedule. All of these things can affect the results of our test. Introducing variables that we hadn’t even considered when we were creating our complicated test plans. We want tests to be performed without a problem, but that’s rarely the case. We do want to avoid those exceptional conditions where we plan to do the test one way and the test ended up being performed in a different way.

Things are going to happen. There are things we can do as design engineers to reduce these problems. One of the things we can do is to involve the test engineers and test technicians in our test planning. They will point out concerns and issues. They know their area, they know where things could be a timeline crunch. If you want this test equipment for your project, that’s going to be a problem because this other project takes priority. We’re going to need to make a substitution for your test. That would be something good to know ahead of time, and their world is of testing and test methods. They’ve tested other parts and they’ve been around for a while and have performed many different tests. They have a perspective about testing that could benefit your test plan and the test results of your product. We’ve tested parts like this in this way before and these are the problems that we have.

We should really address that before we test your parts. Your parts aren’t going to fit into the typical test fixturing that we have. I would have to clamp it a special way and that could cause a problem. You may need to create a test fixture just for your product. These are the kind of things that test engineers and test technicians will be able to tell you about your product. Setting up some time with them and approaching them with these questions shows the respect that we have for what they do in involving the test people in the planning. Were also not just another engineer piling work onto their schedule where Joe working on project Awesome and doing final design verifications. Test engineers and test technicians are partners in evaluating our products in design. Now, they’re professionals at what they’re do. They’re not going to fudge numbers because they like you. If they do, then I would question the validity of any test results that I’ve gotten from them before and I would get someone else to test my parts. We want test engineers and technicians to independently evaluate and test our products, but we also need to be able to work with them to develop a test that is more likely to avoid those exceptional conditions.

Another thing that we can do to avoid getting results that are iffy or not, what we really wanted is when the test is happening, we can show up with curiosity. We can check in and ask, How’s it going? Do you have any questions? Are you running the test per the protocol or have you needed a workaround? Many times I’ve checked in with a test technician and there was a question or a surprise happening in the lab, and this is despite the team’s best planning. It’s going to happen. Now, I don’t hang around. I check in once, let them know that I’m available and then I go away. I let them do their job. If you’re uncomfortable doing this and you think you’re going to be negatively affecting this independence during the test, then partner with your quality engineer or your reliability engineer. Ask them to check in during testing to make sure that there aren’t any questions or any hiccups that need to be addressed. The other option you can do is to let the test lab and the test technicians and test engineers know that you are available for questions during test.

These are all proactive ways that affect how we design our test plan and how we can help manage the test as it’s happening without influencing the results. After the testing is all done, there’s something else that we can do at the end of a design development phase. Some place where it’s natural In the product development process, we can do a mini lessons learned. We can meet with our team and the test engineers or test technicians and ask some questions. Were there ways that we tested that didn’t work out? Are there protocols that we need to update? Are there new fixtures or tools we need to create? Do we need to redo any test method validations because we’re really testing the limits of the tests that we’re doing today. Even if our testing was benchtop or trial runs, we can still do this many lessons learned. That’ll help us in the test plans that we’re developing for later in the product development process like verification testing. This will help set ourselves up for success the next time.

So what’s today’s insight to action? Work with your test engineer and test technician. They have insights into what will work and what won’t. Give them an opportunity to share it. Consider the test lab people partners in product design while respecting their need to be independent.

If you like the content in this episode, visit quality during design.com where you can subscribe to the weekly newsletter to keep in touch. This has been a production of Deeney Enterprises. Thanks for listening!

 

Filed Under: Quality during Design

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Quality during Design podcast logo

Tips for using quality tools and methods to help you design products others love, for less.


by Dianna Deeney
Quality during Design,
Hosted on Buzzsprout.com
Subscribe and enjoy every episode
Google
Apple
Spotify

Recent Episodes

QDD 101 Quality Tools are Legos of Development (and Their 7 Uses)

QDD 100 Lessons Learned from Coffee Pod Stories

QDD 099 Crucial Conversations in Engineering, with Shere Tuckey (A Chat with Cross-Functional Experts)

QDD 098 Challenges Getting Team Input in Concept Development

QDD 097 Brainstorming within Design Sprints

QDD 096 After the ‘Storm: Compare and Prioritize Ideas

QDD 095 After the ‘Storm: Pareto Voting and Screening Methods

QDD 094 After the ‘Storm: Group and Explore Ideas

QDD 093 Product Design with Brainstorming, with Emily Haidemenos (A Chat with Cross Functional Experts)

QDD 092 Ways to Gather Ideas with a Team

QDD 091 The Spirits of Technical Writing Past, Present, and Future

QDD 090 The Gifts Others Bring

QDD 089 Next Steps after Surprising Test Results

QDD 088 Choose Reliability Goals for Modules

QDD 087 Start a System Architecture Diagram Early

QDD 086 Why Yield Quality in the Front-End of Product Development

QDD 085 Book Cast

QDD 084 Engineering in the Color Economy

QDD 083 Getting to Great Designs

QDD 082 Get Clarity on Goals with a Continuum

QDD 081 Variable Relationships: Correlation and Causation

QDD 080 Use Meetings to Add Productivity

QDD 079 Ways to Partner with Test Engineers

QDD 078 What do We do with FMEA Early in Design Concept?

QDD 077 A Severity Scale based on Quality Dimensions

QDD 076 Use Force Field Analysis to Understand Nuances

QDD 075 Getting Use Information without a Prototype

QDD 074 Finite Element Analysis (FEA) Supplements Test

QDD 073 2 Lessons about Remote Work for Design Engineers

QDD 072 Always Plot the Data

QDD 071 Supplier Control Plans and Design Specs

QDD 070 Use FMEA to Design for In-Process Testing

QDD 069 Use FMEA to Choose Critical Design Features

QDD 068 Get Unstuck: Expand and Contract Our Problem

QDD 067 Get Unstuck: Reframe our Problem

QDD 066 5 Options to Manage Risks during Product Engineering

QDD 065 Prioritizing Technical Requirements with a House of Quality

QDD 064 Gemba for Product Design Engineering

QDD 063 Product Design from a Data Professional Viewpoint, with Gabor Szabo (A Chat with Cross Functional Experts)

QDD 062 How Does Reliability Engineering Affect (Not Just Assess) Design?

QDD 061 How to use FMEA for Complaint Investigation

QDD 060 3 Tips for Planning Design Reviews

QDD 059 Product Design from a Marketing Viewpoint, with Laura Krick (A Chat with Cross Functional Experts)

QDD 058 UFMEA vs. DFMEA

QDD 057 Design Input & Specs vs. Test & Measure Capability

QDD 056 ALT vs. HALT

QDD 055 Quality as a Strategic Asset vs. Quality as a Control

QDD 054 Design Specs vs. Process Control, Capability, and SPC

QDD 053 Internal Customers vs. External Customers

QDD 052 Discrete Data vs. Continuous Data

QDD 051 Prevention Controls vs. Detection Controls

QDD 050 Try this Method to Help with Complex Decisions (DMRCS)

QDD 049 Overlapping Ideas: Quality, Reliability, and Safety

QDD 048 Using SIPOC to Get Started

QDD 047 Risk Barriers as Swiss Cheese?

QDD 046 Environmental Stress Testing for Robust Designs

QDD 045 Choosing a Confidence Level for Test using FMEA

QDD 044 Getting Started with FMEA – It All Begins with a Plan

QDD 043 How can 8D help Solve my Recurring Problem?

QDD 042 Mistake-Proofing – The Poka-Yoke of Usability

QDD 041 Getting Comfortable with using Reliability Results

QDD 040 How to Self-Advocate for More Customer Face Time (and why it’s important)

QDD 039 Choosing Quality Tools (Mind Map vs. Flowchart vs. Spaghetti Diagram)

QDD 038 The DFE Part of DFX (Design For Environment and eXcellence)

QDD 037 Results-Driven Decisions, Faster: Accelerated Stress Testing as a Reliability Life Test

QDD 036 When to use DOE (Design of Experiments)?

QDD 035 Design for User Tasks using an Urgent/Important Matrix

QDD 034 Statistical vs. Practical Significance

QDD 033 How Many Do We Need To Test?

QDD 032 Life Cycle Costing for Product Design Choices

QDD 031 5 Aspects of Good Reliability Goals and Requirements

QDD 030 Using Failure Rate Functions to Drive Early Design Decisions

QDD 029 Types of Design Analyses possible with User Process Flowcharts

QDD 028 Design Tolerances Based on Economics (Using the Taguchi Loss Function)

QDD 027 How Many Controls do we Need to Reduce Risk?

QDD 026 Solving Symptoms Instead of Causes?

QDD 025 Do you have SMART ACORN objectives?

QDD 024 Why Look to Standards

QDD 023 Getting the Voice of the Customer

QDD 022 The Way We Test Matters

QDD 021 Designing Specs for QA

QDD 020 Every Failure is a Gift

QDD 019 Understanding the Purposes behind Kaizen

QDD 018 Fishbone Diagram: A Supertool to Understand Problems, Potential Solutions, and Goals

QDD 017 What is ‘Production Equivalent’ and Why Does it Matter?

QDD 016 About Visual Quality Standards

QDD 015 Using the Pareto Principle and Avoiding Common Pitfalls

QDD 014 The Who’s Who of your Quality Team

QDD 013 When it’s Not Normal: How to Choose from a Library of Distributions

QDD 012 What are TQM, QFD, Six Sigma, and Lean?

QDD 011 The Designer’s Important Influence on Monitoring After Launch

QDD 010 How to Handle Competing Failure Modes

QDD 009 About Using Slide Decks for Technical Design Reviews

QDD 008 Remaking Risk-Based Decisions: Allowing Ourselves to Change our Minds.

QDD 007 Need to innovate? Stop brainstorming and try a systematic approach.

QDD 006 HALT! Watch out for that weakest link

QDD 005 The Designer’s Risk Analysis affects Business, Projects, and Suppliers

QDD 004 A big failure and too many causes? Try this analysis.

QDD 003 Why Your Design Inputs Need to Include Quality & Reliability

QDD 002 My product works. Why don’t they want it?

QDD 001 How to Choose the Right Improvement Model

© 2023 FMS Reliability · Privacy Policy · Terms of Service · Cookies Policy

This site uses cookies to give you a better experience, analyze site traffic, and gain insight to products or offers that may interest you. By continuing, you consent to the use of cookies. Learn how we use cookies, how they work, and how to set your browser preferences by reading our Cookies Policy.