Blog Archives

Want To Learn Proper Worst Case Analysis Basics? See Our Latest Article at How2Power.com

A new Design Master article,  “Use Worst-Case Analysis Tool To Efficiently Validate Your Designs,” is now available in the latest issue of How2Power.com.

Bulletin: Design Master Analyzer Now Available

The Design Masterâ„¢ Analyzer (DMA) is a simple fill-in-the-blanks quick and easy worst case analysis tool. DMA is based on expert templates, allowing powerful results to be quickly generated by less experienced engineers.

DMA is designed to be more easily used by iPad and other compact devices.

The Design Master Analyzer is targeted at specific applications with a very simple and easy-to-use format. If you’re an engineering director or project manager, simply provide copies of specific DMA applications to your staff for quick and efficient “fill in the blanks” analyses and receive design validation in minutes rather than weeks.

Although DMA files are useable as provided and are securely locked, a Professional Edition “master” owner can edit or create DMA templates. The DMA engine can also be used to convert any existing Design Master file into a DMA fill-in-the-blanks format. Please inquire for pricing for the DMA engine.

To order please click here.

Reliability Prediction or Magic 8 Ball? You Decide

Many years ago I was contacted by someone who worked for a very large defense contractor. The gentleman (Mr. X) had the responsibility for helping ensure that the electronics modules used by his company met stringent reliability requirements, one of which was a minimum allowable Mean Time Between Failures (MTBF). He had read one of our DACI newsletters that mentioned such reliability predictions, and gave me a call.

“My problem,” he said (I paraphrase), “is that MTBF predictions per Military Handbook 217 don’t make any sense.” He subsequently provided detailed backup studies, including a data collection — using real fielded hardware — that showed the predicted times to failure for the hardware did not match the field experience. The predicted numbers were not just too low (as some folks claim for MIL-HDBK-217), they were also too high, or sometimes about right. In other words, they were pretty random, indicating that MIL-HDBK-217 had no more predictive value than you would get by using a Magic 8 Ball.

But that’s not all. “These reliability predictions,” he continued, “are worse than useless, because engineering managers are cramming in heavy heat sinks, or using other cooling techniques, to drive down the MTBF numbers. The result is a potential decrease of overall system reliability, as well as increased weight and cost, based on this MTBF nonsense.”

Until I heard from Mr. X, I had prepared numerous MTBF reports using MIL-HDBK-217, assuming (what a horrible word, I’ve learned) that the methodology was science-based. After reviewing the data, however, I agreed with Mr. X that MTBFs were indeed nonsense, and said so in the DACI newsletter. This sparked a minor controversy, including being threatened by a representative of a reliability firm (one that did a lot of business with the government) that DACI would be “out of business” because of our stance on the issue.

Well, DACI survived. Today, though, and sadly, my impression is that lots of folks still use MIL-HDBK-217-type cookbook calculations for MTBFs, which are essentially a waste of money, other than the important side benefit (that has nothing to do with MTBF predictions) of examining components for potential overstress. But that task can be done as part of a good WCA, skipping all of the costly and misleading MTBF pseudoscience.

Instead of trying to predict reliability, it’s better to ensure reliability by employing “physics of failure,” the scientific process of studying the chemistry, mechanics, and physics of specific materials and assemblies.

Bottom line: Skip the handbook-style MTBF nonsense, and use those dollars instead to keep abreast of materials science, as applicable to your specific products. (If for some reason you absolutely must prepare an MTBF report, use a Magic 8 Ball: it will be much quicker and just as accurate.)

p.s. Prior to my education by Mr. X, I had been deeply involved with the electronics design for a very ambitious spacecraft project. Thinking MTBF to be an important metric, I asked the project manager what the preliminary MTBF was for the system. He smiled and asked me to meet him privately.

Later, alone in his office, I was furtively told that the MTBF calculations indicated that the system was doomed to failure, so it had been decreed that the project was not going to use MTBFs. The rationale was that each system component would be examined on a case-by-case basis to ensure that its materials and assembly were suitable for its intended task. In essence, this can be viewed as an early example of the physics of failure approach. And yes, the mission was a complete success.

-Ed Walker

Oh, No! We Forgot the Bozo Protection (and other Persistent Design Errors)

We’ve contributed to hundreds of electronics design projects wherein the circuitry was subjected to rigorous WCA+ (WCA+ is our advanced version of Worst Case Analysis; see “Four Costly Myths About WCA“). Our analyses invariably detected various design deficiencies, both stress and functional. Unfortunately, like an annoying relative who can’t get the hint to please not visit again, some common problems that we were finding decades ago are still regularly popping up in today’s new designs. These include:

  • Lack of protection from Bozo the Clown: inadequate ESD protection; connectors without reverse-polarity keying; identical connectors for all ports (you don’t expect Bozo to pay any attention to cable labels or connector colors, do you?); no spills/immersion protection (e.g., coffee, slurpees, beer, or even juice from a steak being thawed on top of a warm electronics unit (no kidding)).
  • Transient protection devices (TPDs) not present at circuit interfaces. Not just the AC power and load interfaces, but all the internal interfaces that are exposed to ESD or potentially unruly test equipment during testing, particularly for costly subassemblies. We’ve seen a hugely expensive and schedule-critical board blown up by a test instrument failure; a disaster that could have been prevented by a few bucks’ worth of TPDs.
  • Failure to account for dissimilar power supply voltages, causing interface overdrive and/or latchup. (Sometimes this only occurs during transient conditions, making the deficiency hard to catch during testing. You will typically learn about it after you’ve shipped a few thousand units and your boss is frantically paging you to get back to work after you’ve had too many beers and the last thing you want is to work through the night and the weekend on warranty repairs while angry customers are screaming at you on the phone…but I digress…)
  • Inadequate ratings for AC mains rectifiers and other power components, particularly in switchmode supplies. Hint: Don’t completely rely on SPICE or other simulations to identify realistic worst case performance boundaries for these components. Or do, but then be sure to not provide a warranty with your product.

For some more tips, see page 210 of The Design Analysis Handbook; still very relevant after all of these years. (Note: We’re out of copies of the Revised Edition, but it’s still available from Amazon and Elsevier.)

P.S. We’re considering creating some low-cost mini-modules of our Design Master WCA+ software, configured for common design tasks such as proper TPD selection, op amp gain stage analysis, etc. (If you care to comment, your feedback will be appreciated and will help us make a decision. You can add a comment to this post, or email us at daci@daci-wca.com.)

Thanks.
-Ed Walker

Four Costly Myths About Worst Case Analysis

Myth #1: Worst Case Analysis (WCA) is a rigidly defined mathematical method of determining the limits of performance of a design.

There are actually a few different types of WCA, primarily:

Extreme Value Analysis (EVA)

Statistical Analysis (Monte Carlo)

WCA+

WCA+ is safer than Monte Carlo and more practical than EVA. Monte Carlo can miss small but important extreme values, and EVA can result in costly overdesign. WCA+ identifies extreme values that statistical methods can miss, and then estimates the probability that the extreme value will exceed specification limits, thereby providing the designer with a practical risk-assessment metric. WCA+ also generates normalized sensitivities and optimization, which can be used for design centering. (Ref. http://daci-wca.com/products_005.htm)

Myth #2: Worst Case Analysis is optional if you do a lot of testing

To maintain happy customers and minimize liability exposure, the effects of environmental and component variances on performance must be thoroughly understood. Testing alone cannot achieve this understanding, because testing — for economic reasons — is usually performed on a very small number of samples. Also, since testing typically has a short time schedule, the effects of long-term aging will not be detected.

Myth #3: Worst Case Analysis is optional if we vary worst case parameters during testing

Initial tolerances typically play a substantial role in determining worst case performance. Such tolerances, however, are not affected by heating/cooling the samples, varying the supply voltages, varying the loads, etc.

For example, a design might have a dozen functional specs and a dozen stress specs (these numbers are usually much, much higher). To expose worst case performance, some tolerances may need to be at their low values for some of the specs, but at their high or intermediate values for other specs. First, it’s not even likely that a tolerance will be at the worst case value for a single spec. Second, it’s impossible for the tolerance to simultaneously be at the different values required to expose worst case performance for all the specs. Therefore it’s not valid to expect a test sample to serve as a worst case performance predictor, regardless of the amount of temperature cycles, voltage variations, etc. that are applied to the sample.

Myth #4: Worst Case Analysis is best done by statistics experts

No, it is far better to have WCA performed — or at least supervised — by experts in the design being analyzed, using a practical tool like WCA+ that employs minimal statistical mumbo-jumbo. Analyses (particularly cook-book statistical ones), when applied by those without expertise in the design being analyzed, often yield hilariously incorrect results.

-Ed Walker