
Myron Tribus’ UCLA Statistical Thermodynamics class introduced me to entropy, -SUM[p(t)ln(p(t))]. (p(t) is the probability of state t of a system.) Professor Tribus later advocated maximum-entropy reliability estimation, because that “…best represents the current state of knowledge about a system…” [Principle of maximum entropy – Wikipedia] Caution! This article contains statistical neurohazards.
Claude Shannon wrote that entropy (log base 2) represents information bits, “…an absolute mathematical limit on how well data from the source can be losslessly compressed onto a perfectly noiseless channel.” [Beirlant et al.]
Maximum likelihood estimation is one way to estimate reliability from data. It maximizes the probability density function of observed data, PRODUCT[p(t)], e.g., for observed failures at ages t. It is equivalent to maximize -SUM[ln(p(t)]. Maximum entropy reliability estimation maximizes entropy -SUM[p(t)ln(p(t)]. That’s same as maximizing the expected value, -SUM[p(t)ln(p(t)], of the log likelihood -ln(p(t). Fine, if you have life data, ages at failures t censored or not. [Read more…]










If you have invested the time to layout the storeroom correctly, and gather the right data, you are on the right track to a successful storeroom. However, if you don’t take the time to map the various processes in the storeroom, and hold staff to those processes, the work is done so far will be a waste.

Ask a question or send along a comment.
Please login to view and use the contact form.