Genichi Taguchi

Gen’ichi Taguchi (born January 1, 1924 in Tokamachi, Japan) is an engineer and statistician. From the 1950s onwards, Taguchi developed a methodology for applying statistics to improve the quality of manufactured goods. Taguchi methods have been controversial among some conventional Western statisticians but others have accepted many concepts as valid extensions to the body of knowledge.

 Life
Taguchi was raised in the textile town of Tokamachi where he initially studied textile engineering with the intention of entering the family kimono business. However, with the escalation of World War II, in 1942, he was drafted into the Astronomical Department of the Navigation Institute of the Imperial Japanese Navy.
After the war, in 1948, he joined the Ministry of Public Health and Welfare where he came under the influence of eminent statistician Matosaburo Masuyama who kindled his interest in design of experiments. He also worked at the Institute of Statistical Mathematics, during this time, and supported experimental work on the production of penicillin at Morinaga Pharmaceuticals, a Morinaga Seika company.
In 1950, he joined the Electrical Communications Laboratory (ECL) of the Nippon Telegraph and Telephone Corporation just as statistical quality control was beginning to become popular in Japan under the influence of W. Edwards Deming and the Japanese Union of Scientists and Engineers. ECL was engaged in a rivalry with Bell Labs to develop cross bar and telephone switching systems and Taguchi spent his twelve years there in developing methods for enhancing quality and reliability. Even at this point, he was beginning to consult widely in Japanese industry, with Toyota being an early adopter of his ideas.
During the 1950s, he collaborated widely and in 1954-1955 was visiting professor at the Indian Statistical Institute where he worked with R. A. Fisher and Walter A. Shewhart.
On completing his doctorate from Kyushu University in 1962, he left ECL, though he maintained a consulting relationship. In the same year he visited Princeton University under the sponsorship of John Tukey who arranged a spell at Bell Labs, his old ECL rivals. In 1964 he became professor of engineering at Aoyama Gakuin University, Tokyo. In 1966 he began a collaboration with Yuin Wu who later emigrated to the U.S. and, in 1980, invited Taguchi to lecture. During his visit there, Taguchi himself financed a return to Bell Labs where his initial teaching had made little enduring impact. This second visit began a collaboration with Madhav Phadke and a growing enthusiasm with his methodology in Bell Labs and elsewhere, including Ford Motor Company, Xerox and ITT.
Since 1982, Genichi Taguchi has been an advisor to the Japanese Standards Institute and executive director of the American Supplier Institute, an international consulting organisation. Contributions
Loss functions
Taguchi’s reaction to the classical design of experiments methodology of R. A. Fisher was that it was perfectly adapted in seeking to improve the mean outcome of a process. As Fisher’s work had been largely motivated by programmes to increase agricultural production, this was hardly surprising. However, Taguchi realised that in much industrial production, there is a need to produce an outcome on target, for example, to machine a hole to a specified diameter or to manufacture a cell to produce a given voltage. He also realised, as had Walter A. Shewhart and others before him, that excessive variation lay at the root of poor manufactured quality and that reacting to individual items inside and outside specification was counter-productive.
He, therefore, argued that quality engineering should start with an understanding of the cost of poor quality in various situations. In much conventional industrial engineering the cost of poor quality is simply represented by the number of items outside specification multiplied by the cost of rework or scrap. However, Taguchi insisted that manufacturers broaden their horizons to consider cost to society. Though the short-term costs may simply be those of non-conformance, any item manufactured away from nominal would result in some loss to the customer or the wider community through early wear-out; difficulties in interfacing with other parts, themselves probably wide of nominal; or the need to build-in safety margins. These losses are externalities and are usually ignored by manufacturers. In the wider economy the Coase Theorem predicts that they prevent markets from operating efficiently. Taguchi argued that such losses would inevitably find their way back to the originating corporation (in an effect similar to the tragedy of the commons) and that by working to minimise them, manufacturers would enhance brand reputation, win markets and generate profits.
Such losses are, of course, very small when an item is near to nominal. Donald J. Wheeler characterised the region within specification limits as where we deny that losses exist. As we diverge from nominal, losses grow until the point where losses are too great to deny and the specification limit is drawn. All these losses are, as W. Edwards Deming would describe them, …unknown and unknowable but Taguchi wanted to find a useful way of representing them within statistics. Taguchi specified three situations:
1. Larger the better (for example, agricultural yield);
2. Smaller the better (for example, carbon dioxide emissions); and
3. On-target, minimum-variation (for example, a mating part in an assembly).
The first two cases are represented by simple monotonic loss-functions. In the third case, Taguchi adopted a squared-error loss function on the following grounds:
* It is the first symmetric term in the Taylor series expansion of any reasonable, real-life loss function, and so is a “first-order” approximation;
* Total loss is measured by the variance. As variance is additive, it is an attractive model of cost; and
* There was an established body of statistical theory around the use of the least-squares principle.
The squared-error loss function had been used by John von Neumann and Oskar Morgenstern in the 1930s.
Though much of this thinking is endorsed by statisticians and economists in general, Taguchi extended the argument to insist that industrial experiments seek to maximise an appropriate signal to noise ratio representing the magnitude of the mean of a process, compared to its variation. Most statisticians believe Taguchi’s signal to noise ratios to be effective over too narrow a range of applications and they are generally deprecated.

Off-line quality control
Taguchi realised that the best opportunity to eliminate variation is during design of a product and its manufacturing process (Taguchi’s rule for manufacturing). Consequently, he developed a strategy for quality engineering that can be used in both contexts. The process has three stages:
1. System design;
2. Parameter design; and
3. Tolerance design.
System design
This is design at the conceptual level involving creativity and innovation.
Parameter design
Once the concept is established, the nominal values of the various dimensions and design parameters need to be set, the detail design phase of conventional engineering. Taguchi’s radical insight was that the exact choice of values required is under-specified by the performance requirements of the system. In many circumstances, this allows the parameters to be chosen so as to minimise the effects on performance arising from variation in manufacture, environment and cumulative damage. This is sometimes called Robustification.
Tolerance design
With a successfully completed parameter design, and an understanding of the effect that the various parameters have on performance, resources can be focused on reducing and controlling variation in the critical few dimensions (see Pareto principle).
Design of experiments
Taguchi developed much of his thinking in isolation from the school of R. A. Fisher, only coming into direct contact in 1954. His framework for design of experiments is idiosyncratic and often flawed but contains much that is of enormous value. He made a number of innovations.
Outer arrays
Unlike the design of experiments work of R. A. Fisher, Taguchi sought to understand the influence that parameters had on variation, not just on the mean. He contended, as had W. Edwards Deming in his discussion of analytic studies, that conventional sampling is inadequate here as there is no way of obtaining a random sample of future conditions. In R. A. Fisher’s work, variation between experimental replications is a nuisance that the experimenter would like to eliminate whereas, in Taguchi’s thinking, it is a central object of investigation.
Taguchi’s innovation was to replicate each experiment by means of an outer array, possibly an orthogonal array that seeks deliberately to emulate the sources of variation that a product would encounter in reality. This is an example of judgement sampling. Though statisticians following in the Shewhart-Deming tradition have embraced outer arrays, many academics are still skeptical .
Later innovations in outer arrays resulted in ‘compounded noise’. This involves combining a few noise factors to create two levels in the outer array. First, noise factors that drive output lower and second, noise factors that drive output higher. This still emulates the extremes of noise variation but with fewer test samples required.
Management of interactions
Many of the orthogonal arrays that Taguchi has advocated are saturated, allowing no scope for estimation of interactions. This is a continuing topic of controversy. However, this is only true for ‘control factors’ or factors in the ‘inner array’. By combining an inner array of control factors with an outer array of ‘noise factors’, Taguchi’s approach provides full information on control-by-noise interactions. His concept is that those are the interactions of most interest in achieving a design that is robust to noise factor variation. In this sense, the Taguchi approach provides more complete interaction information than typical fractional factorial experiments.
* Followers of Taguchi argue that the designs offer rapid results and that interactions can be eliminated by proper choice of quality characteristic and by transforming the data. That notwithstanding, a confirmation experiment offers protection against any residual interactions. If the quality characteristic represents the energy transformation of the system, then the likelihood of control factor by control factor interactions is greatly reduced, since energy is additive.
Analysis of experiments
Taguchi introduced many methods for analysing experimental results including novel applications of the analysis of variance and minute analysis. Little of this work has been validated by Western statisticians.
Assessment
Genichi Taguchi has made seminal and valuable methodological innovations in statistics and engineering, within the Shewhart-Deming tradition. His emphasis on loss to society; techniques for investigating variation in experiments and his overall strategy of system, parameter and tolerance design have been massively influential in improving manufactured quality worldwide. Much of his work was carried out in isolation from the mainstream of Western statistics and, while this may have facilitated his creativity, much of the technical detail of Taguchi methods and its benefits to experimentation and research is only now being studied in the West.
Honours
* Indigo Ribbon from the Emperor of Japan
* Willard F. Rockwell Medal of the International Technology Institute
* Honorary member of the Japanese Society of Quality Control
* Shewhart Medal of the American Society for Quality, (1995)
* Honored as a Quality Guru by the British Department of Trade and Industry (1990)

No comments:

Post a Comment