ASA Connect

 View Only
Expand all | Collapse all

Some thoughts on W. Edwards Deming in Statistics

  • 1.  Some thoughts on W. Edwards Deming in Statistics

    Posted 07-14-2019 14:19
    Edited by Jonathan Siegel 08-02-2019 16:59
    One of the sessions being done at JSM this year is a retrospective on Deming's statistical legacy. The ASA has various associations with Deming. Each year there is a Deming lecture. But I think we rarely talk about Deming. He is often thought of as a management consultant rather than as a statistician or a philosopher. To the extent he is perceived as a statistician at all, his contributions are often perceived as a few practical applications, things like the control chart. Many of his other practical ideas that have been adapted, like the plan-do-study-act cycle, are thought of as outside the field of statistics entirely and having nothing to do with statistical thinking.

    I think Deming's most fundamental contribution was quite different, a core philosophical challenge to the foundations of statistics which I think recent decades have generally sustained and statistics as a profession hasn't really fully recovered from.  We haven't changed so much. As Winston Churchill put it, every now and then a man [sic] stumbles on the truth, but most of us manage to pick ourselves up and keep going anyway. But our field is starting to change, and I think Deming's real legacy is starting to have an impact, although still a very slight one. So I think it's worth thinking about the challenge. 

    1. Science vs. math. I think Deming's first and most foundational challenge challenge is to think of the field as a science of variation, and of how to acquire knowledge in the face of variation, rather than a branch of mathematics. Mathematics deduces consequences from assumed truths. Science attempts to generalize and predict from what we observe.  I think the best explanation of the practical difference is Lawrence Summers' [sic] distinction between what he called the "smart people," econometricians and the like who had lots of fancy mathematics to back up their theories and were the obviously right people to follow when he was a grad student if one wanted to be thought smart and to advance ones career, and the "stupid people," sociologists and such, who had only a few empirical observations they were unable to fully explain and were therefore obviously wrong. As Dr. Summers recounted, our world today looks much more like what the stupid people predicted decades ago than what the smart people predicted. Deming would say this means the stupid people weren't so stupid. And maybe the smart people weren't so smart. 

    2. Standard statistical assumptions often don't hold. Based on his background in physics, Deming recognized the complex world of social and biological systems is highly interactive, non-linear, and heavy-tailed. Exchangeability assumptions in particular - assumptions that orders of operation don’t matter - can be particularly untenable. Thus the thinking that we now associate with the control chart was once an inquiry into establishing the conditions under which statistical assumptions and associated inferences approximately hold. This occurs if you can show - or intervene so as to make - the underlying process stable over time. This work is today most associated with achieving uniform product for quality purposes. But that's simply an application. The underlying work is epistemological, aimed at the question of under what conditions can we use statistical methods to obtain reliable knowledge. Deming's key insight is that these conditions are rarely the natural state of complex systems. Intervention is usually required to achieve them.

    In this respect, Deming criticized statisticians, much as Dr. Summers criticized econometricians, for being too beholden to mathematics, being too quick to confuse elegant mathematics for intelligence or truth, too quick to assume that the world works to make the calculations easier. We have been too quick, in short, equate beauty and truth. But Deming's warning, as Lao Tzu's was, is that the truth is often not beautiful. (And as George Box put it, statisticians and artists both suffer from being too easily in love with their models.)

    Today key societal mistakes can be traced to exactly the assumptions Deming criticized. Prior to the financial collapse, models were too quick to assume defaults would be independent, and too quick to extrapolate the recent past into the distant future. We have been too quick to assume that drugs wouldn't result in evolution of the organisms they target. Much of our difficulties lie in the fact that we simply assumed stable conditions with no basis for believing them, other than an assurance -- what we now recognized was a false assurance -- that that's how science works. We believed the world existed to make our jobs easier, that because we were at the top of the mathematical food chain and academic social hierarchy, our theories HAD to be the right ones. We now see that that was hubris. 

    Today the study of complex systems, thinkers like Nicholas Nissim Taleb, and others are able to articulate and explain why we were wrong. But when Deming started, the underlying mathematics and concepts of complex systems hadn't been invented. Deming was virtually alone, describing things in simple, qualitative terms, with only people like Poincare to rely on. Not having elegant mathematics to back him up, he was often classified as one of the stupid people and ignored, good for the sorts of lower-level statisticians who do applications like industrial statistics, but not someone someone doing theory should take seriously. 

    In my own JSM talk, on estimands, we will be talking about the idea that even in a randomized clinical trial, the traditional gold standard of statistical application, post-randomization dynamic events ("intercurrent "events) can influence the results, sometimes severely, and cannot simply be assumed to be uninformative. Once the instant of randomization completes, patients become subject to processes occurring over time which can potentially be confounding. This is a revolution in thinking. Once upon a time, the objectors were ignored. No longer. And this, occurring just within the last decade or so, is a truly watershed change. 

    And it's a change that, I would humbly suggest, is directly tracable to Deming's legacy. It is Deming who introduced the idea that we simply cannot avoid treating the phenomena we want to learn about as dynamic, interactive systems, systems that can at best approximate the conditions, often with significant and skilled management and intervention required, under which statistical inference becomes approximately reliable. It was Deming who distinguished between enumerative and analytic studies, emphasizing that most applied research seeks to make predictions about the future, not simply to extrapolate to an existing frame. And it is Deming who taught that to have any hope of using the past to help us understand the future, we need to begin by trying to understand the dynamics, the mechanisms, by which these systems work, conceived as systems, if we want to learn anything or get anywhere at all. 





    ------------------------------
    Jonathan Siegel
    Deputy Director Clinical Statistics

    ------------------------------


  • 2.  RE: Some thoughts on W. Edwards Deming in Statistics

    Posted 07-15-2019 09:03

    Jonathan

    Thank you for your post on Dr. Deming. I agree with your observations on Dr. Deming's legacy.

    He was quite capable of holding his own with any audience, academic or otherwise. His observations and statements rarely reflected those of his audience. 

    I had the chance to first meet Dr. Deming when I worked at Ford. I took several courses with him and was mentored by him for 13 years. Many individuals did not understand his experience and extensive formal education. He received his Phd in mathematical physics from Yale University. So, no stranger to higher math. While working for the Dept. of Agriculture, he was sent to England for a year and worked and met with R. A.Fisher and Pearson. I am sure he held his own with them as they worked to develop and apply statistics to the agricultural studies.

    I frequently saw him being challenged on his ideas and his understanding of engineering theory (usually physics related). He didn't boast of his knowledge or display his extensive background or education. But, when directly challenged you could see those blue eyes flash, and like a light saber being drawn, he could effectively display a dazzling argument  to any challenge quickly ending effectively any further debate.

    His vision of the world of statistics certainly was influenced by his knowledge of physics. He was an intern at Western Electrics Hawthorne plant in Chicago where he met Dr. Shewhart, his mentor. The focus on understanding variation drove him from that point into his work at the Dept. of Agriculture and then the Census Bureau. 

    His legacy remains with those that interacted with him and learned to view the world with a unique lens, based on statistical concepts and knowledge of the interactions in the real world.

    He is greatly missed and I often reflect on what would he be thinking of the us now?

    Eileen Beachell

    Quality Disciplines



    ------------------------------
    Eileen Beachell
    ------------------------------



  • 3.  RE: Some thoughts on W. Edwards Deming in Statistics

    Posted 07-15-2019 11:24
    Jonathan,

    Thanks for the contribution.

    One possibly small, picky point: Control charts came from Shewhart, correct?  Deming communicated their existence and use to others.  I have read of his statistical contributions in sample surveys and in Deming regression.

    Bill

    ------------------------------
    Bill Harris
    Data & Analytics Consultant
    Snohomish County PUD
    ------------------------------



  • 4.  RE: Some thoughts on W. Edwards Deming in Statistics

    Posted 07-15-2019 13:21
    You're right. And it's a very fair point. 

    Not it only did Walter Shewhart develop the control chart, he did it as a foundational, epistemological inquiry, asking under what conditions can we ensure that statistical inference approximately holds, and then asking how we can create those conditions, if we can, when we are starting with a situation where it doesn't but have the ability to intervene in a process. A great of what Deming did comes from him.

    ------------------------------
    Jonathan Siegel
    Associate Director Clinical Statistics
    Bayer HealthCare Pharmaceuticals Inc.
    ------------------------------



  • 5.  RE: Some thoughts on W. Edwards Deming in Statistics

    Posted 07-16-2019 13:05
    I have always though of Deming as more of the practical executioner of quality related statistics (and sampling - his real forte) then Shewhart, the theorist.  For example, Deming changed from Shewhart's random and assignable causes to the more realistic common and special cause designations. One of his rivals once remarked that "Deming was not a real statistician".  That individual had obviously never looked into Deming's books "Some Theory of Sampling" or "Statistical Adjustment of Data", not to mention his many other papers on matters and problems involving statistics. 

    S. Luko 
    Collins Aerospace


    ------------------------------
    Stephen Luko
    Statistician, Collins Aerospace
    ------------------------------



  • 6.  RE: Some thoughts on W. Edwards Deming in Statistics

    Posted 07-17-2019 19:20
    The two Deming books S. Luko mentioned are probably still available.  I bought them both not that long ago.  They are quite interesting. 

    Chapter 12 in Some Theory of Sampling, "A Population Sample for Greece," is noted to be almost an exact copy of the 1947 article in JASA by Jessen, Blythe, Kempthorne, and Deming.  That article was brought to my attention by Ken Brewer some time ago as an example of a ratio estimator corresponding to what Ken showed to be the largest degree of heteroscedasticity generally reasonable.  (See "alternative ratios" in Sarndal, Swensson, and Wretman.)

    Ratios are a prominent part of that other Deming book as well.

    ------------------------------
    James Knaub
    Retired Lead Mathematical Statistician
    ------------------------------



  • 7.  RE: Some thoughts on W. Edwards Deming in Statistics

    Posted 07-18-2019 06:46
    Deming was a super smart and almost unreal statistician. No wonder he got rivals when he questioned methods such as acceptance sampling and tried to make common and special cause understandable to the average worker.

    ------------------------------
    Lars Lyberg
    ------------------------------



  • 8.  RE: Some thoughts on W. Edwards Deming in Statistics

    Posted 07-18-2019 12:24
    Deming's principles on management speak to building and sustaining excellent professional organizations of all types, including those providing statistical services. In them, most people have some type of leadership role, even if that is only leading one of the smaller components of a project. 

    I recommend Mary Walton's book on Deming as a management guru. She fleshes out his 14 points and 7 deadly diseases in a most readable and entertaining fashion. Good quotes and anecdotes.

    A sample:

    It is totally impossible for anybody or for any group to perform outside a stable system, below it or above it. If the system is unstable, anything can happen. Management's job, as we have seen, is to try to stabilize systems. An unstable system is a bad mark against management.
         -W. Edwards Deming

    What is a "stable system?" Are the various "systems" you lead and work within stable?

    Walton's book covers this and much more.

    (Thanks to Ron Randles for "suggesting" that I read it in 1992.)





    ------------------------------
    Ralph O'Brien
    Professor of Biostatistics (officially retired; still keenly active)
    Case Western Reserve University
    http://rfuncs.weebly.com/about-ralph-obrien.html
    ------------------------------



  • 9.  RE: Some thoughts on W. Edwards Deming in Statistics

    Posted 07-23-2019 11:16
    Maybe it is worth recalling that Deming signed a new foreword to a book by Shewhart «Statistical method, from the wiewpoint of quality control» (Dover, 1986) where one can find this phrase: «There is no true value of anything» (in the context fo measurements...), which, considering the recent special and very iimportant issue (Vol. 73 sup1. «Moving to a world beyond p<0,05»), makes Deming a sort of precursor...
    I cite this phrase in all my elementary courses in statistical methods, especially when  when comes the time for teaching elementery statistical inference.

    ------------------------------
    Marc Bourdeau
    École Polytechnique de Montréal
    ------------------------------



  • 10.  RE: Some thoughts on W. Edwards Deming in Statistics

    Posted 07-23-2019 11:31
    Jonathan, you make an important point that I've wondered about.  As I understand, Shewhart's epistemological  breakthrough came before Fisher's work on statistical significance, Neyman and Pearson's work on hypothesis testing, and Linquist's original work on NHST (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4347431/), which makes his work all the more impressive.  It also makes it old, which is an attribute and not necessarily an epithet: older is not necessarily worse.  Indeed, as evidenced by the recent ASA statements on statistical significance and a post-0.05 world and by Shewhart's or Deming's statements somewhere (or so I recall) that Shewhart was not doing statistical significance testing but rather making an economic decision (more like today's decision or utility theory, perhaps), coming before significance testing, NP decision making, and NHST may have helped Shewhart's work have a more lasting utility.

    Still, the world has evolved, perhaps most dramatically in the current prevalence of Bayesian work, thanks in part to improved, general-purpose MCMC samplers.

    One of Shewhart's key contributions was creating a process that was theoretically sound and able to be carried out by the shop floor of the day, essential for eliminating engineering (I think that was his classification at the time of his memo), as the bottleneck in process improvement.  I know the shop floor is occupied by an educated workforce in many places, and they have computing power that even Shewhart lacked at the time, but MCMC doesn't sound like a tool to teach to production workers: it's complex, and they have more important thinks to learn and know.

    So what's the current thinking in statistical process control or whatever you want to call it?  Is there a Bayesian decision-theoretic approach that applies more current (and more valid??) ideas in a way that it can still be applied by the shop floor?  Have ERP systems become so pervasive that we rely on whatever tools they provide (!?!) except in the smallest organizations (and, if so, what do both the ERP shops and the smallest organizations do today)?  Is Donald Wheeler's emphasis on XmR charting the best we have (I don't mean that critically; XmR seems to satisfy the usability criterion, but I am not sure if it's the best from a decision theory approach...still, it may be good enoug)?

    ------------------------------
    Bill Harris
    Data & Analytics Consultant
    Snohomish County PUD
    ------------------------------



  • 11.  RE: Some thoughts on W. Edwards Deming in Statistics

    Posted 07-23-2019 11:37
    To clarify my point about Shewhart making an economic decision-theoretic tool: if that is what he did (and I think he says that), then isn't the nature of the tool affected by the economic conditions at the Hawthorne Works?  Might we need to update those guidelines to fit the economic realities of each of our settings--or at least to help convince management that we were helping them in a business sense rather than just running statistical analyses of some sort?  I don't recall seeing any articles showing how people should modify the rules for "in control" based on the cost of detecting a failure at a certain point in a certain factory.

    ------------------------------
    Bill Harris
    Data & Analytics Consultant
    Snohomish County PUD
    ------------------------------



  • 12.  RE: Some thoughts on W. Edwards Deming in Statistics

    Posted 07-24-2019 12:37
    Greetings to all. 

    There have been quite a few Bayesian methods proposed for process monitoring, but they tend to be quite complicated and unused in practice in my experience. 

    The basic control charting methods are closely tied to hypothesis testing. Deming, by the way, favored control charting but saw no use whatsoever for significance testing. 

    Because control charting is tied to hypothesis testing, it shares its weaknesses. Statistically significant process changes may not be practically significant. Fred Faltin and I discuss these issues and one way to address them in a paper to appear in Quality Engineering, "Rethinking Control Chart Design and Evaluation".  I'm happy to share a copy. My email address is bwoodall@vt.edu

    Bill Woodall

    ------------------------------
    William Woodall
    Professor
    Virginia Tech
    ------------------------------



  • 13.  RE: Some thoughts on W. Edwards Deming in Statistics

    Posted 07-15-2019 12:22
    Thanks Jonathan, really enjoyed the post. Are you aware of the 1942 Paper that Deming wrote entitled, On a Classification of the Problems of Statistical Inference, W. Edwards Deming
    Journal of the American Statistical Association, Vol. 37, No. 218. (Jun., 1942), pp. 173-185.

    Best regards,

    Cliff Norman
    API

    ------------------------------
    Clifford Norman
    Consultant/Partner
    Associates in Process Improvement
    ------------------------------



  • 14.  RE: Some thoughts on W. Edwards Deming in Statistics

    Posted 07-15-2019 17:21
    I enjoyed Jonathan Siegel's comments on the legacy of W. Edwards Deming. I was privileged to meet with Deming a few times in the 1980's when our university was seeking to build a consulting and training relationship with a major automotive company. This was at a time when American auto firms were trying to catch up with the Japanese in terms of quality. Faculty members from mathematics, engineering, and business presented proposals for their academic units to become the university's main liaison with the auto company. Deming was engaged to advise the university on this matter. He was suspicious of business schools (which I represented, feeling that traditional business school teaching was part of the quality problem. He wanted to start at the top by retraining business managers (in what he somewhat grandiosely called "profound knowledge"). Avoid assigning blame, improve worker training, and so on. His 14 points were part of Deming's influence, but his unique (overpowering?) personality was also a factor.

    Deming ended up endorsing the math department's proposal for applied statistics training. I recall that Deming was particularly impressed by the math department's representative, a statistician who had studied under John Tukey. Tukey's focus on "letting the data speak" appealed to Deming's views. Happily, I was invited to attend a three-day Deming seminar and later to join a five week class taught by automotive engineers trained in SQC and in Deming's philosophy. My teaching of business students was informed by that experience. Quality topics (including behavioral issues) was always prominent in my students' quantitative training. Most business statistics textbooks include a chapter on quality. Hopefully, modern managers are continuously improving themselves.

    ------------------------------
    David P. Doane, PStat
    Professor Emeritus
    Oakland University
    ------------------------------



  • 15.  RE: Some thoughts on W. Edwards Deming in Statistics

    Posted 07-19-2019 13:53
    My main point in this comment is that some of us may have associated beauty too much with simplicity. I am responding primarily to your passage, Jonathan,
    "Deming criticized statisticians, just as Dr. Sommers criticized econometricians, for being too beholden to mathematics, being too quick to confuse elegant mathematics for intelligence or truth, too quick to assume that the world works to make the calculations easier. We have been too quick, in short, to equate beauty and truth. But Deming's warning, as Lao Tzu's was, is that the truth is often not beautiful."
    Out of many examples of the difference between beauty and simplicity, here is an extreme one: A picture that is all white or is all black is simple and is the stuff of jokes.

    To equate simplicity with beauty would be to ignore the latter's complexity, and it would be likewise to say that simplicity, or rather the lack of unnecessary complexity, is not part of beauty. Simplicity similarly is part of Occam's razor. Granted, the universe is not simple to us mortals, and our own complexity exceeds our comprehension. If only for this very reason, simplicity is essential to science. And many empirical observations necessarily are more or less simplifications. I think we agree that simplicity has some value, particularly since you credit Deming for his "describing things in simple, qualitative terms." Considering simplicity's roles in science and in beauty, I would view science as a system with simplicity as a component of it and of any beautiful system.

    ------------------------------
    Thomas M. "Tom" Davis
    ------------------------------



  • 16.  RE: Some thoughts on W. Edwards Deming in Statistics

    Posted 07-22-2019 23:15
    I appreciate this.

    I suspect the relevant metaphor is not simplicity vs. complexity, but rather, exactness or precision vs. vagueness and ambiguity. In Professor Summer's example, the "smart people" had exact definitions, theories, models, and methods to explain the chain of causality they posited, while the "stupid people" generally lacked this. Often the more precise model is both more complex and more fragile, as is the case with overfitting. But exactness/vagueness and simplicity/complexity are different concepts, and examples exist in each cell of their 2x2 table. 

    Even to say that our explanation is merely an approximation, with the underlying reality having an element not fully encompassed by it, is an improvement. Statistics of the Newman-Pearson type tended to take it for granted that the statistic is what's real, and what identifies what one is actually measuring, with the distribution around it representing measurement error, something that isn't really there and can be discounted as noise. Today I think we are inclined to accept that variation is more what is fundamentally real, the distribution as a whole sometimes has to be addressed and sometimes contains the value, and the statistic is an artificial construct, a simplification that is useful for some purposes but not others. 

    Taleb says mistaking the statistic for the distribution is the fundamental mistake that classically trained financial quants make. I think this was an important part of Deming's critique as well. A second part of that critique is acknowledgement that real biological and social phenomena seldom follow classical distributions with any exactness. A model of a distribution is also just a model, also just a construct, not what is fundamentally real. The reality is often complex, and ignoring the complexity can only get us so far.

    ------------------------------
    Jonathan Siegel
    Deputy Director Clinical Statistics
    ------------------------------



  • 17.  RE: Some thoughts on W. Edwards Deming in Statistics

    Posted 07-21-2019 13:06
    As friends with Ed Deming during the 1960's (only corresponding later), I grew to admire his wisdom. His sound arguments couched in a modest manner would convince me of his case without my even realizing it. He passionately believed in using the art of statistics as well as the mathematics of statistics. He convinced me that the role of theory is to serve application, not the other way around.

    ------------------------------
    Bob Riffenburgh
    Retired (Mostly)
    ------------------------------



  • 18.  RE: Some thoughts on W. Edwards Deming in Statistics

    Posted 08-06-2019 10:41
    Jonathan, in your point 2, you mention the epistemological basis of Shewhart's work.  I agree with what you wrote, and I might add (although you probably said it, as well) that that aspect doesn't seem to come to the fore as much as it might in discussions about this work.

    I'm curious if you see the implications of that statement as having evolved (or having gotten the ability to evolve) since Shewhart's time.  I seem to recall both Shewhart and Deming talking about predictability a lot: if a process is not predictable, then you don't know if the change you see is important or not.

    In today's world, we can do more exotic statistical modeling that Shewhart could.  For a simple example, a process might use material from two suppliers, and some process metric might vary by supplier.  A simplistic reading of Shewhart might suggest that one should work on reducing that variability; today we could try something like a HMM to make the process output predictable even as it changes.

    If that be true, then we get to another part of Shewhart's idea: we should make the math simple enough so that the production team can carry it all out, because forcing the math back to the "engineers" (either process engineers or industrial statisticians) creates an undesirable bottleneck.  In today's world, we get to decide a) is that still important in our own setting, b) should the math be automated inside the ERP or other process control system, and c) if a person should carry it out, who should that be?

    Does that make sense?

    ------------------------------
    Bill Harris
    Data & Analytics Consultant
    Snohomish County PUD
    ------------------------------



  • 19.  RE: Some thoughts on W. Edwards Deming in Statistics

    Posted 08-09-2019 11:04
    Jonathan, you wrote "In my own JSM talk, on estimands, we will be talking about the idea that even in a randomized clinical trial, the traditional gold standard of statistical application, post-randomization dynamic events ("intercurrent "events) can influence the results, sometimes severely, and cannot simply be assumed to be uninformative. Once the instant of randomization completes, patients become subject to processes occurring over time which can potentially be confounding. This is a revolution in thinking. Once upon a time, the objectors were ignored. No longer. And this, occurring just within the last decade or so, is a truly watershed change."

    From somewhat of an outsider's perspective, I agree.  Have you seen Six Reasons to Apply System Dynamics Modeling in Medical Research, or INTRODUCTION TO SYSTEM DYNAMICS FOR HEALTH CARE SERVICES

    System dynamics is, at its core, not much different that PBPK applied to systems that typically include human decision-making in the observation that our ideas, our decisions, and our actions often are subject to the same accumulation and feedback processes that in the PBPK sense govern the ADME of drugs and toxins that enter and then, at least sometimes, exit the body.  I do think there is something to combining these ideas.  It can help us make sense of the seeming confounding you describe.  Serial correlation and confounding can become clear change in an underlying system performance.

    ------------------------------
    Bill Harris
    Data & Analytics Consultant
    Snohomish County PUD
    ------------------------------



  • 20.  RE: Some thoughts on W. Edwards Deming in Statistics

    Posted 08-20-2019 13:28
    Hi Bill,

    I'll try to address both of your recent posts.

    Shewhart's and Deming's ideas are general and can be applied to a wide variety of contexts. A great deal of application details depend on the context. Two contexts I've mentioned - financial markets (the background Taleb bases his ideas on) and medical clinical trials (my own current environment) differ in many respects from industrial process control. One difference is that the key players in both contexts are more educated and able to deal with more sophisticated math, which changes what kind of solutions are possible from a human environment point of view. A second difference is that both financial analysts and clinical trialists generally have much less control over their subjects and environment than folks in industrial process control, requiring them to use methods that are more observational in character. The idea that randomization doesn't fully protect clinical trials and post-randomization events including patient choice can make them, in practice, something of a hybrid with an observational element, is a relatively new one and something the estimands framework emphasizes. 

    Although Taleb presents many ideas Deming grappled with, there are some key differences. Taleb's world of financial instruments requires predicting events but treats them as external, outside control. The analyst merely makes side bets on these uncontrollable events which are regarded as zero sum games, with one side losing what the other side gains. This is very different from managing people and processes. So it appears to me that while Taleb's writings  encompass some of Deming's key ideas on theory of knowledge and variation, a great deal of Deming's thought about how our actions can influence (but not deterministically control) events, what we need to do to manage such influence successfully, the importance of cooperation, etc., don't tend to have much play in Taleb's thought.

    ------------------------------
    Jonathan Siegel
    Deputy Director Clinical Statistics
    Bayer HealthCare Pharmaceuticals Inc.
    ------------------------------