The subjunctive is scientific thinking built into the language.

The subjunctive draws a distinction between fact and possibility, between truths and wishes. The expression “if he were” (not “if he was”) is subjunctive; it intentionally sounds wrong (unless you’re used to it) to indicate that we’re talking about something hypothetical as opposed to something actual.
This is scientific thinking built into the language (coming from its romance-language roots).

This is beautiful. Let’s hold onto it.

You are not disinterested.

Everyone: Stop saying ‘disinterested’. You apparently don’t know what it means. It doesn’t mean ‘uninterested’.

In fact, it means you’re truly interested. ‘Disinterested’ is when you care so deeply as to want to treat the situation objectively. It is a scientific term describing the effort to rid a study of the effects of subconscious biases.

Also, please don’t say ‘substantive’ when all you mean is ‘substantial’. They’re not the same thing. Thanks. (‘Substantial’ is a good word. You’re making it feel abandoned. )

Microsoft: Fix your use of the word ‘both’.
When comparing only two files, Windows says something like “Would you like to compare both files?” As opposed to what, just compare one, all by itself? (like the sound of one hand clapping?)
The word ‘both’ is used when the default is not that of two things. It emphasizes the two-ness to show that the twoness is special, unusual. But when the default is two, you say “the two” (as in “Would you like to compare the two files?”), not ‘both’, and DEFINITELY NOT ‘the both’. (It was cute when that one famous said it once. It’s not cute anymore. Stop saying it.)
Back to ‘both’: A comparison has to involve two things, so ‘both’ (the special-case version of the word ‘two’) only makes sense if the two things are being compared to a third.
English is full of cool, meaningful nuances. I hope we stop getting rid of them.

Seriously, everyone: English is wonderful. Why are you destroying it?

 

PS: same with “on the one hand”… We used to say “on one hand” (which makes sense… either one, any one, not a definite hand with a definite article)

Science-doing

There are (at least) two types of scientists: scientist-scientists and science-doers.

Both groups do essential, difficult, demanding, and crucial work that everyone, including the scientist-scientists, needs. The latter group (like the former) includes people who work in research hospitals, water-quality labs, soil-quality labs, linear accelerators, R-&-D labs of all kinds, and thousands of other places. They carry out the daily work of science with precision, care, and a lot of hard work. Yet, at the same time, in the process of doing the doing of science, they typically do not get the luxury of stepping back, moving away from the details, starting over, and discovering the less mechanical, less operational connections among the physical sciences, the social sciences, the humanities, technology, business, mathematics, and statistics… especially the humanities and statistics.

I am not a good scientist, and that has given me the opportunity to step back, start over, do some things right this time, and more importantly, through a series of delightful coincidences, learn more about the meaning of science than about the day-to-day doing of it.[1] This began to happen during my Ph.D., but only some of the components of this experience were due to my Ph.D. studies. The others just happened to be there for me to stumble upon.

The sources of these discoveries took the form of two electrical-engineering professors, three philosophy professors, one music professor, one computer-science professor, some linguistics graduate students, and numerous philosophy, math, pre-med, and other undergrads. All of these people exposed me to ideas, ways of thinking, ways of questioning, and ways of teaching that were new to me.

As a result of their collective influence, my studies, and all my academic jobs from that period, I have come to think of science not merely as the wearing of lab coats and carrying out of mathematically, mechanically, or otherwise challenging complex tasks. I have come to think of science as the following of, for lack of a better expression, the scientific method, although by that I do not necessarily mean the grade-school inductive method with its half-dozen simple steps. I mean all the factors one has to take into account in order to investigate anything rigorously. These include double-blinding (whether clinical or otherwise, to deal with confounding variables, experimenter effects, and other biases), setting up idiot checks in experimental protocols, varying one unknown at a time (or varying all unknowns with a factorial design), not assuming unjustified convenient probability distributions, using the right statistics and statistical tests for the problem and data types at hand, correctly interpreting results, tests, and statistics, not chasing significance, setting up power targets or determining sample sizes in advance, using randomization and blocking in setting up an experiment or the appropriate level of random or stratified sampling in collecting data [See Box, Hunter, and Hunter’s Statistics for Experimenters for easy-to-understand examples.], and the principles of accuracy, objectivity, skepticism, open-mindedness, and critical thinking. The latter set of principles are given on p. 17 and p. 20 of Essentials of Psychology [third edition, Robert A. Baron and Michael J. Kalsher, Needham, MA: Allyn & Bacon, 2002].

These two books, along with Hastie, Tibshirani, and Friedman’s The Elements of Statistical Learning and a few other sources that are heavily cited papers on the misuses of Statistics have formed the basis of my view of science. This is why I think science-doing is not necessarily the same thing as being a scientist. In a section called ‘On being a scientist’ in a chapter titled ‘Methodology Wars’, the neuroscientist Fost explains how it’s possible, although not necessarily common, to be on “scientific autopilot” (p. 209) because of the way undergraduate education focuses on science facts and methods[2] over scientific thinking and the way graduate training and faculty life emphasize administration, supervision, managerial oversight, grant-writing, and so on (pp. 208–9). All this leaves a brief graduate or a post-doc period in most careers for deep thinking and direct hands-on design of experiments before the mechanical execution and the overwhelming burdens of administration kick in. I am not writing this to criticize those who do what they have to do to further scientific inquiry but to celebrate those who, in the midst of that, find the mental space to continue to be critical skeptical questioners of methods, research questions, hypothesis, and experimental designs. (And there are many of those. It is just not as automatic as the public seems to think it is, i.e., by getting a degree and putting on a white coat.)

 

Statistics for Experimenters: An Introduction to Design, Data Analysis, and Model Building, George E. P. Box, William G. Hunter, and J. Stuart Hunter, New York , NY: John Wiley & Sons, Inc., 1978 (0-471-09315-7)

Essentials of Psychology, third edition, Robert A. Baron and Michael J. Kalsher, Needham, MA: Allyn & Bacon, A Pearson Education Company, 2002

The Elements of Statistical Learning: Data Mining, Inference, and Prediction, second edition, Trevor Hastie, Robert Tibshirani, and Jerome Friedman, New York, NY: Springer-Verlag, 2009 (978-0-387-84858-7 and 978-0-387-84857-0)

If Not God, Then What?: Neuroscience, Aesthetics, and the Origins of the Transcendent, Joshua Fost, Clearhead Studios, Inc., 2007 (978-0-6151-6106-8)

[1] Granted, a better path would be the more typical one of working as a science-doer scientist for thirty years, accumulating a visceral set of insights, and moving into the fancier stuff due to an accumulation of experience and wisdom. However, as an educator, I did not have another thirty years to spend working on getting a gut feeling for why it is not such a good idea to (always) rely on a gut feeling. I paid a price, too. I realize I often fail to follow the unwritten rules of social and technical success in research when working on my own research, and I spend more time than I perhaps should on understanding what others have done. Still, I am also glad that I found so much meaning so early on.

[2] In one of my previous academic positions, I was on a very active subcommittee that designed critical-thinking assessments for science, math, and engineering classes with faculty from chemistry, biology, math, and engineering backgrounds. We talked often about the difference between teaching scientific facts and teaching scientific thinking. Among other things, we ended up having the university remove a medical-terminology class from the list of courses that counted as satisfying a science requirement in general studies.

Science, Clave, and Understanding

When Dr. Eben Alexander defended, in one of the major news magazines, his book (“proof[i]”) [1] about a spiritual non-physical afterlife realm, part of his argument was that he is a surgeon, and therefore a scientist. Surgeons are highly trained, highly specialized people who perform a very difficult and critically important service. It would be absurd not to recognize their value. Their work is without a doubt science-based, but does that make it “science”? (There are, of course, surgeons who publish scholarly work (although, I’ve noticed that in some cases, it’s not about surgery, but on fields as distant as music), and thus function as scholars, and therefore scientists.)

A scientist is not anyone who functions as a professional practitioner of a difficult and science-based field; a scientist is someone who sets up, tests, and evaluates (mostly via statistical data analysis) testable hypotheses (about anything, including the afterlife and spiritual realms, if necessary), and more importantly, does so within the guidelines of rigor, accuracy, objectivity, skepticism, and open-mindedness [2][ii]. It is worrisome to imagine that surgeons are setting up double-blinded clinical trials of surgical practices as part of their work, choosing to apply a known good technique on one patient and an as-yet-unsupported one on another patient. (In other words, I really hope surgeons do not act as scientists.) Maybe they do; I’d like to know, so please give me feedback on this question.

Assuming, though, that they don’t endanger patients’ lives for the sake of science, as we tend not to do anymore, it seems safe to assume that surgeons are highly trained specialists who practice state-of-the-art medicine. In this sense, they are not scientists. They use the findings and results of science in their practical, applied work (medicine). They must, then, fall somewhere between applied scientists and technologists (inclusive).

To say that someone who practices a specialty that is based on scientific findings is therefore a scientist is like saying a sandwich-shop employee is a farmer because they use bacon, lettuce, and tomato in their work. (The fact that surgery is far more specialized does not invalidate the argument.)

The professions that discover, invent, develop, and apply are all different. The roles can overlap—scientists do develop and build new equipment to perform their experiments, but these are not mass-produced. Anything we can purchase repeatedly on amazon or at Best Buy, say, was not made by scientists. It was designed, developed, tested, and manufactured by engineers, technologists, technicians, and other professionals, not by scientists, even if scientists were involved in the early stages. As for applied scientists, including those who work at laboratories, characterizing soil samples, say, or performing tests, they are also highly trained specialists of scientific background who are not doing science at that point. As one XKCD comic suggested [3], you can simply order a lab coat from a catalog; no one will check your publication record. Science is not solely about what you’re wearing or what degrees you have; it’s about what, exactly, you’re doing.

The public’s idea of what science is seems to be “mathy and difficult, preferable done in a lab coat while uttering multisyllabic words you don’t want to see in your cereal’s list of ingredients.” This may be a decent shortcut for pop-culture purposes, but it is not what science really is. I will not go into the inductive-method-vs-hypothetico-deductive-method-vs-what-have-you debate here because there are people who do that professionally, and do it very well. (I have been enjoying Salmon’s The Foundations of Scientific Inference [4] immensely.) What I do want to do is draw two parallels in succession, first from the preceding discussion to explanation and understanding, and from those concepts, to explanations and understanding of clave (in music).

The former has been done quite successfully in Paul Dirac-medal-and-prize-winning physicist Deutsch’s earlier book The Fabric of Reality [5]. I am not concerned here with the bulk and main point of his book, but only with his opening argument about the role of science (explanation) and what it means to understand. Deutsch criticizes instrumentalism because of its emphasis on prediction at the cost of explanation (pp. 3–7). He gives rather good examples of situations in which no scientist (or layperson, for that matter) would be satisfied with good predictions without explanations (p. 6, for example). He does not deny the role and importance of predictions, but argues that “[t]o say that prediction is the purpose of scientific theory is [. . .] like saying that the purpose of a spaceship is to burn fuel” (similar to another author’s argument that the purpose of a car is not to make vrooom–vrooom noises; they just happen to do that as part of their operation[iii]). Deutsch states that just like spaceships have to burn fuel to do what they’re really meant to do, theories have to pass experimental tests in order “to achieve the real purpose of science, which is to explain the world.” (Think about it: Why did we all, as children, get excited about science? To understand the world!)

He then moves on to explain that theories with greater explanatory power than the ones they’ve replaced are not necessarily more difficult to understand, and certainly do not necessarily add to the list of theories one has to understand the be a scientist (or an enthusiast). Theories with better explanatory power can be simpler. Furthermore, not everything that could be learned and understood needs to be: See his example of multiplication with Roman numerals (pp. 9–10). It might be fun, and occasionally necessary to have some source in which to look it up (for purposes of the history of mathematics, say), but it’s not something anyone today needs a working knowledge of; it has been superseded. His example for this is how the Copernican system superseded the Ptolemaic system, and made astronomy simpler in the process (p. 9). All of this is discussed in order to make the point that there is a distinction between “understanding and ‘mere’ knowing” (p. 10), which is where my interest in clave comes into play.

Several “explanations” of clave (sometimes even with that word in the title) that were published in recent years have been of the “mere knowing” type in which clave patterns are listed, without any explanation as to how and why they indicate what other patterns are allowed or disallowed in the idiom. Telling someone that x..x..x…x.x… is 3-2, and ..x.x…x..x..x. is its opposite, so 2-3, and (essentially) “there you go, you now know clave” does nothing towards explaining why a certain piano pattern played over one is “sick” (good) and over the other, sickening (bad) within the idiom.

Imagine if the natural sciences went about education the way we musicians do with clave. A chapter in a high-school biology book would contain a diagram of the Krebs cycle, with all the inputs, outputs (sorry for the electrical-engineer language), and enzymes given by name and formula, followed by “and now you know biochemical pathways,” without any explanation as to how it has anything to do with an organism being alive. I’m flabbergasted that musicians and music scholars find mere listings of clave son, clave rumba, [and . . . you know, the other one that won’t be named[iv]] sufficient as so-called explanations[v].

All of this reminds me of an argument I once had with a very intelligent person. I had said, in my talk at Tuesday Talks, that science is concerned with ‘why’ and ‘how’, not just ‘how’. He disagreed, which I think is because he thought of a different type of ‘why’: the theological ‘why’. I, instead, had in mind Deutsch’s type of ‘why’: “about what must be so, rather than what merely happens to be so; about laws of nature rather than rules of thumb” (p. 11). I would add, about consistency (even given Goedel, because I’m Bayesian like that, and not so solipsistic), which Deutsch mentions immediately afterwards, calling it ‘coherence’.[vi]

I understand that Hume, Goedel, and others have shown us that our confidence in science, or even math, ought not to be infinite. It isn’t. Even in a book like The God Delusion, even Richard Dawkins makes it clear that he is not absolutely certain. Scientific honesty requires that we not be absolutely certain. But we can examine degrees of (un)certainty, and specifically because of the solipsists, we have to ignore them[vii], and be imperfect pursuers of an imperfect truth, improving our understanding, all the while knowing that it could all be wrong.

To that end, I continue to test my clave hypothesis under different genres. Even if it’s wrong, it definitely is elegant.

[1] Alexander, M.D., E., Proof of Heaven: A Neurosurgeon’s Near-Death Experience and Journey into the Afterlife, Simon & Schuster, 2012.

[2] Baron, R. A., and Kalsher, M. J., Essentials of Psychology, Needham, MA: Allyn & Bacon, A Pearson Education Company, 2002.

[3] http://xkcd.com/699/ (last accessed 12/25/2015).

[4] Salmon, W. C., The Foundations of Scientific Inference, Pittsburgh, Pennsylvania: University of Pittsburgh Press, 1966.

[5] Deutsch, D., The Fabric of Reality: A leading scientist interweaves evolution, theoretical physics, and computer science to offer a new understanding of reality, New York: Penguin Books, 1997.

[i] Scientists do not speak of proof; they deal with evidence. Proofs are limited to the realm of mathematics. There are no scientific proofs; there are just statistically significant results, which are presented to laypersons as ‘proof’ because even scientists have quite a lot of difficulty interpreting measures of statistical significance, and the average person has no patience for or interest in the details of philosophy of science.

[ii] The authors of [2] give the following excellent definitions for these precise terms. Accuracy: “gathering and evaluating information in as careful, precise, and error-free a manner as possible”; objectivity: “obtaining and evaluating such information in a manner as free from bias as possible” [Ibid.]. ‘Bias’ in this case refers to the cognitive biases that are natural to human thinking and judgment, such as confirmation bias, Hawthorne effect[ii], selection bias, etc.; skepticism: the willingness to accept findings “only after they have been verified over and over”; and open-mindedness: not resisting changing one’s own views—even those that are strongly held—in the face of evidence that they are inaccurate [2]. To these we can add principles like transferability and falsifiability, and the key tools of double-blinding, randomization, blocking, and the like. Together, all these techniques and principles constitute science. Simply being trained in science and carrying out science-based work is not sufficient.

[iii] I think it was Philips in The Undercover Philosopher, but I’m not sure.

[iv] If you’ve read my post about running into cool people from SoundCloud at NIPS ’15, you’ll know what pattern I’m talking about: the English-horn-like-named pattern.

[v] Fortunately, we do have work from the likes of Mauleón and Lehmann that show causal relationships between individual notes or phrases in different instrumental lines, but since their work and mine, the trend has reverted to listing three patterns, and calling that an explanation.

[vi] Perhaps this paragraph needs its own blog post. . .

[vii] Because, according to them, they don’t exist.