Portmanteaus from Famous People’s Names

My pointless brain has been cooking up portmanteaus of famous people’s names. The following are what I’ve come up with so far.

George Duke Ellington

Buddy Miles Davis

Lil’ Kim Thayil

Ornette Coleman Hawkins

Nat “King” Cole Porter

João Gilberto Gil

Wynton Kelly Rowland

                Mark Kelly Rowland

Geddy Lee Sklar

                Alvin Lee Sklar

                                Alvin Lee Ritenour

                                                Geddy Lee Ritenour

Ben Folds Five For Fighting

John Oliver Sacks

Arundhati Roy Buchanan

Willie Nelson Mandela

Jennifer Lawrence Krauss

Anna Kendrick Lamar

Steve Martin Short

                Dean Martin Denny

                                Steve Martin Denny   

                                                Dean Martin Short

Clark Terry Gilliam

Elizabeth Warren Buffet

                Elizabeth Warren DeMartini

Charlie Hunter S. Thompson

Philip Catherine Zeta-Jones

Carl Jungkook

Michael Spiro Agnew

Woody Allen Ginsberg

                Tony Allen Ginsberg

George Michael Jackson

                George Michael Faraday

Michael Jackson Browne

Elton John Lennon

                Elton John Coltrane

                                Elton John McLaughlin

Jon Anderson .Paak

                Ian Anderson .Paak

James Taylor Hawkins

                James Taylor Swift

Roger Taylor Hawkins

                Roger Taylor Swift

Kim Gordon Lightfoot

Toby Keith Jarrett

                Toby Keith Urban

                                Toby Keith Emerson

                                                Toby Keith Richards

Keith Emerson Fitipaldi

Randy Travis Tritt

Dave Stewart Copeland

                Rod Stewart Copeland

Rick James Brown

                Etta James Brown

                                Rick James Taylor

                                                Etta James Taylor

Debbie Harry Connick, Jr.

                Debbie Harry Styles

Hank Marvin Gaye

                Marvin Gaye Su Akyol

Daniel Hope Sandoval

Paul Simon Le Bon

                Carly Simon Le Bon

Howard Blake Shelton

Elton John Deacon

                Elton John Martyn

Jack Bruce Springsteen

                Jack Bruce Dickinson

Bon Scott Walker

                Bon Scott “Not” Ian

Scott “Not” Ian MacKaye

                Scott “Not” Ian Paice

                                Scott “Not” Ian Gillan

                                                Scott “Not” Ian Anderson

Lester Young MC

                Neil Young MC

                                Paul Young MC

Lester Young Thug

                Neil Young Thug

                                Paul Young Thug

Young MC Solaar

                Young MC Lyte

                                Young MC Eiht

                                                Young MC Ren

                                                                Young MC Hammer

                                                                                Young MC Frontalot

                                                                                                Young MC Hawking

MC Ren & Stimpy

Vanilla Ice-T

                Vanilla Ice Cube

DJ Jazzy Jeff Beck & The Fresh Prince Rogers Nelson

                DJ Jazzy Jeff Buckley & The Fresh Prince Rogers Nelson

Mos Def Leppard

.

.

.

Nina Hagen-Dasz

John Abercrombie & [Figure it out.]

LAST BUT NOT LEAST: Natalie Portmanteau

No Absurdity Necessary

In a 2022 book about the fear of death and the desire for immortality, Dean Rickles refers to a an example the Scottish philosopher Thomas Reid gave to demonstrate his belief that, in Rickles’ words, “It is often difficult to speak of an enduring self even in perfectly ordinary (mortal) scenarios.” (p. 12)

Reid’s example is about an 80-year-old general who can remember being 40 but cannot remember what his life was like when he was a child. (He seems to have no memories left of his childhood.) Reidevidently argued that we would consider the 80-year-old and the 40-year-old the same person (the former being the continuation of the latter) and that we would consider the 40-year-old man who could remember details about his childhood the same person as he was when a child, but that we should not consider an 80-year-old who is unable to have any recollection of his childhood the same continuing ‘self’ as that child.

If I haven’t done justice to Reid’s argument, it is because I’m anxious to present it in a symbolic way and then to demolish it using more modern ways of thinking we now have that were not available to Reid. (And even though we have access to these ways of thinking, it’s my experience that they are not anywhere close to widely appreciated or employed in the United States.)

The symbolic form of the argument is this:

Let the 80-year-old army general be A, his 40-year-old self be B, and his childhood self be C.

Let’s use the term ‘equals’ to mean “remembers and therefore can be considered a conscious continuation of…”

Reid says we have A equals B; B equals C; A does not equal C.

Rickles calls this “an absurdity.” (p. 97, end note 6 to chapter 2).

My response:

The idea that the 80-year-old general being the same person as the 40-year-old officer he remembers being, but not the child he does not remember being (which the 40-year-old officer did remember) is only absurd if you subscribe to reductionism and classical logic. This is the question of whether one can step into the same stream twice or not, but with the memory stability of the stream being the question rather than the memory stability of the stepper. And the question is no less pointless. It appears to have a point because in classical logic, if the general equals the officer and the officer equals the boy, then the general necessarily equals the boy. However, given that our cells degrade and are ejected or lost, and that all of the molecules that make up each of our bodies are entirely overturned (at some long period or another), in the traditional view we should never be able to be ourselves.

All these problems are solved with multivalued logic and with an understanding of emergence.

First, fuzzy logic, being a type of multi-valued logic (perhaps a continuous-valued logic) has no difficulty with the boy having zero membership in the-general-ness and the officer having some intermediate membership in the-general-ness. While two membership functions at opposite ends of a continuum can have no overlap, they can each have finite overlap with some intermediate membership function (in this example, only one, which overlaps at each end with each of the membership functions at the extreme ends, but all that is not a necessary condition; merely sufficient).

This may not be an answer that is satisfactory enough. Emergence comes to the rescue.

NOTE:  The fact that we don’t know the mechanism for something does not mean that something isn’t happening. Emergence seems to a fact.

Emergence has to do with how a Gestalt is—somehow—constituted from miniscule components, not one of which has any of the high-level characteristics possessed by the Gestalt. The classic introductory examples include how ant colonies can behave in coordinated adaptive ways that no single ant (and no collection of ants on the order of only a few hundred, say) can manage. Similarly, the world’s economy, the immune systems of animals, the complex chemical interactions of many plants with their environment (soil and air, at a minimum), and the fact of human emotions and creativity are instances of complex and adaptive behaviors and capabilities emerging in ways we don’t yet understand out of the complex interactions (communication, feedback, etc.) of interconnected small components, none of which is capable of the same sophistication at any scale or to any degree.

Our neurons exchange pulse sequences of electric potential via various ions and neurotransmitters. As far as we know, there is/was no mini Tom Stoppard, Ginger Rogers, Prince, or Björk hidden in the synapses of those individuals, just as there is no homunculus at the control center of the brain that sees the colors we see and feels the emotions and sensations we experience. Pain, hunger, joy, love, fear, pleasure, colors, sounds, smells, and all other emotions and sensations are emergent experiences of the whole person, made from—but not made up of—interactions among complex human systems such as the nervous system, gut bacteria, the endocrine system, long- and short-term memory, and more, the effects of each of which are made from—but not made up of—interactions among the constituent subsystems and components. Your spinal cord is not happy, fearful, anxious, or euphoric—you, the whole emergent person, are. Likewise, what makes you you is not only your experiences between ages 35 and 95 or between 0 and 15 or any interval, no matter how wide or narrow: Even without any memories of his life before age twenty and having completely replaced every molecule in his body (more than half of which were water and bacteria anyway), the 80-year-old general is still the person who resulted from his birth, his early life, his genetics, his society, and all the physical, emotional, social, and spiritual experiences his past bodies have been through. Strictly delineating where the person-who-was-once-a-boy ends and the person-who-is-a-general begins is an unnecessary exercise imposed on us by classical logic and the battle between reductionistic and holistic modes of understanding ourselves and the world around us. Systems thinking (which includes the concept of emergence) and fuzzy logic (which admits that some categories are loose in the real world) have removed the need to waste our time fitting things (experiences, people, music, success and failure, etc.) into all-or-nothing categories, about which we can only be completely right or completely wrong.

Some Pesky Subgenres

No one dared or wanted to admit it at the time, but Nü Metal and Grunge had quite a bit in common.

What’s my positional framework for this claim?

I’m making this claim from just the simple heavy-rock’n’roll perspective; I don’t even have to go to a global-southern perspective to say this.

In terms of the appearances, geographic origins, and the preferred foundational cultural elements of the bands associated with Grunge in the ’90s and with Nü Metal in the aughts (and late ’90s), most critics I was aware of, as well as I and the other music-obsessed people around me, saw these subgenres of heavy rock as utterly distinct. Perhaps this was because I had lived and continued to live in the PNW. It’s hard to imagine a Portlander in 2005 daring to say Korn and Gas Huffer were very alike in any way, musically or otherwise.

There was always SYSTEM OF A DOWN, whom no one could resist being in awe of, but they were seen as an exception. Here in the PNW, people tended to look down on Nü Metal. Perhaps we were still resentful that Melvins had not become a national phonemenon. (Perhaps most were glad Melvins didn’t inadvertently sell out, as all bands who happen to succeed and live long enough to enjoy it are said to have done.)

Imagine if Kurt hadn’t killed himself. But I digress.

I noticed this aural similarity most when a lesser-known band called The Union Underground became my obsession for a while in the mid-aughts. The artwork (the “aesthetic”) and the timing was pure Nü Metal. The sound, though, was not entirely so. If you ignored the release date and the artwork — few people can do this — it sounded like it came from the heyday of Grunge. What stood out to me musically was not drop-D tuning and a scattering of Hip Hop elements. This was much more closely an offspring of Pixies, Melvins, and Gruntruck than of Shootyz Groove, follow FOR NOW, and Ice-T.

I’m probably an aural purist. Many more people, in my experience, view music as a broader phenomenon. They see the outfits, hairstyles, cover art, and other forms of expression and identity, including ethnicity and lifestyle, as part of what makes an artist Punk and not Pop, Industrial and not Hip Hop. I’m thinking of Avril Lavigne and Consolidated, respectively. Starting with the latter, Consolidated has mostly been considered an Industrial act by fans and promoters. What they do in their music is to use samplers (I know this from meeting and talking to one of them outside the context of music) and to rap. It seems to me there’s some serious pigeonholing of Hip Hop by the industry (if not also the fans) when Eminem is part of Hip Hop without a doubt, but white guys who rap about veganism, feminism, and immigration while criticizing bullying, homophobia, and mysoginy over samples are labeled Industrial rockers, not Hip Hop MCs. If we were honest, we would at least, then, include PUBLIC ENEMY in the Industrial bucket. But no, it’s all about your race, not your lyrics, instruments, or style of music.

In the case of Avril Lavigne, I keep trying to find a song of hers that sounds like Punk Rock to me. It’s the same with the “punk idol” of my generation, Billy Idol. He sneers. He was in Generation X, who sang about drinking and stuff. He wears spikes and studs, maybe even a Mohawk. He must be punk, right?

Heck, I like several of his songs; I totally enjoy BLUE HIGHWAY, DAYTIME DRAMA, EYES WITHOUT A FACE, REBEL YELL, and CRADLE OF LOVE.

I just don’t think those songs have anything to do with Punk Rock the genre. And it’s not that I don’t count it if it’s not by Sex PisTOLS, X-Ray Spex, SUB HUM ANS, or CRASS. I’m happy to include The Stranglers, The Jam, and pre-Punk punky bands like The SONICS, DEATH, MC5, and New York Dolls (heck, Green Day, too) among what I consider “punk”… (though I draw the line at that blinky band).

And Punk Rock is an excellent example of how much lifestyle and philosophy matter. It could be a way to pursuade me that I’m wrong. I realize there’s more to punk than distortion, speed, and some sort of a British working-class accent, whether real or fake. Punk is DIY. Punk is community. Punk is—no, briefly was nonconformity. Videos of the very early days of punk reveal people in myriad creative DIY garb that reaches far beyond safety pins, fishnets, and dyed glued hair. You see sparkly dresses and garbage-bag dresses, both groups fully integrated with the earliest and less conspicuous users of safety pins—out of necessity, not fashion.

So… Did The Union Underground play any Grunge? I say they did, and to a notable extent. Did Billy Idol, Avril Lavigne, or Miley Cyrus make any Punk Rock? They probably did at some point, but I haven’t heard it yet. Are they punks? I don’t know. Since Fat Mike opened a punk museum in Vegas, it doesn’t even matter. Who would have thought LINKIN PARK would have the final word.

Anyway, what made me think of this stuff all over again was listening to VERMILIOIN PT. 2 on SlipknoT‘s awesome VOL. 3: (THE SUBLIMINAL VERSES). I don’t think that song counts as Metal of any kind except probably Grunge.

Look, it’s really simple…

― Dude, did you see all this stuff people are doing with CobraSpit 17.6?!!

― Yeah, I’ve been wanting to do that for a while, but I had some trouble installing it. I went to the CobraSpit download site and it said:

If you’re using [some OS], follow these instructions

If you’re using [another OS], follow these instructions.

If you’re using Windows, please switch to Linux. We can’t be bothered.

― So I went on a bunch of forums, you know, pancakexchange and stuff, and finally found someone who said: “If you have to use Windows, install the Windows versions of the following tools native to Linux. [list of tools]”

― So, I installed everything, but SmokeRrring 4.2 conflicted with Squeek and I found out that’s because it can only work with Squeek 3 which is completely different from Squeek 2. However Rootabaga only works with Squeek 2 unless you install PPa (pron. Papua), which I did. Now all I had to do was to uninstall Squeek 2, but then:

“If your computer ever had a Squeek 2 installation or has even been in the same ZIP code as another machine with Squeek 2, you should set fire to your machine and switch to Linux. If you insist on using Windows, you may be able to get around the Squeek-2-vs.-3 problem by using Screech 5. Screech 5 is for ubuntu only, but you can install CrumblyCake 3 to make it work in Windows.”

(An hour later) “CrumblyCake 3 only works with Windows 7. If you have Windows 10, install WhippedCream 8.1 and Whisk 9.2 running inside Battenberg 4 in your CatUnicornLizard-8R or oRCAsCORPIONhAWKiBEXtIGER environment.”

So you look these up and (after 25 minutes of reading puns on “ate” and “eight”) find out that CatUnicornLizard-8R [CUL8R] has been bought by a corporation and incorporated into their “solution” which costs $12,000/year (and more if you want support).

orCAscorPIONhaWKibEXtiGER is still open-source, but to run on Windows, it needs TurnTable 17.3 running on top of Onions 4. The website for Onions 4 comes up “404” and the forums suggest another dozen or so layers of stuff you can install in its place.

Still helpful and sympathetic, your friend says:

― Alternatively, you can start a Neanderthal 3 process within an Urrk channel running on top of your Dayvid stack in Aghast, and if you can talk to that with a Gennifur script—Hey! How about building a UNIX box for that from scratch? … And why were we doing this anyway?

― We were gonna use CobraSpit 17.6 to do some cool stuff really quickly.

― Oh, nobody uses that anymore. You should try PixieRust 8.0. It’s like Banana 3 but better.

― How does that work?

― Well, you need to install a SeaShell environment first, but that only works if you have Coral Wreath 9, so start by creating a Babylon 5.0 sandbox inside a BurritoCircus virtualizer running on Celery. You need Dwindle 3 for that though, which is really just an Ohmlette 5.9 instance in a fryingPeterPan shell, so it’s no big deal if you’re running ubuntu.

― Ah… I was on Windows, remember?

― Well, then why don’t you just install Screech 5?!

― Uhmm, that’s what we were trying to do in the first place.

 

 

[Look, it’s really simple… or, “Can we just go back to FORTRAN on VAX/VMS?]

Teaching machine learning within different fields

Everyone is talking about machine learning (ML) these days. They usually call it “machine learning and artificial intelligence” and I keep wondering what exactly they mean by each term.

It seems the term “artificial intelligence” has shaken off its negative connotations from back when it meant top-down systems (as opposed to the superior bottom-up “computational intelligence” that most of today’s so-called AI actually uses) and has come to mean what cybernetics once was: robotics, machine learning, embedded systems, decision-making, visualization, control, etc., all in one.

Now that ML is important to so many industries,application areas, and fields, it is taught in many types of academic departments. We approach machine learning differently in ECE, in CS, in business schools, in mechanical engineering, and in math and statistics programs. The granularity of focus varies, with math and CS taking the most detailed view, followed by EC and ME departments, followed by the highest-level applied version in business schools, and with Statistics covering both ends.

In management, students need to be able to understand the potential of machine learning and be able to use it toward management or business goals, but do not have to know how it works under the hood, how to implement it themselves, or how to prove the theorems behind it.

In computer science, students need to know the performance measures (and results) of different ways to implement end-to-end machine learning, and they need to be able to do so on their own with a thorough understanding of the technical infrastructure. (If what I have observed is generalizable, they also tend to be more interested in virtual and augmented reality, artificial life, and other visualization and user-experience aspects of AI.)

In math, students and graduates really need to understand what’s under the hood. They need to be able to prove the theorems and develop new ones. It is the theorems that lead to powerful new techniques.

In computer engineering, students also need to know how it all works under the hood, and have some experience implementing some of it, but don’t have to be able to develop the most efficient implementations unless they are targeting embedded systems. In either case, though, it is important to understand the concepts, the limitations, and the pros and cons as well as to be able to carry out applications. Engineers have to understand why there is a such a thing as PAC, what the curse of dimensionality is and what it implies for how one does and does not approach a problem, what the NFL is and how that should condition one’s responses to claims of a single greatest algorithm, and what the history and background of this family of techniques are really like. These things matter because engineers should not expect to be plugging-and-playing cookie-cutter algorithms from ready-made libraries. That’s being an operator of an app, not being an engineer. The engineer should be able to see the trade-offs, plan for them, and take them into account when designing the optimal approach to solving each problem. That requires understanding parameters and structures, and again the history.

Today, the field of ‘Neural Networks’ is popular and powerful. That was not always the case. It has been the case two other times in the past. Each time, perhaps like an overextended empire, the edifice of artificial neurons came down (though only to come up stronger some years later).

When I entered the field, with an almost religious belief in neural networks, they were quite uncool. The wisdom among graduate students seemed to be that neural nets were outdated, that we had SVMs now, and with the latter machine learning was solved forever. (This reminds me of the famous patent-office declaration in the late 1800s that everything that could be invented had been invented.) Fortunately, I have always benefited from doing whatever was unpopular, so I stuck to my neural nets, fuzzy systems, evolutionary algorithms, and an obsession with Bayes’ rule while others whizzed by on their SVM dissertations. (SVMs are still awesome, but the thing that has set the world on fire is neural nets again.)

One of the other debates raging, at least in my academic environment at the time, was about “ways of knowing.” I have since come to think that science is not a way of knowing. It never was, though societies thought so at first (and many still think so). Science is a way of incrementally increasing confidence in the face of uncertainty.

I bring this up because machine learning, likewise, never promised to have the right answer every time. Machine learning is all about uncertainty; it thrives on uncertainty. It’s built on the promise of PAC learning; i.e., it promises to be only slightly wrong and to be so only most of the time. The hype today is making ML seem like some magical panacea to all business, scientific, medical, and social problems. For better or worse, it’s only another technological breakthrough in our centuries-long adventure of making our lives safer and easier. (I’m not saying we haven’t done plenty of wrongs in that process—we have—but no one who owns a pair of glasses, a laptop, a ball-point pen, a digital piano, a smart phone, or a home-security system should be able to fail to see the good that technology has done for humankind.)

I left the place of the field of Statistics in machine learning until the end. They are the true owners of machine learning. We engineering, business, and CS people are leasing property on their philosophical (not real) estate.

 

Herbie Hancock's Chameleon's BPM graph from the Android app 'liveBPM' (v. 1.2.0) by Daniel Bach

Listening to music seems easy.

Listening to music seems easy; it even appears like a passive task.

Listening, however, is not the same as hearing. In listening, i.e., attending, we add cognition to perception. The cognition of musical structures, cultural meanings, conventions, and even of the most fundamental elements themselves such as pitch or rhythm turns out to be a complex cognitive task. We know this is so because getting our cutting-edge technology to understand music with all its subtleties and its cultural contexts has proven, so far, to be impossible.

Within small fractions of a second, humans can reach conclusions about musical audio that are beyond the abilities of the most advanced algorithms.

For example, a trained or experienced musician (or even non-musician listener) can differentiate computer-generated and human-performed instruments in almost any musical input, even in the presence of dozens of other instruments sounding simultaneously.

In a rather different case, humans can maintain time-organizational internal representations of music while the tempo of a recording or performance continuously changes. A classic example is the jazz standard Chameleon by Herbie Hancock off the album ‘HEADHUNTERS’. The recording never retains any one tempo, following an up-and-down contour and mostly getting faster. Because tempo recognition is a prerequisite to other music-perception tasks like meter induction and onset detection, this type of behavior presents a significant challenge to signal-processing and machine-learning algorithms but generally poses no difficulty to human perception.

Another example is the recognition of vastly different cover versions of songs: A person familiar with a song can recognize within a few notes a cover version of that song done in another genre, at a different tempo, by another singer, and with different instrumentation.

Each of these is a task that is well beyond machine-learning techniques that are exhibiting remarkable successes with visual recognition where the main challenge, invariance, is less of an obstacle than the abstractness of music and its seemingly arbitrary meanings and structures.

Consider the following aspects of music cognition.

  • inferring a key (or a change of key) from very few notes
  • identifying a latent underlying pulse when it is completely obscured by syncopation [Tal et al., Missing Pulse]
  • effortlessly tracking key changes, tempo changes, and meter changes
  • instantly separating and identifying instruments even in performances with many-voice polyphony (as in Dixieland Jazz, Big-Band Jazz, Baroque and Classical European court music, Progressive Rock, folkloric Rumba, and Hindustani and Carnatic classical music)

These and many other forms of highly polyphonic, polyrhythmic, or cross-rhythmic music continue to present challenges to automated algorithms. Successful examples of automated tempo or meter induction, onset detection, source separation, key detection, and the like all work under the requirement of tight limitations on the types of inputs. Even for a single such task such as source separation, a universally applicable algorithm does not seem to exist. (There is some commercial software that appear to do these tasks universally, but because proprietary programs do not provide sufficiently detailed outputs, whether they really can perform all these function or whether they perform one function in enough detail to suffice for studio uses is uncertain. One such suite can identify and separate every individual note from any recording, but does not perform source separation into streams-per-instrument and presents its output in a form not conducive to analysis in rhythmic, harmonic, melodic, or formal terms, and not in a form analogous to human cognitive processing of music.)

Not only does universal music analysis remain an unsolved problem, but also most of the world’s technological effort goes toward European folk music, European classical music, and (international) popular music. The goal of my research and my lab (Lab BBBB: Beats, Beats, Bayes, and the Brain) is to develop systems for culturally sensitive and culturally informed music analysis, music coaching, automated accompaniment, music recommendation, and algorithmic composition, and to do so for popular music styles from the Global South that are not in the industry’s radar.

Since the human nervous system is able to complete musical-analysis tasks under almost any set of circumstances, in multiple cultural and cross-cultural settings, with varying levels of noise and interference, the human brain is still superior to the highest-level technology we have developed. Hence, Lab BBBB takes inspiration and direct insight from human neural processing of audio and music to solve culturally specific cognitive problems in music analysis, and to use this context to further our understanding of neuroscience and machine learning.

The long-term goal of our research effort is a feedback cycle:

  1. Neuroscience (in simulation and with human subjects at our collaborators’ sites) informs both music information retrieval and research into neural-network structures (machine learning). We are initially doing this by investigating the role of rhythm priming in Parkinson’s (rhythm–motor interaction) and in grammar-learning performance (rhythm–language interaction) in the basal ganglia. We hope to then replicate in simulation the effects that have been observed with people, verify our models, and use our modeling experience on other tasks that have not yet been demonstrated in human cases or that are too invasive or otherwise unacceptable.
  2. Work on machine learning informs neuroscience by narrowing down the range of investigation.
  3. Deep learning is also used to analyze musical audio using structures closer to those in the human brain than the filter-bank and matrix-decomposition methods typically used to analyze music.
  4. Music analysis informs cognitive neuroscience, we conjecture, as have been done in certain cases in the literature with nonlinear dynamics.
  5. Phenomena like entrainment and neural resonance in neurodynamics further inform the development of neural-network structures and data-subspace methods.
  6. These developments in machine learning move music information retrieval closer to human-like performance for culturally informed music analysis, music coaching, automated accompaniment, music recommendation, and algorithmic composition for multicultural intelligent music systems.

 

The subjunctive is scientific thinking built into the language.

The subjunctive draws a distinction between fact and possibility, between truths and wishes. The expression “if he were” (not “if he was”) is subjunctive; it intentionally sounds wrong (unless you’re used to it) to indicate that we’re talking about something hypothetical as opposed to something actual.
This is scientific thinking built into the language (coming from its romance-language roots).

This is beautiful. Let’s hold onto it.

You are not disinterested.

Everyone: Stop saying ‘disinterested’. You apparently don’t know what it means. It doesn’t mean ‘uninterested’.

In fact, it means you’re truly interested. ‘Disinterested’ is when you care so deeply as to want to treat the situation objectively. It is a scientific term describing the effort to rid a study of the effects of subconscious biases.

Also, please don’t say ‘substantive’ when all you mean is ‘substantial’. They’re not the same thing. Thanks. (‘Substantial’ is a good word. You’re making it feel abandoned. )

Microsoft: Fix your use of the word ‘both’.
When comparing only two files, Windows says something like “Would you like to compare both files?” As opposed to what, just compare one, all by itself? (like the sound of one hand clapping?)
The word ‘both’ is used when the default is not that of two things. It emphasizes the two-ness to show that the twoness is special, unusual. But when the default is two, you say “the two” (as in “Would you like to compare the two files?”), not ‘both’, and DEFINITELY NOT ‘the both’. (It was cute when that one famous said it once. It’s not cute anymore. Stop saying it.)
Back to ‘both’: A comparison has to involve two things, so ‘both’ (the special-case version of the word ‘two’) only makes sense if the two things are being compared to a third.
English is full of cool, meaningful nuances. I hope we stop getting rid of them.

Seriously, everyone: English is wonderful. Why are you destroying it?

 

PS: same with “on the one hand”… We used to say “on one hand” (which makes sense… either one, any one, not a definite hand with a definite article)

Science-doing

There are (at least) two types of scientists: scientist-scientists and science-doers.

Both groups do essential, difficult, demanding, and crucial work that everyone, including the scientist-scientists, needs. The latter group (like the former) includes people who work in research hospitals, water-quality labs, soil-quality labs, linear accelerators, R-&-D labs of all kinds, and thousands of other places. They carry out the daily work of science with precision, care, and a lot of hard work. Yet, at the same time, in the process of doing the doing of science, they typically do not get the luxury of stepping back, moving away from the details, starting over, and discovering the less mechanical, less operational connections among the physical sciences, the social sciences, the humanities, technology, business, mathematics, and statistics… especially the humanities and statistics.

I am not a good scientist, and that has given me the opportunity to step back, start over, do some things right this time, and more importantly, through a series of delightful coincidences, learn more about the meaning of science than about the day-to-day doing of it.[1] This began to happen during my Ph.D., but only some of the components of this experience were due to my Ph.D. studies. The others just happened to be there for me to stumble upon.

The sources of these discoveries took the form of two electrical-engineering professors, three philosophy professors, one music professor, one computer-science professor, some linguistics graduate students, and numerous philosophy, math, pre-med, and other undergrads. All of these people exposed me to ideas, ways of thinking, ways of questioning, and ways of teaching that were new to me.

As a result of their collective influence, my studies, and all my academic jobs from that period, I have come to think of science not merely as the wearing of lab coats and carrying out of mathematically, mechanically, or otherwise challenging complex tasks. I have come to think of science as the following of, for lack of a better expression, the scientific method, although by that I do not necessarily mean the grade-school inductive method with its half-dozen simple steps. I mean all the factors one has to take into account in order to investigate anything rigorously. These include double-blinding (whether clinical or otherwise, to deal with confounding variables, experimenter effects, and other biases), setting up idiot checks in experimental protocols, varying one unknown at a time (or varying all unknowns with a factorial design), not assuming unjustified convenient probability distributions, using the right statistics and statistical tests for the problem and data types at hand, correctly interpreting results, tests, and statistics, not chasing significance, setting up power targets or determining sample sizes in advance, using randomization and blocking in setting up an experiment or the appropriate level of random or stratified sampling in collecting data [See Box, Hunter, and Hunter’s Statistics for Experimenters for easy-to-understand examples.], and the principles of accuracy, objectivity, skepticism, open-mindedness, and critical thinking. The latter set of principles are given on p. 17 and p. 20 of Essentials of Psychology [third edition, Robert A. Baron and Michael J. Kalsher, Needham, MA: Allyn & Bacon, 2002].

These two books, along with Hastie, Tibshirani, and Friedman’s The Elements of Statistical Learning and a few other sources that are heavily cited papers on the misuses of Statistics have formed the basis of my view of science. This is why I think science-doing is not necessarily the same thing as being a scientist. In a section called ‘On being a scientist’ in a chapter titled ‘Methodology Wars’, the neuroscientist Fost explains how it’s possible, although not necessarily common, to be on “scientific autopilot” (p. 209) because of the way undergraduate education focuses on science facts and methods[2] over scientific thinking and the way graduate training and faculty life emphasize administration, supervision, managerial oversight, grant-writing, and so on (pp. 208–9). All this leaves a brief graduate or a post-doc period in most careers for deep thinking and direct hands-on design of experiments before the mechanical execution and the overwhelming burdens of administration kick in. I am not writing this to criticize those who do what they have to do to further scientific inquiry but to celebrate those who, in the midst of that, find the mental space to continue to be critical skeptical questioners of methods, research questions, hypothesis, and experimental designs. (And there are many of those. It is just not as automatic as the public seems to think it is, i.e., by getting a degree and putting on a white coat.)

 

Statistics for Experimenters: An Introduction to Design, Data Analysis, and Model Building, George E. P. Box, William G. Hunter, and J. Stuart Hunter, New York , NY: John Wiley & Sons, Inc., 1978 (0-471-09315-7)

Essentials of Psychology, third edition, Robert A. Baron and Michael J. Kalsher, Needham, MA: Allyn & Bacon, A Pearson Education Company, 2002

The Elements of Statistical Learning: Data Mining, Inference, and Prediction, second edition, Trevor Hastie, Robert Tibshirani, and Jerome Friedman, New York, NY: Springer-Verlag, 2009 (978-0-387-84858-7 and 978-0-387-84857-0)

If Not God, Then What?: Neuroscience, Aesthetics, and the Origins of the Transcendent, Joshua Fost, Clearhead Studios, Inc., 2007 (978-0-6151-6106-8)

[1] Granted, a better path would be the more typical one of working as a science-doer scientist for thirty years, accumulating a visceral set of insights, and moving into the fancier stuff due to an accumulation of experience and wisdom. However, as an educator, I did not have another thirty years to spend working on getting a gut feeling for why it is not such a good idea to (always) rely on a gut feeling. I paid a price, too. I realize I often fail to follow the unwritten rules of social and technical success in research when working on my own research, and I spend more time than I perhaps should on understanding what others have done. Still, I am also glad that I found so much meaning so early on.

[2] In one of my previous academic positions, I was on a very active subcommittee that designed critical-thinking assessments for science, math, and engineering classes with faculty from chemistry, biology, math, and engineering backgrounds. We talked often about the difference between teaching scientific facts and teaching scientific thinking. Among other things, we ended up having the university remove a medical-terminology class from the list of courses that counted as satisfying a science requirement in general studies.