Look, it’s really simple…

― Dude, did you see all this stuff people are doing with CobraSpit 17.6?!!

― Yeah, I’ve been wanting to do that for a while, but I had some trouble installing it. I went to the CobraSpit download site and it said:

If you’re using [some OS], follow these instructions

If you’re using [another OS], follow these instructions.

If you’re using Windows, please switch to Linux. We can’t be bothered.

― So I went on a bunch of forums, you know, pancakexchange and stuff, and finally found someone who said: “If you have to use Windows, install the Windows versions of the following tools native to Linux. [list of tools]”

― So, I installed everything, but SmokeRrring 4.2 conflicted with Squeek and I found out that’s because it can only work with Squeek 3 which is completely different from Squeek 2. However Rootabaga only works with Squeek 2 unless you install PPa (pron. Papua), which I did. Now all I had to do was to uninstall Squeek 2, but then:

“If your computer ever had a Squeek 2 installation or has even been in the same ZIP code as another machine with Squeek 2, you should set fire to your machine and switch to Linux. If you insist on using Windows, you may be able to get around the Squeek-2-vs.-3 problem by using Screech 5. Screech 5 is for ubuntu only, but you can install CrumblyCake 3 to make it work in Windows.”

(An hour later) “CrumblyCake 3 only works with Windows 7. If you have Windows 10, install WhippedCream 8.1 and Whisk 9.2 running inside Battenberg 4 in your CatUnicornLizard-8R or oRCAsCORPIONhAWKiBEXtIGER environment.”

So you look these up and (after 25 minutes of reading puns on “ate” and “eight”) find out that CatUnicornLizard-8R [CUL8R] has been bought by a corporation and incorporated into their “solution” which costs $12,000/year (and more if you want support).

orCAscorPIONhaWKibEXtiGER is still open-source, but to run on Windows, it needs TurnTable 17.3 running on top of Onions 4. The website for Onions 4 comes up “404” and the forums suggest another dozen or so layers of stuff you can install in its place.

Still helpful and sympathetic, your friend says:

― Alternatively, you can start a Neanderthal 3 process within an Urrk channel running on top of your Dayvid stack in Aghast, and if you can talk to that with a Gennifur script—Hey! How about building a UNIX box for that from scratch? … And why were we doing this anyway?

― We were gonna use CobraSpit 17.6 to do some cool stuff really quickly.

― Oh, nobody uses that anymore. You should try PixieRust 8.0. It’s like Banana 3 but better.

― How does that work?

― Well, you need to install a SeaShell environment first, but that only works if you have Coral Wreath 9, so start by creating a Babylon 5.0 sandbox inside a BurritoCircus virtualizer running on Celery. You need Dwindle 3 for that though, which is really just an Ohmlette 5.9 instance in a fryingPeterPan shell, so it’s no big deal if you’re running ubuntu.

― Ah… I was on Windows, remember?

― Well, then why don’t you just install Screech 5?!

― Uhmm, that’s what we were trying to do in the first place.

 

 

[Look, it’s really simple… or, “Can we just go back to FORTRAN on VAX/VMS?]

The subjunctive is scientific thinking built into the language.

The subjunctive draws a distinction between fact and possibility, between truths and wishes. The expression “if he were” (not “if he was”) is subjunctive; it intentionally sounds wrong (unless you’re used to it) to indicate that we’re talking about something hypothetical as opposed to something actual.
This is scientific thinking built into the language (coming from its romance-language roots).

This is beautiful. Let’s hold onto it.

Science-doing

There are (at least) two types of scientists: scientist-scientists and science-doers.

Both groups do essential, difficult, demanding, and crucial work that everyone, including the scientist-scientists, needs. The latter group (like the former) includes people who work in research hospitals, water-quality labs, soil-quality labs, linear accelerators, R-&-D labs of all kinds, and thousands of other places. They carry out the daily work of science with precision, care, and a lot of hard work. Yet, at the same time, in the process of doing the doing of science, they typically do not get the luxury of stepping back, moving away from the details, starting over, and discovering the less mechanical, less operational connections among the physical sciences, the social sciences, the humanities, technology, business, mathematics, and statistics… especially the humanities and statistics.

I am not a good scientist, and that has given me the opportunity to step back, start over, do some things right this time, and more importantly, through a series of delightful coincidences, learn more about the meaning of science than about the day-to-day doing of it.[1] This began to happen during my Ph.D., but only some of the components of this experience were due to my Ph.D. studies. The others just happened to be there for me to stumble upon.

The sources of these discoveries took the form of two electrical-engineering professors, three philosophy professors, one music professor, one computer-science professor, some linguistics graduate students, and numerous philosophy, math, pre-med, and other undergrads. All of these people exposed me to ideas, ways of thinking, ways of questioning, and ways of teaching that were new to me.

As a result of their collective influence, my studies, and all my academic jobs from that period, I have come to think of science not merely as the wearing of lab coats and carrying out of mathematically, mechanically, or otherwise challenging complex tasks. I have come to think of science as the following of, for lack of a better expression, the scientific method, although by that I do not necessarily mean the grade-school inductive method with its half-dozen simple steps. I mean all the factors one has to take into account in order to investigate anything rigorously. These include double-blinding (whether clinical or otherwise, to deal with confounding variables, experimenter effects, and other biases), setting up idiot checks in experimental protocols, varying one unknown at a time (or varying all unknowns with a factorial design), not assuming unjustified convenient probability distributions, using the right statistics and statistical tests for the problem and data types at hand, correctly interpreting results, tests, and statistics, not chasing significance, setting up power targets or determining sample sizes in advance, using randomization and blocking in setting up an experiment or the appropriate level of random or stratified sampling in collecting data [See Box, Hunter, and Hunter’s Statistics for Experimenters for easy-to-understand examples.], and the principles of accuracy, objectivity, skepticism, open-mindedness, and critical thinking. The latter set of principles are given on p. 17 and p. 20 of Essentials of Psychology [third edition, Robert A. Baron and Michael J. Kalsher, Needham, MA: Allyn & Bacon, 2002].

These two books, along with Hastie, Tibshirani, and Friedman’s The Elements of Statistical Learning and a few other sources that are heavily cited papers on the misuses of Statistics have formed the basis of my view of science. This is why I think science-doing is not necessarily the same thing as being a scientist. In a section called ‘On being a scientist’ in a chapter titled ‘Methodology Wars’, the neuroscientist Fost explains how it’s possible, although not necessarily common, to be on “scientific autopilot” (p. 209) because of the way undergraduate education focuses on science facts and methods[2] over scientific thinking and the way graduate training and faculty life emphasize administration, supervision, managerial oversight, grant-writing, and so on (pp. 208–9). All this leaves a brief graduate or a post-doc period in most careers for deep thinking and direct hands-on design of experiments before the mechanical execution and the overwhelming burdens of administration kick in. I am not writing this to criticize those who do what they have to do to further scientific inquiry but to celebrate those who, in the midst of that, find the mental space to continue to be critical skeptical questioners of methods, research questions, hypothesis, and experimental designs. (And there are many of those. It is just not as automatic as the public seems to think it is, i.e., by getting a degree and putting on a white coat.)

 

Statistics for Experimenters: An Introduction to Design, Data Analysis, and Model Building, George E. P. Box, William G. Hunter, and J. Stuart Hunter, New York , NY: John Wiley & Sons, Inc., 1978 (0-471-09315-7)

Essentials of Psychology, third edition, Robert A. Baron and Michael J. Kalsher, Needham, MA: Allyn & Bacon, A Pearson Education Company, 2002

The Elements of Statistical Learning: Data Mining, Inference, and Prediction, second edition, Trevor Hastie, Robert Tibshirani, and Jerome Friedman, New York, NY: Springer-Verlag, 2009 (978-0-387-84858-7 and 978-0-387-84857-0)

If Not God, Then What?: Neuroscience, Aesthetics, and the Origins of the Transcendent, Joshua Fost, Clearhead Studios, Inc., 2007 (978-0-6151-6106-8)

[1] Granted, a better path would be the more typical one of working as a science-doer scientist for thirty years, accumulating a visceral set of insights, and moving into the fancier stuff due to an accumulation of experience and wisdom. However, as an educator, I did not have another thirty years to spend working on getting a gut feeling for why it is not such a good idea to (always) rely on a gut feeling. I paid a price, too. I realize I often fail to follow the unwritten rules of social and technical success in research when working on my own research, and I spend more time than I perhaps should on understanding what others have done. Still, I am also glad that I found so much meaning so early on.

[2] In one of my previous academic positions, I was on a very active subcommittee that designed critical-thinking assessments for science, math, and engineering classes with faculty from chemistry, biology, math, and engineering backgrounds. We talked often about the difference between teaching scientific facts and teaching scientific thinking. Among other things, we ended up having the university remove a medical-terminology class from the list of courses that counted as satisfying a science requirement in general studies.

This is a sci-fi story.

The icers were out in force that night. Joey really didn’t like running into them. It wouldn’t be dangerous if he wouldn’t wear his lukers leather jacket when he went out alone, but unless he was a full-time luker—and flaunting it—he felt like a traitor. He felt like he wasn’t “real” or wasn’t noteworthy enough to be out on the streets. He also didn’t want to run into any girls while not visually declaring allegiance to his chosen subculture, so he wore the identifiers of his subculture even though it meant he’d likely get beaten bloody by icers or somebody else.

The icers were truly extreme, Joey thought. They claimed they would rather die than drink any but the most extreme ice-cold tap water. Lukers, like Joey, weren’t so picky, and indeed preferred avoiding brain freeze.

Water was no longer the simple commodity previous generations took for granted. It wasn’t exactly unobtainable, but most people had to save to get their monthly allotment, which was a very small amount, or perform community service for extra water which would then be delivered automatically to their approved smarthomes. Now that games, movies, music, fashion, food, cars, drones, jetpacks, body alterations, and everything else a youth could want was readily available through picofabrication, automation, and biotech, it was ironically the most basic substance of life, water, that became scarce—because desalination remained expensive—and thus became the marker of one’s identity as a young person: their subcultural in-group.

Aside from the icers and lukers, there were half a dozen other fine-grained varieties of tap-water subcultures (like the boilers and the JCs—”just cold”).

Mostly high-school kids, and mostly idle due to the abundance of free scavenged (recycled) energy for their picoautomation and their neural implants, these youths roamed the streets in their picofabricated faux-leather jackets emblazoned with their subcultural affiliation, and picked fights with members of other groups.

After the first few years of water shortage, this expression of identity through the one scarce resource that was critical for survival began to expand. Through a naturally stochastic clustering process, some hairstyles and preferences for clothes or shoes became associated with particular groups.

It just happened the way it did. There is no reason icers should prefer fur-lined boots and Christmas sweaters. If anything, one would expect the opposite. Yet, they wear them even in the summer… in the 130° globally warmed summers of Cascadia. That’s how you know you’ve got a genuine subculture: The clothing has got to be uncomfortable; it’s gotta require sacrifice.

The lukers likewise somehow ended up all having to wear havaianas, 20th-century motorcycle helmets over long green hair, tank tops (what the British call “vests”), bandannas tied right at the elbow, one on each arm, and pajama pants with teddy bears sewn unto them. (None of them knew that this last little detail originated with a bassist in a combo of ancient “rock” music from back when music was made by people playing instruments rather than autonomous conscious AI units that wrote every kind of music straight into digital encoding.) The more teddy bears one’s pants had on it, the greater would be their status as a luker.

Joey had found the time to get his automation to sew 37 onto his favorite pajama pants and another 24 on a different pair. The fact that he consequently couldn’t run was a big part of why the icers picked on him so much. They, on the other hand, spent most of their time getting their picobots to learn to assemble themselves into fists and feet for delivering punches and kicks from a distance.

So, Joey called up a game in his neural implant as he and his 37 teddy bears set out onto the streets of Seaportouver, bracing themselves—not so much the teddy bears but Joey’s bio-body and all his affiliated picobots and neurally linked semi-autonomous genetic floaters—against the onslaught of icer attacks and against old people who look disdainfully at his awkward teddy-bear-encumbered gait and transmit unsolicited neuro-advice that clogs up his game for an entire interminable microsecond, in search of a thimblefull of lukewarm water.

 

This mini sci-fi story is an attempt to draw a parallel between how ridiculous and unlikely such tap-water-based subcultures of street-fighting youth might seem to us, and how the music-based subcultures of my youth in the ’80s must seem to today’s youth.

Music, after all, is like water now: You turn the tap, and it pours out—out of YouTube, Spotify, Pandora, Slacker, or SoundCloud, and in a sense, also out of GarageBand, FruityLoops, Acid, and myriad other tools for generating music from loops. A few dozen people in a few offices in LA may make those loops—they’re like the people working the dams and the people who run the municipal water bureau or whatever. They supply the water that we take for granted, and it just flows out of the tap, not requiring any thought or effort on our part about how it got there or how much of it there might be. Music today works the same way. You exchange memory cards or streaming playlists; you download free software that allows you to drag and drop loops and which makes sure they are in the same key and tempo. It’s about as complicated as making lemonade. Why would such a thing have any relation to one’s identity and individuality?

In contrast, when I was young, I had to save money for a year and still beg my parents for a long-playing record. I could also occasionally buy some cheap tapes or record songs off the radio (almost always with the beginning cut off and with a DJ talking over the end) onto cheap low-fi cassettes that had more hiss than hi-hat. My first compact disc, a birthday present from a wealthy relative, was like an alien artifact. It still looks a bit magical to me… so small and shiny. Today, I hear they’re referred to as “coasters” because… why bother putting music on a recording medium when it’s free and ubiquitous? 

Subculture-as-identity-marker has disappeared except among the old. (How old is Iggy today, or the guys from The Clash?) Young people today dress in combinations of the “uniforms” of ‘50s, ‘60s, ‘70s, ‘80s, and ‘90s subcultures without having any interest in the sociopolitics or music of those subcultures. The last three times I talked to a―seemingly―fellow goth or punk rocker, they reacted with mild repulsion at the suggestion that they might listen to such music.

Expressing allegiance to a musical subculture must seem as silly to today’s youth (say, through age 30 or so) as expressing allegiance to a temperature of water would seem to anyone.

 

Zeno’s thermometers?

A friend just told me about the xkcd idea for the “Felcius scale” which is the arithmetic mean of the Fahrenheit and centigrade (Celsius) scales. Naturally, my first thought was that this was a funny but pointless idea since it discarded the advantages of the centigrade scale, which was renamed ‘Celsius’, but I’m using the old name to emphasize the 0-to-100 advantage. (A better discussion is found here: http://www.explainxkcd.com/wiki/index.php/1923:_Felsius)

My friend, however, suggested that it was a step in the right direction.

If so, it isn’t enough. If this idea were to take hold, we would need another such step in the right direction, perhaps to be called the “Eelsius” which would take us another 50% of the way to Celsius, and eventually another halfway jump to “Delsius”, and (aside from running out of characters between ‘f’ and ‘c’), a nice little Zeno’s paradox of temperature-scale systems that asymptotically approach the logical centigrade scale.

Time Travel? (It comes down to causality’s unawareness.)

Forget the usually discussed paradox about time travel; how about how one would know whether one had come back to the proper (intended) present?

Would one rely on clocks and calendars? Clocks move at different speeds (as explained by relativity). And how much should the calendar have changed while you were off in the past? Does the travel itself take any time? How about if everyone else also time-traveled while you were off doing that? Who, or what, decides what the correct time is “back home”? (To say that it would take so much energy that only one person should be able to time-travel or get close to the speed of light at a time would be an economic argument, not a physical one, so in a discussion of physics, we must allow for the possibility that everyone is time-traveling concurrently—let’s say, by starting together but going off in different directions in different amounts.) Then who or what keeps so-called real time going in its regular pace? (It doesn’t even have one, thanks to relativity!)

What we mean by time travel—if it’s not kept track of via clocks—is that causality is unaware of the traveler’s appearance and disappearance. So, what happens to the air molecules where the traveler will suddenly appear? Are they pushed out, or do they somehow combine or coexist with the traveler’s? If they are pushed out, then how massive an object can time-travel? (Because, at some point, we’d be moving so much air, it would lead to catastrophic damage to someone or something. So, the paradoxes extends to the legal and safety realms as well, although those are merely practical.)

Happy 65th birthday, Deacy!

Thank you for Back Chat, Cool Cat, You And I, Spread Your Wings, In Only Seven Days, Misfire, Who Needs You, Under Pressure, You’re My Best Friend, One Year Of Love, If You Can’t Beat Them, Need Your Loving Tonight, and Pain Is So Close To Pleasure, not to mention AOBTD, FWBF & IWTBF. (And, really, every Queen song!).

You really nailed it whenever you wrote one. By the end of my life, I would like to have written one song that’s like one of yours.

What happened to ‘Windows 9’?

I was talking to an old friend a couple of years ago, just before my band was going to play a show at a bar in Seattle. He’s been at Microsoft for quite a few years, and was saying that one of Vernor Vinge’s old books is like a guidebook for a lot of MS people, in terms of what future technologies the high-ups (perhaps) would like to the company to contribute to or pioneer.

Okay, so, MS folks read sci-fi; no surprise there.

Now for the other half of the puzzle: I have been reading the second 2016 issue of THE SKEPTIC, and took some extra time to dig into the arguments for and against the (eventual, distant) possibility of uploading one’s entire consciousness onto silicon (or some other substrate), partly because another friend has already invested in having his (and his wife’s) brain frozen for eventual uploading.

The author of one of the articles from THE SKEPTIC synthesizes results and possibilities from quite a bit of current and recent work in neuroscience and neurobiology, with references to numerous scientific papers from journals such as Nature, Nature Methods, The Journal of Neuroscience, Proc. NAS, Nature Neuroscience, Frontiers in Psychology, Progress in Neurobiology, Neuroscience, Journal of Neural Engineering, etc. (You get the picture.)

He also references several books: some technical, some more along the lines of popular science, and one, a fiction book.

I ordered two of the three technical/scientific books online, and found the fiction (sci-fi) book at Powell’s, so I started reading that one. It’s called Kiln People, and it’s by David Brin. It reads a little like a detective-story version of The Mind’s I: Fantasies and Reflections on Self and Soul, which is an excellent compilation of stories and essays meant to exercise the reader’s understanding of self, soul, consciousness, and the boundaries of such. It was compiled/written (“composed and arranged”) by Douglas Hofstadter and Daniel Dennett. The Brin book flows better, as it is pure fiction (altough this is not to say The Mind’s I doesn’t; it’s a superbly crafted book).

And like most good sci-fi, Kiln People is well-informed fiction (like Diaspora by Greg Egan), and that’s why, I suppose, it was listed along with dozens of peer-reviewed neuroscience articles.

In any case, my reason for writing this small post is not neuroscience, or consciousness, or spirituality, or the human soul, but a wild conjecture about the naming of Windows 10. If Microsoft programmers, managers, executives, etc., like to read sci-fi, and some of them have read Vinge, it is not unlikely that out of that many people, some have read Kiln People by David Brin. In the edition I bought, on page 396, the narrator and main character lists people whose deaths have come at the hands of their own creation. The list contains Oedipus’ father, Baron Frankenstein, and “William Henry Gates and Windows ’09.” (The book is copyrighted 2002.)

Granted, there was no year-named “Windows ’09” (that I know of). However, the abbreviation “’09” (for the year) could have been pronounced by some people just like the numeral ‘9’, and I’m guessing the sci-fi fans at MS, in marketing, for example, may have read the book, and not wanted to follow Windows 8 with a Windows 9, avoiding Brin’s fictional prediction, or even better, avoiding making Brin look bad when that did not come true.

Otherwise, why go ‘Windows 7’, ‘Windows 8’, ‘Windows 10’?

Auto-(in)correct: Emergent Laziness?

Is it Google? Is it LG? Or is it emergence?

I am leasing an LG tablet running Android to go with my phone service. I thought the large screen and consequently larger keyboard would make my life easier. The first several days of use, however, have been unreasonably annoying. The salesperson had said that this device would be slave to my LG Android cell phone, but my settings did not seem to carry over. What’s worse, no matter how much I dig through menu trees to get to certain settings I’m looking for, I can’t find them. For example, I may want autocorrect off, or I may not want the latest e-mail in my inbox to be previewed. (I prefer to see a bird’s-eye view of all the recent e-mails, packed as tightly as possible, and I can usually set this very quickly and easily, but not on this tablet.) The reasons might range from being about to go to class and teach in a few minutes and not wanting to think about that e-mail about a committee issue that just arrived right at the moment, and I don’t want Gmail to parade it in front of me.

So, the settings seem to be very well hidden, or maybe not even available to the user anymore (because that has been the trend in computer-and-Internet technology: Make the user think less, and have less control; so-called intelligent software will decide all your preferences for you).

And perhaps the software can deduce (or, more likely, induce) your preferences as they were at a certain time under a certain set of circumstances, but human beings expect the freedom to change their minds. Software doesn’t seem to allow this.

Furthermore, crowd-sourcing is considered the ultimate intelligence. I know and understand the algorithms behind most of these ideas, and totally agree that they are beautiful and awesome (and really fun). However, engineers, programmers, mathematicians, and other nerds (like me) finding something super-fun should not be how life is redesigned. The crowd-sourcing of spelling and automatic correction is leading us from artificial intelligence to natural laziness. My device wants to change “I’m” to “imma”. (Before you decry that I’m also ignorant and don’t know to put a period inside the quotation marks, read my disclaimer about switching to British/logical punctuation.) Am I now forced to appear like I have abandoned capitalization, and to have picked up an unnecessarily excessively colloquial form of spelling. And if I had, then fine, but I haven’t.

It gets worse. The learning algorithm is not learning, at least not from me. The following has now happened with several phrases and words on this new tablet, and I’ve looked further into altering this setting, to no avail.

When I type “I will”, it automatically replaces it with “I silk”. If I backspace and type “I will” again, it replaces it again. And it doesn’t learn from my actions; I have patiently (and later on, a further dozen or so times, impatiently) retyped “I will” more than 30 times, only to watch Gmail running on my Android LG device switch it back to “I silk” immediately.[1]

Where did this come from? Is there a band called “I silk”? Is this a new phrase that’s in these days, and I haven’t been overhearing my students enough to know about it?

Or is it because earlier that day, I tried to write “I seek to …” where the ‘seek’ was autocorrected to ‘silk’? (for who knows what reason)

And what happens when this behavior is pushed beyond e-mail on a tablet, and I’m not able (or allowed) to write either “I will” or “I seek” as I type a blog entry such as this on my laptop, or as I try to type an e-mail to explain what’s going wrong to Google’s tech support, or someone else’s tech support?

This really doesn’t make sense. Shouldn’t machine learning give us results that make sense? (That used to be the idea.) Now, perhaps, it’s just supposed to give us results that are popular or common. It seems we’re not building artificial intelligence; we’re building artificial commonality.

This is not a rant for elitism (which, anyway, is also used in machine learning, in evolutionary algorithms). It’s about the loss of freedom of speech, to be able to say what one is trying to say the exact way one wants to say it. The ability for clear, unequivocal communication is not something to be eliminated from the human experience; it is something to continue to strive for. Likewise, convenience over freedom (or over accuracy) is not a good choice of values. In the end, the person pushing little buttons with letters marked on them will be held responsible for the content. Shouldn’t that person be in charge of what words appear when they push the little buttons? Shouldn’t we at least be able to turn off auto-correct, or have some control over when it applies?

This is being taken away, little by little. “Oh, it’s just a tablet.” … “Oh, it’s just an e-mail. Nobody expects it to be spelled correctly.” Pretty soon, no one will be able to spell anything correctly, even if they know how to, because their devices won’t allow them to have that little bit of control.

 

[1] Also, do not even try to write in a foreign language, or mix English and Turkish in one sentence. In an early e-mail I wrote on this device, I had to repeat the letter ‘i’ (which appeared only once in the actual word) five times (for a total of six ‘i‘s) for it to stop auto-(in)correcting “geliyorum” to something like “Selma”. I had to type “geliiiiiiyorum”.

Making sentences. . .

Winter term has been crazy, although in a good way. I’m teaching six classes this term, all of which are going great (I have awesome students), though one’s a new prep, so I haven’t had time to post here, but I recently found some silliness on my phone that I had come up with while waiting for a train: Making sentences by combining band names.

Here they are:

As Blood Runs Black The Refused Pierce The Veil

Tower of Power Was (Not Was) Built To Spill Ashes

Barenaked Ladies Poison The Well From First To Last

Bring Me The Horizon Within Temptation At The Drive-in

Blonde Redhead Of Montreal Cursed The Only Ones

(And, of course, a band name, all by itself: I Love You But I’ve Chosen Darkness)