On forgetting, for lack of a better word

Preamble

Getting lost in the moment is, at least sometimes, quite literal. A three-day festival, a house party, a conference, the first fuzzy days with a new romantic interest. I’m just there and then.

And then, some stimulus jolts me back from the moment and into the normal meta-level consciousness of everyday life.

The rest of the world exists. Shit.

Right, find your bearings, you have a home, you have a job. When was the deadline for that report? Should probably call my parents at some point. Do I even have anything in the fridge for breakfast or will I have to run to the store once I get back?

For a moment, I have very concretely forgotten about the rest of the world, only to be brought back by some fleeting thought that I have no control over or a random external stimulus. If I think about it too much, it makes me feel like I’m adrift, and only catching hold of something by chance, because the river just happens to have enough comfortably-shaped metaphorical rocks at frequent enough intervals.

I realize I might not be typical, but I feel like I’m typical enough and have met enough other, more typical people to say that feeling something like this is common. Many people report feelings of being adrift, of not knowing what to do in life, of coasting through the years.

And I have what I can only describe as a Lovecraftian fear about human cognition related to this kind of forgetting.

My existential dread comes from the feeling that much of adult human life might be like the kind of living-in-the-moment where you forget about the fact that you didn’t do the laundry and your partner’s birthday is in just two days, except in the grand scheme, on longer and shorter timescales, and with much more important things being pushed into oblivion. All of us in that sleep-deprived, drunken stupor of the second night at an all-weekend illegal rave, all the time.

I do not mean just “coasting through life, kind of landing an education/a job/a partner that is the most easily available”, although that is a part of it. I mean, on a deeper level, that I question how much memory or understanding play into our lives instead of just stimulus-response, the environment being so well-built to nudge us forward that we never stop to realize that there never was a plan, rhyme or reason. Maybe sometime in history there was, but then something (cultural evolution, civilization, cognitive offloading) happened, and now we are here, now we are now.

I also do not mean “deterministic universe applied to the brain”, though of course my thoughts are compatible with that idea. You could have a deterministic universe where people deterministically act based on memories and reasoning, or something very different.

Brains built for forgetting

“Hmm”, I – an adult human being – thought, “I feel like there’s way less kids’ TV nowadays.”

“Oh, wait”, I – an idiot realizing his idiocy – thought a few seconds later.

Recognizing actual change in the world relying on one’s experiences alone can be surprisingly hard because not only are we limited in what and how much we can experience in total, we’re also systematically limited by what stage of our lives we’re at. You might live in the same physical environment as a child and as an adult, but what you pay attention to, and what you can pay attention to, changes.

Skills are learned through practice and maintained by practice. Skills are also, in the end, a form of memory. Some memories (such as knowing how to swim or ride a bike) are “stickier” than others, in that it is very hard to forget them and/or very easy to relearn them after a long period without practice. However, many memories degrade with lack of practice. More importantly, even the sticky memories are not always active: I know how to ride a bike, but I rarely think about it. The memory becomes active in the right context, with the right stimulus, and without those it is simply somewhere in the background, unremembered.

What’s more, the stimuli and the contexts might not be under our control at all. We do not choose to age, and I would argue most of us do not really choose to go to school even as adults any more than we choose to get jobs or pay rent. No deep thought is really necessary, the environment does quite enough for us, and all of these pushes from the environment further change the environment and thus the stimuli that help us remember. Olfactory cues triggering vivid memories is a common occurrence, but we can’t always choose to go smell a specific thing, some things we can only smell at very specific times, and most likely we don’t even remember the fact that a certain smell could trigger a certain memory. What if a much larger percentage of memory is like a completely involuntary reaction to the smell of a spring morning than we would normally think?

My pithy elevator pitch view of human cognition is that we do not really know how to do something other than when we are actually doing that thing, we do not actually understand something unless we are actively involved in some sort of sense-making of that thing, and we do not actually remember something unless we are in a situation where the memorized knowledge is important or relevant through some association (from the first person perspective). When not made relevant or important by some aspect of the situation that we may not have control over, the memories are not accessible, meaning they’re as good as forgotten – and what makes something “relevant or important” is not defined by some Platonic rationality but the first-person perspective through idiosyncratic associations. People are usually not used to using a specific memory or skill, no matter how general, in all situations, only the specific ones that triggered it the most consistently in the past. In short, context matters, generalization is rare.

This is a very rough draft of even half a theory, but even as it is, I feel it helps make sense of several types of “human stupidity”, at least better than many theories about cognitive biases that rely on vague ideas of processing capacity, some kind of innate folk science, or the brain somehow suppressing information.

Why do many people not know how to draw a bicycle, regardless of how often they ride a bicycle? Because the actual form and mechanics of a bicycle are unimportant unless you are building or repairing a bicycle (and even then the bicycle is right in front of you, so you don’t really need to have it etched into your visual memory to aid the completely unrelated task of drawing).

Why do physics undergrads show the same kinds of mistakes in understanding gravity in relation to a thrown object as elementary school students? Because classroom physics is a whole different task from “normal everyday physics”: different context, different kinds of practice.

Why do physicists not do any better than social scientists in avoiding teleological thinking, when physicists have much more explicit practice of rejecting teleology? Because scientific thinking is hard even when you’re actually doing science, and these people are not tasked with doing science but answering silly questions out of context.

Why might recent Harvard grads or even faculty members be unable to correctly answer questions about the phases of the moon or the change of seasons? To be fair, it might be because their education is bad, but I would argue the deeper reason for this is that it is very-hard-to-almost-impossible to teach people to think in a scientific way consistently and quickly in any given situation, not just when taking a test or writing an article.

One could go a whole day listing all the very funny stuff where psychologists or science popularizers show how humans are seemingly acting in contradictory ways: “you know/claim that X, yet you act as if Y”. The common thread that I believe is behind most (seeming) cognitive biases like this is that the contexts of X and Y are different.

Of course, this idea connects to less quirky everyday things as well. Despite having been children themselves, many adults seem to be completely clueless about the way a child’s mind or the social world of children works. My claim is simply that saying “it’s because people forget with age” would be imprecise: it’s not necessarily age but a lack of continued practice and experience of the life of a child (in addition to continued practice of being an adult and listening to all flavours of “experts” on children). And we have a constant lack of continued practice and experience in so many things we used to experience and thus practice all the time, because our lives move in phases. Living means forgetting.

A world built for forgetting

In a way, the idea of human lives being mostly driven by external stimuli that just happen to be here is just a depressing version of the idea of cognitive offloading – an involuntary, chaotic version of using the physical environment to relieve a burden on memory. The world just happens to be shaped in a way that is good enough to remind us of the relevant things often enough. Or, if you prefer another framework, the world affords going to work, paying rent, and calling your parents.

But cognitive offloading is good – even if it is the case that baseline humanity is quite ADHD-like, it seems like at least some of the problem is solved. So, what does the world remind us of, and what does it not? The second, much more speculative part of my vague fear around forgetting is a feeling that the world is getting worse for us in the whole reminding business, or at the very least the nudge conveyor belt leads to somewhere it shouldn’t.

I have met university students who did not know what “folders” on a computer are, which made explaining using certain kinds of software unexpectedly difficult. It’s not exactly a fresh observation that young people are not actually very good with computers, but these students could not have been more than five years younger than I was at the time, and I had learned basic navigation on a Windows computer seemingly by osmosis. The obvious explanation is that increasingly convenient tech has made it unnecessary for people to learn even the very superficial stuff I know about the inner workings of an OS (or, God forbid, the actual computer).

My point isn’t that young people being bad with computers is somehow a horrible thing: it’s that things move fast, and reminders of many things disappear faster than you would think, to be replaced by new ones.

We’re all online now. Across the years, people have worried about whether constantly increasing contact with computers that offer us quick rewards of almost any kind has led to increased rates of attention deficit disorders. Maybe, but to me it seems like even neurotypical people have all the potential to act like kids who need Ritalin. More than that, we seem to want to. Social media that supercharges gossiping, a news cycle where even on-going events are simply forgotten once the next big thing hits – these would not exist without some sort of demand. “Slow journalism” would not be a subculture if the majority of people did not have a revealed preference for faster, cheaper, now-er.

“We have access to all the knowledge in the world literally in our hands and we use it to look at pictures of cats”, we joke, but it’s not really just that. We do not just have passive access to knowledge of the world: knowledge of the world is being selectively blasted at us at all hours. There is no shortage of reminders about all kinds of things, no shortage of things the world affords, no shortage of stimulus-response opportunities. But none of these reminders are very long-term in any sense, and coherent timelines may be hard to find even if one has the patience to go looking for them. Add to this the fact that archiving “born-digital” media is hard, and that despite claims of “everything on the internet being forever”, it’s becoming increasingly possible to just remove stuff from the web that the majority of people know about as the web becomes more centralized. More affordances for stumbling in a never-ending now, more chances to forget individually and collectively.

I wonder if this was always the case. Maybe we were always kind of bad at remembering, it was just that the world-as-a-post-it-note functioned better than it does now, at least for some outcomes.

There’s a kind of conservative complaint about kids in the current year that goes something like this: people’s lives are way too sheltered and guided from above until they become “real” adults and enter the “real” world, at which point they break down and start demanding that the kind of pampered structure they used to have become the norm everywhere. However bad the kids are these days, I think it’s interesting that this complaint carries the implication that less external guiding would be good, so that people would learn to be independent. I will concede that it sounds like a plausible idea, but I’d also like to note that maybe people of previous generations in general just needed less planned, codified, explicit external guidance because implicit external guidance was often enough. Go to school, get a job, buy a house, things just kind of fall into place, one step flows into the next. I’m no economist, but I’ve heard rumours that going to school and getting a job might not be enough to buy a house anymore. Maybe the kids are so very not alright not because people are less independent, but because the (Western) world is no longer functioning so well when people simply proceed from one societal stimulus-response set to the next. Maybe we were always bad at the whole independence thing and always needed to be nudged, but it used to not matter as much.

Why bother?

Well, I think it’s interesting. Interesting, and at the same time boringly obvious. Other people have talked about similar ideas, but I haven’t seen anyone put the ideas together in exactly this way. I hope I can give someone a new perspective to things large and small.

I find it interesting how all these bits of cognitive science and everyday observation are out there, for anyone to see, and I still haven’t seen anyone express this exact kind of uneasiness about humanity.

I find it interesting how I feel like most of these ideas fit pretty well with the 4E umbrella of cognitive science, which I usually don’t really agree on. I find it interesting specifically because I often get the impression that 4E is presented as a kind of more “human” and “holistic” and less “cold” approach to cognition, but to me, taken to its logical conclusion, it paints a much grayer picture of humanity than extremely online cynical cognitivist evo-psych.

I find it interesting how so many internet-era worries, from loot-boxes in video games and “social media addiction” through easily-forgotten outrage media cycles to obesity and increases in attention issues all seem to point in the same general direction, yet nobody points at them and says “hey, maybe the behaviorists were right”.

Academic trust issues

I have become wary of experts. I doubt their claims more and more.

I don’t exactly fit the bill here. I’ve spent the last seven years of my life in higher education and am currently in the last stretch of my PhD: a person simply doesn’t waste that many years trying to gain “expert knowledge” without caring about it in some way. If I were an outside observer and had to make a bet, I’d be all in on me being the kind of person who goes around talking about how people should trust experts more, and how the anti-intellectual atmosphere of our society is worrying, but here we are. Coincidentally, I’ve also become wary of those kinds people too.

To make one thing clear before going any further, this post is not about climate change. My wariness is mostly about experts in the social sciences (especially my own academic fields of cognitive science and psychology) and in medicine and nutrition – the usual suspects for those who’ve followed any news on the reliability of science in the last decade or so. That said, the main reason I’m mostly excluding “harder” sciences is because I don’t know enough about the inner workings of the fields, not because I think they’re immune.

The issue with experts is that they (or we) are nerds.

As the famous comic summarized, the difference between a nerd and a geek is that nerds are interested in studying things academically, whereas geeks are simply very passionate about their hobbies, and the kinds of people who care about this distinction are called “dorks”. I propose a second distinction: nerds (academics) often focus on knowledge of facts, whereas geeks (engineer types) focus on know-how. Nerds care about specific tidbits of info, geeks care about the mechanisms. Nerds tell stories and then remind you that everything is complicated, geeks try and build models.

Nerds too can get very passionate about the things they study. When you know so much about something, you want to correct other people’s mistakes about it whenever you see them. I’ve done this myself more times than I’d like to admit even when it’s not particularly relevant to the discussion. Luckily, me being annoying at parties is not that detrimental to humanity as a whole. Where it gets tricky is when people are nerds about things that actually matter in terms of our understanding of the world.

Some time ago I was trying to figure out how much I should trust Steven Pinker’s thesis of the world having gotten consistently better and being in better shape on several different measures than before. It seemed that many people who knew much more about history than I do weren’t too thrilled about the idea, and pointed out several issues with Pinker’s handling of historical data. The criticisms seemed valid enough when it came to singular facts. What bothered me was that it wasn’t clear to me whether these factual mistakes actually mattered that much in terms of the main thesis. I got the feeling that I might just be witnessing people flexing their trivia muscles. When you’re not an expert in the area yourself, it becomes harder and harder to figure out which of the arguments are actually relevant to debunking a model, and which are just angrily expressed anecdotes. From a non-nerd point of view, it can be next to impossible to even gauge how strong the disagreement truly is.

(Coincidentally, I get the feeling that most of the really detailed “climate skeptic” arguments are just bundles of angrily expressed anecdotes. Anti-establishment types are the worst nerds of all.)

The point that laymen often can’t make informed judgments about who’s right or wrong in a scientific debate (or a “debate” between an expert and an eager amateur) gets repeated year after year. But this is said with the implication that if only they had some mythical property that the scientists have, they wouldn’t have such issues. You hear claims like “we need more education and outreach”. But how much would we have to educate people to know better in some of these situations? I happened to know more than the average Joe about the issues at hand in the whole Pinker debate AND had an academic background, but I still wasn’t 100% sure what to make of it. I would basically have had to read every article and book each side of the debate had ever read to really know who was right. Since I couldn’t be bothered, the only thing I could say is that one side seems to be pointing out to many interesting tidbits the other side gets wrong but doesn’t necessarily refute the general argument.

Getting stuck at expert-level irrelevancies seems pretty common to me. The existence of rare chromosomal or hormonal abnormalities is used as an argument in favor of a spectrum of biological sex in humans, even though it’s questionable how these abnormalities refute binary sex any more than birth defects refute a standard layout of the body. Or the fact that computers can’t be programmed to have drives or needs is used to argue against the very idea of an AI turning against us, even though we live in a world where algorithms that clearly don’t have drives or needs are already causing unexpected and unwanted things. If there’s a risk the experts themselves are grasping at irrelevant factoids, how much can we really blame the poor uneducated masses when they do the same with vaccine side-effects or whatever macronutrient we’ve collectively decided to hate this Thursday?

I don’t want to speak too negatively of nerds, especially given I’m one myself. Knowing when your expertise in an area is actually relevant to a model or an application (or a half-drunken discussion at 3am) is hard. It’s not exactly taught in schools – I’m not even sure we could teach it if we wanted to. Learning a lot about something and seeing for yourself how complicated things are can be humbling (there’s so much we don’t know!), but it can also lead to a false sense of seeing everything as so complex that we can’t even talk of any general trends. Nerds become too zoomed in.

This isn’t my only reason to distrust experts, though. The more I learn, the more I notice that experts are often just really, really wrong.

I read popular articles where big name psychologists engage in amazingly shoddy reasoning with such confidence it makes me question whether it was me or them who misunderstood everything they were taught from freshman year onwards. I see all kinds of cultural experts analyzing subcultures I’ve been a part of and completely miss the mark. Or I end up in a Twitter discussion with a professor who claims that the fact that 95% of workplace deaths are working-class men is clearly only caused by them being working class, and has nothing to do with them being male. I spot these things because I either happen to know about the subject matter or have the necessary mathematical skills to notice errors – and neither my level of knowledge nor math skills are exactly top notch. Again, what about areas where I’m not that knowledgeable?

When you top this all with replication crises in basically every field where someone has bothered to look for one and the glacial pace of change in the matter, I think it would be completely fair to actively tell everyone to remain skeptical of every new finding for a couple of decades until the scientists have sorted their shit out.

All of this puts me in a weird spot. It grates me when someone announces with the confidence and philosophical nuance of a seventeen-year-old internet atheist that psychology is just pseudoscience – but I doubt much of psychology myself. I hate the tinfoilery of people who reject anything coming from official medical science because of “Big Pharma” – but I have really low priors on the replication of any medical finding. And it’s just abysmal to look at the misinformed mess of people who reject the mainstream views in nutrition – but… for the most part I do that exact thing myself, to be honest.

Sure, I might reject expert opinions for better reasons than those people. My views might be more nuanced and informed. Or they might just be generally the same thing, expressed in a more nerdy way and flourished with interesting tidbits and angrily expressed anecdotes. I still find noisy anti-experts idiotic, but I can’t help sympathizing with them a bit.