On forgetting, for lack of a better word

Preamble

Getting lost in the moment is, at least sometimes, quite literal. A three-day festival, a house party, a conference, the first fuzzy days with a new romantic interest. I’m just there and then.

And then, some stimulus jolts me back from the moment and into the normal meta-level consciousness of everyday life.

The rest of the world exists. Shit.

Right, find your bearings, you have a home, you have a job. When was the deadline for that report? Should probably call my parents at some point. Do I even have anything in the fridge for breakfast or will I have to run to the store once I get back?

For a moment, I have very concretely forgotten about the rest of the world, only to be brought back by some fleeting thought that I have no control over or a random external stimulus. If I think about it too much, it makes me feel like I’m adrift, and only catching hold of something by chance, because the river just happens to have enough comfortably-shaped metaphorical rocks at frequent enough intervals.

I realize I might not be typical, but I feel like I’m typical enough and have met enough other, more typical people to say that feeling something like this is common. Many people report feelings of being adrift, of not knowing what to do in life, of coasting through the years.

And I have what I can only describe as a Lovecraftian fear about human cognition related to this kind of forgetting.

My existential dread comes from the feeling that much of adult human life might be like the kind of living-in-the-moment where you forget about the fact that you didn’t do the laundry and your partner’s birthday is in just two days, except in the grand scheme, on longer and shorter timescales, and with much more important things being pushed into oblivion. All of us in that sleep-deprived, drunken stupor of the second night at an all-weekend illegal rave, all the time.

I do not mean just “coasting through life, kind of landing an education/a job/a partner that is the most easily available”, although that is a part of it. I mean, on a deeper level, that I question how much memory or understanding play into our lives instead of just stimulus-response, the environment being so well-built to nudge us forward that we never stop to realize that there never was a plan, rhyme or reason. Maybe sometime in history there was, but then something (cultural evolution, civilization, cognitive offloading) happened, and now we are here, now we are now.

I also do not mean “deterministic universe applied to the brain”, though of course my thoughts are compatible with that idea. You could have a deterministic universe where people deterministically act based on memories and reasoning, or something very different.

Brains built for forgetting

“Hmm”, I – an adult human being – thought, “I feel like there’s way less kids’ TV nowadays.”

“Oh, wait”, I – an idiot realizing his idiocy – thought a few seconds later.

Recognizing actual change in the world relying on one’s experiences alone can be surprisingly hard because not only are we limited in what and how much we can experience in total, we’re also systematically limited by what stage of our lives we’re at. You might live in the same physical environment as a child and as an adult, but what you pay attention to, and what you can pay attention to, changes.

Skills are learned through practice and maintained by practice. Skills are also, in the end, a form of memory. Some memories (such as knowing how to swim or ride a bike) are “stickier” than others, in that it is very hard to forget them and/or very easy to relearn them after a long period without practice. However, many memories degrade with lack of practice. More importantly, even the sticky memories are not always active: I know how to ride a bike, but I rarely think about it. The memory becomes active in the right context, with the right stimulus, and without those it is simply somewhere in the background, unremembered.

What’s more, the stimuli and the contexts might not be under our control at all. We do not choose to age, and I would argue most of us do not really choose to go to school even as adults any more than we choose to get jobs or pay rent. No deep thought is really necessary, the environment does quite enough for us, and all of these pushes from the environment further change the environment and thus the stimuli that help us remember. Olfactory cues triggering vivid memories is a common occurrence, but we can’t always choose to go smell a specific thing, some things we can only smell at very specific times, and most likely we don’t even remember the fact that a certain smell could trigger a certain memory. What if a much larger percentage of memory is like a completely involuntary reaction to the smell of a spring morning than we would normally think?

My pithy elevator pitch view of human cognition is that we do not really know how to do something other than when we are actually doing that thing, we do not actually understand something unless we are actively involved in some sort of sense-making of that thing, and we do not actually remember something unless we are in a situation where the memorized knowledge is important or relevant through some association (from the first person perspective). When not made relevant or important by some aspect of the situation that we may not have control over, the memories are not accessible, meaning they’re as good as forgotten – and what makes something “relevant or important” is not defined by some Platonic rationality but the first-person perspective through idiosyncratic associations. People are usually not used to using a specific memory or skill, no matter how general, in all situations, only the specific ones that triggered it the most consistently in the past. In short, context matters, generalization is rare.

This is a very rough draft of even half a theory, but even as it is, I feel it helps make sense of several types of “human stupidity”, at least better than many theories about cognitive biases that rely on vague ideas of processing capacity, some kind of innate folk science, or the brain somehow suppressing information.

Why do many people not know how to draw a bicycle, regardless of how often they ride a bicycle? Because the actual form and mechanics of a bicycle are unimportant unless you are building or repairing a bicycle (and even then the bicycle is right in front of you, so you don’t really need to have it etched into your visual memory to aid the completely unrelated task of drawing).

Why do physics undergrads show the same kinds of mistakes in understanding gravity in relation to a thrown object as elementary school students? Because classroom physics is a whole different task from “normal everyday physics”: different context, different kinds of practice.

Why do physicists not do any better than social scientists in avoiding teleological thinking, when physicists have much more explicit practice of rejecting teleology? Because scientific thinking is hard even when you’re actually doing science, and these people are not tasked with doing science but answering silly questions out of context.

Why might recent Harvard grads or even faculty members be unable to correctly answer questions about the phases of the moon or the change of seasons? To be fair, it might be because their education is bad, but I would argue the deeper reason for this is that it is very-hard-to-almost-impossible to teach people to think in a scientific way consistently and quickly in any given situation, not just when taking a test or writing an article.

One could go a whole day listing all the very funny stuff where psychologists or science popularizers show how humans are seemingly acting in contradictory ways: “you know/claim that X, yet you act as if Y”. The common thread that I believe is behind most (seeming) cognitive biases like this is that the contexts of X and Y are different.

Of course, this idea connects to less quirky everyday things as well. Despite having been children themselves, many adults seem to be completely clueless about the way a child’s mind or the social world of children works. My claim is simply that saying “it’s because people forget with age” would be imprecise: it’s not necessarily age but a lack of continued practice and experience of the life of a child (in addition to continued practice of being an adult and listening to all flavours of “experts” on children). And we have a constant lack of continued practice and experience in so many things we used to experience and thus practice all the time, because our lives move in phases. Living means forgetting.

A world built for forgetting

In a way, the idea of human lives being mostly driven by external stimuli that just happen to be here is just a depressing version of the idea of cognitive offloading – an involuntary, chaotic version of using the physical environment to relieve a burden on memory. The world just happens to be shaped in a way that is good enough to remind us of the relevant things often enough. Or, if you prefer another framework, the world affords going to work, paying rent, and calling your parents.

But cognitive offloading is good – even if it is the case that baseline humanity is quite ADHD-like, it seems like at least some of the problem is solved. So, what does the world remind us of, and what does it not? The second, much more speculative part of my vague fear around forgetting is a feeling that the world is getting worse for us in the whole reminding business, or at the very least the nudge conveyor belt leads to somewhere it shouldn’t.

I have met university students who did not know what “folders” on a computer are, which made explaining using certain kinds of software unexpectedly difficult. It’s not exactly a fresh observation that young people are not actually very good with computers, but these students could not have been more than five years younger than I was at the time, and I had learned basic navigation on a Windows computer seemingly by osmosis. The obvious explanation is that increasingly convenient tech has made it unnecessary for people to learn even the very superficial stuff I know about the inner workings of an OS (or, God forbid, the actual computer).

My point isn’t that young people being bad with computers is somehow a horrible thing: it’s that things move fast, and reminders of many things disappear faster than you would think, to be replaced by new ones.

We’re all online now. Across the years, people have worried about whether constantly increasing contact with computers that offer us quick rewards of almost any kind has led to increased rates of attention deficit disorders. Maybe, but to me it seems like even neurotypical people have all the potential to act like kids who need Ritalin. More than that, we seem to want to. Social media that supercharges gossiping, a news cycle where even on-going events are simply forgotten once the next big thing hits – these would not exist without some sort of demand. “Slow journalism” would not be a subculture if the majority of people did not have a revealed preference for faster, cheaper, now-er.

“We have access to all the knowledge in the world literally in our hands and we use it to look at pictures of cats”, we joke, but it’s not really just that. We do not just have passive access to knowledge of the world: knowledge of the world is being selectively blasted at us at all hours. There is no shortage of reminders about all kinds of things, no shortage of things the world affords, no shortage of stimulus-response opportunities. But none of these reminders are very long-term in any sense, and coherent timelines may be hard to find even if one has the patience to go looking for them. Add to this the fact that archiving “born-digital” media is hard, and that despite claims of “everything on the internet being forever”, it’s becoming increasingly possible to just remove stuff from the web that the majority of people know about as the web becomes more centralized. More affordances for stumbling in a never-ending now, more chances to forget individually and collectively.

I wonder if this was always the case. Maybe we were always kind of bad at remembering, it was just that the world-as-a-post-it-note functioned better than it does now, at least for some outcomes.

There’s a kind of conservative complaint about kids in the current year that goes something like this: people’s lives are way too sheltered and guided from above until they become “real” adults and enter the “real” world, at which point they break down and start demanding that the kind of pampered structure they used to have become the norm everywhere. However bad the kids are these days, I think it’s interesting that this complaint carries the implication that less external guiding would be good, so that people would learn to be independent. I will concede that it sounds like a plausible idea, but I’d also like to note that maybe people of previous generations in general just needed less planned, codified, explicit external guidance because implicit external guidance was often enough. Go to school, get a job, buy a house, things just kind of fall into place, one step flows into the next. I’m no economist, but I’ve heard rumours that going to school and getting a job might not be enough to buy a house anymore. Maybe the kids are so very not alright not because people are less independent, but because the (Western) world is no longer functioning so well when people simply proceed from one societal stimulus-response set to the next. Maybe we were always bad at the whole independence thing and always needed to be nudged, but it used to not matter as much.

Why bother?

Well, I think it’s interesting. Interesting, and at the same time boringly obvious. Other people have talked about similar ideas, but I haven’t seen anyone put the ideas together in exactly this way. I hope I can give someone a new perspective to things large and small.

I find it interesting how all these bits of cognitive science and everyday observation are out there, for anyone to see, and I still haven’t seen anyone express this exact kind of uneasiness about humanity.

I find it interesting how I feel like most of these ideas fit pretty well with the 4E umbrella of cognitive science, which I usually don’t really agree on. I find it interesting specifically because I often get the impression that 4E is presented as a kind of more “human” and “holistic” and less “cold” approach to cognition, but to me, taken to its logical conclusion, it paints a much grayer picture of humanity than extremely online cynical cognitivist evo-psych.

I find it interesting how so many internet-era worries, from loot-boxes in video games and “social media addiction” through easily-forgotten outrage media cycles to obesity and increases in attention issues all seem to point in the same general direction, yet nobody points at them and says “hey, maybe the behaviorists were right”.