As I get older, I notice more and more that I am slowly forgetting. Things I once learned in high school or college are fading from memory. Hard-fought knowledge has since gone unused and languished into oblivion. This process is not entirely a bad thing, as some of that knowledge has never been useful since the last paper was written and the final exam was finished, and more heavily-used knowledge has taken its place. Learning and then forgetting all of that knowledge was not a total loss, either, because along the way I learned something else even more valuable: how to learn efficiently.
When learning something new, there is usually a long list of concepts, rules, and processes that we need to know in order to apply that knowledge effectively to solve problems in any given domain. We can either attack that long list through brute-force memorization, beating each concept into our heads with tedious repetition and grinding study, or we can build a foundation of knowledge up to the higher level concepts and problem-solving skills necessary to do the work that needs to be done. The former approach may yield quicker results because we can get right into specific ways to solve this or that problem, but the latter approach may have more longer term benefits because we'll understand the problem space at a deeper level and we'll be able to handle more complex situations.
In reality, both approaches are necessary to truly learn anything. Let's explore why that is with the basic example of arithmetic. We start learning arithmetic by counting:
1, 2, 3, 4, 5, 6, 7, 8, 9, 10.
These are the first ten natural numbers. They are represented as written symbols associated with spoken words, and we need to memorize those symbols, their correct order, and how many objects each symbol represents. The ten numbers above contain all of the symbols we need to know to write down any natural number. As we continue writing down more numbers, we can see they form a pattern:
11, 12, 13, 14, 15, 16, 17, 18, 19, 20,
21, 22, 23, 24, 25, 26, 27, 28, 29, 30.
While we could memorize each number as an individual concept, and how they're related—as in 26 comes after 25 and 27 comes after 26—that tactic would quickly become tedious, and soon after that it would become overwhelming. To make the concept of counting easier, we recognize the pattern of digits within a number. The ones digit is always the rightmost digit, the tens digit is to the left of the ones digit, the hundreds digit is to the left of the tens digit, and so on. We also learn how counting to the next number changes these digits. We increment the rightmost digit, and each time the digit is a '9', we roll it over to a '0' and increment the digit to the left, repeating the rollover if necessary. So the number that comes after 258,439 is 258,440 because we increment the '9' to a '0' in the ones place and then we have to increment the '3' to a '4' in the tens place.
Now we've formulated a rule that generalizes counting so that we know how to count starting from a really big number without iterating through all of the numbers before it. We have a better understanding of how natural numbers work on a fundamental level. Once we are comfortable with that, we can add in another concept and move to a higher level with addition. Addition is the operation of combining two numbers, or in the physical world, combining two collections of things into one collection, and counting what the total number of things is. We can start with the number of objects in one collection, and from that number count up by the number of objects in the other collection. If we have 5 apples and we want to add 3 more apples, we start at 5 and count up 6, 7, 8. We end up with 8 apples.
This process of counting to do addition gets just as tedious as always counting from 1 to large numbers did, so we memorize our addition tables—adding all combinations of 0 to 9 with 0 to 9—and learn how to do multiple-digit addition and carrying with stacked addition problems. Like with counting, we've done some learning of the fundamentals and some rote memorization to speed things up. Counting also becomes a special case of addition where we are adding 1 each time we count up. Then we learn that subtraction is similar to addition, but we count down instead of count up. Then we learn how to do multiple-digit subtraction and how to borrow. More memorization also comes into play with subtraction tables.
We can keep building on this foundation with multiplication, which is essentially the process of adding a number to itself some number of times. We memorize our multiplication tables and learn about stacked multiplication of multiple-digit numbers. Division follows, then fractions, and so on and so forth. At each stage we're learning fundamental concepts of how to do arithmetic and memorizing shortcuts to make our calculations more efficient. Without the fundamental knowledge we'd be stuck at counting, and without memorizing key tables of information we'd take forever to calculate any non-trivial formula. Both skills are necessary, and they compliment each other nicely.
Arithmetic is fairly easy to remember, since we tend to use it all the time. Depending on what we do for a living, we may also remember a lot of algebra, geometry, trigonometry, and even calculus. (And this is just considering mathematics. The other subjects are important, too!) At some level things start to get more hazy as we forget the concepts and shortcuts that we don't use. This can happen even with subjects that we use fairly frequently. I was surprised how much stuff I had to look up and verify when I was writing the series on Everyday Statistics and Everyday DSP. These are subjects that I use all the time, and I thought I could write about them purely from memory. But it turns out that even in this case I only use a subset of Stats and DSP on a regular basis, and writing about it requires much more detailed explanations than what I normally use.
Memory is not rigid and static. It's fluid and malleable. I think of memory like a fabric of the mind. We weave more material from thread at the edges of our knowledge, needing fabric that's already there to attach new information and expand our knowledge. Over time the fabric becomes worn, faded, and thin—although unlike real fabric, the holes appear where it's least used. We need to patch those areas with new cloth, woven from new thread, and sometimes dye areas with new colors of newly acquired information to make it all match and flow together.
I'm finding more areas of my mental fabric that need patching whenever I decide to take on a new learning experience. The most recent example is machine learning. I'd like to learn more about this exciting subject, but when starting to look into it, I quickly discovered areas of knowledge that have thinned out over time. I need to brush up on linear algebra and numerical methods, things I haven't spent much time on since college. That doesn't worry me, though. I've been through this situation many times before. It doesn't matter that I don't remember everything I learned before because I know how to learn. I know how to patch those holes in the fabric of my knowledge and make it stronger so that it can support the fabric I'm going to add onto it.
Learning specialized knowledge well is important, but knowledge can age and disappear. I'm finding that as I forget older knowledge, the specifics don't matter as much. It's not important that I learned this or that concept in school, or that I try to retain everything that I learned when I was young as if that's all I would ever be able to learn. What's important is knowing how to weave and sew and patch that knowledge, adding to the intricate fabric that's already there. The fundamentals that we learned when we were young will likely always be there to build on. Some higher level knowledge may get lost, but we only need to remember that it's a combination of memorization and understanding fundamental concepts that makes for optimal learning to keep expanding our mental fabric.