In the early 90’s, most people had three reasonable choices for looking up the information you did not know: a book or journal you owned, the notes you took in class, or a library. In all of these cases, you had to physically move yourself to the location where the information source was stored. The best bet to be able to state information quickly was to know the information yourself. Given that we are humans, let’s assume the accuracy of all the information we hold in our heads is (at best) 80%.
In the later 90’s and early 00’s, the location of information expanded slightly. The reasonable choices were now: the Internet (accessed from a desktop computer plugged into a wall), a book or journal you owned, the notes you took in class, or a library. Still, you had to move to a location where the information could be accessed. And still, the best bet to be able to share information quickly in any circumstance was to know the information yourself.
Today, with Smartphones in the hands of the majority of people in the United States (237 million people, 98% if you look at just the 18-29 age range), the choices are now: the Internet (accessed through the Smartphone you, or the person you are talking to, is carrying). Notice I stopped there. I’m not sure that the average working-age person considers the book, the notes, or the library a “reasonable choice” for getting information quickly. Yes, all three sources still exist (as my partner will tell you every time we have to move the books to a new house), but most of us do not use these sources as our first choice. It is no longer the case that knowing all the information yourself is a requirement.
Several times now, when I’ve gone to the doctor, my doctor looks up information on the computer. And I’m okay with that. What’s the likelihood that a doctor can accurately store all the side effects and drug dosages for all the pharmaceuticals in existence in their head more accurately than a computer? Pretty small. Personally, I’d rather they look up the information and have 100% accurate information than know it off the top of their head with 80% accuracy.
Now, with that said, we do need to store some information for the long term, which varies by chosen career. But I think we are now at a point in history where we need to carefully examine our curriculum through a new lens and ask “Given the technology and volume of available information today, how much information is too much?”
Too often now, I see courses that stop a student from pursuing their desired career because the student cannot learn some archaic skill or memorize some facts that are no longer needed in the world today. If you need to replicate some small procedure, you can search for a YouTube video that will teach you this procedure (fix your sink, complete the square, replace an Intel processor). If you need to remember a fact, you can Google it (what’s the capital of Albania, what’s the quadratic formula, how many pins does an Intel processor have).
Little known fact: a few years ago, I started working on a BA in Software Design. Silly me, I assumed that years of experience in the industry designing software would make this a fun and relatively easy degree to complete. Unfortunately for me, there were several “certs” required for the degree, and the first was A+ Certification. For those of you who have never heard of this A+ Certification, it requires you to know how to repair every part of a desktop PC going back to the first PCs that existed. You not only have to know how to repair these machines in general terms, but you do actually need to know the number of pins in every processor, the motherboards they are compatible with, etc.
Now, the irony is that I actually had a job building and upgrading computers once (a very long time ago). But even then, we looked up this kind of information, we did not memorize it. And today, with Google search at our fingertips, it is especially pointless to memorize this stuff. So there I was, memorizing ancient facts about PC computers so that I could continue to pursue a degree in Software Design. I spent hours trying to make myself hold this information in my head (pages and pages and pages of facts) I couldn’t do it (and I couldn’t make myself want to try any more). To sum it up… I quit the degree program.
And then I realized … this is exactly the same thing that happens when students in many programs. I could immediately see how some of the procedures in math had become outdated with the technology we now carry. Consider mathematical procedures like completing the square, polynomial long division, factoring a sum or difference of cubes. None of these procedures are useful in the real world. They are all artifacts of the math that was taught before we had the technology to do it any other way. Before computing technology, it was only possible to easily teach precalculus and calculus with nice integer coefficients where everything was factorable. But these old procedures are not the math of the real world. We now have the technology to deal with non-integer math, and we have it always at our fingertips.
Those students who encounter the archaic algebraic procedures probably think the same thinking I did when I encountered the A+ certification fact memorization. “I can’t do it. This has nothing to do with my desire to be a [fill-in-the-blank career] and I’d rather quit this degree than learn this stupid thing I shouldn’t have to learn.
I shared my example from math but I do not think the problem is unique to math. Many subject areas (maybe all of them) need to reexamine the learning objectives through the lens of today’s easy access to information. What archaic procedures and facts are your degree programs and courses clinging to that just don’t make sense in today’s world? Who are you stopping from a career?
With this in mind, I’ve been working on a new lens through which we can examine learning objectives and thus assessment strategies. I call it the ESIL Lens and it has four levels.
ESIL Level | Meaning of Level | Assessment Strategy | ||
Existence | Does the learner know it exists? Can they find right search words for it later? | Not assessed
May just be demonstrated or referenced in learning |
||
Supported | Can the learner do it supported with help from notes, tutorials, and peers? | Low-stakes
e.g. homework, project, group assignment, open notes quiz |
||
Independent | Can the learner do it independently, without assistance and maintain the skill until the next expected refresh? | Multiple Medium or High-stakes
e.g. Assignment, Quiz, Unit Exam |
||
Lifetime | Can the learner do it independently (without outside help) and maintain the skill for lifetime success? | Multiple High-stakes
e.g. Quiz, Exam, and Cumulative Final |
Let me show you how it might work with a learning objectives that is typically included in a College Algebra course: Solve a quadratic equation.
The first thing you have to do is actually split the learning objective into the various sub-objectives that actually exist and are usually taught:
- Solve a quadratic equation by completing the square.
- Solve a quadratic equation by factoring.
- Solve a quadratic equation using the quadratic formula.
- Solve a quadratic equation using technology.
The last two techniques (quadratic formula, and technology) will always yield results. Factoring only works when the equation is factorable (almost never in the real world). And completing the square takes hours to teach and practice and is rarely remembered or carried out correctly once the coefficients become non-even integers. If a student is required to solve a quadratic equation by completing the square in some future class (unlikely unless it is another math-based class that is still stuck in the dark ages of archaic algebraic techniques), they can look up “How to complete the square” and find a tutorial to walk through it. I rate the ability to factor (or understand what factors are) as being slightly more important for future math success than completing the square. I want students to know what factoring is, and be able to work through finding the factors, but I don’t think the ability to factor is crucial to being a literate citizen of the world. I’m not going to stop a student from majoring in [fill-in-the-blank major] because they can’t solve 6×2+x-40=0 by factoring (especially if they are able to solve it by other methods).
So personally (and within the context of the institution I teach at), I would rate the objectives as follows:
- Solve a quadratic equation by completing the square. (E = Existence)
- Solve a quadratic equation by factoring. (S = Supported)
- Solve a quadratic equation using the quadratic formula. (I = Independent)
- Solve a quadratic equation using technology. (L = Lifelong)
The learning and assessment strategies for these sub-objectives will be designed to be appropriate to the level of the lens.
By examining this each learning objective through ESIL, we can increase the efficiency and relevancy of the course. This frees up time to refocus on more relevant activities (in my case, examining graphs from the real world each day in class). If your institution does this for every course in the degree program, I’m betting you will see a drop the attrition rates.