Exams: We Have Changed, They Haven’t

I just had a period of exams. Don’t get me wrong, I think I did ok on most of them so this entry is not to blame “it” on something, but I started thinking about the nature of exams, what they are for and how they are performed.

To my best knowledge, exams today are executed more or less the same way they were 100 years ago. The details may vary, but usually it is something like this: The student has a certain curriculum he or she must study to understand and memorize. The exam is then supposed to reflect a statistical example of this curriculum and the result to reflect approximately how much of the curriculum the student has been able to take in.

I can see how this has been a very good format over a century ago, but the times have changed, while exams haven’t. The reality most of us deal with today includes a lot more information than people had to deal with back then and a lot more tools to handle the vast amount of information than was imaginable at the time. I think today’s exams should be more about how well people are able to use those tools to work on their projects than about memorizing various lists that could e.g. be Googled in seconds.

First of all: What are exams for? Three things come to mind primarily:

  • To measure the performance of students to see where they stand.
  • To force students to learn the curriculum.
  • To serve as a certificate of certain skills for the student to show when applying for further education or a job.

All of these are important things, but when so many things have changed in our society in the last century it begs the question: Shouldn’t exams also have changed more during this time?

A century ago schools were information centers. The information available to students in a school library usually far exceeded what they could expect to be able to access at home and at work. Today schools, homes and workplaces all can access virtually endless amount of information and a school library pales in the comparison to what is available there.

When information was this rare, it made sense that people would know offhand long lists of Latin words and try to take in as much of the information to store in their heads as possible, because it was unlikely that the information would be readily available when it was needed. Even with access to a big library it might take a long time to find the needed information and by that time the patient could very well be dead or – less dramatically – would slow research down to almost standstill if the researcher would have to resort to his books every time. Having the info in the head was the best and fastest index and therefore a valuable feature of – say – a good biologist. Today, it is hard to see the value in memorizing something that you can look up in seconds, if you know the right tools and how to use them.

Now I’m not saying that it is not valuable to know the basics of your field offhand, what I’m saying is more that it should be a part of the basics in the field to know where and how to search for information and know how to use the tools that are available. Once out in the real world, working e.g. in a biological lab somewhere, the researcher will have his computer and his Internet connection available. How good a researcher he is will depend on how well this duo works together. The researcher and his computer are one (see also Wetware entry: When People are Cheaper than Technology). The researcher’s output will be measured in what comes from him and his computer, not him or her alone from offhand knowledge.

To continue using the biologist as an example, it should therefore be quite obvious that it is a more important skill for him to know how to use Excel and even program a little simulation, than remembering the Latin names of the shoulder bones of a bird. Knowing where to find and how to search the best available database of animal species is more important than knowing offhand the features of only a few such species. Knowing what information to trust and what information sources to avoid in the millions of pages worth of information available and how to tell thee good info from the bad info is more important than memorizing most of the contents of your 1,000 page course book.

What people are taught should therefore aim more at teaching the student the very basics and using the best available tools. As a result the exams should test these abilities. In an exam there should be a lot more of unread material from a far larger and more widespread curriculum and the student and his computer should be allowed to work together on the problems, just as they will for the rest of his or her life a scientist, physician or almost anything else the student may move on to doing after his studies.

And don’t even get me started on computer science students that take their exams with pencils and paper and have to remember correct syntax and function names!

One comment

  1. Exams, exams how I miss them…

    I have the feeling that exams are already loosing their importance as THE only criterion for how well a student is doing. Increasingly the final grade is based on how well students do in teamwork and individual assignments, class participation etc.

    The exams are still there to test how to memorize the subject and solve problems under pressure. This kind of ability could be a more important for some professions than others. We like our surgeons to know as much as possible about what they are doing before they cut us open, while it is fine if researchers memorise less and browse more.

    While we are on the subject… I would like to see more test where students must use the Internet to solve problems under pressure.

    FS

Comments are closed.