blogphoto-creatinganassessmentAs one user of Edulastic to another, can you relate?  The thrill of creating dozens of different question types compiled with instant testing feedback on a “dashboard of colors” is like that feeling when you first step into Disney World, or like walking the streets of Las Vegas for the first time. The automation is an elixir. But, like Odysseus hanging out with the Lotus Eaters, it is easy to stay in that zone and focus on the test taking rather than the test making.

Granted, many of us are using common assessments created by our curriculum leads and don’t need to think about creating tests ourselves. However, assessment expert Ruth Axman Childs, in her paper for the ERIC Digest, gives us another angle to consider when she writes, “the most instructionally-relevant achievement tests are those developed by the individual teacher for use with a particular class.”

“The most instructionally-relevant achievement tests are those developed by the individual teacher for use with a particular class.” -Childs

Essentially, incorporating small original assessments based on a particular class is a valuable path for feedback. That being said, the science behind creating tests shows that developing your own questions is not an amateur endeavor and the smart test taker can see through a lot of the screens we put up to try to “catch” them — a tempting move for any teacher who wants to find out who was really listening.

Based on our extensive research through the impressive halls of academia (OK, our Google search last Thursday) we have compiled a shortlist of tips to think about when building your own exam. The findings were actually very interesting, especially when it came to ways to use multiple choice and multiple answer questions – a format, I personally had wrongly considered less effective than the other question types when it came to assessing knowledge. The information gleaned from our sources was generally centered around multiple choice style questions, but the strategies apply to many of the question types found in online formative test platforms, e.g. drag and drop, resequencing, labeling, classification, etc.

Here are our top 10 best practices for creating tests. Most of the experts were consistent about their recommendations, so the list below is a summary of many sources. Credit and kudos go to Ruth Axman Childs, Christopher Pappas, Scott Winstead, Teresa Flateby, Marjorie Devine and Nevart Yaghlian.

For more details into the academia of creating tests, below you’ll find links to articles from the PhD-types.

10 Best Practices for Creating Tests

1. Design

When designing the test, be clear about the specific objectives you expected the students to learn from that unit of instruction. Rank them in importance and include more questions about the most important objectives, but be sure to include something about each objective.

2. Format

Spend some time thinking about how to match your test or question format to your learning objectives. For example if performing quickly was an objective, give a timed test. If articulating how one event affected another, a short answer may be the way to go. Or if recall of dates and sequencing is the key, or drag/drop style question might be most appropriate.

3. Language

Use simple and brief language in the questions (and the instructions) so you can test the subject matter knowledge vs. a student’s language skills. An answer stated in “text book language” may trigger a quick response because it sounds familiar or seems more correct. Similarly, if a student’s language skills are weak and he or she can’t understand the question clearly, you may not capture that the student has actually mastered the content.

Engage NY Sample Question

In this question, written by Engage NY, you’ll notice the carefully written stem and consistency in answer length. Additionally, the author included an image, which boosts engagement.

4. Answer Choices

Determine the number of answer choices in advance and keep it the same throughout the test. Generally, 4-5 alternatives is the recommendation. It decreases the chances of guessing the right answer and prevents memory overload.

5. Answer Length

Make all of the answers parallel in length. Often test makers end up making the correct answer longer and more articulate, thus giving away the right answer.

6. Wrong Answers

Offer believable and attractive “wrong answers”. This will test the students real understanding of the material. It is common for test makers to offer one or two obviously wrong answers.

7. Level of Thinking

SampleDandD

The Drag and Drop question type is a great option for elevating the level of thinking and also creates a more engaging question than Multiple Choice or True and False.

Integrating charts, graphs, images or passages that require evaluation can elevate the level of thinking from recall to analysis. This is true with multiple choice style questions. Edulastic users can use image labeling, drag and drop, classification,  resequencing, and other TEIs to achieve similar higher levels of thinking.

8. Instructions

State clearly in the instructions whether you require the singularly correct answer or the best answer to each item. This is important with common core type curriculum which is more often than not requiring them to do more “best answer” type questions.

9. Answer Order

Don’t forget to switch or randomize the order of the answer selection…be sure the location of the correct answer doesn’t follow a pattern. Savvy test takers will figure out that every other correct answer is C if you don’t mix it up. One tactic to randomize answer options is to consistently list them in alphabetical order.

10. Avoid Confusion

The “stem” of the item is the prompt or the question. To reduce confusion, the experts recommend the following:

  1. Put as much of the wording as possible in the stem so that the answer options are clear of language that is repetitious;
  2. Be sure to state the stem in positive form, wherever possible;
  3. If you are using negative wording be sure to highlight or italicize (somehow emphasize) the negative wording so it is clear to students.

screen-shot-2016-10-14-at-10-01-18-am

Once you complete your assessment, add it to the Public Library!

Next time you sign on to Edulastic to create a test, keep these ten tips in mind. If you feel great about the assessment you’ve created, go ahead and share it publicly for the whole community to see.

Keep in mind, there have been tons of studies and research on how to create great assessments. If you want to learn more, please check out the articles below. Have a tactic that you’ve found particularly helpful? Share this article with your network and chime in with your tip!

Sources and Additional Articles

Author: Childs, Ruth Axman
Source: ERIC Clearinghouse on Tests Measurement and Evaluation Washington DC., American Institutes for Research Washington DC.

Author: Flateby, Teresa L. PhD
Source: University of South Florida – A Guide for Writing and Improving Achievement Tests

Author: Professors Marjorie Devine & Nevart Yaghlian
Source: Center for Teaching Excellence, Cornell University

Authors: Alison Oldfield, Patricia Broadfoot, Rosamund Sutherland and Sue Timmis
Source: Assessment in a Digital Age: A research review Graduate School of Education, University of Bristol

Author: The Alberta Teachers’ Association
Source: Digital Reporting and Digital Assessment Tools: Evaluating their Value and their Impact

Author: Christopher Pappas
Source: How To Write Multiple-Choice Questions Based On The Revised Bloom’s Taxonomy

Author: Scott Winstead
Source: “6 Tips To Create Great eLearning Quiz Questions”

Author: Ruby Spencer, CTDP
Source: How to Make Meaningful Elearning Assessments