Updated Assignment Tool Replaces Digital Drop Box in Blackboard NG

posted in: News | 0

The coming upgrade to Blackboard Next Generation (NG) on May 30, 2011, is the most significant upgrade of Blackboard since it was first introduced at NIU almost ten years ago.

Blackboard NG includes over 80 new and enhanced features, including substantial updates to the Assignment Tool. As a result of new Assignment Tool features, the Digital Drop Box is obsolete. Consequently, the Digital Drop Box will no longer be available after the upgrade to Blackboard NG. Faculty who have used the Digital Drop Box should consider using the Assignment Tool as a replacement.

Upload AssignmentAssignment Tool:

Organizes student submissions in the Grade Center for easy grading

– Allows faculty to assign points, add comments, and even return a file to students

– Tracks submission history, including content, comments, and grades

New: Allows faculty to give specific students an additional attempt at an assignment

New: Allows faculty to enable multiple attempts for all students and grade each attempt individually

New: Facilitates group assignments, so that only one student for each group has to submit the assignment, and the grade assigned is automatically given to all group members

Student work can also be submitted through the SafeAssign tool, which prevents plagiarism by detecting unoriginal content in student papers.

For a preview of the new Assignment features that will be available in Blackboard NG, register to attend the upcoming online workshop (http://niu.edu/blackboard/ng/workshops.shtml#bbngassignprev) on Tuesday, April 12 from 12:00 pm to 1:00 pm.

Want to learn more about Blackboard NG? Sign up for one of several preview sessions or workshops being offered by the Faculty Development and Instructional Design Center at http://www.niu.edu/blackboard/ng/workshops.shtml. For more details on Blackboard NG, visit http://www.niu.edu/blackboard/ng.

If you have any questions or concerns, please contact the ITS Helpdesk at 753-8100 or helpdesk@niu.edu.

Using Concept Inventories to Improve Instruction

In many fields, “common sense” can lead students astray. Before stepping into a classroom, students have formed hypotheses and theories based on observations and experience, but what seems to make sense based on casual observation may be, in fact, false. These misconceptions can be worse than complete ignorance, as the misconceptions have to be corrected in order for new information to be learned. In fact, most of the time, students simply modify their existing understanding to accommodate the new concepts rather than internalizing the correct knowledge, leading to a mash-up of correct vocabulary mixed with partially correct theories (Hestenes, 2006).

There are several important questions related to student misconceptions. First, what misconceptions do students have when they begin a course? Also, is the course effective at replacing misconceptions with a deep understanding of the concepts which are essential to the course, or are students learning the material by rote? Finally, are some teaching methods more effective for imparting this deep learning? Obviously, these misconceptions can be challenging to assess using conventional methods.

One way to address these misconceptions is by administering a Concept Inventory assessment. A concept inventory is a multiple choice test that forces students to choose between the correct concepts and common sense alternatives (Hestenes, Halloun & Wells, 1992). The inventory is administered at the beginning of a course to get a baseline level of student understanding, and again at the end of a course. The difference between the scores represents the students’ change from misconception to accurate and deep understanding of the concepts.

Because concept inventories are designed to assess understanding of concepts, the questions focus on reasoning, logic, and general problem solving, rather than facts, definitions, or computations. Initial questions may be followed by a second multiple choice question that asks for the reason why an answer was given. For example, the following two questions are part of the Chemistry Concept Inventory (Mulford, 1996.) Answers follow at the end of the article.

  1. Two ice cubes are floating in water. After the ice melts, will the water level be:
    1. Higher?
    2. Lower?
    3. The same?
  2. What is the reason for your answer?
    1. The weight of water displaced is equal to the weight of the ice.
    2. Water is denser in its solid form (ice).
    3. Water molecules displace more volume than ice molecules.
    4. The water from the ice melting changes the water level.
    5. When ice melts, its molecules expand.

Unlike traditional multiple-choice exams, concept inventory questions are criterion-referenced, meaning the questions should be directly linked to the concepts and misconceptions the inventory is designed to assess. The distracters (incorrect responses) for each question should be matched to common misconceptions.

To create a concept inventory, begin by selecting the theories or concepts that are most critical to success in the subject area. Then, identify common misconceptions that students have about those concepts. For experienced faculty members, this could be based on observation and experience, at least initially. For greater accuracy, misconceptions can also be identified through open-ended exams that require students to explain their reasoning. Interviews with students are very informative about the common sense theories they have constructed. It also may be possible to review literature on common student misconceptions about the concepts.

Use the common misconceptions to develop multiple-choice questions that are problem-oriented and concept-based rather than computational or factual. To many faculty, the questions on a concept inventory seem to be too easy or trivial, but that is natural (Hestenes, Halloun & Wells, 1992). Because the questions are based on essential concepts as opposed to complexities, errors are indicative of lack of understanding, while correct responses may not indicate mastery as traditionally understood.

After administering the concept inventory as both a pre- and post-test, compare the scores. Ideally, the scores should improve substantially. If there is little change overall, or little change for a particular concept, reconsider the questions, and examine the teaching strategies used. If possible, it is particularly helpful for multiple faculty members to administer the inventory to multiple sections. Over time, continue to revise teaching strategies to improve students’ mastery of the concepts they struggle with.

Naturally, there are many factors that affect the results of a concept inventory. The ultimate goal is to identify student misconceptions and to determine whether those misconceptions are corrected. Hestenes and Halloun (1995) argue that a well-written concept inventory, like their Force Concept Inventory (FCI), is best analyzed as a whole rather than as individual questions. The result is an indication of how well students understand the concepts overall, as opposed to how they respond to specific questions.

Developing an accurate and valid concept inventory is a matter of research, time, and revision. Fortunately, many individuals who have already developed concept inventories welcome other faculty to use their exams and to add their data to the ongoing study of the instrument. Several of those examples follow.

Examples of Concept Inventories:

Concept inventories are most common in mathematics, the sciences, and engineering, but can be applied to any field. The first widely-disseminated concept inventory was the Force Concept Inventory (Hestenes, Halloun, & Wells, 1992), which assesses basic understanding of Newtonian physics. There are also concept inventories to assess introductory knowledge in chemistry, digital logic (a branch of computer science), and statistics, among many others. Use the links below to view several examples (some require a password that can easily be acquired by emailing the contact listed on the website.) Many of the teams welcome other faculty to use the inventories and contribute additional data to ongoing evaluation projects.

  1. Force Concept Inventory (FCI) – http://modeling.asu.edu/R&E/Research.html
    • First widely-disseminated concept inventory
    • Developed by David Hestenes,  Ibrahim Halloun, and Malcolm Wells.
    • Assesses basic understanding of Newtonian physics
  2. Chemistry Concepts Inventory – http://jchemed.chem.wisc.edu/JCEDLib/QBank/collection/CQandChP/CQs/ConceptsInventory/CCIIntro.html
    • Developed by Doug Mulford
    • Assesses topics generally covered in the first semester of a college chemistry course
  3. Dynamics – http://www.esm.psu.edu/dci/
    • Developed by Gary Gray, Don Evans, Phillip Cornwell, Brian Self, and Franceso Costanzo
    • Assesses understanding in rigid body dynamics and particle dynamics
  4. Statistics – https://engineering.purdue.edu/SCI/index.htm
    • Developed by Teri Reed-Rhoads and Teri Jo Murphy
    • Assesses statistics understanding through 4 sub-tests: Descriptive, Probability, Inferential, and Graphical

Additional examples are available at https://engineering.purdue.edu/SCI/workshop/tools.html (Allen, 2007).

Learn More

The Faculty Development and Instructional Design Center will be offering a workshop on this topic titled “Concept Inventories: Measuring Learning and Quantifying Misconceptions” on March 8, 2011 from 11:30 – 1:00. Registration details will be available soon.


Allen, K. (2007). Concept Inventory Central: Tools. Retrieved September 28, 2010, from https://engineering.purdue.edu/SCI/workshop/tools.html.

Hestenes, D. (2006). Notes for a Modeling Theory of Science, Cognition and Instruction. Retrieved October 1, 2010, from http://modeling.asu.edu/R&E/Notes_on_Modeling_Theory.pdf.

Hestenes, D., & Halloun, I. (1995). Interpreting the FCI. The Physics Teacher, 33, 502-506. Retrieved October 1, 2010, from http://modeling.asu.edu/R&E/InterFCI.pdf.

Hestenes, D., Wells, M., & Swackhamer, G. (1992). Force concept inventory. The Physics Teacher, 30 (3), 141-151. Retrieved October 1, 2010, from http://modeling.asu.edu/R&E/FCI.PDF.

Mulford, D. (1996). Chemistry Concepts Inventory. Retrieved October 1, 2010 from http://jchemed.chem.wisc.edu/JCEDLib/QBank/collection/CQandChP/CQs/ConceptsInventory/CCIIntro.html.

Answers to sample questions

  1. C
  2. A

New Quick Tip on Classroom Civility

Have you ever wondered how to promote a respectful and civil environment in your classroom? Do you worry about how to respond to classroom disruptions and what to say to challenging students? Do you know what NIU’s policy is with regard to classroom deportment? As a faculty, what are your rights and responsibilities for maintaining a civil environment? Tim Griffin, University Ombudsman, and Faculty Development and Instructional Design Center have created a new Quick Tip that addresses these concerns.

Quick Tip on Classroom Civility

You can view the Quick Tip here. A transcript of the presentation is available here. You can also subscribe to Faculty Development’s Quick Tips via RSS or iTunes.

Teaching Assistant Orientation Materials Available Online

posted in: Newsletter | 0

The Fall 2010 Teaching Assistant Orientation (TAO) was a big success, and the largest yet. On August 17, 217 graduate assistants filled the Regency Room. Twelve NIU faculty and staff presented on a variety of topics, including Teaching and Teaching-Related Responsibilities, Managing Your Classroom Effectively, and Assisting Students with Emotional or Behavioral Concerns, to name a few.

Teaching Assistants

All of the TAO materials are now available online. While the materials are designed for graduate assistants, faculty may find the information valuable, as well. Also, if you work with a graduate assistant who was not able to attend TAO, feel free to point out the materials for him/her to review. You can view the handouts and presentations here.  Several of the sessions even have video tutorials that cover much of the information that was presented at TAO, for a more engaging way to review the content.

Self and Peer Assessment

Sometimes, students need more than just their professors’ feedback. Students benefit from learning to assess their own work and from evaluating the work of their peers.

There are many benefits to self and peer assessment. The most obvious benefit to self assessment is that it encourages autonomy and independence in students (Boud, 1995). It forces students to think critically about their work rather than relying upon external feedback, which builds the students’ skills in self-monitoring and self-correction (Exemplars, 2004). Both of these are essential skills to have in the workplace (Boud, 1995).

Peer assessment allows students to receive feedback from their peers. However, the greatest benefit comes from the process of assessing their peers. In many cases, students would never see any work but their own. Evaluating others’ work allows students to compare their own work to the work of their peers. The assessment process also requires students to analyze the criteria for excellence more closely, which may also cause them to internalize the criteria (Exemplars, 2004).

There are some challenges to using Self and Peer Assessment in the classroom. Perhaps most importantly, students’ self-assessment skills may not be developed prior to arriving at the university (Boud, 1995). Students may need to be taught the skills necessary for effective critical reflection before requiring them to self-assess. Since self-assessment skills may be subject-specific, it may not be possible to assume that skills taught in other courses are applicable to the current course.

Peer assessment is often viewed as punitive rather than constructive (Boud, 1995). Students may even fear receiving low scores from their peers. Similarly, peer assessment may focus on scores rather than providing constructive feedback. Faculty should take care to design peer assessments to encourage or require feedback and explanations as opposed to only numerical scores.

It can also be challenging to implement self and peer assessment. If the subject of the assessed work is a paper or other written work, it often becomes the faculty member’s responsibility to coordinate the collection of the assignment and the distribution for peer review. The faculty member must determine and track which assesses each assignment and ensure that the evaluations are collected. The Self and Peer Assessment Tool, one of the newest features in the Blackboard Course Management System, may make this process simpler.

The Self and Peer Assessment Tool allows faculty to establish criteria for assessing the assignments and allows faculty to provide examples of model work. While creating the self and peer assessment, faculty can determine submission and evaluation periods, which Blackboard strictly enforces. Faculty can also determine how many peer assessments each student must complete, as well as whether or not a self assessment is required. Students submit their assignments using the tool, and then Blackboard randomly assigns assessment pairs and distributes the files. The faculty member may decide to make the pairs known or anonymous. Once the evaluations are complete, the faculty member may view or download the results, and can send the results to the Grade Center.  To learn more about the Self and Peer Assessment Tool, go to http://www.niu.edu/blackboard/assessments/spa

In short, both self and peer assessment are valuable tools that can increase learning by requiring students to critically evaluate their work and the work of their peers. The Blackboard Self and Peer Assessment Tool can simplify the process.


Boud, D. (1995). Enhancing learning through self assessment. New York, NY: RoutledgeFalmer.

Exemplars. (2004). The benefits of peer- and self-assessment. Retrieved from http://www.exemplars.com/resources/formative/assessment.html.

1 5 6 7 8 9