By Ana Marjanovic, Instructional Designer

 

You graded the first assignment in the semester and it hits you – the class as a whole didn’t perform as well as you expected! Why is that? you keep asking, when you spent a considerable amount of time exploring approaches to teaching and learning, setting the course objectives, creating assignments, and selecting materials.

As much as this scenario may seem distressing, luckily, there is an (often overlooked) step in the curriculum development that addresses this issue – the Needs Analysis. Needs analysis is defined as the process of identifying “the gap between the current state of performance and the desired state of performance” (Altschuld and Kumar qtd. in Stefaniak 34). Leading authors in the field point out the comprehensive and systematic approach to data needs analysis (Stefaniak 47). In more general terms, the needs assessor should gather information from different people that work in various departments of an institution to assure an accurate and comprehensive approach to determining what is causing the gap in performance. This article, however, proposes a framework and tools for university instructors that help them identify, understand, assess, and address students’ performance gaps and improve learning outcomes for all students. Blackboard tools (such as rubric reports, column statistics, and surveys) can help instructors gather necessary data to address student performance gaps and prioritize learning needs.

Framework

There is a variety of needs assessment models adjusted to the various types of organizations and situations (Stefaniak 86). One model that helps identify and assess performance gaps is Gilbert’s Behavioral Engineering Model (BEM) developed in 1978. Thomas Gilbert determined that the three most important factors that determine performance are Information, Instrumentation, and Motivation; he took into account all three factors in relation to the individuals and the environment (Stefaniak 88). [In the context of higher education, the individuals may correspond to students, and the instructor’s teaching approach, class modality, and/or the use of technology could be referred to as the environment.]

There is no one correct way of implementing the method, and Stefaniak points out that the rigidity in implementing the needs assessment models is not recommended (Stefaniak 97). Therefore, for the purpose of this paper, the BEM will be interpreted through the lens of an instructor as the needs assessor in their own class; the BEM will be adjusted to compare the course design to the student performance in an attempt to identify misalignments between the two. Even though this approach imposes a great deal of self-reflection and introspection onto instructors, it is essentially results-driven and student-centered.

The first factor, Information, takes into account the performance expectations from the instructor (e.g., assignment instructions) and the student’s existing knowledge and skillset to execute the task. Some introspective questions that can guide the instructor could be: are the assignment instructions clear to students? Do my students need a prerequisite to register for my course? Does the assignment require prior knowledge that is beyond the scope of this course? The answers to these questions can then be compared with the actual grade distribution for the assignment. The instructor could also administer anonymous surveys or organize journals and discussion boards where students can reflect on the same questions. The instructor then can organize all data in the chart comparing their own answers with students’ reflections and/or assignment grades to determine what caused the gap in student performance.

The Instrumentation factor examines whether the course design, as well as the institutional support, provides necessary resources and tools for students to achieve the desired level of performance. The Instrumentation factor in relation to students examines their capacity to complete the assigned tasks. Some of the guiding questions for instructors could be: Do all students have access to resources such as the Internet, class materials, labs, or technology? Does my course design allow my students enough time to acquire information and practice a skill per learning unit? The instructor can ask students the same questions in a survey, discussion, journal, or paper and then triangulate the data.

Motivation as a third factor not only encompasses incentives and rewards that students may gain in the course but also includes consequences and accountability for not executing the tasks. From instructors’ point of view, motivating students could include fair and transparent grading schema, access to a professional network, aligning the course content with the industry standards, and job searching support; it could also refer to the punitive class policies for late assignment submissions, attendance accountability, etc. The instructor may administer surveys, or organize discussions or one-on-one conferences to gauge students’ motivations to succeed in class.

Collecting the Data

Learning Management Systems (LMS) provide excellent tools for data collection and analysis, which can help the instructor establish patterns in student performance and devise solutions for bridging the performance gap. Tests and surveys offer a multitude of question formats (multiple choice, scale, open-ended answers) that allow instructors to track students’ performance, attitudes, perceptions, etc. The instructor may allow students’ anonymity in non-gradable surveys or administer gradable test questions including students’ names. The types of questions depend on what the instructor wishes to achieve in their needs analysis. For example, if the needs analysis is conducted after the class assignment in a form of a survey, the instructor may ask students to rate the difficulty of the assignment on a scale and/or reflect on how they have approached the assignment. The instructor can triangulate the actual test scores and the data from the survey to better understand what caused the performance gap.

Grade Center columns in Blackboard’s Full Grade Center automatically calculate grade distribution and provide measures of central tendency (mean, median, and mode) and the measures of variability (standard deviation and variance).

However, one of the quickest needs analysis tools in Blackboard (Bb) is the Rubrics Evaluation Report. This report is automatically generated when the instructor grades a Bb assignment using Bb-integrated rubrics. Figure 1 shows the rubrics statistics report for a composition assignment that assesses students’ research, analysis, evaluation, and persuasive writing skills. The rubric has five categories: working thesis statement, research results and annotated bibliography, sample explanatory body paragraph, essay outline structure, and grammar and style. The performance levels range from exceeding expectations to failure. The red bar shows the maximum points possible for each category; the blue bar represents the average student score per criterion. This graph provides a visual representation of the performance gap between the actual and desired student performance.

 

Rubrics Statistics Report

In addition to Blackboard tools for needs analysis, the instructor can use class time or office hours for reflections or discussions that may open up new outlooks in analyzing student performance.

Closing the Performance Gap – Devising Solutions

Collecting and triangulating data from multiple sources, which may include students’ grade distribution, the survey results, student reflections, journals, discussions, feedback alongside the instructor-led course analysis using the aforementioned BEM framework, will allow for greater transparency and accuracy in determining what caused the gap in student performance. This will help the instructor devise solutions to increase student performance. For example, the rubrics statistic report in figure 1 showed a pronounced performance gap of almost 8 points for the ‘research results and annotated bibliography’ category. To close the performance gap, the instructor can provide extra instruction on research techniques, organize a research workshop, or invite a librarian as a guest speaker.

Needs Analysis, one of the most overlooked steps in the instructional design process, is critical to determining what causes a gap between the current and desired performance. In the context of higher education, Gilbert’s Behavioral Engineering Model (BEM) was interpreted as a framework for college instructors to conduct a comprehensive Needs Analysis in their courses to determine the causes of students’ performance gaps and to devise solutions to close the gap. A variety of Bb tools, such as grade center column statistics, grading rubrics statistic reports, surveys, tests, discussions, and journals can be used to gather data to better understand the misalignment between the desired and actual student performance. Instructors as needs assessors, should not only be collecting data from students, but also reflect on their own course design strategies.

Works Cited

“Needs Analysis Overview.” Instructional Design Central. https://www.instructionaldesigncentral.com/needs-analysis. Accessed 28 December 2021.

Stefaniak, Jill E. Needs Assessment for Learning and Performance. Taylor and Francis, 2020. Kindle Edition.

0 Comments

Leave a reply

Your email address will not be published.

*

Log in with your credentials

Forgot your details?

CUNY logo with name on the right in white color