Pages

Thursday, April 16, 2015

Mapping Ed Tech Usage to Student Engagement - An SAMR Observation Rubric

by John R. Walkup, Ph.D.


Education technology holds some of the most exciting promises in K-12 instruction, with some calling it as essential to instruction as paper and pencils.

However, results have been mixed.

Few doubt the potential instructional power of well-designed education technology, but implementation has varied wildly across the country. While the amount of technology purchased for classrooms varies from district to district, inconsistencies in the way that technology is used in the classroom have also surfaced.

SAMR Model 

Many education technology pioneers have seen the need to categorize the degree of education technology implementation in terms of quality factors. The Substitution Augmentation Modification Redefinition Model (SAMR) model developed by Ruben Puentedura is the most commonly used model and contains four levels, as described by Educational Technology and Mobile Learning web page.
  • Substitution: Here, teachers or students are only using new technology tools to replace old ones, for instance, using Google Docs to replace Microsoft Word; the task (writing) is the same but the tools are different.
  • Augmentation: Here, we are still in the substitution mentality but this time with added functionality. Again using the example of Google docs, instead of only writing a document and having to manually save it and share it with others, Google Docs provides extra services like auto saving, auto syncing, and auto sharing in the cloud.
  • Modification: Here, technology is being used more effectively not to do the same task using different tools but to redesign new parts of the task and transform students learning. An example of this is using the commenting service in Google Docs, for instance, to collaborate and share feedback on a given task.
  • Redefinition: Here, students use technology to create new tasks that would have been inconceivable previously. An example of redefinition is "when students connect to a classroom across the world where they would each write a narrative of the same historical event using the chat and comment section to discuss the differences, and they use the voice comments to discuss the differences they noticed and then embed this in the class website".
Another helpful resource on the SAMR model is offered on the Technology Is Learning Web site.   There, they offer an insight into correlating the four levels of SAMR implementation with student engagement:

While one might argue over whether an activity can be defined as one level or another, the important concept to grasp here is the level of student engagement. One might well measure progression along these levels by looking at who is asking the important questions. As one moves along the continuum, computer technology becomes more important in the classroom but at the same time becomes more invisibly woven into the demands of good teaching and learning.

I would also recommend reading Will Kembley's blog on SAMR.

Examining the four SAMR levels, we would expect students to hold the most intense academic engagement when using technology at the Redefinition level, with engagement dropping as the SAMR levels drop to Modification, Augmentation, and finally Substitution.

However, to my knowledge, no studies have been conducted to map this correlation. Which poses another question: What happens to student engagement when no education technology is used at all?

SAMR Implementation Versus Student Engagement

Mapping SAMR implementation to student engagement could offer classroom observers, such as researchers, administrators, coaches, and teachers, a clear illustration of the power of education technology. The figure below shows an observation rubric for carrying out such observations. You can download the rubric here: SAMR Engagement Rubric.

Each column of boxes corresponds to a predetermined segment of observation time, usually one minute. In the figure on the right, 15 columns have been shaded, indicating a total observation time of 15 minutes.

For an in-depth discussion of measuring academic engagement time and the use of the rubric, see our paper titled  "Bell to Bell."

Top Half

In the top half of the rubric, the number of boxes in each column reflects a typical Lexile scale. When all four boxes are shaded, this denotes maximum technology implementation (i.e., the Redefinition level). Shading the three bottom-most boxes would indicate the Modification level, and so on.

For every minute of time that elapses during a classroom observation, the classroom observer simply shades the number of boxes corresponding to SAMR implementation for that time segment.

Sample SAMR Implementation/Engagement rubric.
The sample shown on the shows (hypothetical) results for a 15 minute observation, with each vertical column boxes marking off one-minute intervals.

In the one-minute span after the three-minute mark, the teacher employed education technology at the Augmentation level.

For three minutes after the six-minute mark, the teacher employed technology at the Modification level.

Note that no education technology was employed for the first two minutes after the ten-minute mark. From that point on, implementation resided at the Substitution level.

Bottom Half

Now let's discuss the bottom half. For every minute of time that elapses during a classroom observation, the classroom observer simply shades in the proportion of students that appear academically engaged, with the proportion broken into fifths.

The first column in the sample shows that 2/5 of the class was engaged during the first minute of observation, whereas 3/5 was disengaged.

Note that five boxes are always shaded in each column.*

Correlation Calculations

There are many ways of measuring the correlation between technology implementation and student engagement, with some more straightforward than others. I suggest the Pearson coefficient because Microsoft Excel already features this function in its library.
Correlation calculation with the function used to display the result in Cell A7 shown in the function window.
To find the Pearson r-value, simply enter the numerical values associated with each column into an Excel spreadsheet, with one row corresponding to SAMR implementation and the other row student academic engagement (the number of boxes shaded above the horizontal).

The Pearson() function in Excel will generate the r-value for the two rows of data. The figure on the left shows how Excel found the Pearson coefficient for the sample rubric shown above. The Excel code that generated this result is shown in the function window.

In social studies research, a result of 0.71 as shown here is considered fairly reasonable correlation.

Future thoughts

School improvement relies on clear, thorough measurement of classroom processes. Use of such a rubric could provide school administrators and teachers a powerful tool for initiating discussion on how education technology can improve the learning environment for all children.


* Separating engaged from disengaged students, rather than just marking the ratio of students engaged, has uses beyond the scope of this article. A forthcoming article, "Bell to Bell Engagement: New Tools for Measuring Academic Engagement Time in Direct Observation Studies," that I have coauthored along with my colleagues David Farbman and Ben Jones will illustrate how to use the rubric to measure academic engagement time.


Seeking training at your school or district centered on Cognitive Rigor or Depth of Knowledge?  Call me at (559) 903-4014 or email me at jwalkup@standardsco.com. 

We will discuss ways in which I can help your teachers boost student engagement and deep thinking in their classrooms. I offer workshops, follow-up classroom observation/coaching, and curriculum analysis to anywhere in the country (and even internationally).

No comments:

Post a Comment