4
Issues and Trends in Educational Technology Volume 6, Number 2, Dec. 2018
Google Forms Quizzes and Substitution,
Augmentation, Modication, and Redenition
(SAMR) Model Integration
Abstract
Web-based formative assessment technology has simplied how teachers capture
and analyze student data. As an assessment and data gathering web-based
application, Google Forms quizzes can be used to adapt content, individualize
instructional goals, collect performance data, and connect students and teachers
locally or from dierent parts of the world. Teachers can create and distribute
Google Forms’ formative assessment technology, resulting in synchronous
student performance feedback that communicates critical information related to
learning objectives for teachers and students. In addition, Google Forms quizzes
can be exclusively aligned and integrated with the technology benchmarks dened
in the Substitution, Augmentation, Modication, and Redenition (SAMR) Model,
resulting in the creation of dynamic and customizable formative assessments in
ways never before conceptualized.
Keywords: Google Forms; formative assessment; SAMR Model; online
collaboration; synchronous feedback
Scott Castro
Boise State University
Google Forms is a free web-based data gathering tool part of the Google Suite
applications by Google Cloud. Google Forms can be used to create surveys, polls, and
formative assessment quizzes. Google Forms quizzes allow teachers to create, share,
collaborate, individualize, and distribute formative assessments to students, providing
synchronous feedback that generates measurable performance data which is critical for
evaluating student progress. Nicol and Macfarlane (2005) suggested “good feedback
practice is not only about providing accessible and usable information that help students
improve their learning, but it is also about providing good information to teachers” (p.
14). Synchronous formative feedback has become a direct line for student self-regulated
learning (SLR) and instructional intervention, serving as an “important process as
teachers strive to instill SRL characteristics among their students” (Clark, 2012, p. 8).
Researchers agree formative assessment impacts the quality of teaching and learning,
and it engages students in SRL and self-directed learning (SDL) environment which can
be consummated using Google Forms quizzes (Stiggins, Arter, Chappuis, J., & Chappuis,
S, 2004). Google Forms quizzes have an assortment of optional design features that
produce data for students and teachers. These optional design features, located in the
quiz creation Settings tab, include:
5
Issues and Trends in Educational Technology Volume 6, Number 2, Dec. 2018
1. Missed questions, which show which questions were answered incorrectly;
2. Correct answers, which show the correct answer for each question;
3. Point values, which shows total points and points received for each question
(“Choose a question for your form”, 2018).
Figure 1. Google Form quiz settings. Reproduced from Google Forms. Retrieved
from https://support.google.com/docs/answer/7032287?hl=en. Copyright 2018,
Google.
Google Form quizzes include short response and multiple choice question types, but
only multiple choice questions provide synchronous feedback data. Complete question
design options in Google Forms quizzes include short response, multiple choice,
checkbox, linear scale, and grid (“Getting started with Google Forms”, 2016). Multiple
choice and checkbox questions share design characteristics including a horizontal linear
display, similar to traditional answer sheets. Additionally, multiple choice answers display
in a circular format, analogous to traditional bubble sheet recording documents like
scantrons. One variance is checkbox answer choices. Checkbox answer choices allow
teachers to design questions that authorize students to select more than one answer
choice. The more than one answer choice alternative follows logically with contemporary
computer-based K-12 Common Core assessments. For example, the Smarter Balanced
Assessment Consortium (SBAC) exam presently administered in fteen states (Gewertz,
2017), denes the function of selecting more than one answer choice as “multiple choice,
multiple correct responses” (“Smarter Balance Question Types”, 2016). This optional
6
Issues and Trends in Educational Technology Volume 6, Number 2, Dec. 2018
design function allows teachers to provide an equivalent testing framework to SBAC.
Google Forms quizzes include several general features that provide security measures
and data access by students. The general features in Google Forms quizzes include:
4. Collecting emails and sending receipts;
5. Restricted and required sign-in;
6. Limit to one response per user or allow multiple;
7. Respondent can edit after quiz submission;
8. Respondent sees a summary of charts and text responses (“Choose a
question for your form” 2018).
This review will explore Google Forms quizzes as a formative assessment tool and
provide comprehensive examples and integration strategies in correlation with SAMR
Model technology benchmarks.
Interpreting Formative Assessment
There are several denitions of formative assessment. Black and Wiliam (1998a)
described formative assessment as “encompassing all those activities undertaken by
teachers, and/or by their students, which provide information to be used as feedback to
modify the teaching and learning activities in which they are engaged” (p. 7). Black and
Wiliam later re-examined this denition to include the notion that assessments become
formative “when evidence is actually used to meet the needs of students” (1998b. p.
140). Expanding on Black and Wiliam’s denition, Broadfoot Daugherty, Gardner, Gipps,
Harlen, James, and Stobart (1999) argued that improving learning through assessment
depends on the following ve factors:
“(1) the provision of eective feedback to pupils; (2) the active involvement of pupils in
their own learning; (3) adjusting teaching to take account of the results of assessment;
(4) a recognition of the profound inuence assessment has on the motivation and self-
esteem of pupils, both of which are crucial inuences on learning; and (5) the need for
pupils to be able to assess themselves and understand how to improve.” (p. 7).
Furthermore, Black, Harrison, Lee, Marshall, and Wiliam (2004) argued an assessment
activity can assist learning if it produces information that teachers and their students can
use as feedback in assessing themselves and one another. Most teachers agree that
formative assessment is a continuous instructional process that follows closely to the
provisions Broadfoot et al. presented, rationalizing that its purpose is to evaluate, improve,
and support student learning. An example of this process could be observed when a
student is assigned a critical response essay aligned with specic learning objectives.
As an assessment tool, a teacher can use a rubric wherein feedback is recorded, oering
7
Issues and Trends in Educational Technology Volume 6, Number 2, Dec. 2018
personalized feedback with the intent of providing a means for the student to improve
their writing. During this process, both student and teacher evaluate student performance
and address potential needs to improve upon prior performance. Black et al. (2004)
supported this armation by suggesting such an assessment becomes formative when
the “evidence is actually used to adapt the teaching work to meet learning needs” (p. 10).
In the preceding critical response writing example, a student can amend their work based
on teacher feedback, and the teacher can use results to facilitate focused and guided
instructional objectives. When interpreting student feedback, formative assessment
provides an instructional framework for teachers, often distributing measures to adapt,
individualize, and dierentiate instructional goals to support students. It is critical to
consider “an assessment activity can help learning if it provides information that teachers
and their students can use as feedback in assessing themselves and one another and
in modifying the teaching and learning activities in which they are engaged” (Black et
al., 2004, p. 10). Despite varying interpretations and denitions of formative assessment,
one can summarize it as a procedure “carried out during the instructional process for
the purpose of improving teaching or learning” (Hammerness & Rust, 2005. p. 275) that
“is used to provide information on the likely performance of students...where feedback
is given to students...telling them which items they got correct” (Wiliam & Thompson,
2008, p. 60). Google Forms quizzes authorize teachers to deliver formative assessment
tasks to their students, allowing the creation of dynamic and original content that can be
integrated into all four technology benchmarks of the SAMR Model.
The SAMR Model
Developed in 2006 by Dr. Ruben Puentedura as part of his work with the Maine Learning
Technology Initiatives, the SAMR Model “provides a framework for teachers designed to
improve the integration of emerging technologies into their daily lessons” (Hilton, 2017, p.
68). Within this framework, the SAMR Model deploys a hierarchical structure, consisting
of four distinct levels with specic technology integration benchmarks. These levels and
benchmarks include:
Substitution: technology acts as a tool substitute, with no functional
change;
Augmentation: technology acts as a direct tool substitute, with functional
improvement;
Modication: technology allows for signicant task redesign;
Redenition: technology allows for the creation of new tasks previously inconceivable
(Puentedura, 2013).
The SAMR Model follows closely with recent revisions of Bloom’s Taxonomy. Forehand
(2011) described Bloom’s Taxonomy as a “multi-tiered model of classifying thinking
according to six cognitive levels of complexity” (p.42). These classications from lower-
8
Issues and Trends in Educational Technology Volume 6, Number 2, Dec. 2018
level to higher-order thinking go as follows: Knowledge, Comprehension, Application,
Analysis, Synthesis, and Evaluation. In 2001, former Benjamin Bloom student Lorin
Anderson revised Bloom’s Taxonomy and published a new list of categories from lower
to higher-order thinking (Churches, 2008, p. 2). Anderson’s changes included eliminating
the noun form of each classication and replacing them with verbs. The revised lower
to higher-level order thinking goes as follows: Remembering, Understanding, Applying,
Analyzing, Evaluating, and Creating (Churches, 2008). Each revised level of Bloom’s
Taxonomy can be connected to SAMR benchmarks. For example, Substitution and
Augmentation in the SAMR Model enhance learning, suggesting they serve to utilize
technology to replace or improve upon traditional learning exercises with technology.
Puentedura (2014) recommended that Substitution and Augmentation be associated
with the “three lower levels of Bloom (Remember, Understand, and Apply)” (para. 3). The
tasks of Modication and Redenition transform learning, creating new opportunities
that were previously unattainable through traditional measures absent of technology
(Kirkland, 2014). Puentedura noted “Modication and Redenition are associated with
the upper levels of Bloom” including Analyze, Evaluate, and Create (Puentedura, 2014,
para. 3). Although the connection to Bloom can be used “it is important to realize the
association between SAMR and Bloom’s Taxonomy is not a necessary or even habitual
coupling…the simple structure described is well-suited to beginning practitioners’ needs
and even retains usefulness for more experienced faculty” (Puentedura, 2014, para. 5).
It is worth taking into consideration that using Bloom’s Taxonomy as a framework with
SAMR presents a familiarity to teachers, allowing for the likelihood of an uninterrupted
transition when integrating technology into learning tasks. Teachers must also consider
the rapid development of new technologies being considered for classroom integration.
Because of this, SAMR benchmarks and how they are achieved appear to be in constant
uctuation due technology enhancements and improvements. These improvements and
enhancements have a direct impact on decision making by beginning and advanced
technology educators and play a critical role when implementing SAMR. With this taken
into consideration, each SAMR Model benchmark related to Google Forms quizzes
integration will be presented in further detail, beginning with Substitution.
Substitution. Hockly (2013) stated that Substitution is the simplest way to implement
mobile learning, and when evaluating whether an activity is a part of the Substitution
phase, Puentedura (2015) posed the question, “What will be gained by replacing older
technology with the new technology?” (p.3). Hilton (2017) dened Substitution as the
“use of technology for a task that could be accomplished without technology” making
technology gains at this stage insignicant (p. 68). For example, during a traditional
quiz, when students are presented with multiple choice or written response questions,
they are required to use a writing tool, like a pencil or pen to record responses. The
assessment delivery method is tangible, including a test and answer document. As a
direct tool substitute, Google Forms quizzes can accomplish these objectives without
functional modications using mobile or laptop technology. For example, Google Forms
quizzes may include both multiple choice and short response questions. Instead of using
a traditional pen or pencil to record responses, students use a keyboard, mouse, and
9
Issues and Trends in Educational Technology Volume 6, Number 2, Dec. 2018
touch technology. Even though administering a Google Forms quiz with mobile or laptop
technology diers from a traditional multiple choice and written response assessment,
the outcome eectuates no functional change notwithstanding the present technology
hardware. As teachers explore the design options in Google Forms quizzes, they will
discover advancement beyond the Substitution phase of the SAMR Model. When
transitioning from Substitution to Augmentation, teachers must investigate if they have
added an improvement to the task process that could not be accomplished with older
technology and explore if this feature contributes to the design (Puentedura, 2013).
Augmentation. Learning activities positioned within the Substitution and Augmentation
classications are said to enhance learning. In the Augmentation phase, “Technology acts
as a direct tool substitute with functional improvement” (Puentedura, 2013). To establish
whether Google Forms quizzes provide a functional improvement, one must consider
the default accessibility functions on laptops and mobile devices. Traxler (2010) stated,
“Mobile devices, especially connected devices, enable students to consume—that is, to
access and store—all sorts of knowledge almost instantly and almost wherever they are,
with little or no eort compared with earlier technologies” (p. 154). For example, when
using an Apple iOS device to take a Google Forms quiz, students can select an unknown
word and use the dene technology. The result of this process is a dictionary pop-up
window that displays the denition of the word and a thesaurus. These options allow
students to acquire vocabulary parallel to their independent skill level. Dene technology
is a functional improvement over traditional dictionary and thesaurus counterparts
because the student has immediate access to word denitions and synonyms. In a
learning environment without this technology, the process of looking up words in physical
texts is prolonged and can take away from instructional time. Along with dene options,
iOS devices also include speak technology. Speak technology allows students to have
a phonetic pronunciation of a word spoken to them, including customizable accents.
Speak technology is a functional improvement because content adjusts to learning
styles, providing a phonetic media alternative not available in traditional dictionaries. It
is worth detailing that speak and dene technologies are not design features of Google
Forms quizzes; instead, they are embedded accessibility tools on laptops, desktops,
and mobile devices.
Further functional improvements in the Augmentation phase include sharing and accessing
Google Form quizzes remotely. Accessing quizzes remotely empowers teachers to use
a ipped learning pedagogical approach, so students can access and complete content
prior to attending class. Additionally, Google Forms quizzes can integrate video and
photos into multiple choice and short response question sequencing. This feature is a
functional improvement because it merges media into quizzes, following logically with
media-driven questioning presented during computer-based state assessments like
SBAC. Video integration in Google Forms quizzes is limited to YouTube content and
may present challenges to teachers at school because of potential YouTube networking
blocking. Despite these limitations, most YouTube content contains automated closed-
captioning technology which can be used to individualize Google Forms quizzes for
students with hearing impairments or other learning disabilities.
10
Issues and Trends in Educational Technology Volume 6, Number 2, Dec. 2018
Photo integration with Google Forms quizzes allows more exibility than videos. Teachers
can upload photo media from a screenshot, hard drive, URL, Google Photo Albums,
browser search, and Google Drive. Photo integration can also be used for multiple
choice answers, making it adaptive for visual learners. However, photo integration lacks
meta descriptions of photos, which is not equitable for students with visual impairments,
something teachers should take into consideration. Additional functional improvements
in Google Forms quizzes include creating distinct numerical point values for questions,
randomizing answer choices, restricting quizzes to specic users, allowing users to or
not to edit responses, and permitting users to take a quiz multiple times or only once.
Modication. Learning activities positioned within the Modication and Redenition
classications transform learning. When in the Modication phase, “Technology allows
for signicant task redesign” (Puentedura, 2013). First, upon completion of a Google
Forms quiz, students receive immediate performance feedback. Winne and Butler
(1994) describe feedback as “information with which a learner can conrm, add to, and
overwrite, tune, or restructure information in memory” (p. 5740). Unlike traditional pen
and paper formative multiple-choice assessments, feedback for Google Forms multiple
choice questioning allows teachers to enable an immediate auto-grading feature that
shows incorrect and correct answers and question point values. It is critical for quiz
feedback to be visibly aligned with questions, allowing students to regulate their learning.
Clark and Mayer (2011) supported this design technique in their cognitive multimedia
research, concluding “When feedback is provided on a page separate from the quiz
question to which it is referring, it is more dicult for the learner to relate the feedback to
the quiz response” (pg. 92). The inclusion of answers with questions should be taken into
consideration by teachers designing Google Form quizzes so students can self-regulate
learning. Question point values and written answer feedback must be set by the teacher
and can be customized for each question and displayed when a quiz is completed.
Additionally, teachers can enable permissions for students to observe whole class
scoring data when they nish the quiz, so that they can measure performance outcomes
relative to peers. It is worth noting students do not have access to individual student
scores with this feature, only question-to-question class data provided in graph format.
Enabling these functions allows quality feedback and communication among teacher
and student. Nicol and Macfarlane (2005) asserted “quality feedback is information that
helps students troubleshoot their own performance and self-correct: that is, it helps
students take action to reduce the discrepancy between their intentions and the resulting
eects” (p. 10).
Another way that Google Forms quizzes modify task redesign is through creating
conditional logic-branching questions. Conditional logic-branching questions allow
teachers to create a quiz with unique sections of questions. For example, when taking a
quiz with logic-branching questions, students can complete the assessment more rapidly
based on correct answer selections. In contrast, if students answer questions incorrectly
during conditional logic-branching questioning, the quiz will could continue until students
show mastery of content. Conditional logic-branching questions authorize teachers to
gather valuable feedback “prioritizing areas of improvement” (Nicol & Macfarlane, 2005),
11
Issues and Trends in Educational Technology Volume 6, Number 2, Dec. 2018
by adapting in real-time based on student understanding. The inclusion of conditional
logic-branching questions and synchronous feedback communicates a signicant task
redesign with technology, allowing teachers to deliver innovative technology integration
for students.
Redenition. In the Redenition phase, “Technology allows for the creation of new
tasks, previously inconceivable” (Puentedura, 2013). Redenition is the highest phase
in the SAMR Model, and teachers must rst explore the Share features in Google Forms
quizzes to connect with teachers and students. For example, teachers and students
from dierent schools can work collaboratively to create and produce personalized
Google Forms quizzes by utilizing the quiz sharing option. One example of how this
activity can work is having teachers from dierent schools in the same school district
collaborate, create, and produce a series of shared Google Forms quizzes related to
shared curriculum. When a Google Forms quiz is nalized, the teachers could share it
on a learning management system like Google Classroom so students can assess their
understanding of the topic. Upon completion of the quiz, students from both schools
could engage in a discussion board related to the quiz topic. They can engage in a
collaborative analysis discussion on the performance data, allowing for the construction
of new knowledge. In this example, Google Forms quizzes serve as not only a formative
assessment tool but also one that connects students to shared learning goals through
communication and collaboration, creating a transformative technological task.
Using the ‘share’ function in Google Forms quizzes is not limited to teachers in the same
school or district. Teachers can also connect with a broader range of technology educators
all over the world using Google Education Groups (GEGs). GEGs are “communities of
educators who learn, share, and inspire each other to meet the needs of their students
through technology solutions, both in the classroom and beyond” (“Google Educator
Groups”, 2018). GEGs are created with the social media platform, Google Plus, and are
for teachers that aspire to innovate students with technology. Once teachers create a GEG
or join one in progress, they can establish shared learning goals with group educators
and proceed to share, collaborate, and create Google Forms quizzes, resulting in the
creation of new opportunities for students and teachers to connect and collaborate on
a global scale. In GEGs, Google Forms quizzes can be administered on a Google Plus
message board, so students and teachers can gather, analyze, and interpret data from
all participating group members. This example is transformative because it creates a
global task using technology that was previously implausible, redening how teachers
can connect themselves and students to one another around the world.
Google Forms Quizzes Limitations
With most technology, it is practical when administered by individuals that have proper
training. To eectively use Google Form quizzes, it is suggested teachers become
qualied with implementation strategies by becoming a Google Certied Educator or
undergoing similar professional development. Completing certication or professional
12
Issues and Trends in Educational Technology Volume 6, Number 2, Dec. 2018
development requires time and money, something teachers may not have the opportunity
to pursue, presenting a potential barrier. Also, Google Forms quizzes require an internet
connection. The 2013 US Census reported the population at 316,200,000 million with
98% having access to the internet (“Computer Use and the Internet”, 2014). This data
concludes that 2% of the United States population, some of which are students, are
without internet access, creating potential concerns for student connectivity to Google
Forms quizzes. Moreover, in a school setting, Google Forms quizzes may encounter
network security that prohibits administering assessments that contain embedded media
content, creating a potential obstruction for teachers that want to embed media content
for students with physical or learning disabilities. Lastly, GEG meetings could become
complicated because of the varying time zones of participates, making it problematic for
teachers and students to communicate in real time.
Conclusion
As a web-based formative assessment tool, and with careful consideration, Google
Forms quizzes can achieve all four SAMR Model technology benchmarks. When
designing Google Forms quizzes using SAMR model benchmarks, teachers can embed
media content, design logic-branching questioning, individualize and adapt content, and
provide students with synchronous feedback and data to support learning through self-
regulation. Additionally, teachers can communicate and share Google Form quizzes in a
global domain, reaching students and teachers in venues never before conceptualized.
Despite the enhanced and transformative functions in Google Forms quizzes, not all
teachers achieve each SAMR benchmark. Having this expectation is unrealistic. Instead,
SAMR can be interpreted as a guide or reective framework for beginning and advanced
teachers that aspire to integrate more technology into lessons. Moreover, it is critical
to take into consideration not all teachers maintain comparable technology skills. For
teachers using SAMR as a model for integration, they should work in the levels they have
determined familiar and comfortable, so they never force SAMR integration. Despite
varying pedagogical approaches to SAMR integration in the classroom, Google Forms
quizzes permit beginning and experienced technology educators with opportunities to
carry out SAMR Model benchmarks, functioning as a valuable formative assessment
tool for teachers and students.
13
Issues and Trends in Educational Technology Volume 6, Number 2, Dec. 2018
References
Anderson, L. W., & Krathwohl, D. R. (2001). A taxonomy for learning, teaching, and
assessing: a revision of Bloom’s taxonomy of educational objectives. New York:
Longman.
Black, P., Harrison, C., Lee, C., Marshall, B., & Wiliam, D. (2004). Working inside the
black box: Assessment for learning in the classroom. Phi Delta Kappan, 86(1),
8–21.
Black, P. J., & Wiliam, D. (1998a). Assessment and classroom learning. Assessment in
Education: Principles, Policy, and Practice, 5(1), 7–73.
Broadfoot, P. M., Daugherty, R., Gardner, J., Gipps, C. V., Harlen, W., James, M., et
al. (1999). Assessment for learning: beyond the black box. Cambridge, UK:
University of Cambridge School of Education.
Butler, D.L., & Winne, P.H. (1995). Feedback and self-regulated learning: a theoretical
synthesis. Review of Educational Research, 65(3), 245-281.
Churches, A. (2008). Bloom’s taxonomy blooms digitally. Tech & Learning, 1, 1-6.
Clark, I. (2012). Formative assessment: assessment is for self-regulated learning.
Educational Psychology Review, 24 (2), 205-249.
Clark, R. C., & Mayer, R. E. (2011). E-Learning and the science of instruction: proven
guidelines for consumers and designers of multimedia learning (3rd ed.). San
Francisco, CA: John Wiley & Sons.
Docs Editors Help: Choose a question for your form (2018). Retrieved April 02, 2018,
from https://support.google.com/docs/answer/7322334?hl=en
Forehand, M. (2010). Bloom’s taxonomy. Emerging Perspectives on Learning, Teaching,
and Technology, 41, 47.
Gewertz, C. (2017). Which states are using PARCC or Smarter Balanced: an interactive
breakdown of states’ 2016-17 testing plans. Education Week, 36 (21). Retrieved
from https://www.edweek.org/ew/section/multimedia/states-using-parcc-or-
smarter-balanced.html
Google Educator Groups. (n.d.). Retrieved April 22, 2018, from https://www.google.
com/landing/geg/
Google Learning Center: Get started with forms (2016). Retrieved from https://gsuite.
google.com/learning-center/products/forms/get-started/
14
Issues and Trends in Educational Technology Volume 6, Number 2, Dec. 2018
Hilton, J. T. (2016). A case study of the application of SAMR and TPACK for reection
on technology integration into two social studies classrooms. The Social
Studies, 107(2), 68-73, doi: 10.1080/00377996.2015.1124376
Hockly, N. (2013). Technology for the language teacher: mobile learning. ELT Journal,
67(1),80–84.
Kirkland, A. (2014). Models for technology integration in the learning commons. School
Libraries in Canada 32(1), 14–18.
Nicol, D. J., & Macfarlane‐Dick, D. (2006). Formative assessment and self‐regulated
learning: a model and seven principles of good feedback practice. Studies in
Higher Education, 31(2), 199-218.
Puentedura, R. (2013). Moving from enhancement to transformation.
Retrieved from http://www.hippasus.com/rrpweblog/
archives/2013/05/29SAMREnhancementtToTransformation.pdf
Puentedura, R. (2014). SAMR and Bloom’s Taxonomy: assembling the puzzle.
Retrieved from https://www.commonsense.org/education/blog/samr-and-
blooms-taxonomy-assemblingthe-puzzle
Smarter Balanced Question Types. (2016). Retrieved from http://www.cde.ca.gov/ta/tg/
sa/question-types.asp
Stiggins, R. J., Arter, J. A., Chappuis, J., & Chappuis, S. (2004). Classroom assessment
for student learning: doing it right--using it well. Portland, Oregon: Assessment
Training Institute.
U.S. Census Bureau (2014). Computer and internet use in the United States. 2013
American Community Reports. Retrieved from https://www.census.gov/history/
pdf/2013computeruse.pdf
Wiliam, D., & Thompson, M. (2017). Integrating assessment with learning: what will it
take to make it work? (pp. 53-82). Routledge.