Volume X Number 1, August 2004

Universally Designed Online Assessment: Implications For The Future

Michael Abell
Debra Bauder
Thomas Simmons
University of Louisville

ABSTRACT

The 1997 Amendment to the Individuals with Disabilities Education Act (IDEA-97) mandated that students with disabilities be afforded increased access to the general education curriculum. As an outgrowth of this, educators, assessment developers, publishers and researchers should begin to expand the research base of classroom assessment to include emerging online assessment tools. The purpose of this article is to shed light and begin to contextualize how next generation assessment tools could function to meet individual student learning and assessment needs. Future online assessment tools have the power to provide immediate individualized feedback for both students and teachers. By offering the assessment process within a universally designed framework, students gain individualized feedback based upon their unique learning styles while teachers gain just-in-time feedback to gauge teaching styles and content acquisition.


UNIVERSALLY DESIGNED ONLINE ASSESSMENT TOOLS

In the growing field of Universal Design, online assessment should play an expanded role. Wiggins (1998) defines assessment as a deliberate use of many methods to obtain evidence to indicate if students are meeting standards. Evidence is gathered in formal and informal assessment including observations, dialogues, traditional quizzes and tests and performance tasks and projects along with student self-assessments. Online assessment has begun to emerge as a new tool that will impact student instruction and assessment. Currently the dimensions identified by Wiggins are not prevalent in online assessment tools available to teachers due to technological and theoretical limitations. Many of the premier online assessment vendors offer only rudimentary, machine scored true/false, or multiple-choice responses with automatic feedback or essay storage without scoring. Little research has been done to integrate the deep knowledge base we have of learning and assessment into future online assessment tools. Online assessment tools offer educators new ways to impact learning, by accessing the rich pool of student results stored in these relational databases. When paired with dynamic assessment models integrated into the online assessment tools, the representation of assessment data in rich formats using multiple forms of presentation and media can be used to meet individual learning styles. Royal (2003) advocates a method termed "Computer Adaptive Testing (CATS)" that allows the computer to custom design tests to meet the individual ability level of each student to measure progress verses the more static print-based norm or criterion measures currently used.

Education reform has placed high expectations upon educational institutions to produce highly skilled and knowledgeable professionals at both the P-12 and Post Secondary levels. According to Pankratz & Petrosko (2000, p.1-8, & 268-282) lessons can be learned from past mistakes to impact new ways of teaching through high order learning. Ongoing assessment is one key to such learning. Case, Bauder, & Simmons (2001) examined the benefits to online learning in the student decision-making process. This decision making process is quite powerful when placed in the context of online assessment. This forum affords the student and instructor multiple means of presenting or accessing assessment information using online technology. This might include video clips of experiments with possible answers or digital images or audio embedded directly into the online assessment. When multiple forms of presentation are used in the online arena, one begins to delve deeper into individual learning styles than is the case with the more traditional "one size fits all" assessment approach. Online assessment also affords students an opportunity to analyze their results; further, when given multiple forms of assessment presentation methods they can compare results to match their own unique learning styles. Rose & Meyer (2002, p 64) note that multiple forms of presenting and representing information connect individual students' learning styles using principles found in the Universal Design for Learning (UDL) movement. UDL also addresses three neurological networks (Recognition, Strategic, and Affective) in the brain that when aligned to individual learning styles could also positively impact the assessment as well as the learning process.

PERSONAL DIGITAL ASSISTANTS AND OTHER APPLIANCES IN THE ASSESSMENT PROCESS

While online assessment tools hold promise, ubiquitous computing devices such as Personal Digital Assistants (PDAs) or handheld computers allow online assessment tools to become portable. Anderson-Inman (1999) advocates the need for sufficient technology access. Devices such as these offer such access and lend themselves to individual customization and 24/7 use both at home and school. Given the falling prices, many PDAs can be purchased for ($200-400) and used directly in ongoing classroom assessment activities. Abell, Bauder, Simmons & Sharon (2003) note the ways PDA can be used to connect students to the learning process as well as monitor progress. When these devices are used to present information such as informal assessment measures or to track portfolio entries, which are then, uploaded into centralized online assessment tools students are freed from the desktop computer for such things as naturalistic assessment activities in the field. At this low cost, students are also able to have ownership for "anytime, anywhere" progress monitoring and assessment tracking.

ROLE OF DIGITAL CONTENT IN ONLINE ASSESSMENT

Hitchcock, Meyer, Rose & Jackson (2002) postulate that barriers to the curriculum cause intellectual disconnect from the learning process and limit potential learning opportunities. Flexibility within the assessment environment resulting from the use of digital content (including text, video, audio, or combinations of these) provides more opportunities to connect with individual student learning styles, thereby increasing knowledge transfer. Digital content offers key elements to more dynamic assessment. The programmability of digital content to meet the unique assessment needs of individual learners as well as collapsible content offers individualization of the assessment process (Anderson-Inman, 1998). Through such features, frequent self-guided "smart assessments" could be tailored to guide students through the learning process by providing frequent feedback and progress monitoring. Anderson-Inman's (1987) examination of student test performance indicated that changes in test materials and the test administrator impacted student achievement. Rose (2001) provided testimony before the Congressional Subcommittee on Labor, Health and Human Services, and Education. He advocated that digital text and accompanying Universally Designed curriculum and assessment can bring down barriers to learning for all students.

ASSESSMENT AS A PROCESS OF LEARNING

When examining the various dimensions of assessment, Popham (2001, pg 104) identifies four rules of classroom assessment,

  1. Use only a modest number of major classroom tests, but make sure these tests measure learner outcomes of indisputable importance.
  2. Use diverse types of classroom assessments to clarify the nature of any learning outcome you seek.
  3. Make students' responses to classroom assessments central to your instructional decision making
  4. Regularly assess educationally significant student affect-but only to make inferences about groups of students, not individual students
These rules help us to conceptualize the important aspects of assessment that could play a role in various online assessment tools. Popham goes on to encourage the close examination of test items by asking "What kind of cognitive operations must students engage in if they are to succeed on this test?" What are the cognitive demands and intellectual operations students will be required to perform? Through the use of online assessment tools that employ multiple and flexible content formats, educators would now have the ability to present and monitor assessments aligned to individual cognitive and intellectual operations. Popham emphasizes the fact that the cognitive demands tests place on students should capture teachers' attention and drive instruction. Teachers who spend even a modest amount of time analyzing a test's cognitive demands will have a better idea of how to design instruction to achieve major learning outcomes. This is the role of next generation online assessment tools in the context of teachers adjusting instruction based upon ongoing assessment. Online assessment can now become a tool for analyzing instructional demands through individual learner results using the power of online assessment databases and analytical tools. With this in mind, classroom evaluation brings forth a higher calling in teachers - to improve the quality of classroom instruction.

Sprenger (1999) advocates for teaching and assessing students in a natural, enjoyable, brain-compatible way and still get the results and data that we need to give to our school district, administration, and parents. When discussing the area of procedural memory assessment, Sprenger shares the benefits of students physically demonstrating knowledge through modeling. Next generation assessment tools could allow for such demonstrations in either a prompting mode allowing teachers to glean various assessment ideas that have proved beneficial to students in the past as indicated in the online assessment database and as a data repository for actual performance results. How this can be done, in a thirty-student class, remains the challenge. Sprenger recommends having students demonstrate through an imaginary lab experiment thus allowing their episodic memories to work. This same principle could be implemented virtually with students accessing online assessment tools that provide learning guides that blend the actual instructional process with assessment to activate various parts of the brain, in this case the episodic memory.

ONLINE ASSESSMENT TOOLS INDIVIDUALIZE LEARNING AND TEACHING

Online assessment tools also allow for the powerful analysis of multiple forms of data. McTighe & Thomas (2003) advocate for the analysis of multiple sources of data in a "photo album" of assessment (rather than the "snapshot" found on a single test) and the application of this learning in new situations. Online assessment technology can be designed to provide such analysis across multiple test measures and "learner scenarios" in different formats (such as video) rather than the traditional test score results averaged together for an overall class grade. Teachers and students need assessment tools that connect to individual learning styles and provide key information to teachers. This information will help to guide instruction and allow students to connect with their unique learning style. McTighe (1997) advocates the use of ongoing assessment to provide feedback and adjustment for the instructional process. Teachers using online assessment tools can better analyze and adjust teaching approaches based on real-time student assessment data only available through online assessment tools.

CONCLUDING THOUGHTS

Next generation online assessment tools that meet the high expectations of education reform and increase student outcomes are needed now. The integration of multiple presentation formats of assessments to meet individual student learning styles is one of the areas in need of further research and development. These tools could provide timely feedback to teachers to help them understand students' grasp of content and the effectiveness of their teaching styles. Research should examine the impact online assessment has on the way students engaged in the assessment process as both a progress monitoring tool and a criterion measure for passing benchmarks. Assessment tools must also take into account individual cognitive styles, while reporting results with a combination of power and ease. Teacher and student training in the utilization of next generation online assessment tools are also paramount to the institutionalization of new assessment tools for the 21st century and, therefore, deserve further research.

BIBLIOGRAPHY

Abell, M., Bauder, D. Simmons T., Sharon, D. (2003). Using personal digital assistants (PDA) to connect students with special needs to the general curriculum. Closing the Gap, 22(1). 20, 38.

Anderson-Inman L. (1998). Electronic text: Literacy medium of the future. Journal of Adolescent & Adult Literacy, May, 41(8), 678-682.

Anderson-Inman L. (1999). Computer-Based solutions for secondary students with learning disabilities: Emerging issues. Reading & Writing Quarterly, July/Sept, 15(3), 239-49.

Anderson-Inman L. (1987). Consistency of performance across classrooms: Instructional materials versus setting as influencing variables. The Journal of Special Education, 21(2), 2-29.

Case, D., Bauder, D.K., & Simmons, T.J. (2001). Decision-making in the development of web-based instruction. Education at a Distance, Retrieved June 13, 2003, from http://www.usdla.org/html/journal/MAY01_Issue/article04.html

Hitchcock, C., Meyer, A., Rose, D., & Jackson, R. (2002). Providing access to the general education curriculum: Universal design for learning. Teaching Exceptional Children, 35 (2), 8-17.

McTighe, J; & Thomas, R. (2003). Backward design for forward action. Educational Leadership, 60(5), 52-55.

McTighe, J. (1997). What happens between assessments? Educational Leadership, Dec/Jan, 54(4), 6.

Pankratz, R. S., & Petrosko, J. M. (2000). Introduction: An ambitious plan for improving schools. In R. S. Pankraz & J. M. Petrosko (Eds.), All children can learn: Lessons from the Kentucky reform experience (pp. 1-8). San Francisco: Jossey-Bass.

Pankratz, R. S., & Petrosko, J. M. (2000). Conclusion: Insights from a decade of school reform. In R. S. Pankraz & J. M. Petrosko (Eds.), All children can learn: Lessons from the Kentucky reform experience (pp. 268-282). San Francisco: Jossey-Bass.

Popham, W.J. (2001). The Truth About Testing, An Educators Call to Action. Alexandria Virginia. ASCD.

Rose D.H., & Meyer, A. (2002). Teaching every student in the digital age, Universal design for learning. Alexandria Virginia. ASCD.

Rose, D. (2001, July). Education technology: Hearing before a subcommittee of the committee on appropriations. Special Hearing of the United States Senate, One Hundred Seventh Congress, First Session, Washington, DC.

Sprenger, M. (1999). Learning and memory, The brain in action. Alexandria Virginia. ASCD.

Van Horn, Royal. (2003). Computer adaptive test and computer-based test. Phi Delta Kappan, April 84(8).

Wiggins J. & McTighe J. (1998). Understanding by Design. Alexandria Virginia. ASCD

Abell, M., Bauder, D., & Simmons, T. (2004). Universally designed online assessment: Implications for the future. Information Technology and Disabilities E-Journal, 10(1).