“This is the end, my only friend, the end”. This is the last week of CEP 811 and for the ‘end’ we have been asked to think about assessment as related to maker education and creativity. As with many of my blog posts, I would like to address this topic by providing you with a little background information on how and why we assess here at the American University of Sharjah (AUS) Library, and my vision for the types of assessment measures we can use to assess the use of maker kits in the classroom.
Currently, the AUS Library uses a multi-pronged approach to assessment, in which various assessment techniques and methods are utilized prior to, during and following information literacy instruction. Assessment of AUS IL classes is essential, as it allows librarians to: monitor effectiveness for accountability, better identify instructional practices, evaluate the effectiveness of instructional practices, measure student achievement (i.e. how much they know) and evaluate students’ mastery of skills (i.e. what they can do). However, the majority of our assessment measures revolve around the assessment of skills related to information literacy, as documented in the ACRL’s Information Literacy Competency Standards for Higher Education. This type of skills-based assessment, is often rigid and focuses on the use of surveys, fixed test questions and closed-ended questions.
Recently, however, there has been a major shift occurring in the way academic librarians conduct assessment. This is largely due to the development of the ACRL’s new Framework for Information Literacy for Higher Education (2015), which places an emphasis on the grasping of concepts related to IL rather than a set of rigid and prescriptive skills. This has had a dramatic effect on how librarians develop and conduct assessment. The framework is heavily based on Meyer and Land’s (2011) Threshold Concepts theory that emphasizes the importance of concepts that once grasped by the learner, create new perspectives and ways of understanding a discipline or challenging knowledge domain. The grasping of these concepts produce transformation within the learner, allowing the learner to think like an expert or practitioner.
The new framework and its emphasis on threshold concepts has had an immense impact on assessment and library instructions. Librarians are beginning to move away from assessing students’ ability to learn a set of rigid skills. Rather, library instructors are now using tools like performance assessment or self assessment which emphasize the students’ ability or inability to grasp broad concepts. With tools like self assessment students become conscious of the research strategies, critical thinking steps, and other transferable skills that they employ to complete an assignment. Additionally, emphasis is placed on self-reflection and internalization of the use and value of specific skills allowing the students to think as a practitioner. Lastly, it goes beyond simple analysis of the final product and assesses the decisions students encounter and make while completing an assignment and why they make certain choices (Gilchrist and Oakleaf, 2012).
With self-assessment and performance assessment instruction librarians teaching students to use maker kits in the library classes can better assess students’ ability to deal with and use maker kits in new and creative ways. Here the emphasis would be on the process and train of thought more so than the end product. I say this largely because I feel that measuring or assessing creativity in a quantifiable and structured manner is very difficult. Although, I do not believe it to be impossible, unlike some prominent educators (e.g. Grant Wiggins) I do not agree that instructors can “recognize creative thinking immediately when we see it” (Wiggins, 2012). It is my humble opinion that Wiggins wrongly postulates that instructors have the ability to consistently discern what is an is not creative when assessing student work. Creativity and what is deemed creative, is an extremely subjective matter that is heavily influenced by various external factors.
To assume, as Wiggins does, that the “right criteria” and “multiple & varied exemplars” is all that is necessary to create rubrics or assessment tools that can quantifiably measure creativity, is an oversimplification of the matter. The focus for assessment of maker education, should be shifted from an evaluation of the end product to an evaluation of the process.
Association for College and Research Libraries. (2015). Framework for Information Literacy for Higher Education. Retrieved from ACRL website:http://www.ala.org/acrl/standards/ilframework
Gilchrist, D., & Oakleaf, M. (2012). An essential partner: the librarian’s role in student learning assessment. National Institute for Learning Outcomes Assessment: Occasional Paper, 14.
Meyer, J., Land, R., & Baillie, C. (2010). Threshold concepts and transformational learning. Rotterdam; Boston: Sense Pub.
Oakleaf, M. (2014). A Roadmap for Assessing Student Learning Using the New Framework for Information Literacy for Higher Education. The Journal of Academic Librarianship, 40(5), 510-514.
Wiggins, G. (2012, February 3). On assessing for creativity: yes you can, and yes you should. [Web log comment]. Retrieved from http://grantwiggins.wordpress.com/2012/02/03/on-assessing-for-creativity-yes-you-can-and-yes-you-should/