A framework for evaluation: Stage 1

Stage 1: Determine your position/orientation

A framework for evaluation: Stage 1. Determining your positioning. The simplified 3 stage integrated teaching development framework (Vigentini, Negin & Kligyte, 2016; Pardo & Mirriahi, 2017). Used with permission.

In this first stage you need to decide where you stand in respect to key questions that shape any evaluation:

  1. what is the purpose of evaluation (what are the key questions?)
  2. who the stakeholders are
  3. what is the most appropriate methodology to answer your questions (how)
  4. what are the most appropriate sources of evidence (how)

Your conscious choices in response to these questions will affect the evaluation process as well as the conclusions which you will be able to draw from the evaluation.

Your position will be shaped by:

  1. the questions you ask
  2. your level of engagement in the process
  3. the sources of data you collect.

Your level of engagement in the evaluation process can be represented as a continuum ranging from little engagement (at the periphery), to that of an inside observer and then to an in-depth engagement (at the core).

Emerging from the academic development work of the authors and the reflections in submitted work they have assessed through the years, the following metaphors were suggested as typical examples of how the positions are related to the second stage of the model, focusing on the selection of interpretational lenses and the sources of data.

Second-hand data: ‘gold panning’

Focus is entirely on the design of the course and the teaching of the material. There is no conscious intention of asking specific questions about the effectiveness of the design and therefore success stories are like golden nuggets, which are found occasionally, more for persistence and refinement of the technique. If any, the data collection relies on students or peer feedback and/or self reflection.

At the periphery: the astronomer

Awareness of the reporting tools available, exploring the data as a by-product of the teaching process while becoming accustomed to considering a range of sources of evidence, taking different perspectives to appraise teaching effectiveness. The sense-making afforded by the use of the data is like the observation of a distant planet, something that happens outside their direct control.

Inside observer: the ethnographer

Actively part of the process, data generated doesn’t flow from a distance; it is actively collected to gain objectivity from the observation of teaching. Hence, the data becomes a tool to understand the process of teaching and its effectiveness on student learning alongside one’s own reflection.

Research at the core: the researcher/scientist

Questions and hypotheses actively drive the course design and data collection process. Appropriate ‘sensors’ are used to measure aspects of teaching and learning and the design of the learning experience is fully informed by an hypotheses-driven testing cycle, constantly striving to learn from practice, from multiple perspectives and improve future iterations of course design and teaching. What is learned is then compared and analysed with existing evidence and theory for further dissemination.

Reflection point

Which position do you think you’d adopt to do an evaluation: a gold-panner, an astronomer, an ethnographer or a scientist? Why?

Talking point

What is your position on evaluation? In one or two sentences summarise what you believe the role of evaluation is in your educational context. Describe your role as an evaluator. You may wish to refer to the positions of an evaluator described in the previous step.

References

Pardo, A., & Mirriahi, N. (2017). Design, Deployment and Evaluation of a Flipped Learning First-Year Engineering Course. In C. Reidsema, L. Kavanagh, R. Hadgraft, & N. Smith (Eds.), The Flipped Classroom: Practice and Practices in Higher Education (pp. 177-191). Singapore: Springer Singapore.

Vigentini, L., Mirriahi, N., & Kligyte, G. (2016). From reflective practitioner to active researcher: Towards a role for learning analytics in higher education scholarship. In M. Spector, B. B. Lockee, & M. D. Childress (Eds.), Learning, design and technology: An international compendium of theory, research, practice, and policy. New York: Springer.

Share this article:

This article is from the free online course:

Introduction to Enhancing Learning and Teaching in Higher Education

UNSW Sydney