Skip to 0 minutes and 7 secondsDR.

Skip to 0 minutes and 7 secondsNEGIN MIRRIAHI: So an effective strategy, that I've used, to evaluate my teaching practise or my course design is making sure that my evaluation is holistic and that it's looking at essentially five different views or five different lenses. So looking at student feedback, feedback that my students give me throughout this semester, looking at peer feedback from my colleagues, if they sit into my course or they look at my online course material-- the literature and the theory in terms of how my strategy aligns with that and my own self-reflection what I think needs to be improved whether this semester or following iteration of the course. And also bringing in that analytics, the learning analytics component.

Skip to 0 minutes and 49 secondsAnd what I mean by the learning analytics component is often what we have access to in terms of how students are engaging with online technologies, and the blended learning and online space. Students are engaging with more and more technologies or we hope they are, but we don't necessarily know that. Which is why I think it's critical that I also look at how they're actually engaging with those technologies and that's where that data from the analytics can be useful. So one example that I've done is in a foundation's programme that I redesigned. I evaluated it looking at these four, plus a fifth lens being that analytics lens, in order to see what sort of feedback was around the programme.

Skip to 1 minute and 34 secondsSo that involved getting student feedback, and getting peer feedback on the programme, and getting analytics behind it to see how students are engaging with the online activities that I've done. And it essentially-- generally speaking you can organise surveys that match the type of questions that you want answered. You can organise peers to give you the type of feedback that you're looking for in particular activities, and you can seek out the literature, and you do your own. The tough part is usually looking at the analytics and what is available and how you can get access to that.

Skip to 2 minutes and 9 secondsSo we were fortunate in that we had access to the data from the learning management system at the University and we had access to somebody who could visualise some of that data for us in order to be able to see when students were for example accessing meetings or activities was it before an assessment was due was it before the face to face session. How many students were engaging in-- we had a flipped classroom, flipped learning approach to the programme. So we were looking at how many of the students were engaging with the flipped learning pre-activities prior to class versus those who did it after class.

Skip to 2 minutes and 47 secondsAnd it's quite alarming once you get a sense of how students are actually engaging with how you thought they would be engaging. DR.

Skip to 2 minutes and 54 secondsRACHEL THOMPSON: In terms of teaching practise and how to collect information I initially started collecting information with my first set of online tutorials that I did. And I used the Brookfield's Critical Incident Questionnaire altered that for an online sort of version. So I had a few basic questions like do you think you understand this more or less, or whatever. So some basics of quantifiable questions which are pretty straightforward, but I wanted to capture qualitative information. So I had those basic questions of Brookfield's which were, what you think was the best thing about this whatever it was, so this an online tutorial. What were the most difficult parts of this? What is the most surprising? What would you change?

Skip to 3 minutes and 39 secondsVery simple, very straightforward questions, but the information I got from those students who answered this was brilliant. So you tend to get the extremes you tend to get the students who understand it and want to give feedback because they think it's good, and they got it. And you get the students who are still struggling and they're trying very helpfully to give you some feedback as to where they got stuck. So in terms of the Critical Incident Questionnaire, it can give you an idea as to how well you're going is what's good and what's bad in what you've done. So where you might need to target to improve.

Skip to 4 minutes and 13 secondsAnd the good can actually help you work out where you can use that elsewhere maybe. The surprising is always a really useful question because it is surprising this often makes you think outside the square. It maybe something you totally didn't expect. And if it's surprising to the students that can be an engagement point or a point where they disengage, so it can be an interesting point. So that was what I did originally with the online tutorial, so using the Critical Incident Questionnaire. DR.

Skip to 4 minutes and 46 secondsADAM HULBERT: For me, with the Audio Culture course I was quite nervous about putting that together. It was a pilot course for the active learning spaces. It was the first time that I'd put the lectures online. There are a whole lot of different things going on that were new for me in terms of teaching. One thing that I found really worked for me was to just honest with the students and say, hey, guess what you guys were trialling this thing together. This is a new room, let's work out how it goes. This is a new format, what do you think? And check in with the students throughout the semester.

Skip to 5 minutes and 22 secondsIn terms of the room, there was some formal evaluations as well as informal discussions. That became almost a weekly thing at the start of the tutorials, checking in with the students. How are we going? How did you find this week's material? How is the room going? What can we do? And that worked really well and a lot of the feedback from the rooms, in particular, got developed for future iterations of those spaces.

Skip to 5 minutes and 55 secondsFor music students that was really important like being able to listen in a proper environment, the sound quality the use of headphones. Just these things that are easy to overlook or under-hear could be fixed without too much difficulty. And that's done through conversation, so finding a way to set up that dialogue I think is probably the most valuable way I was able to keep track on how the course was going DR.

Skip to 6 minutes and 19 secondsELIZABETH ANGSTMANN: So an innovation that I introduced was problem solving workshops to Physics 1A where the students actually come to class and solve problems themselves rather than watching a tutor solve problems themselves. And these questions are specifically designed to address common misconceptions. How I evaluated this was with concept inventory tests. So we gave students a test before they started the unit and a test after they had done the unit and worked out how much they'd learned, how much had their score changed. It would have been ideal to apply these tests before we introduced the problem solving workshops and then afterwards. But unfortunately I didn't. I wasn't using them before we started the problem solving workshops.

Skip to 7 minutes and 4 secondsBut the good news is with concept inventory tests, at least in physics, these are widely used at a large number of institutes. So we can compare how our cohort of students is performing compared to similar cohorts at other institutes. And they showed that the learning gains for our students were a lot higher than at most other institutes and so we can have confidence that the approach that we're taking to learning and teaching in this course. So the problem solving workshops were effective in getting basic concepts across to students

Skip to 7 minutes and 41 secondsNALINI PATHER: When I first came to UNSW, which was eight years ago, reflection wasn't something that I did naturally in terms of my learning and teaching. I think because I was trained as a scientist and that was not something that we actually did. It was a whole new thing for me when I the full course here. I'd done learning and teaching courses in other places before, but there wasn't that big emphasis on reflecting on what you're doing and then thinking about improvement. So for me that was a whole new thing. And then writing reflection for the different assessments that was a big learning process for me.

Skip to 8 minutes and 19 secondsAnd in thinking about it now it's actually really great because after teaching for 15 years for the first time it puts me back in that learning kind of mode where I had to do assessments and to think about how am I doing this. Do I actually understand what they want from me? What is this reflection thing that they want, which I actually don't like. So it is a big, big learning process for me and a steep learning curve to actually reflect but in hindsight it was brilliant because just thinking about all the things that I've tried.

Skip to 8 minutes and 52 secondsIt was only through reflecting on every experience and not just recounting it in my mind which is good to do, but also thinking about what worked, what didn't work, how would I change that. That has actually has made me innovative in my learning and teaching. And I think if I didn't develop that skill through [? fold ?] and then with the grad cert, I would be less of an innovative teacher, less about thinking about what I'm doing and how it's working for the students. So in that session that we recorded one of the things that I did do was using Socrative to actually get feedback from the students.

Skip to 9 minutes and 26 secondsAnd in that situation the Socrative software didn't work really well because they couldn't display the results in the lecture hall. But the students could see it and so thinking on my feet I had to walk them through what they were seeing and then still be able to give them feedback and let the situation work out as it was partly designed to work. Because we had a lecture that was supposed to be this integrated an interactive session that was really, really important for them going to the exam.

Skip to 9 minutes and 56 secondsSo I think one of the things is actually being flexible on your feet and being able to think OK if this doesn't work what can I do and that's been built into me over the last few years. But just developing this whole reflective kind of practise, so that now when I stand in the front and I know something's not going to work I can actually think very quickly through OK, what can I do to make it work because I'm thinking more now about what is the learning strategy that I want to achieve here and how else can I do this.

Skip to 10 minutes and 26 secondsSo it's been a process for me of learning to be a teacher in this whole digital environment using technology and all of those kind of things. What I would do in future? It's hard to answer that for that particular situation. But in future and I'm really, really excited about all the different kind of possibilities there are with learning and teaching now, so trying different things like using VR, and 3D, and modelling, and manipulation, and all of those kind of things just to see where it will get students excited.

Skip to 11 minutes and 1 secondBecause part of it is getting them excited about learning the content, the other part for me is actually getting them excited to want to continue doing some post-graduate work in this field. So by them seeing all of the possibilities and sometimes seeing that it doesn't work is actually a good thing for me and being comfortable enough to tell them OK that didn't work let's do it in another way now and that's part of the process, think. DR.

Skip to 11 minutes and 28 secondsPAUL EVANS: One of the primary ways that I know is that I'm an effective teacher is that I monitor student learning and I make sure that happens effectively. So you can see that informally, in interactions during tutorials, and you see it formally, in the students assessed work that they produce. So that would be I would say the primary measure of an effective teacher is to create good student learning. Another measure that we have, another formal measure, is the course evaluations that happen at the end of every semester. And we know that these are very limited in a lot of ways. One way is the response rate can be very low.

Skip to 12 minutes and 12 secondsAnd so it's important to explain to students towards the end of the semester that you really value this information and you care about it. And you think it would be good responsibility for the students to contribute as they have benefited from students evaluating the course in the past. It is a bit of an obligation for them to contribute to the courses in the future by doing the same and by evaluating the course. And so invite students to participate in the process and so that can help increase the response rate. The numerical information that we get about the degree to which students are satisfied with the course and like our teaching is a reasonable indicator.

Skip to 13 minutes and 0 secondsIt is imperfect, but it is a measure. And then I think in the comments we can sift through the comments and some of them we can write off, some of them you have to have pretty thick skin to read through. So you can't get too bothered by them. I think one good strategy when going through the comments is to look at the consistency of comments between students. Just because one student says something, it doesn't mean that all of the students are in agreement about that. And indeed you can sift through [INAUDIBLE] and see so many contradictions. I think this lecturer is the most interesting person I've ever seen in the world.

Skip to 13 minutes and 45 secondsAnd then you get other students saying I think this lecturer is the most boring person I've ever seen. And so you can take those out, but look at the most consistent comments that keep popping up. And they can be good feedback about either your teaching or about the structure of the course, which can be good to act on. DR.

Skip to 14 minutes and 5 secondsCHINTHAKA BALASOORIYA: It's important to note that while there's a wide range of strategies available, the effectiveness of learning and teaching is less about finding the perfect strategy, but more about finding the strategy that fits best with the content, with the preferences of the learners, and the strategy that you are most comfortable with. So it's really important to be able to develop a skill base and develop a menu of strategies that are available out there, so that you can draw on the strategy that is most appropriate for the situation that you're confronted with. So that you can really align the strategy with the content, the learner preference, and what you're comfortable with.

Skip to 14 minutes and 56 secondsAnd that will lead to a really effective learning experience.

Evaluation in practice and what is evaluated

What is evaluated?

In the previous step we looked at the questions to ask when considering evaluation. What is the object, or focus of evaluation is a good one to start with.

As a teacher, you could focus on the evaluation of teaching effectiveness, teaching quality, the impact of teaching on student learning, the effect of teaching on students’ competences, the curriculum design, or the goodness of assessment methods.

These broad areas of focus may seem overwhelming, however they are essential to understanding what enhancement refers to when focusing on quality learning and teaching in higher education.

To help you understand how to conduct an evaluation and consider the elements of this process, you will be doing a small-scale, mini-evaluation on your practice in week 4 during this course

“Evaluation studies are fundamentally about asking questions, and then designing ways to try to find useful answers. Studies may concern materials, projects, courses, methods, packages, or systems; in fact, anything that can be asked about in a detailed, structured fashion” (Harvey, 1998, p. 9).

We recommend you identify a specific focus for your evaluation. For example, you could focus on one of these:

  • the evaluation of one aspect of your teaching and how effective it is
  • what impact a new tool or method you designed is having on student learning
  • an appraisal of how effective a particular assessment is for student learning.

For all of the above we need to formalise what to measure and select appropriate and relevant data based on your evaluation focus.

Examples of teaching practice evaluation

In the video above academics from a range of disciplines share examples of evaluating their teaching practice. They speak about making decisions about what to measure and selecting appropriate and relevant data.

Talking point

Share one strategy or key point that you learned from the video. Include the timecode of when this appears in the video so that your colleagues in this course can revisit this specific point.

Academics in context

Information about the academic staff in this video and their professional contexts can be found in the Video Participants Information document below.

Want to know more?

If you would like to more about this topic of evaluation in practice and what is evaluated, there are additional resources listed in the Want to know more.pdf for this step.

References

Harvey, J. (Ed). (1998). Evaluation cookbook. Edinburgh: Learning Technology Dissemination Initiative Institute for Computer Based Learning, Heriot-Watt University.

Share this video:

This video is from the free online course:

Introduction to Enhancing Learning and Teaching in Higher Education

UNSW Sydney