Skip to 0 minutes and 6 secondsWhile the problem-solving workshops were being redesigned, we also redesigned some of the lab exercises so that they better aligned with what was being taught in lectures. And we found that this has been very effective. How we've measured the effectiveness is using Concept Inventory Tests. So in Physics there's a whole bank of Concept Inventory Tests, which are designed to test the students' understanding of basic concepts. And how it works is you give these questions to the students before they do the course, and then you give them to the students after they finish that topic, and you look at the difference in their results.
Skip to 0 minutes and 46 secondsAnd if they didn't know much at the start and then knew a lot at the end, then that's good. That's a high learning gain. So we've been measuring these learning gains in our students in the 1A course, and we found that they are just on the verge of the medium to high gain region, which is-- very, very few courses manage to get there. So these Concept Inventory Tests are given to students at lots of different institutes. So you can compare how your cohort of students is comparing, compared to students at other institutes.
Skip to 1 minute and 20 secondsSo this gives us a lot of confidence that the way we're teaching the students is at least getting them to understand these basic concepts we want them to understand very well, a lot better than they did at the start of the course. OK, so I use a number of assessment strategies in blended learning contexts. I've mentioned I use quizzes.
Skip to 1 minute and 48 secondsIt's probably more a feedforward approach. And I try to adopt a feedforward approach as much as possible, so find ways so that the students can feel oriented before they get to the assignment, rather than feedback after they've done the assignment. One way to do this for project-based works, is to have a number of assignments that feed into a final assignment. So with one course, they'll build a virtual instrument early on and then use that instrument in a final context. So that's one way. Another way of getting feedback is through engaging with wider platforms. So I'm using Global Patch Bay at the moment, which is an international platform for artists to meet and discuss works.
Skip to 2 minutes and 42 secondsAnd the students are submitting their works onto that platform and getting feedback from other contributors around the world. This is good for the students. It's also good for allowing them to get used to developing networks, and to developing an international portfolio of work, which I think will really help, once they've left the university or moved on into other roles. I also like to provide feedback from peers. This can happen through class, through self-assessment, through group assessment. One thing that I'm trialling this year is having students from a different course, so working with animation students' works that they've done previously and doing scores for those.
Skip to 3 minutes and 27 secondsThe animation students give a little bit of a discussion about what they were thinking when they made the animation. Then my students engage with this, create the animation. Then they get feedback from me, but also from the creators of the animation. So there's a sense of working towards something that's collegial, that's a cross-disciplinary. And I think the students really respond to working in that environment where the work is meaningful and the feedback comes from people whose work they've been involved with. In terms of this course assessment that we've designed, we're getting feedback over the past couple of years that the students still are confused. There's a lot to learn.
Skip to 4 minutes and 12 secondsWe redesigned the instructions so they were PDF, and they had a lot more detail in, but it still wasn't adequate. So what we are now going to do is create some short videos which actually show the software programme SPSS being used with voiceover. So the screen capture will show the manipulation of the data with explanation. And this will be, I'm planning this to be and hoping this will actually aid the students in terms of learning how to use the software. But also the statistical understanding will be improved, because they can play these over and over again. And they can talk about it with each other. They can follow it on their own computer as they're listening.
Skip to 5 minutes and 1 secondThey can actually try it out. So I think this sort of is one area that we're going to improve upon. Another area where we think we could improve upon is what we actually ask them to do in the formative. At the moment it's very basic, and I think we may be underestimating what they can achieve at that point. And that's something that you can often evaluate as you're going through, when you design something, you try it out. So long as you've got evaluation in place and you're also assessing the assessments, so you have to actually have an overview of what has this assessment achieved? Are these students actually understanding what we think they're understanding?
Skip to 5 minutes and 43 secondsAnd are they taking that into account, and the feedback into account, when they're moving on? And if that isn't working, then you can actually target that again and rethink maybe what you need to put into it. It might need making simpler, it may be that you can make it more complex, or it may be asking questions in a different way to get the right learning in there, because it is a learning process, that formative assessment. The decision to design and implement this assessment task was made because of an issue that we identified, and a need that we identified in training and developing these particular skills in our students.
Skip to 6 minutes and 26 secondsWe tried a number of strategies, and we found that assessment was one of the big drivers that were likely to get us to this goal. And having reached that preliminary conclusion, we then engaged our students in a conversation on what would actually work for them. And we were fortunate to get some Strategic Learning and Teaching Funding to actually run a project with students as partners. Whereby we were able to engage with a small group of students who then went out and spoke to a larger group of students about what was working and what was needed to develop this particular set of capabilities in the student group.
Skip to 7 minutes and 13 secondsAnd we were then able to build on what we learned from that experience, and really design an assessment activity that not only was identified by us as a need, but had features that the students brought back as likely to be more effective in engaging the student group.
Good assessment and feedback design in practice
Hear four UNSW academics talk about assessment, and how they redesign their assessment tasks to ensure that they align with the learning outcomes, integrate technology as part of designing assessment and providing feedback, and design formative assessment.
In this video (7:42) these academics are highlighting some of their assessment design experiences. Insights they share include:
- re designing lab exercises to align with the lecture content
- using a feed-forward approach when designing assessments
- continually improving the assessment and support material
- using students as partners in the design process
Academics in context
Information about the academic staff in this video and their professional contexts may be found in the Academics in context document.
Each of the academics featured in the video have adopted a different approach to designing assessment. Reflect on these different approaches. What are the key strategies used for assessment design?
Want to know more?
If you would like to more about this topic of good assessment and feedback design in practice, there are additional resources listed in the Want to know more.pdf for this step.
© UNSW Australia 2017