Daniel Waller

Daniel Waller

Hello. My name is Daniel and I work in the UK at UCLan. I have worked in TESOL for around 25 years and my areas of interest are language assessment, corpus linguistics and discourse.

Location Preston, UK

Activity

  • The journalist Eddie Mair - always calm during interviews and able to really pull issues apart (and politicians too). When interviewing people who have been through traumatic experiences he is empathetic and supportive.

  • And sitting cheerfully on the fence - logical decision with consideration for emotions in the communication of it. A logical decision, badly communicated can be as devastating as a poor decision or no decision. Emotions can also help one to recognise when a decision is urgent and has to be addressed - as long as one also reflects on how this is best done and...

  • Hello, my name is Daniel and I work in educational management. I am pretty good at managing my emotions at work - at staying calm and not reacting. Having said that, I also think that there infrequent times when showing emotion can have good results as long as it is genuine and it remains controlled.

    The question about managing the emotions of others - I...

  • @DeannaBerget It is definitely a tall order. One possible way is through the sharing of an analytic criteria. In addition to describing the learners' level, some work with the learners in class can help develop an awareness of what the next band up means. This could work well in multi-level classes since learners could be working with a student who has scored...

  • @DanGerrard Hi Dan, Well, a portfolio is a form of continuous assessment and there are different ways of grading it, depending on what you want the portfolio to do. If you consider the portfolio to be a single piece of work, perhaps like a vocabulary journal for a course, then feedback builds towards a single grade. This is relatively easy because there is, in...

  • @AnastassiaZelenyaeva Hello Anastassia. I think there can be some quite good tasks that can be done to integrate skills with low-level learners. Irina has illustrated some. I remember working on a test for A1 and A2 which combined listening into the speaking test (learners had to do a task which involved listening to the interlocutor and responding). In part...

  • Hello Irina, thank you very much for your comment and description of you assessment process. I also agree that integrated tasks are probably going to be an increasing feature of future assessments, though it may take a while to get there. The key issues often are whether tasks are dependent on each other (for example, if they do badly on the listening, does...

  • Hi Dan. A porfolio should do both since it is a collection of work over time. Ideally, the portfolio is introduced to the students, explained and explored (maybe letting students see examples) and then regular class time is assigned with feedback being a key feature (both from the teacher and learenrs).

    One of the problems with portfolios I have...

  • Hi Deanna, that's brilliant! When portfolios engage students they really are a wonderful tool to use. I notice that you said 'some' students - I take it there are others who have not been quite as engaged :)

  • Hello Dan. Good question. Because an Achievement test is linked to a programme of study, it should reflect the aims and objectives of that programme in the assessment. This goes back to the point about having a unified curriculumn (i.e. clear course aims/objectives and an assessment that measures how well learners have performed against these). Most course...

  • Hello Deanna, Thank you very much for the comment. It is definitely difficult with some learners to persuade them that formative feedback matters. But, if it is followed up with good feedback and learners see that the instructer takes it serioulsy, that can help to change attitudes.

  • Hello Atiya, Thank you very much for the comment - using portfolios is great - I think the key element is making sure that learners understand what they are supposed to include, why and creating curriculum time to work on it in class along with regular formative feedback. It is a lot of work but when learners engage with it, as you have identified, it can have...

  • Thank you for the comment Anastassia. So, you use them as progress tests?

  • Hello Eduardo, that is certainly a problem but if items are built up over time then mix and match, and putting in new items can be possible. Test security is another factor that can be used for high-stake tests to avoid tests leaving the room.

  • Yes, copying would definitely create reliability problems. :)

  • Excellent point Matthias and one which needs to be considered when looking at a whole test, particularly if it includes different skills and subskills. If students perform levelly across it, that might be a bit odd since most learners have a spikey profile. This is why measures of whole test reliability can be problematic.

  • @AlexisCalderónRodríguez Very true Alexis.

  • @fatimahussain Hello Fatima, I agree that pictures can be a great resource for assessment but they do need trialling. Not everyone gets the same thing from a picture and some are certainly better than others for eliciting language. They are as you say really useful though.

  • Really good for native speakers too :) Intuition about frequency can be wrong and even fully proficient users of English (within which I include both NSs and NNSs have come up against the gap fill completed by a learner and thought 'hmm, that almost sounds possible, but is it?'

  • Thank you Nicolette. Have you thought about trying learner diaries? They could combine some aspect of lexical/grammatical recording but can be a great way to have short personal dialogues with your learners with your feedback encouraging, suggesting and responding to them?

  • Absolutely Fatima. An assessment could have 100% reliability and yet lack any validity! The pursuit of reliability has been a theme in assessment, sometimes at the expense of validity. The fact is that if you want to assess things with some real-world element then it may involve the assessment having some 'messy' elements such as interaction. Great point!

  • Hi Katie, yes it can because the learners are taking the test under conditions different from the norm. One reason why assessments need handbooks is to make certain conditions in any venue are as similar as possible. When standardising colleagues, we use videos recorded with volunteers in non-assessed simulations. Having unexpected audiences does unsettle...

  • Hello Maria, that's an interesting point and is one of the key differences between an achievement test and a proficiency test. As achievement tests tend to be at the end of a course of study where the input across groups is relatively similar, a proficiency test must be accessible to anyone eligible to apply for it. This increases the need for more publicly...

  • You are absolutely right - partial points creates an element of uncertainty which can be very hard to control for. So, for high stakes tests, they are often not awarded. However, if you are running a progress test you may well want to reward partial corrects in order to guide and motivate learners. Some sort of scheme will be needed for consistency but as the...

  • Hello Carmen, you raise a good point. In most of the writing tests I've seen creativity within the task is rewarded so long as the purpose of the task is met. The question which usually arises in marking is 'is this response communicatively effective'? If so, then creativity will make it moreso. It's also important to note that linguistic ambition is often...

  • Good point Katie. The classroom is a safe place for practice and rehearsal and if we can get learners performing tasks in as close a manner as in the real world then we are enabling them in the way you describe.

  • Hi Ricky, thank you for sharing your experience. Letting learners work with the criteria is a really good idea.

  • @NicoletteL Hi Nicolette, where is there is lexical overload, you can do some of the filtering work for the learners - direct them towards lexis that is likely to be useful - you can use some of the tools like the Academic Word List, or the English Vocabulary Profile to help you. You can also rely on what you know about your learners and their context - for...

  • Hello Nicolette, Thank you for sharing your experience. I am glad to hear it worked well - you will find some learners who do not engage, at least not until they (a) know that everyone else is (b) see that there are results and consequences to not doing so - and even then you will find the odd 'last minuter'. All the same, it sounds like you found the...

  • Good examples Ivan. Online tools such as One Note really do make it much easier to build portfolios now but they still need to be very carefully planned into the curriculum, explained, time allocated and feedback used as a regular component - as you experiences illustrate.

  • Hello Alexis, Absolutely - and that is one of the most common reasons why portfolios often do not 'work' - they need time in the curriculum as you say and regular meaningful feedback.

  • @SatishThakare Hello Satish, Achievement tests usually come at the end of a relatively substantial period of study - I would suggest that what you probably need are progress tests since you want to look at development. Progress tests can be thought of as the stops along the way to the final achievement test - they usualy have a formative purpose. The other...

  • Hi Dan, Field's book is brilliant - definitely a key read if you are interested in the development of listening as a skill. I am not sure he ever published the study I mentioned; he presented it at a small event at CRELLA (Centere for Research in English Language Learning and Assessent) but I suspect he has taken it to other events as well. I will have a hunt...

  • I've always been partial to the task of getting learns to pull out any chunks they thought were interesting or useful from a text and exploring them as a post reading/listening task.

  • Hi Simon - that's a really useful distinction and I agree. Formative assessment can and should be flexible and open to adaptation because it is aimed at developing the learners' skills.

  • @RickySjöberg I agree with you - as you say, assessment really ought to be considered right at the point at which you develp the course aims and outcomes. This means that when materials are developed you can make sure they address both learning and test familiarisation. You can also minimise the issues of the assessment and the course pulling learners in...

  • Hi Elena, and how does that seem to you? I pick up from your comment that you might feel it is too much. Particularly with skills assessment, you may not see clear difference in performance in short spaces of time. How long are the modules in terms of time (that would make a difference)?

  • Hi Maria, so this is for a kind of progress testing?

  • Hi Carmen, I really like your comment. There have been variations of the communiative approach which threw out explict teaching of vocabulary and grammar but I think most teachers have always known that they are the building blocks upon which other skills stand. I remember years ago using a text book which wanted A2 learners to listen to a text for gist. Well,...

  • Hello Matthias, Yes, there are some issues with the selection of vocabulary in text books with authors often relying on intution rather than considering corpus data. To be fair, this situation is improving and many course books are now corpus informed but they also often fail to distinguish between levels of vocabualry. For example, when a text book presents...

  • You're welcome Aurora

  • Hi Dan, That's a really good question. I think in part it depends on the task and the level of the students as to whether you want to repeat the listening - and also the amount of repetition built into the text naturally. Elsewhere on the MOOC, I have mentioned a small piece of work done by John Field who looked at the consequences of replaying IELTS...

  • Hello Yogita. I have to admit that I have a soft spot for the C-Test.

  • Hello Matthias, That is true. They can be used though for peer-to-peer discussion (depending on the level of the test and the students) which can be quite a good way of revisiting language. I have also found that getting learners to set cloze tests for each other works quite well.

  • Hello Fabiana, I agree that MCQs are fast for marking but not for writing them. I remember when I started teaching spending a good couple of hours writing a multiple choice test which my learners then breezed through in about 5 minutes - it really was not a good return for my time. The other problems you suggest are also common issues too.

  • Hi Carmen, I completely agree. MCQs can be very useful but they are, as you say hard to write well.

  • Hello Mary, it probably comes down to piloting the test with an appropriate group of learners. Timing is an essential part of assessments and it is one of the aspects that learners often have comments about.

  • Hello Carmen, That sounds like a good positive change. And if your tests help develop the learners towards the actual exam I am sure that they are appreciated.

  • Hello Yogita - those sound like good clear aims for your assessments.

  • Hello Matthias, thank you for passing that on. It sounds like your school is using best practice.

  • If you use the BYU-BNC https://corpus.byu.edu/bnc/ interface that's a bit more user-friendly. The BNC isn't a pedagogic tool but it can be useful for pulling outdoors lines for learners to explore. For example, you can search for [j] money, it will bring up the adjective collocations for money - so simple searches like that can be used to make learners aware...

  • Hi Asta, you paste in the text you want to use and it will highlight the AWL words or gap them depending on the level you set it. The AWL has 10 levels so if you select 10 it will highlight all the words on the AWL, but if you choose 2 for example it will highlight words in list 1 and 2. I hope that helps.

  • @RickySjöberg Thank you for opening an interesting discussion. Task based learning is a very effective classroom practice for motivating and pushing learners. We also know that explicit focus on language enhances learning and testing, carried out in a principled way, can support that. Even in TBLT there is an element of assessment in the form of the feedback...

  • I think that is an important point - and good assessments should facilitate and develop learning and help to guide it. It comes down to the right assessment being used.

  • Hello Olga. I think students may find these lists hard to deal with on their own. Where these lists can be useful for teachers is in helping to prioritise certain lexis when preparing students for particular text books. They can inform materials development and test design.

  • I agree that students may find these lists hard to deal with on their own - the lists at the end of the text books are useful for precisely the reasons you give. Where these lists can be useful for teachers is in helping to prioritise certain lexis when preparing students for particular text books. They can inform materials development and test design....

  • Hi Barbara, They can be useful tools to help guide teachers when planning and also for test designers to ensure that the focus is on vocabulary that is frequent and useful in different contexts. Of course it is really important to check where the list comes from - is it linked to a corpus? If so, what kind of texts went into it: written, spoken, both? What...

  • Hi, that's interesting - I wonder how often it happens to learners :)

  • Hello Carmen. I really like the idea of the word list being given at the start so that learners can work with it during the period of study, notice and re-notice the language in context and the link to assessment. Do you use any productive tests?

  • Hello Yogita. I like the idea of setting tasks that give learners the opportunity to use the lexis in appropriate contexts - that would tell you a lot about how they had learned.

  • Hello Helen. I like the use of the short tests as part of recycling within the classroom.

  • Really well put Ricky.

  • @GrahamWilson I agree - it comes back to the question of the purpose of the assessment and what the results are supposed to indicate. As you say, a test for entry to US insititutions might well require US spellings and usages and a test in a location with its own variety of English might accept usage that would be correct in that context if not in others.

  • Hi Graham, I would never want to discourage teachers from writing their own tests but I would urge them towards particular formats and usually away from multiple choice which can be really hard to produce (and usually not a good investment of time for teachers). I think if teachers talk with each other about assessment and look at each other's work then better...

  • Hi Larysa, Yes, we do often build courses around what thngs our learners need to be able to do. What the Grammar and Vocabulary profiles do is provide suggestions as to the linguistc exponents that might be appropriate at that level which can be a useful starting point to help teachers work out what language they might present.

  • Hi Elena, It's a good question. The Grammar Profile is not an absolute guide as to what should be covered and learners will often be exposed to 'higher' forms in the lower levels. I think you could start by raising awareness of the form - maybe highlight it in a text andlet them notice it and establish its meaning. Almost treat it like a lexical item and...

  • Hello Dan. Great questions. I saw a really good session by John Field who explored the issue of what impact listening twice had on students (he used an IELTS task where they usually only listen once). It was a small-scale project but he found that listening twice tended to allow students to pick up additional marks but did not take them out of their band (so...

  • I think the experiences with flawed test papers show the importance of peer review and transparency in assessment and if possible, external scrutiny as part of the testing process . Good test writing is incredibly hard and best done collaboratively.

    The issue of correct English is a difficult one. I've worked in testing teams made up of UK, US, Australian,...

  • Hi Maria. You have hit the nail on the head here! Multiple choice items are really time consuming to write well, test receptively and often in quite a limited way and as you say, learners tend to fixate on the correct answers. I think for busy teachers, they are often the least efficient form of assessment (okay, they can be marked quickly, but how long was...

  • Thank you for this Larysa. the range of items is good and looks at slightly different features. Maybe the scrambled items are slightly easier than the gap fill, so maybe that could go first?
    This would be a good set of tasks for a progress test - learners could learn from them as they worked through them and the feedback could be discussed peer-to-peer in...

  • Hi Ricky, Thank you for your post. Modals are particularly problematic in multiple choice questions because it does so often come down to the speaker's intention. You are absolutely right to add more context but I am still not sure it would work as an item - I could still see A, C and D working (not so much B but that might just be me!). However, if you wanted...

  • Hi Graham, Test centres are sometimes given additional items to pilot and there is a really rigorous review process. I haven't written for Cambridge but I have written for other boards/institutions and often there would be a writers meeting where proposed items would be gone through forensically. It was often quite a humbling process when a test you thought...

  • All good points Tom. Piloting can also work well, both with colleagues and learners as it throws up interpretations that the writer just does not come up with on their own.

  • @DanGerrard Hi Dan, yes - you're right. This can result in some very 'interesting' tests, though they might struggle to be described as valid or reliable. But a document which states, even simply, what a test is designed to assess and how, even if just a couple of pages, can go a long way to helping to get people to think about their assessments.

  • Hi Erik, Absolutely! And really the practice element of the classroom is where we really need to spend the time - helping learners notice, renotice and systematise the new language through meaningful activities so that they can use it in the real world.

  • Hello Sarah. Thank you for your comments. I thought the reflection on how children learn is important - they know the grammar intuitively even if they don't know the metalanguage. For learners, we need to make them aware of the patterns used to create meaning - or as you say, they will struggle to construct thier meanings effectively.

  • Hello Hien, I agree with you - grammar is part of a communicative system and should be presented and tested in a meaningful context. Thank you for your comments.

  • Thank you for your comment Erik. When you test your learners, do the learners pick up on the difference between what they do in the classroom and what happens in the test? Do you give the learners a context for the vocabulary and lexis?

  • A really good point - learners revisit language across a course and it can be a surprise to them that there are different meanings/uses/forms to a word that they think they 'know'. A group of strong B2 students were once astounded to discover that 'research' could be used as a verb.

  • Hello Elena. I agree about assessing vocabulary in context. On the issue of word lists, I think that it depends what you do with them. Many students are very good at learning vocabulary this way and like it so perhaps we can adapt it and get them to use it. One text book I worked on (long, long ago) provided a weekly word list at the start of the chapter and...

  • Hello Maria, Thank you for the response. You make a good point here - that lexical knowledge is made up of different aspects of which spelling is one - so we may want different tasks for these.

  • Hello Anne, I definitely agree that short tests around grammar/vocabulary can be motivating. I think that the key element is that the language needs to be in a meaningful context.

  • Hello Sarah. That's a really good point - the expectation of the learners is a key thing we have to keep in mind. We might have a reasons for deciding that an integrated skills approach is the best one for a situation, but if the learners expect explicit grammar and lexis focus then if we don't provide this they may lose faith in us. And there is good evidence...

  • Thank you Omar. A good test specification (the document describing a test and providing guidence to test writers or users - there might be different versions of it) should set out all of those factors to help people choose texts which are suitable.

  • I agree Tony - it can be hard to discriminate between B2 and C1 - especially at the top level of B2+ where it sort of merges into C1. About 7 years ago, IELTS re-calibrated the top level of B2 and C1 which suggests that we are not alone in finding the distinction difficult. I think the idea of 'familiar' topics, inculding those in the learner's area of...

  • Hello Patricia, I think that would work really well in class, especially using it as you have suggested. Would you want to use it in a test though?

  • Hi Tony, I agree that humour does not always survive the translation either in terms of language or culture. In some countries where I have worked, we had to be really careful talking about traffic accidents because there were so many fatalities on the roads that jsut about everyone knew of someone who had been killed.

    On the other hand, I once saw a...

  • I think I fall into that category too :)

  • Hello Omar and thank you for your comment. When you say language level do you mean the test takers or the level of the text (listening or reading)? Language level would include many different aspects such as length/speed of speech, complexity of lexis and grammar and so on

  • The real-life aspect is important, as you suggest. The level of the learners also needs taking into account which is why we might choose to manipulate the answers we are looking for - perhaps we might want low-level learners to pick particular bits of infomation out of the text.

  • That sounds spot on :)

  • Hello Dora, do you have a clear notion of when? As we have suggested, spelling can be considered but the key is to have a clear notion of when and how.

  • Good that you give the feedback Fabiana

  • Hello John,it sounds like there is a clear agreed scheme in your context which is what is needed. Thank you for sharing.

  • Hi Elena. I don't think it's a problem - we usually listen to things for a purpose and listen to things we have some knowledge about. For learners in academic situations, what you are doing is quite authentic because you would read about a topic or hear a talk and then go to a seminar or another class on the same topic. You then have information to compare the...

  • Hello Mahu, yes it is the last option but it is a useful thing to have ready.

  • Hello Alexis; I agree that authentic recordings will help learners begin to deal with things at speed. Dictation has its place though - I think Luke Harding talked about it at one point - very good for intensive listening and short frequent practice. You can also try the dictagloss (dictating at normal speed with learners noting as much as they can) which can...

  • 'Enchant them to read' - wonderful phrase! Love it!

  • Exposure to a wide range of accents is a good thing for learners - given that globally, learners are more likely to engage with English speakers from non-English speaking countries. If you walk round an international hotel lobby in China, you can hear Russians speaking with Germans or people from Arabic countries speaking with Indians. The outsourcing point is...