Maria Carmen LAFUENTE

Maria Carmen LAFUENTE

Activity

  • Thank you very much.

  • Thank you very much for the course. I appreciate all the tips and resources that the educators have shared with us.

  • Questions 2, 4 and 10 seem to be problematic for different reasons.
    Q4 seems to be too easy.
    In Q2 and Q10 students in the lower group get better results than those in the higher group.

  • Thank you for this most valuable tool. Unfortunately, though, I have no tests with me at the moment to try it out.

  • Q8= 0.18, too difficult as it falls below the acceptable range.
    Q9= 0.81, quite easy, but OK for an achievement test.
    Q10= 0.71, acceptable for all types of test.

  • I think that speaking tests are perhaps the ones that can best reflect real-life situations. Tasks designed to make students interact in a group or pair discussion, trying to reach an agreement, for example, using appropriate structures and register can give us reliable feedback on the students' ability to use the language.

  • Very helpful advice indeed. Test piloting with colleagues and test feedback to make changes are crucial.

  • I teach adults from 16 years old. Teenagers are usually rather self-conscious when it comes to speaking in public. They are often given tasks to do in small groups so this should be less intimidating. Young adults tend to struggle when faced with certain topics of conversation as their knowledge of the world is still quite limited. As a result, they seem to...

  • I have never used portfolio assessment with my groups. However, at the beginning of the course I ask my students to complete a profile where they can express their strengths and weaknesses and their expectations from the course. Most of my students use English at work or doing research for their studies so they have plenty of samples of writing and speaking...

  • I teach adults from 16 years old and from different backgrounds. They are usually keen and motivated students but sometimes they cannot attend classes regularly, which makes continuous assessment rather difficult.

  • For B.1, B.2 and C.1 our students take tests that have been set by the education authority. They have been piloted and some past papers are used for training as mock exams.

  • In my school formal assessment is planned beforehand. Less formal tests are planned as the course progresses and they will depend on the needs to reinforce particular aspects of the learning.

  • I would like my students to have a sense of achievement not just because they have passed a final exam but because they can feel they are making progress in different skills. To this purpose continuous feedback on their production is essential. Keeping learners engaged and motivated is the great challenge.

  • @SherylCooke Thanks for sharing all these resources.

  • Thank you very much for this week and the video with useful information.

  • Thank you very much for this week and the video with all the useful information.

  • lll

  • Thank you very much for a really useful video. Thank you for reassuring those who have to face the challenges of online assessment.

  • Thank you very much for a really useful video. I am sure it will be greatly appreciated by those who have to face the challenge of online assessment.

  • I totally agree with Mr Sullivan's reflections on how we assess vocabulary and grammar. Thank you for shedding light on this subject.

  • All the tools above will no doubt be extremely helpful when developing my tests in the future.

  • I am familiar with Cloze Tasks but I had never heard of C- tests before. I think I'll try them out at some point in my future classes.

  • I have used matching tasks in vocabulary tests at lower levels, for ex. with adjective opposites. They are easy to mark but I don't think they give us
    reliable feedback at higher levels. Unless we provide good distractors, it could be a matter of luck for some test-takers.

  • We use MCQs for placement tests as they need to be marked quickly. When designing tests in our level team we find that it is really hard to come up with 4 strong options. I agree that context is essential if we want to avoid ambiguity or various correct answers.

  • We work in level teams in our school and tests are developed and piloted before they are delivered to our students. If I want to check the knowledge of a specific grammar structure or vocabulary area I usually develop a test myself, which can be corrected in class. Placement tests are re-used every year.

  • Modal verbs are often difficult to test as they express attitude. A clear context should be provided in order to avoid ambiguity or interpretation.

  • I totally agree with Mr O'Sullivan on this matter. When I give my students feedback on their writing and speaking I always try to give them specific information about the structures and vocabulary they have used. I have noticed how most of my students make an effort to incorporate the new grammar and vocabulary input in their performances although it may not...

  • Thank you for these helpful tools.

  • First of all, I like the fact that test-takers can choose the skill they want to assess. As regards the Aptis grammar and vocabulary test, I think that the variety of tasks in the vocabulary section can offer reliable feedback on this skill. I have also noticed that this test includes items from different topics that range from general or basic to more...

  • I didn't get any feedback either.

  • You need to click on one of the links above and the links to the booklets will appear as zipped files. Good luck!

  • Our students take separate tests for listening and grammar and vocabulary for levels up to B.2.2. Integrated skills are assessed at C.1 and C.2 when students are required to take notes from a lecture.

  • Before the start of the course we deliver placement tests for new students. During the year students take a number of achievement tests and a mock exam towards the middle of the course to train them for a standardised proficiency test at the end of the school year (B.1, B.2, C,1) External students can also take this final test.
    Generally speaking placement...

  • Use of English tests cover grammar structures as well as vocabulary in written form. Grammar and vocabulary is also assessed in the students' writing and oral exams.

  • I think that each learner has their own method of learning new vocabulary. Word lists can work for some but not for everyone. Recording new words in context is key to help learners use them effectively.

  • I like the fact that this diagnostic test covers a variety of aspects of vocabulary, like word formation, collocation, for example. I am not so sure of the purpose of identifying invented words, though.

  • I'm glad to see that most of the aspects considered in the article were on my list. At B.1 and B.2.1 levels I am particularly interested in teaching new language in context. Vocabulary tasks in tests will include for example, supplying the suitable word for a definition, writing the opposites, providing a suitable collocation, etc. I also make a point of...

  • As I student myself I used to take quite a lot of multiple choice grammar and vocabulary tests. To my lower level students I have given only vocabulary tests based on specific topics before moving on to the following lesson in the coursebook. Nowadays I always test grammar and vocabulary in context using, for example, cloze tests.

  • Thank you for this second week of the course. I'd like to highlight two ideas from the video: collaborative work when designing a test and transparency towards our students. I find these two concepts essential.

  • I used to consider spelling when marking A1 and A2 listening tests. At higher levels, however, we try to design tests that do not involve writing answers.

  • I prefer the 3rd version. I would play the recording twice for and end-of-course A1 test.

  • @SherylCooke Audio 2 is definitely more authentic.

  • I think the recording is suitable for an A2 learner. Marking the task, however, can raise a few problems. I don't think students know how they are going to be scored if a word is missing or mispelt. Besides, the comprehension of a number and a proper noun "George", for example, should not be given the same weight.

  • Our reading and listening tests include tasks with different weighting. Students do not need to write anything in these type of exams. All they have to do is tick or circle the right answer. Test takers are familiar with the marking criteria and the score of each task.

  • Thanks for the tips. I'm looking forward to reading next section as I am interested in test question weighting.

  • I use ready-mde materials from course-books for practice but most of the time we find our resources online: news, documentaries, interviews, etc. Needless to say these videos need to be edited, which is time-consuming. We also try to download and save them in case they are taken down or the internet connection does not work when they need to be used.

  • I've done the Aptis demo test and I like the fact that it is based on real-life situations and includes a variety of tasks. I can see the progression from asking for specific information in the first tasks towards inference later on. I think that this progression is suitable for a placement test but may not be appropriate when testing students who are...

  • Thank you for sharing the links. My students are between B.1 and B.2.1.

  • I think that at lower levels students are usually asked to listen for gist, for specific information and for detail. At higher levels, from B.2, the two more complex sub-skills like listening for inferring meaning and for attitude can be added. I would include at least 4 different tasks in a B.2 test so that there would be a balance between all the sub-skills.

  • My B.1 and B.2.1 students usually struggle with authentic videos which are used for the listening tests. A variety of accents and speed of delivery are the main difficulties they encounter. We try to compensate this by including a variety of recordings, so that a listening test will typically include at least 4 different tasks with different levels of difficulty.

  • Listening to recorded messages over the phone requires special attention because it is not face-to-face communication. Firstly, lower order processing will help us to recognise the basic units that make up the message. We may also need to focus and make a special effort if there is some background noise involved in the recording. Higher order processing will...

  • I find the tools to analyse the difficulty of a text most useful. I am definitely going to try them out. Professor O'Sullivan's advice on assessing reading is also extremely helpful and no doubt going to be taken into account.

  • We use most of the tasks shown in the examples above. At B.1 and B.2.1 levels our reading tests tend to include True / False, multiple choice, ordering tasks, gap fill and cloze. Personally I am not too fond of the True/False task, even if it includes the "not given/mentioned" option.

  • What I find difficult sometimes is to avoid ambiguity when thinking of suitable items for multiple choice. This is particularly true at higher levels. I admit that sometimes we try to make a reading test more challenging by including tasks that are not really testing the comprehension of the text but other skills. I suppose that this is a clear message that...

  • All the topics from the texts above seem to be closely related to British culture and I do not think many of my adult students would be familiar with them. Perhaps adding some graphics or a small glossary with the key words would help avoid confusion or misunderstandings. On the other hand, however, eliciting the meaning of the new words could be an...

  • I have never used any text analysis tools before. They look quite useful so I intend to use them at some point in the near future.

  • As regards contents, I make sure that the topic is appropriate to the target readers. As for the difficulty, I check the vocabulary and the complexity of the structures. If it is authentic material, like an article from a paper, for example, I consider the possibility of simplifying it to make it more comprehensible without affecting the contents.

  • 1. Abstract: aimed at university students at advanced level. It could be part of intensive reading to do research for a presentation.
    2. Financial report: aimed at adult students taking part in a business English course. Reading for detail.
    3. Hotel brochure: A2 (young) adult learners. Good to practise directions. Intensive reading looking for specific...

  • My B.1 and B.2.1 students take reading tests that involve both higher and lower level processes. The texts are always authentic and there are a variety of tasks to check their comprehension, for example, true/ false, multiple choice, multiple matching (headings with texts).

  • I the last 24 hours I have read quite a few emails and text messages, travel information from an online guide, a recipe from a cookbook, directions for handicrafts projects both online and in paper, a couple of social media posts, and last, but not least, course materials. I would consider all the above part of my intensive reading, as I was reading for a...

  • I think both skills involve processing information, which in an activity in itself. In both processes the reader and the listener need to identify the elements of the speech or text that will enable him or her to decode the message successfully. Seeing and hearing do not require attention, unlike reading and listening.

  • @RichardSpiby I've just watched it. Very interesting and useful. Thank you.

  • Thank you for the first week of the course.

  • Looking forward to it!

  • It's difficult to come up with meaningful writing tasks for adult students as their interests can vary enormously. I try to give them a choice whenever possible. At B1 level, for example, they can recommend a restaurant, a holiday destination, a book, etc. Tasks which are based on their personal experiences are the ones that work best.

  • My students are between B.1 and B.2.1. I use a correction code so that they can correct their writings themselves. They are encouraged not only to correct their mistakes, for ex. wrong tenses, word order or form, but also to "embellish" their writings by using the new vocabulary and structures that have been introduced and practised in the previous lessons. I...

  • We have standardisation meetings every year in which we assess students' speaking and writing with marking criteria set by the institution. I think these meetings are extremely helpful to both experienced teachers and those who have just started working in our language schools. They usually generate interesting debates when teachers justify their marks.

  • I use the analytic rating scale in order to give students as much detailed feedback on their writing as possible.

  • Unable to access the test. I'm afraid the link does not work.

  • My students are often asked to write both informal and transactional emails, which I think can reflect real-life situations. At higher levels they are often required to produce argumentative writings. I am not sure many of my students will have to write articles or opinion essays in their personal lives.

  • I would say band 5 corresponds to most of my students' level as regards fluency and lexical resource. For next level up I would encourage them to widen their vocabulary range by using different connective and discourse markers. I would ask them to choose a few "favourites" making sure they use them appropriately in their speaking tasks. This could help them...

  • Peer assessment seems to be a good resource.

  • Time and large groups are major constraints. Recording could be a solution but giving students meaningful and personalised feedback takes a long time .

  • I have used all of the formats and tasks but always face-to-face.