Olena Rossi

Olena Rossi

I have an MA and a PhD from Lancaster University, specializing in language testing. I also have practical experience working as a teacher, teacher trainer, examiner, item writer, and item reviewer.

Location Lancaster, UK

Activity

  • Hi Rasika! You've made a good point - mismatch in one option can lead to other options answered incorrectly, too. This is a weakness of matching tasks.

  • Hi Vgokce! There is also Aptis Advanced test up to C2 level. In fact, Aptis test has many components, the Core test (grammar and vocabulary) is only one of them. It is compulsory, i.e. everyone who takes Aptis has to take it, while other papers are optional, e.g. you can take the Core + Reading, or any other skill. In the Writing and Speaking papers, grammar...

  • It's great you disseminate learning!

  • Hi Fathi! I'm not aware of any research that would prove that matching tasks are a good indicator of overall language proficiency. Could you point some to me?

  • That's a good point!

  • I think you mean Aptis, Savani :)

  • Hi Fathi,
    I agree with you, generally. However, communication needs differ for different people. E,g, some might need to communicate face-to-face and, therefore, have to have listening and speaking competency. others might need the language to read and write, e.g. in academic context. Classroom teaching and testing should be adapted accordingly to suit...

  • Hi Robab! It seems like you have a really diverse student profile. Are they in one class?

  • I think you can use typical phrases found in academic language (e.g. http://www.phrasebank.manchester.ac.uk/) - split them in half for students to match, or to match sentence halves that contain those phrases.

  • Hi Mathieu! Dictionaries are very useful in learning vocabulary, but you need to make sure students also know how to use words in a sentence / context. Many online dictionaries are free and have multiple examples of words used in sentences taken from corpora - you can use these naturally-occurring sentences to create your own vocabulary tasks.

  • Hi Malika! Could you clarify what you mean by "poor exposure to well exposed sets"? I think you're using some terminology here I'm not familiar with.

  • very interesting observations of the impact of age difference on learning!

  • Hi Martha! You can try to make the instructions clearer. You can also include one example showing students what they need to do.

  • Hi Kathy! Please use this link https://www.britishcouncil.org/exam/aptis/practice-materials/grammar-vocabulary , scroll down and click on the button 'take a demo test'

  • Hi Shani, immediate testing is a good idea because it shows the teacher how well the students understood the new material. However, it is also useful to test the grammar point / vocabulary later on, e.g. in a week or two, to check the knowledge retention. Surely, you should also test them integrated with the skills.

  • Thanks for the detailed description of your class profile!

  • Hi Geethanjalee! What does ATI stand for?

  • Hi Sarah! Guessing is a problem with all constructed-response tasks, e.g. MCQs, True/False, etc. If you add a couple of distractors to the matching task you will reduce guessing, though it's not possible to eliminate it completely.

  • A good point, Andreja!

  • I love your metaphor!

  • Hi Urszula! When teaching languages, we are not directly concerned with developing our students' thinking skills - we develop their language skills. For language testing specifically, we normally refer to language-specific cognitive processes, which can also be lower-/higher-order. For example, listening for specific detail is a lower-order cognitive process,...

  • Hi Eresha! You wrote that your students understand the vocabulary "to a great extent". I am wondering if they can be categorised as A1, then, because A1 level presupposes the knowledge of only very basic vocabulary. It might be that your students display so-called 'jagged profile': their vocabulary might be at a higher level but some other knowledge / skills...

  • Hi Edgar! is there any reason why you focus on teaching grammar only?

  • @SANDRAVERONICAGarcia thanks, Sandra - I can access it now

  • A very good example, Neza!

  • All very good points, Nicholas!

  • A good point, Claudia! Matching tasks should be used alongside other tasks that would test grammar/vocabulary in context, e.g. as part of the four skills.

  • That's an excellent practice, Fiona!

  • Hi Lesley! It's easy to eliminate guessing by adding a couple of distractors in the right column. Also, matching tasks are seemingly easy but care should be taken when producing them. it's particularly important to ensure only one way to matching is correct. If you produce such tasks by yourself, it's worth showing them to a couple of colleagues and ask them...

  • Hi Claudia! Are your students young learners? How old are they? What is their proficiency level, in terms of the CEFR? The two tests we're discussing in this step are not government tests, but students sometimes take them on top of what is offered at their school - as a way to cross-compare and to get a second opinion about their progress.

  • Hi Daw! There are no correct or incorrect answers to this question. It all depends on your particular context, curriculum, your students' needs, etc.

  • A great summary, thank you, Mark!

  • Good luck with your article, hope to read it!

  • Hi Mariia! Great examples, very imaginative!

  • Hi Shelley! Maybe the tasks are too difficult for the level? If so, students tend to lose interest and motivation. You can try varying them, e.g. by matching words to pictures and not only to words / phrases.

  • That is true. To limit he possibility of correct guessing, we need to have more options in column B than the options in column A - they are called distractors.

  • Hi Olabisi, Aptis Core paper only covers grammar and vocabulary and you can't take it alone, without taking at least one more paper for one of the skills. E.g. you can take the Core + Speaking, or + Reading and Writing, etc.

  • Hi Aida! Both Cambridge English tests and Aptis test are aligned to the CEFR. besides, as I wrote above https://www.futurelearn.com/comments/53628573, we can't say which test is easier because of the different approach to testing: Aptis is a test fro all levels while Cambridge exams target a particular proficiency level. Therefore it won't be correct to...

  • Hi Karlygash! This is because Cambridge tests are aimed at a particular level of proficiency, i.e. students need to have an idea what level they are before they take the test to sign up for the right test for them. Aptis is a test for all levels - you can't fail it, you can only pass it at a particular level, e.g. A1, or B2, etc.

  • Hi Fiona! This is a very pertinent observation. In fact, the Core paper (grammar and vocabulary) is used to gauge students' proficiency level and then refine their final score, after they take other part(s) of the test.

  • Hi Thuy, as I wrote here https://www.futurelearn.com/comments/53628573 Aptis covers the whole range of proficiency from A1 to B2, so some items might be difficult for students because they are aimed at higher proficiency levels. But it doesn't mean the weaker students won't pass the test - they will pass it at the proficiency level they have, e.g. if they...

  • Hi Priyantha! Both Aptis and Cambridge English tests cover a similar range of proficiency levels. The difference is that Aptis covers the whole range in one test, while Cambridge English exams are aimed at a particular level of proficiency.

  • @GangaHerath Thanks, Ganga - these are good examples!

  • Hi Anita! yes, you're right, and this is a problem with all selected-response tasks (tasks where students have to choose the correct answer from options given). We can make the matching tasks more reliable by adding a couple of distractors.

  • Hi Jithmi! This is a good point! The problem can be solved with careful item-writing: those who produce the test items should make sure that it's not possible to match sentence halves only from understanding the context of the sentence, the matching should depend on knowing a particular grammar structure (the task above, with if-structures, is a good example)....

  • Hi Hiruni! Do you have a particular reason for that?

  • Hi Jane, the choice is particularly useful if we want to make the test meet students' needs. For example, a person is learning the language to be able to communicate but does not need to write in English. This is especially true for adult learners but can also apply to YLs to better align with the learning curriculum.

  • Hi Gayani! If you test grammar and vocabulary as part of writing/speaking, I don't think you test them separately - you test them as part of skills. If you tested them separately, you would administer tests of grammar and tests of vocabulary to your students.

  • Thanks for a detailed comment. Would you include pragmatics as part of discourse or as a separate system?

  • @MadhuriKannangara It's definitely important for higher-proficiency students to know the form of grammar structures. We can test in as part of productive skills, though, by having 'grammar accuracy' criterion when assessing speaking and writing skills.

  • All good points!

  • Hi Shelley! Do you mean you assess lexicogrammar?

  • That is very true. But do they need to be assessed in a separate test, or can they be assessed as part of a reading / writing / etc. test?

  • @SANDRAVERONICAGarcia I think Karlygash meant lower-proficiency levels, not necessarily young learners. Adult learners sometimes need explicit instruction, so we might then want to test the understanding in separate grammar/ vocabulary tasks.

  • @ValentinaPuchkova In classroom assessment, it's sometimes difficult to separate teaching and testing, especially when testing is formative.

  • Hi Ganga! You've made a great point about differences in testing meaning and form.

  • Hi Prabha! We don't normally call grammar and vocabulary 'skills', because skill is something one is able to do, e.g. read, write, or speak. We normally say that someone knows grammar or vocabulary, and they are often referred to as language elements.

  • Hi Mevuni! can we also test grammar and vocabulary in speaking tasks?

  • Hi Anita! Are you talking about your local tests? Why, do you think, they're tested separately?

  • Hi Aida! Do you think they're also involved in speaking? What about listening?

  • Hi Madhuri! Can you explain why at higher levels grammar should be tested separately?

  • Hi Sthez! Multiple matching is a useful task format to test vocabulary. It's important to ensure that there are more options in column B that words in column A, otherwise the last remaining words will be matched be default. Have you also tried using other task types, e.g. multiple-choice, gap-fill? Cloze tasks can also be used, although they test more than...

  • Thank you for carefully watching the video and giving such a detailed summary!

  • Hi Rocio! Can you explain why three times? Also, you've written about differences in the recordings. What about the differences in the tasks?

  • Hi Andreja! It's a good idea to spell our the address. Why do you think that each word cannot be worth a whole point? And why is it a problem that the task is worth a lot of points altogether? I believe giving one point per word is fairer. Imagine one gap should be filled with three words. One student misses or makes a mistake in one word only, while the other...

  • Hi Rocio and Pavithra! It's good to expose your students to some native English accents, but we shouldn't forget that it's not only native speakers that our students need to understand, non-native accents are also important and valuable. Because English is primarily used as a Lingua Franca nowadays, we should expose our students to a broad range of both native...

  • Hi Andreja! I agree that finding the right listening texts for lower-proficiency students can be difficult. On the other hand, if teachers read out written texts to students, this is not authentic because in real-life listening situations people don't read out loud written texts to each other (or not often do). We should also bear in mind that our students...

  • Hi Olesya! It's a very good point - about the need for your students to understand a broad range of accents.

  • Hi Tin! Apparently Flash needs to be enabled. Try accessing the website in Microsoft Explorer (or Microsoft Edge) rather than Chrome.

  • Hi Giselle! I believe your language proficiency is much higher than your students' so probably your ability to understand the text cannot be a guide in this situation. Do you know what level your students are, e.g. in terms of the CEFR scale?

  • Hi Mariana! Our intuitions are very individual and not always correct. It might be a good idea to consult several online sources (see the next step of this course) that use statistical and corpus tools to measure text difficulty more reliably.

  • @sarahalbalawi Hi Sarah! You've made a great point - we should design test in such a way that would discourage rote learning and promote learning the language. Then our tests will have positive washback.

  • Hi Mohammad! Field dependence/independence is related to individual differences in cognitive styles, in particular the way information is perceived and analyzed: "Field dependence indicates a tendency to rely on external frames of reference in cognitive activities and is thought to foster skill in interpersonal relations, whereas field independence suggests...

  • Haha, that's a good way to make a point! Btw., good test instructions should be concise, clear and unambiguous, i.e. they should not involve a lot of reading and information deciphering, otherwise construct-irrelevant variance is introduced into the test. When test-takers don't understand and/or follow instructions very often it's not the test-takers' fault...

  • That's very true, Lesley!

  • Hi Re! I don't fully agree with you here. English has been a language of international communication for a long time now and, according to statistics, more people use English to communicate to non-native English speakers that to native speakers. It's often the case that two people for whom English is L2 use this language for communication. There is the term...

  • Can you explain, Valentina? What do you mean by "the aim of the test"?

  • @ReMon I was told that you’ll need to open files in Internet Explorer or enable Flash on your browser to open the files. This information was added to the task now.

  • @AndreasKaplan I've submitted a paper on this issue :) don't know when it's going to be out yet. If you follow me on social media (e.g. Twitter or LinkedIn) I'll post it there

  • Hi Penny, thanks for sharing your experience in creating listening materials!

  • Hi Jan! The reason for listening might be different, though. Do we listen to get a general idea of the test (gist) or do we want to find out some specific information (detail)? Or maybe we want to know something that hasn't been said directly (inferencing or attitude)? Different reasons require different types of listening processing and different listening...

  • Hi Valentina! We also need to bear in mind students' proficiency level. E.g., inferencing is not normally tested at lower proficiency levels because it's a higher-level of processing.

  • Hi Mai! You can find lots of planned lessons on the TeachingEnglish website https://www.teachingenglish.org.uk/

  • Hi Valentina! It's possible to base several tasks on one text, e.g. a MCQ task focussing on specific details and then a summary-writing task testing global understanding. In this case, different tasks would tap into different types of processing / different sub-skills. But I don't think you can create very many different tasks based on one text only because...

  • Hi Re! Thanks for letting us know and I've alerted the course administrators.

  • Hi Tilak! Text books often target lower-level processing, but you can supplement textbook tests with your own material, to make sure your students get practice in both lower- and higher-level processing.

  • @AndreasKaplan That's because authenticity can be overdone, too :) I've done some research into this topic and I believe item-writer training is the key here.

  • Hi Daniel! This is a very good point - the test should strive to be fair to all students. For large-scale testing, however, it's might be difficult to find names and addresses that would be familiar to everyone globally.

  • Hi Andreas! There are ways to make pre-prepared scripts sound more authentic, e.g. by incorporating features of authentic spoken language such as pauses, hesitations, reformulations, etc. The script itself should be produced not like a written text but like a transcription of an authentic conversation, using transcription conventions (e.g. not sentences with...

  • Hi Anita! That is great you're skilled at producing your own listening materials. Can you share some tips on how you do it? Many teachers here wrote that creating listening texts is difficult for them.

  • Hi Priscilla! By 'adapting' do you mean you edit the listening files? I am very interested how teachers actually do it because many teachers here wrote they adapt authentic listening materials.

  • Hi Chandana! That is very interesting. Could you share how you adapt the materials? Thank you!

  • Hi Meera! thanks and I'm glad you found the information useful. As for reading strategies, a course focussed on teaching the language (this course's focus is testing and assessment) might be of help. You will be able to find several MOOCs for EFL teachers on Future Learn. TeachingEnglish website by the British Council is also full of good ideas on teaching...

  • Hi Mariia! I wouldn't agree that Gist Listening is less useful because it examines the comprehension more passively. I don't think there is such a thing as passive comprehension, anyway. Section 2.1 of this week discussed differences between passive and receptive skills. I agree that attitudes is something that is more suitable to test at higher levels of...

  • Hi Nazneen! Traditionally, the last question is a about speaker's attitude.

  • Hi Priscilla! Listening for specific information is fine at lower proficiency levels, although you might want to include some gist questions as well. As your learners move to higher levels of proficiency, it'll be more important for them to develop a range of listening sub-skills.

  • Hi Mihiri! Would you say that this task also tests listening for attitude? It is not necessary to include all listening sub-skills in one task, and it's often not possible. Several tasks might have to be used to make sure all main sub-skills are tested. Besides, what we test depends on students' proficiency level - we only test inference at higher levels of...

  • Hi Silvia! I would agree that it's difficult to talk about percentages - it all depends on the students' level and every concrete test. Different listening texts also lend themselves to testing different listening sub-skills, e.g. some are not suitable for testing attitude, while others might not contain enough concrete detail to base many questions on.

  • Hi Joyce! How have you arrived at these percentages? Is it a recommendation from your school/institution? Is it based on some research?

  • Hi Helen! It's not normally possible to find authentic texts to use at this level. You might need to choose texts with the right content (e.g. concrete information, familiar everyday topic, short and straightforward) and then adapt them for the level in terms of the grammar and vocabulary.