Using a quality framework tool
You will shortly be required to submit your specification. Thinking back to the ASPIRE framework, covered in Week 2, the specification is what allows us to form the basis of the implementation phase. However, without getting carried away with thoughts around media development just yet, there is an important step which must be undertaken as an effective quality control measure. This is what we refer to as the peer review 1 check.
In the HELM team, we use our own quality framework tool for the peer review 1 stage. The number 1 is significant here as this is the first ‘official’ review part of the process. There is a second review stage, peer review 2, which takes place after the implementation (after the RLO has been created). Both of these stages are very important in ensuring quality within a E-learning resource but, as we will not be focusing on the physical creation of the resource as part of this course, it is the first peer review stage which we will give focus to.
It is reasonable to assume that all specifications, including your own, will encounter some degree of modification and so we would emphasise that any first attempt is considered a draft format. From then, the process should be an iterative one between the spotting of any initial errors and the resulting changes from peer review 1. A top tip for during this period is to carefully think about the versioning of all changes to avoid losing sight of the work in progress. This can allow stakeholders to compare changes, or in some cases, even revert to a previous version if necessary. And so, it is a good idea to clearly and uniquely identify each version of the specification on its way to becoming the finished article ready for the use in the implementation stage.
Peer review 1
There are two important aspects to a successful peer review 1 process.
The first is making sure that the right personnel are included in the process. It is encouraged to have a range of people involved who can look at the specification from different perspectives. Ideally, the specification will be reviewed first by a couple of the stakeholders who have been part of the design process. But, it is especially important to obtain reviews from sources outside of the design group. This should include someone who has the knowledge to understand and make sense of the content provided (i.e. a subject expert), someone who can review the pedagogical design and someone who can review from a technical angle to ensure suggested functionality is feasible.
The second important aspect is making sure reviews are completed using the same quality framework tool to ensure feedback is of a consistent level. The tool we use is essentially a form with a specific set of questions which allow reviewers to record their thoughts and opinions against.
Use of the quality framework tool in the peer review stage will help ensure:
- Accuracy and relevance of content.
- A good standard of usability.
- Appropriate interactions/media/technologies to support active online learning.
- Inclusion of some related self-test/assessment and feedback.
Once complete, it is important that this peer review is discussed with all stakeholders (or at least all those involved in the design process) and that there is the opportunity for additional comments outside the boundaries of the review questions to be recorded. All feedback should then be collated and addressed as necessary by the stakeholders. It is worth stressing again at this point that peer review 1 will likely involve a number of iterations to produce a final specification which can be used for implementation. This is natural and the benefit of overseeing these iterations at this stage will categorically outweigh trying to address any major dispute during the physical creation of the RLO.
Consequences: poorly defined vs. well defined specification
The following table provides a basic general contrast of the consequences or outcomes you could expect for a specification that has been either poorly or well defined.
|Poorly Defined Specification||Well Defined Specification|
|Many iterations during implementation||No (or few minor) iterations during implementation|
|Media elements that need to be continuously redefined||Media elements which are easy to source and go straight into the RLO|
|Contains content that is irrelevant and wordy||Content is appropriate, relevant and concise|
|Contains a heavy amount of testing or self-assessment||Makes use of a small amount of assessment (with feedback)|
|Contains concepts which are ambiguous or inaccurate||Concepts are clearly explained and accurate|
|No glossary to explain abbreviations, important terms and meanings||Full glossary covering all abbreviations, important terms and meanings|
|Lack of summary or opportunities for further study or feedback||Summary of learning, related links, resources and opportunity for feedback provided|
There can be many resultant effects of a poorly defined specification. These would include things like additional time and costs, unnecessary redevelopments, “scope creep” (where requirements constantly evolve) and general frustrations for designers, stakeholders and developers alike. Hopefully, by highlighting these consequences, you will appreciate and have more understanding as to why it is so important to produce, and work with, a well-defined specification.
Examples of a poorly defined specification
In realising consequences of a poorly defined specification, let’s now take a look in closer detail at a few examples which illustrate where specifications can sometimes go wrong.
Have a look at this extract from a specification, below. Can you identify any issues?
Looking at the same extract again, below, we will take each of the two points highlighted in turn.
Firstly, the text highlighted with the red outline shows the specification containing content that is ambiguous. This text looks like a placeholder inserted by the stakeholders perhaps as the result of being unable to agree what the content should be here or in hoping to return to it at a later stage. Whilst acceptable in an initial draft of the specification, the analogy should be clearly provided in the completed specification. It may also be valid to question, pedagogically, whether an analogy is the most appropriate method to help the learners understand this concept best.
Secondly, the text highlighted with the blue outline shows a required media element, in this case an image, which may need redefining. There are a couple of issues for consideration here. Why does the gym have to be “well-known”? It seems the stakeholders intend for the image to contain some organisational branding but this certainly poses copyright concerns for sourcing and use. Other thoughts may include whether the image is relevant or appropriate (e.g. cultural considerations).
This example is an extract taken from the same specification as the previous example. It is for a resource which has a precise learning aim to raise awareness of health-related risks in the workplace. Have an initial look at this extract, below. Can you see anything that might be a cause for concern?
Let's take a closer inspection at what has been highlighted below.
Breaking it down, the most noticeable problem with this entire section of content is that it does not relate to the learning aim and so should be classed as irrelevant. This content discusses vital signs and checking pulses of colleagues which is off topic. As such, it does not help address the learning aim which is strictly about raising awareness of health-related risks in the workplace.
Further points to consider would be that the text highlighted with the red outline could benefit from being reworded. Textual content in an E-learning resource should ideally be in short, clear and concise sentences and this is arguably a bit too wordy. Additionally, it should be noted that the blue outline highlights information which is factually inaccurate. The pulse found on the wrist is a ‘radial’ pulse (not ‘popliteal’). Errors can happen during the formation of content in the specification but it is extremely important that any inaccuracies are amended immediately.
This example is taken from a different specification for an E-learning resource on the importance of good documentation and handover. What do you think are the issues at first glance? How many issues do you notice that that need be addressed?
Now, looking at the same extract again, below, we will consider the two highlighted areas in turn (to address four issues in total).
As with the previous two examples, there are a few issues to point out with the proposed content in this specification. Firstly, the text provided in the red outlined section contains a couple of abbreviations (“SBAR” and “IG”) which should be written in full the first time they are used. They should initially appear as “Situation Background Assessment Recommendation (SBAR)” and “Information Governance (IG)” and from then onwards it would be acceptable to refer to them as “SBAR” and “IG”. Both these terms should also appear within a glossary which should be defined within the specification to appear as part of the eventual RLO.
Similarly, in this same section, the style of text written includes some terminology which needs to be tidied up. There are a few bits of jargon such as “info”, “obs” and “docs”. These should be replaced with the words intended in “information”, “observations” and “documents”. Whilst it may make writing up quicker and easier, the specification is what provides the basis for the implementation and developers will use this to create the physical resource. If things are not clear within the specification, it will only result in additional time for all involved spent trying to clarify.
The second sentence could also benefit from being reworded. The phrase “so that the next nurse knows what they’re up against!” comes across quite negative and a more neutral tone would be advisable to provide objectivity. A phrase like “so that the next nurse understands the information that is available” would be more suitable.
Finally, the suggestion of the inclusion of a 25 multiple-choice quiz within the blue outline is not good practice for RLO development and this should be avoided. The use of self-assessment to support E-learning should absolutely be encouraged but this quantity is simply too heavy. The advice would be to include a much shorter multiple-choice quiz; one of perhaps 3 to 5, questions.
Use the framework tool provided in this article to assist you in reviewing your own specification to ensure some level of quality assurance. This is available as an open document type (.ODT) downloadable from Padlet or as a PDF in the “Downloads” section at the bottom of this article.
Top Tip: Always keep the overall learning aim in mind during review.
Once you are happy with your specification, making any amendments you wish to, proceed to the next step in the course where you will be required to submit it for peer review feedback from other learners.
The peer review process is important for quality assurance.
- From a learner’s perspective, what sort of things would frustrate you in a poorly designed resource?
- Do you agree with the points made in the examples of poor specification?
- Has the tool helped you make any notable improvements to your specification?
© The University of Nottingham 2016 (Creative Commons Attribution-NonCommercial-ShareAlike UK 2.0 Licence) except for third party materials or where otherwise indicated