Making digital exemplars

In addition to my usual classroom use of exemplars as a means of familiarising students with the assessment requirements of a specific module, this year I have created a video walk through of an exemplar. Initially this was to enable those who missed the relevant class to catch up on the session, but the approach was welcomed by students who attended the exemplars activity session, as well as those who did not. 

How to create a digital exemplar walk through: 

• Bring up the exemplar on screen after selecting a ‘good’ piece of work

• Read through and use comments in word to annotate the work with points which would be surfaced in feedback, particularly comments related to the assessment criteria (of course!). Comments include things done well, things done less well which could be avoided and opportunities for further detail and development. This tagging process acts only as an aide memoire so that as I create  feedback video I am aware of what I wanted to include. 

• Open Screencast-o-Matic to very easily screen record the work as a video as I re-read it through and talk to each of the tags. ‘This work includes this … which is useful because…’ ‘this work used literature in this way …. It might be better to …. Because ….’. None of this is rehearsed; that would be too time consuming. The resultant video is a commentary on performance.

• The video is uploaded and made available to students.

After using the resource there was some consensus amongst my students that the value was ONLY in listening to the assessment commentary and not specifically in looking at the work. One student described how they listened but did not watch. They then recorded notes about what they should include, remember and avoid. They avoided looking at the work for fear of having their own ideas reshaped. If assessment judgments are ‘socially situated interpretive act[s]’ then the digitised marking commentary may be a useful way of making that process more transparent for students, and indeed for other staff.

I will definitely be including this in future modules.

Handley, K., den Outer, B. & Price, B. (2013) Learning to mark: exemplars, dialogue and participation in assessment communities. Higher Education Research & Development Vol. 32 , Iss. 6.

Action research and its link to UK PSF

How does action research link to UK PSF?

Action research can be a useful strand within a learning and teaching staff development strategy.

How specifically though does this type of practitioner research link to UK PSF?

This is useful to articulate so that

i) Staff developers can be conscious of how to advise colleagues
ii) Those undertaking action research can link their work to the national recognition framework
iii) Colleagues undertaking action research may use the links with UK PSF to further enhance their reflections.

The ways in which I think action research and UK PSF are linked are as follows:
1. A self-review using the Dimensions of Practice can help to inspire topics for focus in an action research project. Effectively the framework can help identify areas of practice or understanding that could be usefully progressed. While self-development is a potential motivator for action research, care must be taken that this aim does not trump the needs of students.
2. Areas of Activity (A1-A4) can be directly enhanced through action research (and with direct benefit to students) e.g. a project to develop inclusive online learning spaces (A4)
3. The act of action research can be a way of contributing to professional development (A5) – although this does require a degree of openness in the outlook of the researcher. Without this open mindedness the project simply becomes a problem solving exercise, rather than something that really impacts individuals and their development.
4. Knowledge ‘about’ practice can be developed through reconnaissance phase of the action research project. This stage is the initial scoping research that helps determine an effective course of action once an issue is identified. It may involve a literature review, collegial discussion and student engagement.
5. If action research is collaborative, as ideally it should be, then the process can facilitate a better understanding of the needs of individual students (V1); this can ultimately challenge personal beliefs about what we think our students like or need.
6. Action research is a direct contributor to ‘V3’ which relates to pedagogic research and scholarship.
7. Sharing action research can start to contribute to the process of influencing others, as is a requirement of Senior Fellowship. This might be through internal institutional events, papers, or other forums (although if you do this, don’t forget to gather evidence of impact as you go … what effect did your research have on others?).
8. Given the link between theory and practice in action research, this form of scholarship is one way to demonstrate point V on Descriptor 2 (Fellowship) and Descriptor 3 (Senior Fellowship).
9. The exploration of ethical dilemmas related to action research can relate the research to the wider context (V4). For example questions about data usage and data protection have origins and implications beyond a practitioner research project.
10. Action research can assist with progression in UK PSF. Themes and issues that might be considered in a small project can be built up in to a bigger body of work with more impact and influence. A small project can sow seeds for something greater.

Bitter sweet TEF reflection

Yesterday I avoided comment on social media about TEF, because I had such mixed feelings about the exercise. We should celebrate great teaching and if TEF helps with this then that’s a good thing, but we need to take care that TEF does not divide a community which is rich in collaboration, mutual encouragement and shared mission. Despite the well reported limitations of the TEF data, universities and individuals have been excited with some outcomes. Being a party pooper I’d question whether we should be so celebratory when good colleagues, in good institutions, doing important work, and in many cases providing excellent teaching on the ground, are now labelled as ‘third tier’. I struggle with that. It seems the institutions who faired less well have been left alone to point out the limitations of TEF, which in the aftermath of the results, makes those speaking out like sore losers. I don’t believe that is fair when we are all aware of the limitations of the measure. In our immediate celebratory reactions we are potentially signaling uncritical acceptance and alienating colleagues in the process. Some collective sensitivity may be part of the answer.

Another thought as the results came in was about what TEF does to existing alumni. I have a degree from bronze, silver and gold rated institutions. By far the most impactful, rewarding and engaging of these was my ‘bronze rated degree’ (from Liverpool, by the way). Does TEF devalue the sense of worth for existing qualifications? I fear it might. As a ‘bronze alumni’ I feel outraged at this label and am clear that it absolutely does not reflect my experience. As we start to categorise universities in this way, we need to think about the consequences for past as well a future students.

Unfortunately the celebration of TEF results is bitter sweet.

From BTEC to HE – reflections on student conversations

Recently I have been reflecting on the experience of students from vocational backgrounds who come to university. We know that, in some universities, success rates amongst BTEC students are lower than those from A’Level backgrounds, but I am not sure that we really understand why this is the case and what it is really like to be a student from a vocational background in higher education. I am presently trying to understand a little more about the VQ in HE (Vocational Qualifications in Higher Education) student experience. From my recent purposeful conversations with students, some observations on this topic are shared.

  • The academic world can be confusing and stressful after a BTEC. The courses and expectations in HE are very different than those experienced previously, but on the upside, with assistance of the right type, students can be well prepared to thrive. Assessment is an area where key differences are felt. It is not just the profile of assessment types that may differ, but the culture of ‘submit-feedback-improve-resubmit’ that seems prevelent at level three, but often lacking at level 4 and beyond.
  • The types of support that can be useful include academic skills development particularly focusing on:
    • Equipping students to understand the requirements of an assessment
    • Teaching students how to break down a task to minimise feelings of being overwhelmed;
    • Developing time management skills;
    • Developing organisation skills.
    • Building confidence to rid the imposter syndrome (simple reminders that ‘you’re doing fine’ mean so, so much).
    • Revisiting class content – going over lecture notes
    • Getting started with writing (e.g. providing structure for students to frame their own writing)
    • Locating reading and sources
    • Referencing (supporting these skills, and not being pedantry when students are getting to grips with sourcing information)
  • Skills development and reduction of stress were often talked about together. Academic support and counselling skills sit side by side.
  • While VQ students may face challenges with specific aspects of their course, there may be many other aspects of the course where students feel confident and have a good degree of mastery from their vocational qualification. This raises the question, whether more can be done by way of skills exchanges or peer mentorship between A’Level and BTEC students. While we should be concerned about the achievement and experience of specific groups, we need to be very careful not to create a deficit and fix-it culture. More might be done to simply recognise the specific and valuable strengths brought to the mix by students from vocational backgrounds.
  • The idea that students from a BTEC background prefer coursework because it is ‘what they are used to’ does not tally all of the discussions that I’ve had. Students tell me that they can quickly learn to thrive with exam format. While the first one or two are nerve wracking, again with support, and revision strategies, many students can start to feel relatively comfortable with this type of assessment. The re-introduction of exam format assessments, is, at least according to those I have spoken with, less stressful when the content of the exams aligns with coverage of their vocational prior qualification, rather than presenting entirely unfamiliar content and demand. While this insight in to exam perception challenges my own assumptions that BTEC students may not be comfortable with this type of assessment, it’s important to remember that perception/preference and actual learning gain are different things.
  • Through my student conversations I have been reminded of tutor actions that can be particularly useful for VQ students (and indeed all students) in preparing for exams:
    • Provision of past papers and signposting to these
    • Revision classes
    • Providing checklists of what topics should be revised
    • Highlighting key topics in class to guide revision focus
    • Providing model answers
    • Providing a booklet of practice questions
    • Providing a menu of revision techniques to encourage active revision
    • Comprehensive session resources shared in a format that can easily be revisited
  • Overwhelmingly the students that I have spoken with said the most important point about support is that they need it to be accessible, welcoming and friendly. The tone in which support is offered absolutely matters.
  • Handing in those early pieces of work is a really big step within the university experience. Having some kind of facility to have work reviewed before submission is seen as a really valuable to remove fear and anxiety. There are of course many ways that such a step can be built in to the formative feedback journey of provision.

Undoubtedly all of these points could be addressed through a universal design approach to learning, whereby the curriculum, the classroom, the learning relationships and the online environment are intended to allow as many students as possible to reach their potential.

A UK PSF compatible framework for professional reflection

One ongoing challenge I have is around how to increase the depth of reflection on teaching practice (or indeed other professional practices) within the context of formal development programmes. Sometimes we use models of reflection to assist, including Gibbs, Johns and Greenaway’s models. However existing models, and even free flow writing, have not always yielded in-depth reflections. Based on my own experience of supporting reflection across different professional groups I have summarised three limitations of existing models of reflection.

1) A tendency to focus on iterative improvement with less emphasis on validation of practice

Models tend to steer the reflector to assess any issues that require a change in approach (plan-do-review and variations thereof); change is king. Based on experience, sometimes colleagues find that they don’t need to change but instead they can take value from affirming their practice and recognising what they do as effective or good. Affirmation and confidence in practice are as important as identifying points for change and development.

2) A limited engagement with the idea of governing variables

Reflection models can tend to encourage single loop learning as critical incidents are located and considered. I always encourage anyone reflecting to consider what is within their remit and control, and to focus their attention accordingly rather than locating issues within the practice of others, particularly when this leads to a sense of blame or the shifting of responsibility for personal practice. Nevertheless, it can be very useful for some attention to be given to the constructive consideration of challenging the status quo and the operational norm. New(-ish) practitioners can often assess the context in perceptive ways as they have not necessarily been acculturated and institutionalised. To encourage a focus on the constraints and context of practice is very different than shifting the focus of a micro reflection to others because it may be easier than examining one’s own practice. It means standing back and asking what are the things around me that I need to challenge? (challenge is key here, and the answers may not be to hand, challenge – not change). Possible areas to challenge include policy and established ways of working. Whilst senior staff may be able to act on these realisations, new lecturers (or practitioners in other fields) may be less empowered or confident to take action. However, if institutional staff development is joined up, then the issues raised through these reflections can filter through course leaders and assessors for discussion elsewhere.

3) A tendency to focus on incidents rather than wider periods of personal transformation and growth.

A third issue with existing models of reflection is that they tend to focus on an incident by incident basis i.e. take a critical incident and consider it in depth, resulting in a learning or a change. This approach can be simplistic and fail to make connections between a range of events and practice. The resulting reflection therefore tends to be overly descriptive and sometimes forced. Instead I am now encouraging a ‘compound reflection’ – to look back over a series of events or a time period and consider the resultant personal and professional growth. This is especially powerful for identifying personal learning about practice, and the recognition of evolving beliefs and values. It should also provide a chance to review meta-learning, asking what happened across this period to assist my learning? I am not convinced that this depth occurs on an incident by incident basis.

I am proposing an alternative reflection model to capture some of the points above. Essentially this encourages the focus on either an event/incident or a period of time identified by the reflector (e.g. across one term, or after a CPD programme). In the model focus is drawn to three areas, which align to the UK Professional Standards Framework Dimensions of Practice. Individuals should separately  attend to their activities/practice, knowledge and values/beliefs. The actual dimensions of practice can be used to further frame thinking. For EACH of those three areas stimulating questions can be asked to encourage external or internal conversation. Affirmation, challenge and meta-reflection are all evident.

reflection model

Of course this is an early attempt at shaping up a framework to assist reflection, so any thoughts by reply are very welcome.

 

Download the model reflection modelhere.

Effective feedback through GradeMark

This week I’ve been asked to share my own thoughts on how GradeMark can be used effectively. After some recent student interviews on this topic, here are my top ten points, some of which apply outside of GradeMark too.

  1. Give sufficient emphasis in feedback to content and context as well as structural issues in student submissions. Students sometimes report back that they have too much emphasis on grammar, layout and spelling and not enough on content. Some tutors may have a preference to feedback on what is easier to see.
  2. Avoid annotations e.g. exclamations, highlighting question marks, without explanation for students – they don’t know what short hand annotations mean. Give ideas on what students can do to improve their work in future e.g. you might use a range of literature sources and compare the different ideas, rather than only providing one perspective.
  3. Use the full features of GradeMark to help with feeding forward on how to improve and to provide links to further advice. This is almost impossible to achieve by hand, so use the full range of the tools features.
  4. Wherever possible make sure that feedback can help with the next assignment – know what students will study next, so you can make these links.
  5. Capture4
    Relating comments to criteria

    Use the criteria explicitly in the feedback so that students can see how decisions about their work have been made. You could even use the colour feature within GradeMark to relate comments to different criteria e.g. Pink for analysis, Green for Evaluation, Yellow for structure and writing style.

  6. Avoid using the term “I think ….” While we all know feedback has a degree of subjectivity to it, making links to the criteria should counter the expression of personal preferences to determine marks. Students often feel that their marks are about playing to the preferences of different tutors. This can be countered by consistent use of criteria.
  7. Use summative comments as well as annotation to draw attention to the most important areas for future development and/or any specific issues arising. Students may not always be able to prioritise amongst many annotations.
  8. To speed up your workflow, consider using voice to text facilities alongside GradeMark. This is especially simple on Apple devices where the built in ‘speech to text’ is usable and accurate.
  9. Use common errors and issues from one year to pre-prepare comments for use in feedback in the following session (assuming adjustments in teaching don’t address everything). Because these are prepared in advance the advice can be more detailed and helpful and they can be used in teaching so that students can absolutely see what they should and shouldn’t do.
  10. Capture3
    Personalising a generic comment

    Wherever possible personalise comments; students appreciate the interaction or dialogue through feedback when it feels personalised to them or, when anonymous, to their work – “I suppose because we are all in the same boat, I suppose it wasn’t  really personalized because I suppose a majority of us get the same things wrong” (Student quote 2017). So if you aren’t personalising, advise the students why this is the case. If you do want to personalise there are many ways to do this including:

  • Using an audio comment in addition to generic text comments
  • Adding additional specific comments on to generic comments
  • Including a summary comment in addition to annotation

I’m sure there are other points, and some of these are already well rehersed in literature, though particularly point 1.  is little explored elsewhere. 

 

The tricky issue of word count equivalence

The challenges of managing media rich assessments, or managing student choice in assessment, has been evident in higher education for as long as I have been employed in the sector, and probably a lot longer. Back in 2004, when I worked on the Ultraversity Programme, the course team had an underpinning vision which sought to: enable creativity; encourage negotiation of assessment formats such that the outputs were of use; and, develop the digital capabilities of students (a form of assessment as learning). We encouraged mixed media assessment submissions for all modules. At this time we debated ‘the word count issue’ and emerged with a pragmatic view that alternative media should be broadly equivalent (and yes that is fuzzy, but ultimately this helps develop judgment skills of students themselves).

In the HEA accredited PgC in Teaching and Supporting Learning that I now manage, we assess using a patchwork media portfolio. Effectively there are five components (including an evaluation of assessment and feedback practices, a review of approaches used in teaching or supporting learning and a review of inclusive practices used) plus there is a stitching piece (a reflection on learning). The assessment brief describes what the students should show, but it is not prescriptive on the precise format. Each element has a word guide, but this should be used by those working with alternative media as a guide to the size of the output and the effort they apply.

wordcount

Where students opt for media rich formats, they are asked to decide on equivalence. Close contact in class sessions provides a guiding hand on judgment, critically with peer input (‘yes, that sounds fair’). Techniques to assess equivalence include taking a rough ‘words per minute’ rate and then scaling up. I have had other items such as posters and PowerPoints, again, I ask them to use their own approximation based on effort. Because the students in this particular programme are themselves lecturers in HE, there is a degree of professional reflection applied to this issue. We don’t ask for transcripts or supplementary text when an individual submits an audio or video format, because it can add considerable work and it may be a deterrent to creativity.

Media experimentation within this programme is encouraged because of the transformative effect it can have on individuals who then feel free to pass on less traditional, more creative methods to their students. I asked one of my students to share their thoughts having just submitted a portfolio of mixed media. Their comments are below:

My benefits from using media were;

  • Opportunity to develop skills
  • Creativity
  • More applied to the role I have as a teacher than a written report would have been
  • Gave ideas to then roll out into my own assessment strategies, to make these more authentic for students
  • Enjoyable and I felt more enthused to tackle the assignment elements

But I wouldn’t say it was quicker to produce, as it takes a lot of advanced planning. And, it was tricky to evidence / reference, which is a requisite for level 7. This is where I fell down a little.

I judged equivalence with a 60-100 words per minute time frame for narrative, and / or, I wrote the piece in full (with word count) and then talked it through. I think the elements that I chose to pop into video were those that were more reflective, and lent themselves better to this approach. With the more theoretical components, where I wasn’t feeling creative or brave enough to turn it into something spangly, I stuck with the written word. The exception to this was the learning design patch, where I wanted to develop particular skills by using a different approach.

This student’s comments very much match up with comments made back in 2009, by Ultraversity students who reported “without exception, felt that they had improved their technical skills through the use of creative formats in assessment” (Arnold, Thomson, Williams, 2009, p159).   Looking back at this paper I was reminded that a key part of managing a mixed picture of assessment is through the criteria, we said “In looking at rich media, the assessor needs be very clear about the assessment criteria and the role that technology has in forming any judgments, so as to avoid the ‘wow’ factor of quirky technology use. At the same time he/she must balance this with the reward of critical decision-making and appropriateness in the use of technology. Staff and student awareness of this issue as well as internal and external quality assurance guards against this occurrence” (p161). This is exactly the approach taken within the PgC Teaching and Supporting Learning. Tightly defined assessment criteria have been very important in helping to apply consistent assessment judgments across different types of submission.

If we want to receive identically formatted items, which all address the learning outcomes using the same approach, then of course mandating a single format with a strict word count is the way to go. But if we want to encourage an attitude to assessment which encourages creativity in new lecturers, and which acts as a development vehicle for their own digital skills, then we must reduce concerns about word counts and encourage junior colleagues to develop and use their professional judgment in this matter. The student quote above shows the thoughtful approach taken by one student to address the issue for themself.

Frustratingly, even by using word count as the reference point for parity we may ‘other’ some of the more creative approaches that we seek to encourage and normalize, but ultimately wordage has long been the currency of higher education. It is good to see some universities being pro-active in setting out a steer for equivalence so that individual staff do not feel that they are being maverick with word counts when seeking to encourage creativity.