Reflecting upon what makes a university operationally ready to engage with employers in collaborative provision, I have pulled together a list of “good conditions for employer engagement“. Its purpose is simply to provide prompts for discussion about how institutions can prepare to engage with employers to meet the development needs of the workforce. This list assumes that there is already a strategic level commitment in place and it is of course not exhaustive.
I have been involved in informally supporting a number of HEA Senior Fellowship applicants. I thought it helpful to share my thoughts and feedback on their draft case studies, in hope that it might also be useful for others.
Common issues have resulted in the following advice:
• Consider the impact of each case with evidence – so for example if someone is describing their pro-active and innovative approaches to feedback, it is necessary to say how they know their approach is working well.
• Consider the impact of the case on your own values and attitudes – as a result of involvement in the case did you rethink any assumptions about your role or your approach to supporting learning? By example, the person using innovative approaches to feedback as their case might say that student engagement led them to change their assumptions about students’ expectations; they discovered their students valued less feedback in a timely way, more than they valued extensive feedback which takes longer to deliver, in turn this led them to adjust their feedback practice.
• Consider different types of impact. The impact of the case may be on students, colleagues, the institution and its policies/systems, the discipline community or network or the sector as a whole. A Senior Fellowship application should really include evidence of institutional impact.
• Give detail about individual contributions when working on joint projects.
• Give detail about the specific context; sometimes there is a need to make the institutional and discipline context more explicit. Sticking with the feedback example, there might be a need to describe how the practical emphasis of the curriculum acts as an influence on the approach taken to feedback.
People approach the case studies in very different ways but I would recommend some kind of structure to ensure going beyond description. A proposed structure would be something similar to the following:
- Describe the activity and an outline of the rationale
- Consider what worked well and what not so well in practice
- Identify what you learnt in involving in this activity and note any changes to your prior assumptions
- Describe the impact but ensure that impact points to sources of evidence – student feedback, colleague feedback, or external feedback are all ok
- Describe what the next steps might be to advance this area of practice even further.
This of course is just one way to tackle the task and I’m sure there are many more!
During a summer of local and international educational development workshops ‘Jing’ has had many outings. I was struck to see how this really simple facility never fails to make people say – “wow, I can really use that”. It’s a low ceiling technology with transformative potential. From my summer ‘tour’ here are some thoughts on how Jing can be used productively by those involved in teaching and supporting learning.
Formative feedback – an assignment walk through
As described here and also by Russell Stannard , Jing can be used to offer feedback on actual assignment work by enabling a visual-voice combination to be used. Feedback can be given and related to the assignment on screen. Early signs are that this approach is widely enjoyed by students who particularly value the ability to play and replay the feedback; the personal tones of the feedback; and, the privacy and convenience of getting the feedback in a location that suits them.
Best bits and “no no’s”(one to many feedback)
To feed forward and enable one group to learn from another, Jing can be a way of presenting good practice and things to avoid. This needs a little care to avoid showing individuals up, but with careful doctoring any ethical issues can be avoided! This can be used as a group feedback method, and can be a useful interim form of feedback when individual comments can’t be provided in time to be useful for the next assessment. Such videos can be added to the VLE or sent direct to students.
In addition to providing feedback, Jing can assist with directly facilitating corrections. This, I find, is particularly helpful with very specific and detailed tasks. The visual element can help enable the recipient to use tools to make future changes. An example would be a student who has issues with alignment being shown how to use the facility in Word, which shows the spaces and tab marks. In discussion with colleagues I am advised that the same principles may carry to correction of language or sentence structure.
Lots of attention s being given to teacher led Jing feedback, but this is freeware and as a result can be easily utilised for students giving peer to peer feedback. This might even help with communication skills and confidence.
Summative feedback – a tour of the mark sheet
Students have fed back their desire to know how marks have been allocated. One way this can be brought about is through the use of Jing to discuss the mark itself; perhaps by the tutor talking through the feedback sheet, one section or outcome at a time. In this way Jing can be a useful complementary technology.
A reflective tool for students (an audio layer in the battle against plagiarism)
One of the ways we can mitigate plagiarism, and encourage learners to reflect on their learning processes, is through the inclusion of an annotated bibliography in any assignment. As an alternative, perhaps catering for different styles and preferences, students could review their own assignment and create a walk through of any difficult points, any areas that they feel could be improved and any things they would do different in future. They could also comment on how they found particular readings cited in their work.
Recalling assumptions (project management tool)
As part of my role is project management, Jing also helps with remembering what we did and why. A two minute voice over on a spreadsheet means that when we go back and think how on earth did we arrive at x, y or z, that we have the detail captured from the moment. Jing is now, therefore, becoming a favourite of accountants and data managers as well as teachers!
As part of my doctoral studies I have recently undertaken an action research project relating to strengthening approaches to feedback practice. Informal reconnaissance led me to believe that feedback practice is very siloed. At the same time in the planning process I encountered a paper by Ball (2009) who showed that collaborative practitioner centred action research in itself can bring about the questioning of ones own practice, put simply, discussing the feedback practice of others shines a light on the way that each of us works and we then ask questions of ourselves and review how we might act differently.
Informed by this, my action hunch was that the development of an electronic sharing resource for good practice could be a mechanism for modelling good practice and bringing about transparency. Influenced by Ball, I envisaged that ensuring ownership of this resource by those who would use and populate it could act as a catalyst for critical dialogue around practice.
In seeking exemplary practice to populate the resource it became clear that there were some issues to address first. The project got messy in the way described by Cook (2009). The intended action therefore was put on hold, and became a second project phase. This was clearly emergent work in progress.
The action research methodology permits these off-piste directions and in my search for good practice to populate the resource, I generated four spin off cycles to explore what is good feedback in this context? How can feedback practice be developed? What are the barriers to developing good feedback practice? What conditions might be needed for the benefits of practitioner sharing to be realised?
I have take away learning about all of these points but by far the greatest learning from this research has been the development of empathy with those engaged in the process. I have a much better sense of their experience and in staff development terms this is important for productive ways forward. My data was not vast and my conclusions didn’t add a lot to the already overflowing pool of literature on this topic, but it felt valuable. Trying to justify your research in terms couched in feelings is something that even I, as a self-confessed navel gazer, am not used to doing. In reading around this I was drawn to the work of Dadds who described a phenomenon called empathetic validity which refers to “the potential of the research in its processes and outcomes to transform the emotional dispositions of people towards each other, such that more positive feelings are created between them in the form of greater empathy” (2008, p208)
Whether empathy can really be incorporated a project aim I am not clear, I imagine it either happens or it doesn’t, but its benefit for me has trumped any of the the intended consequences of this project.
Ball, E. (2009). A participatory action research study on handwritten annotation feedback and its impact on staff and students. Systemic Practice and Action Research, 22(2), 111-124. doi: 10.1007/s11213-008-9116-6
Cook, T. (2009). The purpose of mess in action research: Building rigour though a messy turn. Educational Action Research, 17(2), 277-291. doi: 10.1080/09650790902914241
Dadds, M. (2008). Empathetic validity in practitioner research. Educational Action Research, 16(2), 279-290. doi: 10.1080/09650790802011973
Yesterday I struggled to explain accreditation services and the point of them. So thought it worth a little time to think about my clarity of message. My verbal inadequacy coincided with me stumbling upon a post from Ruth which showed the importance of scientists and academics writing and communicating in an accessible way. I have had fun exploring the quirky writing tool that forces us to communicate in less complex ways by using only the 1000 most used words. The beauty of this approach is neatly captured by the explanation of the Saturn 5 rocket in words that help even me to understand such complexity.
So here is my attempt at explaining the point of accreditation using theupgoer5 method.
To help people learn so that they can be better at their jobs we can make courses so people can leave behind their work for a day or two and spend time together, learning. They come along and share ideas and have a go at doing stuff, listening to others or talking together. After the course they might do something which shows that they have learnt things. That way others will know they were paying attention.
We have some courses that people can join or we can make one just for your work place, for people who do the same things or work in a team together. To do that we will need to know what stuff you want to learn, what ways you want people to act when they have done the course and how you want people to actually learn together (you might want to sit in a room together and talk, you might want to go out to new places or you may want to use computers). This helps make a plan which can be taken away be people who know about stuff that you want to know about and be turned in to an amazing course. But you might not want us to make a course because you might be pretty good at helping people to learn too.
Another way people learn is when they take courses at work. They learn by listening to important people telling them things, by trying out new ways of doing things, by thinking through how things might be done better, by reading, playing games and by completing things on the computer. Instead of saying this learning is not as good as learning done here we can take a look and see if it is like the learning we do but just in a different place, with different people. If this is the case then we will be able to recognize your learning so that you have a certificate like students who do courses here.
Checking out your course can be hard – it means we need to send each other lots of paper and it can make people annoyed when we have to ask so many questions. It is however really important because when we are done, you know that the people doing the courses really should have their reward.
It’s really important for us to work in these different ways because we know that there are lots of people who do really good work and are really great at learning. We don’t always like the way we have to check on the way your courses are going because it can feel like we are trying to put our ideas on to you, but this also makes sure that we can show the outside world how good people are at learning.
Whilst my text here is highly imperfect it does help me to reflect on the words I use and the way that working groups assume a common language that is impenetrable to others.
Today I received confirmation that my Senior Fellowship application to the HEA had been accepted. I thought it helpful if I shared a few insights in to the application process with others who might be considering individual entry route.
There are three ways to achieve recognition with the HEA –
- Individual entrance route
- In-house accredited CPD schemes
- Through recognised qualifications (e.g. a PgC which leads automatically to Fellowship)
In 2011 the HEA launched new levels to their recognition scheme (there are now four levels of recognition: Associate Fellow, Fellow, Senior Fellow and Principal Fellow). I suspect many institutions will take a little while longer to develop internal CPD systems for the two new levels of recognition and so for now the individual route may remain the favoured approach for some colleagues.
I completed a Fellowship by individual entry in 2007 and have taken the same route for the Senior Fellowship. The application comprises a 6-7000 word reflection on practice and two ‘case studies’ of practice. It needs to demonstrate all of the areas of activity, all the professional values and all areas of knowledge as described by the Professional Standards Framework. It is not enough to list how you meet each requirement – there is a need to show how you apply the knowledge and values in practice. The biggest challenges in the application process were:
- Managing the time needed (which is significant)
- Going beyond description in the account to ensure sufficient reflection
- Selecting areas of practice on which to reflect
The process requires some detailed planning. The approach I took was to begin with the activities. I simply plotted out what I did against each of the headings listed. So, against ‘assessment and feedback’ I located my roles, projects and practices (present and past), likewise I asked myself what had I done in the area of learning design and developing student guidance and so on for each activity area. Simply creating a list provides the raw material for the reflection.
The next thing I did was to place the ideas in to chronological order so they made sense in terms of my personal progression – of course doing this showed up duplications and sparked additions. Initially I planned to create matrix to ensure I had all values and areas of knowledge covered, but this was quite limiting and made a tick box exercise of the process. Instead I took each activity on my list and reflected by asking a series of questions around each point, including:
- What did I do? (description of the activity)
- Why did I choose to work this way? What shaped the decision? (was it the influence of a colleague, a particular belief, a policy, an engagement with a particular academic idea or theory or case studies from elsewhere). What knowledge and understanding informed this way of working? **
- How was this way of working beneficial to students, colleagues and/or others (including industry partners)?
- What was the impact? How do I know this approach was working well?
- What was learned about working this way? Are there things in future that need to be done to refine this approach further?
**these were the most important questions as they gave opportunity to review both values and knowledge
I then tagged the emerging narrative against each of the framework requirements by adding “(v1, k3)” – these are the labels given to the framework requirements (k = knowledge, v = value). These tags were added where I believed my reflection demonstrated the criteria. By doing this I was able to see where the gaps were. My original draft was lacking in v4 for example and so I was able to track back and ask where, in my activities, did I draw upon this this value?
The case study elements (perhaps these should be re-named since they are more like illustrations of practice) are a thicker, and more focused, description of things that you have done. I considered these to be a zoom lens on two areas of my practice. I could adopt the same approach as for the general narrative but had more space to provide more detailed description and reflection.
During the process it was really important for me to have a critical friend who could chat through the sticking points and offer me feedback. I had hoped the application would take a day if I chained myself to my desk but in the end it was much longer. Overall the process has been valuable (if not a little intense) – it provides a useful opportunity to look back at what has been achieved and I was particularly pleased to see that my values had not waned too much! It was quite motivational to retrace my steps over years of practice and it was also helpful in informing planning for new CPD. While I was frustrated by the time commitment needed, without this dedicated ‘thought space’ the benefits of the reflective process would not have been realised.
Having explored a number of Jisc and independent feedback projects on assessment and feedback I have selected ten points which emerge as being critical for making feedback work.
- Timeliness is critical if feedback is to be useful; consider whether quality management systems are blocking the use of feedback. 4 weeks is too slow!
- Criteria are important but they can also promote tactical learning as students learn to be selectively negligent (Gibbs, 2012). Use criteria but don’t be locked in by them.
- Curriculum design is important. Too many assessments and too many small modules do not encourage deep engagement, and the associated assessment does not capture sufficient study time so as to be meaningful.
- Feedback should feed forward. It is important that staff are able to know something about what comes next for each student. Without any understanding the opportunity to be purposeful in feedback is severely limited.
- Little and often is better than more and infrequent.
- There is only so much individual tutors can do – there is a need to consider programmes and course suites.
- Feedback that refers to material that will not be studied again tends to be ignored.
- Where feedback stimulates dialogue about learning, the feedback is perceived as being more useful; this is where feedback crosses in to the line of personal development and students taking control.
- Reflection on feedback makes learner more autonomous. It reduces dependence on the teacher as giver and makes the student become an active part of the feedback process.
- Assessment diversity is good, but too much renders the feed forward process meaningless. Limit variety so progression is meaningful.