Having survived my doctoral viva on the EdD programme at the University of LiverpooI, I wanted to share some of my own experiences in the hope that they might be useful to others.
- To prepare for my viva the first thing I did was take six weeks away from the research and from even thinking about the doctorate. This was an important preparatory step to make myself objective about the thesis when re-engaging; essentially it allowed me to come back with fresh eyes and a clear mind. I questioned whether this was a wise thing to do as some advice says keep reading around your topic in the gap, but I really valued the break.
- I then read my thesis back (twice), from cover to cover. As I read I annotated typographical errors. I decided not to berate myself for their presence, since that would be a distraction. Finding these minor typographical and phrasing errors early on was helpful as it removed any sense of thinking that the viva will be the final step (it became clear that I would need to make modifications). Getting this realization over and done with earlier in the process made my expectation management much easier.
- As I read, I noted areas where I felt I should have said more. Particularly I noted areas where I had said more and then, for editorial reasons, cut back on the detail. In these cases I read through the words and diagrams that were reluctantly cut out of the final draft (I always saved copies of earlier drafts of each chapter). Logically I figured, if I had struggled to cut out certain sections, their eventual absence may be of concern to the examiners. Reacquainting myself with this material was invaluable. By example I had cut out text which explained how the two strands of analysis in my study were synthesized. I had cut out this detail in the editing process, but re-familiarizing with it in the viva preparation allowed me to answer questions on this apparent gap in the thesis.
- I used a number of websites to generate common viva questions. I found one from the University of Leicester particularly helpful (see http://www2.le.ac.uk/departments/gradschool/training/eresources/study-guides/viva/prepare/questions ).
- Armed with the lists of questions, I generated answers in my mind. I did not write them down, frankly I didn’t think this helped. I sometimes committed two or three questions to mind and mulled them over while driving. This was a useful exercise as I could happily mumble answers to myself in the privacy of my car. Answering two or three at a time was enough to keep concentration. Tackling more that three questions in one sitting was not particularly productive for me.
- None of the ‘text book’ questions came up, but without doubt, these generic questions focused my attention and provided very good preparation.
- I made a conscious effort to talk with others about my research before the viva. This gave me an opportunity to clarify my own understanding and to make the research accessible. Responding to my ten year old son’s question, what is your thesis about? Was actually the most challenging and the most valuable step in this process. he pushed me to be able to explain it in a way that he could understand.
- Another useful pre-viva question to consider was ‘which three works most influenced your research?’. Answering this really forced me to focus on how I had used different influences, in turn this brought further clarity to the themes and ideas within the work.
- Keeping perspective was very important in getting ready. One side of my brain felt like my doctorate depended on the viva. The other side constantly reminded me that the thesis and viva are, in reality, part of an extended study journey and should not be seen, as with a PhD, as the only product of assessment. Essentially I was two-thirds of the way to success without the project. This was a calming fact.
- I found it really useful to ask myself ‘what would be the worst questions that could come up?’. Answering this is a real test of knowing your own limitations and those of your research. Sure enough my worst question came up (after all if I recognised this as a weakness in my research, surely others would too!). Having accepted this area as a weakness in advance, I was able to read around the issue and fill the gap. I was therefore comfortable on the day with defending my position, while accepting that some of the things I had learned through revisiting the issue would be usefully incorporated in modifications. This is a long way of recommending viva candidates face up to the areas that you know are weakest, in advance of the viva, and use the new found impetus that this phase of your journey brings to resolve any concerns that might have seemed unfathomable under the pressure to complete for submission.
- Practically, I used post it notes to separate the chapters of the tome. This was useful for finding my way around the parts quickly when questions were asked. Also I researched the outcomes of the viva, so that I was prepared to hear the judgment and absorb the critical information, rather than get lost in the terminology.
- Finally, one of the biggest challenges was to manage my own, and my supporters, expectations of completing the viva. While some friends/family/colleagues/strangers on a train congratulated me, with minor modifications outstanding I couldn’t fully celebrate. I had anticipated feeling like the viva was the end of the doctorate, but on the day it was just another milestone (albeit a significant one). This was a massive deflation. I wanted to keep the champagne on ice a little while longer. This was tricky to manage when others saw this as the finish line. In the end I settled on celebrating twice. In your mind be clear whether you feel your doctorate is over after the viva, or when any modifications are in. For me it was the latter.
A part of my doctoral literature review I endeavoured to collate the benefits and limitations of different technologies for use in student feedback. The file below briefly summarises the findings. This can be used as a reference point for anyone looking at evaluating different tools.
Feedback tools evaluation table
I have formed a short guide to action research particularly to support colleagues in higher education who may be undertaking action research for the first time. This is absolutely not intended to be a substitute for literature but it is offered as a ‘first stop’ for anyone contemplating this methodology. It offers practical ideas and tips and seeks to answer some of the key questions that I understand new action researchers to have. As ever, any feedback, additional inclusions or suggestions for revision would be welcome.
Download the booklet here: Action Research Introductory Resource
In considering how to support curriculum design for new programmes I have developed a question framework for course design teams to use to help them to deliberate and discuss the shape of new programmes. It tries to encourage a balance of looking back at what has worked before, and looking forward at how designs could be improved. It encourages discussion in the context of the discipline, but focuses on the underpinning structure of the design rather than specific content.
Download the course design think sheet
Looking at lecture capture led me to ask questions about the technology’s effectiveness. I can’t help feel that lecture capture is counter-intuitive, since we know transmission based learning is less effective than active learning (so, why would we invest more in it and replicate it?) and we know that concentration spans for online engagement don’t readily lend themselves to hour long broadcasts (my own concentration sees frustration after 15 minutes!). Nevertheless adoption is on the increase and students clearly appreciate the opportunity to apply catch up TV principles to learning – they value the flexibility.
As lecture capture heads towards the mainstream, I thought it useful to look at the evidence of the benefits and challenges of this technology, especially in light of a prediction that we may begin to move away from capturing lectures to viewing lectures as performances – something Professor Phil Race constantly emphasises with the idea of making the lecture unmissable and engaging.
My reading notes can be downloaded but the headline points were:
- More research is needed in to actual, rather than perceived effectiveness of lecture capture.
- Students appreciate lecture capture and believe it helps learning but the actual impact is unclear. Critically there is little or no evidence that lecture capture really impacts performance. Some subsets of users appear to show higher scores, but this may be associated with their diligence rather than the impact of heavy usage of downloads.
- The circumstances in which lecture capture is effective and the reasons for it are also unclear. Research suggests that content heavy subjects are best suited to this technology and interactive subjects less so, and this makes good common sense. By implication then, this point raises the question would lecture capture lead to a less interactive delivery style?
- Lecture capture is suspected as having a connection with more effective note taking and students appear to selectively watch lectures to address tricky concepts. These recurrent findings, irrespective of the growth of lecture capture, point to the value of addressing how students take notes as an academic skill and raise the question of how media can be used to address difficult concepts in watchable and debunking, even (dare I say) enjoyable ways.
If they are useful please help yourself to my lecture capture quick notes.
The document posted is a collection of short narrative portraits that has been constructed during my doctoral research, titled, ‘Using technology for student feedback: Lecturer perspectives’. Within the study, fifteen participants were interviewed. Each told their story of how and why they used technology in feedback. This illuminated challenges in the development of academic practice, it uncovered some of the ways in which feedback practice is formed, and it showed some of the ways in which lecturers internally mediate technology selection.
Individual interview transcripts were reduced to portraits (essentially these are mini accounts). This was done using a systematic and reflexive process articulated by Seidman (2013). The portraits themselves, and the process of data reduction, provided learning which fed in to the wider analytical process. These portrait stories are not all included in the final thesis in their full form, however given that narratives can provide instant knowledge (Webster & Mertova, 2007) I wanted to publish the collection. The participant portraits are presented here because they stand alone as insights in to the formation of academic practice.
DOWNLOAD Participant stories – in their words
Seidman, I. (2013). Interviewing as qualitative research: A guide for researchers in education and the social sciences (4th ed.). New York: Teachers College Press.
Webster, L., & Mertova, P. (2007). Using narrative inquiry as a research method: An introduction to using critical event narrative analysis in research on learning and teaching. Abingdon: Routledge.
Part way through my thesis research I have stood back to ask what is all this data saying? To this end I have produced a pause for thought document about the emerging findings. This is not the finished output, but in creating it I managed to consolidate my thoughts, and in sharing it I hope for any comments that may help refine further analysis or additional data collection.
Please feel free to leave any questions or comments in response.
And thank you to those who have assisted me in setting up the next round of interviews after a recent plea for help.
Preliminary Research Findings available here.
As I edit through my thesis I needed to cut out a short section that I had written about how mixed methods can be highly compatible with critical realism. As my work emerged I did not, in the end, use mixed methods. This section was hard won and so simply deleting it felt like a waste. Therefore, attached to this post is a short piece about the compatibility of mixed methods and critical realism, in hope that it may be useful to someone else.
Download the short article here: Marrying mixed methods and critical realism
In considering staff experiences of choosing and using feedback technology, one of the emerging themes has been the differing views on feedback technologies and efficiencies. While the jury is still out on the data and the process is incomplete, my observations are that efficiency can be conceived in different ways in the negotiation of technology. For some efficiency is a primary driver in the decision making process. The search for technology and the refinement of its use is motivated and shaped by the quest for efficiencies. For others efficiencies are a welcome benefit of technology – they are almost an unexpected gift – welcome, but not necessary. Efficiencies also appear to be conceived relatively; rarely are efficiencies discussed without a reference to the relative enhancement gains that can be made through a technology. Wherever there is a time saving there is a tendency to ‘re-spend’ the saved time making still more enhancements to the feedback – adding detail and depth for example. In this way efficiencies become difficult to identify as they are theoretically achievable but in reality they are trumped by the possibility for improvement. Efficiency also seems to be a veto concept for some; it is not a particular concern in the run of practice but is triggered only when a particular technology is likely to encroach other activities or provide an intolerable stress.
My ‘work in progress’ thesis for my doctoral studies at University of Liverpool is entitled Faculty experiences of feedback technology: A critical realist perspective. I have had a personal interest in feedback technology for some time, and through a process of practice based research and reviews of the literature it became clear that the lecturer or faculty voice is under-represented. Often feedback technology appears to be evaluated in terms of the student benefit rather than the experience of staff engagement. While there is a lot going on in the sector about making digital forms of feedback systematic, I was keen to discover what happens when staff have choice about the technology employed in feedback:
My research is therefore asking
- What influences the choice and use of technologies for feedback?
- What are the reflective processes through which practices develop?
- What is the impact of faculty engagement with feedback technology?
Beyond answering these questions the research also seeks to shed light on beliefs about feedback and faculty relationships with technology for pedagogic practice.
So far I have undertaken ten narrative style interviews where staff engaged with a range of technologies (inc. Jing, GradeMark, Dragon, Pebblepad, Track Changes, audio) have shared their motivations for engagement, some of the barriers to practice, and some of their underpinning beliefs about both feedback and technology. They have shared the deliberative process about how technologies were chosen and they have shed light on the institutional factors which shape practice.
The critical realist approach to analysis (particularly using the work of Margaret Archer as a theoretical framework) is particularly revealing the compexity of interaction between individuals, institutions and the wider environment.
I will release parts of the research on this blog as they develop, but I am hoping to interview a few more individuals who are involved in using technology for feedback within UK HEI’s (and will offer a token voucher as a token of appreciation). If anyone can help I would be very pleased to hear from you at firstname.lastname@example.org and I can provide more details.