Viva preparation

Having survived my doctoral viva on the EdD programme at the University of LiverpooI, I wanted to share some of my own experiences in the hope that they might be useful to others.

  • To prepare for my viva the first thing I did was take six weeks away from the research and from even thinking about the doctorate. This was an important preparatory step to make myself objective about the thesis when re-engaging; essentially it allowed me to come back with fresh eyes and a clear mind. I questioned whether this was a wise thing to do as some advice says keep  reading around your topic in the gap, but I really valued the break.
  • I then read my thesis back (twice), from cover to cover. As I read I annotated typographical errors. I decided not to berate myself for their presence, since that would be a distraction. Finding these minor typographical and phrasing errors early on was helpful as it removed any sense of thinking that the viva will be the final step (it became clear that I would need to make modifications). Getting this realization over and done with earlier in the process made my expectation management much easier.
  • As I read, I noted areas where I felt I should have said more. Particularly I noted areas where I had said more and then, for editorial reasons, cut back on the detail. In these cases I read through the words and diagrams that were reluctantly cut out of the final draft (I always saved copies of earlier drafts of each chapter). Logically I figured, if I had struggled to cut out certain sections, their eventual absence may be of concern to the examiners. Reacquainting myself with this material was invaluable. By example I had cut out text which explained how the two strands of analysis in my study were synthesized. I had cut out this detail in the editing process, but re-familiarizing with it in the viva preparation allowed me to answer questions on this apparent gap in the thesis.
  • I  used a number of websites to generate common viva questions. I found one from the University of Leicester particularly helpful (see ).
  • Armed with the lists of questions, I generated answers in my mind. I did not write them down, frankly I didn’t think this helped. I sometimes committed two or three questions to mind and mulled them over while driving. This was a useful exercise as I could happily mumble answers to myself in the privacy of my car. Answering two or three at a time was enough to keep concentration. Tackling  more that three questions in one sitting was not particularly productive for me.
  • None of the ‘text book’ questions came up, but without doubt, these generic questions focused my attention and provided very good preparation.
  • I made a conscious effort to talk with others about my research before the viva. This gave me an opportunity to clarify my own understanding and to make the research accessible. Responding to my ten year old son’s question, what is your thesis about? Was actually the most challenging and the most valuable step in this process. he pushed me to be able to explain it in a way that he could understand.
  • Another  useful pre-viva question to consider was ‘which three works most influenced your research?’. Answering this  really forced me to focus on how I had used different influences, in turn this brought further clarity to the themes and ideas within the work.
  • Keeping perspective was very important in getting ready. One side of my brain felt like my doctorate depended on the viva. The other side constantly reminded me that the thesis and viva are, in reality, part of an extended study journey and should not be seen, as with a PhD, as the only product of assessment. Essentially I was two-thirds of the way to success without the project. This was a calming fact.
  • I found it really useful to ask myself ‘what would be the worst questions that could come up?’. Answering this is a real test of knowing your own limitations and those of your research. Sure enough my worst question came up (after all if I recognised this as a weakness in my research, surely others would too!). Having accepted this area as a weakness in advance, I was able to read around the issue and fill the gap. I was therefore comfortable on the day with defending my position, while accepting that some of the things I had learned through revisiting the issue would be usefully incorporated in modifications. This is a long way of recommending viva candidates face up to the areas that you know are weakest, in advance of the viva, and use the new found impetus that this phase of your journey brings to resolve any concerns that might have seemed unfathomable under the pressure to complete for submission.
  • Practically, I used post it notes to separate the chapters of the tome. This was useful for finding my way around the parts quickly when questions were asked. Also I researched the outcomes of the viva, so that I was prepared to hear the judgment and absorb the critical information, rather than get lost in the terminology.
  • Finally, one of the biggest challenges was to manage my own, and my supporters, expectations of completing the viva. While some friends/family/colleagues/strangers on a train congratulated me, with minor modifications outstanding I couldn’t fully celebrate. I had anticipated feeling like the viva was the end of the doctorate, but on the day it was just another milestone (albeit a significant one). This was a massive deflation. I wanted to keep the champagne on ice a little while longer. This was tricky to manage when others saw this as the finish line. In the end I settled on celebrating twice. In your mind be clear whether you feel your doctorate is over after the viva, or when any modifications are in. For me it was the latter.

Using technology for student feedback: Lecturer perspectives. In their words

The document posted is a collection of short narrative portraits that has been constructed during my doctoral research, titled, ‘Using technology for student feedback: Lecturer perspectives’. Within the study, fifteen participants were interviewed. Each told their story of how and why they used technology in feedback. This illuminated challenges in the development of academic practice, it uncovered some of the ways in which feedback practice is formed, and it showed some of the ways in which lecturers internally mediate technology selection.

Individual interview transcripts were reduced to portraits (essentially these are mini accounts). This was done using a systematic and reflexive process articulated by Seidman (2013). The portraits themselves, and the process of data reduction, provided learning which fed in to the wider analytical process. These portrait stories are not all included in the final thesis in their full form, however given that narratives can provide instant knowledge (Webster & Mertova, 2007) I wanted to publish the collection. The participant portraits are presented here because they stand alone as insights in to the formation of academic practice.

DOWNLOAD Participant stories – in their words

Seidman, I. (2013). Interviewing as qualitative research: A guide for researchers in education and the social sciences (4th ed.). New York: Teachers College Press.

Webster, L., & Mertova, P. (2007). Using narrative inquiry as a research method: An introduction to using critical event narrative analysis in research on learning and teaching. Abingdon: Routledge.

Preliminary findings – Lecturer experiences of choosing and using technology in assessment feedback

Part way through my thesis research I have stood back to ask what is all this data saying? To this end I have produced a pause for thought document about the emerging findings. This is not the finished output, but in creating it I managed to consolidate my thoughts, and in sharing it I hope for any comments that may help refine further analysis or additional data collection.

Please feel free to leave any questions or comments in response.

And thank you to those who have assisted me in setting up the next round of interviews after a recent plea for help.

Preliminary Research Findings available here.


Marrying mixed methods and critical realism

As  I edit through my thesis I needed to cut out a short section that I had written about how mixed methods can be highly compatible with critical realism. As my work emerged I did not, in the end, use mixed methods.  This section was hard won and so simply deleting it felt like a waste. Therefore, attached to this post is a short piece about the compatibility of mixed methods and critical realism, in hope that it may be useful to someone else.

Download the short article here: Marrying mixed methods and critical realism

Efficiency, technology and feedback

In considering staff experiences of choosing and using feedback technology, one of the emerging themes has been the differing views on feedback technologies and efficiencies. While the jury is still out on the data and the process is incomplete, my observations are that efficiency can be conceived in different ways in the negotiation of technology. For some efficiency is a primary driver in the decision making process. The search for technology and the refinement of its use is motivated and shaped by the quest for efficiencies. For others efficiencies are a welcome benefit of technology – they are almost an unexpected gift – welcome, but not necessary. Efficiencies also appear to be conceived relatively; rarely are efficiencies discussed without a reference to the relative enhancement gains that can be made through a technology. Wherever there is a time saving there is a tendency to ‘re-spend’ the saved time making still more enhancements to the feedback – adding detail and depth for example. In this way efficiencies become difficult to identify as they are theoretically achievable but in reality they are trumped by the possibility for improvement. Efficiency also seems to be a veto concept for some; it is not a particular concern in the run of practice but is triggered only when a particular technology is likely to encroach other activities or provide an intolerable stress.

Empathetic validity and action research in educational development

As part of my doctoral studies I have recently undertaken an action research project relating to strengthening approaches to feedback practice. Informal reconnaissance led me to believe that feedback practice is very siloed. At the same time in the planning process I encountered a paper by Ball  (2009) who showed that collaborative practitioner centred action research  in itself can bring about the questioning of ones own practice, put simply, discussing the feedback practice of others shines a light on the way that each of us works and we then ask questions of ourselves and review how we might act differently.

Informed by this, my action hunch was that the development of an electronic sharing resource for good practice could be a mechanism for modelling good practice and bringing about transparency. Influenced by Ball, I envisaged that ensuring ownership of this resource by those who would use and populate it could act as a catalyst for critical dialogue around practice.

In seeking exemplary practice to populate the resource it became clear that there were some issues to address first. The project got messy in the way described by Cook (2009).  The intended action therefore was put on hold, and became a second project phase. This was clearly emergent work in progress.

The action research methodology permits these off-piste directions and in my search for good practice to populate the resource, I generated four spin off cycles to explore what is good feedback in this context? How can feedback practice be developed?  What are the barriers to developing good feedback practice? What conditions might be needed for the benefits of practitioner sharing to be realised?

I have take away learning about all of these points but by far the greatest learning from this research has been  the development of empathy with those engaged in the process. I have a much better sense of their experience and in staff development terms this is important for productive ways forward. My data was not vast and my conclusions didn’t add a lot to the already overflowing pool of literature on this topic, but it felt valuable. Trying to justify your research in terms couched in feelings is something that even I, as a self-confessed navel gazer, am not used to doing. In reading around this I was drawn to the work of  Dadds  who described a phenomenon called empathetic validity which refers to “the potential of the research in its processes and outcomes to transform the emotional dispositions of people towards each other, such that more positive feelings are created between them in the form of greater empathy” (2008, p208)

Whether empathy can really be incorporated a project aim I am not clear, I imagine it either happens or it doesn’t, but its benefit for me has trumped any of the the intended consequences of this project.

Ball, E. (2009). A participatory action research study on handwritten annotation feedback and its impact on staff and students. Systemic Practice and Action Research, 22(2), 111-124. doi: 10.1007/s11213-008-9116-6

Cook, T. (2009). The purpose of mess in action research: Building rigour though a messy turn. Educational Action Research, 17(2), 277-291. doi: 10.1080/09650790902914241

Dadds, M. (2008). Empathetic validity in practitioner research. Educational Action Research, 16(2), 279-290. doi: 10.1080/09650790802011973

Apps 2012


As I have progressed through my EdD my ways of working have got a little smarter. There are four apps that have served me well in 2012 for supporting my studies …

1. Reminders (so in the hours where I have too much to do I can remember what they were!)

2. Good Reader – managing my online library downloads and annotating my reading without reams of paper. By far the best reading app I have found (still).

3. Good notes – high levels of functionality, a great jotter and annotator – good for generating diagrams and mapping out thoughts.

4. Splashtop – allows my desktop (including Endnote) to be fully functional from my ipad or phone. Excellent when not wanting to be stuck at my desk.

Two more fab apps (not study related)  for 2012 have been

5. Screenchomp – Jing for the ipad – great for audio visual feedback for students and again this means there is no need to be desk bound.

6. Spelling – my best parenting app! So the kids can input the spelling list for the week and then run the tests until the spellings stick. Very motivational for kids who hate spelling.

Strategy and mission

Having poured over some fairly hard going documentation and policy text books for a few days for assorted reasons I was pleased to stumble upon a refreshing approach to writing mission statements, which I’m sure would work equally well for policy and strategy documents!

OK, so not in a million years will HE documentation ever take this flavour, but wouldn’t it be better if it did!


I am asked increasingly about concept mapping software. I have previously favoured iThoughtHD; however, while this is very intuitive it is not so good at enabling inter-label links (something only realised after a little time and intensive usage!). C-Map was recommended to me as an alternative. Though not native to the ipad, it has a greater focus on the links rather than the labels and in turn this helps the author to think about structure, more than the brain dump. It forces the user to clarify: Why is x connected to Y?

“A concept by itself does not provide meaning, but when two concepts are connected using linking words or phrases, they form a meaningful proposition”. (Villalon and Calvo 2011 p18)

C-map is downloadable for Windows and Mac and wonderfully, is free.

Below is my own mind map to demonstrate C-map (though I am confident that there are better examples!!). Click to view.
Lydia's map of learning theory

Villalon, J. and R. A. Calvo (2011). “Concept Maps as Cognitive Visualizations of Writing Assignments.” Journal of Educational Technology & Society 14(3): 16-27.

5 reasons why giving pass/fail marks, as opposed to percentage grades, might not be a bad idea

Grades1. Grades may be an inhibitor of deeper self-reflection, which is in turn linked to self-regulated learning (White and Fantone 2010). Grade chasing distracts from meaningful learning review (see also Dweck 2010). For real examples of this, some student views visible in the comments here are useful

2. Research shows that performance is neither reduced nor enhanced by pass/fail grading systems (Robins, Fantone et al. 1995). For those worrying about a reduction in standards caused by the removal of grades, don’t!

3. Pass-Fail grades are more conducive to a culture of collaboration, which in turn links to higher levels of student satisfaction (Robins, Fantone et al. 1995; Rohe, Barrier et al. 2006; White and Fantone 2010). The increased collaboration may be especially beneficial as preparation for certain professions which require high levels of cooperative working (as noted in a medical context by Rohe, Barrier et al. 2006).

4. Pass-fail counteracts challenges brought about by grade inflation practices (Jackson 2011).

5. Pass-fail is associated with lower student anxiety and higher levels well being (Rohe, Barrier et al. 2006). That has to be good!

Dweck, C. S. (2010). “Even Geniuses Work Hard.” Educational Leadership 68(1): 16-20.
Jackson, L. J. (2011). “IS MY SCHOOL NEXT?” Student Lawyer 39(8): 30-32.
Robins, L. S., J. C. Fantone, et al. (1995). “The effect of pass/fail grading and weekly quizzes on first-year students’ performances and satisfaction.” Academic Medicine: Journal Of The Association Of American Medical Colleges 70(4): 327-329.
Rohe, D. E., P. A. Barrier, et al. (2006). “The Benefits of Pass-Fail Grading on Stress, Mood, and Group Cohesion in Medical Students.” Mayo Clinic Proceedings 81(11): 1443-1448.
White, C. B. and J. C. Fantone (2010). “Pass-Fail Grading: Laying the Foundation for Self-Regulated Learning.” Advances in Health Sciences Education 15(4): 469-477.