Back to research – data collection phase

Today I have published a questionnaire to researchers on the BA Learning Technology Research course to capture their experience of patchwork media, this is to inform some ongoing research that I am undertaking in this area. I always find it useful as a researcher, to fill in questionnaires or participate in other people’s research for several reasons. Firstly, it allows me to sit on the other side of the fence and in my own mind to explore the design of the research instrument (was that the best way to ask? would a different form of question have been useful here or there?). Second, with the use of technology in data collection it is  also a good window on the tools of data collection. Third, in answering the questions of others I am able to recall areas of my practice and learning, that perhaps otherwise, may lay forgotten, it is useful to bring to mind ideas which others value for it helps us to realize the value of our experience. Also, the inputs made to other people’s research can trigger moments of realisation about areas which I may further develop or research. With all the science and craftmanship of data collection, there is a huge reliance upon the will of other to involve. So I am crossing my fingers and hoping for the best now. 

Evaluating Work-based Learning

In thinking about how work-based learning should be evaluated, I considered a range of issues …

The evaluation design for WBL is determined by the purpose of the evaluation, is it for marketing, research or to inform the provision of teaching/learning opportunities, or is it to establish a cost-benefit analysis? It may be possible of course to use one data set for multiple purposes, and this may be reflected in the design.

Secondly, it may be seen that there are different agents acting as evaluators. Again this may be determined by the purpose. It may be academics seeking to improve, the employer, seeking value or specific delivery, or it may be an external body, for quality assessment or funded research. An additional evaluator of WBL, may be the learners themselves.

The learner as an evaluator of WBL can be built in to assessment design. For example …

Through the inclusion of reflective writing tasks or stitching in the patchwork approach, the learner is critically evaluating the effect of the learning on themselves, their knowledge base, skills and confidence for example.

  1. Through the inclusion of specific learning activities, learners can critically reflect upon the achievement of standards. For example in asking learners to evaluate themselves against the graduate learning outcomes, they undertake self-evaluation, but in addition critical reflection there upon provides a learning opportunity and thereafter an assessment opportunity (on the academic qualities of the reflection).  In both the routine reflection and the task based reflection the learner is learning to take control of their learning and developing skills in evaluation and planning.

Through the inclusion of learning outcomes “Evaluate the impact of a major project, intervention or change “ the learner is evaluating the impact outside of themselves. They develop skills and knowledge around the assessment of impact. For example they may use theoretical models or evaluative frameworks.

The learner as evaluator is in keeping with the vision of the learner as taking control, engaging in meta learning and developing skills for a complex economy. 

More traditionally, the evaluation of WBL is upon (rather than by) learners and upon the employing organisation. In both instances studies by be momentary, providing a snapshot or they may be longitudinal. In both cases (whether evaluating impact upon learners or organisations) different forms of channels are identifiable. These can be labelled informal practice based evaluation, process driven evaluation or occasional evaluation.

  Impact of WBL upon learner Impact of WBL upon organisation
Informal Feedback from the tutor relationship Feedback from working relationship, facilitated by a culture of openness and honesty.
Process Module evaluations (with impact section), course exit surveys. Stake holders forums, key informants, satisfaction surveys 
Occasional One off research surveys/projects One off research surveys/projects


I would tentatively suggest that the reaction time to evaluative feedback varies with each channel, being lower to the informal and higher to the occasional. Though perhaps practitioners need to ‘listen harder’ to hear the informal. 

In summary, evaluating work based learning requires a set of decisions which are represented below.