Some suggestions of how we can capture the learner voice in employer engagement arrangements

One of my roles at Harper is to act as Student Advocate for Employer Engagement courses. This is a new arrangement and so the role and associated procedures are under development.

Feedback channels for learners to have their voices heard will vary from course to course depending on size of the cohort, distribution, teaching methods employed (face to face, online, blended, manual/distance). I am developing resources to assist course teams plan for student feedback. As a first step here is a quick brainstorm of ways in which feedback could be gathered:

  • Online survey (anonymous) – e.g. Survey monkey, Bristol Online.
  • Online form for feedback (enables an open text box for free comment and suggestions, is anonymous, less structured than a questionnaire).
  • Paper survey (feedback forms, postal questionnaire).
  • Suggestion box in the online space (this may be a forum where thoughts can be added as the module/programme progresses).
  • Student representatives (appropriate when courses are in cohorts).
  • Programme/module student advocate (is there someone on the course team or within the partnership who could receive, collate and report feedback?).
  • Facilitated group discussion (either face to face or online) – how well is the module or programme meeting your needs?
  • Employer based student advocate.  
  • Post event/module text (text your thoughts and feedback).
  • Feedback ‘graffiti walls’ – quick comments. May be done physically via a flip chart or virtually using a Wiki.
  • Phone conference (of advocates and representatives or an open forum depending on numbers).
  • Highlight route to HEI’s Student Advocate.

What might be missing here?

Subtle evaluation, not so subtle wellies

One of my ongoing projects is to evaluate work-based learning at Harper. I started out with a very clear process in mind, each initiative would be considered in terms of its impact on staff, learners and employers. The methods available included interviews and questionnaires as well as focus groups: All very familiar. To facilitate the engagement with learners a more natural approach was needed than making cold calls, with this in mind I looked to try to squeeze a focus group in to the time when learners were on-site … time was tight when learners were on campus and so that was not likely or desirable. Instead, I bought some new wellies (10 pm the evening before in Tesco) dug out some waterproofs and joined their field trip. This took me back to Geography field trips of old! Despite having very inappropriate wellies, I very much enjoyed the trip and found it a very productive way of entering into a dialogue with work-based learners. The approach was not imposing (at least I hope not) and it was utilising times between field sites and briefings to chat to learners. The biggest challenge for me was retaining the detail, but I was able to take notes when the group were on task. This is an approach that I would definitely use again.

So far the evaluation work is unfolding though perhaps not in the sequential and tidy way in which I would neatly like it to be occurring, but such is the nature of the beast.
As I go forwards with this work I need to use whatever moments are available to gather data, planned or unplanned, data is in fact still data! Evaluative responses should be non-intrusive, proportionate, appropriate and listening.

Work-based learning Impact Study, summary notes.

Ahead of undertaking some sizable WBL evaluation I wanted to cross check my own evakuation design with the HEA Work-Based learning Impact study. In doing this I have jotted ouut the key elements of this report for future reference.  

Reasons to engage in WBL

Learners Employers
Validate and formalise experience Develop knowledge, skills ad expertise (job specific and generic)
Career progression Retention
Increased knowledge and understanding Supplement and extend existing provision
Develop practical skills to perform better in new or future role  

Reasons for programme choice

Learners Employers
Flexible delivery Fits in to work schedule
Cost  
Pace Opportunity to influence change in the work-place
Convenience of delivery in the workplace Minimum time away from work
Relevance Addresses day to day issues

 Impact

Learners Employers
Confidence at work Clearer organisational direction
Confidence outside of work Development of standards, policies and contracts
Higher aspirations and motivation Improvements in quality
Raised personal status Increased innovation
More self aware learning to think and challenge assumptions  Improved performance of employees who require less direct support
A greater awareness of particular issues Positive attitudinal and behavioural change in line with the values of the organisation (capability expanding)
Developed new and enhanced existing skills External recognition and prestige
More likely to take stock of performance  
Wider perspective of workplace issues  
Better understanding of the workplace organisation  

 

Key professional benefits of WBL

  • Better performance
  • Taking on responsibility
  • Changed jobs or secured promotion
  • Secured salary increase
  • Left able to see other points of view
  • Positive workplace thinking
  • Relieved stress and increased contentment
  • Able to coach others
  • Professional recognition or membership

NB: Reflective approaches were cited as critical for realising benefits at work.

Bite size acknowledged as a catalyst for further study.

Fuzzy Evaluation (evaluating work based learning)

Evaluating work based learning is increasingly appearing to be a messy business! With a range of developments to evaluate I am needing to focus the planning of evaluation structures at a level which attends to what we want to know, why we need to know it and what process we must go through to gather and analyse the data. The messy bit is making choices about the data collection, since initiatives vary by size, longevity, commercial sensitivity and teaching and learning arrangements (e.g. on campus, blended, third party delivery). I like neat data, in a symmetrical world I would conduct parallel surveys for each case, however the data collection approaches are going to have to be made on a case by case basis, using what ever means necessary to obtain meaningful data. I did so like the days of controlled experiments when everything fitted in to neat boxes 🙂

I have created a pattern of topics to address within four broad headings; evaluating teaching and learning, evaluating impact upon learners, evaluating impact upon employers and evaluating impact upon college staff. Data will be sought to address each area.

Then, by work based learning initiative, a process steer the evaluation (click to view):
evaluation process

It will be interesting to see how much consistency can be found in the methods of data collection across different WBL initiatives.

I always like the term by Bassey which refers to ‘fuzzy generalisations’ – I think for WBL when evaluating, we have a situation where comparisons between initiatives can only ever be fuzzy; imperfect but highly useful.