Friday, 1 December 2017

Purple Pens: Enhancing Assessment Literacy and Student Engagement with Feedback through Students Writing Their Own Feedback

Dr Dave McGarvey, School of Chemical and Physical Sciences, Keele University

From an abstract submitted to https://www.heacademy.ac.uk/stem-conference-2017-poster-abstracts

The aim of this session is to describe and illustrate our experiences of a deceptively simple but effective strategy for improving the quality and timeliness of assessment feedback in large classes through the use of a tutor-led dialogic technique that involves students writing their own feedback using distinctly coloured pens. The objectives are to stimulate discussion of the distinctions between passive receipt of tutor-written feedback and students writing their own feedback in a tutor-led dialogic environment with a view to further enhancing students’ engagement with feedback and feedback literacy.
The UK Quality Code for Higher Education (Chapter B6) articulates indicators of sound practice as a basis for effective assessment [1]. Indicator 6 (developing assessment literacy) states:

‘Staff and students engage in dialogue to promote a shared understanding of the basis on which academic judgements are made’ [1].

This is followed by a narrative that commences: ‘Engaging students, and making use of examples and/or self and peer assessment activities, where appropriate, helps students to understand the process of assessment and the expected standards, and to develop their assessment literacy’ [1] and this captures the essence of the work described here. The use of dialogic feedback cycles provides examples of alternative approaches [2].

In the physical sciences the use of regular high-value, low-stakes paper-based assessments that typically involve calculations, analysis and interpretation of scientific observations and data (e.g. problem sheets, in-class tests) is common practice. Tutor experiences of marking such assessments are invariably characterised by observations of common errors /misconceptions, resulting in much of the feedback that is provided being repeated again and again, which is further exacerbated when dealing with large classes. The desirability of rapid turnaround times coupled with large classes presents challenges for the provision of detailed feedback, and is compounded by ineffectiveness due to the fact that many students do not understand the feedback, or do not read the feedback and only look at their mark.
We (David McGarvey, Laura Hancock, Katherine Haxton, Michael Edwards, Martin Hollamby) have recently trialled a tutor-led dialogic self-assessment method to enhance assessment literacy and feedback in selected high-value, low-stakes 1st year Chemistry assessments. The elements of the approach comprise (i) tutors surveying (but not marking, or writing feedback) completed assessments to inform the feedback to be provided (ii) prompt return of the unmarked work together with a distinctly coloured pen under controlled conditions (iii) interactive tutor-led assessment, during which students mark and write feedback on their own work with the distinctly coloured pen (iv) collection of the scripts to review marking and feedback annotations and (v) return of the work within a subsequent timetabled session.

From a detailed evaluation we have learned that students value this approach to provision of feedback for a variety of reasons, not least that the students have some autonomy over the feedback and can engage in dialogue with the teacher and peers.

‘I can make notes that make sense to me/explain things in the way that I understand them’’ (Keele student)
It is also quite efficient and provides an insight into students’ engagement with feedback. Detailed outcomes of the final student and tutor evaluation and examples [3] will be presented and discussed. Practical advice on adapting the methodology will be provided.

1. UK Quality Code for Higher Education, Chapter B6 (2013), http://bit.ly/2acRADP.

2. Chris Beaumont , Michelle O’Doherty & Lee Shannon (2011). ‘Reconceptualising assessment feedback: a key to improving student learning?’, Studies in Higher Education, 36:6, 671-687.

3. We thank Lydia Bennett (Keele chemistry undergraduate) for helpful comments and permission to use her annotations as examples.


       
Creative Commons License
Purple Pens: Enhancing Assessment Literacy and Student Engagement with Feedback through Students Writing Their Own Feedback by Dr Dave McGarvey, Keele University is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Based on a work at https://lpdcsolutions.blogspot.com/2017/11/developing-academic-practice-at-level-4.html.

Thursday, 16 November 2017

The lecture is dead, long live the lecture, By Peter G Knight. School of Geography, Geology and the Environment, Keele University

When I first started teaching, back in the Stone Age, I went on a training session to develop my skills in lecturing. Yes, we had training sessions even way back then. And in that golden, wild-west age before health and safety, political correctness, or quality assurance, those sessions were real humdingers. They must have been, because in those days a single 2-hour session was all the University required for a youngster like me to make the transition from newly qualified PhD graduate (no teaching experience required) to fully-fledged lecturer. Training was a session, not a programme. And I think it was optional.

My training session involved me and half a dozen other new staff members each delivering a short example of their lecturing, and then discussing our different approaches. I think we may have been filmed. Perhaps it was set up as part of the session’s cunning design, but watching somebody nervously reading their lecture from an over-prepared script written down on index cards was possibly the best lesson in lecturing I could have had at that time. “Look how bad it can be. Don’t be this.” Even in those primitive times, and even as a youngster, I understood that a University Lecture was not supposed to be like that.

Now, thirty years later, we have recourse to a substantial literature telling us how useless the formal lecture is as a teaching tool. But I like lectures, and I think they still have their place in our teaching armoury. Not the lectures where somebody reads off a script, even if (especially if) the script is nowadays projected on PowerPoint, published on the virtual learning environment and available for replay on the PlayBack system. That’s not what I mean when I talk about lecturing.

For me, the lecture is not mainly about information delivery. Information can be delivered more effectively in other ways. If your idea of a lecture is reading out information from a script, cancel the lecture and post your script online. If you like the sound of your own voice, make a podcast. Recordings are great for students who want to listen to your pearls of wisdom while they do the washing up, walk to the park, or fall asleep at night. But where large numbers of students want face-to-face access to the individual expertise of a small number of teaching staff, the lecture remains an effective way of teaching… as long as you are careful with it! For me, the lecture is primarily about route laying, signposting, and motivation. There is some information content in my lectures, but the real aim is to show students the learning territory that lies open to them, and to motivate them to want to go and explore that territory. The lecture is a facilitating tool, not a content holder.

So what simple steps can we take to make our lectures more engaging and effective?

There are lots of different ways of doing this, depending on your own course context and teaching style, but for me it has been effective to use a blended learning approach in which the lecture is the glue holding together a range of other media. For example, a lecture might have strong online backup including a short topic summary and readings from both undergraduate textbooks and advanced research sources, so I can be sure that students have access to the core content even if I don’t go through all of it in detail in the lecture. Students can be encouraged to do pre-reading for the lecture (not just post-reading) so that they come along already clued up (and perhaps even with questions) rather than turning up saying “what are we doing today?” (or, worse still, if they say “what are you doing today?”). Preparation can be encouraged and enhanced using resources such as YouTube mini-lectures that flag up things for students to wonder about in advance. Or you could post material onto a module’s Facebook page. Students are then familiar with the key points before we start, and the lecture can operate at a higher level than if you needed to run through the basics for 15 minutes. You can see an example of these pre-lecture mini-lectures on YouTube at https://www.youtube.com/watch?v=MKahbVbo2Ec

For courses where students do preparatory work such as pre-videos or pre-reading, the lecture sessions can be improvised in response to in-class student questions/comments about what they have already done. It is usually easy to predict what students will want to learn more about (indeed you can steer them with the pre-resources that you provide), so you can still “write a lecture” in advance if you want, but then deliver it as a response to the questions they bring from their prep-work. Alternatively, you can simply give the bits of the lecture that become relevant as they ask questions about the video or the reading.

If you don’t want to set up preparatory resources, but plan to “stand and deliver” for 50 minutes, you don’t have to simply stand and deliver for 50 minutes! One effective approach is to break the session into manageable chunks, and to use each chunk to achieve a specific goal.

Here’s an example breakdown of a 50-minute 1st-year lecture, following a model that has worked well for me:

  • minutes 0-5: “establish a teachable moment” – in other words, do something that puts the students into a frame of mind where they want to learn. They won’t learn just because you force information on them. They will learn if they feel the desire or need to know something. This can be achieved in different ways. One basic approach is to ask them an interesting question (interesting to them) to which they don’t know the answer or to which you show the answer is not what they always thought. Essentially you need to make sure at the start of the lecture that the students are curious to know more about whatever it is you are covering. 
  • minutes 5-20: flag up the key issues in your topic of the day. This is the “core content” section of the lecture, and needs clear signposts and subheadings so students know exactly what to go away and read up on. Remember that you don’t have to teach them everything in class, just show them that it is there and help them to realise that it is important and interesting.
  • minutes 20-25: short break, with a reminder that students could take this opportunity to review their notes from the previous 20 mins and identify questions they might want to ask.
  • minutes 25-40: present a case-study or counterpoint example (perhaps from an important research paper) that draws together key themes from the day’s topic and perhaps illustrates them in an applied context (or from a perspective that will help shed light on what you did in minutes 5-20).
  • minutes 40-45: time to deal with student queries and comments about what you’ve done (including opportunity for them to ask questions they thought of in the mid-lecture break), and time to reiterate your key point. This might be a good point to throw in a quick Mentimeter activity to get student feedback on what they feel they have understood well or have found difficult in the session. Not heard of mentimeter? Check it out at https://www.mentimeter.com/ It is one of a whole raft of interactive tools that are available to help make lectures more engaging. If you don’t want to use technology, then a good old-fashioned 2-minute talk-to-your-neighbour buzz group can work well at this point, too.
  • minutes 45-50: a closing activity to reinforce their learning and encourage them to do the follow-up work you may have set. One simple approach here is to give them a short self-assessed or peer-assessed mini-test on what just happened in the lecture. Alternatively, the old-fashioned “conclusions” slide still has its place! Better still, throw them a teaser to prime them for the next session.

Recently I set out to redesign an entire module using that kind of framework as a starting point. I didn’t really stick perfectly to the plan, but making a step in that direction was a big improvement on my previous style. Basically, instead of a 50-minute block covering a set of information, think of half a dozen short blocks of varied content, including student activities, designed to signpost and motivate. If you are adopting an approach like this because you think students have short attention spans (we all have short attention spans), then you can also help by making sure you switch occasionally between different modes of presentation. For example, if your core session at minutes 5-20 is delivered by PowerPoint, then perhaps try using the whiteboard, or a box of sand, or at least a Prezi instead of a PowerPoint for the case study section. Or you could use a video for the opening few minutes, then talk-and-chalk for a bit before going back into PowerPoint.

Mix it up. Stay lively.

The example above is just that: an example. I’m not pretending to be able to teach anyone how to teach. But having looked at my example, take this as your own teachable moment: I know you are thinking you could do it better, and that I’ve missed a trick, or a bit of technology, or a key pedagogic theory. Excellent, then my work here is done: now please go and develop lectures even better than I have suggested!

NB: Some of this content was previously published on Peter G Knight’s own WordPress blog.   Creative Commons License
The lecture is dead, long live the lecture, By Peter G Knight. School of Geography, Geology and the Environment, Keele University by Peter G Knight, Keele University is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Based on a work at https://lpdcsolutions.blogspot.com/.

Friday, 3 November 2017

Developing Academic Practice at Level 4 Mature (DAPL4M) Online

DAPL4 M Online. What went wrong and why, learning from our mistakes. 

Angela Rhead, Student Learning; Katy Lockett SSDS and Matthew Street, Student Learning / LPDC

Summary 
Student Learning launched the Developing Academic Practice at Level 5 (DAPL5) pilot in 2015/16. An intensive, seven-week open course in both semesters, DAPL5 supports Level 5 students to explore academic reading, thinking and writing practices in the context of their modules. Based on DAPL5’s success, and in partnership with SSDS’s mature student liaison officer, we piloted the DAPL4Mature (DAPL4M) course in October 2016 to re-engage mature students with learning confidently and increase awareness of the services available. A shorter, four-week course, DAPL4M focused on the preliminary aspects of scholarly study: making the most of handbooks; note-taking in seminars / lectures; managing reading lists.

With a small number of the students who applied actually able to attend the Wednesday morning workshops (6/26), we launched DAPL4M as a closed online course in semester two. We envisaged an online course, ‘attended’ at any time in the week but with a weekly ‘delivery schedule’ of content, discussion and online chat (see Table 1), would increase participation, perhaps also capturing students who had applied in semester one. We also hoped to engage with a wider range of students not attracted to the face-to-face, communal workshop approach. To reflect the learning journey, we added a session on using feedback, experiences and work from semester one to shape development in semester two.

Ultimately, we attracted fewer students: thirteen applied, three of whom had applied to the first DAPL4M. Nine of those thirteen engaged in the pre-course ‘Getting to Know Each Other’ KLE discussion forum; two engaged partially in the Session 1 blog discussions on ‘Being Academic’; by Session 2, no one was participating. Additionally, no one attended any of the ‘Live Chat’ discussions, intended to explore questions created by the week’s tasks. After a silent ‘Live Chat’ session in week three of the course, we reluctantly decided to close DAPL4M Online, providing details about the Write Direction 1:1 academic coaching service should anyone want to continue to focus on their academic development.

Having closed DAPL4M Online prematurely, we wanted to reflect and explore insights gained from piloting an ostensibly ‘failed’ initiative. We assessed it (entirely subjectively) at a grade of 35% in terms of success, and then shared our thoughts on two questions:

1. Why did we not judge it less than 35%?

2. Why did we not judge it more than 35%?

We then considered how to apply our ideas to future projects or to our wider practices. Find out more here

Creative Commons License
Developing Academic Practice at Level 4 Mature (DAPL4M) Online by Angela Rhead, Katy Lockett, Matthew Street is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Based on a work at https://lpdcsolutions.blogspot.com/2017/11/developing-academic-practice-at-level-4.html.

Friday, 12 May 2017

Asking what they want won’t tell us what they need. Discuss. By Peter G Knight School of Geography, Geology and Environment, Keele University



In one of my 1st-year tutorial exercises early in each academic year I ask the new students whether they trust their lecturers and whether they believe what they are told in lectures. The students normally say that they do trust us, and then we have the whole conversation about reliability of sources and the importance of checking everything - including what they are told in lectures - against evidence and peer-reviewed publications. It is an important exercise, but looking at this year’s student evaluations of my teaching I wonder whether I am asking the question at the wrong point in the students’ careers.

It is that time of year now when many of us are getting feedback on our teaching from students as they complete their end-of-year module evaluation forms. When I started lecturing 30 years ago there were no such forms, and I confess with some shame to having played a substantial role in developing and introducing student evaluations. The kinds of feedback that we can pick up through these anonymous forms, or, increasingly, anonymous online surveys, are different from the feedback we used to get in the old days by having conversations with our students. Perhaps the anonymity and distance of the feedback form, rather like the distance afforded by interactions on social media, change the way that people respond when asked to offer, or vent, an opinion. Even in the years since we started using student evaluations of teaching, gradual changes in the nature of the students’ comments have reflected significant changes in our learning and teaching environment.

Even in the best-case scenario, student feedback forms are to be treated with extreme caution, especially when they are read by career-young teaching staff. The level of polite professionalism that we try to maintain in our own communications with students is not always reciprocated by some individuals as they deliver their anonymous feedback to us. In my current role I see feedback addressed to a lot of different staff, including inexperienced staff who are still being mentored, and I try to warn them, before they look at their first batch of forms, to be ready for the small percentage of responses that will be either irrationally hateful or inappropriately affectionate. At each end of the spectrum, there are usually a few forms that challenge the notion that every student’s opinion is a valid contribution to our course development process. One of my personal favourites was a verdict passed on my teaching by an anonymous student asked to comment on the merits and shortcomings of one of my modules, who wrote, drawing together his or her reflections on my year’s pedagogic efforts: “Dr Knight looks like a turtle”.

One of the biggest changes revealed by looking back over years of feedback on my modules is the change in student expectations. On the oldest forms I see students congratulating me for including projected 35mm photographic slides in my lectures. I used the occasional OHP if I wanted to push out the technological boat. The students were very happy with that. Gradually the comments changed to reflect the students’ satisfaction or boredom with PowerPoints, Prezi presentations, YouTube pre-lectures, the flipped classroom and a succession of virtual learning environments from WebCT (remember that?) through to Blackboard. The technological support that students now take for granted was not even imagined by previous generations. Students now will quickly complain about tutors who don’t provide online notes, copies of the slides, very specific set readings, and, now, captured recordings of the lectures themselves. But the students are quite right to expect the latest and best technology, and their feedback (if given thoughtfully) can help us to use it effectively.

Another change in student expectations, beyond the merely technical, is an increasingly prevalent assumption that learning should come easily. Perhaps it is connected to changes in technology. Almost any kind of basic information is now just a few seconds away, a few mobile thumb-clicks away, on Google. Even for more sophisticated academic materials Google Scholar, Web of Science, or the academic search engine of your choice makes even the CD databases of a few years ago seem stone age in comparison. I was brought up on index cards. I was trained to expect learning to be hard work. When you get right into the intellectual puzzles, learning still is hard work, but some students find this to be an unacceptable surprise. Only once, so far, has a student actually told me that they believe their £9,000 fees pay for the hard work to be done by me, rather than by them, but that kind of thinking is out there in the classroom now.

Increasingly over the last few years student feedback on my modules has started to include complaints that my teaching has given the students difficult intellectual challenges, or has required them to search for literature themselves, or has expected them come up with their own research-project designs. This year one final-year student wrote in the “what could be improved about this module” box that I raised lots of questions for long discussions instead of just telling them the answers. I am sure that a few years ago, with a different generation of students, that comment would have gone into the “what went well in this module” box. Most students don’t like clashing deadlines, and most of us may agree with them, but if one of the learning outcomes of a study-skills module is to develop time-management skills, then a deliberately clashed deadline is a learning opportunity. In a research-design module, giving students a ready-made project deprives them of project-formulation experience that will be invaluable to them in future employment. If you are learning a difficult analytical technique, skipping the difficult or boring bits is not good training. We have to recognise that sometimes, a bit like being at the gym, gain requires pain. Learning requires hard work on the part of the students, not just their teachers.

It is important to recognise and respond to student feedback on our modules even if sometimes we think the student is missing the point, or if a poorly designed questionnaire has failed to deliver our questions effectively. Galling though it is when we know that the assessment criteria are clearly set out in the easily accessible handbook, and that they were explained at length in the opening lecture, to have a student say on the feedback form that no, the assessment criteria were not made clear in advance, we can’t just shrug it off. We must consider why, despite all our efforts, this student did not believe the assessment criteria to have been made clear. That piece of student feedback should lead us to look again at the handbook, the timetable, our lecture resources, the scheduling of big nights at the Students’ Union, or whatever else might be contributing to the problem. What we should not do is just ignore the feedback. If that were our plan, we should not ask for feedback in the first place.

But this leads us to a key question. What is it that we are asking for? We sometimes talk about “student satisfaction surveys”, as if satisfaction will be the measure of our teaching quality. It won’t. This is becoming very important as we anticipate the incorporation of student feedback into the TEF. Ensuring students’ short-term satisfaction in a way that it will be reflected in their feedback is a different matter from ensuring their long-term learning, which might be most effectively won by painful hard work. If I were to design a module to make students give me the best feedback, it would not look the same as a module designed for the best learning outcomes. The danger in adjusting modules in response to poorly designed “satisfaction” surveys is that a student’s satisfaction may not equate to their learning gain. I currently lead two 3rd-year modules. One I would describe as competent but dull in its design, while the other has won external recognition as one of the UK’s most challenging, exciting and innovative modules. The competent but dull module has repeatedly and consistently scored 100% positive student satisfaction over the last few years. The exciting and innovative module has so far never scored 100%.

We altered some of our student evaluation questions recently in light of changes to the questions in the National Student Survey. One of our new questions is about whether students feel that staff value, and act upon, student feedback. As part of an institution and a subject area that genuinely does value and act upon student feedback I was confident that the responses to this question would be uniformly positive, but they were not. In the feedback on one of my modules, one student wrote in the “what could be improved” box that I seemed to teach the module the way I wanted to, rather than the way the student would have liked. They threw in for good measure the observation that this was arrogant and condescending of me. An old-timer like me does not get upset by personal comments as much as some younger colleagues may do, but I do take care to think them through. Reflecting on that particular comment I do believe that, on the whole, staff are well placed to know both what students will like and how students will most effectively learn. We try to strike the best balance between those, recognising that students may learn better if they are learning in a way that they enjoy, but also that some learning has to be hard won. From the student’s perspective, sometimes, only part of that is obvious.

That discussion that I have with the 1st-years about whether they trust their lecturers usually includes me inviting them to set out their own programme for the remainder of the course. Most years, the students decide that they would prefer me to do it, because, they say, at this stage in their careers they do not know what they need to learn or what is the best way to learn it. As one of them once said, their dentist never asks them how to proceed with the treatment, and they are happy to trust his judgement. I use that example myself now in those discussions, and ask them whether attending classes and doing assessments is a bit like going to the dentist: not always what they would choose to do for fun, but worthwhile and important nevertheless. If the students can see that the pain of a nine o’clock lecture or a challenging assignment is leading to an eventual benefit, and if we can earn and maintain their trust in us to be doing what is best for them when we schedule those classes and set those assessments, perhaps we can all do our best work.

So perhaps that is the key question that is missing from our surveys: do you trust your tutor to be doing what is best for your learning? If we have not earned their trust, that is something we really need to find out about. I don’t need an evaluation to find out whether I look like a turtle or if I am popular with the students. I need it to check whether they feel they can trust me to be doing a good job.

Creative Commons License
Asking what they want won’t tell us what they need. Discuss. by Peter G Knight, School of Geography, Geology and Environment, Keele University is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Based on a work at https://lpdcsolutions.blogspot.co.uk/.

Friday, 3 March 2017

Making feedback more rewarding for everyone. Dr Julie Hulme



This blog post has been contributed to Solutions by Dr Julie Hulme a Principal Fellow of the HEA, a National Teaching Fellow and Keele Excellence award winner. More about Julie can be found here

The National Student Survey has made universities look carefully at the feedback provided to students, and most have implemented initiatives to improve students’ satisfaction (which has nonetheless remained stubbornly low). This can be a source of frustration for academics, who note glowing external examiners’ reports, hours of diligent commentary on their students’ work, and then rage that some students never read their feedback.

As a psychologist, this piques my curiosity. How can students be dissatisfied with something that should help them to learn (and get better marks)? How can academics believe that they are doing a great job, and yet deliver something that is so fundamentally flawed that students don’t even bother to read it? There appears to be a communication problem, and further investigation is needed.

Since the communication problem exists on both sides of the feedback equation, I carried out some research addressing both student and tutor perspectives. Most previous research offered insight into the ways in which students thought we could improve, but tutors were left reeling at the prospect of ‘doing more’ within an already busy job. Could we get more effective AND more efficient?

I used a mixed methods approach, combining surveys and focus groups, to investigate the perceived purposes of feedback, what people thought helped learning, and how feedback was used. The results offered a surprising degree of agreement between students and tutors, and allowed me to generate a set of recommendations, which I shared with volunteer participants online. They commented on and amended the recommendations, until they reached a consensus.

So, what can we do to make the feedback process more satisfying, and less frustrating, for everyone concerned? Let’s start with what tutors can do:

  • Students value three types of feedback: what they do well, what can they improve, and how they can improve. My research shows that tutors tend to focus on what is wrong – but guidance on how to improve is the hardest type of feedback to give, but the type that is most useful. Practically, how do you expect the student to provide ‘more depth’? I now provide structured feedback under these three headings.

  • Don’t give feedback on every mistake; pick the issues that are causing the most problems, and give detailed and constructive feedback about these (for writing errors, point students at Student Learning – this will save you time, and get them some expert help, while avoiding them feeling ‘picked on’).

  • Remember your audience – we sometimes use comments to justify our marks for the second marker. But students may not understand academic ‘short hand’; what is ‘depth’, anyway? Use language that will engage and develop them. 

  • Audio comments can help to make feedback feel personal and make it more accessible.

  • Early in the course, talk to your students about how to use feedback as a tool for learning; feedback in schools and colleges is very different, so they need to learn a new approach.

  • Identify yourself - sign or initial your marking so that students know who to ask if they need any clarification.

And what can students do?

  • Firstly, read your feedback! It gives you insight into how to improve your grades in future, and, according to Hattie (1987), is the most important way you can improve your learning.

  • But don’t read it instantly. When you first get work back, you’ll be emotional. Either you’ll be dancing around celebrating, or you’ll react defensively to the feedback. Give yourself time to calm down and reflect; look for the information that tells you what you do well, what to improve, and practically, how you can get better (even if this isn’t written explicitly). Devise an action plan and follow it!

  • If you’re not sure how to use your feedback, ask for an appointment with the tutor who marked your work, and ask for their help. Communication is much clearer in person, and misunderstandings can be avoided. If still in doubt, pay a visit to Student Learning for more advice.

The important thing, whether you’re a tutor or a student, is to remember that feedback involves two people. Sharing responsibility for the communication process ensures that feedback is useful, valued, and facilitates learning. Applying these principles certainly makes feedback delivery more rewarding for me, and my students often give me positive feedback on my feedback!


Hulme, J.A. and Forshaw, M.J. (2009). Effectiveness of feedback provision for psychology undergraduate students. Psychology Learning and Teaching, 8, 1, 34-38.

Creative Commons License
Making feedback more rewarding for everyone by Dr Julie Hulme, Keele University is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Based on a work at https://lpdcsolutions.blogspot.com.

Thursday, 9 February 2017

Keele Learning and Teaching conference 2017

The annual Keele Learning and Teaching conference organised by the LPDC took place on the 17th January 2017 sponsored by our Journal of Academic Development and Education (JADE).  There were over 20 presentations during the day across a range of disciplines sharing innovative approaches to learning and teaching.

The day was drawn to a close by a Keynote from Professor Tom Ward DVC for Student Education at Leeds University focused on Strategies for Enhancing Learning and Teaching through Publication arranged in conjunction with The Higher Education Network Keele (THiNK).

All content in this post is published using the Creative Commons licence noted below.  If you find something of interest please feel free to contact the authors for more information
Creative Commons License
Keele Learning and Teaching conference 2017 by [relevant author from post], Keele University is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Permissions beyond the scope of this license may be available at www.keele.ac.uk/solutions.

Placing the learning at the centre of learning analytics Dr Ed de Quincey,

Presentation can be found here

The interaction and interface design of Learning Analytics (LA) systems is often based upon the ability of the developer to extract information from disparate sources and not on the types of data and interpretive needs of the user. Current systems also tend to focus on the educator’s view and very rarely involve students in the development process. We have used a User Centred Design (UCD) approach with a group of 82 second year Computer Science students to design LA interfaces (in the form of Dashboards) that will engage and motivate them as learners and personalise their own learning experience. A preliminary thematic analysis has suggested that their understanding of LA and their requirements for it are often formed by the limitations of the technologies and systems that they currently use within and outside of the University. We have found however that learners want to be able to access an overarching view of their previous, current and future learning activity e.g. in a timeline. We propose that the only way of truly creating a personalised, supportive system of education is to place the learner at the centre, giving them control of their own Learner Analytics.

From broad brushstrokes to fine detail: using open badges to recognise learning achievements Mr Dan Harding & Mr Matthew Street 

Presentation can be found here

Rather than discussing a work in progress or completed project this presentation looks forward to the potential of open badges for extending student learning and continuing professional development by visibly endorsing uncredited activities. Open badges have been used in education since 2011, founded by Mozilla. Used appropriately, they have the potential to open up closed learning environments (both physical and virtual) enabling learners to gain recognition for the wide range of activities they engage with. This might include adding value to the outcomes of learning, digital credentials (i.e. digitising on and offline experiences) and supporting the scaffolding of learning through gamification. For co-curricular activities, badges could be used to recognise HEAR related activities. For example, volunteering, institutional student awards and university employment opportunities such as student ambassadors. Open badges can also bridge the gap between education and employment, providing a digital representation of achievements and improving the authenticity of learner experiences. Finally, this presentation will explore the difference between a badge and open badge, focusing on the affordances provided by openness for the learner and institution (issuer). For example, an open approach offers greater portability, allowing learners to build a repository of credentials that encourage lifelong learning

Congratulations! You’re Engaged! Ms Emma Hedges 

As student numbers and diversity increases, there is a need to ensure that student retention, progression and completion rates are maintained and improved. The implications of dropping out before completion for students, the institution, families and future employment options are great. Between 33% and 42% of students consider withdrawing from Higher Education prior to completion and students are particularly likely to consider leaving during their first semester and after Christmas. It is clear that student retention is an issue that needs addressing, as much as levels of attainment. The Student’s Union believes it has a key part to play in retention as the three key sections- Academic, Social and Services - of the Student Experience must work in harmony in order to create the best environment for student retention and attainment. At the symposium, the Student’s Union would like to present certain case studies and reports to support the idea that the culture of belonging is one of the most important reasons that student’s stay at University. We will examine how an individual’s engagement with Student’s Union can aid with retention, and attainment, through engagement, representation and involvement; with particular reference to combating isolation through its extensive student engagement programme. 

Students’ conceptions of teaching quality – what is excellent assessment and feedback? Dr Jackie Potter 

This presentation explores what we can know about effective assessment and feedback by studying students’ nomination statements for Keele’s teaching awards. Published work on student conceptions of teaching excellence (Bradley et al 2015; Moore and Kuol 2007) has not focused on assessment practices. The session explores the themes emerging in the data and compares these to core principles of good assessment and feedback practice (for example, Nicol and MacFarlane-Dick, 2006). It compares the nature of the approach to the initial step of appreciative inquiry and speculates on the real potential value for such data to be part of a virtuous cycle of change and development to assessment and feedback practices. The session explores the nature of trustworthiness, credibility and validity of the data particularly in relation to how nominations are solicited or students are briefed in advance. 

Evaluating the impact of a journal club to enhance Masters student research literacy and decision making Dr Tom Laws 

The literature supports and prescribes the establishment of a journal club to increase the research literacy of post graduate students. There is burgeoning literature on the effect of journal clubs for nursing students even though Evidence Based Practice is the corner stone of their learning and quality control practices. The lecturers at Keele-Nursing established regular journal club meetings using an electronic interface using a framework where by student explored a range of methodologies prior to selecting a research approach for their research project. A content analysis of the student postings / discussions across five research methods was undertaken (Cohort studies, case control studies, RCTs, Survey design and Qualitative interview). We triangulated the research methodology literature with the lecturers understanding of the requisite knowledge base and the student’s on-line comments. We found that students displayed enthusiastic and informed interactions beyond lecturer expectations. There was a strong association between students’ ability to offer informed critique on a range of studies and what the research methodology literature recommended in design and evaluation of research quality. This outcome situated most students with an adequate knowledge base to justify why they had chosen a specific research methodology and reduced tutoring needed to design the research. 

Throwing caution to the wind: using drones to teach undergraduate students Dr Alexandre Nobajas 

Presentation can be found here

Unmanned Aerial Vehicles (UAVs), also known as drones, have become increasingly popular and feature regularly on the news, both for positive and negative reasons. However controversial, their use has proven to be successful in a myriad of applications which go from disaster management to agricultural surveying. Nonetheless, due to a lack of courses offering UAV training there is a current shortage of professionals capable of performing missions with UAVs satisfactorily, something this project aims to mitigate by introducing UAV training as part of the university offering. By designing and deploying a variety of activities, including letting them fly a drone independently, a cohort of year 3 Geography students were introduced to drone technology, in what it is likely to be the first experience of its kind in the UK. 

What’s a digital experience? Student perceptions at Keele Mr Matthew Street, Mr Dan Harding and Mr Tim Hinchcliffe 

Presentation can be found here

This presentation will provide an insight into information collected during Keele’s involvement with the JISC Student Digital Experience Tracker. During April 2016, Keele’s application to become a pilot institution was accepted, requiring the collection of feedback from students about their experience of the digital environment. In total, 20 higher and further education institutions took part, giving participants access to anonymised benchmarking data. Nine questions were set by JISC, linked to their work on digital capability (https://www.jisc.ac.uk/rd/projects/building-digital-capability) with a further 3 left open for institutions to define. Inclusion in the tracker was offered to all areas of the University, with programmes from each Faculty represented in the final survey. Despite only 2 weeks of data collection, 312 responses were returned. Initial reaction was that the University performs well against the benchmark data, with analysis focused on free text responses as the richest source of information. Headline findings will be discussed during the presentation, however, clear trends in the data relate to consistency within the KLE, increasing engagement during lectures and other areas of the University’s digital environment. This session will begin the process of disseminating these findings with the University, including those who took part, with a report to follow. 


Progress with PlayBack Mr Matthew Street and Mr Phil Devine 

Presentation can be found here

This presentation covers two strands, the first will focus on the evaluation of the PlayBack pilot in 2015/16. The second on the development of PlayBack as one tool in a growing suite of capture technologies available at Keele. A range of methods were used to collect evaluation data, these will be explored and the lessons learned discussed alongside trends from the qualitative data captured. The evaluation highlighted the student perception of benefit from using PlayBack and showed how students use the service. When looking across the published literature on lecture capture conflicting evidence is offered about the benefit of captured lectures to learning. Leadbeater et al (2013) discuss the high use of captured lectures by non-native speakers of English and students who have dyslexia. Describing the benefits to these students in the context of being able to revisit material, but also highlighting that in some cases high use can lead to surface approaches to learning. This demonstrates the need for students to be supported to use the generated resources appropriately, work completed by Cornock provides examples of this support. If lecture capture is considered a tool that creates supplementary resources for these resources to be educationally beneficial students need support in learning how to use them appropriately 


Thinking and practising like a lawyer Mr Mark Davys 

It has been suggested that learning is something that takes place within the practices of a particular community rather than an activity that is contextually neutral. The aim of this project is to: 
  • consider what it means to think and practice like a lawyer; 
  • discern and deploy activities to help students become members of (and participants in) the legal academic community, rather than acquirers of expertise in discrete subjects and discrete transferable skills; and 
  • (in the longer term) assess whether this aids learning and improves employability, whether in the legal academy, legal practice or elsewhere. 
This project is at an early stage. The aim of the presentation is to share ideas and thinking, rather than conclusions; to test the reasoning behind the project, start conversations and gain insight from other disciplines. 

The challenge of remote CV guidance; does audio feedback help? Ms Keren Coney 

Presentation can be found here

CV writing is a component of employability development and allows students to effectively present themselves to employers. The aim of this project was to investigate the use of audio feedback as a tool to aid students with improving their CVs; in particular, to determine if this form of feedback could enhance the depth of feedback, level of understanding gained by students and the extent to which the feedback could be perceived as more personal. Responses were sought from 40 students using a questionnaire. Participants were incredibly positive about the format of feedback: it was unanimously agreed that the audio format aided their understanding and provided greater depth and a majority stated that this form was more tailored and personal. As a result of this project, I have incorporated the use of audio feedback into my practice and have even started using this format when providing remote guidance on other career-related activities, such as personal statements for postgraduate applications. Some students stated that they would prefer written and audio feedback; in response to this, a colleague and I are currently investigating the use of screen capture software to provide feedback and have been awarded HECSU funding to carry out this research. Page | 10 

Pilot evaluation of medical student perception of a novel pharmacology-based game Dr Russell Crawford and Dr Sarah Aynsley 

Gamification in higher education has been gaining traction as a plausible and useful addition to the diversity of learning resources available to both teachers and students. We have invented a card-based, role-playing team game to help aid pharmacology learning in medical students. Feedback from teaching staff was overwhelmingly positive and most felt it had a place within the curriculum as a pharmacological learning aid. The aim of the current study is to determine whether a range of medical students who played the game perceived any benefit to pharmacology learning supported in this way. Here we present pilot data based on questionnaire data and theme analysis of free-text comments collected from students who played the game. We found that students perceived a wide range of positive benefits to learning pharmacology in this manner and in our discussion, we consider how best to capitalise on these perceived benefits. 


Active learning: using interactive tools to enhance learning Dr Chris Stiff 

Presentation can be found here

This talk examines the use of active learning tools to enhance students' engagement and learning of delivered materials. Traditional lectures involve students being passive recipients of learning. However, teaching research has shown that students retain more information and are more motivated when they are actively involved in the lecture. I will demonstrate two teaching tools - Poll Everywhere and Mentimeter - that can be used to enhance content and draw students into the lecture. These tools differ from others that are available in that they require very little set up and no special apparatus – responses are made via PC or mobile phone. I will also outline preliminary data that shows using these tools does increase student engagement and overall satisfaction with the teaching they receive. 

Swipe in: electronic student attendance Mr Alex Goffe, Mr Neil Herbert, Mr Nick Vaughan and Mr Dan Daw 

Presentation can be found here

The purpose of this project is to utilise the student Keele Card to take electronic student attendance within the Faculty of Medicine & Health Sciences using bespoke, in-house designed and built card readers using open source technology.  

So you’ve got big data? Mr Tim Smale 

Presentation can be found here (access for Keele staff only)

Over the last 10 years it has increasingly become commonplace to evaluate every aspect of a student’s course. “How happy are the students?”, “how are students doing on their course?”, “how are students doing on placements” and “are the placement areas suitable” are all common types of evaluations undertaken throughout the academic year. This data is then held separately and manually reviewed and reported on. This separation and manual processing has hampered professional courses such as Physiotherapy, Nursing and Midwifery which are required by their respective professional bodies to collect this data and report the results within short timeframes. Over the last three years Tim Smale and Pete Lonsdale have been developing a solution that helps to collect this information from staff and students, import data from the student record system and external regulatory bodies. This then presents this data to staff in reports that can be easily digested and shared securely. This presentation will give a high level overview of the system’s capabilities around data associated with clinical placements (Audits carried out on the placement locations, logs of serious / safeguarding issues, student placement evaluations and debriefing records between tutor and student) and potential to develop further. 

Learning together, evaluating together: an enhanced evaluation of inter-professional postgraduate learning Dr Chris Little and Ms Jane Jervis 

Presentation can be found here

This paper will detail an investigation taking place with post-registration, postgraduate learners in the School of Nursing and Midwifery. The project sought to investigate the impact of interprofessional learning with students of an Advanced Clinical Practice module. This module recruited 47 learners, from 6 professions, including nurses, pharmacists, physiotherapists, podiatrists, optometrists, and paramedics. The project utilised the student response system Mentimeter to gather 22 responses utilising an in-class evaluative survey consisting of both qualitative and quantitative questions. These responses equate to 47% of the cohort. Students were left, unattended, to complete the in-class evaluation using their own devices or iPads provided by the School. All students in the room completed this evaluation. This paper will present a discussion about the benefits and disadvantages of presenting students with in-class evaluations and what it may mean for the data gathered. The data gathered points to an extremely positive evaluation of interprofessional learning. Students noted that new perspectives this learning had added to their academic and clinical practices. This short paper will also prompt colleagues to consider the place of in-class evaluations in their own practices and the implications of this upon the traditional end of year module evaluation. 

Learning in public? Assessing with blogs Dr Matthew Wyman 

Presentation can be found here

This will be a quick overview of my experiences using blogs as an assessment tool on my second year module on Russian politics. It will review technical issues, the online off-line question, assessment briefs and evidence from outcomes and evaluation about the effectiveness of the approach. I'll argue that the technical issues presented are worth it in terms of the added authenticity of the exercise and the extra levels of creativity that the blogs release. I think this is an approach which may be of wide interest, as just about every subject we teach is faced with the challenge of effective public communication of its value. 

Applying interdisciplinary approaches in the liberal arts through ‘living labs’ Ms Ella Tennant and Dr Andy Zieleniec 

Presentation can be found here

The new Liberal Arts Programme has been designed to be both interdisciplinary and innovative in teaching and learning. This is reflected in the first year core module Understanding the World Through the Liberal Arts, which provides an introduction to perspectives and approaches from various disciplines in HUMSS. True to the liberal arts’ origins in classical antiquity, this also allows students to engage with theories and perspectives outside the classroom, in ‘living labs’. These ‘field trips’ to specific locations encourage exploration and application of knowledge and methodologies, through thinking and doing, in real living communities and physical and social landscapes. With specific examples of the ‘living labs’ approach, this presentation will argue that learning outside the classroom can generate creative interactive activity, which in turn leads to new thinking through engaged and participatory, interdisciplinary teaching and learning. The linking of classroom and field activities also provides the opportunity for students to develop critical-thinking and problem-solving skills. Integration of learning activities and problem-solving reflects the educational theories of Dewey, Vygotsky, and Piaget, which have informed the field of outdoor learning and pedagogy. 
‘Living labs’ and interaction with the surrounding environment is a reflection of the innovative nature of liberal arts at Keele. 

Embedding the Keele journey and reflective practice within the mathematics programme Dr Martyn Parker 

Presentation can be found here

Recent reports note the ‘skills challenge’ facing the UK both from employers perspective [High Fliers, London, 2016] and the university sector [Universities UK and UKCES, 2014, Universities UK, 2015]. In particular, universities must consider ‘how effective universities are at describing the skills of their graduates?’ [Universities UK, 2015] We argue further that a fundamental axiom for universities is: how effective they are at providing students with the means to articulate their skills? In the wider context, the Teaching Excellence Framework [Department for Business and Skills, 2016] explicitly notes employability and destinations as a metric. Thus, student ‘added-value’ and its implications for their employment has never been more important. We present a model used within the mathematics programme to develop student reflective practice, their self-awareness and their ability to recognise and articulate their `skills'. We also summarise preliminary results on the effectiveness of this model. 

Project Ponder: using clicker technology to make students think Dr Russell Pearson 

Project Ponder was conceived in 2014 with the aim of encouraging students to think more deeply and debate core chemistry topics as and when they are introduced during teaching sessions to improve student engagement, understanding, performance and retention. Phase 1 of the project involved 144 first year students receiving a clicker handset as part of their welcome pack for regular use during lectures, workshops and problem classes. Phase 2 dealt with the same cohort of students in their second year of study using more sophisticated clicker handsets for team based tasks and discussions that were strategically positioned at regular intervals during a range of formal teaching sessions. This two-pronged approach involving the use of two different clicker handsets for large group teaching sessions has resulted in 96% of students wanting even greater clicker usage on their course, with 90% preferring clicker handsets to using mobile phone technology and over three-quarters of the class preferring clicker-based teaching sessions to the flipped classroom approach. The pros and cons of clicker handsets when compared against other available voting technologies will be addressed and the additional instructor benefits of the work, alongside tips for optimum impact, will also be described. 

From flash fiction to flash reflection Ms Karen Taylor and Ms Helen Machin 

Presentation can be found here

This paper reports the initial findings of an exploratory study into the effectiveness of “flash reflection” to encourage social work students to develop critical reflection skills. “Flash reflection”, inspired by flash fiction, is a brief writing activity of no more than 250 words structured around a model of reflection. Critical reflection is a core requirement of the UK’s Professional Capabilities Framework (PCF) for social workers and it is believed to lead to positive social change in social work interventions (Morley, 2014). However, there is a paucity of empirical evidence about how social work educators can support students to develop critical reflection skills (Burr et al, 2016). To explore the effectiveness of flash reflection, a group of 20 social work students were asked to complete flash reflections after each day of a short, intensive module. A second phase will invite the students and educators involved in the flash reflection experiment to discuss their experiences during interviews and focus groups. Initial findings suggest that flash reflection encourages students to analyse their assumptions whilst also providing rich and immediate evaluations of taught sessions. Based on these exploratory findings, we argue that flash reflection may be added to other techniques to encourage social work students to develop critical reflection skills. 


Teaching the limits of technology through iPad usage in seminars Dr Rachel Bright 

Presentation can be found here

This paper will draw on recent pedagogical developments relating to using tablets in classroom learning environments as an integrated part of seminar interaction. It will demonstrate how iPad activities can encourage active and collaborative learning, both of which have been shown to improve academic performance (Kuh 2005, Prince 2004). This paper will specifically focus on sessions within History at Keele in 2014 and 2016, demonstrating how the technologies were used to improve student interaction, especially through group work and presentations, as well as how the iPads improved the class participation of students with a range of disabilities. As Rice (2011, para.3-4) has shown, ‘iPads increase engagement and collaboration, acting as a facilitator for more easily sharing information.’ However, this paper will also reflect on the limitations of using such technologies, which unexpectedly became evident during group work in these classes. A comparison was made between groups with iPads and those without; and occasionally the iPads proved a distraction, with worse outcomes. Rather than demonstrating that iPad use should be discontinued, however, the results subverted student expectations about using new technologies in a learning environment, constructively enabling them to reappraise how they approach research at university-level. 


Sustainability education at Keele; what do you think? Dr Zoe Robinson 

Presentation can be found here

This talk will present material from a staff (2013) and student (2015) university-wide survey looking at staff and student attitudes towards sustainability. It considers how staff and students conceptualize sustainability at Keele, how relevant they see it to their discipline, how they would like to see it implemented, and the impact that sustainability education work at Keele has had on the sustainability attributes of students from different disciplines. This data can inform future directions of education for sustainability work within the institution, and in particular highlights the importance of considering different approaches for different disciplines. 

Scrolling’ to map texts, cross academic conceptual thresholds and access the ‘critical’ reading portal Ms Angela Rhead

An example of scrolling can be found here

Interested in increasing students’ active and critical engagement with texts, I have been frustrated by attempts to move beyond the “scan the abstract” technique for the initial stages of reading, where criticality can be said to begin. Inspired by Middlebrook’s (1994) discussions on the benefits of creating scrolls from selected sources to explore reading processes collaboratively, and beginning with Y2 Music students, I experimented with scrolls to ‘map’ texts to precede (or circumvent) narrative reading. Twelve feet of a journal article, ten students and several marker pens later, scrolling appeared to increase students’, “dialogic engagement with the text “(Abbott, 2013, p.198) and uncover the easily recognisable but difficult to explain intuitive practices of confident academic readers (Moore, 2013); imperfect but profoundly improved, this year I have extended the experiment to a range of disciplines at undergraduate and postgraduate levels. One key finding is students’ widespread lack of understanding of the purpose of academic reading, a ‘conceptual threshold’ (Wisker and Robinson, 2009) that, unlike discipline-specific ‘threshold concepts’, describes epistemological cross-disciplinary concepts that require criticality on the part of the reader. A discussion of the emerging findings of this experiment provides the potential to focus on a significant aspect of higher learning that frustrates both students and academic staff.

Developing critical writing skills; a patchwork tour de force Ms Sara Morris 

Presentation can be found here

Aim: To develop confidence and a requisite academic skill set for level 7 in a disparate graduate cohort The ‘patchwork text' assignment commences at the start of the MSc Nursing programme and is gradually assembled during the course of an eight month module to meets all the learning outcomes. Winter (2003) suggests a Patchwork text approach as useful to creatively explore concepts. The assessment consists of a variety of formative short ‘patches’, using a range of written approaches (e.g. reflective account, annotated biography, report) each of which is complete in itself. These patches are developed around a theme (e.g. frailty) to ensure the overall unity of these component sections. Although the theme is planned in advance the work is finalised retrospectively, drawing on formative feedback, feed forward, discussion and peer review. The final assessment is complete when the revised ‘patches’ are 'stitched' together’ with a critical reflective commentary. So why this approach? The MSc Nursing is a fast track programme that recruits from a wide range of graduate disciplines, all of whom are new to the Nursing profession. The use of formative patches and a range of feedback mechanisms have been instrumental in supporting and upskilling the cohort to construct new knowledge that builds on previous learning (Matt, 2000) and introduces and the required academic skills in an accelerated manner to Masters Level. The opportunity for critical reflection in both the discussions and the ‘stitching’ has demonstrated a deeper level of understanding of complex and multi-dimensional areas of nursing practice.

Monday, 5 December 2016

Induction week. Welcome on board! Peter G Knight, School of Geography, Geology and Environment, Keele University

It seems appropriate that my first blog entry for Solutions, which I intended to submit for the start of the academic year and which was supposed to be about teaching time management to newly arrived students, should have been delayed by two months because I have been overwhelmed by - amongst other things - teaching time management to newly arrived students.

Most academics think about the start of October in the way that other people think about the start of January: a new year, big hopes, good resolutions, a fresh start. The new academic year brings with it a fresh crop of the bright young intellects that, changing every year, help to keep our own ideas fresh and constantly renewed. It is an exciting, promising time. I have had nearly forty new years now as a university lecturer or student: enough academic fresh starts to fill two of the lifetimes of my average undergraduate. I’ve done this before, and for me it is not scary. But for you, dear student, this is the first time. We see this academic induction, this welcome week, this fresh start, from very different perspectives. And I need to remember that.

We can try to serve as good examples to our students, but we don’t always have to be models of perfection. The fact that I have time-management problems of my own does not undermine my position in teaching time-management skills to students. It strengthens it. I am teaching from the front line of right now, not from long-remembered experiences of “when I was in your position”. I can face a problem on Monday, figure out a workaround on Tuesday, and teach it to my tutorial group on Wednesday. In my mid fifties I can’t pretend to put myself in my new students’ teenage shoes or claim to be facing the same struggles that they are facing as they settle into University. But I can share with them my equivalent struggles, and show them that fighting battles, finding coping strategies, and dealing with everyday academic problems are normal things that we are all learning to do. If a student sees that I am still learning, and still struggling, perhaps the student will feel less inadequate about their own struggles and their own early setbacks. It’s OK to find University difficult. These challenges are supposed to be here.

And here, for me, is the challenge of induction week. We want to be positive, supportive, and encouraging, but we also have to be honest, realistic and pragmatic. We want to say well done for scoring those A-level grades, but we also have to point out that much of what was covered at A-level was fundamentally flawed. Many of my students begin their degrees hoping for clear answers and reliable certainties. I have to tell them that there are no clear answers and that study at University will introduce them to a whole new set of uncertainties. Welcome on board, but hold on tight.

One of my Welcome-week activities that seems to help students feel at home is the start-of-course diagnostic assessment. Students seem accustomed to having lots of tests and quizzes at school, so having a 15-minute short-answer test included alongside the many unfamiliar experiences of induction week seems to steady the ship for some of them. I tell them that the idea of the test is to help me work out the correct level to pitch material in the early sections of the course, and that it will also give them a broad indication of how far their pre-University work has prepared them for this new stage in their academic journey. They do the test, I let them mark their own or their neighbour’s paper straight away while I talk them through the answers, and as I take in their marked papers I give them a handout with all the questions and all the correct answers on it for them to take away. A couple of days later I see the students again, and tell them that in fact the first test was just a rehearsal, and the real diagnostic assessment is today. I tell them to put away any notes or devices, and I hand out the new, real diagnostic assessment. It is, to their surprise, identical to the one they did before. And here I deliver the first big lesson of the week. I was never interested in whether they knew the answers to the questions on the test. It doesn’t matter: they are only at the start of their learning journey. What I am interested in, and what I want to bring into the students’ line of sight, is what they did when they were presented with a body of information - answers - to take away. Usually, none of the students, or certainly very few, have done anything with the handout from the first test or followed up topics that they were unsure about. The marks for the second test are usually no better than the marks for the first. And here is the teachable moment: the students can see that they totally blew their opportunity to do well on the second test by not following up the feedback on the first test; and they see that what I care about is not their factual knowledge but their approach to learning and their engagement with course materials. If there is a “mark” for the diagnostic assessment, it is the difference in scores between test one and test two. Most important of all is what you chose to do after test one. Or perhaps it is what you will do after test two. Welcome to the programme. Welcome to university.

Generally, this two-stage diagnostic assessment works well, but one potential downside is that I am in a small way tricking or misleading the students, and I don’t like to mess with their trust too much early on. It is important that the students see the key lesson from that exercise (why I had to trick them with that first test), and it is important to (re)establish trust quickly through other activities. One small way that I try to do that is by joining in with the students on further in-class exercises that they do. If I ask a tutorial group to take 60 seconds out and try to write a one-sentence answer to a sample question, I take that same 60 seconds and try to come up with a sentence of my own. Not one that I prepared earlier, but one made in the same time that the students are making theirs. I can then be much more believable if I agree (or disagree) with their argument that 60 seconds was not long enough, and they can even sympathise with me a little if, when I read out my own attempt, there is some comical error in it. If the students can then suggest improvements to my attempt, just as I suggest improvements to theirs, then we are (as a happy by product) well on the way towards addressing issues that surround the new NSS question about whether students feel part of a learning community. Of course they do - they are teaching me at the same time that I am trying to teach them. We are all in this together, even if we are looking at it from different perspectives of experience. And that is one of the most important lessons to incorporate into induction week. Even if it means that the time management exercise has to wait until next time!

More about Peter 

Peter G Knight is a Reader in Geography in the School of Geography, Geology and the Environment at Keele University. He studied Geography at the Universities of Oxford (BA Hons, 1980-1983) and
Aberdeen (PhD, 1984-1987) before coming to Keele as a lecturer in 1987. He is currently Head of Geography Programmes, and Course Director for Physical Geography. He has previously served as Director of Learning and Teaching for the School, Chair of the University Academic Appeals Committee, and Chair of several Internal Quality Audits.

Peter has won Keele’s “Individual Award for Excellence in Learning and Teaching” and several Teaching Innovation Awards, and was awarded a National Teaching Fellowship by the Higher Education Academy in 2008. Peter’s promotion to Reader was one of the first at Keele to be based on “outstanding distinction and achievement in the conduct, outcome, scholarship and dissemination of learning and teaching”. Peter is the author of more than 50 research papers in glaciology and geomorphology, and has published textbooks both in glaciology and in undergraduate study skills. His most recent book is the 3rd edition of the widely used “How to do your dissertation in Geography and related disciplines” (Routledge, 2015). Peter has recently been working with both the Geographical Association and the Royal Geographical Society to develop learning resources for students at A-level and in transition between A-level and University study. The Geography module “Inspirational Landscapes” (GEG-30014), which Peter developed and leads, was featured by Geographical Magazine as “one of the UK’s most interesting and unique geography modules”.

Creative Commons License
Induction week. Welcome on board! by Peter G Knight, School of Geography, Geology and Environment, Keele University is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.