Involving students in the appraisal of rubrics for performance-based assessment in Foreign Languages

By Rita Balestrini, Department of Modern Languages and European Studies

 

 

Context

In 2016, in the Department of Modern Languages and European Studies (DMLES), it was decided that the marking schemes used to assess writing and speaking skills needed to be revised and standardised in order to ensure transparency and consistency of evaluation across different languages and levels. A number of colleagues teaching language modules had a preliminary meeting to discuss what changes had to be made, what criteria to include in the new rubrics and whether the new marking schemes would apply to all levels. While addressing these questions, I developed a project with the support of the Teaching and Learning Development Fund. The project, now in its final stage, aims to enhance the process of assessing writing and speaking skills across the languages taught in the department. It intends to make assessment more transparent, understandable and useful for students; foster their active participation in the process; and increase their uptake of feedback.

 

The first stage of the project involved:

  • a literature review on the use of standard-based assessment, assessment rubrics and exemplars in higher education;
  • the organization of three focus groups, one for each year of study;
  • the development of a questionnaire, in collaboration with three students, based on the initial findings from the focus groups;
  • the collection of exemplars of written and oral work to be piloted for one Beginners language module.

I had a few opportunities to disseminate some key ideas emerged from the literature review – School of Literature and Languages’ assessment and feedback away day, CQSD showcase and autumn meeting of the Language Teaching Community of Practice. Having only touched upon the focus groups at the CQSD showcase, I will describe here how they were organised, run and analysed and will summarise some of the insights gained.

 

Organising and running the focus groups

Focus groups are a method of qualitative research that has become increasingly popular and is often used to inform policies and improve the provision of services. However, the data generated by a focus group are not generalisable to a population group as a whole (Barbour, 2007; Howitt, 2016).

 

After attending the People Development session on ‘Conducting Focus groups’, I realised that the logistics of their organization, the transcription of the discussion and the analysis of the data they generate require a considerable amount of time and detailed planning . Nonetheless, I decided to use them to gain insights into students’ perspectives on the assessment process and into their understanding of marking criteria.

 

The recruitment of participants was not a quick task. It involved sending several emails to students studying at least one language in the department and visiting classrooms to advertise the project. In the end, I managed to recruit twenty-two volunteers: eight for Part I, six for Part II and eight for Part III. I obtained their consent to record the discussions and use the data generated by the analysis. As a ‘thank you’ for participating, students received a £10 Amazon voucher.

 

Each focus group lasted one hour, the discussions were entirely recorded and were based on the same topic guide and stimulus material. To open discussion, I used visual stimuli and asked the following question:

  • In your opinion, what is the aim of assessment?

In all three groups, this triggered some initial interaction directly with me. I then started picking up on differences between participants’ perspectives, asking for clarification and using their insights. Slowly, a relaxed and non-threatening atmosphere developed and led to more spontaneous and natural group conversation, which followed different dynamics in each group. I then began to draw on some core questions I had prepared to elicit students’ perspectives. During each session, I took notes on turn-taking and some relevant contextual clues.

 

I ended all the three focus group sessions by asking participants to carry out a task in groups of 3 or 4. I gave each group a copy of the marking criteria currently used in the department and one empty grid reproducing the structure of the marking schemes. I asked them the following question:

  • If you were given the chance to generate your own marking criteria, what aspects of writing/speaking /translating would you add or eliminate?

I then invited them to discuss their views and use the empty grid to write down the main ideas shared by the members of their group. The most desired criteria were effort, commitment, and participation.

 

Transcribing and analysing the focus groups’ discussions

Focus groups, as a qualitative method, are not tied to any specific analytical framework, but qualitative researchers warn us not to take the discourse data at face value (Barbour, 2007:21). Bearing this in mind, I transcribed the recorded discussions and chose discourse analysis as an analytical framework to identify the discursive patterns emerging from students’ spoken interactions.

 

The focus of the analysis was more on ‘words’ and ‘ideas’ rather than on the process of interaction. I read and listened to the discussions many times and, as I identified recurrent themes, I started coding some excerpts. I then moved back and forth between the coding frame and the transcripts, adding or removing themes, renaming them, reallocating excerpts to different ‘themes’.

 

Spoken discourse lends itself to multiple levels of analysis, but since my focus was on students’ perspectives on the assessment process and their understanding of marking criteria, I concentrated  on those themes that seemed to offer more insights into these specific aspects. Relating one theme to the other helped me to shed new light on some familiar issues and to reflect on them in a new way.

 

Some insights into students’ perspectives

As language learners, students gain personal experience of the complexity of language and language learning, but the analysis suggests that they draw on the theme of complexity to articulate their unease with the atomistic approach to evaluation of rubrics and, at times, also to contest the descriptors of the standard for a first level class. This made me reflect about whether the achievement of almost native-like abilities is actually the standard against which we want to base our evaluation.  Larsen-Freeman’s (2015) and Kramsch’s (2008) approach to language development as a ‘complex system’ helped me to shed light on the idea of ‘complexity’ and ‘non-linear relations’ in the context of language learning which emerged from the analysis.

 

The second theme I identified is the ambiguity and vagueness of the standards for each criterion. Students draw on this theme not so much to communicate their lack of understanding of the marking scheme, but to question the reliability of a process of evaluation that matches performances to numerical values by using opaque descriptors.

 

The third theme that runs through the discussions is the tension between the promise of objectivity of the marking schemes and the fact that their use inevitably implies an element of subjectivity. There is also a tension between the desire for an objective counting of errors and the feeling that ‘errors’ need to be ‘weighted’ in relation to a specific learning context and an individual learning path. On one hand, there is the unpredictable and infinite variety of complex performances that cannot easily be broken down into parts in order to be evaluated objectively, on the other hand, there is the expectation that the sum of the parts, when adequately mapped to clear marking schemes, results in an objective mark.

 

Rubrics in general seem to be part of a double discourse. They are described as unreliable, discouraging and disheartening as an instructional tool. The feedback they provide is seen as having no effect on language development as does the complex and personalised feedback that teachers provide. Effective and engaging feedback is always associated with the expert knowledge of a teacher, not with rubrics. However, the need for rubrics as a tool of evaluation is not questioned in itself.

 

The idea of using exemplars to pin down standards and make the process of evaluation more objective emerges from the Part III focus group discussion. Students considered pros and cons of using exemplars drawing on the same rationales that can be found debated in scholarly articles. Listening to, and reading systematically through, students’ discourses was quite revealing and brought to light some questionable views on language and language assessment that most marking schemes measuring achievement in foreign languages contribute to promote.

 

Conclusion

The insights into students’ perspectives gained from the analysis of the focus groups suggest that rubrics can easily create false expectations in students and foster an assessment ‘culture’ based on an idea of learning as steady increase in skills. We need to ask ourselves how we could design marking schemes that communicate a more realistic view of language development. Could we create marking schemes that students do not find disheartening or ineffective in understanding how to progress? Rather than just evaluation tools, rubrics should be learning tools that describe different levels of performance and avoid evaluative language.

 

However, the issues of ‘transparency’ and ‘reliability’ cannot be solved by designing clearer, more detailed or student-friendly rubrics. These issues can only be addressed by sharing our expert knowledge of ‘criteria’ and ‘standards’ with students, which can be achieved through dialogue, practice, observation and imitation. Engaging students in marking exercises and involving them in the construction of marking schemes – for example by asking them how they would measure commonly desired criteria like effort and commitment – offers us a way forward.

 

References:

Barbour, R. 2007. Doing focus groups. London: Sage.

Howitt, D. 2016. Qualitative Research Methods in Psychology. Harlow: Pearson.

Kramsch, C. 2008. Ecological perspectives on foreign language education. Language Teaching 41 (3): 389-408.

Larsen-Freeman, D. 2015. Saying what we mean: Making a case for ‘language acquisition’ to become ‘language development’. Language Teaching 48 (4): 491-505.

Potter, M. and M. Wetherell. 1987. Discourse and social psychology. Beyond attitudes and behaviours. London: Sage.

CASE STUDY: ASSESSMENT USING ELECTRONIC LEARNING JOURNALS

Dr Madeleine Davies, Department of English Literature

 

Objectives

  • To use an electronic Learning Journal to improve attendance, engagement and attainment on a Part 3 module I convene, ‘Virginia Woolf and Bloomsbury’
  • To determine whether a Learning Journal + assessed essay assessment pattern offers a viable alternative to the ‘assessed essay + exam’ model favoured by the Department of English Literature (this in conversation with the ‘Diversifying Assessments’ TLDF project I co-lead in DEL see http://blogs.reading.ac.uk/t-and-l-exchange/connecting-with-the-curriculum-framework-using-focus-groups-to-diversify-assessment/)
  • To improve my ‘return of feedback’ scores on my modules; hard copy marking has always been returned to my students within 10 days yet students select ‘3’ or ‘4’ for the ‘speed of feedback’ question in their module responses. I wanted to see whether online return of marked work within the same period ‘felt’ like ‘5’ to my students more than hard copy return did.

 

Context

The pedagogic aims of my Part 3 module, ‘Virginia Woolf and Bloomsbury’ can be summarised as follows:

  • To gradually construct, over 11 weeks, a detailed and advanced knowledge of Virginia Woolf’s often complex texts and ideas.
  • To develop student’s understanding of the socio-cultural, political and literary contexts of the inter-war period.
  • To enhance skills of close reading and critical knowledge.

 

These are challenges because Woolf’s ideas connect with theoretical models including feminism, structuralism and postmodernism. In addition, the important contexts of literary modernism and of post-impressionist art have to be taught in accessible ways so that they can be understood to an advanced level. There is a great deal to learn and only thirty teaching hours available in which to develop the level of required knowledge.

 

Before I introduced technology-enhanced assessment to the module, the assessment pattern involved the following stages:

 

  • one 1500-word formative essay in Week 5 – the instruction was, ‘answer on one text’. Rushed, late, or missing essays characterised this stage.
  • one 2500-word assessed essay in Week 11 – the instruction was, ‘demonstrate substantial knowledge of at least two texts’, one of which may be the formative assignment text.
  • a summer term exam – instruction, ‘answer on two texts, avoiding the texts used for the assessed essay.

 

Not only did this model create significant question-setting work, administrative time, frustration, and paper, but it also inadvertently facilitated inconsistent attendance. Students disappeared from classes in Weeks 9 and 10 as assessed essay deadlines approached, or as they calculated that they only ‘needed’ 4 texts for assessment; when those has been selected and stored under their belts, students disappeared. Tougher material was avoided altogether because the assessment pattern meant that it did not have to engaged with. The old system also caused essay-writing panic towards the end of the module, then exams-related stress, and both triggered the inevitable chain of ECF requests.

 

None of this was conducive to consistent, productive learning and to strong attainment. In addition, the old assessment system rewarded the best writers who were able to gloss ‘shallow’ knowledge effectively: these tended to be students from more traditional educational backgrounds so the assessment model was not heeding inclusivity guidelines because it only ‘recognised’ and rewarded one type of attainment and engagement.

 

Implementation

 

Increasingly dissatisfied with the assessment model, but remaining committed to the teaching and learning aims of the module, I switched to a Blackboard Learning Journal because the pedagogic principles could, I felt, be best achieved (perhaps could only be achieved) using technology.

 

The instruction given to students about the function of the Learning Journal is as follows:

‘The use of a Learning Journal as part of the assessment

on this module is designed to encourage and reward

consistent attention throughout the course, development

in your understanding, and thoughtful reflection on your

own learning. It should support you to identify and seek

solutions to any problems you encounter in your studies.

It also requires you to organise your time carefully in order

to make regular submissions, which is a vital skill in the

world of work.’

 

This instruction emphasizes ‘understanding’ (‘Mastery of the Discipline’ in the Curriculum Framework), self-motivated problem solving, and time-management (‘Graduate Skills’ in the Curriculum Framework). From a pedagogic point of view, ‘thoughtful reflection’ is being implicitly framed within the structure of continuous engagement, and this itself is understood within the language of ‘encouragement’ and ‘reward’.

 

The online Learning Journal requires students to submit 500 words every week, reflecting on the week’s teaching and textual material; after 5 weeks, two entries from the online Journal are assessed and feedback is given (this is the formative stage – no essay questions are necessary). The 10-week Journal concludes in a retrospective entry in Week 11 and there is an assessed essay due for submission 4 weeks later. There is no longer a summer term exam. The Journal is marked online and the mark for the journal is generated by consistent completion of every entry and by the quality of entry 10 plus 4 other entries selected by each student.

 

Students know that if they miss lectures and seminars they will struggle to complete the Journal so attendance is greatly enhanced: an average module attendance rate of 72% (2016-17) has leapt to 86% (2017-18) since Journal assessment has been implemented. The high level of attendance allowed me to deliver the teaching that I know works most effectively on this module because I can rely on various connections between ideas being understood. Further, because of attendance, students are in a far stronger position when they prepare to write their assessed essays so their anxiety is much reduced and they are able to submit their best work. It was notable that no ECFs were requested for extensions on this module in 2017-18 (18 students were enrolled) where 3 were submitted the previous year as the week 11 assessed essay deadline loomed into view.

There is no exam so my marking is reduced and a redundant element of assessment is removed.

 

Impact

 

The Learning Journal initially produced some anxiety amongst students because DEL does not use Learning Journals at Part 2 so this was the first time these students were managing them. At least 5 minutes at the beginning of several seminars had to be reserved for providing students with repeated information and reassuring them that, even though the different format and requirements of the Learning Journal felt unfamiliar and even ‘wrong’, they were following the remit correctly.

 

The Learning Journal information was placed on Bb but most of the 18-strong group did not read the materials on this site; this revealed our students’ resistance to consulting Bb. DEL students seem only to recognise information when it is presented in hard copy, so I had to declared surrender and circulated the Learning Journal Guidelines to students in this form.

 

The majority of students managed to submit weekly work without difficulty and on time. Some students were worried that the Learning Journal format did not seem to adequately prepare them for the more formal writing of the assessed essay. However, by Week 8, the majority of the students expressed their growing engagement with their Journals and, through them, with the module. I also found it interesting that students were more able than usual to forge connections between texts and ideas and I wondered whether this was because the weekly Journal entries cemented the reading and seminar discussions more securely.

 

As for the feedback sheets, this module was not scheduled for assessment in 2017-18. To gather informal feedback, I asked some of the students in the group to write down (anonymously) how they rated speed of feedback: ‘5’ was registered by every student who responded. I have no idea why precisely the same time period would be viewed as ‘3’ or ‘4’ when hard copy was used and as ‘5’ when electronic feedback was used, but the implications for student satisfaction scores are clear.

 

Connecting with the SLL ‘Diversifying Assessments’ project, it is clear that Learning Journals are an increasingly popular method of assessment in DEL. The results of a 2017 Survey Monkey poll in DEL (June 2017 – see http://blogs.reading.ac.uk/t-and-l-exchange/connecting-with-the-curriculum-framework-using-focus-groups-to-diversify-assessment-part-2/) suggest that Student Focus Groups had correctly identified that this form of assessment was capable of challenging the traditional essay in terms of student choice:

 

Reflections

  • In the example of EN3VW (‘Virginia Woolf and Bloomsbury’), technology has allowed me to employ a pedagogic model that was always perfectly suited to the module but that was not always enabling success because students’ engagement was a desired outcome rather than a clear requirement. With the Learning Journal, the pedagogy underpinning the module works effectively for the first time.

 

  • It is clear, however, that students require a great deal of guidance when they initially use a Learning Journal, and colleagues need to be aware that increasing a student’s freedom to write in less structured forms also increases their anxiety. Time has to be reserved for writing advice and this can dent seminar time. The time investment is, however, worth it because the quality of work presented in the Journals was of a very high standard.

Jess Phillips MP at the University of Reading (16th November 2017)

Dr Madeleine Davies (Department of English Literature)

The Vice-Chancellor’s Endowment Fund generously supported the Department of English Literature and the Department of Politics and International Relations in hosting Jess Phillips MP at the University of Reading last week.

Jess Phillips was invited to deliver a talk on the topic, ‘Finding your Voice’, and to engage in a Question and Answer session led by Dr Mark Shanahan from the Department of Politics and International Relations. A book-signing for the MP’s recent book, Everywoman, was organised with the help of Blackwell, and this took place after the talk.

185 people were in the audience on the night. Members of the wider community joined us (including some in the 15-18-year-old category), and the majority of the seats were taken by colleagues and students in roughly equal proportion. The University’s live Facebook stream shows that 3,465 views were recorded during the 90-minute broadcast.

A Twitter feed from the event provided a lively flow of the MP’s comments as well as audience responses. One tweet alone (presented below) was viewed by 1,334 people.

Jess Phillips herself added her ‘like’ to the feed.

Jess Phillips’ talk included her childhood experiences as a campaigner with parents who were both committed to socialist causes: she remembered attending a day-care centre run by activists and helping to produce the banners that would be used on the drive-way to Greenham Common. She also discussed a brief period of political apathy when, in the early years of the Blair governments, many situations improved and the need for constant campaigning declined (she noted that she was more a fan of Blair’s ‘early work’ than of his later concepts). The election of David Cameron reignited her political activism, and her years of experience with ‘Women’s Aid’, a refuge charity, finally persuaded her to make herself heard and to enter Parliament. Her speech also included issues of class and privilege, questions of fairness and responsibility, and all her comment was laced with wit, humanity, and a deep-seated commitment to social justice. In the speech and in the Q&A session that followed it, it was clear that Jess’s passion is for equality, not in the highly theorised sense of ‘academic feminism’, but in the ‘lived’ sense of fairness, human rights and plain decency.

All of us who met Jess were extremely impressed by her warmth and her wit: there was no gap between her public image and the real person. It was also a timely and much-needed reminder that there are many MPs who are politicians because they are driven by their convictions and who are defined by their integrity and compassion. Meeting heroes is a dangerous enterprise but not in this case.

Thank you to all colleagues and students who attended the event. Jess Phillips told me (and told many students too) how impressed she was with Reading students and I felt very proud of everyone who contributed so much to such an excellent evening.

 

SLL TEACHING AND LEARNING PROJECT ON STAFF PEER REVIEW AND REFLECTIVE PRACTICE

 

 

 

Led by Chiara Cirillo (MLES and ISLI)

Cindy Becker (SLL SDTL)

(This is a joint SLL/ISLI project)

 

Member Michael Lyons (SLL Teaching and Learning Support Officer)

 

Central support Eileen Hyder (Academic Developer, CQSD)

 

Timing Three years, commencing Summer 2017

 

Purpose To reinvigorate academic T&L peer review across SLL in response to:

§  University concerns that our peer review rates are dropping.

§  Amendments to university procedures that allow us to approach peer review in a variety of ways.

§  An understanding that reflective practices can be powerful tools in peer review.

 

Methodology Based on the CQSD session on reflective practitioners, led by Chiara and Eileen, Chiara will produce 3-4 reflective practice exercises that can be used as part of the peer review process.

Michael will source any university guidance on peer review.

Cindy will produce a guidance document on peer review for staff, encouraging, where appropriate, a move away from simple peer observation to a more nuanced and in depth peer review, which might be conducted in groups or as a pair, and might involve reviewing documents, discussing themes, sharing innovation and so forth.

Chiara will share her experience of pair/group peer reviews within ISLI which have gone beyond the traditional peer observation.

Michael will monitor who has carried out peer review, and how, and will be available to make notes if group peer review takes place.

A review and reflection stage will be followed in each year by a development of the system, so that we achieve effective peer review across SLL and ISLI by the end of the project.

 

Timeline of activities Summer 2017: Material produced by Chiara, Cindy and Michael, as above.

 

October 2017: New guidelines sent out to colleagues in SLL and ISLI, but with peer review probably remaining ‘in department’ at this stage.

 

2017-18 academic year: Peer review encouraged and monitored.

 

Summer 2018: Review of engagement with both peer review and reflective practice (taken from data on completed peer reviews and a staff survey asking for responses to the reflective practice material).

 

2018-19: Moving to reflective peer review across SLL (giving colleagues in different departments to peer review together). The SLL Teaching Links Project could be made the focus of some peer reviews. Also giving the option of peer review between colleagues in SLL and ISLI.

 

Summer 2019: review of engagement with peer review and consideration of any themes for peer review in the following year.

 

2019-2020: Monitor the situation, adjust as necessary. If needed, we could consider allocating peer review partners/groups.

 

Outcomes Full peer review in SLL and ISLI by the end of the project; peer review being conducted in a variety of ways, using a range of reflective practice tools.

 

Funding?
Dissemination Through University T&L blog and the T&L in SLL blog and website.

This project will produce material suitable for conferences and pedagogical articles.

Reports to SLL Teaching Leadership Group for consideration prior to submission to SBTL.

 

 

For more information, please contact Dr Chiara Cirillo (c.cirillo@reading.ac.uk) or Dr Cindy Becker (l.m.becker@reading.ac.uk).

SLL TEACHING AND LEARNING PROJECT ON PEER ASSISTED LEARNING (PAL) DEVELOPMENT

Led by Neil Cocks (SLL STAR and PAL Academic Champion)

Cindy Becker (SLL SDTL)

 

Member Michael Lyons (SLL Teaching and Learning Support Officer)

 

Central support Caroline Crolla (Peer Assisted Learning Coordinator)

Chantelle Turner (Peer Mentoring Coordinator)

 

Timing Three years, commencing Summer 2017

 

Purpose To reinvigorate peer assisted learning within DEL, to introduce successful peer-assisted learning throughout SLL, and to explore ways in which the STAR and PAL schemes might support each other in this context.

 

Methodology Collation of more in depth information on the varying practices across the school, then marketing in DEL to potential PAL Leaders for training, for the scheme to be initially cultivated there, before facilitating its adoption in DMLES and DELAL.

 

Timeline of activities Summer 2017: Neil and Michael start identifying modules and convenors who would be amenable to pilot adoption of the PAL scheme in other departments, and determine how to approach introducing the scheme in MLES and DELAL. Neil will also become co-convenor for EN1GC to facilitate running the scheme.

 

October 2017: Neil will start marketing the scheme to gather interest from potential PAL leaders, to start recruitment and training in advance of the 2018/19 session in DEL.

 

2017-18 academic year: PAL Leaders will be recruited and trained in DEL, ready for the scheme to be launched in DEL the 2018-19 academic year. Staff will also be approached in DMLES and DELAL for discussions about training PAL leaders during the same year, for adoption of the scheme across the school in 2019-2020, identifying particular modules for the scheme.

 

Spring 2018: Update report produced on progress with the scheme.

 

2018-19 academic year: PAL is launched in DEL, and PAL leaders will be recruited and trained in DMLES and DELAL, for the scheme to be launched in the rest of the school in the 2019-2020 academic year.

 

Spring 2019: Update report produced on progress with the scheme.

2019-2020 academic year: PAL is launched in DMLES and DELAL. More modules in DEL are included the scheme as well if deemed appropriate.

 

Summer 2020: A final report will be produced reviewing the scheme’s implementation across the school.

 

Outcomes A final report including recommendations for the continuation of the scheme in the scheme.

 

 

Funding? TLDF and PLanT funding.

 

Dissemination Through University T&L blog and the T&L in SLL blog and website.

This project will produce material suitable for conferences and pedagogical articles. The project will also be showcased at an appropriate Away Day.

Reports to SLL Teaching Leadership Group for consideration prior to submission to SBTL.

 

 

For more information, please contact either Dr Neil Cocks (n.h.cocks@reading.ac.uk) or Dr Cindy Becker (l.m.becker@reading.ac.uk).

SLL WORKING PARTY ON STUDENT ENGAGEMENT

 

 

Led by Michael Lyons (SLL Teaching and Learning Support Officer)

 

Members (4-5 other members, at least one from each department in the school and one from central/support services)

Cindy Becker (SLL SDTL)

Marian Fisher (SLL Senior Programme Administrator)

Maddi Davies (SLL/DEL Senior Tutor)

Tony Capstick (DELAL Senior Tutor)

Hugo Tucker (MLES Senior Tutor)

Melani Schroeter (MLES DTL)

Christiana Themistocleous (DELAL DTL)

Chloe Houston (DEL DTL)

Paddy Bullard (DEL Student Engagement Officer)

Central support Hannah Smithson (RUSU)
Timing One year, commencing September 2017

 

Purpose To share and build on practices how student engagement is approached generally as a school including academic engagement and misconduct procedures.

To look out how SSLCs are run within the school in order to gain maximum benefit from our student representation.

 

Methodology Meetings and discussion.

 

Timeline of activities Summer 2017 (pre-working party): Michael and Marian to collate university information regarding processes around academic engagement and misconduct, for dissemination to the group prior to a first group meeting.

 

Summer 2017 (pre-working party): Cindy to talk to Hannah Smithson about the role of student reps within schools, and the training and support that is available to them.

 

September/October 2017: first meeting of the group to share ideas and practice and to consider how we might create productive parity across the school.

 

Autumn Term 2017: Fact finding by members of the working party if needed, followed by a further meeting in November 2017.

 

Spring Term 2018: Final report produced, with clear procedures in place, agreed upon by all members of the group.

 

Summer Term 2018: Report to SBTL and a meeting of all administrators and academics involved in these aspects of student engagement so as to ensure that the processes are understood and followed.

 

Outcomes (If the outcomes of the working party warrant it, an implementation group might be set up)

A final report – it is unlikely that an implementation group would be needed.

 

Funding? This is unlikely to attract T&L funding.

 

Dissemination Reports to SLL Teaching Leadership Group for consideration prior to submission to SBTL.

Through the T&L in SLL website and blog.

 

 

 

For more information on the Working Party, please contact Michael Lyons (m.lyons@reading.ac.uk).

SLL WORKING PARTY ON ATTENDANCE AND ATTAINMENT

Led by Paddy Bullard (Student Engagement Officer)

 

Members (4-5 other members, at least one from each department in the school and one from central/support services)

Grace Ioppolo (DEL)

Central support A colleague from RISIS?
Timing One year, commencing September 2017

 

Purpose To look at current attendance patterns across the school, find possible links if any between attendance and attainment in one department, and consider whether attendance is a challenge we need to address, and if so, how best to do so across the school.

 

Methodology Collation of university statistics on attendance – where does SLL sit in the overall student attendance pattern?

Statistical analysis of attendance and its potential link to attainment in DEL (conducted by MD).

 

Timeline of activities Summer 2017 (pre-project): MD arranging for collation of statistics in DEL.

 

By the end of Spring 2018:

– Collation of university data.

– Study of pedagogical research in this area.

– DEL data analysed and report produced.

 

 

Summer Term 2018: SLL-wide activity to ascertain the size and nature of the challenge and practical ways in which it can be overcome.

 

Outcomes (If the outcomes of the working party warrant it, an implementation group might be set up.)

A brief report on attendance and attainment in English Literature (by end of Autumn Term, 2017).

A report at school level with recommendations, if appropriate, for implementation.

 

Funding?
Dissemination Reports to SLL Teaching Leadership Group for consideration prior to submission to SBTL.

T&L in SLL blog and website.

University Teaching and Learning blog.

For more information, please contact Dr Cindy Becker (l.m.becker@reading.ac.uk).

SLL WORKING PARTY ON INCLUSIVITY AND TEL

Led by Enza Siciliano Verruccio (MLES)

 

Members (4-5 other members, at least one from each department in the school and one from central/support services)

Adam Bailey (University TEL team) and member of the university sub-group on disability and inclusivity (the TEL strand)?

Student member (Eleanor Rush de Jesus did a project on this last year for the module ‘Literature, Language and Media’ and so might be interested).

Michael Lyons (SLL Teaching and Learning Support Officer).

Cindy Becker (SLL SDTL) and member of university sub-group on disability and inclusivity (the TEL strand).

 

Central support We might be able to ask the TEL team to help with training in this area, once we have decided on the means we plan to adopt across the school.

 

Timing One year, commencing September 2017

 

Purpose To find the best way to lodge lectures and other study material on BlackBoard, or elsewhere, to meet the needs of all students.

To consider the impact, if any, of the release of lecture material in advance on attendance and engagement.

 

Methodology An exploration of pedagogical research in the area.

Basing this work on the outcome report of the university sub-group on disability and inclusivity (the TEL strand) – Cindy has details.

A survey of what is available (include efficacy, cost, student access and so forth) across the university (and elsewhere?).

SLL student response to different options, with consideration given to which options might suit different types of learning event across the school. (Survey? Focus groups? RUSU work already carried out?)

 

Timeline of activities
Outcomes (If the outcomes of the working party warrant it, an implementation group might be set up.)

A report to SBTL, with recommendations for an implementation group if needed.

Showcasing via SLL T&L events and on SLL T&L blog and website.

Direct dissemination to SLL staff of best practice in this area, along with a brief toolkit directing staff to resources available via the university or elsewhere.

This might also require us to make available some drop-in sessions or one to one time to provide colleagues with the skills to perform well in TEL.

 

Funding? PLanT funding to pay for a student to work with other students through focus groups and surveys?

TLDF funding for some training provision in this area for SLL as a whole?

 

Dissemination Feedback to university working party on disability and inclusivity (in whatever form it takes by the end of this project).

T&L in SLL blog and website.

This working party could produce material suitable for conferences and pedagogical articles.

Reports to SLL Teaching Leadership Group for consideration prior to submission to SBTL.

 

 

For more information, please contact Dr Cindy Becker (l.m.becker@reading.ac.uk)

SLL WORKING PARTY ON SLL STUDY SUPPORT

Led by Catriona McAllister (MLES)

 

Members (4-5 other members, at least one from each department in the school and one from central/support services)

Tony Capstick (DELAL)

Mary Morrissey (DEL)

Jack Tame (Professional Track Facilitator – SLL)

Central support Study support advisors (Michelle Reid?)

 

Timing One year, commencing September 2017

 

Purpose
  1. To audit current study support across the school, including the new ways of working as personal tutors.
  2. To analyse whether we have sufficient provision and whether we can share resources and best practice across the school.
  3. To create a plan for maximising the help we offer students, with any relevant costings and with reference to central support where appropriate.

 

Methodology Collation and analysis of evidence from department and central services.

Comparison with other schools and departments.

Engagement with academic literature on best practice.

Timeline of activities End of Autumn Term 2017: Report on findings.

 

Spring Term 2018: Implementation phase.

 

 

Outcomes (If the outcomes of the working party warrant it, an implementation group might be set up.)

See above.

 

Funding?
Dissemination Reports to SLL Teaching Leadership Group for consideration prior to submission to SBTL.

T&L in SLL blog and website.

University Teaching and Learning blog.

 

 

For more information, please contact Dr Cindy Becker (l.m.becker@reading.ac.uk)