Embedding Inclusion within the Academic Environment: Lessons from the SIF II funded Trinity Inclusive Curriculum Project Pilot.

Introduction

Background and Objectives of the TIC project

TIC Pilot of the Teaching and Learning Self-Evaluation Tool.

Pilot Methodology

Pilot Stages

Stage one: Teaching Observation

Stage two: Resource Review

Stage three: Stakeholder Feedback

Stage four: Tool Completion

Stage five: Creation and presentation of an action report

Outcomes and Lessons Learnt

Lesson 1: Importance of visible buy in

Lesson 2: Design the tool to be quick, easy, and informative

Lesson 3: Ensure the process is collaborative in nature

Lesson 4: Highlight the progress to date

Future Work on the TIC project

Appendix – Example of Key Suggestions template

Works Cited

Introduction

In October 2008 the Strategic Innovation Funded (SIF II)Trinity Inclusive Curriculum (TIC) project was established in Trinity College Dublin (TCD) with the aim of embedding inclusive practices within the teaching, learning and assessment environment of TCD. This was a response to the increase in students entering TCD from non-traditional routes. The TIC project recognises that while non-traditional access routes have enhanced the opportunity of non-traditional students to enter TCD, it has not addressed inequalities in the teaching and learning environment. TIC addresses this issue, aiming to enhance the accessibility of the teaching and learning environment, thereby levelling the playing field for students from a variety of backgrounds.

This paper looks at the work done by Trinity Inclusive Curriculum project in phase II, which ran between October 2008 and June 2010. It reports on the progress made in embedding inclusive practices within the teaching and learning environment in TCD, and shares the lessons learnt along the way.

Background and Objectives of the TIC project

In 2008 Trinity College Dublin (TCD) obtained funding from the HEA to embark up a three-year project aimed at embedding inclusive practices within the mainstream curriculum of College. The Trinity Inclusive Curriculum (TIC) project was thus created with the following objectives:

-identifying actual and potential barriers to teaching, learning and assessment.

-identifying enabling strategies for overcoming these barriers.

-introducing these enabling strategies into the mainstream curriculum via

  • the creation of teaching and learning self-evaluation tools to be embedded into College policies and procedures,
  • training and awareness activities,
  • creation of resources:

-collaborating, and disseminating information externally to other higher level institutions.

Following a period of research; in which the current teaching and learning environment in TCD was examined, and the TIC project officer engaged in consultation with key stakeholders (e.g. students, academic staff and access staff); a draft teaching and learning self-evaluation tool was created.

TIC has been engaged in a pilot of this self-evaluation tool over the academic year 2009-10, and will take this opportunity to report on this pilot.

TIC Pilot of the Teaching and Learning Self-Evaluation Tool.

The TIC pilot incorporates two phases, phase 2.1 and phase 2.2. Phase 2.1. ran from October to December 2009, with action reports produced in January 2010, and involved five programmes representing all three faculties in TCD, and included both level eight and level nine programmes. Phase 2.2 ran through semester two with action reports produced over early summer and involved five programmes and two individual modules.

The aims of this pilot are to:

•Develop a user-friendly self evaluation system that can blend into design, review and quality enhancement systems ,

•Ensure that recommendations arising from the tool are realistic and attainable considering resources available through consultation with staff and students.

The pilot therefore sought to assess the content and format of this self-evaluation tool, the process through which it is used, and the feasibility of suggested actions within the resulting action report along with the process through which they are enacted.

Pilot Methodology

Pilots included the following stages:

  1. observation of teaching and learning by the project officer,
  2. resource review,
  3. staff and student feedback,
  4. tool completion and feedback,and
  5. creation and presentation of an action report.

Stages one to four, the data gathering stages, ran concurrently. The final stage followed the data gathering.

Each stage will now be described in detail, including lessons learnt along the way.

Pilot Stages

Stage one: Teaching Observation

A sample of classes and events were observed for each participating programme / module. On average, three or four lectures were observed per programme. Where students engaged in a variety of teaching methods, (e.g. lectures, labs, seminars) the TIC project officer endeavoured to attend these. A selection of other events, including orientation events and committee meetings were also attended where possible.

The project officer engaged in teaching observation with the aim of:

  • Getting a sense of the physical environmentsthat students within the programme/ module learn in from the perspective of students so as to discover any difficulties faced (e.g. lighting, acoustics, temperature, available ITequipment).
  • Getting a sense of the variety of teaching and learning methods used across College so as to better match up suggested future actions to promote inclusion with real world practice, and
  • To observe good practices that can enhance the advice offered to programmes / module moving into the future.

Stage two: Resource Review

A selection of resources were reviewed for each programme and module involved within the pilot. Resources included programme handbooks, reading lists, handouts, WebCT and programme webpages.

The project officer engaged in resource review with the aim of:

  • Gauging the level of compliance with the Revised College Accessible Information Policy:
  • Getting a sense of the information conveyed to students through different media in College, and
  • To observe good practices that can enhance the advice offered to programmes / modules moving into the future.

Stage three: Stakeholder Feedback

Feedback was sought throughout the pilot from both staff and students within the pilot programmes and modules.

The project officer engaged in stakeholder feedback with the aim of:

  • Ensuring that the questions asked within the self-evaluation tool were relevant and grounded in the real experiences and concerns of stakeholders,
  • Ensuring that the suggestions arising within the action report were feasible and relevant considering the academic environment and resources available,
  • To request instances of good practices that can enhance the advice offered to programmes moving into the future.

As the structure, size and organisation of each programme / module varies greatly, there was no universal method of feedback collection. The project officer adapted the feedback process to suit the needs of each individual programme or module.

Staff Feedback:

The primary source of staff feedback came from personnel in senior academic positions within the programmes involved, as these were generally the primary liaisons within the pilot programmes (e.g. Programme (or module) Co-ordinators, Heads of Schools, and Directors of Teaching and Learning). These were the individuals who completed the draft tool and so they fed back regarding its format and content. It was through these staff members that the majority of feedback regarding the usability and relevance of the tool was received.

Feedback was also sought informally from other teaching staff following lecture observations. Lecturing staff were offered the chance to offer feedback on any issues that they felt were significant to their teaching within the programme. Staff commonly took this opportunity to comment on issues that arose within the physical environment that affected their teaching (e.g. acoustics within the class room, the classroom layout etc).

Finally, all staff members were given the opportunity to feedback on the action report arising from the pilot when it was presented at the programme committee. At this committee meeting, the TIC project officer presented the tool and action report, and explained the purpose behind the pilot. Feedback on both form and content was then welcomed from all staff.

Student Feedback:

As each programme / module varied in both size and structure, there was no one universally acceptable method of student feedback. Instead the project officer used a variety of methods in response to each programme’s individual needs.

Qualitative feedback:

For each programme / module involved in the pilot the project officer conducted a semi-structured interview with some or all of the student representatives. Student representatives were contacted in advance, and asked to gather feedback from their classmates regarding their experiences of the teaching and learning environment.

One programme involved in the pilot used peer mentoring as a source of student orientation and support, and so the project officer arranged to meet with peer mentors as well as representatives on this programme.

Finally, one programme, involving only eleven students, was too small to administer a successful survey. The project officer chose to meet a sample of students from this programme instead.

Survey data:

The TIC project officer aimed to conduct a student survey with each programme / module involved in the pilot. Student surveys sought student perspectives on teaching, learning and assessment methods along with facilities within the College, and the physical environment.

Surveys were either conducted online using SurveyMonkey or in person during class. Conducting surveys in class could guarantee a higher response rate and so was the preferred method. However, while it was possible to reach all students within modules, and some of the post-graduate programmes in class this was not the case for undergraduate programmes as they involved multiple year groups.

Where it was impossible to reach all target students within the one class an online survey took place. With online surveys, a response rate of between 20-30% was achieved.

There were a small number of programmes where no survey took place as part of the pilotbecause these programmes had arranged their own programme surveys during the academic year and we were anxious to avoid survey fatigue amongst students, as this would lead to disengagement and unreliable survey data. For these programmes, the information collated by the programmes themselves was analysed instead.

Stage four: Tool Completion

The draft self-evaluation tool was sent to the primary liaison within each programme / module involved in the pilot. This person was asked to either complete the tool himself or herself, or to it send it on to the relevant personnel within their areas.

While the liaison was given the option of dividing the sections of the self-evaluation tool between staff members, the majority chose to complete the entire tool themselves. Exceptions to this were cases where the primary liaison was an administrative member of staff. Administrators always chose to pass the tool onto a more senior academic member of staff. In addition, the completion of the placement section, was almost always completed by the relevant programme placement personnel.

The tool was completed in a variety of modes. Some chose to complete it alone and then to contact the project officer, others chose to complete it in the presence of the project officer at a pre-arranged meeting. Of those who completed it alone, some chose to complete the electronic version and some chose to complete a printed version. Those who completed the tool in printed format reported the greatest difficulty, as this method cut off access to accompanying notes and explanations.

Once the tool was completed the project officer arranged to meet with the liaison to obtain feedback on the process. Feedback was sought on:

-Ease of completion,

-Usefulness of accompanying guidance notes,

-Areas of ambiguity where more guidance is needed.

Stage five: Creation and presentation of an action report

Once the data was gathered (stages one to four), an action report was created for each pilot programme / module. The first task was to create a consistent template that could be adapted for each pilot volunteer. Once the template was finalised, reports were written for each volunteer.

Format of Action Report:

The action report began with an introduction that set the context and rationale for the pilot and subsequent suggested actions. The main body of the report contained, in tabular format, the questions raised in the pilot, the response given by the programme and data collected by the project officer, and the suggested future actions. A table of key actions was then collated. This table had two blank columns for the programme to complete with the time frame for actions and the person responsible.

It is important to highlight areas of current good practice as well as areas for improvement, and as such an appendix was included that highlighted examples of good practice either observed during the pilot or reported by staff and students within the programme. The aim of this section was to encourage and motivate staff by showing that much work had already been completed on the path towards inclusion. This also provided an opportunity to highlight the good practices of individual lecturers so that their colleagues couldapply these to their own teaching practices.

Once the first draft of the action report was completed a meeting was held between the primary pilot liaison and the project officer to discuss the report. This meeting allowed for a discussion regarding any recommendations that could be perceived as problematic or ambiguous before wider circulation. It also allowed for the highlighting of any misunderstandings, and the rewording of sections if necessary. Following this meeting the project officer finalised the report and forwarded to the programme / module liaison.

For programme the action reports were then circulated and presented at the next programme committee meeting. This step did not occur for module pilots as there was no corresponding committee.

Following the programme committee meeting (or the final meeting with the project officer for modules), the two blank columns in the table of key suggested actions at the back of the report were completed by the programme / module (see Appendix). These columns requested a timeline for each action and agreed a person responsible. It has been agreed that once these key actions are underway, the project officer and pilot liaison will meet again to discuss the process of implementation (e.g. what was viable and what was not. Where difficulties lay and any advice that could be passed onto future programmes / module engaging within the tool).

Outcomes and Lessons Learnt

Lesson 1: Importance of visible buy in

The visible buy in of senior members of the academic staff was extremely important to the smooth running of the pilot. Without visible buy in from these individuals the whole process could be significantly delayed and there was considerable disengagement.

Throughout the pilot it was noted that the level of engagement from lecturers varied. In some pilot programmes, lecturers showed great enthusiasm to contribute to the pilot. This included reorganising classroom observations when the project officer could not attend initially agreed times, e-mailing lecture handouts to the project officer and granting access to WebCT pages.

Lecturing staff on other programmes displayed less enthusiasm. Lecturers sought to avoid classroom observation and expressed reluctance to share resources.

When analysed, it was noted that within the programmes with more active engagement the primary liaison was generally a senior member of the lecturing staff (e.g. head of school / programme co-ordinator or director of teaching and learning). These individuals liaised between the TIC project and the programme, seeking volunteers for the classroom observation and gathering materials for resource review. They were also often the first to volunteer for lecture observation.

Within the programmes with more reluctant engagement, the primary liaison was often an administrative, non-teaching member of staff. These individuals, while seeking volunteers for the stages of the pilot, could not volunteer themselves. This, we suspect, gave rise to reluctance amongst those members of staff asked to volunteer for classroom observation, and possibly to the feeling that they were being singled out unfairly for external scrutiny.

Therefore, we conclude that if you want buy in from programmes and schools the visible enthusiasm and engagement of senior members of academic staff, along with evidence that they are willing to hold themselves to the same scrutiny as more junior members of staff is important.