Freshman Integration Tracking System and Faculty Academic Advising Concurrent Pilots 2005 2008

Freshman Integration Tracking System and Faculty Academic Advising Concurrent Pilots 2005 2008

Formative Assessment 2005 – 2006

Concurrent Pilots 2005 – 2008:

Freshman Integration Tracking System and

Faculty Academic Advising

Research & Planning Department

September 2006

Freshman Integration Tracking System and Faculty Academic Advising Concurrent Pilots 2005 – 2008

Formative Assessment 2005 – 2006

Introduction

The dual concurrent pilots of the Freshman Integration Tracking (FIT) System and the Faculty Academic Advising (FAA) Baseline Method began in the Fall of 2005. The development of the FAA method and the Faculty Academic Advising Handbook and the preliminary testing of the 2 FIT System surveys occurred in the Spring/Summer of 2005 under the guidance of the Red River College (RRC) Academic Advising Working Group (AAWG). A total of 9 programs enrolling approximately 315 students volunteered to participate.

Preliminary Testing: Methodology

The focus of preliminary testing was technical and was intended to resolve issues related to using the FIT software. This included assessing the administration requirements for scheduling and completing the 2 surveys (Partners in Education Inventory and Student Experience Inventory) and generating the student, faculty, and counseling reports produced by the software. Two sections of first-year Culinary Arts students participated in the testing process. The College’s Computer Services and Research and Planning departments worked closely with Humber College, the developer of the FIT System, to resolve identified issues.

Preliminary Testing: Results

Software:

Preliminary testing identified the following problems:

  1. scanning the survey booklets required a conversion program and the purchase of Scan Tools Plus software.
  2. the FIT software did not accept the data files, consistently sending a ‘bad data’ message when various attempts were made to enter the data.

The first problem was solved quite easily with Humber College providing the conversion program and RRC purchasing the new software. The second problem took some time to resolve. Computer Services identified the problem as stemming from a discrepancy in the Non-Ontario load mask for the FIT software. Numerous exchanges through July and August 2005 finally resulted in a re-programming of the load masks for the 2 surveys by Humber College staff. Once this was done, the trial data easily loaded into the software using the instructions in the FIT Manual.

Generating Reports:

The reports generated during preliminary testing lead to the following insights:

  1. Where double entries are made in a single row or column, the software responds with an asterisk (*) indicating a problem and bad data. These are readily corrected by consulting the relevant student’s survey and entering the right information in the data.
  2. Where questions are not answered, the software leaves a blank. These blanks reflect students’ choices about which questions they are willing to answer and are left as is.
  3. Where a column is skipped in a field, a blank is also left. This is only problematic when it occurs in the student name or student number fields as these are the primary student identifiers needed to return the student’s Personalized Learning Plan (PLP) following each survey. Corrections to the data can be made by comparing the student’s survey to the data and editing the data before it is loaded. Corrections can be made after the data is loaded using the Edit feature outlined in the FIT Manual.
  4. When a student answers “no” to the survey consent, the software produces the data and a PLP, but leaves the student’s name off the Faculty Report.

Once the survey data is loaded into the relevant survey sections of the software, the reports are generated using the FITS Report feature. Initial problems in producing the PLPs resulted from student services paragraphs being too long. The software requires that all RRC student services paragraphs, from which the PLPs are derived, fill only one page. This necessitated an editing of the RRC paragraphs, creation of a no help needed paragraph, and re-entering the revised paragraphs into the FIT COMPAS Table Maintenance. When these changes were made the software quickly generated the PLPs, Faculty Reports, and Counsellor Reports.

Distributing Reports:

During preliminary testing, original decisions regarding the editing and distribution of Faculty Reports had to be re-evaluated. This arose from 2 concerns:

  1. the scalability of producing edited Faculty Reports; and
  2. a discrepancy between the information on the PLPs and the Faculty Reports.

With regard to scalability, producing edited Faculty Reports was a time consuming process even with a small sample. With the larger pilot sample, producing edited Faculty Reports would have negatively impacted the turnaround time for return of reports. As the Student Information System (SIS) provided most faculty access to greater amounts of student information than is contained in the FIT reports, it was decided to give each faculty academic advisor a Faculty Report containing all the FIT scores for their program.

During preliminary testing, a comparison between the Faculty Report and the students’ PLPs revealed some incongruence which had potential to negatively impact faculty academic advising. In general, Faculty Reports contained more information than PLPs. The Faculty Report contained the full range of scores under each need area indicated by a dot (.) for Low, an M for moderate, and an H for high. The students’ PLPs contained only the paragraphs triggered by students’ scores in the higher end of the moderate range and the high range for each need area or the no help needed paragraph. To prevent confusion in the faculty academic advising process, a decision was made to give each faculty academic advisor copies of the PLPs for the students they had been assigned to advise.

Stage 1 Pilots 2005 – 2006: Methodology

Both FIT surveys were administered in booklet format. The Partners in Education Inventory (PEI) survey was to be administered during the first week of classes, either the week of August 29 – September 2/05 or the week of September 5 – 9/05. The Student Experience Inventory (SEI) was to be administered 7 or 8 weeks later, either the week of October 10 - 14/05 or the week of October 17 - 21/05. Designated Faculty Contacts for each volunteer program were responsible for distributing the survey packages to relevant faculty and for returning the completed booklets to Research and Planning. Research and Planning was to scan the completed booklets, produce the reports, and return the completed reports to the Faculty Contact and to the program Chair, within a week of receiving the booklets. The Faculty Contacts were then to distribute the reports to faculty academic advisors and to ensure the PLPs were returned to students. The FIT reports were intended to act as a trigger for faculty academic advisors to initiate contact with each student they had been assigned to advise. Based on Vincent Tinto’s research on student retention, an emphasis was placed on initiating student contact within the first 16 weeks of class.

Stage 1 Pilots 2005 – 2006: Results

Surveys:

For the most part, the pilot for the PEI survey went smoothly. The software performed well. The only problem encountered arose from a small batch of survey booklets with misaligned registration marks. The misaligned registration marks prevented the booklets from scanning. The problem was resolved by copying the responses into different booklets. When this was done, the booklets scanned properly.

No problems were encountered in packaging the survey booklets for each pilot program or in getting the booklets distributed to the Faculty Contacts. Most surveys were administered the week of August 29/05 or the week of September 3/05. Faculty Contacts returned the completed surveys quickly to Research and Planning.

The FIT software generated all the reports smoothly and results were packaged for each program. Return of the FIT reports to the programs occurred in 1 week for 4 of the 9 programs. Delays in turnaround time resulted from acquiring lists matching the faculty academic advisors with their assigned students. The last batch of PEI reports was delivered on October 4/05.

From a technical perspective the SEI pilot went smoothly. The surveys scanned properly and the software performed as expected. For the SEI, additional reports were distributed to Student Services and to faculty. These included:

  1. Mid-Term Attitude Report
  2. Leavers Report
  3. Contact List of Student Needs

One unforeseen technical problem was that the legend did not print correctly on the Mid-Term Attitude Report.

In terms of numbers of completed surveys, there were more PEI surveys completed than SEI surveys. Table 1 outlines the survey participation rates.

Table 1: Freshman Integration Tracking (FIT) System Pilot 2005 - Survey Participation Rates

Program
Code / Program Name / Number of Students in Course (approximate) / Number of Completed PEI Surveys / % (of smaller estimate) / Number of Completed SEI Surveys / % (of smaller estimate)
ADQ / Mechanical Engineering Technology Diploma / 45 - 49 / 45 / 100.0% / 39 / 86.7%
AHC / Civil/CAD Technology Diploma / 27 – 28 / 26 / 96.3% / 23 / 85.2%
ARL / Hospitality and Tourism Management Diploma / 43 / 43 / 100.0% / 37 / 86.0%
ASB / Business Administration Diploma / 80 / 59 / 73.8% / 12 / 15.0%
ASQ / Child and Youth Care Diploma / 28 / 24 / 85.7% / 22 / 78.6%
AST / Early Childhood Education Diploma / 30 – 33 / 30 / 100.0% / 27 / 90.0%
ATI / Aboriginal Self-Government Diploma / 20 / 13 / 65.0% / 8 / 40.0%
IP / Greenspace Management Diploma / 15 / 14 / 93.3% / 13 / 86.7%
IZ / Woods Product Manufacturing Diploma / 19 / 17 / 89.3% / 17 / 89.5%
Total / 307 - 315 / 217 / 88.3% / 198 / 64.5%

Feedback:

The assessment plan for the 2005 pilot was designed around using the pilot as an opportunity to build a learning community related to faculty academic advising. To this end, a number of focus groups (3 with faculty and 2 with students) were planned. In addition, Student Services was to gather data on students who sought help based on the PLPs or faculty academic advising. Table 2 outlines the Student Services data for self-referrals based upon students receiving FIT survey reports.

Table 2: Freshman Integration Tracking (FIT) System Pilot 2005 – Cumulative Self-Referral Rates

Services Required / Number of Students / Self-Referral Rate*
Counseling / 17 / 7.8%
Tutoring / 53 / 24.4%

* Calculated from a total sample size of 217 students surveyed.

The number of focus groups proposed in the assessment plan for both students and faculty proved too ambitious. Although personalized invitations were emailed to about one-third (120) of the participating students, only 3 students accepted the invitation and participated in a focus group (Appendix 1). From the perspective of the pilot assessment, the low participation rate (2.5%) for the student focus group makes its difficult to extrapolate from the participants to the larger pilot sample. However, there were 4 items which were especially noteworthy from among the students’ responses:

  1. the PLPs helped to correctly identify areas where the students perceived needing help;
  2. all 3 had met with a faculty academic advisor and seemed to regard this as a positive experience;
  3. the surveys were perceived as too long and questions were seen as repetitive;
  4. the students appeared to be unaware of the full range of services available through Student Services.

Faculty completed a questionnaire (Appendix 2) and participated in 1 focus group as well. As with the student focus group, the participation rate was low (13.7%). This makes it difficult to extrapolate the responses to the larger pilot sample. However, a number of items were noteworthy:

  1. the timing of the first survey needed to be re-considered as the first week of classes is very stressful for first-year students;
  2. it takes longer than anticipated to complete the surveys;
  3. the wording of the Personalized Learning Plans (PLPs) was seen as impersonal and detached; and
  4. the organizational requirements was somewhat of a challenge. As one participant noted: “This was an organizational task that took some time to coordinate. Simply assigning students to faculty was a challenge. Meeting with students was an additional challenge. It seemed that the students who most needed help were the hardest to set meeting times.”

Change Management

From an organizational perspective, the results of the Stage 1 Pilots have to be considered in terms of change management. The FIT System and the Faculty Academic Advising Handbook and faculty academic advising form part of a larger strategy aimed at improving student retention at the College. They represent a challenge because they require movement towards a new way of seeing and acting in relation to student retention. From a change management perspective the easiest kind of change to manage is technical change. This is because technical problems can be solved using information and experience currently at hand. What is harder to manage is adaptive change. This is because it requires problem solving and adapting to many unknowns until the desired goal is reached (Fullan, 2005). It is because of this that a decision was made to schedule the FIT System and Faculty Academic Advising Concurrent Pilots over 3 successive years prior to full college-wide implementation.

Technical Change:

Much of the information gathered from the Stage 1 Pilots relate to the technical aspects of the pilots, particularly to the FIT System. Based on the information and feedback received in 2005 – 2006, the following changes will be made for 2006 – 2007:

  • only the PEI survey will be administered;
  • the PEI survey will be administered in week 2 or 3 of the program;
  • the PLP has been re-named. The student report will be called How Can We Help You?;
  • a linked database has been created which will allow FIT reports to be sent to program Chairs, faculty contacts, faculty academic advisors, and Student Services electronically.

The How Can We Help You? report will continue to be paper-based and will be sent to students in individual envelops. Distribution of the student reports will be as before.

Organizationally, another technical change has occurred with the approval of Policy B12: Faculty Academic Advising in September 2006 (Appendix 3). This policy helps establish the organizational infrastructure needed for implementation of faculty academic advising in support of student retention across all program areas.

Adaptive Change:

In reviewing the focus group responses from students and faculty, it is noteworthy, but not surprising, that the emphasis tended toward a critique of the FIT System. One of the reasons for this is that the FIT System is highly tangible, which makes it easier to recognize and address. But, what cannot be expressed too strongly is that the FIT System is not a student retention strategy. It is a tool that is intended to trigger a student retention strategy. In the case of this pilot, that student retention strategy is faculty academic advising. What has to be remembered from Tinto’s research is that faculty contact was the one organizational variable that was most important for helping a student decide whether to persist with his/her education or to leave (1993). Tinto’s research was instrumental in the design of the RRC Faculty Academic Advising Baseline Method. It is a streamlined model of faculty academic advising with 3 functional areas: proactive contact, faculty academic advising and referral (Figure 1).

Figure 1: Faculty Academic Advising Baseline Method
  1. Proactive contact during the first 8 weeks of attendance using the FIT tools, if available. Research shows that the first semester (16 weeks) is a critical period for students in deciding whether to persist with or abandon their educational goals (Tinto, 1993).
  2. Faculty academic advising focusing on the knowledge, professional/technical skills, and essential employability skills specific to the program and the faculty’s expertise. Faculty contact is the single most important factor in influencing the positive integration and persistence of adult learners (NVCC, 2001).
  3. Referral to Student Services for services and expertise outside the scope of the academic department. The baseline method carries with it no expectation that faculty bring anything beyond their program-specific expertise to the student advising process. Students needing other kinds of academic advising or counseling should be referred to Student Services. Or, in situations where students need assistance with program administrative matters outside of the jurisdiction of the faculty member, referral can be made to the program Chair or Coordinator.
Source: RRC Faculty Academic Advising Handbook, 2005, page 14.

As faculty noted in their focus group, organizing the faculty advising process was an adaptive challenge. This was probably most true in programs which did not have an articulated process for faculty academic advising. The organizational learning needed to address the challenge can only come from working through the problems associated with putting a new student retention process in place. It is for this reason that the emphasis for the Stage 2 Pilots 2006 formative assessment will be on faculty academic advising. To this end, the faculty contacts and faculty academic advisors will be introduced to reflective practice as a contributor to action-oriented research and to the development of a community of practice as an aide to organizational learning (Fullan, 2005; Tinto, 2002). The focus of reflective practice will be on faculty academic advising with practitioners from the 9 volunteer programs asked to reflect upon and gather information related to:

  1. the planning and organizing required to implement faculty academic advising;
  2. the staff development needs of faculty academic advisors; and
  3. identifying barriers to implementing faculty academic advising and how to address them.

The Stage 2 Pilots participants will form the starting point for a college-wide community of practice related to student retention. As part of the formative assessment process for 2006 – 2007, events will be scheduled to reflect upon and to consolidate what individual program faculty have learned. These will be organized by Research and Planning over the 2006 – 2007 academic year. The intention is to produce a guidebook for implementing faculty academic advising more broadly at RRC and to develop a cadre of experienced practitioners who can lead this process and train new RRC practitioners.