Curriculum Online Guides and Standards
Readiness and Viability Assessment Research
Background
As part of the Red River College (RRC) Curriculum Online Guides and Standards (COGS) Project, the Readiness and Viability Assessment Task Team (Pam MacAskill, Craig Edwards, Freda Robinson, and George Siemens) undertook the development of project deliverable 2(1): readiness and viability assessment. This item addresses the analysis and sustainability of any online course/program development. Readiness has to be determined from the departmental/divisional curriculum readiness level to return on investment based on labour market analysis (Council on Post-Secondary Education
Systems Restructuring Funds 2002 – 2003 Proposal, May 27, 2002, pages 2 - 3). The Scope Statement for the Readiness and Viability Task Team identified the need to conduct research in three areas: academic institutions, literature and website, and policy. The purpose of the research was twofold:
- to inform the design of the Readiness and Viability Assessment instrument, and
- to ensure that the instrument supports current college policy or to identify any policy implications resulting from the design and use of the instrument.
Institutional Research
Based on its Scope Statement, the Readiness and Viability Assessment Task Team conducted a three-stage research process. The first stage involved a plan to conduct an environmental scan of 11 colleges and 1 dedicated distance education university. The questions selected for the environmental scan were:
- Does your college have a defined process for deciding whether a course or program should go online?
- Does your college use an assessment instrument for deciding whether a course or program’s curriculum is ready for online conversion?
- What factors do you consider when you decide on curricular readiness?
- Does your college have an assessment instrument for determining the viability of converting a course or program for online delivery?
- What factors do you consider when determining viability?
- How does your college prioritize courses or programs for online development and delivery? Is this process formalized?
- Are there any drivers, external or internal, for the direction your organization is taking in online development?
- If you were to construct a continuum for faculty moving to online course development and delivery with 1 representing the early adopters and 10 representing the critical mass of faculty, where would you place your college on this continuum?
- What strategies does your college use, or is thinking about using, to create sustainable online development?
This process yielded responses from 7 colleges and the 1 dedicated distance education university for a response rate of 67%. The actual responses from each institutional contact are recorded on SharePoint (
Information garnered from this research indicated that none of the institutions contacted had a formal Readiness and Viability Assessment instrument. Two colleges – Mount Royal College, Alberta and Seneca College, Ontario – had internal grant proposal forms which contained sections for support data as part of the award assessment process. Of these, Mount Royal College’s TIPP II E-learning Planning Guide for Faculties, Schools and Centres (2002) was the most detailed. The TIPP II E-learning Planning Guide for Faculties, Schools and Centres contains 4 sections: Section I – Situational Analysis; Section II – Area Specific Goals and Objectives; Section III – Obstacles and Required Resources; and Appendix A – Guidelines for Selection of Candidate Programs and Courses for Delivery by E-learning. Mount Royal’s guide is quite comprehensive requiring a careful analysis of the program (content, faculty, technology), and audience being proposed for e-learning development as well as a review of how the proposed e-learning initiative fosters achievement of the college’s strategic goals and those of Alberta Learning. The TIPP II E-learning Planning Guide for Faculties, Schools and Centres reflects Mount Royal College’s preference for funding e-learning development of entire programs.
The Seneca College Faculty of Continuing Education and Training’s Proposal Form for IBL Development focused largely on the market potential of an e-learning project. The data required to complete this form covered 7 topic areas:
- Market Characteristics with sub-set information related to learner needs, size of the market, geographic scope, competitive advantage, student readiness and learning styles, and transferability/accreditation;
- Total Program requiring an assessment and rationale related to the value of converting an entire program versus a course or group of courses;
- Cost Effective requiring an assessment of the cost effectiveness of the proposed project;
- Faculty Readiness providing a description of the faculty’s readiness to undertake the proposed e-learning project. This section looked at faculty’s attitudes, preparatory training, roles, and number of faculty required for the project.
- Subject/Program Fit to Online/Mixed Mode Delivery or Development of Learning Objects;
- Partnership Opportunities;
- Implications for In-class Delivery which required an assessment of what impact e-learning or mixed-mode delivery would have on the sustainability of in-class delivery.
As continuing education operates as a revenue generating agent at Seneca as at RRC, it is understandable why this form has a focus on market analysis. These considerations are also important for RRC where both the Continuing Education and Distance Education departments operate as revenue generating agents for the college.
The remaining 6 institutions interviewed indicated that the processes used to assess the readiness and viability of an online project were more informal. In these institutions decisions regarding the selection of a course or program for online development were usually made by a committee headed by the Chair or Dean of a particular school. The criteria used for decision-making were related to the strategic goals of the institution and the Chair or Dean’s assessment of how a proposed e-learning development project might contribute to achieving these goals. All institutions reported internal and external drivers influencing the decision-making process. Chief among these were accessibility, contributing to the achievement of provincial priorities, and revenue generation.
Literature and Web Research
The second stage of the Readiness and Viability Assessment Task Team’s research involved a broader search for readiness and/or viability assessment instruments documented either in literature or on the web. This research yielded information which indicated a split in orientation between the academic and business sectors. The review of academic institutions found two types of readiness assessments:
- Student readiness for online learning. An example of a very comprehensive online assessment of this type can be found on the Georgia Tech website (http://alt.usg.edu/sort/index.html). Georgia Tech’s SORT (Student Online Readiness Tool) provides prospective online students with detailed feedback on a number of factors related to success in online learning. Some of the items covered related to personal preferences, lifestyle, learning styles, academic preparedness, technical literacy, and access to specified technology. At present, RRC already has an instrument to help student’s assess their readiness for online learning. It is available to students through Distance Education and to internal faculty as part of the WebCT Resource developed by the Program and Curriculum Development department.
- Institutional readiness for e-learning. The institutions with readiness assessments like this on their websites were already engaged in e-learning. What was prompting the readiness assessment was a need to either develop a comprehensive e-learning strategy as part of the institution’s strategic planning process or to evaluate expenditures on e-learning by assessing the factors for or against adopting common e-learning platforms and standards for greater interoperability; or, in some cases, both. An example of this is the University of North Carolina (UNC) E-Learning Readiness Assessment (eLRA) Project (2001) conducted by PriceWaterhouseCoopers (http://www.northcarolina.edu/content.php/ir/elearning/elra_report.htm ). This report identified “e-learning critical success factors based on industry best practices including: technical infrastructure requirements, e-learning functionality and support services requirements (page 11).” Of relevance to the COGS Project were the identification of the following items as part of the Critical Success Factors under e-learning support:
- Ensure courses are based on accepted content standards. To ensure effectiveness, these standards must be enforced.
- Developing articulation policies that enable cross-campus programs while maintaining academic standards at each campus.
- Determining e-learning program and content analysis based on market analysis (pages 15 – 16).
In the business sector, there were a number of online readiness assessment instruments which examined a range of interrelated factors deemed critical to the successful adoption of e-learning in the corporate training sector. These instruments provided the most inspiration for the Readiness and Viability Assessment Task Team as they covered many of the factors needed for instructional design as well as addressing the market readiness for an e-learning product. What made these instruments instructive was their design. On the whole, these instruments reflected the need to gather a large amount of data as quickly and efficiently as possible. Most of the instruments were designed to be scored online with feedback provided related to the readiness level of the various factors deemed critical to the corporate training environment.
As an example, The E-Learning Readiness Survey: 20 Key Questions You and Your Organization Must Answer About the Sustainability of Your E-Learning Efforts (Version 1.0, 2000) by Marc Rosenberg, PhD provides an instrument with 7 areas for consideration and a 5 point rating scale for the items under each area as well as a score interpretation guide. This instrument can be scored online or in-person using the scoring guide. The 7 areas covered by Rosenberg’s instrument are: 1 - Your Business Readiness, 2 - The Changing Nature of Learning and E-learning, 3 - The Value of Instruction and Information, 4 - The Role of Change Management in Building a Durable E-learning Strategy, 5 - How Training Organizations Must Reinvent Themselves to Support E-learning, 6 - The E-learning Industry, and 7 – Your Personal Commitment. Another e-learning readiness assessment developed by worknowledge ( covers similar items under different headings: I – Employee Readiness, II – Management Readiness, III – Financial Readiness, IV – Technical Readiness, V – Environmental Readiness, and VI – Cultural Readiness. worknowledge’s E-learning Readiness Assessment was designed for online scoring and feedback, but the scoring scale was not identified on the website. Samantha Chapnick’s E-learning Readiness Assessment (Version 1.0, 2001) reflected her skill in needs assessment and used a readily understandable 3 point scoring scale. This instrument gathers data in 8 areas with a score provided at the end of each section ( ). The 8 sections cover: 1 – Psychological Readiness, 2 – Sociological Readiness, 3 – Environmental Readiness, 4 – Human Resource Readiness, 5 – Financial Readiness, 6 – Technology Readiness, 7 – Equipment Readiness, and 8 – Content Readiness. Feedback is provided for each readiness factor and there is also an overall e-learning readiness score. The design of this instrument allows for the identification of specific problem areas as well as suggestions for remediation.
Policy Research
The third stage of the Readiness and Viability Assessment Task Team’s research focused on a review of policies relevant to the development of the proposed instrument. The focus of this research stage was internal college policies. This was in keeping with the COGS proposal which identified “recommendations on policy issues” related to each deliverable as one of the project inclusions (Council on Post-Secondary Education Systems Restructuring Funds 2002 – 2003 Proposal, May 27, 2002, page 3).
The research process identified one policy which fit with the intent of Readiness and Viability Assessment, although that was not its specific focus. The policy identified was Policy B3 – New Academic Program Approval (January 18, 1994). Policy B3’s relevance for Readiness and Viability Assessment rests in its delineation of a formal process for new program or program expansion approvals. The specific steps outlined in Policy B3 are now dated and have been replaced by templates which can be downloaded internally through MSOutlook Public Folders (under Academic Vice-President and Financial Services). These templates are also readily accessible from the Council on Post-Secondary Education (COPSE) website (
Although not specifically directed at Readiness and Viability Assessment for online program development, the New Academic Program Approval asserts the need for market research to document the need for and the feasibility of investing in the development of a new program or the expansion of an existing program. The templates also address the need to establish a link between provincial priorities and the new programs, i.e. how will the new program contribute to the achievement of provincial priorities. This underscored the need to address market issues in the Readiness and Viability Assessment and re-affirmed the focus on identifying both readiness and viability. Implicit in both Policy B3 and the COPSE templates is an understanding that identifying a need for a new initiative is not sufficient cause for supporting the initiative unless it can be proven that it is both feasible and beneficial to do so. Since investing in online course or program development is an expensive undertaking, the Task Team considered such a process an institutional investment and one which should be approached with similar care and forethought.
Design Considerations
Based on the results of its research, the Readiness and Viability Assessment Task Team decided to design an instrument which would initially be paper-based, but which could be converted for online administration. Program and Curriculum Development’s experience with response rates for online forms indicates that paper forms/tools currently elicit better response rates than their online counterparts. This experience is supported by research conducted by the college’s Research and Planning department. Designing the Readiness and Viability Assessment for current use and future conversion to an online tool addressed the preference of current staff while recognizing the need to evolve the instrument to one which provides rapid online feedback.
Since many of the critical success factors identified in the research related to instructional design, it was quickly evident that the Readiness and Viability Assessment instrument would have to address instructional design issues. Thoughtful instructional design is also necessary to address the quality indicators identified by the Canadian Recommended E-learning Guidelines (
As one of RRC’s goals is to contribute online products to Campus Manitoba and Campus Canada, addressing instructional design seemed to be a pivotal area of focus.
The decision to include market analysis in the design of the instrument was vital since all of RRC’s alternative delivery arms (Continuing Education, Distance Education and Contract Training) operate as revenue generating agents for the college. Having data to establish market support for an online course or program is a necessary precursor to considering alternative delivery. It also helps to address sustainability issues.
Design
The design of the Readiness and Viability Assessment instrument (version 1.0, June 2003) has four interrelated sections:
- An introductory section covering the purpose of the Readiness and Viability Assessment;
- An identification section which names the Assessor and briefly identifies the scale of the proposed online project;
- An analysis section covering 5 interrelated components – Learner Analysis, Content Analysis, Faculty Analysis, Market Analysis, and Technology Analysis; and
- An analysis guide for summarizing the results of the analysis.
Each of the 5 analysis sections of the instrument are compact with related kinds of information clustered around a unifying concept. For example, under Learner Analysis 4 types of data (gender, age, geography, and diversity) are identified under the general concept of demographics. This was done to consolidate related kinds of information for efficiency in both data gathering and data analysis. The data fields selected were chosen for their applicability to instructional design and the potential sustainability of an online product.
Conclusion and Next Steps
At the conclusion of the COGS Project on June 30, 2003, the Readiness and Viability Assessment instrument is a paper-based assessment instrument which is available to any RRC faculty or academic manager for voluntary completion. Initial indications are that the instrument can be readily converted to an online tool. This would require the development of an appropriate scoring mechanism and the determination of feedback points and scenarios. As version 1.0 of the Readiness and Viability Assessment instrument represents the results of the research and design phase of the COGS Project, further development must await the implementation and evaluation phase in the 2003 – 2004 academic year.
The Readiness and Viability Assessment Task Team recommends the adoption version 1.0 of the Readiness and Viability Assessment instrument as meeting the specifications of COGS Project deliverable 2(1). The Task Team further recommends that any updating of Policy B3 – New Academic Program Approval consider inclusion of the Readiness and Viability Assessment instrument as a required component of establishing the need for and feasibility of a new program or program expansion initiative, especially if there is to be any online component in the program. The Readiness and Viability Assessment Task Team believes that, as it stands, version 1.0 of the Readiness and Viability Assessment instrument will prove useful as a resource for educating college staff about the significance of instructional design to the development of quality online learning.
1
Red River College, June 2003