1

BENCHMARKING FOR QUALITY TECHNICAL INSTITUTIONS

Benchmarking for Quality Technical Institutions

Col Dr D S Grewal

BhaiMahaSinghCollege of Engineering,Kotkapura Road, Muktsar 152026 Punjab

------

1

Abstract -- Benchmarking for technical quality involves measuring the quality of the technical institutions; the benchmarked technical institution/s with institution to be improved upon focusing by the initiator on observation and investigation of processes with a goal of identifying and observing the best practices from one or more benchmark. Organizations evaluate various aspects of their processes in relation to the best practices usually within their own sector for example, here the technical education.

There is no single benchmarking process that has been universally adopted. The wide appeal and acceptance of benchmarking led to the emergence of various benchmarking methodologies. Benchmark selected ought to be the indicator of the desired standards. It has to be of the same field having the similar input and giving the outputs in terms of quality as desired. The processes may vary; the best one which ought to be adopted needs to achieve the required quality outputs. A technical institution selected as a benchmark should have achieved the desired quality standards. The present paper discusses various aspects of benchmarking for quality technical education.

Key Words: Benchmarking, Quality, Technical Institutions, Ranking.

I. INTRODUCTION

BENCHMARKING is the process of comparing one's business processes and performance metrics to industry bests and/or the best practices from other industries. Dimensions typically measured are quality, time, and cost. Improvements from learning mean doing things better, faster, and cheaper.

Benchmarking involves management identifying the best firms in their industry, or any other industry where similar processes exist, and comparing the results and processes of those studied (the "targets") to one's own results and processes to learn how well the targets perform and, more importantly, how they do it.

The term benchmarking was first used by cobblers to measure people's feet for shoes. They would place someone's foot on a "bench" and mark it out to make the pattern for the shoes. Benchmarking is most used to measure performance using a specific indicator (cost per unit of measure, productivity per unit of measure, cycle time of x per unit of measure or defects per unit of measure) resulting in a metric of performance that is then compared to others [1] .

Organizations may evaluate various aspects of their processes in relation to the best practices of the benchmarked one. Benchmarking for particular quality/qualities involves measuring required quality standards of the organization to be improved upon with the benchmarked organization. Benchmarking for technical quality involves measuring the quality of the technical institutions; the benchmarked technical institution/s with institution to be improved upon.

II.TYPES OF BENCHMARKING

  • Process benchmarking – focusing by the initiator on observation and investigation of processes with a goal of identifying and observing the best practices from one or more benchmark. Activity analysis will be required where the objective is to benchmark cost and efficiency; increasingly applied to back-office processes where outsourcing may be a consideration.
  • Financial benchmarking - performing a financial analysis and comparing the results in an effort to assess your overall competitiveness.
  • Performance benchmarking - allows the initiator to assess their competitive position by comparing products and services with those of targeted ones..
  • Product benchmarking - the process of designing new upgrades to current ones. This process can sometimes involve reverse engineering which is taking apart competitors output to find strengths and weaknesses.
  • Strategic benchmarking - involves observing how others compete.
  • Functional benchmarking - focusing benchmarking on a single function in order to improve the operation of that particular function. Complex functions such as Human Resources, Finance and Accounting and Information and Communication Technology are unlikely to be directly comparable in cost and efficiency terms and may need to be disaggregated into processes to make valid comparison.

Benchmarking may be in terms of "best practice benchmarking" or "process benchmarking” or‘Functional benchmarking’. Here, organizations evaluate various aspects of their processes in relation to the best practices usually within their own sector; in the present case the technical education.There is no single benchmarking process that has been universally adopted. The wide appeal and acceptance of

AKGEC JOURNAL OF TECHNOLOGY, Vol.2, No.1

benchmarking has led to the emergence of various benchmarking methodologies. Robert Camp proposed the most prominent 12 stage methodology [2]. It consists of

1. Select subject ahead

2. Define the process

3. Identify potential partners

4. Identify data sources

5. Collect data and select partners

6. Determine the gap

7. Establish process differences

8. Target future performance

9. Communicate

10. Adjust goal

11. Implement

12. Review/recalibrate.

The following is an example of a typical modified version of the methodology:

  1. Identify your problem areas - Because benchmarking can be applied to any business process or function, a range of research techniques may be required. They include: informal conversations with stakeholders; exploratory research techniques such as focus groups; or in-depth marketing research, quantitative research, surveys, questionnaires, re-engineering analysis, process mapping, quality control variance reports, or financial ratio analysis. Before embarking on comparison with other organizations it is essential that you know your own organization's function, processes; base lining performance provides a point against which improvement effort can be measured.
  2. Identify other Institutions that have similar processes – One must identify institutions which have similar characteristics. For example if we have to study a privatee remote rural area institution we must choose the benchmark which should be in the remote rural area run by private entrepreneur.
  3. Identify organizations that are leaders in these areas - Look for the very best in any industry and in any country. Consult customers, suppliers, financial analysts, trade associations, and magazines to determine which companies are worth studying.
  4. Survey institutions for measures and practices - Companies target specific business processes using detailed surveys of measures and practices used to identify business process alternatives and leading companies. Surveys are typically masked to protect confidential data by neutral associations and consultants.
  5. Visit the "best practice" institutions to identify leading edge practices - Companies typically agree to mutually exchange information beneficial to all parties in a benchmarking group and share the results within the group.
  1. Implement new and improved practices - Take the leading edge practices and develop implementation plans which include identification of specific opportunities, funding the project and selling the ideas to the organization for the purpose of gaining demonstrated value from the process.

III. SELECTING A BENCHMARK

Benchmark selected ought to be the indicator of the desired standards. It has to be of the same field having the similar input and giving the outputs in terms of quality as desired. The processes may vary; the best one which ought to be adopted to achieve the required quality outputs. A technical institution selected as a benchmark should have achieved the desired standards.

Accreditation is different than ranking system. Accreditation is applicable within the country, while the ranking may be applicable globally An example of various ranking systems globally is given below.

Global Quality Ranking Systems of Universities/ Institutions

ShanghaiJiaTongUniversity (SJTU) Global Academic Ranking

a) Winners of Nobel Prizes in Sciences and Economics and Field Modals in 30%

Mathematics

b) Citation in leading Journals 20%

c) Articles in Science and Nature 20%

d) Number of Thomson/ ISI “HiCi” researchers on the basis of citation 20%

e) The number of professional experts on regular faculty 10% 100%

The search is based primarily on global level research but is dominated by English speaking nations i.e, 71% of the top 100 research universities. eg. USA has 17 out of top 20 and 54 out of top 100. These characteristics needed for world class universities (WCU) and world class Research Universities (WCRU) list cannot be applied in Indian context [2].

The Thesis Higher Education Supplement (THES) Rankings:

It produces a summative holistic ranking but based on existing reputation primarily:

a)Reputational Survey of Academics (Peer Review)-40%

b)Survey of Global Employees-10%

c)Proportion of International students- 5%

d)Proportion of International faculty / staff- 5%

e)Faculty –student ratio-20%

f)Research citations per faculty-20%

100%

Sampling procedure is seriously biased. Weighing of components is not fully justified, Teaching quality measurement is appropriate. Research achievement is biased towards humanities and social science. It consists of classification more of regional rather than international level institutions. The criteria is invalid and not fit for Indian conditions. The results have been biased based on marketing strategies, where truth becomes irrelevant because data lacks credibility [2]’.

The German Centre for Higher Education (CHE) Development Approach: It is Europe wise system based on survey of students and staff, tabulating data on student experience and satisfaction and academic recommendations on the last locations in each field in 36 academic subjects. Surveys are supplemented with information from independent sources comprising one third data base. The Data about Teaching learning, institutional functioning, teaching quality, teaching instructions and techniques for assessment has not been effectively included.

Essential parameters needed to cover are, human development, global competences Business climate, Quality of Governance, computation, health, Education, Innovation, Transparency, transport, happiness [5].

In a survey entitled ‘India’s Best Professional Colleges’ conducted in June 2010 by Outlook-MDRA, the parameters adopted are as follows.

Table1

BENCHMARKING FOR TECHNICAL INSTITUTIONS

Table2

Table 3

Table 4

AKGEC JOURNAL OF TECHNOLOGY, Vol.2, No.1

Table 5

Table 6

Methodology and process adopted to rank the best professional colleges began with extensive secondary research to establish the criteria for selecting the master list of colleges and institutions. Only those with government recognition (accreditation/ affiliation) and with at least three pass out batches were considered. Then, a detailed questionnaire was sent to over 1000 colleges in nine streams i.e., Engineering, medicine, dentistry, law, hotel management, social work, fashion technology, architecture and mass communication.

Given the quantum and quality of responses, only three streams – engineering, law and hotel management- were deemed fit for effective analysis. The three streams were evaluated on a combination of objective and perceptual data with equal weighs. A panel of 120 experts from the corporate sector and academia were carefully selected for consultation on establishing the parameters and sub-parameters for ranking and their importance. Based on this weights were assigned. Fully perceptual analysis was done for the remaining fields in key cities (metropolitan cities).

Field researchers carried out questionnaire based interviews among 331 faculty members and 357 final year students from different streams. Another 147 recruiters/professionals were also interviewed to select and rank the top professional colleges for streams relevant to their fields, nationally as well as in their zones. Total marks were calculated by cumulating the ranks given by the three categories (students, faculty, recruiters/ professionals) after giving separate weights to each category.

The Outlook MDRA survey again has its pitfalls. It is concentrated on the parameters centered on IITs and is biased towards elitist institutions heavily. No indigenous features are included. Key areas like student faculty ratio (20.3), laboratories (38.5) out of 1000 marks are too low while allotting marks for type of entrance exam (59.7) fee structure (59.7), number of patents held (21.7) are too high. The number of samples selected is also too less. The result is the total concentration in IITs or institutions in the south.

From the above, it is seen that the ranking system based on surveys conducted by various players cannot be accepted as fully reliable. A certain degree of bias has also been found especially in the reports by Mail Today and Zee business. Probably these are the business ventures keeping in view the admission season and a sort of advertisement process, where paid results cannot be discounted.

There is certainly a need for an independent agency preferably a government agency like NBA to give out a ranking to each college based on field study. The process adopted by NBA to distribute strong points of accredited institutes helps in setting up a benchmark.

There is a requirement of unbiased system of benchmarks which should be categorised according to the major criteria like urban, rural, government funded/ not funded etc and the score card must be meticulously planned.

V. REFERENCES

[1].

[2]. cite_nite-0#cite_note-0,Robert Camp who wrote the first book on benchmarking in 1989.

[3]. Simon Marginson, 2006, Ranking Ripe for Misleading”, The Australian, Dec. 6

[4]. Richard Holmes, 2007, “The THES University Rankings: Are they Really World Class? Asian Journal of University Education, MARA University of Technology, Malaysia, 1/1, 1-14.

[5]. Outlook India, June 28, 2010, pp.33-42.

/ Col Dr D S Grewal presently the Director Principal BMS College of Engg. Muktsar India is a renowned education administrator and prolific writer who has written 24 books in different subjects and has published 258 papers nationally and internationally.
His specialization is Education Management and nanotechnology. He has the experience as a principal of engineering colleges for 10 years and of teaching UG and PG (MTech. & MBA) students for over 20 years.

He has been presented with many national andinternational awards.

1