Appendix1. List of abbreviations/Acronyms
3DThree-dimensional
CAI Computer Assisted Instruction
DOF Degrees of Freedom
LOD Level of [task] Difficulty
MTQ Memory Test Questionnaire
MUSTeMultidimensional User-centred Systematic
Training Evaluation
HVEI Human-VE Interaction
PVE Perceived VE efficacy
SE Self-efficacy
UIUser Interface
VE Virtual Environment
VTs Virtual Training Systems
Appendix 2. Post-VE Exposure Questionnaire: Self-efficacy (SE)
Post-VE exposure questionnaireTop of Form
Participant ID:
0:Not at all capable / 50:Moderately capable / 100:Completely capable0 / 10 / 20 / 30 / 40 / 50 / 60 / 70 / 80 / 90 / 100
1. / How you think your capability in completing training test accurately (correctly)? / / / / / / / / / / /
2. / How certain are you of your ability to accurately completing the training test? / / / / / / / / / / /
3. / How you think your capability in completing training test efficiently (timely)? / / / / / / / / / / /
4. / How certain are you of your ability to efficiently completing the training test? / / / / / / / / / / /
5. / How you think your capability in completing training test effectively (correctly and timely)? / / / / / / / / / / /
6. / How certain are you of your ability to effectively completing the training test? / / / / / / / / / / /
7. / Please indicate the test score that you expect to receive (score range from 0 to 100) / / / / / / / / / / /
Bottom of Form
Appendix 3. Post-VE Training Test Questionnaire: Perceived VE Efficacy (PVE) – P1
Bottom of Form
Post-VE Training Test QuestionnaireTop of Form
Participant ID:
Section 1
1 / 2 / 3 / 4 / 5 / 6 / 71. / I was able to focus my attention on learning assembly procedures rather than the input control tools (e.g. Haptic device) / strongly disagree / / / / / / / / strongly agree
2. / The virtual training environment helped me increase my understanding of required learning tasks / strongly disagree / / / / / / / / strongly agree
3. / I could easily understand the objectives of the training tasks / strongly disagree / / / / / / / / strongly agree
4. / I understood well what is expected from me / strongly disagree / / / / / / / / strongly agree
5. / I was very aware of my task status and what I suppose to do to accomplish these tasks / strongly disagree / / / / / / / / strongly agree
6. / It was easy to self direct my learning experience in the virtual training environment / strongly disagree / / / / / / / / strongly agree
7. / Performing tasks in the virtual training environment were physically demanding / strongly disagree / / / / / / / / strongly agree
8. / Performing tasks in the virtual training environment were mentally demanding / strongly disagree / / / / / / / / strongly agree
9. / I felt pressured to finish the task in the given time / strongly disagree / / / / / / / / strongly agree
10. / The interactive features of the virtual training environment improved my ability to perform tasks / strongly disagree / / / / / / / / strongly agree
Appendix 4. Post-VE Training Test Questionnaire: Perceived VE Efficacy (PVE) – P2
Section 2
1. Are you experience any side effects such as motion sickness or dizziness?
2. What events did you find difficult to deal with while interacting with the virtual training environment?
3. What are the most negative aspect(s) of the virtual training environment?
4. What are the most positive aspect(s) of the virtual training environment?
5. Please provide suggestions for a better design (e.g. user interface design, instructional design) of the virtual training environment.
6. Please provide a sentence or more describe your experience of the day, did you enjoy it?
7. Please provide feedbacks for areas of improvement to enhance virtual training environment efficacy.
Bottom of Form
Appendix 5. Pre-training Questionnaire
Please answer following questions:
1. / Please tick the gender category that applies to you / Male / / Female /2. / Please tick the age category that applies to you / 18-24 / / 25-34 / / 35-45 / / Over46 /
3. / How often do you use a computer? / Every day / / Every 2-3 days / / Once a week / / Less than once a week /
4. / What do you use computer mostly for? / Computer games / / Surfing the internet / / Word processing / / Drawing 3D images (e.g. Studio Max) /
5. / For how long have you been using computers regularly? / <6 months / / 1-3 years / / 3-6 years / / >6 years /
6. / How experienced are you in manipulating 3D objects in a gaming or computer environment (using keyboard or regular mouse)? / Very experienced / / Moderately experienced / / Minimal experienced / / Not experienced at all /
7. / How experienced are you in manipulating 3D objects a virtual environment(using head mounted display and haptic device)? / Very experienced / / Moderately experienced / / Minimal experienced / / Not experienced at all /
8. / How experienced are you in object assembly using electrical and mechanical tools? / Very experienced:
use every day / / Moderately experienced:
use on weekly base / / Minimal experienced:
use on monthly base / / Not experienced at all:
never use /
9. / How challenging do you find assembling pre-made items (such as bed or bookcase parts from IKEA? / Very challenging:
cannot complete the assembly on my own / / Moderately challenging:
need to constantly refer to the manual / / Hardly challenging:
need to read the manual only when I encounter difficulties / / Not challenging at all:
can complete the assembly without referring to the manual /
Appendix 6. Flowchart of utilization MUSTe method
The MUSTe assessment process commences with the collection of users’ demographic information using the ‘Pre-test Questionnaire’ via the MSUTe tool (see Appendix 3). Pre-test Questionnaire is followed by the VE training. In this training stage, users will gain firsthand experience of interacting with variety of I/O devices and interfaces. Users should be able to operate the VE to perform simple tasks and practice with more complex tasks. ‘Self-efficacy Questionnaire’ will be distributed to the users to report their self-efficacy beliefs about their ability to perform tasks in the VEs. This follows a training assessment of users task performance. As marked “1”, this assessment of user performance is software and application based, and a logging tool will record each user’s performance in detail. All the relevant objective measures in this regard will be used as inputs to the VE efficacy assessment. A ‘Perceived VE efficacy Questionnaire” will then be distributed to the user, to rate their perceptions of interaction and learning experience as well as how well system facilitate their learning. Finally, ‘Memory Test Questionnaire’ will be used, after a gap of approximately two weeks time, to assess the memory performance of the user. This objective measure tests users’ ability of recognition and recall based on their interaction and learning experience with the VE. Gathering of critical data will proceed to the decision of ‘assessment depth’ by the evaluator.
It is noticeable that the evaluator plays a minor role in the MUSTe evaluation. As marked in “2”, the evaluator could observe user performance in the VE and note down any usability problems he/she found. However, the evaluator’s observation notes and feedback will not be used in the final stage of system efficacy determinates. Therefore, if the time and budge is limited for the evaluation this step can be omitted.
Once, all required evaluation data is obtained, MUSTe tool can perform data analysis on quantitative and/or qualitative data (if coded) and determine cognitive, skill-based and affective learning outcomes, depending on the purpose, time, cost and the depth required. When assessment based on quantitative data is satisfactory, the MUSTe tool will enable variety of quantitative and statistical analysis and determine the system efficacy, and produce an efficacy score. This often refers to an ‘Analytical evaluation phase’. On the other hand, if in-depth assessment is preferable and a single score is considered unsatisfactory, the evaluator can analyze qualitative data, as marked “3” in the flowchart, to achieve a more in-depth assessment.This analysis will yield deeper understanding of user experience and interaction with the VE, and enable the evaluator to provide detailed analysis of problems identified. The evaluator can transfer the analysis result on the qualitative data to the MUSTe tool. When combined with the quantitative results, the system efficacy can be determined with great detail referring to the ‘Holistic evaluation phase’. As addressed before, the data analysis can stop at either of the ‘Analytical evaluation phase’ or the ‘Holistic evaluation phase’ determining VE’s efficacy, depending on the purpose, time, cost and thoroughness.
Appendix 7. MUSTe: The Software Engineering Tool
MUSTe supported a wide range of data collection and analysis at varying levels of complexity. MUSTe tool supports data entry and data analysis, as well as produces variety of evaluation results. An example of a MUSTe window, screen shot (a) shows a pre-test questionnaire screen, where users could quickly enter their demographic information via this tool. Similarly, other MUSTe user perception measures can also be collected quickly and electronically during experimentation or user testing, without any interpretation to the evaluation process. Screenshot (b) gives an example of a utility function of a post-VE questionnaire, which enables the evaluator to check missing data of a specific user or gain an overall view of the completeness of the data collection. The utility functions also enable an evaluator to check the completeness of video recorded. It is also possible and easy to associate a user input data with his/her video recorded task performance (c).
(a) Pre-test questionnaire screenshot
(b) Utility functions of Post-VE questionnaire screenshot
(c) A user’s demographic info with video recorded performance procedure (b)
Appendix 8. Memory-test questionnaire