Done-IT
LLP-Project Nr. 511485-LLP-1-2010-1-NO-KA3-KA3MP
/ LPP KA3-ICT Project 2011-13
511485-LLP-1-2010-1-NO-KA3-KA3MP

Done-IT

Develop of open systems services for smartphones that facilitates new evaluation methods, and enhances use of immediate feedback on evaluation results obtained in tests as a creative learning tool.
WP 8: Developing evaluation system for 4 Operative System platforms for Smartphones
D 8.2: Technical Implementation Report
Author and editor: / R.P. Pein
Co-Authors: / T. M. Thorseth, M. Uran, S. Keseling, B. Voaidas, R. Støckert, B. Zorc, M. Jovanović, A. Koveš
Version: / Final
Date: / 31.03.2013
Start month: / 1
End month: / 27
Package leader: / Institut za Varilstvo (IzV), Ljubljana, Slovenia
Language of the report: / English

This document may not be copied, reproduced, or modified in whole or in part for any purpose without written permission from the Done-IT consortium. In addition to such written permission to copy, reproduce, or modify this document in whole or part, an acknowledgement of the authors of the document and all applicable portions of the copyright notice must be clearly referenced. All rights reserved.

This document may change without notice, but consortium members should be informed, and number of version, stage and date should be given.

Project consortium

  • Sør-Trøndelag University College, Faculty of Technology, Trondheim, Norway
  • Centrum for Flexible Learning, Söderhamn, Sweden
  • Petru Maior University of Targu-Mures, Tirgu Mures, Romania
  • Magyar Hegesztéstechnikai és Anyagvizsgálati Egyesülés (MHtE), Budapest,Hungaria
  • Institut za Varilstvo (IzV), Ljubljana, Slovenia
  • HiST Contract Research, Trondheim, Norway

Table of Contents

Done-IT

Introduction

Basic data model

Data formats used in communication

Offline operation - Mobile PeLe Service Unit

Typical scenario

Preparation

Assessment time

Post-assessment time

Post class processing

Testing and evaluation

Design

Use cases

Register to system

Login as teacher

Create new assessment

Edit assessment

(UCT4) Start assessment

(UCT5) Monitor assessment

(UCT6) Close assessment

UCT7 Post assessment activity

(UCS1) Register to the system

(UCS2) Login to system

(UCS3) Start assessment

(UCS4) Respond to assessment

(UCS5) Retrieve assessment feedback

UCS6 Participate in Post Assessment activity

Messages – communication workflow

General Requests

Before Lecture

Assessment Phase

Interactive (SRS) Phase

After Lecture

Scoring Model

Item Level

Section Level

Assessment Level

Normalized Scores

PeLe Service Documentation

Design

RESTful Interface

Authentication

Access Rights

HTTP Status Codes

HTTP Headers

REST Resources

REST Resource Details

Implementation

Package rs.sol.Hist.TestDefinition

Package rs.sol.Hist.Testing

Package rs.sol.Hist.UserManagement

Teacher client documentation

Design

Use cases

UI and user experience

Code architecture - packages

The context

Communication infrastructure

Implementation

The context

hist.net - Communication with the service

hist.model - Data model

The main application DoneiTC.mxml

hist.doneit.datamodels

hist.doneit.events

SessionControlEvent

hist.doneit.gui

hist.doneit.gui.about

hist.doneit.gui.admin

hist.doneit.gui.create

hist.doneit.gui.create.simulate

hist.doneit.gui.end

hist.doneit.gui.general

hist.doneit.gui.itemRenderers

hist.doneit.gui.itemRenderes.adg

hist.doneit.gui.menu

hist.doneit.gui.monitor

hist.doneit.gui.postasess

hist.doneit.gui.start

Student app documentation

Use cases

UI and user experience

Code architecture

Project file structure

Implementation

Data processing

Mobile PeLe Service Unit Documentation

Why

What

How

Index of Tables

Table 1: definitions for identifiers and names...... 40

Table 2: User roles...... 42

Table 3: Access rights matrix...... 44

Table 4: HTTP status codes...... 46

Table 5: Mime-types...... 48

Table 6: Rights for ASI resources based on the role...... 49

Table 7: Custom metadata fields for ASI resources...... 49

Table 8: Custom metadata fields...... 50

Table 9: QTI-XML resources...... 51

Table 10: Service folders...... 52

Table 11: REST resource details...... 52

Table 12: Assessment parameters...... 58

Table 13: Completion state values...... 58

Table 14: Resource state values...... 59

Table 15: Assessment type values...... 60

Table 16: Section type values...... 60

Table 17: User account properties...... 64

Table 18: Attributes of the ASIEntity class...... 66

Table 19: Assoc. ends of the ASIEntity class...... 67

Table 20: Attributes of the ArbitraryData class...... 67

Table 21: Attributes of the DefaultSection class...... 67

Table 22: Assoc. ends of the DefaultSection class...... 67

Table 23: Attributes of the PossibleResponse class...... 68

Table 24: Assoc. ends of the PossibleResponse class...... 68

Table 25: Attributes of the Question class...... 68

Table 26: Assoc. ends of the Question class...... 70

Table 27: Attributes of the Test class...... 70

Table 28: Assoc. ends of the Test class...... 71

Table 29: Enumeration literals of the AsessmentType enumeration...... 71

Table 30: Enumeration literals of the ContentType enumeration...... 72

Table 31: Enumeration literals of the ResponseMultiplicity enumeration...... 72

Table 32: Enumeration literals of the ResponseType enumeration...... 72

Table 33: Attributes of the DefaultSectionCompletionState class...... 74

Table 34: Assoc. ends of the DefaultSectionCompletionState class...... 75

Table 35: Attributes of the DSLifeCycleState class...... 75

Table 36: Assoc. ends of the DSLifeCycleState class...... 75

Table 37: Attributes of the QuestionState class...... 75

Table 38: Assoc. ends of the QuestionState class...... 76

Table 39: Attributes of the SRSSection class...... 76

Table 40: Assoc. ends of the SRSSection class...... 77

Table 41: Attributes of the StudentResponse class...... 77

Table 42: Attributes of the StudentTestRecord class...... 78

Table 43: Assoc. ends of the StudentTestRecord class...... 78

Table 44: Attributes of the Testing class...... 79

Table 45: Assoc. ends of the Testing class...... 79

Table 46: Attributes of the TestingListener class...... 80

Table 47: Assoc. ends of the TestingListener class...... 80

Table 48: Enumeration literals of the ASILifeCycle enumeration...... 80

Table 49: Enumeration literals of the CompletionState enumeration...... 81

Table 50: Attributes of the User class...... 83

Table 51: Assoc. ends of the User class...... 83

Table 52: Enumeration literals of the Role enumeration...... 83

Table 53: Attributes and pins of the CmdCreateUser command...... 84

Table 54: Association ends and pins of the CmdCreateUser command...... 84

Table 55: Attributes and pins of the CmdCreateUsersBatch command...... 85

Table 56: Association ends and pins of the CmdDeleteUser command...... 85

Table 57: Attributes and pins of the CmdEditProfile command...... 86

Table 58: Association ends and pins of the CmdEditProfile command...... 86

Table 59: Association ends and pins of the CmdFetchCurrentUser command...... 86

Table 60: Classes in the hist.doneit.gui.create package...... 113

Table 61: Classess in package hist.doneit.gui.create.simulate...... 118

Figure Index

Figure 1: Post assessment dilemma...... 13

Figure 2: PeLe System overview...... 15

Figure 3: System overview when using Mobile PeLe Service Unit...... 18

Figure 4: Main use cases of the system...... 22

Figure 5: Main communication workflow between system components...... 32

Figure 6: ASI Basic...... 73

Figure 7: ASI Definition...... 74

Figure 8: Testing...... 82

Figure 9: Listener...... 82

Figure 10: User Management...... 87

Figure 11: The current context minimizes the need for complicated dependencies between the components 94

Figure 12: Communication infrastructure...... 95

Figure 13: Context and communication related classes...... 98

Figure 14: Request flows...... 100

Figure 15: Server information...... 101

Figure 16: Login and zero configuration related classes...... 104

Figure 17: Resource definition data model...... 105

Figure 18: Results data model...... 108

Figure 19: Resource state related classes...... 109

Figure 20: About box...... 111

Figure 21: adminCanvas class and related classes...... 112

Figure 22: adminCanvas user interface (design view from IDE) - shows the adminCanvas with the export function and the datagrid displayed. 113

Figure 23: CreateAssessmentWizard classes and their relations...... 114

Figure 24: shows the editAssessmentComponent as seen in design mode in the development environment. 116

Figure 25: shows the EditAssessmentComponent that is connected to the yellow model objects and the EditQuestionItemRenderer and AlternativesItemRenderer and how they are wrapped inside Air components (blue). 117

Figure 26: shows the QuestionItemRenderer displayed in design mode in the developer environment 117

Figure 27: shows the ViewAlternativeComponent and how an hist.model.Alternative is rendered. 118

Figure 28: shows how the ControlComponent, MonkeycageClass, monkeyResponseClass and the HistogramForMonkeycage are related. Communication and control is done through events up to the main application. 119

Figure 29: shows the simulation control dialog...... 121

Figure 30: shows the end interface in “sectionClosed” state where results are displayed from the assessment. 123

Figure 31: shows the class diagram of the ResultsOverview and how that use hist.doneit.gui.monitore.SummaryViewComponent to display results from the model. 123

Figure 32: shows the SessionInformationPanel, that is used to display more detailed information about sessions from a list of sessions. 125

Figure 33: shows the menu with the menu elements and the selector triangle to amplify the current position in the “user flow” of the program 128

Figure 34: shows the MonitorComponent, the two related views, SummaryViewComponent and MatrixViewComponent, related model (yellow) and related item renderers. 129

Figure 35: shows the MonitorComponent when it displays the default MatrixViewComponent. 130

Figure 36: shows the MatrixViewComponent that is inside MonitorComponent...130

Figure 37 shows the SummaryViewComponent with two sections. The default section is the main assessment, and the second chance section is not really used in the application as it is today. To the right, a histogram displays the selected question in more detail. 131

Figure 38: shows the ItemSummaryResponseItemRenderer2 displaying results as percentage bars. 131

Figure 39: shows the component in NoDocument state...... 133

Figure 40: shows the component in DocumentNotReady state...... 133

Figure 41: shows the assessment in DocumentReady state. The Start button is made available 134

Figure 42: shows the component in DocumentUploaded. Normally the user will be redirected to monitoring when the assessment has been started and the user has to deliberately enter the start stated to get this state. 134

Figure 43: relation between the views...... 141

Figure 44: System overview - Use of MPSU with the internal AP enabled...... 145

Figure 45: MPSU architecture overview...... 146

Figure 46: Use of MPSU with one AP/Wireless Router added...... 150

Figure 47: Use of MPSU with multiple AP/Wireless Routers added for a combined wireless network capacity 150

Introduction

The PeLe (Peer Learning Assessment Services) system developed within the frame of the Done-IT project aims to provide the teachers and the students with a set of services that allow information exchange and information aggregation in order to support evaluation activities that promote easy and fast verification and elaboration based learning processes immediately after a test has been done. The main targets are higher education and VET considering both contexts of inside and outside of the classroom.

The PeLe system (PeLe) allows on one hand the students to use their own smart-phones, tablets or portable computers to provide the responses for assessments and post assessment activities and on the other hand allows the teacher to monitor the activity and give verification or elaborative feedback to individual students/groups of students immediately after a test or activity.

This is a key factor helping students to improve their skills by the use of active collaborative supported learning. Students will, when they still remember the questions in the test, learn why the correct answer is correct and why the other ones are incorrect. Thus, mobile technology provides new evaluation and testing criteria within education and training.


The traditional focus when introducing electronic assessment is to save time for correction answers and give immediate feedback on learning. The aspect of this project is turned another way around and focus on the learning perspective. So you have performed an assessment, the students has been working on a problem for a time and given you as a teacher a digitally delivery of the results, you can see the results immediately, what to do next?

If the feedback to you as a teacher is good, dependent upon the results, you can do things in many different ways. With the Done-IT assessment solution our target is to provide the teacher with instant feedback on the status of the assessment. What questions did the students solve correctly and what questions caused more problems. Now the teacher can do the following for each question:

a)Continue as usual and only give the results.

b)Give the students verifying feedback and explain what has been misunderstood.

c)Give the students a hint of what might be the problem for this question, but not the actual solution.

d)Give the students the results, “this is what you voted” and allow the students the possibility to discuss the problem.

e)Allow the teacher to pick one question, as set up in the assessment, and send it out prepared for an SRS session on the question.

f)Allow the teacher to collect several of the questions after a pedagogical treatment, and allow the students to resubmit their answer to parts of the assessment.

In case of c) and d) should the students be allowed to take part of the test again, or should the students after a group / peer discussion be able to renegotiate their response? There are several possibilities that we can allow the teacher to do.

The main advantage of entering the test subject just after the test is first of all, the problem is fresh in mind since the student has just been working with it. The student might have spent time on parts of the assessment but have just not found the right answer. Dependent upon the nature of the subject being thought, there might be just a hint from a peer student or a teacher that might solve the problem.

The system has been developed using an iterative prototyping process. In the design process have been involved the main system stakeholders: teachers and students. The very context of use dictates that a special emphasis must be placed on the human computer interaction aspects. The educational activity must be supported in a transparent and usable way. The system has been evaluated through a number of usability tests that included both expert based evaluation and user testing.


The system is based on a client-server architecture and the communication between the components is over the HTTP protocol. The main components of the system (figure 1) are: a server application, a student client application and a teacher control application. The responsibilities of each component are defined as follows:

The server component includes the PeLe webservice, the back end database (that in our case is collocated with the webservice, but by no means restricted to be collocated) and portal pages. The server also hosts the students' web app. The web service manages the central data model and keeps the entire system in a consistent state at all times. It provides an interface to remotely modify and read the productive resources following the "Representational State Transfer" (REST) principles. The web service is the main communication node and its data represents the "truth", if synchronization issues occur.

The teacher software is responsible for modelling the workflow for a given use-case. This is achieved by modifying the data inside the web service according to the required work-flow. The teacher role is granted write access to various parameters which allow for shaping a custom work-flow.

The student software usually reacts to the actions of the teacher. Again, the user interface has to implement the workflow for the respective use-case. A central aspect of the student client is the ability of being remotely controlled. According to the state of the data maintained in the web service, the student client may entirely block input or provide restricted read/write access to a set of resources.

Apart from the main basic components the system includes few other components that aim at improving of the user experience and facilitating the system maintenance:

The authentication service allows integration with other authentication methods and also allows the scenario when one user is accessing several servers.

The autoupdate service allows the teacher client to update over the internet. When a new build is available it is published on the autoupdate location. The teacher clients will check that location periodically and will notify the user to update.

The zero configuration service allows the teacher client to automatically retrieve the required information to connect to a certain server.

The PeLe tool is a web service in combination with use-case specific clients. During lectures, the system is controlled by the teacher and the student clients react according to the situation, thus generating an interactive environment with immediate feedback between teacher and students.

The ability of reacting to the teacher's actions poses certain challenges in the implementation, as the HTTP protocol does not specify a reliable back-channel for browsers yet. Possible approaches to address this issue are polling, long-polling or web sockets where each of which undeniably has specific drawbacks.
In this interactive environment, both the teacher’s as well as the student’s tools are designed to be streamlined. The amount of available options is kept small and the important functionality for the targeted workflow can be reached with a minimum amount of interactions. A quick interaction with the service is necessary because of the time pressure created in the interactive environment. Neither teacher nor students should get distracted or overwhelmed by special functionality. Any delay in usage of the tool directly causes the other users to wait for the intended change.
Outside the lecture, other clients (e.g. a dynamic web page) are used to work on the resources that has been collected inside the classroom. These tools can offer a rather complex user interface as the time pressure of a live system does not apply here. Typically, the teacher may create new or modify the existing content. Further, both teachers and students have read access to the collected data. Naturally, the teacher is allowed to see all data related to his/her lectures. Students usually only have access to unlocked data, further filtered for personal content. The data of other students remains entirely invisible.

Basic data model

The system data model includes data structures mainly for assessment definition, assessment results and users. In the data model referring to the assessments we considered three levels: Assessment, Section, Item (ASI). An assessment is considered to be a single working unit that is created by an instructor and used for a specified task, e.g. one single lesson during a course. Each assessment can have one or more sections which in their turn can contain items. An assessment item corresponds to a single question in a test.