Feature Wiki

Information about planned and released features

Tabs

Routing Rules in Test Parts for Adaptive Testing Scenarios

This is a sub-entry of the feature request Competence-driven Question Groups and Test Parts

1 Initial Problem

2 Conceptual Summary

Test parts should be enabled to react to certain conditions. It should be possible to skip certain test parts if the relevant conditions are met.
This would support adaptive testing / competence measurement scenarios.
The idea behind this is to use the results / data from previous "trigger questions" or test parts as a kind of "switch" that decides whether the (supposedly) next test part will be shown or not. Currently, survey support this kind of routing rules that affect the playback behaviour.

  • two basic modes of a test part
    • static mode: relevant for gobal result of the test
      • score will be included in the global test 
      • not possible to skip the test part through precondition
    • dynamic mode: not relevant for global result of test
      • score is not inclused in global mark at all
      • test part can hold preconditions and can be skipped
      • test part ass such has a resulh which is shown in the global result presantation but which is marked as "optional part - not counted"
  • test parts react to conditions or triggers to decide whether they want to be presented or not
    • new Subtab in Test Parts Tab
    • options:
      • [default] always shown
      • only if a (pre)condition is fulfilled
    • basic assumptions:
      • fixed preset order of test parts
        • parts can be skipped
        • no possible to jump back through certain conditions
      • the "next" test part is shown depending on the result of one or more previous test parts
        • i.e. user fail at a specific question / test part / question group -> the corresponding test  parts will be shown / entered next
    • logic
      • the precondition handling should use the same logic that is used by the competence management magic device of the test

A wider perspektive on adaptive learning scenarios is currently being discussed in the entry: Starting to Support Adaptive Learning Scenarios

3 User Interface Modifications

3.1 List of Affected Views

{Please list all views (screens) of ILIAS that should be modified, newly introduced or removed.}

3.2 User Interface Details

{For each of these views please list all user interface elements that should be modified, added or removed. Please provide the textual appearance of the UI elements and their interactive behaviour.}

3.3 New User Interface Concepts

{If the proposal introduces any completely new user interface elements, please provide a link to separate feature wiki entries for each of them according to the kitchen sink template.}

4 Technical Information

{The maintainer has to provide necessary technical information, e.g. dependencies on other ILIAS components, necessary modifications in general services/architecture, potential security or performance issues.}

5 Contact

  • Author of the Request: {Please add your name.}
  • Maintainer: {Please add your name before applying for an initial workshop or a Jour Fixe meeting.}
  • Implementation of the feature is done by: {The maintainer must add the name of the implementing developer.}

6 Funding

If you are interest in funding this feature, please add your name and institution to this list.

7 Discussion

8 Implementation

{The maintainer has to give a description of the final implementation and add screenshots if possible.}

Test Cases

Test cases completed at {date} by {user}

  • {Test case number linked to Testrail} : {test case title}

Approval

Approved at {date} by {user}.

Last edited: 17. Mar 2017, 09:59, Kunkel, Matthias [mkunkel]