Feature Wiki

Information about planned and released features

Tabs

(Project) E-Exams with ILIAS

This is a project page that bundles several feature wiki pages which belong to a larger development activity for the ILIAS component Test & Assessment and Exercise (?).

1 Aim of Project

Using the Test & Assessment (T&A) for e-exams is one of the most important usage scenarios for this component – and the highest rated by the Splitting-up Test&Assessment Working Group. Having a reliable and secure piece of software to realise e-exams is of highest importance for organisations and institutions that must offer these types of assessment. Due to the monolithic structure of the T&A, a huge amount of code and settings are part of a test but not needed for an e-exam.

The COVID-19 pandemic has increased the usage of the test module in ILIAS and revealed bigger, underlying issues with the current T&A implementation when it comes to e-exams:

  • ILIAS tests are build as a generic tool to cover a lot of use cases, some of which are detrimental when realizing e-exams with ILIAS tests
  • ILIAS tests do not support the surrounding process (and the persons/roles involved) of e-exams well (preparation of e-exam, marking of e-exam, archiving of e-exams etc.)
  • e-exams have specific "must have" requirements that need to be considered on a conceptual level to ensure a useful implementation
  • e-exams need a heavy focus on the best possible performance to support a bigger number of test participants
The following is meant to serve as a general concept/overview regarding the necessary changes to the T&A module in ILIAS in order to fulfill the requirements of e-exams. All listed features and modifications still require a detailed specification in form of seperate feature requests.

Please note that some of the requirements of e-exams described here are solved (in part at least) by publicly available plugins. These features are still listed as they MUST be integrated into the core software, if ILIAS is meant to support e-exams out of the box.

Indication of Requirement Level

In order to provide a clear understanding of the importance/relevance of a requirement, all feature requests use the following keywords as described in RFC 2119:

  • MUST, MUST NOT
  • REQUIRED
  • SHALL, SHALL NOT
  • SHOULD, SHOULD NOT
  • RECOMMENDED
  • MAY
  • OPTIONAL

2 Requirements for E-Exams with ILIAS

The working group for e-exams does not represent all stakeholders for the T&A module. This feature request should be discussed with the hole project group. Not only, but especially because the concrete technical architecture of the new components and their interfaces is still under discussion.

In a first step the working group has analysed the current T&A and classified all settings and options concerning their need and usage in an e-exam scenario. In addition, it became quickly apparent, that the problems we are facing cannot be solved by adding or removing some configuration settings. Instead a refactoring of the T&A module may be needed and some functions/features may need a complete overhaul. This offers the chance to adapt the test-player for e-exams to enhance users' support during the full process surrounding e-exams, which are:

  • preparation of e-exam
  • execution of e-exam
  • marking / scoring of results
  • inspection of published results for participants
  • archiving of results and export of data
  • administration of test objects
This list of requirements is the results of the working group for e-exams. New features and/or modifications are ordered by the part of the e-exam process for which they are needed. In addition we have provided user stories on which these requirements are based. The list of user stories is incomplete and can only serve as an example. Where necessary the user stories need to be expanded for individual feature requests.

Guiding Users through Preparation of E-Exams

The T&A module offeres a wide array of settings and covers a plethora of use cases. The result can become quickly overwhelming. This is especially true for new and inexperienced users. While necessary for power users, the current implementation SHOULD better guide new users during the preparation of e-exams. Mandatory features/settings must be covered, features that SHOULD not be applied in this context SHOULD be removed. Wherever possible the complexity of the T&A SHOULD be reduced.

We see three ways to achieve this:

  1. Redefine the tabs/subtabs of the test repository object so that they follow the e-exam process (see above)
  2. Implementing a Guide/Checklist feature for e-exams that users can follow to make sure their configuration is valid for e-exams. Something like this already exists with for learning-objectives based courses, although this would have to be enhanced for our purposes.
  3. Implement/support a workflow covering the complete process of e-exams

Test Sections

A major point of criticism in the preparation concerns the limited possibilities of the didactic test design. From a didactic point of view it is a disadvantage that most settings regarding questions, i.e. selection mode and behavior of questions, affect the test in general. There are scenarios that require the combination of fixed and random sets of questions like an initial part with a fixed set of questions, a middle part with randomly selected ones, and a final part with a fixed set. The order of test sections can be fixed or randomized.

Further, there is a need to assign different behaviors for a selection of questions in the same test. For some questions it may be useful to be shuffled to gain a variety in presentation of questions. Other questions build on the correct answers of previous questions, refer to a certain topic and have to be displayed in fixed order with locked answers to access follow-up questions. Finally, some exams that cover multiple topics offer a selection of these topics to participants. Participants are only required to work on 2 of 3 topics for example.

Exams typically contain a high quantity of questions that refer to different main topics. Examiner and participants prefer a structured presentation of questions according to their associated topics. Especially random tests or tests with shuffled questions currently prevent an organized structure of questions.

As a result, tests SHOULD allow for sections with different settings for selection mode and behavior of questions to improve didactic requirements and usability. A test MUST always contain at least one (1) section.

Question Groups

In addition to test sections we need a way to group questions together. We see this functionality as a part of the test question pool.

Global Working Time Instead of Test Access by Date

Test access settings and working time are confusing and often come into conflict with each other. They MUST be combined into one set of settings, so that a "wrong" application of the two seperate settings is no longer possible.

Aside from extensions of test access / working time for participants with disadvantages (see below), test access / working time MUST be equal for all participants. There MUST NOT be any individual time to work on an e-exam depending on time of start of the test etc.

It is sufficient to set a global working time in min (i.e. 120min) for the test. Starting point is determined by the examiner.

Granting Individual Test Working Time Extention For Test Participants with Disadvantages

Students with disadvantages can apply for time extension before the exam. Currently time extension can only be granted after the test has started. Additionally the finishing time cuts off any given time extension. Consequently an examiner has to change the finishing time for all students, if he/she wants to grant only one student time extension.

In order to solve this, examiners MUST be able to extend the set test working time for individual participants during preparation of the e-exam. The global setting for test working time will only apply to a participant if it has not been extended by the examiner. It MUST NOT be possible to reduce the global test working time period for a given participant.

Test working time extension MAY be done nominal (i.e. +20min) or as a percentage (i.e. +20%).

Terms of Use for Tests

Exams need to be taken under specific regulations. In general participants MUST agree to these regulations before they can start and/or hand in the exam. Currently ILIAS tests do not support any sort of test specific terms of use. Administrators SHOULD be able to define systemwide terms of use for tests. Examiners who create e-exams MUST select one of the predefined terms of use the system offers. This way examination regulations can be managed by system administrators and examiners do not need to copy paste this information themselves.

In addition examiners MUST be able to add specific/individual rules to each e-exam/test (basically an addendum to the terms of use). This is needed so that examiners can define specifics, for instance what tools/software participants are allowed to use for one e-exam in particular. This addendum MUST be saved seperately from the general terms of use for tests.

ILIAS MUST also save who accepted which terms of use for which test, when, and what version(!) of the terms of use were accepted.

Terms of Use SHOULD be accessible before test start.

Randomizing Testpasses before Test Start

For better performance ILIAS SHOULD generate final testdata for participants before test start. This includes randomized parameters for formula questions etc. Testdata SHOULD NOT be generated during runtime (test start by participants/opening a question).

The test SHOULD be able to generate testpass data for a fixed list of participants or for a given number of planned participants (1-N). In the latter case, a generated testpass is randomly assigned to the first participant starting the test.

Examiners can view final test data for a test pass during the preparation of e-exams.

Dashbord Key Features for Examiners

The test dashboard SHOULD be reworked to become the central view for examiners to control the exam during exection. The current implementation of the dashboard for ILIAS tests does not offer features needed for examiners running an e-exam for example:

  • examiners SHOULD have a way to inform participants during the test (i.e. to announce the imminent end of the working time); see Plugin ExamNotifications as an example
  • examiners MUST have a way to restart/unlock a test pass that has been finished unintentionally (manually) as long as the test access/working time has not been reached
  • examiners MUST have a way to extend the test working time for a given participant on the fly in case of technical difficulties
  • examiners MUST have a way to unlock an automatically finished test pass (by adding a time extension)
  • examiners MUST have a way to see the real current status for a given participant (exam submitted, processing time elapsed or time left) 
    • Detailed specification for "test pass status" pending

Accessibility and Usability Improvements for Test Player GUI

  • Students SHOULD have a way to comment on a question/answer while working on the e-exam.
  • The test player GUI needs improvements for accessibility.
  • Test GUI elements MUST be controlled by specific settings (currently the display of the countdown is dependant on if individual working time is active or not).
    • It should be evaluated if certain configurations of the test player GUI can be removed. I.e. the display of the overview of questions could be mandatory.

No Automatic Scoring of Answers During Test Run or When Test is Finished

Currently ILIAS will score the answers to a given question each time an answer is saved (if the question can be scored automatically) during a test run. This is NOT RECOMMENDED for e-exams as it costs performance and participants SHOULD NOT be presented with results before the automatic results are checked and scored by examiners.

Automatic scoring of answers SHOULD be run after all participants have finished working on the test, preferably triggered by the examiner or via cronjob/run as a background task.

Finishing Test and Submission of Answers

ILIAS MUST distinguish between participants manually finishing the test (i.e. handing in the exam!) and participants who do not finish on time. The latter MUST be a defined event and logged seperately. Tests finished by an Ilias-cron-Job should also be marked as such and be logged!

Participants SHOULD have the option to hand in the test either manually or after the working time has been reached.

Extending Test Protocol for E-Exams

ILIAS creates a test protocol for ILIAS test. This protocol can only be accessed by admins. It MUST be accessible for examiners (read only!) In addition the protocol needs to be extended. It MUST encompass all activities regarding the e-exam, including but not necessarily limited to:

  • changes to test settings
  • creating sections
  • adding questions and/or questions groups to a section
  • changing/defining a scoring schema (per section)
  • adding participants to the fixed set of allowed participants for a test
  • extending the test working time for a given participant (beforehand or during the e-exam)
  • participants accepting the terms of use
  • test/e-exam start by participant
  • answering a question during execution by participants
  • test finish by participant (manually)
  • test closing due to the access limit being reached (automatically)
  • participant withdrawing from an e-exam
  • commenting on a question during execution (including the content of the comment)
  • removing a question from scoring of the e-exam
  • providing manual feedback during scoring (including the feedback content) by all examiners
  • adjusting points for answers during scoring
  • publishing scored answers
  • generation of test archive export data
  • messages the examiners sent to all participants during execution of the test
Note: Detailed specification pending.

There are several problems concerning the scoring and marking process of the ILIAS Test.

  1. Firstly, there are at least four different pages that contain some information regarding a participant‘s results. This is neither user-friendly nor easy to comprehend.
  2. Furthermore, marking an exam is quite often a process that requires flexibility and a mixture of information and adjustment.
    1. There are often different examiners, which is not yet reflected in the ILIAS Test.
    2. Also there are usability issues like the manual scoring of one question of several participants at once or the obligation to change pages to make adjustments within “Corrections”.
    3. Another major issue is the integrity of the content of questions that already have connections to participant-data. These questions should not be allowed to be entirely erased from a test after its completion, as it is currently the case.
  3. Scoring of results has no clear trigger and thus can be started unintentionally by examiners.
  4. Scored results do not have a history and can be "overwritten" by a subsequent "recalculation of results". This is made worse considering that subsequent scoring of results can lead to differences if the codebase of questions change etc. For that reason scored results need to be "locked" and tagged with a timestamp.
  5. Finally, in order to support the process of an e-exam, the completion of the marking process should be logged from beginning to end; the marking process needs a start- and finish-time stamp that is connected to the examiner’s ID.
  6. Best solutions for questions cannot be modified after test is run.

Merging Results, Scoring and Statistics Tabs

Suggested is one page „Results“ with different subcategories that also lead to a marking process for each participant or question. The main page “Results” as well as every individual result-page of each individual participant and each individual question should have a section that shows aggregated information and statistics.

Among the general statistical information on a test, admins/examiners should be able to decide whether they want to focus on the information on individual participants or the questions. Jumping into the data of each participant or each question is also where the marking process should happen. See User Stories for more details.

New Permissions for Scoring and Marking

Currenlty there is only one permission "test results". This allows scoring and marking of results as well as deleting complete test data. The permission to delete test data MUST be separated at least to prevent data loss and to protect examiners from catastrophic mistakes.

New permissions:

  • read access to test results
  • score test results (needs RA to be effective)
  • delete test results

Rework of Manual Scoring / Corrections

Manual Scoring and Corrections are two features that SHOULD be combined. Comments need to be exported and all activities need to be logged. In addition:

  • it MUST NOT allow to change the text of a question
  • it MUST allow to change the points for a question/answer
  • it MUST NOT allow to delete a question
  • it MUST allow to change the best solution to a question
  • it MUST allow to comment on an answer of one participant
  • it MUST allow to comment on a question for all participants
  • it MUST allow manual scoring of a question if the question type needs manual scoring
  • it SHOULD hide participants names to allow anonymous scoring
  • it SHOULD track corrections by examiner
  • it MUST track the status of questions (scored, not scored)
  • it SHOULD lock scoring of a question for others if on examiner is actively scoring the question
  • it MUST allow the correction of randomized tests
  • it MUST show the complete question including content from the ILIAS page editor
In general the views for corrections are in need of a redesign to improve usability.

Historizing Test Results

Changes to the ILIAS codebase (bugfixes, new features etc.) may change the automatic calculation of test results. In order to prevent updates from changing the calculated results of tests that have been published, test results MUST be historized. That way, if test results are recalculated after a change the original results will still be accessible.

The history of test results SHOULD include the commit id of the codebase at the time of generation.

Publishing Results for Inspection

Examiners MUST publish test results for inspection (if the inspection is handled via ILIAS). That means that a version of the calculated/manually scored test results MUST be manually published by examiners.

Access to test results MUST NOT be granted automatically by ILIAS.

Extending Test Protocol for E-Exams

See above.

In general we think that a rework of the views/displays is need to improve usability.

Limit Access to Test Results to Fixed Timeframe

In the context of e-exams access to test results is granted for the purpose of inspection by participants. This requires a start and an end date. Currently ILIAS only offers a fixed start date from which participants can access test results. ILIAS MUST also offer an enddate. When the enddate is reached test result access is automatically deactivated for test participants again.

Mandatory Settings for Test Results of E-Exams

ILIAS MUST always present the following information for the purpose of insight into results of an e-exams:

  • best solution to all questions/versions of a question
  • scored answers
  • feedback by examiner (if given)
  • test pass id / exam id
  • participant id (i.e. matriculation number)
Currently some of this data is only shown when the corresponding setting is activated.

Streamlining Exports for Results and Statistics

At the moment ILIAS offers different export formats for different things (results, statistics, the test itself) in different locations. A concept for these seems to be missing. For instance, some export formats containing test results are located under the statistics tab (e.g. "Results by Question" as PDF-Export). That makes no sense. Further, another result format can be found under Test -> Export, usually only used to generate ILIAS specific import-files for the object in question. Some export under statistics contain different data depending on test settings (i.e. fixed selection of questions Vs. randomized selection of questions).

To complete this mess, some export data under statistics is labeled as evaluation data, which suggests its primary use would be a kind of quality evaluation for exams, but instead the data provided in this export not only contains statistics for the test, but rather answers by participants. The format for this export then is actually more useful for scoring a test than most of the other export formats for "results".

We suggest to re-define all results / statistic related exports and locate them properly. The tab "export" should only contain ILIAS specific exports concerning the repository object and the full "archive export file".

Data Export as Asynchronous Background Task or Cron-Job

Note: By „export of data“ we do not mean the export of test-objects for the purpose of re-importing them on another ILIAS installation. This sections deals with the export of test-results which is required to archive e-exams!

We suggest to implement an export routine for test data as a background task / cronjob in the ILIAS core as wih the TestArchiveCreator Plugin.

HTML Export for Archives

The current implementation for data export does not perform well with big datasets and is prone to errors when it comes to the export of user generated content (i.e. media elements in different formats/sizes).

This is especially true when the data export is used to archive the results of e-exams and further emphasized by the most common format for archive exports: PDF. In this case the data / content has to be converted from HTML with media files into PDF files, all of which is done on the server. Not only are these files big and require a lot of disk space, the generation takes a long time.

Instead ILIAS MUST be able to create export files for the purpose of archiving in HTML format. Media elements need to be exported "as is" and MUST be embedded locally in the HTML files. A PDF version of the export MAY be offered in addition.

Focus on Text based Formats

Text/string based file formats for data exports MUST be offered wherever possible. Proprietary file formats like .xls etc SHOULD only be a secondary option if needed. I.e. CSV > XLS, HTML > PDF

RBAC Permission to Access/Create Export Files for Archiving

Currently there is no seperate permission to control who can export data for tests. This MUST be implemented so that if needed the creation of export files for archiving can be centralized via the export tab.

Content Definition for Export Data of E-Exams for Archiving

Export files for archiving MUST contain:

  • all questions that where part of the the e-exam/test, including variants if the test has been randomized; each question/question variant MUST have a unique identifier
  • inputs (i.e. answers) from participants for each question
  • best solution for each question and question variant
  • scored results (i.e. scored inputs/answers to questions)
  • human readable e-exam/test protocol as CSV (vs test.log), including all actions taken during preparation(?), execution, scoring, insight(?) by examiners, participants, and admins
  • version of the terms of use participants have accepted including specifics added by the examiner

Revision of Test Section in the ILIAS Administration

The current implementation for the Test&Assessment section in the ILIAS administration does not meet the demands for administrators handling e-exams.

  1. There is no way to search and/or filter for ILIAS tests in the repository.
  2. Exporting test protocol data is cumbersome and inefficient (see (1)).
  3. There is no way to take actions on a test, i.e. set it offline/online etc.
In order to improve the administration of e-exams with ILIAS, we propose a full revision of the test section in the ILIAS administration. This revision SHOULD solve the aforementioned problems as well as improving the general usability for admins.

Firstly an admin dashboard...
  • MUST allow to search and/or filter ILIAS tests in the repository by
    • title
    • owner
    • ref-id
    • date (test access)
    • status
  • SHOULD allow the admin to
    • set a given test offline/online
    • export the test protocol
    • jump to the test in question (link) to make further changes

ID

Title

Description

User

Exam Phase

EE_Prep_1

Activate Available Terms of Use

As examiner I select e-exam terms of use from the list of globally available terms of use for tests as terms of use for my
test under test settings.

Examiner

Preparation

EE_Prep_2

Set Terms of Use Addendum

As examiner I can add requirements/information specific to my e-exam to the selected terms of use. The addendum MUST
be saved seperately.

Examiner

Preparation

EE_Prep_3

Create Test Sections

As examiner I create one or multiple test sections according to my didactic design to structure and orgranize questions or
groups of questions within the test.

Examiner

Preparation

EE_Prep_4

Add Questions to Sections

As examiner I add a fixed or random set of questions or group of questions from one or multiple question pools to each
section and determine the question behaviour for each section.

Examiner

Preparation

EE_Prep_5

Moving Questions between sections

As examiner I can move questions/question groups via drag&drop.

Examiner

Preparation

EE_Prep_6

Rearranging Sections

As examiner I can rearrage the order of test sections via drag&drop.

Examiner

Preparation

EE_Prep_7

Extending Test Access for Participants

As examiner I can extend the set test access period on a individual level per participant either by adding a fixed amount of time, i.e. +20min or by adding a percentage (+20%).

Examiner

Preparation

EE_Score_1

Receiving insights

As examiner I open the page „Results“ and I find on top of the page general information on what the test‘s results are. If I need further and deeper statistical information, I have an option that activates them.

Moreover, I can switch between insights on my test’s participants‘ individual results and the questions as well as how they performed.

Examiner

Scoring

EE_Score_2

Scoring by Participant

If I need to mark an essay question of one particular student, I look for the students aggregated results in the list of participants. I choose to take a deeper look into the detailed results of that student. Within that view I can search for the essay question. I activate the grading mode for that question, give my marking and maybe I will leave a comment in the comment-field. I click on „Save“ and leave that view

Examiner

Scoring

EE_Score_3

Scoring by Question

If I need to mark an essay question of several or all students, I can switch to a view that shows all the questions of the test, instead of the test‘s participants. There I can choose the question that I need to mark (for example an essay question). I receive a result-dashboard that shows how the question’s original condition and overall performance has been. I also receive a view where all answers of that one question are shown in an expandable list and I can go through them one by one, activate a “grading mode”, give a mark and maybe leave a comment. After the marking process is finished and saved, the result-dashboard shows detailed and updated information on how the question’s performance has been.

Examiner

Scoring

EE_Score_4

Changing a Question

If I need to change a Multiple Choice Question, I can go back and choose the question that needs changing. I receive a dashboard that shows how the question’s original condition and overall performance has been. I can activate a “grading mode” and alter the points assigned to the question’s answer options. I cannot manipulate the original content afterwards. After the marking process is finished and saved, the result-dashboard shows updated information on how the question’s performance has been.

Examiner

Scoring

EE_Score_5

Expanding Solution to a Question

If I need to expand the open varieties of answers to a Cloze Question, because students have given answers that I did not foresee but are still viable, I need to change the original condition of that question. I choose the Cloze Question and I receive a dashboard of the question’s original condition, how often participants did give the answers that were predetermined, and also the unforeseen varieties and how often they were given. I choose the varieties that are still viable, add them to the correct answers and distribute points to them. After saving, the dashboard shows updated information on how the question’s performance has been.

Examiner

Scoring

EE_Score_6

Removing Question from Scoring Schema

If I need to remove a question from the scoring of that test, because it is unsuitable to give a credible measure of the student’s capabilities, I can take it out or I can attribute full points to that question; each without unbalancing the whole test. I choose the question that needs removing. I can select it and choose to remove it. Then I can choose a) to fully remove the question from the scoring of the test. This will result in lower overall points for every participant. The question is not completely erased but merely deactivated. Or b) I can attribute full points to every participant. This means that the overall points for the test will remain the same. But the Mark Schema then needs slight shifting. This happens automatically.

Examiner

Scoring

EE_Score_7

Accessing Scoring History

I would like to know if one of my colleagues has already done some markings or changed anything. On the page „Results“ I find a button that leads to a change log or history. I find a list of logged actions that I can filter and/or arrange by categories like date, user, question or participant. The whole change log (or only certain parts of it) can be exported separately to an XLSX-format.

Examiner

Scoring

EE_Archive_1

Planning Generation of Data Export

As examiner/admin I add my/a test to the queue of tests for which ILIAS will automatically generate export files for archiving, when the system ressources allow or when the next cron starts.

Examiner/Admin

Archiving

EE_Admin_1

Searching for Tests in Administration

As admin I can search the repository for tests under Administration - Test - Dashboard by: title, owner.

Admin

Administration

EE_Admin_2

Filter Searching Results for Tests in Administration

As admin I can filter search results for test on the admin dashboard by date, time, terms of use, etc.

Admin

Administration

EE_Admin_3

Generate Export Data for Archive

As admin I can generate export data for a given test at runtime via the admin test dashboard.

Admin

Administration

EE_Admin_4

Administration of Asynchronous Data Export

As admin I activate and configure cron-jobs / background tasks (time of day to run, max. number of concurrent tasks etc.).

Admin

Administration

EE_Admin_5

Administration of Available File Format for Export Files

As admin I can define the file format available to users exporting testdata for each single export file. I.e. I can disable the PDF file format for archive export files.

Admin

Administration

EE_Admin_6

Create E-Exam Terms of Use

As admin I activate and configure global terms of use for e-exams within the repository (root).
Tests for e-exams MUST show a selection list of all terms of use to be activated for a particular test under test settings.

Admin

Administration

3 Involved Maintainers and Stakeholders

TBD

4 Timeline

TBD

5 Related Feature Requests and Status

Feature Request

Suggested by

Funding

Planned Release

Status

Administration of Tests: Log Data Export Improvements

Sesterhenn, Fabian [sesterhenn]

TH Köln

ILIAS 9

Outdated

Clarification of Log Data for Tests

Sesterhenn, Fabian [sesterhenn]

TH Köln

ILIAS 9

Outdated

Show Only Available Points

Rabah, Rachid [rabah], Sesterhenn, Fabian [sesterhenn]

TH Köln, Uni Bonn

ILIAS 9

Released

Revision of Logging in Test: Interface Overhaul

Strassner, Denis [dstrassner], Kergomard, Stephan [skergomard], Sesterhenn, Fabian [sesterhenn]

Universität Hohenheim, TH Köln

ILIAS 10

Scheduled for ILIAS 10

Revision of Logging in Test: Refactoring and Migration

Strassner, Denis [dstrassner], Kergomard, Stephan [skergomard], Sesterhenn, Fabian [sesterhenn]

Universität Hohenheim, TH Köln

ILIAS 10

Scheduled for ILIAS 10

6 Further Results

7 Additional Information

Test-Player E-Exams Working Group

  • FH Bielefeld
  • HS Bremen
  • TH Köln
  • Uni Bonn
  • Uni Köln
  • Uni Marburg

8 General Discussion

Please discuss specific questions of feature requests on the related feature wiki pages. This discussion section is only for a general discussion of the project and its realisation.

Last edited: 18. Oct 2024, 16:07, Kunkel, Matthias [mkunkel]