Auto grading for Jupyter Notebook labs
Jupyter Notebook labs now support automatic grading. An author can create an assignment cell within a notebook, set a maximum score for it, and provide a set of unit tests that would be used to check the learner’s code. An author can make tests visible to learners or hide them. When the learner completes the lab, the tests are run against the learner’s solution. The score for the assignment is calculated based on the test run and reported to LMS.
Submitting partially completed tasks
Now a learner can submit a task even if it does not pass all the tests and get a partial score.
New UI for Jupyter Notebook labs
A new UI was implemented for labs using Jupyter Notebooks. It has a modern look and feel and provides instructors and learners with these new options:
- Uploading additional files by instructors (data files, notebooks) for learners to work with the files
- Importing existing notebooks to use in a lab
- Working with multiple windows
- Opening the console and running commands
Manual assessment for Jupyter Notebook labs
Now an instructor can configure a Jupyter Notebook lab to be assessed manually. After a learner completes the lab, the solution is reported to the instructor. The instructor can:
- Overview code in the learner’s solution
- Leave feedback on the learner’s solution
- Assign scores for the tasks with manual grading
- Adjust scores for auto-graded tasks based on the quality of the learner’s code
After the lab task is assessed, the feedback report is generated and presented to the learner.
Now an instructor can hide some tests in a lab. To do this, they configure tests as “masked”. Thus, learners can see only the results of such tests (passed or failed), without any details about test data.
Unit tests for Eiffel
Coding labs in the Eiffel programming language now support unit tests. These tests automatically verify and grade a learner’s solution.
More detailed error messages
Error messages displayed when working in the Coding Lab authoring mode now provide more detailed error descriptions, include a filename, and the location in the file.
Enhanced environment management
Now an author can do the following:
- Manage lab environments (create, edit, and delete them) on a separate page
- Monitor how many— and which labs use each environment
- Update with a single action an environment that is used in several labs, for example, replace a VM image that all of them use
The Virtual Lab learner UI is now available in the following new languages:
- Chinese (China)
- Portuguese (Portugal)
Improved access to lab variables from scripts
A new library is now available that stores variables for use in Custom Actions and Acceptance Criteria scripts. It provides a uniform and simplified way of reading and setting lab variables.
Now authors can configure lab series. A series consists of several labs following one after another.
When a learner runs labs from a series, the environment with the learner’s work is transferred from a current lab to the next in the series. This way learners can continue their work as they pass labs from the series and have a more comprehensive experience of learning software.
Enhanced session management and preview
The ‘Sessions’ page now has new UI with options for records sorting, search by any field, and quick filters for the most used search options. It allows users to better manage lab sessions.
The improved view with the details of an active session now includes the live preview of the learner’s VM screen.
Improved lab performance
Now authors can configure labs so as to pre-start and buffer VMs with an option of pausing or suspending a VM. Using this option can significantly improve the lab start time.