You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Add manually graded questions within the same gradescope assignment. For example, when configuring Gradescope programming assignments (without otter-grader) rich formatting can be added to the autograder output, where there is a "html" option. This means that manual test cases can be treated as "autograder" (in the otter grader sense) to generate the associated .py test files. The files can then be ran to plot student functions and write them to html by creating a temporary file/BytesIO object. By creating a function that runs student plots the images can be rendered in the output of the same assignment. Below is an example of what the feature could potentially look like (source does not utilize otter-grader or notebook homeworks but standalone functions)
Implementation Strategies
Strategy 1:
Create a custom plugin that configures Gradescope autograder with html output format
add a new key to the AutograderConfig to enable this behavior (something like export_manual_responses ?).
update otter assign to somehow link manually graded portions of notebooks to their question name.
add a method to the GradingResults object that accepts a notebook, extracts all of the manual response cells, and stores them keyed by the question name
update GradingResults.to_gradescope_dict to convert the manual cells to Gradescope’s format and include them in the results dictionary it generates
update both the python and r runners run methods to call the new GradingResults method with the submission notebook (note that it’s possible to submit non-notebook files which also get run by these runners, so this step should only be performed if the submission file’s extension is .ipynb)
add unit tests for all new behavior and update the changelog
Feature advantages
The feature would enable students to submit one notebook file with all test cases manual and automatic outputs written to Gradescope autograder directly allowing to streamline the grading process for larger undergraduate courses reliant on python notebook submissions.
The text was updated successfully, but these errors were encountered:
I like this feature idea a lot. There's a lot of use cases here I think, all of which are kind of related.
Given that gradescope only renders notebooks (or any code file) which is <= 10MB, you could, in theory, extend this idea to simply extract sections of a notebook, or rather a specific cell and its output.
I do think we'd need to be careful about how the "test case" output is displayed, since gradescope only displays pass/fail and we'd want to be clear that a passing "test case" which merely extracts content to be graded later doesn't mean someone gets the points. (0/0 score and passing seems fine, but we'd probably want to put some message in the results too.
I think this is a good feature idea. Unfortunately I don't have the bandwidth at the moment to work on it and I'm trying to get v6 out ASAP, so this won't make it in to that version.
Regarding strategy 2: otter keeps a copy of the executed notebook in the GradingResults object, so grabbing plots generated by student code/responses to other manually-graded questions after grading should be very easy. The question is how to communicate the regions of the notebook that are manually-graded; just tagging cells wouldn't work 100% of the time since if a student adds or deletes response cells, they won't get included in the output. Otter currently demarcates manually-graded portions with HTML comments, and this strategy could be reused to create mini-notebooks which nbconvert could then convert to HTML.
There is also #632 about adding rich output for the autograder results which might also be something to consider as part of this effort.
Happy to review any PRs related to this if anyone wants to contribute, otherwise I will try to shoot for this in a future release (but not sure when).
Description
Add manually graded questions within the same gradescope assignment. For example, when configuring Gradescope programming assignments (without otter-grader) rich formatting can be added to the autograder output, where there is a "html" option. This means that manual test cases can be treated as "autograder" (in the otter grader sense) to generate the associated .py test files. The files can then be ran to plot student functions and write them to html by creating a temporary file/BytesIO object. By creating a function that runs student plots the images can be rendered in the output of the same assignment. Below is an example of what the feature could potentially look like (source does not utilize otter-grader or notebook homeworks but standalone functions)
Implementation Strategies
Strategy 1:
Create a custom plugin that configures Gradescope autograder with html output format
Strategy 2 (compliments of @chrispyles)
Feature advantages
The feature would enable students to submit one notebook file with all test cases manual and automatic outputs written to Gradescope autograder directly allowing to streamline the grading process for larger undergraduate courses reliant on python notebook submissions.
The text was updated successfully, but these errors were encountered: