Tag Archives: Assessment

Budgeting for Assessment

Workloads for Academics in Higher Education are often very complex, with teaching loads, research tasks and administration all juggling for our attention with lots of task switching adding to the complexity. For many academics, teaching loads are a significant part of their work, but explicitly looking at the time spent on assessment could bring better results for staff and students alike.

Contact Hours to Admin Hours

There are usually ad-hoc assumptions about the amount of administration time a module takes above and beyond the contact hours spent in front of a class. For our purposes contact hours could just as easily be delivering synchronous and asynchronous hours on-line as time spent in more traditional on-campus delivery.

A common such assumption is that it takes two hours of this administration time for each hour of contact, sometimes more. That administration time can be subdivided into various tasks such as preparation of teaching materials, delivery of assessment, and other correspondence with students etc..

Preparation time on teaching materials can obviously be markedly higher for the first presentation of a module, or after significant changes. Many academics are already undertaking substantial additional preparation time to re-factor materials for on-line delivery at the moment.

Assessment Hours

I want to focus on the time spent on assessment because I feel this is a serious time sink for most academics. This is partly because when it comes to reform of learning and teaching in a curriculum, assessment is often the last consideration because we are nervous about the serious consequences of getting things wrong. It is also partly because we can sometimes draw the conclusion that time spent on assessment equates directly to quality.

How often have you attempted to place a budget on your assessment time before delivering a module? I mean the time taken to design an assessment, deliver it to students, assess the submissions and deliver feedback. My guess is that very few of us have done this.

The outcomes of this can be serious. We often design assessments focusing on the first half of these tasks that take a tremendous and unquantified amount of labour to fully deliver, often much more than we really expected. This can result in a very stressed academic or team of academics, or the delivery of feedback is too late to be effectively useful to the students, or the quality and depth of the feedback suffers. Any combinations of these outcomes is also potentially likely.

Budget influences Design, poor Design blows the Budget

Agreeing a budget with a line manager, or even with yourself, can be informative. If you think the budget is too low you are faced with the choice of making the argument that additional resource is really required, or that you need to re-design the assessment to fit within your budget.

Of course, there are often times when additional resource really is required, but my argument here is that this should be a conscious choice, planned for and if possible agreed with your line manager who may be able to bring practical assistance, or at least balance out the rest of your workload.

Even if you agree a high budget, a good plan, and a good design minimises the risk of blowing that budget.

Design Choices

So what choices can we make to reduce the time burden? Some choices not only have no adverse effect on quality, but can actually deepen the quality of feedback or reflection opportunities for students. Here’s a very non-exhaustive list of thoughts in this direction.

  • Do you really need all those questions to confirm your learning outcomes? Do you have some questions that are just repeating the assessment of the same aspects? Trim them if so. Extra material can be used for tutorials instead.
  • Have you considered the use of a good rubric if you aren’t already using one? This can improve transparency of outcomes to students both before and after assessments and provide some generic feedback, leaving you with more time to give more focused feedback and can hugely improve the speed of marking.
  • Can you partially automate some of the assessment? If assessments are being delivered on-line many Virtual Learning Environments allow you to set assessments with questions with set or calculated answers so some of the marking and feedback can be automated. You can combine these with deeper more free response questions.
  • Can peer assessment accomplish some of your goals? If you are nervous about using peer assessment (and it does need care) what about using it in a formative way as part of your assessment diet. This can also greatly deepen students’ understanding of how their work is marked and assessed.
  • Can self assessment accomplish some of your goals? This can encourage highly reflective learning and allow you to guide the feedback based on the students’ initial assumptions.

What are your ideas to reduce your assessment budget will keeping, or even deepening the quality?

Even if you don’t undertake this formally with your line manager, try setting yourself as assessment budget, and consider how to work within it so that you can deliver authentic assessments, quality feedback in a way that leaves you time, focus and attention for the other parts of your job.

Assessment handling and Assessment Workflow in WAM

Sometime ago I began writing a Workload Allocation Modeller aimed at Higher Education, and I’ve written some previous blog articles about this.

As is often the way, the scope of the project broadened and I found myself writing in support for handling assessments and the QA processes around them. At some point this necessitates a new name for WAM to something more general (answers on a post card please) but for now, development continues.

Last year I added features to allow Exams, Coursework, and their Moderation and QA documents to be uploaded to WAM. This was generally reasonably successful, but a bit clunky. We gave several External Examiners access to the system and they were able to look in at the modules for which they were an examiner and the feedback was pretty good.

What Worked

One of the things that worked best about last year’s experiment was that we put in information about the Programmes (Courses) each Module was on. It’s not at all unusual for many Programmes to have the same Module within them.

This can cause a headache for External Examination since an External Examiner is normally assigned to a Programme. In short, the same Module can end up being looked at by several Examiners. While this is OK, it can be wasteful of work, and creates potential problems when two Examiners have a different perspective on the Module.

So within WAM, I put in code an assumption of what we should be doing in paper based systems – that every Module should have a “Lead Programme”. The examiner for that Programme should be the one that has primacy, and furthermore, where they are presented other Modules on the Programme for which they aren’t the “lead” Examiner, they should know that this is for information, and they may not be required to delve into it in so much detail – unless they choose to.

This aspect worked well, and the External Examiners have a landing screen that shows which Modules they are examining, and which they are the lead Examiner.

What Didn’t Work

I had written code that was intended to look at what assessment artefacts had been uploaded since a last user’s login, and email them the relevant stuff.

This turned out to be problematic, partly because one had to unpick who should get what, but mostly because I’m using remote authentication with Django (the Python framework in which WAM is written), and it seems that the last login time isn’t always updated properly when you aren’t using Django’s built in authentication.

But the biggest problem was a lack of any workflow. This was a bit deliberate since I didn’t want to hardcode my School or Faculty’s workflow.

You should never design your software product for HE around your own University too tightly. Because your own University will be a different University in two years’ time.

So, I wanted to ponder this a bit. It made visibility of what was going on a little difficult. It looked a bit like this (not exactly, as this is a screenshot from a newer version of an older module):

Old view of Assessment Items
Old view of Assessment Items

with items shown from oldest at the bottom to newest at the top. You can kind of infer the workflow state by the top item, and indeed, I used that in the module list.

But staff uploaded files they wanted to delete (and that was previously disallowed for audit reasons) and the workflow wasn’t too clear and that made notifications more difficult.

What’s New

So, in a beta version of 2.0 of the software I have implemented a workflow model. I did this by:

  • defining a model that represented the potential states a Module could be in, each state defines who can trigger it, and what can happen next, and who should be notified;
  • defining a model that shows a “sign off” event.

Once it became possible to issue a “sign off” of where we were in the workflow, a lot of things became easier. This screenshot shows how it looks now.

Example of new assessment workflow
Example of new assessment workflow

Ok, it’s a bit of a dumb example, since I’m the only user triggering states here (and I can only do that in some cases since I’m a Superuser, otherwise some states can only be triggered by the correct stakeholder – the moderator of examiner).

However, you can see that now we can still have all the assessment resources, but with sign offs at various stages. The sign off could (and likely would) have much more detailed notes in a real implementation.

This in turn has made notification emails much easier to create. Here is the email triggered by the final sign off above.

The detailed notes aren’t shown in the email, in case other eyes are on it and there are sensitive comments.

All of this code is available at GitHub. It’s working now, but I’m probably do a few more bits before an official 2.0 release.

I will be demoing the system at the Royal Academy of Engineering in London next Monday, although that will focus entirely on WAM’s workload features.

OPUS and Assessment 3 – Regime Change

This is the third and final article in a short series on how OPUS, a system for managing placement on-line, handles assessment. You probably want to read the first and second article before getting into this.

Regime Change

It’s not just in geo-political diplomacy that regime change is a risky proposition. In general you should not change a regime once it has been established and students entered on to it. If you do, there is a risk that marks and feedback will become unavailable for existing assessments, or that marks are calculated incorrectly and so on. Obviously it is also non-ideal practice for the transparency of assessment.

Instead you should create a new regime in advance of a new academic year, change the assessment settings in the relevant programmes of study to indicate that regime will come into force in the new year, and brief all parties appropriately. All of this is done by the techniques covered in the first two articles. If you have done all that, well done, and you can stop reading now.

This article is about what to do if students are on a given assessment regime in OPUS, and somebody decides to change that regime midstream, when marks are already recorded for early items.

TL;DR DON’T DO THIS, TURN BACK NOW!

This shouldn’t ever happen, as noted really you need to ensure your regime changes are correctly configured and enabled before any students start collecting marks.

And yet, it does happen, or at least it has happened to me twice that I have been asked to make tweaks to a regime where student marks already exist. Indeed it is happened to me this week, hence this article.

Even changing small details like titles will effect the displayed data for students from previous years. Tweaking weightings could cause similar or more serious problems.

So what happens if we create a new regime and move our students onto it midstream? Well, the existing marks and feedback are recorded against the old regime, so they will “disappear” unless and until the students are placed back on that regime.

If you want to do this, and copy over the marks from the old regime into the new regime, there is a potential way to do this. It is only been used a handful of times and should be considered dangerous. It also probably won’t work if your original marks use a regime where the same assessment appears more than once in the regime for any given student.

But, if you’re here and want to proceed, it will probably be possible using what was deliberately undocumented functionality.

You will need command line, root access (deliberately – this is not a bug), in order to do this. If you haven’t got root access you need to get someone who does so you can… Read all the instructions before starting.

0. BACK UP ALL YOUR DATA NOW

Before contemplating this insanity, ensure your OPUS database is backed up appropriately. I’d also extract a broadsheet of all existing collected assessment for good measure from the Information, Reports section of the Admin interface.

That said, this functionality deliberately copies data, it doesn’t delete it – but still.

0. NO REALLY, BACK UP ALL YOUR DATA NOW, I REALLY MEAN IT.

 

Ok, you’re still here.

First of all this approach only makes sense (obviously) if the marks you have already captured are valid. I.e. the assessment(s) you want to change are in the future for the students and haven’t been recorded. If not, then obviously OPUS can’t help you do anything meaningful with the marks you have already collected.

1. Make your New Assessment(s)

Maybe you plan to just change from one stock assessment to another, or perhaps you want to adjust a weighting on an existing assessment that hasn’t been undertaken by students in this year. In this case, you can skip this step.

But if needed, create and test any new assessments following the approach laid out in the second article in this series. Do make sure you spend some time testing the form.

2. Add and Configure a New Assessment Regime

Create your new assessment regime, as detailed in the first article, but don’t link it to any programmes yet.

Your new regime should be configured as you wish it to be. Remember, for there to be any point in this exercise, the early assessments already undertaken by the students need to be the same (though not necessarily in the same order) – otherwise OPUS can’t help and you need to sort out all the marks in transition entirely manually.

3. Note the IDs of the Old and New Regimes

Things start to get clunky at this point. Remember, we are heading off road. You will need the database ID of both the old regime and the new one.

You can obtain these by, for instance, going to Assessment Groups in the Configuration menu and editing the regimes in turn. The URL will show something like this:

URL

At the very end, you will “id=2” so 2 is the id we want. Write these down for both regimes, noting carefully the old and new one. It’s almost certain the new id will be larger than the old one.

4. Choose your timing well

You want to complete the steps from here on in, smoothly, in a relatively short time period. It is advisable that you switch OPUS into maintenance mode in a scheduled way with prior warning. This can be done from the Superuser, Services menu in the admin interface, if you are a superuser level admin – if you aren’t you shouldn’t be doing this without the help of such a user. You can also enter maintenance mode with the command line tool.

5. Use the Command Line Tool with root access

OPUS ships with a command line utility. With luck, typing “opus” from a root command prompt will reveal it. It’s usually installed in /usr/sbin/ and may not require root access in general, but it most certainly will insist on it for this use.

OPUS Command Line Tool

 

 

 

 

 

 

 

 

If that didn’t work, go find it in the cron directory of your OPUS install and run it with

php opus.php

If you needed this to work, you’ll need to use instead of just using “opus” in the next command. We need a command called copy_assessment_results and you’ll note it’s not on the list. It’s not on the dev_help list either, because … did I mention this is a stupid thing to do? You need to enter in the command as follows changing the id for old and new regimes to be those you wrote down in step 3. All on one line.

opus copy_assessment_results old_regime_id=1&new_regime_id=2

Don’t run this more than once, the code isn’t smart enough not to copy over an additional set of data with possibly “exciting” results.

This copies assessment results and feedback, and marks from one regime to another. It’s potentially wasteful but it can’t identify the correct students and doesn’t delete data as an obvious precaution.

6. Enable the New Regime for Students

Even in maintenance mode, Superuser admins can log in and act. You can switch over your regime now. Maybe do this for one programme and test the results before using the bulk change facility discussed in the previous article.

With luck you will see your shiny new assessment regime with the old marks and feedback for the existing work in the old regime copied over. Older students on the old regime should still show their results and feedback correctly.

If not – well, this is what that backup in step 0 was for, right? And you’ll have to do it manually from the broadsheet you exported as well.

7. Re-enable Normal Access

Either from the command line tool with

opus start

or from the Superuser, Services menu, re-open OPUS for formal access.

8. Corrective Action

Explain to relevant colleagues the pain and stress of having to do this and that in future all assessment regime changes should be done appropriately, before students begin completing assessments.

OPUS and Assessment 2 – Adding Custom Assessments

This is a follow on to the previous article on setting up assessment in OPUS, an on-line system for placement learning. You probably want to read that first. This is much more advanced and requires some technical knowledge (or someone that has that).

Making New Assessments

Suppose that OPUS doesn’t have the assessment you want, then you will have to build your own, from scratch, or by modifying an existing one. This takes some minor HTML skill and access to your OPUS code to add a new file. So if you can’t do this yourself, ensure you get appropriate support.

Look at an existing assessment closely first. Go back to Advanced on the OPUS admin menu, and then Assessments.

For each assessment, clicking on Structure allows access to underlying variables that are captured. These can be numeric, text, or checkboxes, and some validation is possible too.

The Structure of an Assessment

you need to work out what things you will capture, and create a skin for the assessment, most usually from modifying one from another. This following snippet from a related Smarty template shows this is just HTML, but OPUS, through Smarty drops in an assessment variable that gives access to any existing values, and any validation errors. <pre class="lang:xhtml decode:true" title="An extract from a template.">{* Smarty *} {* Template for SEME Final Visit *} {* Assessment specific layout *} {* Really, only the form contents need to be added *} {* Note the use of get_value and flag_error to *}Â Â  Â  {* bring in assessment specific material *}  ... ...  <tr> <td class="property" rowspan="5">Use of English</td> <td colspan="1"><strong>Mark</strong></td><td colspan="4"><strong>Comments</strong></td></tr>  <tr><td colspan="1">   {assessment->flag_error(“mark5″)}
<input type=”text” class=”data_entry_required” size=”2″ value=”{assessment->get_value("mark5")}" name="mark5"> </td> <td colspan="4">   {assessment->flag_error(“comment5″)}
{include file=”general/assessment/textarea.tpl” name=”comment5″ rows=”7″ cols=”60″}
</td>
</tr>

<tr><td colspan=”5″>Marking Scheme</td></tr>
<tr>
<td> Uses language to clearly express views concisely </td>
<td> Expresses clearly but with some minor errors </td>
<td> Good expression, logical flow, reasonably concise </td>
<td> Reasonable flow, some contorted expressions, a little verbose </td>
<td> Poor expression, verbose, some colloquialisms </td>
</tr>


This is a representative snippet. You can see this full template here. Note the “special” code in between braces { }. The variables in the template pertain to the names in the structure.

Create and Save Your Template

Create your template, probably using one of the existing ones to help you understand the format. This provides the layout and skin for your pro-forma and allows you to do anything you can wish with HTML/CSS. Be mindful of security considerations, but you aren’t writing main code, just an included bit. OPUS will top and tail the file for you when it runs.

Save it under the templates/assessments directory in your OPUS install. I recommend you make a subdirectory for your institution.

Avoid using the “uu” directory. This is used for pre-shipped assessments and those used at Ulster University. There is a chance your changes will get clobbered by a new OPUS version if you put your template in there.

Adding the Assessment variables into OPUS

Then you need to create your new Assessment item itself as at the top of the article. Once you have created it, click on structure and add each variable you will capture in turn, whether it is text, a number, or a checkbox, and any simple validation rules – such as minimum or maximum values.

The final detail of one variable
The final detail of one variable

The description appears in feedback and validation, so make sure it is meaningful to the end user. The name is the variable name as it appears in your template. The weighting field is used to determine if numeric values contribute to the score. Usually use 1 if you want the score to be counted, and 0 if you want the score to be ignored. Finally you can choose whether each field is compulsory or not. Optional fields will be ignored in a total when OPUS creates a percentage.

Once complete, add your new assessment into a test regime as detailed in the first article and do some careful testing before adding the regime to live students.

OPUS and Assessment 1 – The Basics

OPUS is a FOSS (Free and Open Source Software) web application I wrote at Ulster University to manage work based learning. It has been, and is used by some other universities too.

Among its features is a way to understand the assessment structure for different groups and how it can change over years in such a way that legacy data is still correct for audit.

You don’t have to use the in-built assessment functionality in OPUS, but the features were written to promote transparency of assessment, and ensure all stakeholders could easily access assessment information for a student.

So here’s how to do it, it takes a bit of set-up but then should run smoothly until you ever decide to change how you assess. This is one of a short series on the matter.

Assessment Regimes

OPUS uses a “bank” of individual assessments that can be built from different weightings into as assessment regime. To be precise OPUS provides a means of capturing the rubric for each assessment and the feedback to students. Each assessment has a Smarty template which “skins” the assessment form. These can be found in the Assessment section of the Advanced tab of the admin interface.

A list of OPUS assessments
A list of OPUS assessments

For most people using OPUS, you build an assessment regime from these components in a pick and mix fashion. Head to the Configuration tab, and select Assessment Groups. This may well be empty, in an out-of-the-box install, in which case create a group with an appropriate name and some commentary on what it is for.

A list of Assessment Groups
A list of Assessment Groups

Once you have a group, you will see an option to edit the regime that is associated with it.

A typical assessment regime.
A typical assessment regime.

When we add an item, a dialog appears to enter some information.

A regime item.
A regime item.

In this we pick which of the assessments from the very start we want to use, you might decide, for instance, to use the same assessment twice in a given regime, at different stages. Give the student a description of what the assessment name should be for them, a weighting (which could be zero for formative only assessments).

You can also specify who should assess this – it could be the academic tutor assigned to the student, the workplace supervisor, the student themselves or labelled as “other”.

The year is specified in relation to the year of placement, and should usually therefore be zero. Finally start and end are the month and day (MMDD) for when work should begin on such assessments, and the deadline. These are used to help prompt staff and order assessments for students.

Adding Regimes to Programmes

Once an assessment regime has been created, you need to tell OPUS you want to use it with students in a given programme.

Go to Configuration and then Organisation Details and get to the school of study that’s relevant and pull up their list of programmes. For each programme you can click on assessment, from here you can select which regime is appropriate for the programme, and the year in which the regime started and ended being valid. You can leave out an end year to let the decision roll on.

More often than not you wish to apply these changes to at least a School. Clicking on Bulk Change Assessment will allow you to select all the programmes within a School, the new assessment regime you want and the start year, and it will do the rest.

Once you have done this the functionality in OPUS to show the assessments, their structures and marks, and to enable marking will appear for all relevant students and the staff working with them.

Sample Assessment Information
Sample Assessment Information

A table like that above will appear under each related student (this one is dummy information) and students can click view to see the pro-forma whether complete or not to understand how they will be assessed, or what the results were as appropriate.

An assessment pro-forma
An assessment pro-forma

Naturally staff who have no business with a student cannot see the marks or information pertaining to them.

When completing an assessment on a student a member of staff has 24 hours to edit their findings before the results “lock” and can only be removed by an administrator – this allows most minor errors to be corrected.