Department of Student and Academic Administration

Data Standards

In addition to making data returns, the team is also accountable for ensuring that data meets the standards laid down by the external agencies.  This is ensured through a continuous process of clarifying the rules and validating/verifying any student and course management system data that has a bearing on the quality of the returns we submit. We take a holistic approach to data quality in relation to the returns in that the important factor is that the data should be correct in the first instance for the purposes of managing students. In this way verification and validation is ongoing and is undertaken as close to the point of entry as possible.

We all have a responsibility for data quality ensuring that it is timely, valid and accurate.

Validation Stages

The philosophy is that; firstly validation and correction should occur as close to the point of entry as possible, secondly all errors and warnings are addressed by correcting the source data except in exceptional circumstances. This means that when errors and warnings are identified, the records are corrected in the Student Course Management System (SCMS).

We see three stages of validation:

  1. Primary - as part of business process

Teams who manage the associated business process undertake verification and validation on a day to day basis in support of the business process. In support of this are various documents detailing the business process. All staff using the system are trained by IS and are not allowed to use the system prior to attending the training.

 

      2. Secondary – as part of ongoing validation on SCMS by:

  • Academic Registry (RSDS validation team) in association with Departments. The RSDS team have a series of discoverer reports to identify records in error for many of the operational processes of the University. They manage the running of these in association with departments on a regular basis. Extensive guidance on those reports is available for departments to consult and use.
  • ERDS – as part of fortnightly GAP validation

Every fortnight a report is run and the output from it migrated to a GAP validation spreadsheet. The report covers the majority of HESA errors and warning as well as some additional UoP derived errors which are implicitly linked to them.

The GAP validation spreadsheet pulls information from a ‘solutions’ spreadsheet which provides for each error/warning; a description, likely cause(s), and probable solutions to cure. Also in merging the ‘actions’ from the previous run effectively acts as a progress chart. Team members then work through the errors/warning in priority and volume order recording the status at each juncture.

The key is to address errors / warnings as quickly as possible with an aim to clear as many errors within the cycle they were first noted.