User Acceptance Testing (UAT)

Goal of this document is to provide basic framework to perform UAT in order to accept new releases of Practique (Server and/or iPad application) and maintain the confidence that the core functionality provided by the software works as before or desired.

For this document we consider Electronic Exam delivery as core functionality of Practique and will focus on providing scenario to verify that this functionality is working as desired. In this document we don't focus on other areas of Practique such as Exam creation, Item bank or Statistical reports.

Requirements

  • You need to have Test exam ready on your Practique Server (see Preparation)
  • Test with at least 1 iPad per role you use (1x Examiner, 1x Marshall, 1x Student), preferably more
  • Make sure you're using correct version of Practique Server & Practique for iPad (it must be the same as one you chosen to use for your live exam)
  • Make sure you have same (or very similar) environment to your real exam for your test. This should include settings for WiFi network, iPad configuration (MDM settings, timezones, language, restrictions, Single app mode, etc...)

Preparation

In order to make your UAT reliable you have to be consistent with both the method as well as data which you use to perform testing.

Method - how you test Practique should be based on the test scenario outlined below. You may customise your test scenario to fit your use of Practique and your testing possibilities.

We recommend you follow some basic principles:

  1. Document your final test scenario so it is easy to follow/repeat for you and your colleagues
  2. Always use the same test scenario to achieve consistency
  3. Always follow test scenario in full and start from the beginning of the scenario so ensure known state of the system

Data to test Practique are represented by specially crafted test exam on Practique server. You should use your test exam for all your tests keeping it on the server so you can always quickly use it to test new release of either Server or iPad application. You should update this test exam according to Practique development so you're sure you're testing with valid dataset for the version of Practique under the test.

Your test exam should be prepared with following properties:

  1. Have real-like size
    1. We're aware that it is not practical to test with an exam having 5 sessions and 7 circuits each, but having correct number of stations with 1-2 circuit and 1 session is feasible and adequate for most cases.
  2. Have real-like items
    1. Mark sheet - Use your real mark sheet to test (all question types you would use). If you have more than one type of marksheet or each station has different one you should include as many stations as practical.
    2. Resources - You should have stations with documents, images and other multimedia which you will be using in real exam. You may include stations with and without any resources.
  3. Use authentic user IDs for logins, for example if your examiners login with GMC numbers provide valid GMC numbers or use ID which follows the same format.

Exam delivery test scenario for OSCE

  1. Reset your Test exam
  2. Push exam to iPads
  3. Verify exam synchronisation onto the iPads
  4. Login Examiners, Marshals and Candidates
    1. Check schedule screens for each role
    2. Check Examiners have correct mark sheets & case resources
    3. Depending on your settings check that mark sheets are locked to prevent early marking
  5. Start round using Marshall's iPad(s)
    1. Check examiners are able to mark
  6. Enter marks using Examiner iPads
    1. Test mark sheet validation by trying to submit empty mark sheet and check missing marks
    2. Enter partial marks and return to Schedule and continue marking
    3. Submit your marks
    4. Amend marks (if configured)
    5. Check marks are synchronised by checking Marshall iPad
    6. Check marks are synchronised by checking marks on the Practique Server
  7. Complete whole 1 circuit when marking
  8. Check collected marks on the Server
    1. Check marks for random examiners and verify
    2. Check Standard settings for expected values (if you failed 10 candidates you should expect certain number of passes)
  9. Check Candidate feedback reports
    1. Does it look as it should?
    2. Does it include all data you required? For example written feedback and so on.