This page describes the practical for the Informatics Software Testing course. It will be marked out of 100 points, and is worth 25% of the assessment of the course. This practical will be undertaken in groups, normally of four (or possibly five), and will be assessed on a group report and individual submissions. This practical should take approximately 20 hours of work from each participant to complete. If at any time you believe you are likely to exceed this estimate by more than 30% please get in touch with me (soa@staffmail.ed.ac.uk
) to discuss what can be done.
The submission deadline for the practical is: Thursday, 16:00, 20 March 2014 (week 9 if we do not count Innovative Learning Week)
The penalty for late submission follows the uniform policy described in the UG3 course guide. Please read that page, and pay particular attention to the section on plagiarism.
For this practical you will be split up, where possible, in groups of 4 or 3. There are 7 tasks which, depending on the size of your group, you will approach as a group or individually as follows:
Each of the first three tasks will count 25% of the final assessment (so the group activity counts 75% of the assessment) and each of the individual tasks will count 25% so each member of the group will be allocated the group score on the group tasks plus their individual score on the task they tackled individually.
The overall goal of this project is to produce a short report on the testing of a small system. The report should be at most around 20 pages in length, supported by various other technical deliverables (code for tests). It should be split up into a main body and appendices (the number of pages being counted in the total of 20). The main body of the report should consist of an introduction followed by sections describing the results of the group and individual tasks. You should clearly label each individual section with the author's student number so marks can be allocated to the correct individual. The appendices should include numbered screenshots, figures (e.g. a control flow graph) and any small piece of code you would like to refer to from the explanations, and should constitute visual aids to your explanations from the main body of the report. Appendices should be referred to and explained from the main body of the report - any which are not will not be counted towards your final assessment.
You should be able to complete the tasks described below with around 15-20 hours of effort per group member so each group has a "budget" of 60-100 hours of effort depending on the size of group and your choice. You should consciously manage that effort.
In this practical you will consider the testing of a part of an educational software system that is reasonably mature and has been developed for use by students. The system is called VMAP (follow this link for an intro)and is intended to help users create “mindmaps”. For more explanation of what the system does see the introduction and help on the web page. There is also a link on the VMAP page that allows you to run VMAP from your web browser. The system is now quite old and exhibits some aspects of legacy code.
The sourcecode is available locally as a tar file compressed using gzip. Please download it from here (and NOT from the project page) and extract it with tar -xzf vmap-0.91.tar.gz
or whatever tool you normally use.
Note that, because VMAP is a legacy system written for Java 1.4 it uses variable names that are incompatible with later versions of Java. You have two options to cope with this:
build.xml
to
include source="1.4"
. If you're using Eclipse, you need
to make the same change in the configuration.
You can choose either to use the Eclipse IDE or just to use JUnit and other tools on their own; I have no strong preference - you may find some of the tools available in Eclipse useful. You will need some of the following:
Each of the group tasks has an associated tutorial which will help you prepare for it. Please consult the tutorial guidelines and arrange for your tutorials. Good planning and organisation will be necessary- the earlier you do the four tutorials the more time you will have to improve your work on the tasks before the practical deadline!
Now you should work through the following activities:
Preparation: You should have a look at the preparatory activity 1 which covers basic material on the use of Junit. You should also have a look at the preparatory activity 2 which provides an overview of what you will work on in the first couple of tasks in the practical.
You should spend some time looking at the VMAP web page, and should consider what you need to download and begin to set up the environment to support you in the tasks given below.
You should also read all of this Practical specification and create a plan that specifies at least:
Deliverables: The plan and allocation of tasks - this should be an internal document, shared and evolved by the group (not to be submitted with the practical).
Preparation: You should thoroughly read and understand section 11.2 of Pezze and Young and the defining paper on the Category Partition method by Ostrand and Balcer, then get together with your group and do tutorial 1.
Tutorial: Contact the tutor (Stuart Anderson) for this session to arrange a tutorial meeting as indicated in the tutorial guidelines
In this task you will generate a test suite in JUnit by first
constructing test case specifications using the category partition
approach. In this task you will test the method
String toRelativeURL(URL base, URL target)
that can be found in the file
vmap-0.91/vmap/main/Tools.java
. You should document the
following parts of the process:
You should then implement your test case specification and test the code for the function. In giving a grade for this part of the practical I will take account of the performance of your test set on a collection of variants of the method.
Deliverables:
Task1.java
that contains the JUnit tests.
Preparation: You should read Pezze and Young chapter 12 and then do tutorial 2 on this topic before arranging your meeting with the tutor.
Tutorial: Contact the tutor (Donal Stewart) for this session to arrange a tutorial meeting as indicated in the tutorial guidelines
Using some appropriate coverage tool (please specify which), assess the level of statement coverage achieved by your test suite. Then do the following:
Deliverables:
Task2.java
containing any new tests.
Preparation: Read Pezze and Young Chapter 10 on Adequacy and then do tutorial 3 which covers mutation. This should help to decide how to generate mutants.
Tutorial: Contact the tutor (Hadi Daneshvar) for this session to arrange a tutorial meeting as indicated in the tutorial guidelines
In this section you should consider using mutations to check the adequacy of your test set developed under Task 1. You should do the following:
toRelativeURL
method.
Deliverables:
toRelativeURL
, call the variant files toRelativeURLVar1.java
, toRelativeURLVar2.java
, and so on.
Task1-strong.java
or something similar.
Preparation: You should review the slides for lecture 10 and the slides for Chapters 15 and 21 of Pezze and Young. Have look at tutorial 4.
In this section you will assess the effort required to test the integration of a relatively modest system. Assessing the integration of a system like VMAP can be very time consuming. So, since this is an individual task, you are being asked to write a report rather than doing the testing.
Inspect the code for the classesVmap
,
Tools
and Controller
and the coverage
attained by your test suite. Write a short report
that:
Deliverables:
Preparation:To prepare for this task you should review the slides for lectures 7 and 8, the slides for chapters 6 and 13 of Pezze and work on tutorial 5.
For this section only consider the method openDocument
in the Vmap
class and the use of the parameter
url. You should do the following:
Vmap
and
Tools
classes.
Note that in the tutorial we look at coupling through return values as well; that's not necessary here.
Deliverables:
Preparation: Re-read the lecture slides that list some of the varieties of testing you might apply during system testing.
Select the three different kinds of system test you consider most appropriate for VMAP. Investigate the best approaches to carrying out these tests on the VMAP system.
Deliverables: A short, individually labelled, section in your report detailing your choice of the three most appropriate system testing activities, together with a justification for this choice. For each of the chosen system testing activities, provide a brief outline of how you would go about such testing for the VMAP system. Try to be reasonably specific, providing examples of test input and identifying what they are intended to test.
Preparation: One important aspect of system testing for VMAP is the testing of the GUI used to develop the mind maps. Prepare to answer this question by reading Coverage Criteria for GUI Testing and some more pragmatic advice on the use of the Abbot tool in GUI testing. You might also find this checklist useful.
For this task, you should prepare a short report outlining the key elements in the VMAP GUI that require test and for each of these elements you should outline how you would test that element. A description of the approach should include the description of any necessary scaffolding code, the kind of test to be applied and the expected result.
Note that you are not being asked to carry out this testing work. Your deliverable is a report outlining the testing that is needed for the VMAP GUI.
Deliverables: A short section in your report outlining how to go about testing the GUI provided in VMAP.
The final task is a review of the main activities of the group in completing the practical. Write a short report on the progress of the group. Your report should include:
Deliverables: A section in your report providing a review of the planning of your group's practical work.
After completing the practical you should have the main report and additiona files of tests etc. It will help me with marking if you please exactly adhere to these names. For other sections if you submit additional files use the naming convention: Taskn-XXX.ttt where $n$ is the task number the file relates to and XXX.ttt is a descriptive name and file extension. (including upper/lower case):
To submit your work you should designate one member of the group as a submitter for the group. The report should be clearly labeled with your group number. The submitter will gather together the files you wish to submit, and execute this command (if for any reason you have not produced one of the listed files you should omit it from the submit command). The dots at the end of the command signify all the other relevant files:
submit st 1 report.pdf Task1.java Task2.java ...
toRelativeURL()
from the
vmap source, and test it separately?
Informatics Forum, 10 Crichton Street, Edinburgh, EH8 9AB, Scotland, UK
Tel: +44 131 651 5661, Fax: +44 131 651 1426, E-mail: school-office@inf.ed.ac.uk Please contact our webadmin with any comments or corrections. Logging and Cookies Unless explicitly stated otherwise, all material is copyright © The University of Edinburgh |