This page describes the practical for the Informatics Software Testing course. It will be marked out of 100 points, and is worth 25% of the assessment of the course. This practical will be undertaken in groups of 2, and will be assessed on the basis of the group submission.
The penalty for late submission follows the uniform policy described in the UG3 course guide. Coursework will be scrutinised for plagiarism and academic misconduct. Information on academic misconduct and advice for good scholarly conduct is available here .
In this practical you will consider the specification and Java program available at the following github repository.
You can choose either to use the Eclipse IDE or just to use JUnit and other tools standalone; I have no strong preference - many people find the tools available in Eclipse useful (if you haven't used Eclipse before maybe now is the time to give it a try). You will need some of the following:
Most of the tasks have an associated tutorial which will help you prepare for it. Please prepare in advance for the tutorial to get the most out of it.
Preparation: If you don't have Eclipse installed and want to use it, you should downlaod it and install it. You can find Eclipse here. Once you have installed Eclipse, you should look at the tutorial. Do enough of the "getting started" tutorial that you have JUnit as a project in Eclipse. You should also install eclemma if you don't have it and intend to use it. You can delay this since it is not essential for the first task.
You should spend some time looking at the JUnit project in Eclipse and become familiar with its structure.
In this task you will implement JUnit tests using the specification provided in the Github repository.
The repository also provides the implementation as a JAR file,ST_Coursework.jar
, so you can execute your JUnit tests and observe test results.
The specification is described in detail, with helpful examples where necessary in the Specifications.pdf
file.
Functional testing is a black box testing technique, so use the specification file to derive tests and not the source code. The jar file under the jar
directory can be used
to execute the tests derived from the specification.
We have also provided a sample JUnit test case, TemplateEngineTest.java
file,
to illustrate a typical test case for the implementation in ST_Coursework.jar
.
All the files referred to above can be found at the Github repository.
In giving a grade for this part of the practical I will take into account the performance of your test set on a collection of variants/mutants of the specification.
Deliverables:
Task1-Functional.java
that contains your JUnit tests.
Using some appropriate coverage measurement tool (such as EclEmma mentioned in "Tools" and "Setting Up"), assess the following:
Deliverables:
Task1-Coverage.java
containing all the JUnit tests used to achieve maximum coverage.
To submit your work you should designate one member of the group as a submitter for the group. The submitter will gather together the files you wish to submit for Task 1 - Part 1: Functional Testing and Part 2: Coverage Analysis , and execute this command. The dots at the end of the command signify any other relevant files:
submit st cw1 Task1-Functional.java Task1-Coverage.java coverage-1.jpg coverage-2.jpg ...
We have provided 10 buggy versions of the implementation (in the form of JAR files), available in the folder "mutations" in the Github repository. Goal of this task is to develop tests that fail for each of the buggy versions. A test fails if the output from the buggy version is different from the output obtained from the correct implementation provided in Task 1. A failing test indicates that the implementation has a fault. A failing test can be one of the existing tests from previous tasks or a new test.
Deliverables:
Task2_Mutation.java
containing tests that fail (existing and/or new) for each of the 10 buggy versions.
new_spec.pdf
, available in Github repository.
Deliverables:
This task will involve a 2 part submission,Task2_TDD_1.java
, for the new additional specification (in "new_spec.pdf").
Please check to make sure tests developed for the new specification fail on the existing implementation available in the "st" folder. TemplateEngine.java
in the "src/st" folder to support the new specification.
Check if all the tests developed in Part 1 of TDD pass for the modified implementation.
If they don't, modify the implementation and/or tests so the entire test suite passes and the new specification is implemented correctly.
Submit both the modified implementation of TemplateEngine.java
and the test suite from part 1, Task2_TDD_1.java
, including any revisions or modifications as Task2_TDD_2.java
To submit your work you should designate one member of the group as a submitter for the group. The submitter will gather together the files you wish to submit for Task 2 - Part 1: Faulty versions and Part 2: Test-Driven Development , and execute this command. The dots at the end of the command signify any other relevant files:
submit st cw2 Task2_Mutation.java Task2_TDD_1.java TemplateEngine.java Task2_TDD_2.java ...
Informatics Forum, 10 Crichton Street, Edinburgh, EH8 9AB, Scotland, UK
Tel: +44 131 651 5661, Fax: +44 131 651 1426, E-mail: school-office@inf.ed.ac.uk Please contact our webadmin with any comments or corrections. Logging and Cookies Unless explicitly stated otherwise, all material is copyright © The University of Edinburgh |