Software Testing: Practical 1

This page describes the first practical for the Informatics Software Testing course. It will be marked out of 100 points, and is worth 12.5% of the assessment of the course. This practical should take approximately 20 hours of work to complete.

Deadline

This practical was issued on Monday 18th January. The submission deadline for the practical is Monday 15th February at 16:00. The penalty for late submission follows the uniform policy described in the UG3 course guide. Please read that page, and pay particular attention to the section on plagiarism.

Background

In this practical you will construct a test suite for a very small part of an educational software system that is reasonably mature and has been developed for use by students. The system is called VMAP (follow this link for an intro) and is intended to help users create “mindmaps”. For more explanation of what the system does see the introduction and help on the web page. There is also a link on the VMAP page that allows you to run VMAP from your web browser.

Note that VMAP is written for Java 1.4 and uses variable names incompatible with Java 1.5. You've a number of options to cope with this:

Resources

The system is open source and all of the source is available in a range of formats from the project page. In addition the source code is available locally as a tar file compressed using bzip2 (extract with tar xjf vmap-0.91.tar.bz2).

Tools

You will probably need the following tools. You can choose either to use the Eclipse IDE or just to use JUnit on its own; I have no strong preference. You will need some of the following:

  1. If necessary you can download JUnit from here. If you are using Eclipse it is probably already installed in the IDE. This article is a reasonable introduction to using JUnit with Eclipse, but bear in mind its age: in particular it's focussed on JUnit 3. Here's a good introduction to JUnit 4 (free registration required).
  2. You will need some kind of coverage analysis tool.
    • With a more recent version of Eclipse you can use EclEmma. It's easy to install through Eclipse's built in software update mechanism.
    • If you are using Eclipse version 3.2 or earlier (the version installed on DICE is 3.4), then Coverlipse provides a very simple coverage analyser within Eclipse.
    • For stand-alone coverage you should consider something like jcoverage or Cobertura.

Now you should work through the following activities:

Task 0: Setting up (1 hr, no credit)

You should spend some time looking at the VMAP web page, and should consider what you need to download and begin to set up the environment to support you in the tasks given below.

Deliverables: None.

Task 1: Category-Partition Testing (10 hrs, 50 marks)

In this task you will generate a test suite in JUnit by first constructing test case specifications using the category partition approach. In this task you will test the method String toRelativeURL(URL base, URL target) that can be found in the file vmap-0.91/vmap/main/Tools.java. You should document the following parts of the process:

  1. A short specification of the function you are testing (you must infer this from the source code and associated comments).
  2. The parameters and environment elements you have identified.
  3. Partitions for the parameters and environment elements you have identified.
  4. Value classes for each of the partitions you have considered.
  5. Constraints on combinations of the value classes you have identified.
  6. The test case specification you have arrived at. This should include the rationale for reducing the number of test cases to at most 20 tests.
  7. At least two erroneous variants (or “mutations”) of the toRelativeURL method which your test set can detect — i.e. two copies of toRelativeURL which you've modified (in a small way) to introduce a bug that your test suite will detect.

You should then implement your test case specification and test the code for the function. In giving a grade for this part of the practical I will take account of the performance of your test set on a collection of variants of the method.

Deliverables: A section in your report containing your design rationale for the tests; a file Task1.java that contains the JUnit tests, and files Variant1.java and Variant2.java, each containing one variant toRelativeURL method.

Task 2: Common Java Problems (2 hrs, 10 marks)

This article outlines the top ten most common Java coding errors. Read the article, then review your tests and the supplied code and suggest three additional tests you could add to your suite as a result of this. You may also want to look at Eric Allen's work on diagnosing Java — this is quite extensive and you should not read the articles in detail. Implement two of your test ideas.

Deliverables: A section in your report giving a short rationale for adding three tests that have been prompted by your reading for this section, and the file Task2.java containing implementations of two of your test ideas.

Task 3: Coverage Analysis (8 hrs, 40 marks)

Using some appropriate tool, assess the level of statement coverage achieved by your test suite. Examine the coverage data carefully. Then do one of the following:

Deliverables: A section in your report (include a screenshot to help document the coverage you achieved), and a file named Task3.java if you developed any new tests.

Submission of the Practical Work

After completing the practical you should have five or six files. It will help me with marking if you please exactly adhere to these names (including upper/lower case):

report.pdf
A report comprising three sections with your written answers. If you cannot produce PDF, contact me for advice: I can cope with a number of other formats but (for example) don't have access to the most recent version of Microsoft Office.
Task1.java
Your tests for Task 1.
Variant1.java
Your first variant of toRelativeURL.
Variant2.java
Your second variant of toRelativeURL.
Task2.java
Your tests for Task 2.
Task3.java
Your tests for Task 3, if you wrote any.

Please write your tests in the default package (i.e. no package).

To submit your work you should collect the files you wish to submit, and execute this command (only including Task3.java if you have one!):

submit cs3 st cw1 report.pdf Task1.java Variant1.java Variant2.java Task2.java Task3.java

Questions

I'll keep an up-to-date list of answers to questions about the practical here; here are three to start with:
Is it ok to isolate toRelativeURL() from the vmap source, and test it separately?
If it will make things easier for you, then certainly.
What's the difference between a partition and a value class?
They're essentially the same thing (“value class” isn't mentioned in Ostrand & Balcer's original paper), but it might be useful to think of partitions as being slightly higher level verbal descriptions (e.g. “none”, “one”, “several”, and “very many”) corresponding to more technical value classes (e.g. 0, 1, 2-100 and 101+). In a more complex project this distinction would be more useful.
Do we need to include our test results?
Yes. Your test case specification should include the results that you expect (c.f. P&Y p.189, Table 11.2). You should also document the actual results you got. A brief commentary to the effect that all tests are passed, or that failures occur and why, would be helpful in demonstrating that you implemented and executed your specification.
My coverage is over 90%. What does the second part of task 3 entail?
I've talked about different coverage criteria, and how they're related, and Ntafos' paper in the reading gives a good overview of this. You've just been working with statement coverage. You should make the case for another coverage criterion, and possibly add one or two test cases to improve your score with that criterion.

Version 1.5, 2010/02/03 19:01:32


Home : Teaching : Courses : St : 2009-2010 

Informatics Forum, 10 Crichton Street, Edinburgh, EH8 9AB, Scotland, UK
Tel: +44 131 651 5661, Fax: +44 131 651 1426, E-mail: school-office@inf.ed.ac.uk
Please contact our webadmin with any comments or corrections. Logging and Cookies
Unless explicitly stated otherwise, all material is copyright © The University of Edinburgh