Software Testing: Practical

This page describes the practical for the Informatics Software Testing course. It will be marked out of 100 points, and is worth 25% of the assessment of the course. This practical will be undertaken in groups of four or five and will be assessed on a group report and individual submissions. This practical should take approximately 20 hours of work from each participant to complete. If at any time you believe you are likely to exceed this estimate by more than 30% please get in touch with me to discuss what can be done.

Deadline

The submission deadline for the practical is: Thursday, 1600, 15th March 2012 (week 9 - but really week 8 because of Innovative Learning Week)

The penalty for late submission follows the uniform policy described in the UG3 course guide. Please read that page, and pay particular attention to the section on plagiarism.

Background

The overall goal of this project is to produce a short report on the testing of a small system. The report should be at most 12 pages in length supported by various other technical deliverables (code for tests). The report will consist of an introduction followed by eight or nine sections (eight in the case of four person groups, nine for five person groups) describing the results of the tasks enumerated below. You should be able to complete the tasks described below with around 20 hours of effort per group member so each group has a "budget" of 80-100 hours of effort and you should consciously manage that effort.

In this practical you will consider the testing of a part of an educational software system that is reasonably mature and has been developed for use by students. The system is called VMAP (follow this link for an intro) and is intended to help users create “mindmaps”. For more explanation of what the system does see the introduction and help on the web page. There is also a link on the VMAP page that allows you to run VMAP from your web browser.

Note that VMAP is written for Java 1.4 and uses variable names incompatible with Java 1.5. You've a number of options to cope with this:

Resources

The system is open source and all of the source is available in a range of formats from the project page. In addition the source code is available locally as a tar file compressed using gzipped (extract with tar -xzf vmap-0.91.tar.gz).

Tools

You will probably need the following tools. You can choose either to use the Eclipse IDE or just to use JUnit on its own; I have no strong preference. You will need some of the following:

  1. If necessary you can download JUnit from here. If you are using Eclipse it is probably already installed in the IDE. This article is a reasonable introduction to using JUnit with Eclipse, but bear in mind its age: in particular it's focussed on JUnit 3. Here's a good introduction to JUnit 4 (free registration required).
  2. You will need some kind of coverage analysis tool.
    • With a more recent version of Eclipse you can use EclEmma. It's easy to install through Eclipse's built in software update mechanism.
    • If you are using Eclipse version 3.2 or earlier (the version installed on DICE is 3.4), then Coverlipse provides a very simple coverage analyser within Eclipse.
    • For stand-alone coverage you should consider something like jcoverage or Cobertura.

The goal of this practical is to create a test report on the VMAP system. At the end of the practical you should have a report on your testing activity. Tasks 1-4 should be done as a group activity. The other five tasks should be done individually with one person writing the section. In four-person groups you should omit one of the individual tasks. Each of the first four tasks counts 20% of the final assessment and each of the individual tasks counts 20% so each member of the group will be allocated the group score on the first four tasks plus their individual score on the task they tackled individually. You should clearly label each individual section with the author's student number so marks can be allocated to the correct individual.

Each of tasks 1-4 is supported by a tutor who will run a tutorial session for two groups at a time once they have completed some preparatory reading and tutorials. So each of tasks 1-4 has a list of preparatory work and a tutor contact to set up the tutorial once your group is ready to go. When you email the tutor saying you are ready to go they will email back suggesting some times for the tutorial once two groups are ready to go.

Some groups have n members where n is smaller than 4. If this is true then your groups should attempt n of the tasks 1-4 and each member of the group should complete one of the individual tasks 5-9.

The tutorials are designed to help you with the task and will usually involve working on a similar problem to the task with a tutor to call on for help. You will also be able to ask questions of clarification concerning the task.

Now you should work through the following activities:

Task 0: Setting up (1 hr, no credit)

Preparation: You should have a look at tutorial 1 which covers basic material on the use of Junit. You should also have a look at tutorial 2 which provides an overview of what you will work on in the first couple of tasks in the practical.

You should spend some time looking at the VMAP web page, and should consider what you need to download and begin to set up the environment to support you in the tasks given below.

You should also read all of this Practical specification and create a plan that specifies at least:

  1. The allocation of individuals to the individual tasks, i.e. for each of task 5-9 a single individual should be allocated to the task.
  2. A plan that decides when and how much effort you are planning to devote to each of the tasks. This can be in any form you like but it should be shared by all of the group and should be easy to change. Remember that not everyone needs to work on all tasks so you might allocate two people to two tasks and three to the other two group tasks (this might reduce communication overhead...).

Deliverables: The plan and allocation of tasks - this should be an internal document, shared and evolved by the group.

Task 1: Category-Partition Testing (20 marks, group activity)

Preparation: Read the tutorial that is relevant to this task together with the reading specified by the tutorial. Get together in your group and tackle as much of the task there as you can.

Tutorial: Contact the tutor (Cristina Alexandru) for this session to arrange a tutorial meeting.

In this task you will generate a test suite in JUnit by first constructing test case specifications using the category partition approach. In this task you will test the method String toRelativeURL(URL base, URL target) that can be found in the file vmap-0.91/vmap/main/Tools.java. You should document the following parts of the process:

  1. Provide a short specification of the function you are testing (you must infer this from the source code and associated comments). You should make it clear what ITF you are testing and that should be the primary function of the method.
  2. The parameters and environment elements you have identified that are relevant to the method.
  3. Partitions for the parameters and environment elements you have identified (see below for an explanation of the diference between partition and value classes).
  4. Value classes for each of the partitions you have considered.
  5. Constraints on combinations of the value classes you have identified. Here you should attempt to eliminate as many combinations of value classes that test situations that will not occur.
  6. The test case specification you have arrived at. This is not the test cases, but what you believe to be specifications of the most significant test specifications for the function. This needs to be shrt enough that you can document and implement the test cases in a limited amount of time.

You should then implement your test case specification and test the code for the function. In giving a grade for this part of the practical I will take account of the performance of your test set on a collection of variants of the method.

Deliverables:

  1. A section in your report containing your rationale for the tests.
  2. A file Task1.java that contains the JUnit tests.

Task 2: Coverage Analysis (20 marks, group activity)

Preparation: You should read Pezze and Young chapter 12 and then do the tutorial on this topic before arrange your meeting with the tutor.

Tutorial: Contact the tutor (Aurora Constantin) for this session to arrange a tutorial meeting.

Using some appropriate tool, assess the level of statement coverage achieved by your test suite. Examine the coverage data carefully. Then do the following:

Deliverables:

  1. A section in your report (include a screenshot to help document the coverage you achieved).
  2. A file named Task2.java containing any new tests.

Task 3: Integration Testing and Coupling-based Coverage (20 Marks, group activity)

Preparation: You should review the slides for lecture 10 and the slides for Chapters 15 and 21 of Pezze and Young. Then as a group tackle the activities in Tutorial 7. Once you have done this you can go ahead and contact the tutor to arrange a meeting.

Tutorial: Contact the tutor(Partha Pani) for this session to arrange a tutorial meeting.

In this section you will assess the effort required to test the integration of a relatively modest system. Assessing the integration of a system like VMAP can be very time consuming. To limit the task you will consider integration testing for some classes in the top level packages: vmap.main and vmap.controller. You should do the following tasks:

  1. Construct a test suite, using JUnit, for the Vmap class. The test suite should be directed to testing as many method calls as possible in the Vmap class. Use EclEmma or another coverage tool to check the level of statement coverage of your test suite.

    Note that here you're testing straight from JUnit and focusing on broad API coverage, without reference to a test case specification or the Category-Partition method. This is because integration is almost always a "white-box" approach where you can asume you have access to the code. Of course in some instances you may only have the interfaces in which case you would need to use black-box methods at the interface. If you decide not to test some methods, give a very brief outline of your reason(s).

  2. Inspect the code for the classes Vmap, Tools and Controller and the coverage attained by your test suite. Write a short report that:
    • Identifies the main ways in which these three classes are coupled. For example, what methods are called, are there shared variables, …
    • Discusses what would be necessary in a test suite to achieve full statement coverage of the code in these three classes.
    • Outlines what would be necessary to meet the coupling-based coverage criteria.

Deliverables: The deliverables from the tasks of this section are:

  1. Your test suite for Vmap, in a file called Task3.java. The class should use the default package (i.e. no package).
  2. A section in your report documenting your tests and including your report on coupling. Regarding the tests, while you're not working through as detailed a process as in Task 1, you should identify any important choices or assumptions you made while creating these tests, and discuss your test results.

Task 4: Using Coupling-based Coverage (20 Marks, group activity)

Preparation:To prepare for this task you should review the slides for lectures 7 and 8, the slides for chapters 6 and 13 of Pezze and Young and work together to complete as much of tutorial 5.

Tutorial: Contact the tutor(Howard Lin) for this session to arrange a tutorial meeting.

For this section just consider the method openDocument in the Vmap class and the use of the parameter url. You should do the following:

  1. Identify all the last defs and first uses of url, concentrating on coupling between the Vmap and Tools classes.

    Note that in the tutorial we look at coupling through return values as well; that's not necessary here.

  2. Work out the coupling paths that must be followed if the test is to conform to the all-coupling-uses coverage criterion.
  3. Develop a test suite for the method that exercises all the coupling paths that you have identified. If for some reason you fail to meet the criterion provide a short explanation why you cannot conform to it in your report.

Deliverables: The deliverables from the tasks of this section are:

  1. A section in your report annotating the source of the relevant methods with first use and last defs for the variable url and identifying the coupling paths.
  2. Your test suite for Vmap.openDocument, in a file called Task4.java. The class should use the default package (i.e. no package).
  3. An explanation in your report — if it is necessary — detailing why the coupling-based coverage cannot be met.

Task 5: Mutation-based test Adequacy (20 marks, individual activity)

Have a look at tutorial 6 which should help a bit with deciding how to generate mutants.

Consider using mutation testing to check the adequacy of your test set developed under Task 1. Develop at least two erroneous variants (or “mutations”) of the toRelativeURL method which your test set can detect — i.e. two copies of toRelativeURL which you've modified (in a small way) to introduce a bug that your test suite will detect. If it is possible generate a variant that will generate different results in some cases but which would not be detected by your test set in Task 1 include that as well.

Deliverables:

  1. A number of variants of toRelativeURL, call the variant files toRelativeURLVar1.java, toRelativeURLVar2.java, and so on.
  2. A short report on the adequacy of your test set as assessed by mutation testing. This should include some indication of the test suite on the variants you have generated.

Task 6: Common Java Problems (20 marks, individual activity)

In this section you are asked to review the effects of programming language choice on test sets.

Deliverables:

  1. A section in your report giving a short rationale for adding three tests that have been prompted by your reading for this section.
  2. A file named Task6.java containing implementations of two of your test ideas.

Task 7: System Testing (20 marks, individual activity)

Re-read the lecture slides that list some of the varieties of testing you might apply during system testing. Select the three different kinds of system test you consider most appropriate for VMAP. Investigate the best approaches to carrying out these tests on the VMAP system.

Deliverables:

  1. A short section in your report detailing your choice of the three most appropriate system testing activities, together with a justification for this choice. For each of the chosen system testing activities, provide a brief outline of how you would go about such testing for the VMAP system. Try to be reasonably specific, providing examples of test input and identifying what they are intended to test.

Task 8: GUI Testing (20 marks, individual activity)

One important aspect of system testing for VMAP is the testing of the GUI used to develop the mind maps. Prepare to answer this question by reading Coverage Criteria for GUI Testing and some more pragmatic advice on the use of the Abbot tool in GUI testing. You might also find this checklist useful. For this task, you should prepare a short report outlining the key elements in the VMAP GUI that require test and for each of these elements you should outline how you would test that element. A description of the approach should include the description of any necessary scaffolding code, the kind of test to be applied and the expected result.

Note that you are not being asked to carry out this testing work. Your deliverable is a report outlining the testing that is needed for the VMAP GUI.

Deliverables:

  1. A short section in your report outlining how to go about testing the GUI provided in VMAP.

Task 9: Project review (20 Marks, individual activity)

The final task is a review of the main activities of the group in completing the practical. Write a short report on the progress of the group. Your report should include:

  1. The group's plan for the tasks.
  2. A review of deviations from the plan in terms of time allocation, degree of completion, sequence, completion dates, ...
  3. A short summary of what you see as the successes of the group in project planning and execution.
  4. A short list of three or four areas you see room for improvement and suggestions of how you might improve in these areas.

Deliverables:

  1. A section in your report providing a review of the planning of your group's practical work.

Submission of the Practical Work

After completing the practical you should have five or six files. It will help me with marking if you please exactly adhere to these names (including upper/lower case):

report.pdf
A report comprising three sections with your written answers.
Task1.java
Your tests for Task 1.
Task2.java
Your tests for Task 2.
Task3.java
Your tests for Task 3.
Task4.java
Your tests for Task 4.
Task6.java
Your tests for Task 6.
toRelativeURLVar1.java
Your first variant of toRelativeURL and so on for the other variants.

Please write your tests in the default package (i.e. no package).

To submit your work you should designate one member of the group as a submitter for the group. The report should be clearly labelled with your group number. The submitter will gather together the files you wish to submit, and execute this command (if for any reason you have not produced one of the listed files you should omit it from the submit command):

submit st cw1 report.pdf Task1.java Task2.java Task3.java Task4.java Task5.java Task6.java toRelativeURLVar1.java toRelativeURLVar2.java toRelativeURLVar3.java

Assessment

In Weeks 9 and 10 of semester each group will be invited to give a 30 minute presentation of their results where they will have an opportunity to demonstrate the tests they have genrated and receive feedback on their work. Details of the signup process will appear shortly on the web page.

Questions

Here are some relevant questions for task 1 and task 2:
Is it ok to isolate toRelativeURL() from the vmap source, and test it separately?
If it will make things easier for you, then certainly.
What's the difference between a partition and a value class?
They're essentially the same thing (“value class” isn't mentioned in Ostrand & Balcer's original paper), but it might be useful to think of partitions as being slightly higher level verbal descriptions (e.g. “none”, “one”, “several”, and “very many”) corresponding to more technical value classes (e.g. 0, 1, 2-100 and 101+). In a more complex project this distinction would be more useful.
Do we need to include our test results?
Yes. Your test case specification should include the results that you expect (c.f. P&Y p.189, Table 11.2). You should also document the actual results you got. A brief commentary to the effect that all tests are passed, or that failures occur and why, would be helpful in demonstrating that you implemented and executed your specification.
My statement coverage is over 90%. What should I do in Task 2?
I've talked about different coverage criteria, and how they're related, and Ntafos' paper in the reading gives a good overview of this. You've just been working with statement coverage. You should make the case for another coverage criterion, and possibly add one or two test cases to improve your score with that criterion.
Further discussion of Tasks 3 and 4 can be found here.


Home : Teaching : Courses : St : 2011-12 

Informatics Forum, 10 Crichton Street, Edinburgh, EH8 9AB, Scotland, UK
Tel: +44 131 651 5661, Fax: +44 131 651 1426, E-mail: school-office@inf.ed.ac.uk
Please contact our webadmin with any comments or corrections. Logging and Cookies
Unless explicitly stated otherwise, all material is copyright © The University of Edinburgh