IBSA’s experience in testing has been demonstrated in all manner of projects and roles by our dedicated Software Testing practice. Automated testing forms a keystone of the IBSA development methodology and IBSA has provided separate testing services for the National Australia Bank, Telstra, MotorOne, Mercedes Benz, Victoria University, GE Money and Veda Advantage.

This includes development, implementation and review of detailed test strategies and analysis of testing requirements. The majority of IBSA’s projects involve the activities of one or more of our team of testers, and therefore all our service offerings are integrated with this section of our practice. As one of the pioneers of agile development in Australia (soon after commencing operations in 1999), IBSA is a great promoter of automated testing to improve the quality of software systems. Every system developed by IBSA has been developed with a comprehensive test harness allowing repeated and continuous automated testing.

This test-centric approach has important consequences:

  • code is designed to allow for automated testing and therefore very modular,
  • automated tests provide a safety net for continuous code improvement and design evolution without fear of regressions,
  • test cases provide detailed documentation of expected behaviour and the specification of test cases acts as a common communication point between developers and subject matter experts to clarify edge cases and exception conditions.

IBSA acknowledges that quality needs to be built in to the entire development process and cannot be “tested in” at the end and our approach to the delivery of testing services is based on that part of the selected IBSA development methodology that describes the strategy, process, workflows and methodology used to plan, organise, execute and manage testing of the product to ensure it conforms to specification, and is fit for purpose..

All our methodologies demand early development of a Project Testing Plan which is used to define overall test objectives and requirements for the solution and set out a detailed test approach including test stages, cycles and cases.

While the testing process needs to be tailored to take into account the specifics of the project, a number of best-practice activities are usually involved and at all times aware that visibility, traceability and accountability are key to successful outcomes.

For software development projects, IBSA adopts automated testing processes based on JUnit / NUnit (for Java and .NET unit testing) and Selenium for browser UI testing. Test cases are part of each function to be developed, and the development is not “code complete” until test cases are implemented as automated test code. Tests are a combination of white-box testing at the function and unit level and black-box testing at the unit, integration and UI level. Exploratory ad-hoc testing is performed manually by expert testers to identify unexpected or inconsistent behaviour and verify edge-case robustness.

Each release is generally accompanied by a User Acceptance Test Entry Report providing details of tests performed and documents involved.

When engaged to test separate to development, IBSA’s approach is to create a Test Strategy Document and one or more Test Plans. This document contains sections titled

  • Scope and Objectives,
  • Business issues,
  • Roles and Responsibilities,
  • Communication and Status Reporting,
  • Deliverability,
  • Industry Standards,
  • Automation and Tools,
  • Measurements and Metrics,
  • Risks and mitigation,
  • Defect Reporting and Tracking,
  • Change and Configuration Management, and
  • Training Plan.

Multiple Test Plans may be required to cover different phases or components. These contain sections for Introduction, Test items, Features to be tested, Features not to be tested, Test techniques, Testing tasks, Suspension criteria, Features pass or fail criteria, Test environment (Entry criteria, Exit criteria), Test deliverables, Staff and training needs, Responsibilities, and Schedule.