^

 
 

Unit of competency details

ICAT5079B - Perform integration test (Release 1)

Summary

Usage recommendation:
Superseded
Mapping:
MappingNotesDate
Is superseded by and equivalent to ICASAS514A - Perform integration testsOutcomes deemed equivalent. Added application of unit. Changes to range statement, required skills and knowledge and evidence guide. 17/Jul/2011

Releases:
ReleaseRelease date
1 1 (this release) 08/Jul/2010

Classifications

SchemeCodeClassification value
ASCED Module/Unit of Competency Field of Education Identifier 029999 Information Technology, N.e.c.  

Classification history

SchemeCodeClassification valueStart dateEnd date
ASCED Module/Unit of Competency Field of Education Identifier 029999 Information Technology, N.e.c.  08/Jul/2010 
The content being displayed has been produced by a third party, while all attempts have been made to make this content as accessible as possible it cannot be guaranteed. If you are encountering issues following the content on this page please consider downloading the content in its original form

Modification History

Not Applicable

Unit Descriptor

Unit descriptor 

This unit defines the competency required to ensure that the components of the system operate together to the expected standard.

These units are linked and form an appropriate cluster:

  • ICAA5050B Develop detailed component specifications from project specifications

No licensing, legislative, regulatory or certification requirements apply to this unit at the time of publication.

Application of the Unit

Application of the unit 

Licensing/Regulatory Information

Refer to Unit Descriptor

Pre-Requisites

Prerequisite units 

Employability Skills Information

Employability skills 

This unit contains employability skills.

Elements and Performance Criteria Pre-Content

Elements describe the essential outcomes of a unit of competency.

Performance criteria describe the performance needed to demonstrate achievement of the element. Where bold italicised text is used, further information is detailed in the required skills and knowledge section and the range statement. Assessment of performance is to be consistent with the evidence guide.

Elements and Performance Criteria

ELEMENT 

PERFORMANCE CRITERIA 

1. Prepare for test

1.1. Prepare the test environment 

1.2. Prepare the test scripts (online test) or test run (batch test) for running

1.3. Review expected results against test and acceptance criteria 

1.4. Confirm pre-existing modules and compile modification logs

1.5. Perform static tests of each point of integration and verify correctness of arguments, positional parameters and return values in each integration suite

1.6. Review results of earlier component testing and ensure critical issues are identified and taken into account

2. Conduct test

2.1. Select appropriate test tools 

2.2. Run test scripts and document the results against software life cycle  model

2.3. Ensure that memory leakage, global name space pollution and static variables are specifically addressed for each integration unit in line with test and acceptance criteria 

2.4. Follow and adopt integration standards where appropriate in line with quality benchmarks 

2.5. Compare test results to requirements on completion of each integration component

3. Analyse and classify results

3.1. Summarise and classify test results and highlight areas of concern

3.2. Compare the test results against the requirements and design specification and prepare report

3.3. Notify operations of completion of the testing

3.4. Ensure attendees' details/comments are logged and signatures gained

3.5. Schedule a feedback meeting to discuss report and possible next actions with stakeholders if necessary

3.6. Ensure test reporting compliance with documentation and reporting  standards

Required Skills and Knowledge

REQUIRED SKILLS AND KNOWLEDGE 

This section describes the skills and knowledge required for this unit.

Required skills 

  • Problem solving skills for a defined range of unpredictable problems involving participation in the development of strategic initiatives (e.g. when static tests of each point of integration are performed and correctness of arguments, positional parameters and return values in each integration suite are verified)
  • Plain English literacy and communication skills in relation to analysis, evaluation and presentation of information (e.g. when attendees' details/comments are logged and signatures are gained)
  • Data analysis skills in relation to analysis, evaluation and presentation of information (e.g. when static tests of each point of integration are performed and correctness of arguments, positional parameters and return values in each integration suite are verified, and when each test script is run and results are documented, and when memory leakage, global name space pollution, static variables are specifically addressed for each integration unit)
  • Research skills for identifying, analysing and evaluating broad features of system testing and best practice in system testing; high-order problem solving skills (e.g. when results of earlier unit testing are reviewed and critical issues to take into account are identified)
  • Programming skills in programming language/s relevant to project (e.g. when static tests of each point of integration are performed and correctness of arguments, positional parameters and return values in each integration suite are verified, and when each test script is run and results are documented)

Required knowledge 

  • Broad knowledge of at least two programming languages, with detailed knowledge of programming languages required by system
  • Detailed knowledge of system/application being tested
  • Broad knowledge of testing techniques, with detailed knowledge of features and processes in some areas
  • Broad knowledge of automated test tools, with detailed knowledge of features and processes in some areas
  • Detailed knowledge of underlying test data
  • Detailed knowledge of input/output requirements

Evidence Guide

EVIDENCE GUIDE 

The evidence guide provides advice on assessment and must be read in conjunction with the performance criteria, required skills and knowledge, range statement and the Assessment Guidelines for the Training Package.

Overview of assessment 

Critical aspects for assessment and evidence required to demonstrate competency in this unit 

Evidence of the following is essential:

  • Assessment must confirm sufficient knowledge of the integration requirements for the units of the particular system.
  • Assessment must confirm the ability to determine whether the units of the system operate according to requirements specifications.

To demonstrate competency in this unit the person will require access to:

  • Acceptance criteria
  • Test plan
  • Integration standards
  • Requirements and design documents used in the analysis of the test
  • System/application suitable for testing

The person will need to ensure that:

  • Components have been compiled, linked, and loaded together
  • Components have successfully passed the integration tests at the interface level between each component

Context of and specific resources for assessment 

It should be noted that the quality of code is not being assessed, but the competency of testing the components.

Integration testing involves formal testing of the combined parts of an application to determine if they function together correctly and is usually performed after unit and functional testing. This type of testing is especially relevant to client/server and distributed systems.

The breadth, depth and complexity covering planning and initiation of alternative approaches to skills or knowledge applications across a broad range of technical and/or management requirements, evaluation and coordination would be characteristic.

Assessment must ensure:

  • The demonstration of competency may also require self-directed application of knowledge and skills, with substantial depth in some areas where judgement is required in planning and selecting appropriate equipment, services and techniques for self and others.

  • Applications involve participation in development of strategic initiatives as well as personal responsibility and autonomy in performing complex technical operations or organising others. It may include participation in teams including teams concerned with planning and evaluation functions. Group or team coordination may also be involved.

Method of assessment 

The purpose of this unit is to define the standard of performance to be achieved in the workplace. In undertaking training and assessment activities related to this unit, consideration should be given to the implementation of appropriate diversity and accessibility practices in order to accommodate people who may have special needs. Additional guidance on these and related matters is provided in ICA05 Section 1.

  • Competency in this unit should be assessed using summative assessment to ensure consistency of performance in a range of contexts. This unit can be assessed either in the workplace or in a simulated environment. However, simulated activities must closely reflect the workplace to enable full demonstration of competency.

  • Assessment will usually include observation of real or simulated work processes and procedures and/or performance in a project context as well as questioning on underpinning knowledge and skills. The questioning of team members, supervisors, subordinates, peers and clients where appropriate may provide valuable input to the assessment process. The interdependence of units for assessment purposes may vary with the particular project or scenario.

Guidance information for assessment 

Holistic assessment with other units relevant to the industry sector, workplace and job role is recommended, for example:

  • ICAA5050B Develop detailed component specification from project specification

An individual demonstrating this competency would be able to:

  • Demonstrate understanding of a broad knowledge base incorporating theoretical concepts, with substantial depth in some areas
  • Analyse and plan approaches to technical problems or management requirements
  • Transfer and apply theoretical concepts and/or technical or creative skills to a range of situations
  • Evaluate information, using it to forecast for planning or research purposes
  • Take responsibility for own outputs in relation to broad quantity and quality parameters
  • Take some responsibility for the achievement of group outcomes
  • Maintain knowledge of industry products and services

Range Statement

RANGE STATEMENT 

The range statement relates to the unit of competency as a whole. It allows for different work environments and situations that may affect performance. Bold italicised wording, if used in the performance criteria, is detailed below. Essential operating conditions that may be present with training and assessment (depending on the work situation, needs of the candidate, accessibility of the item, and local industry and regional contexts) may also be included.

Test environment  may include:

  • data
  • program libraries
  • network/communications and other equipment
  • operating system
  • other support software

Software life cycle  may include:

  • AS/NZS ISO/IEC 12207:1997 Information technology - Software life cycle processes
  • AS/NZS 15271:1999 Guide for AS/NZS ISO/IEC 12207 Information technology - software life cycle processes)

Test and acceptance processes  may include:

  • AS 4006-1992 Software test documentation
  • AS/NZS 14143.1:1999 Information technology - software measurement - functional size measurement - definition of concepts
  • AS/NZS 15026:1999 Information technology - system and software integrity levels
  • AS 4006-1992 Software test documentation, IEEE Standard for software unit testing
  • International and Australian Standards are updated and changed on a regular basis. It is therefore important to check the Standards Australia website on a regular basis for new standards: http://www.standards.com.au

Quality benchmarks 

There are several organisations that have developed standards for software review mainly: US Department of Defence (DoD) standards, IEEE, the Software Engineering Institute (SEI), and ISO standards.

Relevant quality standards include:

  • AS 4043-1992 Software configuration management
  • AS 4042-1992 Software configuration management plans
  • AS 3925.1-1994 Software quality assurance - plans
  • AS/NZS 4258:1994 Software user documentation process
  • AS/NZS ISO/IEC 12207:1997 Information technology - software life cycle processes
  • AS/NZS 14102:1998 Information technology - guideline for evaluation and selection of CASE tools

International and Australian Standards are updated and changed on a regular basis. It is therefore important to check the Standards Australia website on a regular basis for new standards: http://www.standards.com.au

Test and acceptance criteria 

  • Dependent on the type of test (e.g. functional, efficiency, cohesion)

Documentation and reporting 

  • Documentation for version control may follow ISO/IEC/AS standards. Audit trails, naming standards, version control, project management templates and report writing styles will vary according to organisational approach. Information gathering processes may have associated templates

Test tools may include :

  • Code/unit/class testing: AssertMate, BoundsChecker, C-Cover, CodeReview, CodeWizard, DeepCover, FailSafe, Hindsight, Insure++, JCAST, Logiscope, JavaPureCheck
  • Stress load testing: automated test facilities, e-Load, E-TEST Suite, e-MONITO, Astra SiteManager, Astra SiteTest, AutoTester Web, LoadRunner, JavaLoad
  • Applications testing: DataShark, Cyrano Suite, Datatect, preVue-C/S

Unit Sector(s)

Unit sector 

Test

Co-requisite units

Co-requisite units 

Competency field

Competency field