The future belongs to those who believe in the beauty of their dreams.
--Your friends at LectureNotes

Software Testing Methodologies

by Shaik Shabeer
Type: NoteInstitute: nimra college of engineering and technology Course: B.Tech Specialization: Computer Science EngineeringDownloads: 3Views: 55Uploaded: 22 days agoAdd to Favourite

Touch here to read

Software Testing Methodologies by Shaik Shabeer

Shaik Shabeer
Shaik Shabeer

/ 87

Share it with your friends

Suggested Materials

Leave your Comments


Shaik Shabeer
Shaik Shabeer
STANNS COLLEGE OF ENGINEERING &TECHNOLOGY SOFTWARE TESTING METHODOLOGIES Unit – II Verification and Validation: Verification & Validation Activities, Verification, Verification of Requirements, High level and low level designs, How to verify code, Validation Dynamic Testing I: Black Box testing techniques: Boundary Value Analysis, Equivalence class Testing, State Table based testing, Decision table based testing, Cause-Effect Graphing based testing, Error guessing Verification and Validation Verification & Validation Activities (V&V): V&V activities can be best understood with the help of phases of SDLC activities: Requirements Gathering: It is an essential part of any project and project management. Understanding fully what a project will deliver is critical to its success. Requirement Spefication or Objectives: Software Requirements Specification (SRS) is a description of a software system to be developed, laying out functional and non-functional requirements, and may include a set of use cases that describe interactions the users will have with the software. Software requirements specification establishes the basis for an agreement between
customers and contractors or suppliers on what the software product is to do as well as what it is not expected to do. High Level Design or Functional Design: A functional design assures that each modular part of a device has only one responsibility and performs that responsibility with the minimum of side effects on other parts. Functionally designed modules tend to have low coupling. High-level design (HLD) explains the architecture that would be used for developing a software product. The architecture diagram provides an overview of an entire system, identifying the main components that would be developed for the product and their interfaces. The HLD uses possibly nontechnical to mildly technical terms that should be understandable to the administrators of the system. Low-Level Design(LLD): It is a component-level design process that follows a step-by-step refinement process. This process can be used for designing data structures, required software architecture, source code and ultimately, performance algorithms. Overall, the data organization may be defined during requirement analysis and then refined during data design work. Post-build, each component is specified in detail. The LLD phase is the stage where the actual software components are designed. Coding: The goal of the coding phase is to translate the design of the system into code in a given programming language. The coding phase affects both testing and maintenance profoundly. A well written code reduces the testing and maintenance effort. Since the testing and maintenance cost of software are much higher than the coding cost, the goal of coding should be to reduce the testing and maintenance effort. Hence, during coding the focus should be on developing programs that are easy to write. Simplicity and clarity should be strived for, during the coding phase. Verification: Verification is done at the starting of the development process. It includes reviews and meetings, walkthroughs, inspection, etc. to evaluate documents, plans, code, requirements and specifications. It answers the questions like: --Am I building the product right? --Am I accessing the data right (in the right place; in the right way). --It is a Low level activity According to the Capability Maturity Model(CMMI-SW v1.1) we can also define verification as “the process of evaluating software to determine whether the products of a given development phase satisfy the conditions imposed at the start of that phase. [IEEE-STD-610]”. Advantages of Software Verification : --Verification helps in lowering down the count of the defect in the later stages of development. -Verifying the product at the starting phase of the development will help in understanding the product in a better way. -It reduces the chances of failures in the software application or product. --It helps in building the product as per the customer specifications and needs. The goals of verification are: --Everything must be verified. --Results of the verification may not be binary. --Even implicit qualities must be verified.
Verification of Requirements: In this type of verification, all the requirements gathered from the users viewpoint sre verified. For this purpose, an acceptance criterion is preprared. An acceptance criterion defines the goals and requirments of the proposed system and acceptance limits for each of the goals and requirements. The testers works in parallel by performing the following two tasks: 1. The tester reviews the acceptance criteria in terms of its completeness, clarity and testability. 2. The tester prepares the Acceptance Test Plan which is refered at the time of Acceptance Testing. Verification of Objectives: After gathering of objectives specific objectives are prepared consedering every specification. These objectives are prpepared in a document called SRS. In this activity the tester performs two parallel activities: 1. The tester verifies all the objectives mentioned in SRS. 2. The tester also prepares the System Test Plan which is based on SRS. How to verify Requirements and Objectives: An SRS is verifiable if, and only if, every requirement stated therein is verifiable. A requirement is verifiable if, and only if, there exists some finite cost-effective process with which a person or machine can check that the software product meets the requirement. In general any ambiguous requirement is not verifiable. Nonverifiable requirements include statements such as “works well", “good human interface", and “shall usually happen". These requirements cannot be verified because it is impossible to define the terms “good", “well", or “usually". An example of a verifiable statement is “Output of the program shall be produced within 20 s of event x 60% of the time; and shall be produced within 30 s of event x 100% of the time". This statement can be verified because it uses concrete terms and measurable quantities. If a method cannot be devised to determine whether the software meets a particular requirement, then that requirement should be removed or revised. Following are the points against which every requirement in SRS should be verified. Correctness: An SRS is correct if, and only if, every requirement stated therein is one that the software shall meet. However, generally speaking there is no tool or procedure to ensure correctness. That’s why the SRS should be compared to superior documents (including the System Requirements Specification, if exists) during a review process in order to filter out possible contradictions and inconsistencies. Reviews should also be used to get a feedback from the customer side on whether the SRS correctly reflects the actual needs. This process can be made easier and less error-prone by traceability. Unambiguity. An SRS is unambiguous if, and only if, every requirement stated therein has only one interpretation. As a minimum, this requires that each characteristic of the final product be described using a single unique term. In cases where a term used in a particular context could have multiple meanings, the term should be included in a glossary where its meaning is made more specific. Completeness. An SRS is complete if, and only if, it includes the following elements: --All significant requirements imposed by a system specification should be acknowledged and treated. --Full labels and references to all figures, tables, and diagrams in the SRS and definition of all terms and units of measure. However, completeness is very hard to achieve, especially when talking about business systems where the requirements always change and new requirements raise. Consistency: Consistency refers to internal consistency. If an SRS does not agree with some higher-
level document, such as a system requirements specification, then it is a violation of correctness. An SRS is internally consistent if, and only if, no subset of individual requirements described in it conflict. The three types of likely conflicts in an SRS are as follows: --The specified characteristics of real-world objects may conflict. For example, the format of an output report may be described in one requirement as tabular but in another as textual or one requirement may state that all lights shall be green while another may state that all lights shall be blue. --There may be logical or temporal conflict between two specified actions. For example, one requirement may specify that the program will add two inputs and another may specify that the program will multiply them or one requirement may state that “A" must always follow “B", while another may require that “A and B" occur simultaneously. --Two or more requirements may describe the same real-world object but use different terms for that object. For example, a program’s request for a user input may be called a “prompt" in one requirement and a “cue" in another. The use of standard terminology and definitions promotes consistency. Modifiability or Updation: An SRS is modifiable if, and only if, its structure and style are such that any changes to the requirements can be made easily, completely, and consistently while retaining the structure and style. Modifiability generally requires an SRS to a. Have a coherent and easy-to-use organization with a table of contents, an index, and explicit crossreferencing; b. Not be redundant (i.e., the same requirement should not appear in more than one place in the SRS); c. Express each requirement separately, rather than intermixed with other requirements. Traceability:An SRS is traceable if the origin of each of its requirements is clear and if it facilitates the referencing of each requirement in future development or enhancement documentation. The following two types of traceability are recommended: a. Backward traceability (i.e., to previous stages of development). This depends upon each requirement explicitly referencing its source in earlier documents. b. Forward traceability (i.e., to all documents spawned by the SRS). This depends upon each requirement in the SRS having a unique name or reference number. Verification of High Level Design: Like the verification of requirements, the tester is responsible for two parallel activities in this place as well: 1. The tester verifies the high level design. Since the system has been decomposed in a number of sub-system or componens, the tester should verify the functionality of these components and their interfaces. Every requirement in SRS should map the design. 2.The tester also prepares a Function Test Plan which is based on the SRS. This plan will be referenced at the time of Fucntion Testing. How to verify High Level Design: Verification of Data Design --Check whether sizes of data structure have been estimated appropriately. --Check the provisions of overflow in a data structure. --Check the consistency of data formats with the requirements.

Lecture Notes