QA LifeCycle & Software Testing Approach
IT industry follows the process in most of the aspects. It could be for the entire project, development activities, Software testing activities, management activities, etc. In Software Testing, Interview questions will be asked majorly on different life cycles – their phases, components, importance, etc. This article is an attempt to cover life cycles followed in Software Testing. Also, few test approaches that are commonly used are described.
1. Software Testing Life Cycle
This section covers what the Software Testing life cycle and its phases is. In phases, all the activities performed in it are explained.
2. What is Software Testing Life Cycle?
Software Testing life cycle is the life cycle where each and every activity of testing is categorized into phases, and these phases are followed sequentially. This starts right from the project beginning, where requirements are started to analyze and ends at closing software testing activities. Below are the testing phases that are followed in sequence to form the Software Testing Life Cycle.
- Requirement Analysis
- Test Planning / Test Strategy
- Test Design / Test Case Development
- Test Bed / Environment Setup
- Test Execution
- Test Closure
3. Explain the Requirement Analysis phase
The requirement analysis phase is the phase where the requirement specifications shared by the customer are thoroughly analyzed and understood. Here testable, non-testable requirements are categorized, and a rough brainstorming session is carried out to figure out the ambiguous and unclear requirements. These are documented and shared with the customer as Query log to get the confirmation on each point so that nothing is misunderstood and mistaken. Main Moto: DO NOT ASSUME.
4. Explain Test Planning / Test Strategy phase
Manager / Lead define test strategy. It determines a few factors like effort needed, scope, schedules/milestones, test approach, and cost estimates for the entire project. Based on the requirement analysis, Test Plan will be started to design. Test Plan is the living document that captures all the activities of software testing, scope to test, resource information, etc.
5. Explain Test Design / Test Case Development phase
Detailed test case writing will be started by the testing team on testable requirements, along with the preparation of test data for execution. All the test cases undergo review either by customers or peers or lead to ensure good coverage for tests. Along with this, the Requirement Traceability Matrix (RTM) will be prepared to map each test case to a testable requirement. This helps to determine test coverage in terms of test cases.
Also, in some projects, instead of test cases, test scenarios will be identified and documented in this phase. These will be at a very high level as to mention what the scene has to do. This requires highly skilled testers to develop test scenario documentation.
Test cases / Test Scenarios here will be written for Smoke Test and Regression Test.0
6. Explain Test Bed / Environment Setup phase
The testbed will be set up with all the required hardware and software. Once the Test Environment is set up, the Smoke test cases will be executed to check the readiness of the test environment.
7. Explain the Test Execution phase
Once the smoke test passes, regression test cases will be executed by the software testing team. Passed test cases will be marked PASSED, and failed ones will be FAILED. FAILED test cases will have the defect or bug logged into the bug management tool. FAILED Test case and the respective bug will be linked to each other. Along with these conditions, test cases may be marked CONDITIONAL PASSED when it has minor issues that do not hamper the functionality. But the issue should be logged in to bug management tool and linked to this test case. Any test cases that cannot be executed due to blocker issues will be marked BLOCKED, and the issue will be logged and linked to it. The test execution phase helps in determining how good the feature is implemented with correctness.
8. Explain the Test Cycle Closure phase
Test Cycle will be closed based on different criteria as test coverage, quality, cost, time, business objectives, etc. This phase also conducts Retrospective to evaluate what all went good, what all went bad, which area needs to be improved, lessons learned from the current cycle. Test metrics will be prepared to analyze how much is the success rate, deviation from the plan.
9. Bug Management
In Software Testing Interview, questions on bug management through its life cycle, components, severity, and priority are commonly asked and very important as well.
10. Explain the bug life cycle in Software Testing
Bug life cycle / Bug management is one of the most important activities followed by the entire team (developers, testers, leads, managers). Bug life cycle should be followed very strictly in the projects as the bugs will always have Customer, Stakeholder’ keen eye on them. Moving their status on time correctly helps in deciding the success of any component/project in a while. A number of Open bugs will hamper the product’s quality and customer’s faith as well. Below is the life cycle of any bug logged by the tester during software testing.
- When the tester finds the bug, it gets logged into the Bug Management tool with all the required information. Bug status will be NEW at this point
- Once the bug is logged, it is reviewed by Test Lead / Manager and is assigned to the Development team. Bug status moves from NEW to ASSIGNED at this stage
- The developer reviews the bug again, and if the bug is Valid Bug is opened, and the status moves from ASSIGNED to OPEN.
- Once the bug is fixed, the developer moves the bug status from ASSIGNED to FIXED. At this stage, the bug will get assigned to the tester who has logged it.
- The tester will then retest the fixed bug, and if it is working as expected, status is marked as VERIFIED.
- Once the status moves to VERIFIED, the tester has to close the bug by again marking it with CLOSED status.
- If the bug is not fixed and the issue still exists, the tester re-opens the bug and assigns it back to the developer with all the information used for software testing the bug. Here status moves from FIXED to REOPEN.
- Developer reviews the bug, and if more information is needed to analyze it, the developer will move the bug status to NEED MORE INFO and assigns it to Tester. A tester will then have to provide required information and assign the bug back to a developer
- Developer reviews the bug, and if it is invalid, they will move the bug status to INVALID and closes it
- Developer reviews the bug and if another tester already logs it, then will move the status to DUPLICATE and closes it
- Developer reviews the bug, and if it can’t be fixed in the current release, then after discussion with managers, the bug will be moved to DEFERRED. This means bug will be fixed in future releases
11. What are the different components of the bug and explain them
While logging a bug, the tester has to provide a lot much information. The bug management tool has the fields where the tester has to provide information to it while logging the bug. Below are the components, in general, in any bug getting logged:
Read Also: Usability Testing Tutorial: Need, Process, Best Practice
- Summary: Short title for a bug that summarizes what is happening. It should be catchy and clear. Just by reading the summary, anyone should understand what the issue is.
- Description: Detailed steps to reproduce the bug. Here steps start by launching the application until the bug encountered. All the steps should be clearly described with input data wherever needed
- Actual Result: What is happening? Here the issue should be clearly described along with analysis made. Also, it is always good to give URL of the page where this bug is seen
- Expected Result: What should happen. Here the expected behavior should be described correctly. References to the relevant requirement should be provided
- Component: Which component the bug belongs to. Select the correct feature that the bug belongs to. This is because a bug gets assigned to the developer who is working on a particular feature.
- Severity: Level of the impact of the bug – Critical, Major, Normal, Minor
- Priority: Urgency to fix the bug – Critical, High, Medium, Low
- Release info: Release name. This is the release name of the application. It depends on the project naming the release (example: App 1.0, Maintenance 1.0, etc.)
- Build info: Current build number in the release. This indicates the build in which the bug is found
- Environment: Testing environment name (testing, staging, prod, etc.). Testing happens in most of the environments. This indicates the software testing environment in which the bug is found.
- Operating System: OS version of the machine where a bug is found
- Browsers: What all browsers bug is encountered where the bug is found. (usually, one or two browsers will be provided unless cross-browser comes into the picture)
- Attachment: Screenshot or video recording of the bug. The attachment should indicate the area of the bug. Highlight all possible information that should be noted in the screenshots/video.
12. Explain different Severity Levels that the bug could have
- Critical / Show Stopper (S1): Bug that hampers critical functionality and stops the user from proceeding further with any actions. The user here is blocked to perform any actions on the particular feature or application as a whole. Example: Submissions are not working at all, 404 errors, etc.
- Major or Severe (S2): Bug that is misbehaving and producing wrong results goes to this category. Here the feature deviates completely from the requirement. Example: Wrong calculations, incorrect logics, no results for valid inputs, lot many results for invalid inputs, log in with no checks, broken links, etc.
- Moderate/ Normal (S3): Bug that is hampering the functionality indirectly. Here the hampered functionality can still be tested in other ways to produce correct results. Examples: Instant updates in pages upon data input, broken images, but still working upon clicks, etc.
- Low or Minor (S4): Bug that does not have an impact on the functionality. UI issues, alignments, spellings, grammar. Example: Logo misplaced, widget borders overlapped or crossing page borders, etc.
13. Explain different Priority Levels that the bug could have
- Priority 1 Critical (P1): Bug has to be fixed immediately, within a few hours of logging it.
- Priority 2 High (P2): Once the P1 bugs are fixed, P2 bugs are taken into consideration. These have to be fixed within 24-48 hours of logging it.
- Priority 3 Medium (P3): Once the P2 bugs are fixed, P3 bugs are taken into consideration. These have to be fixed within 2-4 days of logging it or within a week or so.
- Once the P3 bugs are fixed, P4 bugs are taken into consideration. These can be decided to fix in the next builds or next releases. Bugs that do not hamper functionalities and not so visible to the user are considered to be fixed in this priority.
14. Test approaches
In Software Testing Interview, questions will be asked on which approach is best for the testing and what the factors to consider before deciding the approach. Approach to software testing a system/product/application is not easy to decide as they involve a lot many criteria for this. Few of them are as below:
- Schedule
- Budget constraints
- Skilled tester to adapt to any approach
- The complexity of the system/product/application
- Expectations of the revenue returned
Software Testing Interview questions will be based on testing phases in any of the models. Here we will go through common approaches that focus on testing activities to a very high extent. All the below approaches can be tweaked to the project’s comfort but keeping the main process intact.
15. Explain the testing phases in V-model
V-model is the model where development and testing go hand-in-hand. All the development activities will have relevant testing activities bounded to it. V-model has been proved that it is one of the approaches to eliminate lot many bugs entering the source code and also a lot many bugs fixed at the early stage of the development. Below is how the model works:
- At the Requirements Study phase, testing activity to identify and design User Acceptance Test Scenarios / Cases are performed.
- At the System Design phase, testing activity to identify and design System Test Scenarios / Cases are performed
- At the Architecture / High-level design, testing activity to identify and design Integration Test Scenarios / Cases is performed
- At the Module / Low-level design, testing activity to perform and design Unit Test Scenarios / Cases are performed.
- Upon completion of Implementation, at the stages, each level’s test scenarios/cases are executed.
Test Scenarios / Cases designed at particular stages are used in execution at their relevant phases.
Explain the Iterative model
Iteration is the model or approach where the new features developed are added to the existing system. Once the new features are added, the entire system is tested for both existing functionalities and new features. Regression testing is extensively performed in this approach to maintain the quality of the whole system. The major goal here is to ensure that all the functionalities that were working fine before adding new feature still works correctly after adding them. Also, it involves modifications to existing features as well as per changing requirements (CRs).
Here each release is divided into iterations, and each iteration is the integration of new features into the existing system and modification of the existing features. As and when the iterations increase, the testing effort also increases tremendously to ensure the quality of the whole system. Automation is preferred for an existing feature’s regression, and a new feature / modified feature is tested manually.
Explain Agile Scrum
Agile Scrum is one of the most commonly followed approaches in the current IT industry. Customers who expect quick returns from their investments go for this approach. Here the release timelines are too short, usually one release per 2-3 months. Each release has fewer new features and/or modifications to the existing features. The standard process of this approach takes variations from project-to-project, i.e., the same approach is followed in different ways in different projects. Here team size will be less compared to other approaches, and the team is usually cross-skilled. Business analysts, developers, testers, managers, customers form a single team. Customer interaction is very high in the Agile Scrum, and there is no much documentation done. The common process goes as below:
- Each release is divided into sprints, which is of 2 weeks each. Within these 2 weeks, below activities are performed.
- Business analyst collects requirements from customers, customer’s customers, backlog issues, etc. Each requirement gets logged to the test management tool as Story and expectations are termed as Acceptance Criteria.
- Each of the stories is described to the entire team as JAD sessions. Here developer, testers commonly discuss and decide on which functionalities can be implemented and tested, respectively. The effort required from both sides is provided at a very high level.
- Stories are selected and finalized for which sprints they have to be taken.
- Developers proceed with designing, implementing the features. Testers proceed with identifying the test scenarios for the features as per the acceptance criteria. Smoke test and regression test scenarios are documented and reviewed by peers, leads, business analysts, and/or customers.
- Once the build is ready for testing, testers perform the smoke test for each of the stories implemented and will report blockers, if any. Upon passing the smoke test, testers will start performing regression testing for stories and also the impacted features.
- Any issues found are logged into the bug management tool, and developers take quick action on them for fixing it.
- Upon completion of regression testing and bug verification for the fixed ones, testers will report the success % for each of the stories and will come up with the retrospective document for the sprint. This document will have what went well in the current sprint, what went bad, and what could have been better.
- Retrospective points are discussed within the team, and corrective measures are taken to enhance the process further.
- Each story is given points at the end of the sprint based on effort, success, failure, open bugs, and fixed bugs.
- Any unfixed bug will move to the next sprint.
- Along with these activities, a scrum meeting is held daily by Scrum master (usually the manager), where every team member has to update on their completed and current tasks, issues, risks, dependencies. Scrum master collects all the points from the entire team and coordinates with customers and/or team to resolve issues then and there.
This common process takes a variation from project-to-project. This approach guarantees the quality of the product if followed strictly, as there is a lack of documentation and tracking. The team has to be self-responsible for the tasks being carried out and should maintain accountability for themselves.
Leave a Reply