Table of Contents <Draft Version>


Define testing type per environment Weekly Release

Assumption: weekly release will not include changes on automations (Flows, triggers), if the case, then a more robust testing should be considered: Happy Path, Alternative Flows, Negative (-Ve).

Define Test scripts repository (JIRA folder) for each testing type

Test Repository :: This will be our test case Library with following limitation ie test repo is a project level feature, so each test repository can only contain test from its projects (we cannot mix tests from multiple projects into a single repository and organize them together)

All test cases which we create end up in Test Repo by default.


Test Sets :: Is a group of tests , we can consider this to be the equivalent of a single folder inside the test repository.

Test sets are more flexible than test repository and that a test set can contain tests from multiple projects and a single test case can live in multiple test sets.

Under test sets we create All E2E , Regression tests , Test Release plans.


Following are Agreed by Dina ::

Test Repository Decision:











Naming Convention = Release Name :: Project Name_Test Plan <Almond_R75 :: Migration 2 Flows_Test Plan>

Test Plan contains the sub folders with test cases which specifically targets the specified testing i.e INT_Regression , INT_Core Testing  and INT_iCARe Testing











Naming Convention :: Test Execution for Test Plan Name i.e Test Execution for Almond_R75 :: :: Migration 2 Flows_Test Plan



The BAs will need to create or update test scripts, identify if the test Script needs to be executed for a certain release by adding or updating the fix Version >> this will allow QA lead to create the Test Sets, Test Plan and Test execution.


Reporting :: Reports can be executed reports section of X-Ray based on Tests List, Test Set list , Test Plan List , Test Execution List.




Define Testing process including Bug creation and fix


Once a story is created on Jira, the testing process generally follows these steps:

Story Refinement Session: A story refinement session is a collaborative meeting where the development team, product owner, and stakeholders review and clarify user stories in the product backlog. The goal is to ensure that stories are well-understood, properly scoped, and ready for upcoming sprints. During the session, the team clarifies requirements, estimates effort, discusses technical challenges, reviews and refines acceptance criteria, and prioritizes the stories. The outcome is a set of clearly defined, estimated, and prioritized stories that are ready for development, helping streamline the subsequent sprint planning process.

Here are some good habits for writing stories on JIRA to ensure clarity, completeness, and efficiency in the development and testing processes:

What: Keep the story title brief and to the point, summarizing the key action or outcome.

Why: A clear title makes it easy to identify stories in JIRA's backlog, boards, or reports.

Example: Instead of "Fix error," use "Fix payment gateway timeout error during checkout."

What: Write stories from the perspective of the end user, following the “As a [user], I want [feature], so that [benefit]” format.

Why: This provides clear context and helps the development and testing teams understand the business value.

Example: “As a customer, I want to save my credit card details, so I can make faster payments in the future.”

What: List out specific, measurable conditions that must be met for the story to be considered complete.

Why: Acceptance criteria guide development and testing, ensuring alignment on what "done" means.

Example:

User can add a new credit card.

The system encrypts credit card information.

Users are notified of a successful save.

What: Attach relevant screenshots, wireframes, design mockups, or detailed specs.

Why: Visual aids and additional details help developers and testers understand the requirement better, reducing back-and-forth clarifications.

Example: Attach a UI wireframe of how the credit card form should look.

What: If the story is too complex, break it down into smaller, manageable tasks or sub-tasks.

Why: Smaller stories make it easier to track progress, estimate effort, and ensure more frequent, smaller deployments.

Example: Instead of “Implement payment module,” break it down into “Implement credit card payments,” “Implement PayPal payments,” etc.


Test Case Design: Test Case Design involves QA engineers creating detailed test cases that outline how to validate a story's functionality. These test cases cover all possible scenarios, including both expected (positive) outcomes and potential errors or edge cases (negative scenarios). Each test case is carefully linked to the relevant user story in Jira, ensuring traceability and making it easy to track testing progress and results against specific requirements.

Naming Convention for Test Description of a test case (This helps in identifying the test case related to the story in test)

Ex :: iCARe_Story Number :: Description of Test / Core_Story Number :: Description of Test

iCARe_CCCME-1234 :: Verify that the user successfully logged into iCARe application.

Core_CCCME-6789 :: Verify that the user successfully logged into Core application.



Test Execution: Once the development is complete, and the code is deployed to the testing environment (usually in the Integration Testing (INT) phase), the QA team executes the test cases. Any issues or bugs encountered are logged as separate Jira tickets, The story is marked as failed and the bug is linked to the original story, and assigned to the relevant developer.

Naming Convention for Bug Description (This helps in identifying in which env the bug was raised).

Ex :: INT_iCARe :: Description of Bug/ INT_Core :: Description of the bug

INT_iCARe :: User unable to log into iCARe application.


PO Testing :: After Integration Testing (INT) is a phase where the Product Owner (PO) personally verifies that the completed story meets the business requirements and acceptance criteria before it moves to User Acceptance Testing (UAT). During PO Testing, the PO reviews the functionality to ensure it aligns with the expected outcomes and overall product vision. If the PO identifies any issues, they can request changes or adjustments, and any bugs found are logged in Jira for the development team to address. Once the PO is satisfied with the implementation, the story is typically approved to proceed to UAT.

Bug Fixing and Retesting: Developers work on fixing any reported bugs, updating the status of the bug tickets in Jira (e.g., from "Open" to "In Progress," and then "Resolved/ Done"). The QA team then retests the fixes and updates the status of the bug tickets accordingly.

Once the bug is passed and closed, QA Team retests the associated Story and marks its status accordingly.

User Acceptance Testing (UAT): User Acceptance Testing (UAT) is the final testing phase, where the story is validated by business users or stakeholders after it has passed Integration Testing (INT). In UAT, these users ensure that the implementation meets the defined acceptance criteria and aligns with business requirements. If any issues or bugs are discovered during this phase, they are logged in Jira, where they are tracked, updated, and reassigned to developers for resolution. The process continues until the users are satisfied that the system functions correctly and meets their needs, signalling that the story is ready for production.

Note :: The bugs raised on UAT should also be tested in lower environments i.e INT.

Naming Convention for Bug Description (This helps in identifying in which env the bug was raised).

Ex :: UAT_iCARe :: Description of Bug/ UAT_Core :: Description of the bug

UAT_Core :: User unable to log into Core application.


Jira Bug Status and Associated Roles


Bug Priority :: Low , Medium , High , Critical


Flow Diagram ::



Story Closure: Marks the final step in the development process, where a user story is formally completed and closed in Jira. This occurs after the story has successfully passed all testing phases, including Integration Testing (INT), Product Owner (PO) Testing, and User Acceptance Testing (UAT). The QA team or product owner verifies that all acceptance criteria have been met, all associated bugs have been resolved, and the functionality works as intended in the production environment. The Jira story's status is then updated to "Done" or "Closed," signalling that no further work is required and the feature is ready for release. Story closure also involves ensuring that any related documentation is updated and that the story is archived for future reference.


Define Guidelines on how to report a Bug


Guidelines for Reporting a Bug

Reporting a bug effectively is crucial for developers and QA teams to understand and resolve issues quickly. Below are comprehensive guidelines on how to report a bug:

  1. Title

Be Specific and Concise: Summarize the issue in a short, clear sentence.

Include Key Details: Mention the area or feature affected, and the nature of the problem (e.g., "On iCARe error when selecting Products").

  1. Description

Detailed Overview: Provide a brief description of the bug, including the expected behaviour and the actual behaviour observed.

Context: Mention where and how often the bug occurs, and whether it’s reproducible or intermittent.

  1. Steps to Reproduce

Clear Steps: Outline the exact steps needed to reproduce the bug.

Numbered List: Use a numbered list to make it easy to follow the steps.

Initial Conditions: Include any necessary conditions or prerequisites (e.g., "User must be logged in," "Browser cache cleared").

  1. Expected vs. Actual Results

Expected Result: Describe what should happen when the steps are followed.

Actual Result: Describe what actually happens, emphasizing the difference from the expected outcome.

  1. Environment / Sprint / Release Details

System Information: Include details about the environment / Sprint / Release details where the bug was encountered (e.g. INT, UAT , Prod).

  1. Attachments

Screenshots: Attach screenshots showing the issue, highlighting the problematic area if necessary.

Video Clips: If a screenshot isn’t enough, include a short video clip showing the steps leading to the bug and the bug itself.

Logs and Error Messages: Provide any relevant logs or error messages, either copied as text or attached as files.

Test Data: Include any specific test data used to reproduce the bug.

  1. Severity and Priority

Severity: Indicate the severity of the bug (e.g., Critical, Major, Minor). This assesses the impact on functionality. ****

Priority: Suggest the priority level (e.g., High, Medium, Low), indicating how soon it should be addressed.

  1. Additional Information

Related Issues: Reference any related issues or bugs that might be connected to the reported problem.

Workarounds: Mention if there’s a known workaround that can temporarily mitigate the issue.

  1. Review Before Submission

Double-Check Details: Ensure all information is accurate, complete, and clearly presented.

Clarity: Make sure the report is easy to understand, avoiding jargon or abbreviations unless widely known.

Example of Bug ::

Title :: INT_iCare Migratiopn Flows :: System generating an error message when trying to add a Relationship to an account.

Description

  1. Log into iCARe as Admin

  2. Create a Contact ABC

  3. Select an account which has no Account Origin and which is not linked to the above created contact

  4. Navigate to Business Tab

  5. Click on Add Relationship --> Under Contact select the above created Contact ABC

  6. Select Send Portal Invitation check box and click on Save

Expected Result :: The record should be saved

Actual Result :: System generating bellow attached error message

Attachments ::





Priority : High

Environment : INT

Sprint : Sprint 123

Release Details / Fix Version: Samosa R76