0% found this document useful (0 votes)
25 views44 pages

Sta Lab Record

The document is a practical record notebook for B.E. Computer Science and Engineering students at Indira Institute of Engineering & Technology, focusing on Software Testing and Automation for the academic year 2024-2025. It includes a certificate of authenticity, an index of experiments, and detailed procedures for developing test plans, designing test cases, and reporting defects for an e-commerce application. The document outlines various testing methodologies, roles, responsibilities, and a defect tracking system, ensuring comprehensive testing practices.

Uploaded by

yaminishrvani
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views44 pages

Sta Lab Record

The document is a practical record notebook for B.E. Computer Science and Engineering students at Indira Institute of Engineering & Technology, focusing on Software Testing and Automation for the academic year 2024-2025. It includes a certificate of authenticity, an index of experiments, and detailed procedures for developing test plans, designing test cases, and reporting defects for an e-commerce application. The document outlines various testing methodologies, roles, responsibilities, and a defect tracking system, ensuring comprehensive testing practices.

Uploaded by

yaminishrvani
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 44

INDIRA INSTITUTE OF ENGINEERING &TECHNOLOGY

PANDUR, THIRUVALLUR

PRACTICAL RECORD NOTE BOOK

ANNA UNIVERSITY CHENNAI

B.E. COMPUTER SCIENCE AND ENGINEERING

V SEMESTER/ III-YEAR

CCS366-SOFTWARE TESTING AND AUTOMATION

2024-2025
INDIRA INSTITUTE OF ENGINEERING &TECHNOLOGY
Pandur, Thiruvallur – 631 203

University Register No:

CERTIFICATE

This is to certify that this is a bonafide record of work done by

Of 5TH S Semester B.E. (CSE) in the CCS366-Software

Testing And Automation during the academic year 2024-2025.

FACULTY-IN-CHARGE HEAD OF THE DEPARTMENT

Submitted for the University Practical Examination held on

Internal Examiner External Examiner


INDEX
PAGE SIGN
EXP. DATE EXPERIMENT NAME
NO
NO
DEVELOP THE TEST PLAN FOR TESTING AN
E-COMMERCE WEB/MOBILE APPLICATION
(www.amazon.com)
DESIGN THE TEST CASES FOR TESTING THE
E-COMMERCE APPLICATION
TEST THE E-COMMERCE APPLICATION AND
REPORT THE DEFECTS IN IT
DEVELOP THE TEST PLAN AND DESIGN THE
TEST CASES FOR AN INVENTORY CONTROL
SYSTEM
EXECUTE THE TEST CASES AGAINST A
CLIENT SERVER OR DESKTOP APPLICATION
AND IDENTIFY THE DEFECTS
TEST THE PERFORMANCE OF THE E-
COMMERCE APPLICATION
AUTOMATE THE TESTING OF E-COMMERCE
APPLICATIONS USING SELENIUM
INTEGRATE TestNG WITH THE ABOVE TEST
AUTOMATION
BUILD A DATA-DRIVEN FRAMEWORK
USING SELENIUM AND TestNG
BUILD PAGE OBJECT MODEL USING
SELENIUM AND TestNG
BUILD BDD FRAMEWORK WITH SELENIUM,
TestNG AND CUCUMBER
EX. NO: 1 DEVELOP THE TEST PLAN FOR TESTING AN E-
COMMERCE WEB/MOBILE APPLICATION
DATE:
(www.amazon.com)

AIM
Develop the Test Plan for Testing an E-Commerece Web/Mobile Application
(Www.Amazon.Com)

PROCEDURE:

1. Objective
 The objective is to validate that the Amazon e commerce website functions as
expected across all features, including product search, shopping cart, order
processing, payment systems, and user accounts.
 The goal is to identify any bugs or usability issues that could affect the user
experience or business operations.
2. Scope
 In Scope: Product browsing, filtering, adding items to the cart, checkout process,
payment gateways, order confirmation, user account creation, and customer service
interactions.
 Out of Scope: Backend inventory management, internal Amazon warehouse
systems, and third party logistics systems.
3. Test Methodology
 Functional Testing: Verify that the features work as expected.
 Usability Testing: Ensure the site is user friendly and intuitive.
 Performance Testing: Check that the website performs well under different loads.
 Security Testing: Protect customer data and transactions from vulnerabilities.
 Cross browser Testing: Ensure the website works consistently across different
browsers and devices.
4. Approach
 Test scenarios will be created based on user stories and requirements.
 Test cases will be written for each scenario, and manual and automated tests will
be executed.
 Regression testing will be performed whenever there are changes or new features
added to the system.
5. Assumptions
 Users will have stable internet connections.
 Payment gateways will have real time connectivity with banks.
 Browsers used will be up to date with JavaScript enabled.
6. Risks
 Unavailability of test environments.
 Delays in receiving updates from third party payment services.
 High traffic during major sales events could impact performance testing.
7. Mitigation Plan or Contingency Plan
 If test environments are unavailable, prioritize testing on the production replica.
 Work with payment service providers to set up a mock testing environment.
 Schedule performance testing during non peak hours to reduce server load.
8. Roles and Responsibilities
 Test Lead: Oversees the testing activities, assigns tasks, and manages timelines.
 Test Engineers : Create and execute test cases, report defects.
 Automation Engineers : Develop and maintain automated test scripts.
 Product Owners : Clarify requirements and validate acceptance criteria.
9. Schedule
 Week 1-2 : Requirement analysis and test planning.
 Week 3-4 : Test case creation and review.
 Week 5-6 : Test execution (manual and automation).
 Week 7 : Bug fixes and retesting.
 Week 8 : Final regression testing and sign off.
10. Defect Tracking
 Defects will be logged and tracked using a tool like JIRA.
 Each defect will be assigned a severity level and status, including Open, In
Progress, Resolved, and Closed.
 Weekly review meetings will be held to discuss and prioritize the defects.
11. Test Environment
 Test environments will replicate the production setup with real time product
catalogs.
 Environments include devices like desktops, smartphones, and tablets to test
responsiveness.
12. Entry and Exit Criteria
 Entry Criteria : All test cases are created, test environments are set up, test data is
ready.
 Exit Criteria : All critical defects are resolved, and all planned test cases have been
executed successfully.
13. Test Automation
 Automate regression test cases for critical features like login, search, checkout, and
order tracking.
 Use tools like Selenium or TestNG for automated testing on multiple browsers.
14. Effort Estimation
 Manual Testing: 300 hours (Functional + Regression).
 Automation Testing: 200 hours (Script Development + Execution).
 Performance Testing: 100 hours.
15. Test Deliverables
 Test Plan document.
 Test Cases and Test Scripts.
 Test Summary Report.
 Defect Report and Root Cause Analysis.
 Final Test Sign off document.
16. Template Summary
 Objective : To ensure that Amazon's e commerce platform is user friendly, secure,
and meets business requirements.
 Scope : Includes all front end functionalities and excludes back end logistics.
 Methodology : Functional, usability, performance, and security testing.
 Approach : Combination of manual and automated testing.
 Risks and Mitigation : Address test environment and third party integration issues.
 Roles and Responsibilities : Clearly defined tasks for the testing team.
 Defect Tracking : Organized defect tracking using tools like JIRA.
 Automation : Focus on automating regression and repetitive tests.

Result:
Thus the Test Plan for Testing an E-Commerece Web/Mobile Application
(Www.Amazon.Com) is created successfully
EX. NO: 2 DESIGN THE TEST CASES FOR TESTING THE E-
COMMERCE APPLICATION
DATE:

AIM:
To design the test cases for testing the e-commerce application.
PROCEDURE:
 The user should be able to navigate to all the pages in the website
 There should be a fallback page for any page load errors
 Verify that all the links and banners work properly
 Search results should be displayed with the most relevant item being shown
first
 All data related to the product – title, price, images, and description are all
visible clearly
 Maintain a session for each user and test verify the session times out after a
while
TEST CASES

Test Cases for Home Page Functionality

Test Description Input Expected Output


Case ID
TC_01 Verify if the home page is Valid login credentials. Home page is displayed.
displayed after a successful
login.
TC_02 Verify if the user name is Logged-in user session. User name is displayed
displayed on the correctly on the homepage.
homepage.
TC_03 Verify if the home page Access home page on Home page is displayed
renders properly on different browsers. properly on all tested
different browsers. browsers.
TC_04 Verify if products are Access home page. Products are displayed on
displayed on the the homepage.
homepage.
TC_05 Verify if search Access home page. Search bar is visible on the
functionality is available homepage.
on the homepage.
TC_06 Verify if products on the Click on any product Product page is opened.
homepage are clickable. on the homepage.
TC_07 Verify if the alignment on View homepage layout. All elements are aligned
the homepage is correct. properly as per design
specifications.
TC_08 Verify if products are Access home page. Products are displayed in
displayed in categories on their respective categories.
the homepage.
TC_09 Verify if the user profile is Access home page User profile section is
available on the homepage. while logged in. visible on the homepage.

Test Cases for Search Functionality

Test Description Input Expected Output


Case
ID
TC_01 Verify if the search box is Access home page or Search box is visible.
available. search page.
TC_02 Verify if the length of the Enter a long query in The query length is
search query is as per the search box. restricted to the specified
specification. limit.
TC_03 Verify if the search results Enter a search query. Results displayed are
match the search query. relevant to the query.
TC_04 Verify if the user can Enter a query by Relevant results are
search by product name, name/brand/spec. displayed.
brand name, or
specification.
TC_05 Verify if the search button Click the search button Search results page opens
works correctly. after entering a query. with relevant results.
TC_06 Verify if the search field Enter various character Search field accepts all
accepts alphabets, types in the search field. valid character types.
numbers, and symbols.
TC_07 Verify if the sorting option Access search results Sorting options are
is available on the search page. displayed on the search
results page. results page.
TC_08 Verify if the number of View search results The number of search
search results per page is page. results per page is
displayed. displayed as per
specifications.
Test Cases for Product Detail Page

Test Description Input Expected Output


Case ID
TC_01 Verify if product images are Access a Product images are
displayed correctly. product detail displayed without distortion.
page.
TC_02 Verify if the price of the product Access a Product price is visible on
is displayed. product detail the page.
page.
TC_03 Verify if the product Access a Product specifications are
specifications are displayed. product detail listed clearly.
page.
TC_04 Verify if product reviews are Access a Reviews section is
displayed. product detail displayed with user reviews.
page.
TC_05 Verify if stock status (In Access a Stock status is shown
Stock/Out of Stock) is product detail accurately.
displayed. page.
TC_06 Verify if shipping information is Access a Shipping information is
displayed. product detail visible.
page.
TC_07 Verify if payment options are Access a Payment options are listed
displayed. product detail on the page.
page.
TC_08 Verify if related products are Access a Related products section is
displayed. product detail visible.
page.
Test Cases for Cart Functionality

Test Description Input Expected Output


Case ID
TC_01 Verify if the "Add to Cart" Click on "Add to Button is clickable, and
button is clickable. Cart" button for a product is added to the
product. cart.
TC_02 Verify if products move to the Click "Add to Cart" Product is added to the
cart after clicking "Add to for a product. cart successfully.
Cart".
TC_03 Verify if the number of items Add multiple items to Cart counter increases
in the cart increases. the cart. accurately.
TC_04 Verify if the user can continue Add items to cart and User can continue
shopping after adding items to browse products. shopping without issues.
the cart.
TC_05 Verify if the total amount of Add items to the cart.Total cost of items is
items is displayed. calculated and displayed.
TC_06 Verify if tax information is View the cart page. Tax details are shown
displayed. clearly.
TC_07 Verify if shipping charges are Enter shipping Shipping charges update
displayed as per location. location in the cart. based on location.
TC_08 Verify the maximum number Add maximum System restricts
of quantities allowed. allowed quantity of a quantities beyond the
product. allowed limit.
TC_09 Verify if products can be Remove an item from Product is successfully
removed from the cart. the cart. removed from the cart.
TC_10 Verify if multiple products can Select and remove All selected items are
be removed from the cart. multiple items. removed.
TC_11 Verify if the user can apply Enter a valid coupon Coupon is applied, and
coupon codes. code. discount is reflected.

RESULT:
Test cases for testing the e-commerce application has been successfully
created.
EX. NO: 3 TEST THE E-COMMERCE APPLICATION AND REPORT
THE DEFECTS IN IT
DATE:

AIM:
To test the e-commerce application and report the defects in it.
PROCEDURE:
Testing Scenarios:
1. User Registration:
 Create a new account with valid and invalid email formats.
 Test the password strength requirements.
 Check for proper error messages when registration fails.
2. Product Browsing:
 Browse different product categories.
 Verify that product details (price, description, images) are displayed
correctly.
 Check for any broken or missing images.
3. Adding Items to Cart:
 Add items to the cart and verify that the cart updates accurately.
 Test adding items with different quantities.
4. Cart Functionality:
 Update item quantities in the cart and ensure the cart total updates
accordingly.
 Remove items from the cart and verify that the cart updates properly.
5. Checkout Process:
 Proceed through the checkout process as both a registered and guest
user.
 Test different payment options (credit card, PayPal, etc.).
 Ensure that tax, shipping, and discounts are calculated correctly.
6. User Account Management:
 Test updating account information (name, email, password).
 Check for proper validation when changing account information.
 Verify that users can log in and log out successfully.
7. Search Functionality:
 Search for products using different keywords.
 Check that the search results are relevant and displayed correctly.
8. Responsive Design:
 Test the application on various devices and screen sizes to ensure
responsive design.
9. Security Testing:
 Test for SQL injection and other common security vulnerabilities.

Defect Report

Defect Module Description Severity Priority Steps to Expected


ID Reproduce Behavior
UR_01 User Application High High 1. Go to the Registration
Registration accepts registration should fail with
invalid email page. an appropriate
formats 2. Enter an error message
during invalid email when the email
registration. format (e.g., format is
"useremail.com invalid.
").
3. Submit the
form.
UR_02 User Weak High Medium 1. Enter a Application
Registration password password like should enforce
accepted "12345" during password
without registration. strength
showing any 2. Submit the requirements
warning or form. (e.g., minimum
error. length, special
characters).
PB_01 Product Some Medium Medium 1. Browse a All product
Browsing product product images should
images are category. load correctly
missing or 2. Check the without errors.
broken. images for each
product in the
category.
CART_ Cart Removing Medium High 1. Add two The total price
02 Functionalit items from items to the should update
y the cart does cart. immediately
not update 2. Remove one after removing
the cart total. item. an item.
3. Check the
cart total.
CHECK Checkout Tax High High 1. Proceed to Tax should be
_01 Process calculation is checkout. calculated
incorrect for 2. Select a correctly based
some shipping on the user's
locations. address in a location.
taxable region.
3. Verify the tax
calculation.
CHECK Checkout Guest users Critical Critical 1. Add items to Guest users
_02 Process are unable to the cart. should be able
complete the 2. Proceed to to checkout
checkout checkout without
process due without logging requiring
to a missing in. account
"Guest 3. Notice that creation.
Checkout" no guest option
option. is visible.
SEARC Search Irrelevant Medium Medium 1. Enter a Search results
H_01 Functionalit results keyword (e.g., should be
y displayed for "Phone"). relevant to the
specific 2. Check the entered
search relevance of keyword.
keywords. displayed
results.
RESP_0 Responsive Application High Medium 1. Open the Application
1 Design layout application on a should render
breaks on mobile device correctly on all
smaller with a 320px devices and
screens screen width. screen sizes.
(320px 2. Check for
width). layout issues.
SEC_02 Security Passwords Critical Critical 1. Register or Passwords
Testing are stored in log in with a should be
plaintext and user account. encrypted and
visible in 2. Intercept the not visible in
network request using a network
requests. tool like requests or
Postman. stored in
plaintext in the
database.

RESULT:
The defects in the e-commerce application is reported successfully
EX. NO: 4 DEVELOP THE TEST PLAN AND DESIGN THE TEST CASES
FOR AN INVENTORY CONTROL SYSTEM
DATE:

AIM:
To develop the test plan and design the test cases for an inventory control
system
Test Plan for an Inventory Control System:
1. Introduction
The Inventory Control System is designed to efficiently manage inventory,
track stock levels, and ensure accurate stock information. This test plan outlines the
testing approach, scope, objectives, and methodologies to verify the functionality,
performance, and usability of the system.
2. Objectives
The primary objectives of this test plan are to:
 Validate that the Inventory Control System meets functional requirements.
 Verify the accuracy and consistency of inventory tracking.
 Assess system performance under varying loads.
 Ensure the system's security mechanisms are effective.
 Evaluate the system's usability and user-friendliness.
3. Test Scope
The testing will cover the following aspects of the Inventory Control System:
 User authentication and authorization.
 Adding, updating, and deleting products in the inventory.
 Monitoring stock levels and receiving alerts for low stock.
 Generating various inventory reports.
 User interfaces for different user roles (admin, manager, employee).
4. Test Environment
 Hardware: Servers, workstations, mobile devices for testing on different
platforms.
 Software: Inventory Control System build, web browsers, databases.
 Network: LAN/WAN with various network conditions (latency, bandwidth).
 Test Data: Realistic inventory data and user scenarios for testing.
5. Test Data
 Test data will include a range of products, quantities, prices, and suppliers.
 Both normal and boundary test cases will be created to cover various scenarios.
 Data will be structured to validate different calculations and reports.
6. Types of Testing
 Unit Testing: Testing individual modules and functions.
 Integration Testing: Testing interactions between different system components.
 Functional Testing: Validating functional requirements.
 Performance Testing: Evaluating system response times and resource usage.
 Security Testing: Checking for vulnerabilities and unauthorized access.
 Usability Testing: Assessing user-friendliness and navigation.
7. Defect Management
 Defects will be logged using a defect tracking tool.
 Defects will be classified by severity (critical, major, minor) and priority.
 Defects will be retested after resolution.
8. Test Schedule
Testing Phases and Estimated Timeline:
 Unit Testing: 1 week
 Integration Testing: 2 weeks
 Functional Testing: 2 weeks
 Performance Testing: 1 week
 Security Testing: 1 week
 Usability Testing: 1 week
9. Risk Assessment
 Identified Risks: Data loss, system downtime, security breaches.
 Mitigation Strategies: Regular data backups, redundancy, security protocols.
10. Test Deliverables
The following deliverables will be produced during the testing process:
 Test cases document.
 Test execution results and defect reports.
 Performance testing results and analysis.
 Usability testing observations and feedback.
12. Exit Criteria
 All high-priority defects are resolved.
 Key performance indicators meet predefined targets.
 All test cases are executed and passed.

TEST CASES FOR INVENTORY MANAGEMENT


Sr. Test Action Steps Input Data Expected Actual Result Status
No Case Result
ID
1 TC-1 Invoice Enter Input 1021 It should accept Invoice no is Pass
no invoice invoice no. accepted
no
2 TC-2 Bill date Enter Input It should accept Bill date is Pass
bill date 27/11/2021 bill date accepted
3 TC-3 Item Select Item name Item name is Pass
name item should be reflecting
name automatically automatically
reflected
4 TC-4 Available Click on It should reflect Item stock is Pass
item stock textbox item stock reflecting
automatically automatically
5 TC-5 Quantity Enter Input 5000 Item quantity Item quantity Pass
item should be is accepted
quantity accepted
6 TC-6 Price Click on Price should be Price is Pass
textbox reflected reflecting
automatically automatically
7 TC-7 Total Click on Total should be Total is Pass
textbox reflected reflecting
automatically automatically
8 TC-8 Receive Enter Input Receive bill Receive bill Pass
bill date receive 29/1/2021 date should be date is
bill date accepted accepted
9 TC-9 Add item Click on Item should be Item is Pass
add item added and reflecting in
reflected in database
database

RESULT:
Developing the test plan and designing the test cases for an inventory control
system was been successfully created.
EX. NO: 5 EXECUTE THE TEST CASES AGAINST A CLIENT SERVER
OR DESKTOP APPLICATION AND IDENTIFY THE
DATE:
DEFECTS

AIM:
To execute the test cases against a client server or desktop application and
identify the defects.
ID Action Steps Input Expected Actual Status Defect
Data Result Result Details
TC-1 Login Enter valid Userna Login Login Pass -
Validation username/ me: should succeeded
password admin succeed
Passwor with valid
d: credential
pass123 s
TC-2 Login Enter Userna Login Login Pass -
Validation invalid me: should failed
username/ invalid fail with with
password Passwor an error error:
d: message "Invalid
wrong1 Credential
23 s"
TC-3 Invoice Enter Invoice Invoice Save Fail Defect
Creation invoice No: should be button is ID:
details 1234 saved unrespons DEF-
Date: successful ive 001
25/11/2 ly Severity
024 : Major
Save
button
not
function
ing
TC-4 Stock Enter item Item: Stock Stock Fail Defect
Check details Laptop details details not ID:
should be displayed DEF-
displayed 002
automatic Severity
ally : Critical
Stock
check
function
ality
broken
TC-5 Quantity Enter Quantit Quantity Quantity Pass -
Input quantity for y: 5000 should be is
an item accepted accepted
TC-6 Price Select item Item: Total Price Fail Defect
Calculatio and quantity Laptop price calculatio ID:
n Quantit should be n DEF-
y: 5 calculated incorrect 003
automatic Severity
ally : Major
Incorrec
t price
calculati
on logic
TC-7 Add Item Add an item Item: Item Item is Pass -
to Cart to the cart Laptop should be added
Quantit added to successful
y: 2 the cart ly
successful
ly

RESULT:
Thus executing the test cases against a client server or desktop application
and identification of defects is done
EX. NO: 6 TEST THE PERFORMANCE OF THE E-COMMERCE
APPLICATION
DATE:

AIM:
To test the performance of the e-commerce application.
Testing the Performance of an E-Commerce Application
1. Define Performance Requirements
Clearly outline acceptable performance criteria:
 Response Times: Define acceptable time limits for key actions (e.g., homepage
load, search results, checkout).
 Concurrent Users: Specify the maximum number of users the system should
handle simultaneously without degradation.
 Other Metrics: Include throughput, error rate, CPU/memory usage, and
database response times.
2. Identify Performance Testing Scenarios
Determine critical areas for testing, such as:
 Load Testing: Evaluate the system’s ability to handle expected user loads under
normal conditions.
 Stress Testing: Identify breaking points by testing with loads beyond expected
capacity.
 Scalability Testing: Assess how the system performs when resources (e.g.,
servers, memory) are added.
 Endurance Testing: Test system stability under a sustained load for an extended
period.
 Peak Load Testing: Simulate maximum user load during sales or high traffic
events.
 Response Time Testing: Measure the time taken for various transactions (e.g.,
login, search, checkout).
 Concurrency Testing: Analyze the system’s ability to handle multiple
simultaneous user sessions.
 Database Performance Testing: Evaluate the performance of database queries,
updates, and transactions.
 Checkout Process Testing: Test the speed and reliability of the checkout
process.
3. Select Performance Testing Tools
Use suitable performance testing tools, such as:
 Apache JMeter: For load and stress testing.
 LoadRunner: Comprehensive performance testing with scalability insights.
 Gatling: Lightweight and efficient tool for concurrency testing.
 Google Lighthouse: To measure page load times and other frontend metrics.
4. Prepare Test Data
Simulate realistic test data, such as:
 User profiles (regular users, admins).
 Products and categories.
 Transactions, orders, and payments.
5. Create Test Scripts
Develop automated scripts to simulate user activities:
 Browsing products.
 Searching for items.
 Adding items to the cart.
 Completing purchases.
6. Execute Performance Tests
Run the test cases defined in your scenarios using the selected tools.
 Monitor key metrics:
 Response times.
 Resource utilization (CPU, memory, disk I/O).
 Error rates.
7. Analyze Test Results
Identify bottlenecks and performance issues by analyzing test results.
Focus on:
 Response times exceeding acceptable limits.
 High error rates or failed transactions.
 System resource consumption during peak loads.
8. Optimize and Retest
 Work with developers and system administrators to address bottlenecks.
 Optimize the application and repeat tests to verify improvements.
9. Reporting
 Generate detailed reports, including:
 Test metrics (response times, error rates).
 System bottlenecks.
 Recommendations for performance improvements.
10. Monitoring in Production
 Use monitoring tools like New Relic, Datadog, or Dynatrace to continuously
assess application performance post-deployment.
 Set alerts for key metrics (e.g., load times, error rates) to catch issues in real-
time.

Performance Test Cases for E-Commerce Application

ID Scenario Steps Input Expected Actual Status Defect Details


Data Result Result
PT-01 Homepage 1. N/A Homepage Homepage Fail Defect ID:
Load Time Navigate should load loaded in 3 DEF-001
to the in ≤ 2 seconds. Severity:
homepage. seconds. Moderate
2. Measure Slow
the time homepage
taken to load time.
load fully.
PT-02 Search 1. Search Search Search Results Fail Defect ID:
Function for an item Query: results loaded in DEF-002
Response ("Laptop"). "Laptop" should load 1.5 seconds. Severity:
2. Measure in ≤ 1 Moderate
response second. Search
time for function is
results. slower than
expected.
PT-03 Add to Cart 1. Select a Product: Add to Cart Item added Pass -
Performance product. Laptop operation in 1.8
2. Click should take seconds.
"Add to ≤2
Cart" and seconds.
measure
response
time.
PT-04 Checkout 1. Add Product: Entire Checkout Fail Defect ID:
Process items to Laptop checkout completed DEF-003
Time cart. Payment process in 6 Severity:
2. Proceed Method: should seconds. Major
to Card complete in Checkout
checkout. process is
3. ≤5 slower than
Complete seconds. expected.
payment.
PT-05 Load 1. Simulate 500 Users
Application System Pass -
Testing 500 Simulated
should performance
concurrent handle the is normal.
users load without
accessing any errors
the or
website. slowdowns.
PT-06 Stress 1. Simulate 2,000 System Application Fail Defect ID:
Testing 2,000 Users should crashed at DEF-004
concurrent Simulated degrade 1,800 users. Severity:
users gracefully Critical
accessing or provide System
the an error crashes under
website. message stress load.
without
crashing.
PT-07 Scalability 1. Increase 1,000 Application Application Pass -
Testing server Users performance performance
resources. Simulated, should improved.
2. Simulate Resources improve,
1,000 Scaled and no
concurrent errors
users. should
occur.

RESULT:
Testing the performance of the e-commerce application has been done and the
errors has been executed successfully.
EX. NO: 7 AUTOMATE THE TESTING OF E-COMMERCE
APPLICATIONS USING SELENIUM
DATE:

AIM:
To automate the testing of e-commerce applications using selenium
PROCEDURE:
 Set up the WebDriver with the correct browser driver (ChromeDriver in this
case) and launch the browser.
 Open the e-commerce website by navigating to its URL.
 Locate Elements such as username, password and login button by its id
 Use WebDriverWait to ensure that the username, password input fields, and
login button are visible and interactable.
 Check the fields if it was working correctly.
PROGRAM:
import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.WebElement;
import org.openqa.selenium.chrome.ChromeDriver;
import org.openqa.selenium.support.ui.ExpectedConditions;
import org.openqa.selenium.support.ui.WebDriverWait;
import java.time.Duration;
public class ECommerceTest {
public static void main(String[] args) {
// Set the path to your ChromeDriver executable (preferably set the path in
system environment variable)
System.setProperty("webdriver.chrome.driver", "path/to/chromedriver");
// Initialize WebDriver
WebDriver driver = new ChromeDriver();
try {
// Test Case 1: Navigate to login page
driver.get("url_of_your_ecommerce_application");
// Define the locators
WebElement usernameInput = driver.findElement(By.id("username"));
WebElement passwordInput = driver.findElement(By.id("password"));
WebElement loginButton = driver.findElement(By.id("loginButton"));
// Ensure elements are visible
WebDriverWait wait = new WebDriverWait(driver, Duration.ofSeconds(10));
wait.until(ExpectedConditions.visibilityOf(usernameInput));
wait.until(ExpectedConditions.visibilityOf(passwordInput));
wait.until(ExpectedConditions.elementToBeClickable(loginButton));
// Test Case 2: Verify fields are displayed
if (usernameInput.isDisplayed()) {
System.out.println("Test Case 1 Passed: Username field is displayed");
} else {
System.out.println("Test Case 1 Failed: Username field is not displayed");
}
if (passwordInput.isDisplayed()) {
System.out.println("Test Case 2 Passed: Password field is displayed");
} else {
System.out.println("Test Case 2 Failed: Password field is not displayed");
}
// Test Case 3: Enter valid credentials and click login
usernameInput.sendKeys("valid_username");
passwordInput.sendKeys("valid_password");
loginButton.click();
// Test Case 4: Check if login was successful (assuming successful login
redirects to a dashboard)
WebElement dashboard =
wait.until(ExpectedConditions.presenceOfElementLocated(By.id("dashboard")));
if (dashboard.isDisplayed()) {
System.out.println("Test Case 3 Passed: Login Successful, Dashboard is
displayed.");
} else {
System.out.println("Test Case 3 Failed: Login was unsuccessful.");
}
// Add more test cases and actions here...
} catch (Exception e) {
System.out.println("An error occurred: " + e.getMessage());
} finally {
// Close the browser
driver.quit();
}
}
}

OUTPUT:
Console Output (Success Case):
Test Case 1 Passed: Username field is displayed
Test Case 2 Passed: Password field is displayed
Test Case 3 Passed: Login Successful, Dashboard is displayed.

RESULT:
Automating the testing of e-commerce applications using selenium is
successfully executed.
EX. NO: 8 INTEGRATE TestNG WITH THE ABOVE TEST
AUTOMATION
DATE:

AIM:
To integrate TestNG with the above test automation
PROCEDURE:
 Initialize WebDriver and TestNG Setup
 Use driver.get(url) to navigate to the e-commerce login page.
 Create test cases to check if the module is working.
 Close the browser using driver.quit() to clean up after the test.
 The TestNG framework will manage the execution order of the test methods
based on the priorities defined (@Test(priority))
PROGRAM:
import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.WebElement;
import org.openqa.selenium.chrome.ChromeDriver;
import org.openqa.selenium.support.ui.ExpectedConditions;
import org.openqa.selenium.support.ui.WebDriverWait;
import org.testng.Assert;
import org.testng.annotations.AfterMethod;
import org.testng.annotations.BeforeMethod;
import org.testng.annotations.Test;
import java.time.Duration;
public class ECommerceLoginTest {
WebDriver driver;
WebDriverWait wait;
// Set the path to your ChromeDriver executable (preferably set the path in the
system environment variable)
String driverPath = "path/to/chromedriver";
String url = "url_of_your_ecommerce_application";
@BeforeMethod
public void setUp() {
// Set up ChromeDriver
System.setProperty("webdriver.chrome.driver", driverPath);
driver = new ChromeDriver();
wait = new WebDriverWait(driver, Duration.ofSeconds(10));
// Open the website
driver.get(url);
}
@Test(priority = 1)
public void verifyUsernameField() {
WebElement usernameInput = driver.findElement(By.id("username"));
Assert.assertTrue(usernameInput.isDisplayed(), "Username field is not
displayed.");
}
@Test(priority = 2)
public void verifyPasswordField() {
WebElement passwordInput = driver.findElement(By.id("password"));
Assert.assertTrue(passwordInput.isDisplayed(), "Password field is not
displayed.");
}
@Test(priority = 3)
public void performLogin() {
WebElement usernameInput = driver.findElement(By.id("username"));
WebElement passwordInput = driver.findElement(By.id("password"));
WebElement loginButton = driver.findElement(By.id("loginButton"));
usernameInput.sendKeys("valid_username");
passwordInput.sendKeys("valid_password");
loginButton.click();
WebElement dashboard =
wait.until(ExpectedConditions.presenceOfElementLocated(By.id("dashboard")));
Assert.assertTrue(dashboard.isDisplayed(), "Login failed. Dashboard not
displayed.");
}
@AfterMethod
public void tearDown() {
// Close the browser
if (driver != null) {
driver.quit();
}
}
}
OUTPUT:
ECommerceLoginTest
Tests run: 3, Failures: 1, Skips: 0, Time elapsed: 5.014 sec
===============================================
PASSED: verifyUsernameField
PASSED: verifyPasswordField
FAILED: performLogin
TestNGException: expected [true] but found [false]

RESULT:
Thus the program to integrate TestNG with the above test automation is
successfully executed.
EX. NO: 9 A. BUILD A DATA-DRIVEN FRAMEWORK USING
SELENIUM AND TestNG
DATE:

AIM:
To build a data-driven framework using selenium and testing.
PROCEDURE:
 Initialize WebDriver and TestNG Setup
 Define a @DataProvider method to supply test data (login credentials) from a
CSV file.
 Write Test Cases for login valid and invalid credentials.
 In the @AfterMethod, call driver.quit() to close the browser after each test
case
 Execute the test suite defined in the testng.xml file.
PROGRAM:
import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.WebElement;
import org.openqa.selenium.chrome.ChromeDriver;
import org.testng.Assert;
import org.testng.annotations.AfterMethod;
import org.testng.annotations.BeforeMethod;
import org.testng.annotations.DataProvider;
import org.testng.annotations.Test;

import java.io.BufferedReader;
import java.io.FileReader;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
public class LoginTest {
WebDriver driver;
@BeforeMethod
public void setUp() {
// Initialize WebDriver before each test
System.setProperty("webdriver.chrome.driver", "path/to/chromedriver");
driver = new ChromeDriver();
driver.get("url_of_your_application_login_page");
}
@Test(dataProvider = "loginData")
public void testLoginWithValidCredentials(String username, String password) {
// Locate username and password fields
WebElement usernameField = driver.findElement(By.id("username"));
WebElement passwordField = driver.findElement(By.id("password"));
WebElement loginButton = driver.findElement(By.id("loginButton"));
// Enter valid username and password
usernameField.sendKeys(username);
passwordField.sendKeys(password);
// Click on the login button
loginButton.click();
// Verify successful login
WebElement welcomeMessage =
driver.findElement(By.id("welcomeMessage"));
Assert.assertTrue(welcomeMessage.isDisplayed(), "Login failed for user: " +
username);
// Optionally, verify the welcome message content
Assert.assertEquals(welcomeMessage.getText(), "Welcome, " + username);
}
@Test(dataProvider = "loginData")
public void testLoginWithInvalidCredentials(String username, String password) {
// Locate username and password fields
WebElement usernameField = driver.findElement(By.id("username"));
WebElement passwordField = driver.findElement(By.id("password"));
WebElement loginButton = driver.findElement(By.id("loginButton"));
// Enter invalid username and password
usernameField.sendKeys(username);
passwordField.sendKeys(password);
// Click on the login button
loginButton.click();
// Verify error message
WebElement errorMessage = driver.findElement(By.id("errorMessage"));
Assert.assertTrue(errorMessage.isDisplayed(), "Error message not displayed for
user: " + username);
}
@DataProvider(name = "loginData")
public Object[][] readData() {
List<Object[]> testData = new ArrayList<>();
String csvFile = "path/to/testdata.csv";
try (BufferedReader br = new BufferedReader(new FileReader(csvFile))) {
String line;
while ((line = br.readLine()) != null) {
String[] data = line.split(",");
testData.add(data);
}
} catch (IOException e) {
e.printStackTrace();
}
Object[][] result = new Object[testData.size()][2];
for (int i = 0; i < testData.size(); i++) {
result[i] = testData.get(i);
}
return result;
}
@AfterMethod
public void tearDown() {
// Close the browser after each test
if (driver != null) {
driver.quit();
}
}
}

TestNG XML Configuration:

<!DOCTYPE suite SYSTEM "http://testng.org/testng-1.0.dtd">


<suite name="TestSuite">
<test name="DataDrivenTest">
<classes>
<class name="LoginTest"/>
</classes>
</test>
</suite>
OUTPUT:
Test Case 1:
- Username: testUser1
- Password: password123
- Result: Login Successful
- Passed

Test Case 2:
- Username: testUser2
- Password: wrongpassword
- Result: Login Failed
- Passed

Test Case 3:
- Username: testUser3
- Password: password456
- Result: Login Successful
- Passed

RESULT:
Thus the program to build a data-driven framework using selenium and testing
was successfully executed.
EX. NO: 9 B. BUILD PAGE OBJECT MODEL USING SELENIUM
AND TestNG
DATE:

AIM:
To build page object model using selenium and testing.
PROCEDURE:
 Start by initializing the WebDriver. This is typically done in the Test class.
 Create a Page Object Class for Each Web Page
 Inside the Page Object Class, define the locators for all the page elements (e.g.,
buttons, text fields, checkboxes)
 Define methods for actions the user can perform on the page (e.g., login, form
submission).
 The TestNG test methods should use the Page Object methods to perform the
tests.
PROGRAM:
LoginPage.java (Page Object)
import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.WebElement;
public class LoginPage {
private final WebDriver driver;
// Constructor to initialize WebDriver
public LoginPage(WebDriver driver) {
this.driver = driver;
}
// Method to open the login page with a given URL
public void openLoginPage(String url) {
driver.get(url);
}
// Method to enter username in the username field
public void enterUsername(String username) {
WebElement usernameField = driver.findElement(By.id("username"));
usernameField.sendKeys(username);
}
public void enterPassword(String password) {
WebElement passwordField = driver.findElement(By.id("password"));
passwordField.sendKeys(password);
}
public void clickLoginButton() {
WebElement loginButton = driver.findElement(By.id("loginButton"));
loginButton.click();
}
public boolean isErrorMessageDisplayed() {
WebElement errorMessage = driver.findElement(By.id("errorMessage"));
return errorMessage.isDisplayed();
}
}

LoginTest.java (Test Class)


import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
import org.testng.Assert;
import org.testng.annotations.AfterMethod;
import org.testng.annotations.BeforeMethod;
import org.testng.annotations.Test;
public class LoginTest {
private WebDriver driver;
private LoginPage loginPage;
// Setup WebDriver and initialize the LoginPage object
@BeforeMethod
public void setUp() {
System.setProperty("webdriver.chrome.driver", "path/to/chromedriver"); //
Ensure the correct path is set
driver = new ChromeDriver();
loginPage = new LoginPage(driver);
}
// Test login with valid credentials
@Test
public void testLoginWithValidCredentials() {
loginPage.openLoginPage("url_of_your_application_login_page");
loginPage.enterUsername("valid_username");
loginPage.enterPassword("valid_password");
loginPage.clickLoginButton();
// For example, if login redirects to a dashboard:
// Assert.assertTrue(driver.getCurrentUrl().contains("dashboard"));
}
// Test login with invalid credentials
@Test
public void testLoginWithInvalidCredentials() {
loginPage.openLoginPage("url_of_your_application_login_page");
loginPage.enterUsername("invalid_username");
loginPage.enterPassword("invalid_password");
loginPage.clickLoginButton();
Assert.assertTrue(loginPage.isErrorMessageDisplayed(), "Error message not
displayed for invalid login");
}
@AfterMethod
public void tearDown() {
driver.quit();
}
}

OUTPUT:
[INFO] Starting ChromeDriver...
[INFO] WebDriver initialized successfully.
[INFO] Opening the login page...
[INFO] Entering username: valid_username
[INFO] Entering password: valid_password
[INFO] Clicking login button...
[INFO] Asserted: URL contains "dashboard" (this indicates successful login).

PASSED: testLoginWithValidCredentials

RESULT:
Thus the program to build page object model using selenium and testing has
been successfully executed.
EX. NO: 9 C. BUILD BDD FRAMEWORK WITH SELENIUM,
TestNG AND CUCUMBER
DATE:

AIM:
To build BDD framework with selenium, TestNG and cucumber.
PROCEDURE:
 Set up the Maven project with dependencies for Selenium, TestNG, and
Cucumber
 Write the test scenarios in Gherkin syntax (Feature, Scenario, Given, When,
Then).
 Create page classes with locators and reusable methods for interacting with
UI elements.
 Display results for each scenario and step in the console, indicating pass or
fail status.Start execution.
PROGRAM:
pom.xml
<dependencies>
<!-- Selenium WebDriver -->
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-java</artifactId>
<version>4.12.0</version>
</dependency>
<!-- TestNG -->
<dependency>
<groupId>org.testng</groupId>
<artifactId>testng</artifactId>
<version>7.8.0</version>
</dependency>
<!-- Cucumber Java -->
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-java</artifactId>
<version>7.14.0</version>
</dependency>
<!-- Cucumber TestNG -->
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-testng</artifactId>
<version>7.14.0</version>
</dependency>
<!-- Cucumber JVM -->
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-jvm-deps</artifactId>
<version>1.0.6</version>
</dependency>
</dependencies>

Login.feature
Feature: Login to the Application
Scenario: Valid Login
Given the user is on the login page
When the user enters valid credentials
And clicks the login button
Then the user should be redirected to the dashboard
Scenario: Invalid Login
Given the user is on the login page
When the user enters invalid credentials
And clicks the login button
Then an error message should be displayed

LoginPage.java
package com.example.pages;

import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.WebElement;

public class LoginPage {


private WebDriver driver;
// Locators
private By usernameField = By.id("username");
private By passwordField = By.id("password");
private By loginButton = By.id("loginButton");
private By errorMessage = By.id("errorMessage");
// Constructor
public LoginPage(WebDriver driver) {
this.driver = driver;
}
// Actions
public void openLoginPage(String url) {
driver.get(url);
}
public void enterUsername(String username) {
driver.findElement(usernameField).sendKeys(username);
}
public void enterPassword(String password) {
driver.findElement(passwordField).sendKeys(password);
}
public void clickLoginButton() {
driver.findElement(loginButton).click();
}
public boolean isErrorMessageDisplayed() {
return driver.findElement(errorMessage).isDisplayed();
}
}
LoginSteps.java
package com.example.stepdefinitions;

import com.example.pages.LoginPage;
import io.cucumber.java.en.*;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
import org.testng.Assert;

public class LoginSteps {


private WebDriver driver;
private LoginPage loginPage;
@Given("I am on the login page")
public void iAmOnTheLoginPage() {
System.setProperty("webdriver.chrome.driver", "path/to/chromedriver");
driver = new ChromeDriver();
loginPage = new LoginPage(driver);
loginPage.openLoginPage("https://yourapp.com/login");
}
@When("I enter a valid username and password")
public void iEnterAValidUsernameAndPassword() {
loginPage.enterUsername("valid_username");
loginPage.enterPassword("valid_password");
}
@When("I enter an invalid username and password")
public void iEnterAnInvalidUsernameAndPassword() {
loginPage.enterUsername("invalid_username");
loginPage.enterPassword("invalid_password");
}
@And("I click the login button")
public void iClickTheLoginButton() {
loginPage.clickLoginButton();
}
@Then("I should be redirected to the dashboard")
public void iShouldBeRedirectedToTheDashboard() {
Assert.assertTrue(driver.getCurrentUrl().contains("dashboard"), "User not
redirected to dashboard!");
}
@Then("I should see an error message")
public void iShouldSeeAnErrorMessage() {
Assert.assertTrue(loginPage.isErrorMessageDisplayed(), "Error message not
displayed!");
}
@After
public void tearDown() {
if (driver != null) {
driver.quit();
}
}
}
TestRunner.java
package com.example.testrunners;

import io.cucumber.testng.AbstractTestNGCucumberTests;
import io.cucumber.testng.CucumberOptions;

@CucumberOptions(
features = "src/test/java/features",
glue = "com.example.stepdefinitions",
plugin = {"pretty", "html:target/cucumber-reports.html"},
monochrome = true
)
public class TestRunner extends AbstractTestNGCucumberTests {
}
OUTPUT:
Successful Test Scenario: Login with valid credentials

[INFO] Starting ChromeDriver...


[INFO] WebDriver initialized successfully.
[INFO] Opening the login page...
[INFO] Entering username: valid_username
[INFO] Entering password: valid_password
[INFO] Clicking login button...
[INFO] Asserted: User redirected to dashboard.
[INFO] Browser closed.

Feature: Login functionality

Scenario: Login with valid credentials


Given I am on the login page
When I enter a valid username and password
And I click the login button
Then I should be redirected to the dashboard

[INFO] Scenario Passed: Login with valid credentials

RESULT:
Thus the program to build BDD framework with selenium, TestNG and
cucumber has executed successfully.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy