STA Record Final
STA Record Final
Name :
Register No :
Year/Semester: Branch :
UNIVERSITY COLLEGE OF ENGINEERING KANCHEEPURAM
( A Constituent College of Anna University Chennai )
KANCHIPURAM – 631 552
BONAFIDE CERTIFICATE
REGISTER NO
3.
Test the E-commerce Application and Report the
Defects in It.
4.
Develop the Test Plan and Design the Test
Cases for an Inventory Control System.
5.
Execute the Test Cases against a Client-Server or
Desktop Application and Identify the Defects.
6.
Test the Performance of the E-commerce
Application.
7.
Automate the testing of e-commerce applications
using Selenium.
8.
Integrate TestNG with the above test
automation.
EX. NO : 01 Develop the Test Plan for Testing an E-commerce Web/Mobile
Application (www.amazon.in)
DATE :
AIM :
The aim of this experiment is to develop a comprehensive test plan for testing the
functionality and usability of the e-commerce web/mobile application www.amazon.in.
ALGORITHM:
1. Identify the Scope: Determine the scope of testing, including the features and
functionalities that need to be tested.
2. Define Test Objectives: Specify the primary objectives of testing, such as functional
testing, usability testing, performance testing, security testing, etc.
3. Identify Test Environment: Define the platforms, browsers, devices, and operating
systems on which the application will be tested.
4. Determine Test Deliverables: Decide on the documents and artifacts that will be
generated during the testing process, such as test cases, test reports, and defect logs.
5. Create Test Strategy: Develop an overall approach for testing, including the testing
techniques, entry and exit criteria, and the roles and responsibilities of the testing team.
6. Define Test Scope and Schedule: Specify the timeline for each testing phase and the
scope of testing for each phase.
7. Risk Analysis: Identify potential risks and their impact on the testing process, and devise
risk mitigation strategies.
8. Resource Planning: Allocate the necessary resources, including the testing team, hardware,
and software required for testing.
9. Test Case Design: Prepare detailed test cases based on the requirements and
functionalities of the e-commerce application.
10. Test Data Setup: Arrange test data required for executing the test cases effectively.
11. Test Execution: Execute the test cases and record the test results.
12. Defect Reporting: Document any defects encountered during testing and track
their resolution.
TEST PLAN:
The test plan should cover the following sections:
1. Introduction: Briefly describe the purpose of the test plan and provide an overview of
the e- commerce application to be tested.
3. Test Scope: Specify the features and functionalities to be tested and any limitations on testing.
4. Test Environment: Describe the hardware, software, browsers, and devices to be used
for testing.
7. Risk Analysis: Identify potential risks and the strategies to mitigate them.
9. Test Case Design: Include a summary of the test cases developed for the application.
10. Test Data Setup: Describe the process of arranging test data for testing.
11. Defect Reporting: Explain the procedure for reporting and tracking defects.
TEST CASE TABLE:
Proces No. Test case Steps Description Statu Expected Actua commen
s s result l t
result
Test TC0 Scope of 1.Review The test plan
Plan 01 Testing the test plan Verify includes all
document. the scope features.
of Done
testing.
TC0 Test 1. Review Verify the Done The test
2 Objectives the test plan test objectives
document. objectives are well-
defined.
TC0 Test 1. Review Check the Done Test
3 Environmen the test plan specified environment
t document environments s are
. mentioned.
TC0 1. Review Ensure all Done The test plan
4 the test plan deliverables includes all
Test document. are listed. deliverables
Deliverable
s
TC0 Test Done The test
5 Strategy 1. Review Verify the strategy is
the test overall clearly
plan approach. stated.
document.
TC0 Ensure Check the Done The schedule
6 potential 1. Review schedule and and scope
risks are the test scope. are defined.
identified. plan
document.
TC0 Risk Ensure Done
7 Analysis 1. Review potential Risks and
the test risks are mitigation
plan identified. strategies
document. are
mentioned.
TC0 Done Resource
8 Resourc 1. Review Check s needed
e the test the for
Planning plan required testing
document. resources are listed.
.
TC0 1. Review Done Test cases
9 Test and Validate the are
Case execute the prepared accurate
Design test cases. test cases. and
functional.
TC1 1.Review Done
0 the test Verify the Test data is
Test data setup availability available
Data process. of test data. for testing.
Setup
TC1 1. Run Done
1 the test
Test cases and Execute Test results
Executio document the test are
n the cases. recorded
outcomes and
. documented
.
TC1 Done
2 Defect
Reporting 1.Log Ensure Defects are
defects defects reported
with are with
detailed reported sufficient
information correctly. details.
.
TC1 1. Monitor Done
3 Defect defect Verify the Defects are
Trackin status and tracking tracked
g updates. of until
defects. resolution.
Explanation:
The test plan is a crucial document that outlines the entire testing process. It ensures that all
aspects of the e-commerce application are thoroughly tested, and the results are
systematically documented.
RESULT:
Upon completion of the experiment, you will have a well-structured test plan that provides a
clear roadmap for testing the e-commerce web/mobile application www.amazon.in.
EX.NO : 02 Design the Test Cases for Testing the E-commerce
DATE : Application
AIM:
The aim of this experiment is to design a set of comprehensive and effective test cases for testing
the e-commerce application www.amazon.in.
ALGORITHM:
1. Understand Requirements: Familiarize yourself with the functional and non-
functional requirements of the e-commerce application.
2. Identify Test Scenarios: Based on the requirements, identify different test scenarios that
cover all aspects of the application.
3. Write Test Cases: Develop test cases for each identified scenario, including
preconditions, steps to be executed, and expected outcomes.
4. Cover Edge Cases: Ensure that the test cases cover edge cases and boundary
conditions to verify the robustness of the application.
5. Prioritize Test Cases: Prioritize the test cases based on their criticality and relevance to
the application.
6. Review Test Cases: Conduct a peer review of the test cases to ensure their accuracy
and completeness.
7. Optimize Test Cases: Optimize the test cases for reusability and maintainability.
4. Precondition: The necessary conditions that must be satisfied before executing the test case.
RESULT:
Upon completion of the experiment, you will have a set of well-defined test cases ready
for testing the e-commerce application www.amazon.in.
Ex. No : 03 Test the E-commerce Application and Report the Defects
in It
DATE :
AIM:
The aim of this experiment is to execute the designed test cases and identify defects or issues in
the e-commerce application www.amazon.in.
ALGORITHM:
1. Test Environment Setup: Set up the testing environment with the required hardware,
software, and test data.
2. Test Case Execution: Execute the test cases designed in Experiment 2, following the
specified steps.
3. Defect Identification: During test execution, record any discrepancies or issues encountered.
4. Defect Reporting: Log the identified defects with detailed information, including
steps to reproduce, severity, and priority.
5. Defect Tracking: Track the progress of defect resolution and verify fixes as they
are implemented.
6. Retesting: After defect fixes, retest the affected areas to ensure the issues are resolved.
7. Regression Testing: Conduct regression testing to ensure new changes do not introduce
new defects.
Process No. Test case Steps Description Status Expected Actual Com
Result Result ment
1.
Test Navigate Verify user User can
Case User to the registratio successfull
Desig TC00 Registratio registratio n process. Done y register.
n 1 n n page.
1. Verify user User can
Navigate login successful
User Login to the process. Done log in.
TC00 login
2 page.
Search
1. Enter a results
keyword Verify relevant
Search in the search to the
TC00 Functionalit search bar. functionality Done keyword.
3 y .
Verify Product is
1. Browse adding added to
the products the
TC00 Add to Cart product to the cart. Done shopping
4 catalog. cart.
1. Click Items in
Shopping on the Verify the the
Cart shopping shopping shopping
TC00 Validation cart icon. cart Done cart are
5 contents. displayed.
1. Click on Checkout
Checkou the Verify the process
t Process "Checkout checkout Not proceeds
TC00 " button. process. Starte as
6 d expected.
EXPLANATION:
Testing the e-commerce application aims to validate its functionality and usability. By identifying
and reporting defects, you ensure the application's quality and reliability.
RESULT:
Upon completion of the experiment, you will have a list of identified defects and their status
after resolution.
EX. NO: 04 Develop the Test Plan and Design the Test Cases for
an Inventory Control System
DATE :
AIM:
The aim of this experiment is to create a comprehensive test plan and design test cases for
an Inventory Control System.
ALGORITHM:
Follow the same algorithm as described in Experiment 1 for developing the test plan
for an inventory control system.
Follow the same algorithm as described in Experiment 2 for designing test cases for an
inventory control system.
TEST PLAN:
Process No. Test Case Steps Description Status Expected Actual Comme
Result Result nt
The test
1. Review the plan
requirements Verify includes
Test Scope and project the all
Plan TC of documentatio scope of Done essential
001 Testing n. testing. features.
2. Identify
the modules
to be tested.
3.
Determine
the out-of-
scope
items.
1. Review the The test
Test requirements Verify objective
Objectiv and project the test s are
TC es documentatio objective Done clearly
002 n. s. defined.
2. Discuss
with
stakeholders
to understand
expectations.
1. Identify
the Verify the The test
hardware required environme
Test and environment Not nt is
TC Environme software s. Starte defined.
003 nt requirement d
s.
2. Set up
the required
hardware
and
software.
1. Determine All
the Verify the necessar
Test documents required Not y
TC Deliverabl and artifacts deliverable Starte documen
004 es to be s. d ts are
produced. listed.
2. Create
templates
for test
reports,
defect
logs, etc.
1. Decide Verify the
on the overall The test
Test testing approach Not strategy
TC Strateg approach for testing. Starte is
005 y and d defined.
techniques.
2.
Determine
the entry
and exit
criteria.
1. Define
Test the Verify the The
Scope timeline schedule Not schedule
TC and for each for testing. Starte is
006 Schedule testing d establishe
phase. d.
2. Determine
the scope of
testing for
each phase.
Potential
risks are
1. Identify Verify risk identifie
potential analysis d with
Risk risks in the and Not mitigatio
TC Analys Starte
testing mitigation n plans.
007 is d
process. strategies.
2. Discuss
risk
mitigation
strategies
with the
team.
1. Allocate Resourc
the required Verify the es
Resour resources for availability Not needed
TC ce testing. of Starte for
008 plannin resources. d testing
g are
allocated
.
2.Determine
the roles and
responsibiliti
es of the
team.
EXPLANATION:
An inventory control system is critical for managing stock and supplies. Proper testing
ensures the system functions accurately and efficiently.
RESULT:
Upon completion of the experiment, you will have a well-structured test plan and a set of
test cases ready for testing the Inventory Control System.
Ex.NO:05 Execute the Test Cases against a Client-Server
or Desktop Application and Identify the Defects
DATE :
AIM :
The aim of this experiment is to execute the test cases against a client-server or desktop
application and identify defects.
ALGORITHM:
1. Test Environment Setup: Set up the testing environment, including the client-server or
desktop application, required hardware, and test data.
2. Test Case Execution: Execute the test cases designed in Experiment 2 against the application.
3. Defect Identification: During test execution, record any discrepancies or issues encountered.
4. Defect Reporting: Log the identified defects with detailed information, including
steps to reproduce, severity, and priority.
5. Defect Tracking: Track the progress of defect resolution and verify fixes as they
are implemented.
6. Retesting: After defect fixes, retest the affected areas to ensure the issues are resolved.
7. Regression Testing: Conduct regression testing to ensure new changes do not introduce
new defects.
Process No. Test case Steps Description Statu Expected Actu Comme
s Result al nt
resul
t
User can
Test 1. Launch Verify user Not successfully
Case TC00 User Login the login Starte log in.
Executi 1 application. process. d
on
2. Enter
validlogin
credentials.
3. Click on
the "Login"
button.
Invalid data
shows
1. Access a Verify data Not appropriate
TC00 Data data input validation on Starte error
2 Validation form. the form. d messages.
2. Enter
invaliddata
in the form
fields.
3. Submit
the form.
1. Access the Verify file File is
file upload upload Not uploaded
TC00 File Upload feature. functionality. Starte successfully
3 d .
2. Select a
filefrom the
system.
3. Click on
the"Upload"
button.
Application
1. Verify the gracefully
TC00 Network Disconnect application's Not handles
4 Connectivity the network. response. Starte disconnecti
d on.
2. Attempt to
perform an
action
requiring
network
access.
1. Simulate Verify Application
Concurrent concurrent application Not performs
TC00 Users usersessions. performance Starte wellunder
5 . d load.
2. Perform
actions
simultaneous
ly.
1. Test the Application
application Verify cross- works on all
TC00 Compatibilit ondifferent platform specified
6 y platforms. platforms.
compatibility Not
. Starte
d
2. Execute
testson
various
browsers.
1.Monitor Data is
network Verify correctly
TC00 Client-Server traffic communicati Not transmitted
7 Communicati between on in starte and
on client and integrity, d received.
server.
Explanation:
Testing a client-server or desktop application ensures its functionality across different platforms
and environment.
Result:
Upon completion of the experiment, you will have a list of identified defects and their status after
resolution for the client-server or desktop application.
EX.NO : 06 Test the Performance of the E-commerce
Application.
DATE :
Aim:
The aim of this experiment is to test the performance of the e-commerce application
www.amazon.in.
Algorithm:
1. Identify Performance Metrics: Determine the performance metrics to be measured,
such asresponse time, throughput, and resource utilization.
2. Define Test Scenarios: Create test scenarios that simulate various user interactions and
loadson the application.
4. Execute Performance Tests: Run the performance tests using the defined scenarios and
collectperformance data.
5. Analyze Performance Data: Analyze the collected data to identify any performance
bottlenecksor issues.
Performance Table:
Process No. Test case Steps Descriptio Statu Expected Actua Comme
n s Result l nt
Resul
t
The home
page loads
1. Access within the
the
home page specified
of
Response the e- Measure response
the
Performan Time for commerce response Not time
ce
Testing TC00 Home Page application. time. Starte threshold.
1 d
2. Use a
performanc
e testing
tool to
record the
time.
3. Analyze
the recorded
data to
determine
response
time.
The
application
can handle
peak-hour
1. Simulate traffic
Throughpu peak-hour without
TC00 t during traffic on Measure Not significant
2 Peak Hours the the Starte delays.
application. throughput. d
2. Execute
performanc
e tests
during peak
hours.
3. Analyze
the data to
determine
the
throughput.
Resource
1. Monitor utilization
CPU, remains
memory, and Measure within
Resource network resource Not acceptable
TC00 Utilization usage utilization. Starte limits.
3 during d
testing.
2. Execute
performanc
e tests while
monitoring
resources.
3. Analyze
the data to
assess
resource
utilization.
The
1. Simulate application
multiple remains
concurrent Measure stable and
Concurrent users app Not responsive
TC00 Users accessing performanc Starte under load.
4 the app. e under d
load.
2. Increase
the number
of
concurrent
users
gradually.
3. Record
the
application's
behavior
with
increased
load.
Measure
1. Apply system The system
maximum behavior recovers
load to test under gracefully
Stress the system's extreme Not after stress
TC00 Testing breaking load. Starte is removed.
5 point. d
2. Apply the
maximum
user load the
application
can handle.
3. Observe
the
application's
response
under stress.
Performan
1. Identify Improve Not ce
performanc application Starte bottlenecks
e performanc d are
bottlenecks e. addressed
and areas of and
application
TC00 Performan improvemen performs
6 ce Tuning t. better.
2. Analyze
the
performanc
e test
results.
3.
Implement
necessary
operations.
Explanation:
Performance testing helps to identify bottlenecks in the e-commerce application, ensuring it can
handle real-world user loads effectively.
Result:
Upon completion of the experiment, you will have performance test results and any optimizations
made to improve the application's performance.
EX.NO : 07 Automate the testing of e-commerce applications using
Selenium.
DATE :
Aim:
The aim of this task is to automate the testing of an e-commerce web application
(www.amazon.in) using Selenium WebDriver, which will help improve testing efficiency and
reliability.
Algorithm:
1. Set up the environment:
- Install Java Development Kit (JDK) and configure the Java environment variables.
- Install an Integrated Development Environment (IDE) like Eclipse or IntelliJ.
- Download Selenium WebDriver and the required web drivers for the browsers you
intend totest (e.g., ChromeDriver, GeckoDriver for Firefox).
7. Report defects:
- Document any defects found during the automated testing process.
- Provide detailed information about each defect, including steps to reproduce and
expectedresults.
Program:
package program;
import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import
org.openqa.selenium.chrome.ChromeDriver;
public class selenium {
public static void main(String[] args)
{
System.setProperty("webdriver.chrome.driver","C:\\Users\\Admin\\Downloads\\chromedri
ver- win64\\chromedriver-win64\\chromedriver.exe");
WebDriver d=new
ChromeDriver();
d.get("https://www.amazon.in");
d.findElement(By.xpath("//*[@id=\"nav-link-accountList\"]/span/span")).click();
d.findElement(By.id("ap_email")).sendKeys("youremail@gmail.com");
d.findElement(By.xpath("//*[@id=\"continue\"]")).click();
d.findElement(By.id("ap_password")).sendKeys("your password");
d.findElement(By.xpath("//*[@id=\"signInSubmit\"]")).click();
String u=d.getCurrentUrl();
if(u.equals("https://www.amazon.in/?ref_=nav_ya_signin")){
Automation Process:
Console output:
Result:
The successful completion of this task will yield:
- Automated test scripts for the e-commerce application using Selenium WebDriver.
- Identification of defects, if any, in the application.
EX.NO : 08 Integrate TestNG with the above test automation.
DATE :
Aim:
The aim of this task is to integrate TestNG with the existing Selenium automation scripts for the
e-commerce application, enhancing test management, parallel execution, and reporting
capabilities.
Algorithm:
1. Set up TestNG in the project:
- Add TestNG library to the existing Java project.
package mytest;
import java.time.Duration;
import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
import org.testng.Assert;
import org.testng.annotations.AfterMethod;
import org.testng.annotations.BeforeMethod;
import org.testng.annotations.Test;
WebDriver driver;
@BeforeMethod
System.setProperty("webdriver.chrome.driver","C:\\selenium\\chromedriver_win32\\c
hro
medriver.exe");
driver=new ChromeDriver();
driver.get("https://amazon.in");
driver.manage().window().maximize();
driver.manage().timeouts().implicitlyWait(Duration.ofSeconds(5));
}
@Test
String actualTitle=driver.getTitle();
String expectedTitle="Online Shopping site in India: Shop Online for Mobiles, Books, Watches,
Shoes and More - Amazon.in";
Assert.assertEquals(actualTitle, expectedTitle);
@Test
boolean flag=driver.findElement(By.xpath("//a[@id='nav-logo-sprites']")).isDisplayed();N
Assert.assertTrue(flag);
@AfterMethod
driver.quit();
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-
4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>MiniProject2</groupId>
<artifactId>MiniProject2</artifactId>
<version>0.0.1-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-java</artifactId>
<version>4.3.0</version>
</dependency>
</dependencies>
<build>
<sourceDirectory>src</sourceDirectory>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.1</version>
<configuration>
<release>16</release>
</configuration>
</plugin>
</plugins>
</build>
</project>
Output:
Result:
The successful completion of this task will yield:
- Integration of TestNG with the existing Selenium automation scripts.
- Enhanced test management and reporting capabilities.
- Identification of defects, if any, in the application and improved efficiency in handling
testscenarios.