0% found this document useful (0 votes)
10 views33 pages

STA Record Final

The document outlines a series of experiments related to software testing and automation for an e-commerce application, specifically www.amazon.in. It includes the development of test plans, design of test cases, execution of tests, and defect reporting. Each experiment aims to ensure the application meets specified requirements and functions correctly through structured testing processes.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views33 pages

STA Record Final

The document outlines a series of experiments related to software testing and automation for an e-commerce application, specifically www.amazon.in. It includes the development of test plans, design of test cases, execution of tests, and defect reporting. Each experiment aims to ensure the application meets specified requirements and functions correctly through structured testing processes.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 33

UNIVERSITY COLLEGE OF ENGINEERING KANCHEEPURAM

( A Constituent College of Anna University Chennai )


KANCHIPURAM – 631 552

DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING

CCS366 - SOFTWARE TESTING AND AUTOMATION

Name :
Register No :
Year/Semester: Branch :
UNIVERSITY COLLEGE OF ENGINEERING KANCHEEPURAM
( A Constituent College of Anna University Chennai )
KANCHIPURAM – 631 552

BONAFIDE CERTIFICATE

REGISTER NO

Certified that this is the Bonafide record of work done by


Mr/Ms............................................... of ...... semester B.E. Computer Science
and Engineering Branch / Batch during the academic year 20...... to 20….. in
the CCS366 -SOFTWARE TESTING AND AUTOMATION.

Staff In-Charge Head of the Department

Submitted for the University Practical examination held on..........................

Internal Examiner External Examiner


TABLE OF CONTENTS
Ex Page
No no
Date Experiment Title Signature
1.
Develop the Test Plan for Testing an
E-commerce Web/Mobile Application
(www.amazon.in)
2.
Design the Test Cases for Testing the E-
commerce Application.

3.
Test the E-commerce Application and Report the
Defects in It.

4.
Develop the Test Plan and Design the Test
Cases for an Inventory Control System.

5.
Execute the Test Cases against a Client-Server or
Desktop Application and Identify the Defects.

6.
Test the Performance of the E-commerce
Application.

7.
Automate the testing of e-commerce applications
using Selenium.

8.
Integrate TestNG with the above test
automation.
EX. NO : 01 Develop the Test Plan for Testing an E-commerce Web/Mobile
Application (www.amazon.in)
DATE :

AIM :
The aim of this experiment is to develop a comprehensive test plan for testing the
functionality and usability of the e-commerce web/mobile application www.amazon.in.

ALGORITHM:
1. Identify the Scope: Determine the scope of testing, including the features and
functionalities that need to be tested.

2. Define Test Objectives: Specify the primary objectives of testing, such as functional
testing, usability testing, performance testing, security testing, etc.

3. Identify Test Environment: Define the platforms, browsers, devices, and operating
systems on which the application will be tested.

4. Determine Test Deliverables: Decide on the documents and artifacts that will be
generated during the testing process, such as test cases, test reports, and defect logs.

5. Create Test Strategy: Develop an overall approach for testing, including the testing
techniques, entry and exit criteria, and the roles and responsibilities of the testing team.

6. Define Test Scope and Schedule: Specify the timeline for each testing phase and the
scope of testing for each phase.

7. Risk Analysis: Identify potential risks and their impact on the testing process, and devise
risk mitigation strategies.

8. Resource Planning: Allocate the necessary resources, including the testing team, hardware,
and software required for testing.

9. Test Case Design: Prepare detailed test cases based on the requirements and
functionalities of the e-commerce application.

10. Test Data Setup: Arrange test data required for executing the test cases effectively.

11. Test Execution: Execute the test cases and record the test results.
12. Defect Reporting: Document any defects encountered during testing and track
their resolution.

TEST PLAN:
The test plan should cover the following sections:

1. Introduction: Briefly describe the purpose of the test plan and provide an overview of
the e- commerce application to be tested.

2. Test Objectives: List the primary objectives of testing the application.

3. Test Scope: Specify the features and functionalities to be tested and any limitations on testing.

4. Test Environment: Describe the hardware, software, browsers, and devices to be used
for testing.

5. Test Strategy: Explain the overall approach to be followed during testing.

6. Test Schedule: Provide a detailed timeline for each testing phase.

7. Risk Analysis: Identify potential risks and the strategies to mitigate them.

8. Resource Planning: Specify the resources required for testing.

9. Test Case Design: Include a summary of the test cases developed for the application.

10. Test Data Setup: Describe the process of arranging test data for testing.

11. Defect Reporting: Explain the procedure for reporting and tracking defects.
TEST CASE TABLE:
Proces No. Test case Steps Description Statu Expected Actua commen
s s result l t
result
Test TC0 Scope of 1.Review The test plan
Plan 01 Testing the test plan Verify includes all
document. the scope features.
of Done
testing.
TC0 Test 1. Review Verify the Done The test
2 Objectives the test plan test objectives
document. objectives are well-
defined.
TC0 Test 1. Review Check the Done Test
3 Environmen the test plan specified environment
t document environments s are
. mentioned.
TC0 1. Review Ensure all Done The test plan
4 the test plan deliverables includes all
Test document. are listed. deliverables
Deliverable
s
TC0 Test Done The test
5 Strategy 1. Review Verify the strategy is
the test overall clearly
plan approach. stated.
document.
TC0 Ensure Check the Done The schedule
6 potential 1. Review schedule and and scope
risks are the test scope. are defined.
identified. plan
document.
TC0 Risk Ensure Done
7 Analysis 1. Review potential Risks and
the test risks are mitigation
plan identified. strategies
document. are
mentioned.
TC0 Done Resource
8 Resourc 1. Review Check s needed
e the test the for
Planning plan required testing
document. resources are listed.
.
TC0 1. Review Done Test cases
9 Test and Validate the are
Case execute the prepared accurate
Design test cases. test cases. and
functional.
TC1 1.Review Done
0 the test Verify the Test data is
Test data setup availability available
Data process. of test data. for testing.
Setup
TC1 1. Run Done
1 the test
Test cases and Execute Test results
Executio document the test are
n the cases. recorded
outcomes and
. documented
.
TC1 Done
2 Defect
Reporting 1.Log Ensure Defects are
defects defects reported
with are with
detailed reported sufficient
information correctly. details.
.
TC1 1. Monitor Done
3 Defect defect Verify the Defects are
Trackin status and tracking tracked
g updates. of until
defects. resolution.

Explanation:
The test plan is a crucial document that outlines the entire testing process. It ensures that all
aspects of the e-commerce application are thoroughly tested, and the results are
systematically documented.

RESULT:
Upon completion of the experiment, you will have a well-structured test plan that provides a
clear roadmap for testing the e-commerce web/mobile application www.amazon.in.
EX.NO : 02 Design the Test Cases for Testing the E-commerce
DATE : Application

AIM:
The aim of this experiment is to design a set of comprehensive and effective test cases for testing
the e-commerce application www.amazon.in.

ALGORITHM:
1. Understand Requirements: Familiarize yourself with the functional and non-
functional requirements of the e-commerce application.

2. Identify Test Scenarios: Based on the requirements, identify different test scenarios that
cover all aspects of the application.

3. Write Test Cases: Develop test cases for each identified scenario, including
preconditions, steps to be executed, and expected outcomes.

4. Cover Edge Cases: Ensure that the test cases cover edge cases and boundary
conditions to verify the robustness of the application.

5. Prioritize Test Cases: Prioritize the test cases based on their criticality and relevance to
the application.

6. Review Test Cases: Conduct a peer review of the test cases to ensure their accuracy
and completeness.

7. Optimize Test Cases: Optimize the test cases for reusability and maintainability.

TEST CASE DESIGN:


The test case design should include the following components for each test case:

1. Test Case ID: A unique identifier for each test case.

2. Test Scenario: Description of the scenario being tested.


3. Test Case Description: Detailed steps to execute the test.

4. Precondition: The necessary conditions that must be satisfied before executing the test case.

5. Test Steps: The sequence of actions to be performed during the test.

6. Expected Result: The outcome that is expected from the test.

TEST CASE TABLE:


Proces No. Test Case Steps Description Status Expected Actua commen
s result l t
result
1.
Test TC001 User Navigate Verify User can
Case Registratio to the user successfull
Design n registratio registratio Done y register.
n page. n process.
1. Verify User can
Navigate user login successfull
TC00 User to the process. Done y log in.
2 Login login
page.
Search
1. Enter a results
keyword Verify relevant
Search in the search to the
TC00 Functionalit search bar. functionalit Done keyword.
3 y y.
Verify Product is
1. Browse adding added to
the products the
TC00 Add to product to the cart. Done shopping
4 Cart catalog. cart.
1. Click Items in
Shopping on the Verify the the
Cart shopping shopping shopping
TC00 Validatio cart icon. cart Done cart are
5 n contents. displayed.
1. Click on Checkout
the Verify process
Checkou "Checkout the Not proceeds
TC00 t Process " button. checkout Starte as
6 process. d expected.
EXPLANATION:
Test cases are designed to validate the functionality and behaviour of the e-commerce
application. They ensure that the application performs as intended and meets the specified
requirements.

RESULT:
Upon completion of the experiment, you will have a set of well-defined test cases ready
for testing the e-commerce application www.amazon.in.
Ex. No : 03 Test the E-commerce Application and Report the Defects
in It
DATE :

AIM:
The aim of this experiment is to execute the designed test cases and identify defects or issues in
the e-commerce application www.amazon.in.

ALGORITHM:
1. Test Environment Setup: Set up the testing environment with the required hardware,
software, and test data.

2. Test Case Execution: Execute the test cases designed in Experiment 2, following the
specified steps.

3. Defect Identification: During test execution, record any discrepancies or issues encountered.

4. Defect Reporting: Log the identified defects with detailed information, including
steps to reproduce, severity, and priority.

5. Defect Tracking: Track the progress of defect resolution and verify fixes as they
are implemented.

6. Retesting: After defect fixes, retest the affected areas to ensure the issues are resolved.

7. Regression Testing: Conduct regression testing to ensure new changes do not introduce
new defects.

TEST CASE TABLE :

Process No. Test case Steps Description Status Expected Actual Com
Result Result ment
1.
Test Navigate Verify user User can
Case User to the registratio successfull
Desig TC00 Registratio registratio n process. Done y register.
n 1 n n page.
1. Verify user User can
Navigate login successful
User Login to the process. Done log in.
TC00 login
2 page.
Search
1. Enter a results
keyword Verify relevant
Search in the search to the
TC00 Functionalit search bar. functionality Done keyword.
3 y .
Verify Product is
1. Browse adding added to
the products the
TC00 Add to Cart product to the cart. Done shopping
4 catalog. cart.
1. Click Items in
Shopping on the Verify the the
Cart shopping shopping shopping
TC00 Validation cart icon. cart Done cart are
5 contents. displayed.
1. Click on Checkout
Checkou the Verify the process
t Process "Checkout checkout Not proceeds
TC00 " button. process. Starte as
6 d expected.

EXPLANATION:
Testing the e-commerce application aims to validate its functionality and usability. By identifying
and reporting defects, you ensure the application's quality and reliability.

RESULT:
Upon completion of the experiment, you will have a list of identified defects and their status
after resolution.
EX. NO: 04 Develop the Test Plan and Design the Test Cases for
an Inventory Control System
DATE :

AIM:
The aim of this experiment is to create a comprehensive test plan and design test cases for
an Inventory Control System.

ALGORITHM:
Follow the same algorithm as described in Experiment 1 for developing the test plan
for an inventory control system.
Follow the same algorithm as described in Experiment 2 for designing test cases for an
inventory control system.

TEST PLAN:
Process No. Test Case Steps Description Status Expected Actual Comme
Result Result nt
The test
1. Review the plan
requirements Verify includes
Test Scope and project the all
Plan TC of documentatio scope of Done essential
001 Testing n. testing. features.
2. Identify
the modules
to be tested.
3.
Determine
the out-of-
scope
items.
1. Review the The test
Test requirements Verify objective
Objectiv and project the test s are
TC es documentatio objective Done clearly
002 n. s. defined.
2. Discuss
with
stakeholders
to understand
expectations.
1. Identify
the Verify the The test
hardware required environme
Test and environment Not nt is
TC Environme software s. Starte defined.
003 nt requirement d
s.
2. Set up
the required
hardware
and
software.
1. Determine All
the Verify the necessar
Test documents required Not y
TC Deliverabl and artifacts deliverable Starte documen
004 es to be s. d ts are
produced. listed.
2. Create
templates
for test
reports,
defect
logs, etc.
1. Decide Verify the
on the overall The test
Test testing approach Not strategy
TC Strateg approach for testing. Starte is
005 y and d defined.
techniques.
2.
Determine
the entry
and exit
criteria.
1. Define
Test the Verify the The
Scope timeline schedule Not schedule
TC and for each for testing. Starte is
006 Schedule testing d establishe
phase. d.
2. Determine
the scope of
testing for
each phase.
Potential
risks are
1. Identify Verify risk identifie
potential analysis d with
Risk risks in the and Not mitigatio
TC Analys Starte
testing mitigation n plans.
007 is d
process. strategies.
2. Discuss
risk
mitigation
strategies
with the
team.
1. Allocate Resourc
the required Verify the es
Resour resources for availability Not needed
TC ce testing. of Starte for
008 plannin resources. d testing
g are
allocated
.
2.Determine
the roles and
responsibiliti
es of the
team.

TEST CASE DESIGN:


Proces No. Test case Steps Descriptio Status Expected Actu Comme
s n Result al nt
Resul
t
1. Review Verify the All
Test Module A - the functionali functionaliti
Case Functionali requirement ty of Not es of
Desig TC00 ty Test s related to Module A. Starte Module A
n 1 Module A. d are tested.
2. Identify
test
scenarios for
Module A.
3.
Develop
detailed
test cases
for
Module
A.
1. Review Verify the
Module B the integration Module B
- requirement of Module Not is
TC00 Integratio s related to B with Starte successful
2 n Test Module B. others. d ly
integrated.
2. Identify
integration
points with
other
modules.
3. Design
test cases
for testing
integration
scenarios.
1. Review Verify the Module
Module C the performan C
- performanc ce of Not performs
TC00 Performan e Module C. Starte optimally
3 ce Test requirement d under
s for load.
Module C.
2.
Determine
performan
ce metrics
to be
measured.
3. Develop
performan
ce test
cases for
Module C.
1. Review
Module D the usability Verify the Module D
- requirement usability Not is user-
TC00 Usability s for of Starte friendly
4 Test Module D. Module d and
D. intuitive.
2. Identify
usability
aspects to
be tested.
3. Create
test cases
for
evaluating
Module D's
usability.
Module E
1. Review is
the security Verify protected
Module E - requirement the Not against
TC00 Security security Starte
5 Test d
s for of security
Module E. Module threats.
E.
2.
Identify
potential
security
vulnerabilitie
s.
3. Design
security test
cases to
assess
Module E.

EXPLANATION:
An inventory control system is critical for managing stock and supplies. Proper testing
ensures the system functions accurately and efficiently.

RESULT:
Upon completion of the experiment, you will have a well-structured test plan and a set of
test cases ready for testing the Inventory Control System.
Ex.NO:05 Execute the Test Cases against a Client-Server
or Desktop Application and Identify the Defects
DATE :

AIM :
The aim of this experiment is to execute the test cases against a client-server or desktop
application and identify defects.

ALGORITHM:
1. Test Environment Setup: Set up the testing environment, including the client-server or
desktop application, required hardware, and test data.

2. Test Case Execution: Execute the test cases designed in Experiment 2 against the application.

3. Defect Identification: During test execution, record any discrepancies or issues encountered.

4. Defect Reporting: Log the identified defects with detailed information, including
steps to reproduce, severity, and priority.

5. Defect Tracking: Track the progress of defect resolution and verify fixes as they
are implemented.

6. Retesting: After defect fixes, retest the affected areas to ensure the issues are resolved.

7. Regression Testing: Conduct regression testing to ensure new changes do not introduce
new defects.

TEST CASE TABLE:

Process No. Test case Steps Description Statu Expected Actu Comme
s Result al nt
resul
t
User can
Test 1. Launch Verify user Not successfully
Case TC00 User Login the login Starte log in.
Executi 1 application. process. d
on
2. Enter
validlogin
credentials.
3. Click on
the "Login"
button.
Invalid data
shows
1. Access a Verify data Not appropriate
TC00 Data data input validation on Starte error
2 Validation form. the form. d messages.
2. Enter
invaliddata
in the form
fields.
3. Submit
the form.
1. Access the Verify file File is
file upload upload Not uploaded
TC00 File Upload feature. functionality. Starte successfully
3 d .
2. Select a
filefrom the
system.
3. Click on
the"Upload"
button.
Application
1. Verify the gracefully
TC00 Network Disconnect application's Not handles
4 Connectivity the network. response. Starte disconnecti
d on.
2. Attempt to
perform an
action
requiring
network
access.
1. Simulate Verify Application
Concurrent concurrent application Not performs
TC00 Users usersessions. performance Starte wellunder
5 . d load.
2. Perform
actions
simultaneous
ly.
1. Test the Application
application Verify cross- works on all
TC00 Compatibilit ondifferent platform specified
6 y platforms. platforms.
compatibility Not
. Starte
d
2. Execute
testson
various
browsers.
1.Monitor Data is
network Verify correctly
TC00 Client-Server traffic communicati Not transmitted
7 Communicati between on in starte and
on client and integrity, d received.
server.

Explanation:
Testing a client-server or desktop application ensures its functionality across different platforms
and environment.

Result:
Upon completion of the experiment, you will have a list of identified defects and their status after
resolution for the client-server or desktop application.
EX.NO : 06 Test the Performance of the E-commerce
Application.
DATE :

Aim:
The aim of this experiment is to test the performance of the e-commerce application
www.amazon.in.

Algorithm:
1. Identify Performance Metrics: Determine the performance metrics to be measured,
such asresponse time, throughput, and resource utilization.

2. Define Test Scenarios: Create test scenarios that simulate various user interactions and
loadson the application.

3. Performance Test Setup: Set up the performance testing environment with


appropriatehardware and software.

4. Execute Performance Tests: Run the performance tests using the defined scenarios and
collectperformance data.

5. Analyze Performance Data: Analyze the collected data to identify any performance
bottlenecksor issues.

6. Performance Tuning: Implement necessary optimizations to improve the


application'sperformance.

Performance Table:

Process No. Test case Steps Descriptio Statu Expected Actua Comme
n s Result l nt
Resul
t
The home
page loads
1. Access within the
the
home page specified
of
Response the e- Measure response
the
Performan Time for commerce response Not time
ce
Testing TC00 Home Page application. time. Starte threshold.
1 d
2. Use a
performanc
e testing
tool to
record the
time.
3. Analyze
the recorded
data to
determine
response
time.
The
application
can handle
peak-hour
1. Simulate traffic
Throughpu peak-hour without
TC00 t during traffic on Measure Not significant
2 Peak Hours the the Starte delays.
application. throughput. d
2. Execute
performanc
e tests
during peak
hours.
3. Analyze
the data to
determine
the
throughput.
Resource
1. Monitor utilization
CPU, remains
memory, and Measure within
Resource network resource Not acceptable
TC00 Utilization usage utilization. Starte limits.
3 during d
testing.
2. Execute
performanc
e tests while
monitoring
resources.
3. Analyze
the data to
assess
resource
utilization.
The
1. Simulate application
multiple remains
concurrent Measure stable and
Concurrent users app Not responsive
TC00 Users accessing performanc Starte under load.
4 the app. e under d
load.
2. Increase
the number
of
concurrent
users
gradually.
3. Record
the
application's
behavior
with
increased
load.
Measure
1. Apply system The system
maximum behavior recovers
load to test under gracefully
Stress the system's extreme Not after stress
TC00 Testing breaking load. Starte is removed.
5 point. d
2. Apply the
maximum
user load the
application
can handle.
3. Observe
the
application's
response
under stress.
Performan
1. Identify Improve Not ce
performanc application Starte bottlenecks
e performanc d are
bottlenecks e. addressed
and areas of and
application
TC00 Performan improvemen performs
6 ce Tuning t. better.
2. Analyze
the
performanc
e test
results.
3.
Implement
necessary
operations.

Explanation:
Performance testing helps to identify bottlenecks in the e-commerce application, ensuring it can
handle real-world user loads effectively.

Result:
Upon completion of the experiment, you will have performance test results and any optimizations
made to improve the application's performance.
EX.NO : 07 Automate the testing of e-commerce applications using
Selenium.
DATE :

Aim:
The aim of this task is to automate the testing of an e-commerce web application
(www.amazon.in) using Selenium WebDriver, which will help improve testing efficiency and
reliability.

Algorithm:
1. Set up the environment:
- Install Java Development Kit (JDK) and configure the Java environment variables.
- Install an Integrated Development Environment (IDE) like Eclipse or IntelliJ.
- Download Selenium WebDriver and the required web drivers for the browsers you
intend totest (e.g., ChromeDriver, GeckoDriver for Firefox).

2. Create a new Java project in the IDE:


- Set up a new Java project in the IDE and include the Selenium WebDriver library.

3. Develop test cases:


- Identify the key functionalities and scenarios to test in the e-commerce application.
- Design test cases covering various aspects like login, search, product details, add to
cart,checkout, etc.

4. Implement Selenium automation scripts:


- Write Java code using Selenium WebDriver to automate the identified test cases.
- Utilize different Selenium commands to interact with the web elements, navigate
throughpages, and perform various actions.

5. Execute the automated test cases:


- Run the automated test scripts against the e-commerce application.
- Observe the test execution and identify any failures or defects.
6. Analyze the test results:
- Review the test execution results to identify any failed test cases.
- Debug and fix any issues with the automation scripts if necessary.

7. Report defects:
- Document any defects found during the automated testing process.
- Provide detailed information about each defect, including steps to reproduce and
expectedresults.

Program:
package program;
import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import
org.openqa.selenium.chrome.ChromeDriver;
public class selenium {
public static void main(String[] args)
{
System.setProperty("webdriver.chrome.driver","C:\\Users\\Admin\\Downloads\\chromedri
ver- win64\\chromedriver-win64\\chromedriver.exe");
WebDriver d=new
ChromeDriver();
d.get("https://www.amazon.in");
d.findElement(By.xpath("//*[@id=\"nav-link-accountList\"]/span/span")).click();
d.findElement(By.id("ap_email")).sendKeys("youremail@gmail.com");
d.findElement(By.xpath("//*[@id=\"continue\"]")).click();
d.findElement(By.id("ap_password")).sendKeys("your password");
d.findElement(By.xpath("//*[@id=\"signInSubmit\"]")).click();
String u=d.getCurrentUrl();
if(u.equals("https://www.amazon.in/?ref_=nav_ya_signin")){

System.out.println("Test Case Passed");


}
else
{
System.out.println("Test Case Failed");
}
d.close();
}
}

Automation Process:
Console output:

Result:
The successful completion of this task will yield:
- Automated test scripts for the e-commerce application using Selenium WebDriver.
- Identification of defects, if any, in the application.
EX.NO : 08 Integrate TestNG with the above test automation.
DATE :

Aim:
The aim of this task is to integrate TestNG with the existing Selenium automation scripts for the
e-commerce application, enhancing test management, parallel execution, and reporting
capabilities.

Algorithm:
1. Set up TestNG in the project:
- Add TestNG library to the existing Java project.

2. Organize test cases using TestNG annotations:


- Add TestNG annotations (@Test, @BeforeTest, @AfterTest, etc.) to the existing test cases.
- Group similar test cases using TestNG's grouping mechanism.

3. Implement data-driven testing (optional):


- Utilize TestNG's data providers to implement data-driven testing if required.

4. Configure TestNG test suite:


- Create an XML configuration file for TestNG to define test suites, test groups, and
otherconfigurations.

5. Execute the automated test cases using TestNG:


- Run the automated test suite using TestNG.
- Observe the test execution and identify any failures or defects.

6. Analyze the test results:


- Review the TestNG-generated test reports to identify any failed test cases.
- Utilize TestNG's reporting capabilities to understand the test execution status.

7. Report defects (if any):


- Document any defects found during the automated testing process.
- Provide detailed information about each defect, including steps to reproduce and
expectedresults.

Program Code (Program1.java) :

package mytest;

import java.time.Duration;

import org.openqa.selenium.By;

import org.openqa.selenium.WebDriver;

import org.openqa.selenium.chrome.ChromeDriver;

import org.testng.Assert;

import org.testng.annotations.AfterMethod;

import org.testng.annotations.BeforeMethod;

import org.testng.annotations.Test;

public class Program1 {

WebDriver driver;
@BeforeMethod

public void setUp()

System.setProperty("webdriver.chrome.driver","C:\\selenium\\chromedriver_win32\\c
hro

medriver.exe");

driver=new ChromeDriver();

driver.get("https://amazon.in");

driver.manage().window().maximize();
driver.manage().timeouts().implicitlyWait(Duration.ofSeconds(5));
}
@Test

public void verifyTitle()

String actualTitle=driver.getTitle();
String expectedTitle="Online Shopping site in India: Shop Online for Mobiles, Books, Watches,
Shoes and More - Amazon.in";
Assert.assertEquals(actualTitle, expectedTitle);

@Test

public void verifyLogo()

boolean flag=driver.findElement(By.xpath("//a[@id='nav-logo-sprites']")).isDisplayed();N

Assert.assertTrue(flag);

@AfterMethod

public void tearDown()

driver.quit();

Program Code (pom.xml) :

<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-
4.0.0.xsd">

<modelVersion>4.0.0</modelVersion>

<groupId>MiniProject2</groupId>
<artifactId>MiniProject2</artifactId>

<version>0.0.1-SNAPSHOT</version>

<dependencies>

<!-- https://mvnrepository.com/artifact/org.seleniumhq.selenium/selenium-java -->

<dependency>

<groupId>org.seleniumhq.selenium</groupId>

<artifactId>selenium-java</artifactId>

<version>4.3.0</version>

</dependency>

</dependencies>

<build>

<sourceDirectory>src</sourceDirectory>

<plugins>

<plugin>

<artifactId>maven-compiler-plugin</artifactId>

<version>3.8.1</version>

<configuration>

<release>16</release>

</configuration>

</plugin>

</plugins>

</build>
</project>

Program Code (testng.xml) :


<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">
<suite name="Suite">
<test name="Test">
<classes>
<class name="mytest.Program1"></class>
</classes>
</test> <!-- Test -->
</suite> <!-- Suite -->

Output:

Result:
The successful completion of this task will yield:
- Integration of TestNG with the existing Selenium automation scripts.
- Enhanced test management and reporting capabilities.
- Identification of defects, if any, in the application and improved efficiency in handling
testscenarios.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy