0% found this document useful (0 votes)
17 views64 pages

Ste

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
17 views64 pages

Ste

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 64
Test Management i |Learning Objectives. Scanned by TapScann er Jest pla Qpproach followed be he pou gpzation, responsibilities, rskang fet plana paris the testcasesand their ‘ eourees po identification of ti : upendent onthe typeof soft i grallabestaffan ial Re output ofthe test planning isthe . resting Iparelly develo 7 prepara oderelop ate 1, Set objectives of test plan purpose. But, before de ebjetves/goa dependent on that of so numberof st software. Tis is be Develop a test matrix: A test matrix indicates the feted. It also specifies the tests required to check the test proof to show that atest exits for al component to indicate the tetin addition, test matrix i software Develop test admini that software testing can begin a5 soon as pos A test plan is to specify the time schedule developing the tes p ed toe (plan that describe festplan also chan n) requ the pr this case, th re For example, ifthe objet user requirements, then atest plan is generated to meet The Testartef ng the sk issues. suring that the ts BY test Plan ‘A teétplan can be defined 5, intended sofware testing atts Test plan is strategic document, testing on an application inan effect scribe ng the development p partof change rmplete, the Test plans are nots because of delays at other stages in the sytem asa whole cannot be tested You then have to revise the back whem te software is on \ncremental to some other activity and bring them tivities, test planning i also Scanned by TapScann sie psoas Ta 5 ested ens + ing Sel unk . + pat coding Pedra tng «. mardware and Software Requirements 4, constraints: C 8 affecting th system Tests: 1 ; it sho Test Plan Identifier: Un g, References: Refer test actual version release number Development and Test process tan Methodology guidelines and exam © Corporate standards and guidelines. 4. Introduetion: A brief introduction about the pr A Testitems: A test item isasoftwareitem thatthe application Software Risk Issues: Software project risks & Featurestobe Tested: A feature that needs totesed on 4 Features not to be Tested: Identify the testing A Approach: It shows details about the overall approach to tes Item Pass/Fail Criteria: Documented whether so W®, Suspension Criteria and Resumption Requirements resumption requirements. AL Test Deliverables: The deliverab! Plans, test specifications and tests 1% Remaining Test Tasks: Al remaining tas (0 planning Risks and Contingencies trol activi metrics to answer how must of itis an import template recomn Scanned by TapScann nfrast jon docume, 5. Applic ig system interface thefeatures that 4 thefollowing fac 4, Features that is new and critica expectation new program to put these features features get enough plan product marketing team tobetested. 2, Features whose failures can be disast | have to be high on the lst of for the release: 1 4, Features that are expected tobe compler line up appropriate resourcesin time Features which are extensions of earlier fea not creep in again. Such features stable features fr testing FY Deciding Test Approach / Strategy sult in identifying the ght type 4 + Thetest approach/strategy s 1 Once, we have prioritized features lst, the nest step isto wha tetimation of size, effort, and schedule * This includes identifying 1 Areany special tools to 2 Willthe too! require special used and what ar the ining he Scanned by Te = < a Scanned by TapScann 4 Test Pr: Test Incid Test Summary Report Scanned by TapScanner 1 External st: usually ye ; standard Compliance to external stand External standards are of thre yp (i) Customer Standards ref (ii) Inter customer 2, Internal. Stant reporting est Coding Standards v) Test Reporting Standards rs must fe eft Repository elect repository cap Ihe defect repost withina soft abso pro cdassified a Aelect repo Service Level Agreement testing 1 andi ts oly 4. Consistent Definition of the shared vision between deve ‘4. Communication Mechanisms tothe hept in the purpose © faced by rl Untimately weighing th a {et incident Rep ale ino test cycle Report running cer ad each g Asummar fg) Defects a) Progr fo Outsta fy) Any variations obse foot Summary Report: The final fgcstease report that sum an there are two type Fhe wie hPa est summary rep also called as re st summary, Whi ding Product Release TEST SUMMARY REPORT ent in detail (i) Update traceability 4? Explain with example Scanned by TapScant ~~ forte er aoe oe @ Tole a Tou 1 at __——___ INTRODUCTION e fe Quality & defined as, m sothering fb Defect d not worked as plann fs toot ca — 1 Requin 2 Designs are wrong 3. People are not tained for 7 4, Proces 1 defect is a produc an used for product business requirem has its own ie cycle Software engineering is & — systematic, wel ds practices and standards The three main phase of S and implementation phase. 10 &*F Aeliverabe at ech phase erst aultina computer proera™ i sors made by ev own as debugging Re icice i vethods for preventing Pogr#™# 4 techniques adopt and What is a Defect Management? . > = avesating, ting a management isthe process of recogniting, css yng them and sdeifyig the mp ng dete defects I involves reco «. Thedefect management approach ncdescountng and MaNAB Need for Defect Management: oe ‘ cre areal stages of software development reavees {h¢ OT us fe apres dfec nigaton fom equeent phat : into implementation pa sooner by adding vot the mat important atte of stare ery maintainability, efficiency and porta aby Defect Classification it of wrong implementation of requirements, some : ‘© Defects may be a resul one extra than required require by the customer or putting something m + Defects may be classified in different ways under er Fig. 41: Classification of Defects different schemes. Fig, vedo Scanned by TapScanne) i) Algorithmic defects may (i) Module interface defects ar il) User interface defects Sieg with te appl (iv System interface defects environmental factors is ham coding Defects: These defects may development/coding standards when designs a some coding defectare given below Varah declartion/ntalatin ariables are not deca ich ay aise ve towrong sande (4) Commentng/Documentatcr defects readable and maintainabein fare ui Database related defects 29 Oe Acre : optimised may bea part of des isnot implemented ed in an app These Jeading to wrong testing ) ae daa ete et 0 1 at ppd orf scenarios Principles of Defect Management Process allowing general princi The primary gal isto prevent defects: Where tis not Poss : ible and minimize the mp be risk driven ie. strategies, pri 4 2. The defect manageme t shouldbe integrated into the software development proces ang, 3. Defect sis ofthe information should be automated 4. Asmucha the capture and analy ve the process 5. Defect information should be used toimpro faulty processes. T 6, Most defects are caused by imperfect o n Steps in Defect Management Process Tig 42 ng steps in defect management process ques, methodology and standard procs 2shows fol Defect Prevention: Implementation a red he f defects 2 Deliverable Baseline: Establish ‘hen tole al ve P for furthe developme tk. When a deliverable is baselined, any fur changes are controled Errors in a deliverable are not considered defects until afte 3, Defect Discovery: Iden hig ot it a, = ail ered when it has ben documented and sponsible for the compote Scanned by TapScanne! | LLL Management Rpertng: Ay gece mall aaa Tevention Process; Defect F =a a ~ tFisng ane lish defined al : : i fou" an | : | | exe Ff 4, Mentty Critica iss: 1 ay todoth rd ce x th cet es oid : : (@) Missing a hey requirement paca nsidered as ric y : represent ajorcausts of does not ution ropes sen (i) tia application sot be User Inter puts If eeve att generation ey mayere re are ons ot need PPT pment 2 (di) Vendor suppliet s* a | nt . aa probe for St th rt fi ee ar imitation or problems © ale acceptably PO te) perforant, king on, Scanned by TapScanne rt Defect cige Defect. Defect Resolution Process (i) Critical: Wou!: [ERG] Process improvement EEE eanagenent Reporting the most useful metrics, + operation di management process has a numberof Purp defects rics to help project management make r the need for more testing, et senior management -- defect pScann Scanned by Ta ECT eu UE EVCLE AND DErtcr TEMPLATE k pe Bug Lif Closed: Review: Gene iY 5. Deffered: The wl ‘ eri is Scanned by TapScanner 6) High (rs Most release (i) Medium (Ps (4) Low (3): + Priority is als NIQUES FOR Fit ESTIMATE EXPECTED IMPACT OF ; DEFECTS AND REPORTING A DEFE' a erect, TECH 1. Waysto Handle Risk + Incase of pred rt sanagement with the help of sl (0) Accept the Risk ast Is not have any solutions, Uke natural di ble to implement them. These risks may be accepted by iy as itis Such decishns are generally taken by the senior managemen panisation/f ay prepare a aback arrangement, but may not define the wa minimising elimina am psychologically prepared to accep ther thatthe impact on the organisation project/user canbe reduced to some extent Listing of known defects in the product indicates aceptance of risk by the manag Customer/user are given information about probable fallures and effect of such falures and eu ne one’ tousers ation is very risky to the users or customer, he risk by avoiding the particular approach, Bypassing 2 by the user cannot be accepted, or no action can be taken (8) pasing the isk te apron uired when the probablty/impact arising due to realisation of risk Probability 2, Minimise Risk Impact if prob pact and dete tion ability. Risk minimisation has three differ tion ability + Risk is a produ ‘of handling its probability, ipa Scanned by TapScanner 5B) techniques torrintngetcs + bose 57 1 Sui Techniques: Testing Papel ecg» 2. Dynamic Technique: Testing in wich sen congener 2 opeationa Technique: A : ile peyond the scope of this st pare and pe Dah static and dai eiues are eure fran elective ee manatee PAE atindngdeecs. E} Reportinga Defect * san important step for an gai ques wil generally Reporting defect establish capabiy ofa process 2 corrections, if problems are fund nce, discovered, defects must De nique specifically designed fn we, some defects, ae 4 defects, These maybe eon PET reporting of the defect may 88%" d mes m lp cess As software be These tech he base for 1 Correct the Defect 2. Report Status of Systen da 3. Gather Statistics to Predict Fallure P 4, Process Improvement; Testing ls not fixing defects cannot al t uctvityof team. An organ must stress on quality improvement ! ess Imp nal defects are not introduced inthe systen | A good defect report should: Give sufficient and high-quality information to reproduce and fix the detect. Be well written and simple to understand 3. Enable stakeholders to make wise dc: Describe need of defect management? plain defect classification in detail th the help of diagram describe defect management proce Enlist principles of defect proces. Scanned by TapScanner Testing Tools and ‘Chapter Outcomes... es i8 es any automation ication or software by t2kin6 le ati + faire in t ‘ eaten at here 8 0) gan ond aan cae 18788 Ey hip tomas eripts and ex . 1} ros some automation ips to run the aPP ‘ re mig ax asTNATON TSE iy cal tong Hone ERR) manual testing typeof soft most i gis Ue oldest a he software wit efully going, * Man in yale oper ya harman siting rt oF 8 mputer Manual sofware teting!® PTET 1 age and int combination comparing the results to through ens. VN application scr ea theermected behavior and TOTES sp Scanned by TapScanner frocess of Manu, esting b toe pests mpletenss of ‘ of tenes plan halen mportant test a | Manual test follows a writ et of imp under whi joftwar appk st that ath lation are met, there m: a5¢5 for each requirement ‘ i # Fig Sash Advantages of Manual Testing 1. Manual testing cand It requires less times and expense tob 3, Easily we reduce and added o at ; i is covered in limited Easy tolearn for new people who a 6. Many be then automated Limitations of Manual Testing 4. Manual testing is slow and costly: Becau complete tests. Increasing headcount increases c 2, Manual tests do not seale well: As the complexity the testing problem grows exponentially. This lead testing as well as the total cost of 3. Manual testing is not consistent or repeatable in i able, for various reasons. One tester may approach an from another, resulting in different r performed dential 4, Lack of training isa common problem: Although on the sam i v 5, Testing is diffcult to manage: There are more unknowns and than in code development. Modern software d youdonothavesufcnt structure intesting wi Framewstk: te the Suitable Scanned by TapScanner Selection of appropriate tool for 2 3 scrips, 4. Development of Test suits, 5. 6 Wiritn Execution of scripts Create result reports. 7. Adentify any potential bug or performance ics Software Testing Tools, * Following are the tools which canbe use fo HP Quick Test Professional, Selenium, IBM Rational Functional Tester, silktest, Test Complete Testing Anywhere, ‘Win Runner, and LaodRunner. Scanne automated Testing table: You pepe programmable: YoU can progr mle aed ests that bring out hidden information f cone eee a 6 fet: Automated Tol run et sgniany fe han human wr 1 cost Reduction: As the mumber of resource jges of Automation Testing for regression test are reduced savant quired to write theaut ond sti est scripts + petugeing the test srt is major isu any err pre may lead to dead! oe sintenance is csty incase of playback methods Even th ar replaced by a new te nipt gha minor change occurs in ipthas tobererecorded diff ifthe tes srpt tests mores Maintenance o parison between Automation TestingandMantlTesing a anal Testing San yutomation Testing operation he test cases irs . : on that it will cate defects under ‘freqve freq | requirements _ . spent ene | Manual WE te the test the sameamounto vg testing on diferent sting S| automation t different mach platform comb isnot possib lve in programming t38k to jormation & | Using Automation test be complicated appuication 1, ees ue nificantly Manual testing |S slower than automation. Aatecnaion vse running tests manually can be very me resources sui ing man faster than pul in Ul esting, yrs very much a contd in testing sot hep Scanned by TapScal eee lomation tests perform reliabl because of human B [ht is programmable Testers can ae Advantages of Automation: 1 Automation re tinge avoids human mistake 3. Itreduces overall cost ofthe software developmen up opportuniti co = FEE) Nec of automated testing Tools +A test tool isa vehicle fo ning c proces «tn software testing, test automation is the use of special sf th ‘ tested) to control the execution of the comparison of a r outcomes 1+ Test automation can automate some repetitive Dut necessary Vs fi T already in place, of automated testing Iti basicaly using pr automated method. Testing tools area tasks, doing the tess th + Testing tools are computerize | tnmon software development odes the codees fix oop cn 8A 1 sims before the waar reese, you ae tetng parca ature Rat ean YS peed to Tu J testsmot once, but dozens of times. rere indeed fixed a nd in previous test + Youwlleheek that the bugs vere inode Ti proces oreruning Your est it be barely en0ue) | stsmall software project hd several thousand est ass run, there migh roexecutether stance Running Nem numerous Uns ightbe impossible : software est took and automation can belp save Ns Ps em by prong amore ecient torun your teststhanby mancal testing, 4 The principal atsbutesoftols and auton 1 speed: hinksbont how ong wou ou to manually try afew th f windows carla. You mInN A vse every five seconds o 0: Autom peabletorun 10,200. even 1000 times that fast 3. Accuracy and Pre to perform. a certain test i and greatly reduce 1 mlat the real w ary to perform the test 7 ft dt replace hardware o Jence or appcationcan then De used to dri nt otherwise be difficuy, 5, Simulation and Emulation te duct This ak that you choose and ways that iB our software in wa 6 Relentes aa ‘ [EE apvantaces an pisaDvaNTAGEs OF USING TOOLS Ea ‘Advantages of using Tools: «in short tools work faster than hur 4. Fast antly faster than human use ware reacts after repeated execution of yy 2. Repeatable: Testers can test how the website os of the software aa ai 4, Reusable: Tests can be re-used on differen esi 4. halnbe- Tests perform precsely the same operation each time MY 75% thereby elrunatg human ero . 5: comprednsive Testers can build test sites of tt hat > 1 feature in sotvan software application. ate 6. Programmable: Testers can program Soph sticated tests that bring hidden information aviours and are not prone to fatigue & 7, Consistent: To Disadvantages of using Tools: + although there are many be but there are also many risks that ort testing active be achieved by using tools to SUPP testing is intrody h it when tool support nefits that can are associated Wil and used Risks include 4, Unrealistic expectations fro realistic expectations may be one of risks to success with tools It is nt to have clear and realistic obj ie greates what sn the tool: Un very import ime, cost and effort for the int ng the ti people bei the tool can do. 2. People often make mistakes by underestimat intrduetion of tol: Once you purchase at) JOU Wan tohave a numbe! Ae to ae te go away that wi be beneath De will be so vance from one people-potn need robe handed ini hnical issues overcome, but there will also be way that the tool willbe of great success nd effort needed to achieve significant an isnevr to the peopl benefi al phase when the too 3. People frequently miscalculate the time # continuing benefits from the tool: Mostly in th they miscalculate the time and e eto achieve significant and continuing from the tool. Just th the last time y hing new fo te very first time takes time to develop waysof using thetolinerdertoachieve whats expect Scanned by TapScan! k back to wp? ol for automat 2 Developing in-house t needs better soldby vendors a are properly trained. Training usual languages, customizing oitsaprop 8. Select tool hich safer tal to phase + oral the above strong reson, adequate tion, The requirements shuld o requirements frm the basis fr tol see | criteria for selecting Test Tools: + Thecategories of criteria for selecting test to appr testing took 1. Meeting requirement, 2, Technology expectations, 3, ‘Training/skilis, and + | 4, Managamenaapecs | 1, Meeting Requirements: t Firstly, there are plenty of y do they meet all the re of given product ora giv f + Secondly, test tools are usually one 1d and may not provide backwa ‘ | compatiblity withthe product unde i ‘irl, est ton may nat go sroogh he same arcu of aN : i Tale drigyt"200 sng some ofthe tools couldn stems atoning eee )asthatof he produ ows information i This helps to. prev relevance ofthe ste SEL Scanned by TapScanné ropriate ch duct tobe tracked and its progress monitored effec red, the pirameters may relate to products Ce nthe data measured, presented inan sto enable ns on improving Tae | ae rt is the actual time that is spent on a | messierons reasurenens aphase mas ference between the start { ] pletion ofthe activity \ | ductivity umbers and. aa f days required to | Jute lp operator plete the set of activities. Elapsed days for a uel te set of activities become the schedule Perm ane and analyzing metrics involves effort sis depicted in Fig. 56 Tigs6 Scanned by TapScann disseminatio + Planning metrics ger Step 4 The fourth step involve ina metrics prog andi areas on product quali Only the improvement aspects pointed also highlight and sustain the Thiswill ensure that the best steps: + Thefinal step i The purpose of a met completion. ead { ae ing. two data points are needed remaining ed per elapsed da 7 day are calculated based on a mea 7 defects f fix al fefects is another data point. The numberof n t tanding defects waiting tobe fixed" a edicting the number of defects that steno 4 a peri of tim ate of the def team. This measure gives the number be fixed in an capabil the release date can be estimate. Testing and defect fxn at can be executed simultaneously xes may arrive after the regular test cycle are completed. These defect fixe anberelase release date for the product. Th rriving atthe nly used for release but also use | test cycle are not cs and their analysis help in preventing t oding defects) that tive activities, Met ere isa type of defect (say forma code review and prevent those defects, rath nd ym inthe co one and fin management to identify the righ source size of product development tein ting help in identifying fect density (formally defined later). Across modules analysis of those defects, the scope of the product ca ime Metrics help in making this decision release- based on customers, and Jease the product only ed with known quality, the idea of metrics is mow the quality of the product and ascertaining | n quality and whether it the product is being F releasing the product with the knox Scanned by TapSca ar be he al +The product metres metrics that numbers that can be coleted and used for planning and tracking testing actvitie ipin planing se metrics and estimating of testing [ppc Fig,57 shows typesof metrics Project Metrics sare manly rebated 4 Project/Product met the measures of Softwar Project Metrics ar They enable to: 1. Minimize the development potential problems ands. assess product quaity onan onon6 quality sey ae used in estimation techniques and oectis finalized the she project scoPe ranslated to size ‘his size estimate getstranstated oor This available productivity dataavalable time by making the ad) individ ments necessary to avoid basis and modify the technical app imate, wh ae foreach ofthe phases and ates PS riatefforis aed base lined effort torespond fast tothe changing require isan indication ofan over estimate negative The Schedule Variance (Planned vs Actual): can be negative al concerned about the variance in effort, but also concerned tothe schedule variance me the actual schedul milestone to find out hi jects are not Number of softwat about meeting schedules. This leads us Schedule variance Is the deviation of variance is calculated atthe end of eve respect tothe schedule the estimate schedule. Schedu the project is do le fron tant to calculate he “actual schedule edule in the middle of project execution it the project and compare it along 1° canbe calculated by adding up all remaining. Je vs planned schedule. This metric is represented as To get a real picture on # ‘remaining days yet tobe spent” 00 spent”, “Remaining days yet be Pe” «+ Thismetrc give the variation of actual sched percentage tual numberof das Panned numberof ys) W Planned number of days Scanned by TapScann effort inthe design phase. Or Cost Varlance (CV): This metric gives the represented as prcen ayaa cts i Size Variance pScannel Scanned by Ta pistribution: fect Distrib ponent-wise Dé Component P | RPE) reducityneci 4 features J : ect changes nth i deve the count corresponding to added/modified ang “ap esting est cases developed fora period /T pent in test ca ea 4. Defect per 100 Test Cases test tha nany defect get uncovered during + Thisisaf wo i) The effectiveness of choosing tess that ae capable. ore to uncover defects depends on how well the test cases are ability ofa testcase 10 un depend 7 teigat per 00 testcass (total detects f 5. Defects per 100 Fale Test Cases + Defects per 100 failed testcase nd fora period / Toa test cases executed forthe same pe 109 are a good measure to find out how granular the test cases are j Indicates: d tobe executed when a efectis fixed (i) How many test cases nee (i) What defects need tobe fixe so that an acceptable umber of test cases reach the pass stay and d 5 stat, the fall rate of test cases and defects affect each other release readiness ana (il) Ho Detects per 100 filed test cases (Total defects found ora prod / Total test cases fled dueto those defects) 19 6, Test Phase Effectiveness «Testing i not the job of testers alone. Developers performed uni testing and t sing component, integration, and system testing phase could be multiple testing teams pe & — Scanned by TapScannet EXER] process Metrics proces of tt prepa pecan phase of TA Test Case reparation Productivity it Case Preparation Productivity For examp Wo oftesteases = 240 Effort spent Test Design Coverage Ithelpstomeasure the percentage of sign Coverage tal nu For example: : Total numberof requirements: 10 Total umber ofeurements mapped test Test Design ¢ Test Execution Productivity, It determines the numberof For example Noof Tes cases exe Effort spent forexecutionof test cases = 10 Test Execution Prod FREE] onject oriented Metrics in Testing rics are gen nent ofa jon of aclass. bilities other classes expec ofa 8) explained b Cyclomatic Complexity (CC) atic comple combined with Scanned by TapScannet i) Message Response for a Cass (iti) Cohesion: co disjoint sets produced fron thods. gis a measure oft (iv) Coupling: Co. smoneentity ane (a) When a mess lasses (che (b) Classes other clase Inheritance intr Coupling between Object Classe Inheritane h te implexit bstraction of objects ca ; the amount of inherit he dept are the dep (i) Depth of Inheritanc (W Number of children: The number of children 6 the number of immediate su tbe hen multi of he potential infuence a ae : : mm 5. Explain need of automated to Explain static and dynamic testing tools ind 8. Compare manual and automated testing How to select a testing too haracteristics of case tool jomated test tools? Also explain some g 0, When to use What is metrics? Define i 12, What is measurement? Define it 13, Explain various types of metrics in deta 14, Explain project metrics 15, Describe progress metrics i in detail 16. Explain testing defect metrics. 17. Explain development defect metric 18, Describe productivity metrics. Explain process metrics in detail 0. Describe the term object oriented m Scanned by TapScanner \ Laboratory Manual for Digital Techniques (22320) \ Semester-IIl (DE/ES/ET/ENEX/RQUEASAC/MU/COIMICW) Maharashtra State Board of Technical Education, Mumbai (Autonomous) (IS0:9001 2018) (ISOMEC 27001:2013)

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy