0% found this document useful (0 votes)
30 views29 pages

Unit - 1

The document discusses various aspects of machine learning, including learning problems, performance measures, and the search for hypotheses in large spaces. It highlights the importance of training experiences and the evaluation of hypotheses based on their consistency with training data. Additionally, it explores concepts such as candidate elimination algorithms and the representation of hypotheses in the context of learning tasks.

Uploaded by

Sampath Perka
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
30 views29 pages

Unit - 1

The document discusses various aspects of machine learning, including learning problems, performance measures, and the search for hypotheses in large spaces. It highlights the importance of training experiences and the evaluation of hypotheses based on their consistency with training data. Additionally, it explores concepts such as candidate elimination algorithms and the representation of hypotheses in the context of learning tasks.

Uploaded by

Sampath Perka
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 29
Cusaitan) Yd» Machine Lear nin 9g eis (Machine Learning) Learning problems : perspectives g Mssues - concept hearning = Vetbian Spaces & candidab @liminations- Frductive Bras — Cectsion “Tree hearning a Representations = Algostithrn ~ Heuststtc Space Search, @ ' om => A computer Program us Said to Learn fr Y Some class 4 Experience “E° with veSpect to 2 Y a T m ‘p th (tS Tasks 1” and Performance ea sore 2 h Performance at tasks to T, a& measuted by 'P’ , tmpstoves usith Experience E. => ue must identify these features # the class dg +ask3 * Re measure Ppertormance to be Tonpsioved * the source 4 Expertence . O checkers Learning problem + Task T: playing checkers * Petbormance measuve pi percent $ games a5en eet ©pponents Taaining Experience Ge) :- playing paacetce games against ftseip. Bandon Tecrgnition Le a ning Problem: * Task Tt Tecognising and Classifying hand- ~Gritten usosids ith In Tages Pettermance moeasure p+ percent o& words ee recely class’Pied ° Teratng Ex pere A database ds handenritten wosd, wiih § given classi fications A Robot Brit ving Learning problem ask TP OHettving on Public fovr-bane highwags USing Version Sensox4 Rerbormance measure p: Average distance traveled before an cxror e Cas judged by human Overseer) “raining expertence! e+ A Sequence dy mages and Steering Command, Seconded ashile obsesving a human driver. fical estan 4 ‘the checkers Learning pragram Experiment New pr0blem Generator 2 CEattial game Hypothesis board) ) Sotution hace Thaining examples Cgame history) Cebit Main lO) > Abo , Viva (Bs > 5" aah Se ‘ ial f Rerspectives and fssves i machioe Learn a 129 ape => One uSeful Perspective On machine Learning us that tt tovoltues Bearching a very Large Space o possible hypothesis +0 determine One that best fits the obsesved data and GnGe PIES” Knowledge held by the Learner Pat “= Gonsider the Space 4 hypotheses that could is in Princcple be Ovtput by the @bove checker Kearner. thts hypothests space Consisk % 2! Cvaluation functions that can be ye presented by Some chotce og values for the weights Wo through U6. => The learner'S task uu thus +0 Search tbvough this vast Space t0 docate the hyPothes: eo that ds most Consistent With the available training examples. = The LMs algost tho for fitting weights achieves this goa! by \tewatively toning the wcetghis , adding a Correction to each coetght @ach -+time the hy pothesiged Evaluation Funckon predicts a value tha disfert from he training Wales. att defioes a Conti nucsly Parameterized Space & Potential hy po thesis This pere i i : S$ perspective 4; learning a& a Search 1 * P9toblem in order +0 Characterize Learning Methods by thety Search Strategies and by the underlying Structure o the Search Space *y they explore — we Can analyge the velationship between the Size a the hypothesis Space to be Searched, the no. training examples available, and the confidence. => we Can have thet a hypo thesis consistent worth the raining data will correctly generalise to unseen examples Qesues % Machine Learning algositthrs exctst Por Learntog gpreral — uohat Larget Punctom% from Specific raining byampies = on ephad | Settings coll particwar Gesired function, abgort thn Coverage +o tre wen Sufpicient Eraining dat perform best for eohich tyPes oh Peoblemy and wepresenta tions 3 ald coltch algor thy => How much training data ds Sufficient? corat general bounds can be found to ¥elat the Confidence es Learned Yypothesis to the Amount troining experionce ese? es novan ceremonies S ' Qn ae: : A the character d the Learners 6 hypotheses espace ? => when and how can Prior knotuledge held by the Learner gutde the procem ay generaltaing fom @tamples’ Can prior Knowledge be helpful even when tt % Only Approximately Covyeck? =} What ds the best Strategy Por choosing i how does ity a useful mext training experience - an the chotce qd, this Strateqy alter the complex a the pearing. pxoblem 2 =} what “us ‘the best way +o meduce the Learniag task +0 one ©») move Funct on approximation pxoblems? put another “way , cohat Speci fic functions Should the System te attempt to learn? Can thts psocess itself be automated? — Hou can the Learner automatically alter it8 sepsesentation to impsove tts ability 40 Stepresent and Learn the tayget function? Concept Learnic 4 q LnPerttng a boolean- Valued function from chaining examples dy tté input and output: A concept hearning Tasks : qj: ae -S congider the example +aSk 4 Learning Fhe target Concept: = she given tables ceprooents by a Set ab atti butes = the attribute Enjoy sport tndi cates ushether ® fo») Not Aldo enjoys his fovesib oater Spomt on thts day. =s the +aak, a to learn +0 predict the | Value ds Enjoysport for an arbitary day, fpaSed on the values 45 at$ Other attibuty e = let us begin by Considering a Stmple representation ia which each hypothesis ets dy a Conjunction co Conatraintt consi attributes . on the instance => let each hypothesis be a Vector d Six Constraints + Specifying the value o the Six attributes Sky » Ainjemp, Humidity , wind » water, fovea! 4+ Qndicate by a "fT that ony volue up 9 acceptable for this attribute. > Spect fy a Single sequired Value Cogiteaims Por the attribute, ©y yx tadicate by "g’ that no value wt acupiats => rn Bome tmtance “x Satisfies all the Conshaints od hypothesis hs then h classifies Pa’ an the Posi tive. => the hypo thesis that Aldo” enjoys hrs faves Sposit only on cold dlauys udith high humidity ssion ws sepsesented by the express . C%rcold> high » %+252) Jumid Te has Humidity | tlted | avatar fae Enjoy Strong] Warm yes Strong) Wate) Same yes Strong] WOT) chagel pig Strong | worm Cherngel Yes rgaie Positive and negaaive straining examples For the target Concept Enjoy sport => The Most Jeneral “hypothesis - that everyday du a positive example - us sepsesented by CTs) => the most SpectPic possible hypothesis — that no day ds a positive exampie- cs sepsesenitd by CB $99.96). => Sn general any concept fearing ta&k Con be described by the Set % instances @ver which the target function ws defined, the t function » the Set 4 canaidate targe the Learner, hypothesis Considesed by and the et 4 available Haining exomply Alotattan @n general »c can be any beolean- valued function defined ever the smstances x 5 thot WA €: x >of. => the target Concopt Cowesponds +o the Nalue ¢ a attribute Enjoy spoxt” Cie, ccx=t tf Enjoyspost ~ yes > and ccx) = 0 th Enfoyspot = No.) nae ve e , yctive learning Hypo thy Any hypothesis found 40 approximate the a Sufficient Large ast also approximate they unobsesved target un ction well over U Set + 4r1aining examples | he target function well over examples Concept learning as Search > Corcepi learning can be viewed an the task Searching through a Large gpace 4 hypothests — (mplietty depined by the hypothesis BepseSentation important +0 note that by Selecting => Tt os the designer Q gpa eres wepresentati on, Learning algostthm amplicitly defines q, the the Space 4; all hypotnesis that Ene psogram con ever represent @ theresove can ever leayo => Consider» fhe example » the pnerantes "yond hypothesis "Hy" wn the Enjoysport learning ark. ~ = Given that the attribute "Sky has +tinee possible values hye r ” “Aiviemp" 0 Homily Cach have Two \ w 0” hod, water é O.SSi bl de iG) por => The instance Space "y¥’ containg exactly 3.9.9.9.%.9 = 96 adbtinet ingionces. = A similar calculation Shows tat there ate 5.y.U.4.4.y= lao Syntaceically dPAticc! hypothesis wot4hio ae => Learning os a Bearch pasoblem , then tE Us natural thot owt Study 4 learning algosithm will examine different strategies Pov Searching the hypothesis Space - General ~ £0 - Specific ordering of hypotheses —_ Many algo t thes for Concept Learning orgariae| the. Search Ehrough the hypothest S Space by relying on o very useful Structure that exists Por Any Conapt learning — Peoblem. wy TO ‘illustrate the general - to~ Speci fic ondering, Consider the -wo hypotheses hi = {Bonny, 9,7, Str9ng, 224 ho= 8h Sunny 3.99.97 5 Def (er hy and hy be~ boolean— valued functions defined over x. Then hy ds mone general than -(©1) equal tO hy Curitten hy > hx) @ i4 and only “fe — (M4, € ¥) [Theo =") iat ( hyo = o]} Gastances *y’ Hypotheses H Speuif x1 = he % = i a= < Sunny» worms high, ha = < Sunny 99275 leght 3 warm, 8ames ha = the Limitation $ £ addvre Sses Several H the = the Key (dea in the candidate —eliminatio a olescription % the gon thm aw to ovipat consistent usith the | Gok oF jolleengpoupeses training ex ample: 4 this descré ptio? dy tl —s TAIS Computes the enumerating all Ser ith out explt city members 5 "fin Can be, applied "ito "Peele Such a leaning megularities tn chemi cod mass Spectroscopy cmitchell 194% =, hearning cmtso| mules fod heuristic Search C mitchell 1932 Representa tion* ( deft nition) A “ho hypothests “h’ ds consistent twtth @ Set od training examples "D’ th and only ik heads ccx) Por each example (x, 60x) tao Consistent (hi 0) = (va, etx) & p) heros ce) | é . | I , Lythe Vesston Space , denoted % va othe wD Yespoct to hypothesis Space "WY and training Jexampies "S “H cansisient | J=> The Subset 4 hypotheses from With the training examples 40.70 VS yo {he HI consistent (hid) 4 alaoxithm oe oe the List -athen= eliminate ln \gowi thm fisot ce to contain all 5 any hypothesis aining example AP the est - Thon — elimioate a tortialiges the vession 8pa hypotheses “Hy” y then ejiminate found yaconststent —Uotth any tr aulgosn then 1} Vergion Space Ce 0. lest, containing every hypothesis in “H 2+ for each traicing example » (x) ¢) remove from VertionSpace OY hypothests “Ko Ror which hex C6xd 3 output the List 4 hypothests [0 Veraton Space. St 5 x Tesunng, Bhd 1>> 22, warm, 1:4 > Ae £19> Version space with § Specific bourdary sets Definker the general bourclary &, with vespect to hypothesis Space -H ond training data 0, uw the St F maxi mally general members ad H Consistent with Oo Ge $qeu| @nsistent (4:0) A (mage) Tota, 9JA Comsiatent( g's0)} } 3 wapmuion Te oe Hepinition + the Specific boundary 5, with syegpect 46 hypothesis Space UW ara graining ws ithe Set 4 minienally genera | tH) Comiz bet data 0D, (ie. mari mally Sporigic) members A, | ustth ‘D S= {Seu Comsistent (5.0) A CaS! € HD TiGsas men Gonsistent (80) } vepre sent aston thediare tary set 4 stance booloan - valued Verxston Space het >x be an arbi bo a Set & | ord let rs mix let rx font hypotheses defined ove Concept defined over X> be an anbitary target tary Bet 4 raining Mary jet ep bec arbi examples fsx eens } + H/C and for all x ph) idg S G Gau, saa! cle fine al . 507 {he HVC © 6 C4 66) (g=4h 2383 | elimination algorithm Compal, all hypothe srs an — the candidat containing the Version Space f comsiétent from cH’ that are woth sequence + training onanples: mnipialiging the vension Spare observed thet Us <7 Jt bas by 4 the set H aul by gartiali 3f04 the “G boundary containing the >» and enctleltaing the °S ae get ap entato the ‘most Spect fi ¢ Cleaat gree) hypotrest* So” $0d9 FP )} Candidate elimination algaaithm * | Candidate - @lioviaati ONGORU i 4o the set 4 maximally general . Gnitialige Gi io A hypothe ges —> Qnitialige ‘si to the Set o& maximally gpeci bic hypotheses in => for example glint e th d ts a positive example * Remove from “Gi any hypo thesis socomistent with “d’ * For each hypothesis “s' in S Hot ds snot consistent with ‘d e Romove s from S * Add to S all minimal general hs Such that “KR us (emsrsied with d! > and sone rgattons member a. Gi Gs more general er, @ Remove from iS any hypothesis that us more genera) tran another hypothesis to's a iP d Bw a negative example ° Remove from S$ any hypothesis 20 Cora with @ | * For each hypothesis gto Gt thad aa | Mot @misten wth ol + Remove @ from Gr | + Add to “G all anini mal Speer igati ond { h 4 4 Such that - © fh us Gomidstent with 4d, and Some member dy G wy more Specific than h | | thod 34 | a} G « Remove grom och any hypothesi4 frypo thes # less grec than another Training example So* warm, f: Sy Sa° Go GIeOa te AS Tet Wt ot Zan paling euample Z Sunny swarms wormal » Sttong» warm, Same >, enjoy sport = YES ZB8uany, Lam , Htgh, Strong » WaIm, Same> » a | _ Eojsoy Sport= Yes @ é.\ |§- Sunny , Warm, 9 strong? < Sunn y2,2,St100g,.7 7 < Sonny swarms 27229 > BMY : motes in < 4g, , waren 5 9, »St70g» 2-29 1 VAS 1€Y, warm, 51;2,D} £4: the -firal veraton pace for ‘the Enjoy Sport Conapt warning pxoblern and’ training exanpls desuribed earlier. Prductive Bias* — the Candidate - elimination algontthm will Coverage towara the “tue target Concept tt ds given acwiak training puourded examples and provided 44S Initial hypothos' mw the rere Concept. Space contai A Biased tlypo thesis pace — Consider again the Enjoyspoxt example in which we restricted the hypothests Space +0 faclude only Conjune Hons d attribute Values. Bomause qh this Mestriction » the hypothest # Spore ws unable to smpresent even Bienple dtsjanctive target Concepts Such os * sky = Sunny @y) Sky = cloudy". ag Example} Sk Aig 4 Humid enter | fore Y | wind + Emoy ' Sv0 Fe PY | warm | osmal | St09 |cool Saal yes » Cloudy | usar | wormal | strong | cool | charge} Yer 3 Ratny warm | norm) | Strong | Col change AO. oes => Consistent with the fiTét +00 example and sepptesontable th the ge hypothesis space H TS Sot <1 -warm , Normal , Strang, cool, charge >| An Unbtased Learner, + ye cs Capable dg; otepresenting every Possible Bici & te anstanes x+ In general » the Set ch atl Subsets 4 2 Set x sae power Set 4 *- SSNs 7-7.92,7>V < Cloudy, 259,2,9,7> | the paailtty d Bias free learning @ pehinvior | Consider a Concept learning al gost the ‘LL Sor the a Af Tene, | eens Cr beec AOR concept | defined over x, and let De ~ {dream >} bo an Fasbitary Set a4 training examples qc. fet LG)» Oc) | cbnok the classification assigned 10 the tostane |X, by L after training on the pata De: —> The inductive bi 4b ds ang roinimel Set dy aseertions “8 Such tab for ang target Concept |-¢ and Ccoxstesponding training examples De C¥ ape xD PBA AT TEL Gia) Gnductive bias -4 candidat - eliminator algosiithm The target Concept “cds Gntained tn the giver hupottesi¢ Space “H. ; +> the inductive Candidah elimination algo’ ihm at the top 4 the figure takes two Ups © Grductive Systero Naining ertampla} Candidate Eliaination Classt fication Pageant ds new New instance 4instancecor) USing Ay pothesrs Space H " don’t know” Equivalent deductive Systm Traiaing Examples New instance | Classi fication 4 new instance, Assertion (or) "Onn know! ” y wntaim the tagpt cencept” £4 modeling ‘inductive Systems by equiva Deduc tive Systerns Decision Tee learning SDectsion tree fearning th o method for approxicatig dicrets Valued target functions, ta eahich the Learned -function® tu sepresented by a cecisio? tree. = learned trees Gn ako be cye- vepresenkd as Set g& ib-then yale bo 1psore human mea dability, Dectsion Tree yesentatian ) Hs Decision trees classizy instances by Sovting © them down the tree Prom the soot +O Sore leak node, udhich provides the classi fication qd the instance. Ks Each node (a the -tree Spec'fia a test 4 Same atttibute 4 the instance , and each branch descending Om +that node corresponds, to one 4 the possible Values Por. thir attribute iain! Sonny osha yes (Horst) htgh oral SHonpia ta Ped | J 0 i. 3 No yes £19 + Decteton Wee for the Cnieegh ply Tennt 3 Fox the given ( Corvosponds 40 ( ovtlook = Sunny A V( Outlook = VC owtloor = Rato Lor => Most atgont thins fox learning decision trees thm thos employs & Top- eSpace % on a core algosi - down, greedy possible Oectsion talbtch Adtri bute ae = we will define @ Co \lbda tnformat(on how well a given tyalmng examples cla Si fication =+ (D3 vses this taformation garn measurex +o Select among the candidak attributes at each step While growing the tree. erample, ihe Search through th expression Houmidtty = Normal ) overcast J A wlind = weak) ‘ aece Learning that have been developed “Trees. the best classifier ? grartstical Poopenty y qain > that measures attribute Separates the ing to their target acord the doertsion tee af ther axe variatiom% En tyof eaauyes homoqenattt 4 Examples =3Jn order to desing toformation in prectselys we begin by defining 9 measure commonly ErOpY, USed fo to formation theory , called Entropy . an that chata cteviges the Cim) puTlty a arbitary Collection 4 examples - = Given a collection S; Containing positive, and mega tive example 8orne target Concept. The Entropy 4 °S Sielative to this boolean Classification ds Eroops.cs) = — faleg.to ~ Bloale example Suppose °S' ca a Collection 4 ty example 4 Some boolean Conupt, including q positive & 5 repoie Entropy (a4, 5-] = PR) ty) - Sly, (540 Entropy = 440 L TOs Cexam ples , Target ~attribuk , Attribuda) ¥% CyYCalie a Root Node for the yee ¥% Uh all Examples ave Positive » Retu7 the Single - node tree Root, wrtth label = + * ib all Exampies are Tegative , Reluro the Bingle -node tree Root , with label = - » Yh Attnbutes ek empty, Retusn the Single: = most comrno) ta Exampes: node tree Root, with Label Value “rorget - attribute * Otherualke Begin © A < The attribute from Attributes tha beet” classifies Exarnples * the dectsion attribute fox Root . mpl Target a pute a t ‘oudes (Ay *End * Rewin Root JA yorthm ad Summary t Ha apect u & Go2 5 Me Sf I 1_te ae Geni ; loan ~ Valusd unc oo 77 Of Oo Po to & fig the Entvopy function yelative boolean classitj cate? ar the pre example waved betwaro FO! por tion Pay » Positive Gain CS, A) = Entropy(S) — Da Fc} entops(S Hypetheses Space Sxuch fn Dect ston tees => %p3 Gn & Choractetiged a Searching = @ Space 4 hypothests for one that fits the training examples: =— 7 the hypothests Space Searched by G03 ds the Set d possible Dec’ston trees . Gnductive Bias to decision teee learning Approumat *qductive bias 4 203% Shontes trees axe cre persed over larger trees i Qssua in Ceci sion T¥ee Leasning- x Avoiding Over Fttti 09. the Oala * Ruduced Ess08 peuntag % Rule post- psuning x Ancorporating continuous - valued Atte bale oy Alzernative Mmeasyses Sos electing Attributes # Hand! eng +rratning Example with missing @rttrtbute Valuet- ee. Pe tort Cudering

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy