0% found this document useful (0 votes)
35 views18 pages

AAI Module 2

This are the notes for module 2 of AAI summarized format.

Uploaded by

ahmed.412052.cs
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
35 views18 pages

AAI Module 2

This are the notes for module 2 of AAI summarized format.

Uploaded by

ahmed.412052.cs
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 18
Generative Adversarial CHAPTER 2 Network University Prescribed Syllabus Basics of GAN : Generative Adversarial Networks (GANS) architecture, The discriminator model and generator model, Architecture and Tralning . GANS, Vanila GAN Architecture GAN variants and improvements (DCGAN, WGAN, Conditional GAN, CycleGAN), Challenges - Training instability and model collapse, GAN applications in image synthesis, and style transfer \) Fig. 2.1.5 : Output 1.4 Vanilla GAN Architecture (2 Marks) | N. (2 Marks) | (5 Marks) ¢ Describe advantages of Vanilla GAN. Describe disadvantages of Vanilla GA Explain architecture of Vanilla GAN. Write a code for implementation of Vanilla GANs (10 Marks) wee 4 . Vanilla Generative Adversarial type of artificial neural network architecture introduced by Ian Good fellow and his colleagues in 2014. ———t ws Figure below illustrates the schematic diagram of vanilla GANs, where G generates a fake image from a random latent code z ~pz and D learns to distinguish between real and fake samples. ° othe key idea of GANs is usually defined as a game- play problem with a min-max objective. e— Researchers aim to obtain the optimal generator, which can generate high-resolution vivid images similar to natural images by fine-tuning the hyper-parameters properly. vane aoa inttigence (MU-Ser, 1208) (Generative adversarial Networ)...Page no x, we briefly show some popular objectives us Nest, we briefly popular objectives used to tain a generative model, High Dimersional Sample Space [ | EF Discriminaive| Network low Generative Dimensional ‘Network Generated Fake Latent Fake Images Space [7] G 2 Fig. 2.6: Vanilla GAN, torch.sigmoid(sellfo2(s)) Training Procedure GCs a Vals GAN involves eatvey dane the weighs of both the generator and discriminator reworks (SGD) or one of + Typically stochastic gradient des is variants is used for optimization + The training process continues until convergence criterion is met, such as when the generator produces convincing samples or when the discriminator can m0 Tonger diainguish between seal and fake samples he networks and op cfcctively. ie a Generator) an {la GAN 4 oe Generative Model + GANS have 1 Pi a mt Ney es, including AES audio, and text. They ss ag ly ODES earning :[GANS operate in earning setting, Which means they doy require labeled date for ueaining) This makes yy rae and applicable t 2 wide Fane OF SKS wh Jateed data might be scarce oF expensi¥e 4s Fleabity + GANs are highly flexible and can by agplicd 10 various data modalities, iocloding image text audio, and more. They are not limited (© a speci Uns ansopervised typeof data iste Samples When trained successfully, GAN can produce samples that are often more realistic tn tnose generated by other generative models. ‘© Variational Learning : GANs provide a nov approach to learing data distributions through a game | theoretic framework. This can be beneficial ‘modeling complex, high-dimensional data "= Disadvantages of Vanilla GAN + Training Instability : Training GANs can te challenging and prone to instability. They are know? suffer from problems like mode collapse, where the ‘generator produces limited diversity in generaleé sales Aching «tne between the Bee minator during taining can be tricky. Hyper parameter Se hype parameter cho Sensitivity : GANs are sensitive ® Wo, inlding learning ate "ectres, and initialization metO® © Bhesa- fw for Large Datasets : GANs typically require large datasets to achieve good results. Training them on small datasets can lead to over fitting, making them less A . Tremere suitable for some applications. ° pensive Computation : Training GANs can be computationally expensive, especially for large and deep networks. This requires access to powerful GPUs or TPUs. ae arewrer Vie suitable for some applies foes Computation : Training GANs can be tionally expensive, especially for large and soon 1. Deep Convolutional GANS como Tees acceso power GPUs (BGAN) Gg or TUS. fnvolutional GANs (DCGANs) are a class of to orn = enerative adversarial networks (GANs) that are grains, which can IeQMMM convergence issues and | secifically designed for generating high-quality sow tcnining images, particularly in computer vision task8\DCGANs Mode Collapse, 7 igh several variants ‘were introduced by Radford eieah in their 2015 paper ofGANs jecollapse | “Unsupervised RepeeSzmatign Learning wih Deep (eax WGANs, Conditional GANS), introduce additional complexities and may not completely solve the problem, Convolutional Generative Adversarial Networks.” or networks, which DCGANS leverage convolutional neural networks (CNNs) for both the generator and discrit makes them highly effective at generating realistic and high-resolution images. “| Project and reshape _Advanon etl inigence MU Sem 8 NBDS) 5 hoy Components of DCGANS Cant convo! ayes rns Toe ato chi 2 toc pa + Tranpose ion) Layers “Te generate oaveltin byes also a cons stigedconolat este maps root Iowerimensional she dese mse ott. reenact i fet oth the generar and fiz ing by sora ing isos He int sopled ter exh dcr help coves + LeakyRebU Act AN we Lay cv fot isctiniaaer to alow * seal, noner0 ‘eat apts, which alps with ed tts te anising ent bl + No Fully C GAN wpicly avoid aly Sr of conven! eve Input: Th peo kes rom reward ages. Tsoi is semper simple ditto, sc a Gussin dsibaton Taining DCGANS iooles 2 sandd GAN wining procedure wi some sei consideration rca Ua + Tinie dis 9 singh btweer real and enced images, while the generator sims (0 produce mages tt are inlisingishabl tom el oes. Te os frcton wed is pel binary cose BCE) Iss 2Dptinizrs : Commonly, be Ada optimize i wed fer both i generator and sri, ough othr cptinizr can alo be effective + Aniston: Wei in ining sabiliy. Tiling weghs with soul) (bnew Sab wt Acdeic Year 23-28 048238) ‘Ang 2081 he inion oe jr ig <8 Ie ioe te serene 000 rey en ssel Ad se ime a age vaniion USS SER ta mnt sts "spony ato ty EET NGS ie ibn aa ienfyng owes, a Limiatons: ase the dive Avan Hravantanes Ns ace highly effsive at gener aii ina. «+ Avy hae ested est practi for ening shee rites relatively £4510 im 7 experimen with Amains cl “PO Prise DCCA: can tll be son semive to ype perme Aiello snd ting nity G) and the Generator is tying to minimize the Discriminator’ war oe in other words, maximize is loss. It can be matheaticlly described by the formula betow jd mjd ¥ ©.) VOB) = Bpigg 08+ Fp e)(loe(1-D(G@)1 where, G = Generator D = Discriminator Pata (x). = distribution of eal data P(e) = distribution of generator X= sample from Pasta (x) 2 = sample from PQ) Dix) = Discriminator network GQ) = Generator network b2 pdvantages of Generative Aversa Networks (GANS) ow” edaba anor AN 5 synthe sera GAN met ‘yacht rxebls sone Iaovn abn whch bet for dagen, tray deo, rcv alos. ,_ ster ee cationpetnsih pera us in nage sys, ideo ‘mes mst yb and te ss, rsp ening GAN cme i Witt Se a king th alr envied label earning tasks, where labeled data is sarc or difficalt plied 10 8 wide synthes translation, and others. i can be a iy : GANS image cae 1 of Generative Adversaria, a pyenevantages of OSCE mes GANS) OOO Sig Fring statis + CANS canbe fe fe rng ofinstabiiy OE CURDS comers. eampotatonat cost + CAN Ca TOT Lo resources and can DE SIOW 19 computational pci for high esoaton imaes FE Jones iting + GANS can over fit the ting producing sybctic data that #8 Too sina trsining data and lacking diversity. Bias and Fairness : GANs can reflect the bias nfairness present inthe training data, lad isriinatory or biased synthetic date Interpretabilty and Accountability : GANS ‘opaque and difficult 10 interpret or explain, challenging to ensure accountability, transparne {aires in their applications, °% 2.1.2 The Discriminator Model Generator Model discriminator ee nian 8 GAN i simply a cuit tngush real data from the data crewed 19 cacao ould a Md. use any network architecture approftt toobiain. ‘othe Se of data its classifying ebzRoiats.aQ discriminator model by own = 3 (Generative ersarial Network)....Page NO (23) aur fir yeh diagram aur iSka explanauon: by,.< -A&DS) Real images || Sample YQ Discriminator Random input Generator. |——»| Sample Fig, 2.1.2 : Generator Discriminator Architecture 9 3 =%. $2 ae a g sso] 410}210109 2-3 points on genera’ aur fir yeh diagram aur iska explanation by own # Real images anizes Une xe tor Mo del'b\ Sample Joy tamnug =~ y own Random input Generator ‘Sample sso) soyeujuosig Discriminator {+ sso} Joyes9U09 1 Backpropagation ‘= applications of GANS GAN havea wide range of spe on J Generation: GAN, ; ae on toy oon Nishqulty imager tat reeoye © rhotarnts,pintings orotic pes ofan’ 2 gee Tran OANcande tatty fos ne onto aot, ceing sistant, | appealing results, a 4, SFr Reson CANS cn eae ey ofmages, making them shaper and moe deed Alanna © GANS ow ee sion sing du for matin erg ee ipsving hee perma Anomaly Deletion: GANS can be dw ene Aistbution of normal dats snd identify anomalies tha deviate from this distbtion, _ J Rieter + GAN ose images fom one domain o aot, ch as covering seelit mages to maps of bskand-wie phos colar: 1, Ace Aging and Deagng : GANS can sma te ‘i0g oc deagng of ham aces in images 8% Drug Discovery : GANS are applied to generate molecular structures for potential drug compound "5 challenges and Considerations . Collapse : GANS can sometimes suffer from ‘mode collapse, wher the generator produces limited set of samples, ignoring otber modes in the data Aistiution, ‘+ Arann Stability: Training GAN canbe challenging and unstable. Techniques like Wasserstsin GANS an | prosenive owing GANs fave ten develgel | ais aby i of + Aasmeparameer Toning + Pope it 7 rspernametes sel or GANS 02858 £0 vse + muaton Metres : Messing #e aml) was vat eae m neo A ‘8 traditional metrics like likelihood do not 3 monte

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy