0% found this document useful (0 votes)
72 views

Learn Everything A I

AI

Uploaded by

Umar Naseer
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
72 views

Learn Everything A I

AI

Uploaded by

Umar Naseer
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 28
INPUT HIDDEN BER LnWER INTRO To arch cy UN ARNG CAN DEAL JOIrH. 80H DEEP LEARNING isn: ee = He weath —, Ox SUPERVISED LEARNING E==4 J fw] THe GUI BRON Fox” Wl UCN Ad 0 Ee UNsTRUCTARED IMAGE CONV. NN (CHA) is) AUDIO TEXT TRANSCRIPT | RECURRENT NN ea ENGLISH CHINESE. CANN) WHY NOW? ursor ime IMAGE /kabAR | POS 2 OTHER CARS] CUSTOM / oe Dene ALeS. ONE OF HE UREN le BREAKTHRDUGS z gs MeD Nw) s : HAG BEEN. MOVING XW7 WEN & eusscn. PROM SIGMOID TO ¥O> NETWORK — REL FOR FASTER a x oO (O-8 —_aecHirearu Res ‘o GRADENT DESCENT Xs, s x . = = STANDARD NN £3 a Ss woe 2 a. - N got boo 2 aval as FASTER COMPITATION CONVOLITINAL NN) RECURRENT NN IS INPORTANT SPEEDUP THE IRERATIWE PROCESS, QC Testenendez BHMRY CASSICETON LOGISTIC. REGRESSION = en AS A NEURAL NET xX 7 FINDING RE MINIMUM yee yout iene eatin Pee son St t SRT ntbenanes) 5 = =r 5 K Se (ope We)ATA f .. id = REPEAT Un You FAC BET par PUTTING an THER WE TASK [81 LeARNIo db Bur HOW? aT ‘A OPTMIZE Ho GOOD THE GUESS 5 BY x 6 Ss NEURAL 2 Geet O1Ga| Toe aE ore ou 2 2 2 2 a MINIMIZING HE DIFF RENOEEN GUESS (9 ron ol AND aaah Wy) awn joa }=o suHow(2 loss = LCG arse 4 RS + ChLOMATE cost= Ji! orl i 2egry ") ee FN TE WNUM rach OE jo eb COST-=LOSS-70R THE ENTIRE SET) EPEAT UNTIL IT CONVERGES: QC Testenendez 2 LAYER NEUPALNET OW EURAL NETS Hipven 6 waar X, Ge. TPL psy ff Layer 4 ay @- g qr ze Pa Bs via : ” XK my ze Pere Pus NA 5 : ® G ACTIVATION FINGTION Pew Ox BE) +b LP . unit? : week Nk Bae A : a = ACTWATION FUNCTIONS Cn een (ND OUST ; fe 4 / | RUTUZING. eb Lite sues “he REL eee | WHAT IFS INIT TO NEURAL NET é MARY CLASSIFIER ied DETAULT IDS UNDEF LL CHIME ALL THE UNS Frowark Seaser Cecene SUNT Peete csge mot VOD INFESK OUTPUT LAYER DESUENT IS ACTIVATION yr paReLy | EXACTLY THE SAME FEATURES | © smerny FSR gfe=4/o Used INFRA Syypal: RANDOM. INIT 2 tire Bur ASO WANT THEM 8 © FOR AREESMAL SMALL SD Paribas O01” QC Testenendez og EAE ENTS peep WALL DEEP NET GN GO NEURAL NETS “THAT oHALColn NES es ee woty DEEP NEURAL WETS? & IMD RE (ANTS TO COMP. VERY D, oI Sh O , ie Tow 5 LoTs OM a ORO S OY Power | ; IOP TO) a ey COMPUTE ON GPUs < O LOTS OF HYPERFKRAMS \ LEARNING RATE ofr HIDDEN UnliTS #EITERATIONS ——_CAfOGE OF AarAFTON 4 \w— LOnO KEEL HE HIDDEN LABERS MOMENTUM — pm wave — PHONENES — WORDS — GENTENGES | MINH SFE REGUUARIEATION avn = GAT © Tesstenendez SETING UP YOUR ML ABP CUSSIC ML DEEP LEARNING I-00 SAMPLES «= AM SAMPLES TRAN [pev [TEST] [TRAIN phy Ov OD. 207. 9% Tn LL FROM SHE PLACE, exten [EE lift DSTRIBUTION / Bue ovest [aie wm Dev ¢ Test SHOWLD CONE ROM SAME DISTRIBUTION an EXPCRIME 0 eT BHYD/ VARIANCE Ee ‘HIGH IRS ar ower Havas ERROR 1% Bh [5% 05% We |oA 3207 4h ia HGH HGH BI BIAS! veneer HUMANE GET 07 ERROR HE ML RECIPE BIGGER NETWORK HIGH 4 sfRAN LINGER, Le (DIE NIN ARGIITECTURE) HIGH L., MORE DATA (TRAIN) REGULARIZATION VARIANCE |e nn ARCHERD INPROVING DEEP NEURAL NETS: COURSERA ER ARIZATION Gi carn PREANTING OVERFIITING DROFOUT =e me UAGMENTATION rr OLe- DATA A | 2-REGULARIZATION p GENERATE NEVO GS FROM EXSTA Com Dose A G—+2 Mole, * O70 Sh ok sO 0 \ urd LAREGULARIZATION) “#6 COMI H= KEEGY *? ol, rrReae RAMON € me ONE NDDESARE RANDOMLY eo ae * OROPPED (BASED HS Fet?-FRB) SIMPLER NETWORKS a O \ - A A 67O<070 mi \of So ni ie Oo Oo LOO FROBLEM: AFFECTS BOTH XY 4 BIS 6 VARIANCE s-OS0—O WE GET SINPLER Ns €LESS CHANLE TO RELY ON SINGLE FEATURES IMPROVING DEEP NEURAL NETS: COURSERA OPTIMIZING TRAINING NORMAUZING INPUTS x2 ya 2 SEP CENTER Sie: SCHUE AROUND 2,5 SDVARIANE Ih SAME Cx -lo4 silt — UKE WRME Av6/VAR To NORMALIZE Dev/TEST Wy po We DS THIS? NORMAL ZED ie |e TF WE NORMALE (WOE CAN USE A MUCH URREER LEARNING RATE di ete ee GRADIENTS: Ex: DEEP NWO (Les) Gu e--We + b IF l= B52] 05> Vane OR we g]elss brane SES GRADIENT ey a VERY LONG TIME PARTIBL SOLUMON: CHOOSE (NMAL VAUIES LES CAREFULLY wee ae ere (eh) XA ws i ful) wate RADIEN oR CKING UF YOR Cosr OES NOT DECREASE ON TAH TER YOU MAD HAVE A BRERPROP BUG. GRADIENT CHECKINE APRROXIMAIS THE GRADIENTS So SOU CAN VERRY CALC. [NOTE ] onus use WHEN DEBIUGCING OWNLE [TF SL00 DTestarandez OPTIMIZATION ALGORITHMS MINIBATOL GRAD. DESENT LT SPUT SOUR DAT INTO MINI BATCHES 50 GRAD DRVENT AFTER CAG! BATE “THS WAY HOU CRN PROGRESS FIER QUST A SHORT WHILE 1, MINLL BATCH | mao rier. FIT = CHORING THE. MINIBATCH SIE = Size~m > BATU GRAD DEX. S1zE=41 > sreetiastic oFAD DE BACH MOE SOGUSTIC =| es cy | ASR VEOIRETIN ~SHORTITER.| WOES VET Te aE nf IF go HAE TRAIN BIGGER NETO. B (Grn) a”. ; aie ro 19) (TRAN una /BEER (SF) never age bre | TRAN Se arcu IF DEVE. xfs pis | GREENERY | hey | Behe ‘B lorz|esy.[or~ BEST Woe JIL MISS THE TESTARET = Benign eID. HUMAN LEVEL PERF s RECALL € PREASION Bayes CTL ERIE = LEVEL PRE VARIBNCE Ex OPTIMIEING VS HMA cae g S ICING METRICS Fatwa te | — AVOD.BAS TOIT? Fieve oeeen z ERERENEDDR, O78 See = war Rance rehire SE: aes = 4/207. | GIVEN TINE< dons P roe ro Lae a |ONSE TO BASES UNCLEAR Z ACCURAC= +A HUM Al CAN NO LONGER i OPMIMRING HELP IMPROVE CINSISHTS) 2 RUNTIME = DIFFICULT TD ANAL SSE & SHTISFICING ees e STRUCTURING ML PROVECTS * COURSERA ERR ERROR YOu seca NEO PRON: Db Jol HAVE ID7- ERRORS, SOME beer SHOULD YOU #K ALD Ist SYSTEM QUICK ARE DOGS MIS-CRSSIFIED AS CTS SHOULD You TRAIN. on ox! | SERCH RECOENITON wre | ate fe ° 4, PICK [00 Mis:LABLED a eau WORT SHOULD YOU FOCMS ON? 2 COUNT ERROR REASONS | Zee au re cats mvaee Abed AS CE) NE ee ADD EXTRA COL. IN-ERRDR FAR FROM MIKE ANALYSIS. AND DBE GAVE CRRA) | ——_——_—