Open navigation menu
Close suggestions
Search
Search
en
Change Language
Upload
Sign in
Sign in
Download free for days
0 ratings
0% found this document useful (0 votes)
37 views
Unit 4
Uploaded by
yoyoharshit255
AI-enhanced title
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here
.
Available Formats
Download as PDF or read online on Scribd
Download now
Download
Save unit 4 For Later
Download
Save
Save unit 4 For Later
0%
0% found this document useful, undefined
0%
, undefined
Embed
Share
Print
Report
0 ratings
0% found this document useful (0 votes)
37 views
Unit 4
Uploaded by
yoyoharshit255
AI-enhanced title
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here
.
Available Formats
Download as PDF or read online on Scribd
Download now
Download
Save unit 4 For Later
Carousel Previous
Carousel Next
Save
Save unit 4 For Later
0%
0% found this document useful, undefined
0%
, undefined
Embed
Share
Print
Report
Download now
Download
You are on page 1
/ 19
Search
Fullscreen
ve CART algoritum. we can PY The deg, Took like this Cholesterol <= 200 | | Disease: Yes | Disease: No Disease: Yes Fig 42: Predicting Disease based on Age and Cholesterol Levels he top, which evaluates the begins at The decision tree in this illustration ceed to the left br " [fa patient is under the age of tion, The 200. The forecast is "Yes Disease” if the et statement "Age Che festerol fevel is less than or equal t is "No Disease” if the patien 200" cond examine the level is more than 200. Howev switch to the right branch, where "No Disease predict 43 CHAID 4.3.1 Chi-Square Automatic Interaction Detection ood used t0 useful. when CHAID (Chi-Square Autoniatic Interaction Detection) is a statistical met between diferent categories of variables, It is particu! the interaction se ce & Continuing Education, Campus of Open Learning pen Learning, University of Delhi © De= introduction to Business Analytics pst ‘ es cate vaniab "= 90 z Liddle. 9 c Purchase Frequency a Femate = 3 Se ee 5 Low Medium igh ee Not s - “Satisfied ‘Satisfied Not Satisfied Fa satisfied Determining Satisfaction Levels of cusnto SUD Neorg y hierarchical structure. I enables ae “This flowchart shows bow eat ‘most important predictor factors, td ondety visualise Hoe inks between (BE (Costomer Satzsfaction). sulting im ‘variables and their effects on the target vet variable on the flowchart, and it has ¢Wo Branches: "Yougr aoe the Gender varible within the "Young" branch, aie «the Purchase Frequency variable is next exitained fo "Low." "Medium," and "High" We ave y faction outcome and are either "Satisfiegr Age Group is the fi "Middlc-aged.” We further in branches for "Male" and "Female." tach gender subgroup, yielding three branches the leaf nodes, which represent the customer Salis ‘or "Not Satisfied." 4.3.2 Bonferroni Correction ietical method used to adjust the significance levels (p- pothesis tosis at the same time, 1t helps contol the ificant result by making the criteria for significance The Bonferroni correction is a st values) when conducting multiple overall chance of falsely claiming 4 sig hy more strict. ction, we divide the desired significance Level (usually denoted med (denoted as m). This adjusted significance level, termining statistical significance. To apply the Bonferroni corre a) by the mamber of tests being perform new threshold for det denoted as a! or a_B. becomes 1 Mathematically, the Bonferroni comection can be represented as wee ot aa ng 10 hypothesis tests, and we want a sigoifieance For example, suppose we are conductin Bonferroni correction, we divide a by LO, resulting level of 0.05 («> 0.05). By applying the in.an adjusted sig a’ = 0.05/10 = 0.005 03) inst the es obtained from each test, we compare them Now, when we assess the p-value inal a. Ifa p-value is less than oF equal tod. adjusted significance level (0') instead of the ori ave consider the result to be statistically significant se have conducted 10 independent hypothesis tests, an example. Suppose Let's conside we obtain p-values of 0.02, Bonferroni correetion with a of 0.05 0.05 / 10 = 0.005. Al Department of stance & Continuing Education. Campus of Open Learmin Schoul of Open Learning, University of Dethi and 0.07, 0.01, 0.03, 0.04, 0.09, 0.06, 0.08, 0.05, and 0.02, Using the and m™~ 10, the adjusted signifieance level becomes d= eeeesmiroduction 2 Business Analytics 0.05, and we ide a by 0.02)
“uortoanp’g Summuneo:y Y ato PO mae ra Punis1q Jo Womnaag J porseds- [9% a8 Si9ySN}9 397 yo envelas 1OMUSIP How PIOUS sss TULUM Aayenb ioanniag np . yp WIDMNIEG, aout ryposput 8 a ua famqeindas samiqerndsg asp dup 0} S459 sg sons? 5 i Zunsausny> por mod cep 2avu, pINOHs 1105 son pussys INTE ay are siuiod 1 rdw ssamoedae) « sai uae sop ay) O80] MOY] OF SIOGOI_ SOU - Aten yed ue aumumnts pov] esaaag ‘rep aut Ur Su 9 ss0rsey (PIDAag ‘EEP 3 nyo Jo wuatUssasse MPO ssnyo go> Atienb iy + MOH OF SUF ui yy soumade> WE! suauayy sSupiaysnyy so sypengy (vy) pea auoydye 5927 SI p pu Aten) 3 yo anqumtt yeurndo amp Susu: uy suonsiopisuos ywerodiu ssaoysnrzy Jo snquiay, pend pu tend: 97 val as) pad ed) = g stored (ub-= «zb 1b) = pun (ud i é Os ae a ee os meen soungp SOA, LL “seseD [woods Se ssaMMEID TANT Ca fq wont a weap Sopnjaur yeyy oansraue 2ouwysip paziei7 (eT) bid IZ ) cep qb) 2d pone - 28 ua9s8 Aires aaysoa ay (ub 2D“) = OE 2414) = a -sroyo9n ong Jo “eyep jouorsuonup-ysry qin BuslEP ee aumeoiptt surest 208° For aes Asoeswe9 sy suonsanp sayy wy Kavegtes ot 3 seq aifun ayy jo aunson aya samsvaut Aaciey(oHts 20D ost yrpodsey sontqnay ssomsngy 04 watt" attStability measures the consistency of chaste - , such as different initalizations or subsets of he get My Jess prone to variation4s and demonstrates robugue MA ahi ly specific Measures: Depending on the appl Plication do, ain, easier sepention, Mts he homepage al + Compa alate the effect VENESS Of ly Seng ig measures specific to the problem can be example, silhouette coefficient can be used to ev Capturing meaningful customer groups. (B) Determining the O} nal Number of Clusters in K-means clustering Determining the optimal number of clusters, K, in K-means elusterin, 18 iS 2 rail iy ig ott fo okey al methots are commonly ied clustering analysis. Selecting th appropriate number of elusters i imp and extracting meaningful information from the data. Seve determine the optimal number of clusters: + Elbow Method: “The elbow method involves plowing the withinelser sun squares (WCSS) as a function of the number of clusters”, The plot resembles on am n ide and the optimal number of clusters is ified at the "elbow pint, whee ig, below. in is cleat rate of decrease in WSS slows down significantly. In th ke=3 is the optimal number of clusters Fig 4.10 Etbow Method
You might also like
ML-Lec-07-Decision Tree Overfitting
PDF
No ratings yet
ML-Lec-07-Decision Tree Overfitting
25 pages
IS4834 Week 8
PDF
No ratings yet
IS4834 Week 8
42 pages
Classification With Decision Trees: Instructor: Qiang Yang
PDF
100% (1)
Classification With Decision Trees: Instructor: Qiang Yang
62 pages
Classification - Decision Trees
PDF
No ratings yet
Classification - Decision Trees
43 pages
ML-chap9_2024_110217
PDF
No ratings yet
ML-chap9_2024_110217
52 pages
Decision Trees
PDF
No ratings yet
Decision Trees
13 pages
CART1
PDF
No ratings yet
CART1
17 pages
Chapter 3
PDF
No ratings yet
Chapter 3
88 pages
Decision Tree Algorithm
PDF
No ratings yet
Decision Tree Algorithm
18 pages
Classification and Regression Trees
PDF
No ratings yet
Classification and Regression Trees
48 pages
02 - Decision Trees
PDF
No ratings yet
02 - Decision Trees
38 pages
Classification With Decision Trees I: Instructor: Qiang Yang
PDF
No ratings yet
Classification With Decision Trees I: Instructor: Qiang Yang
29 pages
3 Decision Trees
PDF
No ratings yet
3 Decision Trees
41 pages
DECISION TREES-jb
PDF
No ratings yet
DECISION TREES-jb
8 pages
Class Basic
PDF
No ratings yet
Class Basic
75 pages
5 1 decision trees
PDF
No ratings yet
5 1 decision trees
34 pages
Unit 5 - Data Mining - WWW - Rgpvnotes.in
PDF
No ratings yet
Unit 5 - Data Mining - WWW - Rgpvnotes.in
15 pages
Data Science Concepts Lesson04 Decision Tree Concepts
PDF
No ratings yet
Data Science Concepts Lesson04 Decision Tree Concepts
22 pages
תרגול - Decision Trees
PDF
No ratings yet
תרגול - Decision Trees
43 pages
Decision Trees
PDF
No ratings yet
Decision Trees
45 pages
Decision Trees (I) : ISOM3360 Data Mining For Business Analytics, Session 4
PDF
No ratings yet
Decision Trees (I) : ISOM3360 Data Mining For Business Analytics, Session 4
32 pages
DS535 Note 6 (Page1-14)
PDF
No ratings yet
DS535 Note 6 (Page1-14)
13 pages
DM Lect8
PDF
No ratings yet
DM Lect8
56 pages
ML Unit-3 ppt
PDF
No ratings yet
ML Unit-3 ppt
92 pages
Random Forest
PDF
No ratings yet
Random Forest
5 pages
Lecture4 Supervised Segmentation For Students
PDF
No ratings yet
Lecture4 Supervised Segmentation For Students
44 pages
DWDM 4
PDF
No ratings yet
DWDM 4
58 pages
Decision Tree
PDF
No ratings yet
Decision Tree
33 pages
DS4 - CLS-Decision Tree
PDF
No ratings yet
DS4 - CLS-Decision Tree
32 pages
20210913115613D3708 - Session 05-08 Decision Tree Classification
PDF
No ratings yet
20210913115613D3708 - Session 05-08 Decision Tree Classification
37 pages
CSE445 NSU Week_4
PDF
No ratings yet
CSE445 NSU Week_4
48 pages
Classification
PDF
100% (1)
Classification
37 pages
Clase12 13
PDF
No ratings yet
Clase12 13
15 pages
Introduction To Big Data and Data Mining
PDF
No ratings yet
Introduction To Big Data and Data Mining
130 pages
Decision Trees MIT 15.097 Course Notes
PDF
No ratings yet
Decision Trees MIT 15.097 Course Notes
17 pages
Unit-6: Classification and Prediction
PDF
No ratings yet
Unit-6: Classification and Prediction
63 pages
ML4 - Decision Trees & Random Forest
PDF
No ratings yet
ML4 - Decision Trees & Random Forest
44 pages
dm unit 4
PDF
No ratings yet
dm unit 4
24 pages
Decision Tree Introduction
PDF
No ratings yet
Decision Tree Introduction
14 pages
COS10022 DSP Week05 Decision Tree and Random Forest
PDF
No ratings yet
COS10022 DSP Week05 Decision Tree and Random Forest
50 pages
Lecture 4
PDF
No ratings yet
Lecture 4
74 pages
L04 Decision Trees
PDF
No ratings yet
L04 Decision Trees
34 pages
Lecture 17 18
PDF
No ratings yet
Lecture 17 18
52 pages
UNIT 2 Class Basic
PDF
No ratings yet
UNIT 2 Class Basic
69 pages
dm4
PDF
No ratings yet
dm4
68 pages
Randomforest TNP
PDF
No ratings yet
Randomforest TNP
71 pages
Classification and Regression Trees
PDF
100% (1)
Classification and Regression Trees
60 pages
Classification Algorithms: Inteligência Artificial E Cibersegurança (Inacs)
PDF
No ratings yet
Classification Algorithms: Inteligência Artificial E Cibersegurança (Inacs)
60 pages
08 Class Basic
PDF
No ratings yet
08 Class Basic
81 pages
Lecture 5_Decision Tree
PDF
No ratings yet
Lecture 5_Decision Tree
49 pages
Concepts and Techniques: - Chapter 8
PDF
No ratings yet
Concepts and Techniques: - Chapter 8
81 pages
Decision Tree
PDF
No ratings yet
Decision Tree
8 pages
ML Unit 3
PDF
No ratings yet
ML Unit 3
83 pages
2 ML Ch3 Decision Trees Final
PDF
No ratings yet
2 ML Ch3 Decision Trees Final
70 pages
Naïve Bayes-DecisionTrees-RandomForest-SVM
PDF
No ratings yet
Naïve Bayes-DecisionTrees-RandomForest-SVM
26 pages
04 Classification
PDF
No ratings yet
04 Classification
72 pages
Lecture 5 Classification P2 Decision Tree
PDF
No ratings yet
Lecture 5 Classification P2 Decision Tree
54 pages
Data Mining NOTES
PDF
No ratings yet
Data Mining NOTES
57 pages