0% found this document useful (0 votes)
74 views4 pages

M.L Lab Syllabus Copy

The document outlines the practical course structure for a Machine Learning lab, detailing course objectives, experiments, assessment criteria, and suggested learning resources. It includes a list of experiments to be conducted, such as implementing various machine learning algorithms and data visualization techniques. The assessment is divided between Continuous Internal Evaluation (CIE) and Semester End Examination (SEE), each contributing 50% to the final grade.

Uploaded by

navaleakash810
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
74 views4 pages

M.L Lab Syllabus Copy

The document outlines the practical course structure for a Machine Learning lab, detailing course objectives, experiments, assessment criteria, and suggested learning resources. It includes a list of experiments to be conducted, such as implementing various machine learning algorithms and data visualization techniques. The assessment is divided between Continuous Internal Evaluation (CIE) and Semester End Examination (SEE), each contributing 50% to the final grade.

Uploaded by

navaleakash810
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Template for Practical Course and if AEC is a practical Course Annexure-V

Machine Learning lab Semester 6


Course Code BCSL606 CIE Marks 50
Teaching Hours/Week (L:T:P: S) 0:0:2:0 SEE Marks 50
Credits 01 Exam Hours 100
Examination type (SEE) Practical
Course objectives:
1. To become familiar with data and visualize univariate, bivariate, and multivariate data using statistical
techniques and dimensionality reduction.
2. To understand various machine learning algorithms such as similarity-based learning, regression, decision
trees, and clustering.
3. To familiarize with learning theories, probability-based models and developing the skills required for
decision-making in dynamic environments.

Sl.NO Experiments
1 Develop a program to create histograms for all numerical features and analyze the distribution of each feature.
Generate box plots for all numerical features and identify any outliers. Use California Housing dataset.

Book 1: Chapter 2
2 Develop a program to Compute the correlation matrix to understand the relationships between pairs of
features. Visualize the correlation matrix using a heatmap to know which variables have strong
positive/negative correlations. Create a pair plot to visualize pairwise relationships between features. Use
California Housing dataset.

Book 1: Chapter 2
3 Develop a program to implement Principal Component Analysis (PCA) for reducing the dimensionality of the
Iris dataset from 4 features to 2.

Book 1: Chapter 2
4 For a given set of training data examples stored in a .CSV file, implement and demonstrate the Find-S
algorithm to output a description of the set of all hypotheses consistent with the training examples.

Book 1: Chapter 3
5 Develop a program to implement k-Nearest Neighbour algorithm to classify the randomly generated 100 values
of x in the range of [0,1]. Perform the following based on dataset generated.

1. Label the first 50 points {x1,……,x50} as follows: if (xi ≤ 0.5), then xi ∊ Class1, else xi ∊ Class1
2. Classify the remaining points, x51,……,x100 using KNN. Perform this for k=1,2,3,4,5,20,30

Book 2: Chapter – 2
6 Implement the non-parametric Locally Weighted Regression algorithm in order to fit data points. Select
appropriate data set for your experiment and draw graphs

Book 1: Chapter – 4
7 Develop a program to demonstrate the working of Linear Regression and Polynomial Regression. Use Boston
Housing Dataset for Linear Regression and Auto MPG Dataset (for vehicle fuel efficiency prediction) for
Polynomial Regression.

Book 1: Chapter – 5
8 Develop a program to demonstrate the working of the decision tree algorithm. Use Breast Cancer Data set for
building the decision tree and apply this knowledge to classify a new sample.

Book 2: Chapter – 3

@#@10012025
Template for Practical Course and if AEC is a practical Course Annexure-V

9 Develop a program to implement the Naive Bayesian classifier considering Olivetti Face Data set for training.
Compute the accuracy of the classifier, considering a few test data sets.

Book 2: Chapter – 4
10 Develop a program to implement k-means clustering using Wisconsin Breast Cancer data set and visualize the
clustering result.

Book 2: Chapter – 4
Course outcomes (Course Skill Set):

At the end of the course the student will be able to:


1. Illustrate the principles of multivariate data and apply dimensionality reduction techniques.
2. Demonstrate similarity-based learning methods and perform regression analysis.
3. Develop decision trees for classification and regression problems, and Bayesian models for probabilistic
learning.
1. Implement the clustering algorithms to share computing resources.

@#@10012025
Template for Practical Course and if AEC is a practical Course Annexure-V

Assessment Details (both CIE and SEE)


The weightage of Continuous Internal Evaluation (CIE) is 50% and for Semester End Exam (SEE) is 50%. The
minimum passing mark for the CIE is 40% of the maximum marks (20 marks out of 50) and for the SEE
minimum passing mark is 35% of the maximum marks (18 out of 50 marks). A student shall be deemed to
have satisfied the academic requirements and earned the credits allotted to each subject/ course if the
student secures a minimum of 40% (40 marks out of 100) in the sum total of the CIE (Continuous Internal
Evaluation) and SEE (Semester End Examination) taken together

Continuous Internal Evaluation (CIE):


CIE marks for the practical course are 50 Marks.
The split-up of CIE marks for record/ journal and test are in the ratio 60:40.
1. Each experiment is to be evaluated for conduction with an observation sheet and record write-up.
Rubrics for the evaluation of the journal/write-up for hardware/software experiments are designed
by the faculty who is handling the laboratory session and are made known to students at the
beginning of the practical session.

2. Record should contain all the specified experiments in the syllabus and each experiment write-up will
be evaluated for 10 marks.

3. Total marks scored by the students are scaled down to 30 marks (60% of maximum marks).

4. Weightage to be given for neatness and submission of record/write-up on time.

5. Department shall conduct a test of 100 marks after the completion of all the experiments listed in the
syllabus.

6. In a test, test write-up, conduction of experiment, acceptable result, and procedural knowledge will
carry a weightage of 60% and the rest 40% for viva-voce.

7. The suitable rubrics can be designed to evaluate each student’s performance and learning ability.

8. The marks scored shall be scaled down to 20 marks (40% of the maximum marks).

The Sum of scaled-down marks scored in the report write-up/journal and marks of a test is the total CIE
marks scored by the student.
Semester End Evaluation (SEE):
1. SEE marks for the practical course are 50 Marks.

2. SEE shall be conducted jointly by the two examiners of the same institute, examiners are appointed
by the Head of the Institute.

3. The examination schedule and names of examiners are informed to the university before the
conduction of the examination. These practical examinations are to be conducted between the
schedule mentioned in the academic calendar of the University.

4. All laboratory experiments are to be included for practical examination.

@#@10012025
Template for Practical Course and if AEC is a practical Course Annexure-V

5. (Rubrics) Breakup of marks and the instructions printed on the cover page of the answer script to
be strictly adhered to by the examiners. OR based on the course requirement evaluation rubrics
shall be decided jointly by examiners.

6. Students can pick one question (experiment) from the questions lot prepared by the examiners
jointly.

7. Evaluation of test write-up/ conduction procedure and result/viva will be conducted jointly by
examiners.

1. General rubrics suggested for SEE are mentioned here, writeup-20%, Conduction procedure and result
in -60%, Viva-voce 20% of maximum marks. SEE for practical shall be evaluated for 100 marks and scored
marks shall be scaled down to 50 marks (however, based on course type, rubrics shall be decided by the
examiners)

Change of experiment is allowed only once and 15% of Marks allotted to the procedure part are to be
made zero.

The minimum duration of SEE is 02 hours

Suggested Learning Resources:


Books:

1. S Sridhar and M Vijayalakshmi, “Machine Learning”, Oxford University Press, 2021.


2. M N Murty and Ananthanarayana V S, “Machine Learning: Theory and Practice”, Universities Press (India) Pvt.
Limited, 2024.

Web links and Video Lectures (e-Resources):

1. https://www.drssridhar.com/?page_id=1053
2. https://www.universitiespress.com/resources?id=9789393330697
3. https://onlinecourses.nptel.ac.in/noc23_cs18/preview

@#@10012025

You might also like