ggvyyu (1)
ggvyyu (1)
at
BACHELOR OF TECHNOLOGY
(Computer Science and Engineering)
SUBMITTED BY:
PRAKASH GUPTA
Random Forest, and Neural Networks. The study involved collecting and
Absolute Error (MAE) and Root Mean Squared Error (RMSE). Results
showed that ensemble methods like Random Forest and neural networks
sectors.
ACKNOWLEDGMENT
by Babu Banarasi Das University and the Department of Computer Science and
skills.
build a strong foundation in the Machine learning. Through their guidance, I was
I also want to acknowledge my friends and family for their unwavering support
during this journey. Their constant motivation and belief in my abilities were
training experience.
developer. I have not only gained technical skills but also developed valuable
tailored to fields like Artificial Intelligence, Machine Learning, and Full Stack
prepares students for the demands of the tech industry. Its curriculum, designed
expertise.
One of Ikigai School’s standout features is its mentorship model, where students
work directly with industry experts who provide guidance on technical skills,
that each student can navigate complex topics and gain insights that are directly
stack applications and machine learning models, giving them the confidence to
tackle real-world challenges. The institute also offers career support, including
Chapter 1: Introduction
1.1 BACKGROUND OF THE TOPIC
1.2 THEORITICAL EXPLANATION
1.3 SOFTWARE TOOLS LEARNED
1.4 HARDWARE TOOLS LEARNED
3.1 RESULTS
3.2 DETAILED OBSERVATIONS
3.3 CHALLENGES
Chapter 4: Conclusion
4.1 CONCLUSION
CONCLUSION
APPENDIX
Chapter 1: Introduction
computational overhead.
analyze data, learn from it, and make predictions. For rainfall prediction,
1.Training Phase: The model learns from historical data, mapping input
or occurrence).
2.Prediction Phase: The trained model predicts unseen data based on learned
patterns.
Key ML Algorithms for Rainfall Prediction:
1. Regression Models:
(SVR).
2.Ensemble Methods:
models.
3.Neural Networks:
high-dimensional data.
heavily.
Machine learning workflows rely on various software tools for data analysis,
neural networks.
results presentation.
learning:
computing resources.
The training process began with sourcing weather datasets from publicly
Datasets included:
pressure.
Target: Daily rainfall data (in mm).
imputation techniques.
4. Splitting: Data was divided into training (80%) and testing (20%)
subsets.
rainfall. Dimensionality reduction (e.g., PCA) was used to retain only the most
relationships.
performance.
accuracy.
performance.
Models were evaluated using metrics such as MAE, RMSE, and R² scores.
results.
3.1 Results
The performance of trained models on the test dataset is summarized below:
1.Linear Regression:
patterns.
2.Random Forest:
3.Neural Networks:
computational resources.
dropout layers.
4.Visualization of Results:
Scatter plots comparing actual vs. predicted rainfall highlighted
prediction deviations.
3.3 Challenges
1.Data Limitations:
generalizability.
2.Computational Costs:
3.Model Interpretability:
interpretability.
4.2References
5-32.
Appendix
2. Visualization Snapshots:
training.