Research Paper
Research Paper
3.4 SVM Classifier Training : The key feature that sets this
classification algorithm apart from its predecessors is its
unique approach to defining the decision boundary. It aims
to maximize the separation between all classes that are
closest to the data points. SVMs, or Support Vector
Machines, achieve this by identifying a hyperplane with the
greatest margin as the decision boundary. In the case of a
linear SVM classifier, it constructs a straight line between
two classes, effectively categorizing each relevant data point
on one side into one class and those on the other side into
another.
Conclusion
This study compares and examines two categorization
algorithm that are comparable in accuracy and in length.
The NASA dataset are used in this study to train SVM
and ALM classifier. The best accuracy was provided by
ALM classifier model which is more efficient than
SVM. Both SVM and ALM are sophisticated machine
learning algorithms that can be used to predict
observations. Overfitting occurs when the model learns
too well from the training data and performs badly on
Fig-4. Finding of defects in the dataset using K- nearest Algorithm new data. Validation is a strategy used for preventing
overfitting.
3.5 ELM classifier training: They can be assigned References
in various ways while remaining consistent in the
above technique. This is because the solution is 1. Lecun, Y.; Bottou, L.; Bengio, Y.; Haffner, P.
straightforward and does not necessitate iteration, Gradient- based learning applied to document recognition.
and the input weights are fixed and constant. Such a Proc. IEEE 1998, 86, 2278–2324.
solution can be computed rapidly and easily for a 2. Lawrence, S.; Giles, C.L.; Ah Chung, T.; Back, A.D.
specific linear output layer. When the weight range Face recognition: A convolutional neural-network
of the solution is limited, symmetrical information approach. IEEE Trans. Neural Netw. 1997, 8, 98–113.
sources produce a greater space volume 3. Hinton, G.E.; Salakhutdinov, R.R. Reducing the dimen-
arrangement. sionality of data with neural networks. Science 2006, 313,
504–507.
4. Yang, F.-J. An Implementation of Naive Bayes
Classifier. In Proceedings of the 2018 International
RESULTS Conference on Computational Science and Computational
Intelligence (CSCI), Las Vegas, NV, USA, 12–14
December 2018; pp. 301–306.
5. Boetticher, G.; Menzies, T.; Ostrand, T. PROMISE
Repository of Empirical Software Engineering Data; West
Virginia University, Department of Computer Science:
Morgantown, WV, USA, 2007.
6. Rath, S.K.; Sahu, M.; Das, S.P.; Mohapatra, S.K. Hybrid
Software Reliability Prediction Model Using Feature
Selection and Support Vector Classifier. In Proceedings of
the 2022 International Conference on Emerging Smart
Computing and Informatics (ESCI), Pune, India, 9–11
March 2022; pp. 1–4. 7.
7. Rong, H.-J.; Ong, Y.-S.; Tan, A.-H.; Zhu, Z. A fast
pruned extreme earning machine for classification
problem. Neuro- computing 2008, 72, 359–366.
8. Dash, M.; Liu, H. Feature Selection for
Classification; Intelligent Data Analysis; Elsevier:
Amsterdam, The Netherlands, 1997; pp. 131–156.
9. Menzies, T.; Greenwald, J.; Frank, A. Data mining
static code attributes to learn defect predictors. IEEE
Trans. Softw. Eng. 2007, 33, 2–13.
10. Mahaweerawat, A.; Sophatsathit, P.; Lursinsap, C.;
Musilek, P. MASP-An enhanced model of fault type
identifica tion in object-oriented software engineering. J.
Adv. Comput. Intell. Intell. Inform. 2006, 10, 312–322.
11. Ritu, R. (2022, December). Concepts for Energy
Management in the Evolution of Smart Grids. In
International Conference on Hybrid Intelligent Systems
(pp. 917-928). SpringerNature Switzerland.
12. Bhambri, P. (2022, October). A CAD System for
Software Effort Estimation. In 2022 2nd International
Conference on Technological Advancements in
Computational Sciences (IC- TACS) (pp. 140-146).
IEEE.
13. Ritu, Bhambri, P. (2023). Software Effort Estimation
with Machine Learning–A Systematic Literature Review.
Agile Software Development: Trends, Challenges and
Applications, 291-308.
14. Biemek. A Moga. A. An efficient watershed algorithm
based on connected components. Pattern Recogn. 33 (3).
907-916. 2000