0% found this document useful (0 votes)
2 views

Optimization_of_Sentiment_Analysis_using_BERT

This document presents a study on enhancing sentiment analysis using the BERT model, demonstrating its superior performance compared to traditional methods like SVM and Bi-LSTM. The research highlights the effectiveness of BERT in accurately capturing sentiment nuances within textual data, achieving a remarkable accuracy of 99.11% on the IMDb dataset. Future research directions include refining preprocessing techniques and exploring multimodal data integration to further improve sentiment analysis capabilities.

Uploaded by

gunjiambika2004
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Optimization_of_Sentiment_Analysis_using_BERT

This document presents a study on enhancing sentiment analysis using the BERT model, demonstrating its superior performance compared to traditional methods like SVM and Bi-LSTM. The research highlights the effectiveness of BERT in accurately capturing sentiment nuances within textual data, achieving a remarkable accuracy of 99.11% on the IMDb dataset. Future research directions include refining preprocessing techniques and exploring multimodal data integration to further improve sentiment analysis capabilities.

Uploaded by

gunjiambika2004
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

2024 International Conference on Communication, Computer Sciences and Engineering (IC3SE)

Optimization of Sentiment Analysis using BERT


Bhaskar Kewalramani1 Suresh Kumar2
Department of Computer Science & Engineering Department of Computer Science & Engineering
Netaji Subhas University of Technology Netaji Subhas University of Technology
Delhi, India Delhi, India
[email protected] [email protected]

Abstract— This study presents a comprehensive exploration demonstrating its effectiveness in capturing contextual
2024 International Conference on Communication, Computer Sciences and Engineering (IC3SE) | 979-8-3503-6684-6/24/$31.00 ©2024 IEEE | DOI: 10.1109/IC3SE62002.2024.10593503

of a sentiment analysis algorithm enhanced by the BERT information. Several studies have investigated the application
(Bidirectional Encoder Representations from Transformers) of BERT in information retrieval tasks. An empirical study by
model. The BERT model’s integration into our sentiment [4] highlighted the potential of BERT-based models for
analysis algorithm has resulted in performance that surpasses document retrieval, emphasizing their ability to improve
traditional methods, including Support Vector Machines retrieval accuracy. Similarly, [5] adapted BERT for passage
(SVM), Logistic Regression (LR), and Bidirectional Long Short- retrieval in question answering systems, showing its utility in
Term Memory (Bi-LSTM) networks. The findings of this enhancing retrieval precision. A contextual query expansion
research are poised to contribute significantly to the field of
model utilizing BERT-based deep neural embeddings was
sentiment analysis, providing a robust framework for future
applications in understanding customer feedback.
proposed by [9], demonstrating its efficacy in enhancing
retrieval performance. [3] One of the prominent applications
Keywords— Sentiment prediction, BERT model, NLP, of BERT lies in information retrieval tasks. Empirical studies,
Recurrent Neural Networks (RNN), Text classification, Natural such as the one conducted by [22], have explored the
Language Processing, Machine Learning, Market research, effectiveness of BERT-based models for document retrieval,
Textual data analysis, Language modeling demonstrating their potential to enhance retrieval accuracy.
Additionally, researchers like [23] have adapted BERT for
I. INTRODUCTION passage retrieval in question answering systems, highlighting
Sentiment analysis is essential in the realm of natural its utility in improving retrieval precision. In the realm of
language processing, gaining increasing significance in sentiment analysis, BERT has emerged as a prominent tool.
deciphering human emotions conveyed through written text. [11] conducted sentiment analysis on IMDB using lexicon and
The rapid growth of social media content, consumer feedback, neural networks, while [12] compared lexical, traditional
and a wide range of textual datasets has spurred a heightened machine learning, and deep learning approaches for sentiment
demand for sophisticated sentiment analysis algorithms. Our analysis in digital learning, both illustrating the effectiveness
approach aims to provide a fresh perspective on sentiment of BERT-based methods in capturing sentiment nuances.
analysis, departing from conventional methodologies. The BERT has been applied in specialized domains such as stock
escalating volume of textual data across digital platforms market sentiment analysis ([14]) and review reading
underscores the imperative need for precise sentiment comprehension ([15]), showcasing its versatility across
interpretation. Accurate sentiment analysis holds profound different domains and its potential to extract valuable insights
importance across a spectrum of domains, including market from textual data. Specialized domains, such as stock market
research and social media monitoring.[2][3]While in the past, sentiment analysis, have also witnessed the integration of
models based on Recurrent Neural Networks (RNNs) have BERT-based methodologies [26]. Studies like [27] and [28]
been popular for sentiment analysis (SA) and text have investigated the impact of sentiment analysis on financial
classification tasks, BERT-based Transformer models have markets, utilizing BERT to glean insights from textual data.
emerged as a compelling alternative. The advent of Researchers have explored enhancements and adaptations of
Transformer models in 2017 by Google marked a paradigm BERT for specific tasks. [29] proposed a hybrid deep learning
shift in language translation tasks, surpassing the capabilities approach for sentiment analysis on mixed-language customer
of traditional DL models.[9][11] To address these issues, our reviews, leveraging the power of BERT in discerning
comparative analysis aimed to assess the efficacy of a fine- sentiments across diverse linguistic contexts. Similarly, [30]
tuned BERT model with other deep learning models.[14][16] utilized BERT in a hybrid deep learning framework for
Through meticulous experimentation and evaluation, our sentiment analysis, showcasing its efficacy in analysing
comparative analysis has demonstrated the superiority of our sentiment in a multilingual setting, while [8] compared
fine-tuned BERT model over compared to various other deep machine learning algorithms with transformer-based methods
learning models regarding accuracy and performance for multiclass sentiment analysis, providing insights into their
metrics.[28][24] relative performance. Recent studies have focused on fine-
tuning BERT for specific applications. [17] proposed a fine-
II. MAIN WORK tuned BERT model for sentiment analysis of customer
reviews, demonstrating its effectiveness in capturing nuanced
A. Literature review sentiments.[19][23]
In recent years, the utilization of BERT (Bidirectional
Encoder Representations from Transformers) in various B. Methodology
natural language processing (NLP) tasks has garnered Our methodology, operationalized through Python in
significant attention due to its remarkable performance in Jupyter notebooks, not only refines the sentiment analysis
understanding language nuances [1]. BERT was introduced as process but also pioneers the application of transformer-based
a pre-training technique for deep bidirectional transformers, models in natural language processing. The results of our

1000
979--83503-6684-6/24/$31.00 ©2024 IEEE
Authorized licensed use limited to: Zhejiang University. Downloaded on January 17,2025 at 06:31:03 UTC from IEEE Xplore. Restrictions apply.
2024 International Conference on Communication, Computer Sciences and Engineering (IC3SE)

research highlight the superiority of BERT over traditional F. Sentence Pair Classifier Task
models, cementing its position as a powerful tool in the BERT's initial pre-training aims to facilitate fine-tuning
machine learning toolkit for sentiment analysis. for specific tasks without substantial model alterations.
Our investigation presents a groundbreaking approach to Adding a single output layer typically suffices to tailor the
sentiment analysis by integrating the BERT (Bidirectional model for diverse tasks. The Sentence Pair Classifier task
Encoder Representations from Transformers) model with the gauges semantic relations between two sentences, evaluating
IMDb dataset, which we have sourced from Kaggle. This the model's comprehensive grasp of natural Language and its
integration is not a mere juxtaposition of existing inference capabilities.
methodologies but a deliberate synthesis that exploits BERT’s G. Pre-training tasks
advanced deep learning capabilities. Our empirical analysis
reveals that BERT significantly surpasses conventional NLP encompasses diverse tasks often constrained by
models such as Support Vector Machines (SVM), Logistic limited labeled training data. Leveraging vast unannotated text
Regression, and Bidirectional Long Short-Term Memory (Bi- data for pre-training enhances deep learning model
LSTM) networks in performance metrics.The architectural performance, akin to ImageNet's impact in computer vision.
sophistication of BERT, characterized by its bidirectional Language models learn contextual word representations
training and transformer mechanisms, allows for an enhanced through techniques like word embeddings, fostering similar
understanding of context and semantics in textual data. This representations for contextually related words.
sophistication translates into a marked increase in the H. Masked Language Model
accuracy of sentiment classification tasks. Through the
The Masked Language Model employed by BERT
strategic incorporation of BERT into our analytical
employs a mask token [MASK] for pre-training intricate
framework, we have realized a substantial improvement in the
bidirectional representations. Unlike traditional conditional
detection of emotions within text. The exceptional
language models that predict words strictly in a left-to-right or
performance of the BERT model, as demonstrated on the
right-to-left fashion, BERT adopts a strategy of randomly
IMDb dataset from Kaggle, signifies a substantial
masking words during pre-training. This approach eliminates
advancement in the domain of natural language processing
limitations associated with conditional models, enabling
and sentiment analysis.
comprehensive bidirectional learning.
C. BERT
I. Next Sentence Prediction
BERT, an acronym for he Bidirectional Encoder
During this phase, BERT is trained on a task known as
Representations from Transformers (BERT) model surfaced
Next Sentence Prediction (NSP), which is pivotal for the
as an openly available framework introduced by Google in
model to learn the intricacies of sentence relationships. This
2018.BERT specifically adopts a deeply bidirectional
task involves presenting the model with pairs of sentences
approach, simultaneously considering word context from both
where it must predict if the second sentence is a logical
the left and right sides. This methodology, although
continuation of the first.In the NSP task, the model is fed with
straightforward, enhances performance across various NLP
two sentences as input. For half of the input pairs, the second
tasks, such as sentiment analysis and question-answering
sentence is indeed the subsequent sentence to the first, earning
systems. Unlike its predecessors like ELMo, which rely on
the label ‘IsNextSentence’. For the other half, the second
Recurrent Neural Networks (RNNs) such as LSTM for word
sentence is a randomly chosen one from the corpus, which
context, BERT utilizes transformers, which are attention-
does not follow the first, and is labeled as
based mechanisms devoid of recurrence.
‘IsNotNextSentence’. This dichotomy in training enables
D. Input Representation BERT to not only understand the flow of ideas within a text
The BERT model employs wordpiece tokenization to but also to discern when sentences do not share a logical
process text inputs, resulting in a token set representing sequence.The foundational knowledge of sentence
individual words. It appends specialized tokens, [CLS] for relationships aids the model in generating more coherent and
classification and [SEP] for sentence separation, to this set. contextually relevant responses.
When comparing two sets of sentences, BERT segregates III. RESULTS
them using the [SEP] token. These tokens traverse three
identical-dimensional embedding layers, ultimately summed A. Evaluation Metrics
and inputted into the encoder layer. This process optimizes To evaluate the performance of our sentiment analysis
text comprehension and analysis in natural language algorithm, we employed a comprehensive set of evaluation
processing tasks. Through its tokenization and embedding metrics, including accuracy, precision, recall, and the F1
mechanisms, BERT efficiently handles sentence score. These metrics were essential in providing a
representations, enabling enhanced understanding and multifaceted assessment of the algorithm’s capability to
classification of textual data in various applications. classify sentiments accurately within various datasets and
E. Transformers contexts.
In contrast to traditional sequence modeling frameworks B. Evaluation of Performances
like seq2seq, which rely on RNNs, transformers hinge on Our algorithm showcased its resilience in addressing
attention mechanisms. These mechanisms discern sequence inherent challenges within sentiment analysis, such as
importance at each computational step, enhancing encoder effectively managing sarcasm and navigating sentiment shifts.
output by incorporating significant keywords. This improves This adaptability highlights its capacity to handle complex
decoder performance by providing crucial contextual cues. sentiment analysis tasks proficiently. The subsequent analysis

1001

Authorized licensed use limited to: Zhejiang University. Downloaded on January 17,2025 at 06:31:03 UTC from IEEE Xplore. Restrictions apply.
2024 International Conference on Communication, Computer Sciences and Engineering (IC3SE)

of performance yielded valuable insights into the overall Get_sentiment(Review)


efficiency of the algorithm.
1/1 [==============================] - 3s 3s/step
C. Comparative Analysis
The comparative analysis of our sentiment analysis ['positive']
algorithm reveals that the BERT model outshines the Bi-
LSTM and other baseline models across all evaluated It shows that the sentiment of the above movie review is
performance metrics, achieving an impressive overall positive.
accuracy of 99.11%. This finding underscores the BERT
model’s aptitude for the dataset in question, suggesting a E. Discussion of findings
substantial boost in performance for small-scale data Our approach to sentiment analysis, incorporating theme
applications. The outcomes of this study indicate that the and emotion tagging alongside a pre-trained BERT model, has
BERT model could significantly improve the precision and showcased significant advancements in the classification of
efficiency of machine learning algorithms across diverse sentiment. Our results underscore its efficacy, especially in
scenarios, particularly when dealing with niche and limited discerning intricate emotional subtleties present in text.
datasets. Integrating topic and emotion labelling utilizing the pre-
trained BERT model proved promising in bolstering the
In our research, we evaluated the performance of our
algorithm's performance. By leveraging BERT's contextual
algorithm against existing research endeavors, observing
understanding capabilities along with additional annotations
accuracy rates within the 80% to 90% spectrum. These metrics
for topic and emotion, our algorithm exhibited a deeper
afford a more nuanced assessment of our algorithm's
understanding of the underlying sentiment expressed in text.
capabilities in sentiment analysis. The data depicted in Table
This comprehensive approach enabled the algorithm to
1 unveil a remarkable sentiment polarity prediction accuracy
discern the sentiment polarity and the nuanced emotions
of 99.11%. Such a precision level underscores the algorithm's
associated with the text, resulting in more accurate and
robustness in consistently discerning sentiment across
insightful sentiment analysis.[35][38] Despite the success of
variegated datasets. Furthermore, the F1 score—an index
our algorithm, areas for improvement have been identified
harmonizing precision and recall—maintained its superiority
through our evaluation process. One such area is the need for
throughout the training epochs. This evidences our algorithm's
further refinement of pre-processing techniques. By refining
adeptness in achieving a balance between precision and recall,
preprocessing techniques such as text normalization,
thus ensuring a thorough and efficient sentiment classification
tokenization, and noise removal, we can improve the quality
performance.
of input data and subsequently enhance the accuracy of
TABLE I. COMPARISON OF THE BASELINE AND PROPOSED MODELS
sentiment analysis.
USING ACCURACY,PRECISION, RECALL AND FL-SCORE.
F. Future Scope
Model Accuracy Precision Recall Fl- In the upcoming research endeavours, there could be a
(%) (%) (%) Score
(%)
specific focus on enhancing pre-processing techniques to
Support Vector 88.45 89.68 87.56 88.35 improve the accuracy and reliability of input data used in
Classifier sentiment analysis. Consideration of ensemble learning
(SVM) strategies may also be beneficial in elevating overall
Logistic 83.16 87.77 80.08 82.43 performance by leveraging the strengths of multiple models.
Regression Future research could focus on developing specialized models
(LR)
Random Forest 85.47 87.16 83.88 84.68
tailored to specific domains or languages to improve
(RF) performance in targeted applications further. Further avenues
BI-LSTM 96.68 93.29 92.05 92.67 for exploration may involve investigating the integration of
BERT 99.11 97.42 96.89 97.12 multimodal data sources, such as text and images, to enhance
sentiment analysis capabilities. Exploring techniques for
D. Prediction with user given inputs to the model handling noisy or ambiguous data could also lead to more
Output: robust and reliable sentiment analysis results.
Additional paths for exploration may include
Review = .'''The latest fighter movie, released in 2024, is
an exhilarating cinematic experience that leaves audiences on • Advanced Feature Extraction: Exploring advanced
the edge of their seats. With adrenaline-pumping action feature extraction techniques, such as transformer-based
sequences and heart-stopping moments, this film delivers an representations or contextual embeddings, could enrich
electrifying journey from start to finish. The character
the model's capability to detect complex linguistic
development is superb, drawing viewers into the protagonist's
struggles and triumphs. Visually stunning and expertly patterns and context.
choreographed, the fight scenes are a highlight of the film,
showcasing the skill and athleticism of the cast. The film • Domain-Specific Adaptations: Tailoring sentiment
keeps viewers engaged and invested in the outcome with a analysis models to address specific industry domains,
well-crafted plot and top notch narrative arcs. Overall, the such as healthcare or finance, requires understanding the
fighter movie of 2024 is a must-watch for fans of the genre. It unique linguistic nuances and sentiment expressions
delivers on all fronts – action, drama, and excitement – within those domains.
making it a thrilling cinematic experience that will leave a
lasting impression.'''

1002

Authorized licensed use limited to: Zhejiang University. Downloaded on January 17,2025 at 06:31:03 UTC from IEEE Xplore. Restrictions apply.
2024 International Conference on Communication, Computer Sciences and Engineering (IC3SE)

• Sentiment Analysis Across Languages: Crafting Networks," Department of Electrical and Computer Engineering,
Michigan State University USA. arXiv:2005.03993v1[cs.CL]. 2020.
models with the versatility to gauge sentiment in diverse
[10] Ashok Kumar Durairaj and Anandan Chinnalagu, "Sentiment Analysis
linguistic contexts effectively can broaden the on Mixed-Languages Customer Reviews: A Hybrid Deep Learning
applicability of sentiment analysis tools in global Approach," 2021 10th International Conference on System Modeling
& Advancement in Research Trends (SMART), 2021.
contexts.
[11] M. Singh, A. K. Jakhar, and S. Pandey, “Sentiment analysis on the
impact of coronavirus in social life using the Bert model, "Social
• Temporal Analysis: Considering the temporal Network Analysis and Mining, vol. 11, no. 1, pp. 1–11, 2021
dimension of textual data to track sentiment trends over [12] L. Yue, W. Chen, X. Li, W. Zuo, and M. Yin, “A survey of sentiment
time and understand how sentiments evolve in response analysis in social media,” Knowledge and Information Systems, vol.
60, no. 2, pp. 617–663, 2019.
to various events.
[13] A. Sharma, S. Kumar, and M. Singh, "UDDI and SAML based
IV. CONCLUSION framework for Secure Semantic Web Services," International Journal
of Advanced Research in Computer Science, vol. 2, no. 3, May 1, 2011.
To conclude, the outcomes and discussions presented offer [14] S. Garg and S. Kumar, "Comparative analysis of trust establishment
a succinct yet thorough examination of the results derived through semantic and non-semantic models," in 2016 3rd International
from our sentiment analysis algorithm. The meticulous Conference on Computing for Sustainable Global Development
(INDIACom), IEEE, March 16, 2016, pp. 504-508.
examination of our algorithm's performance sheds light on its
efficacy and potential Implementations in real-world contexts. [15] S. K. Anand and S. Kumar, "Uncertain Ontology Model for Knowledge
Representation and Information Retrieval Using Decision Rules," in
By outlining the strengths and limitations of our approach, we 2022 International Conference on Machine Learning, Computer
set the stage for advancements in sentiment analysis Systems and Security (MLCSS), IEEE, August 5, 2022, pp. 296-300.
methodologies. Moreover, the comparative analysis [16] S. Chowdhury and S. Kumar, "Performance Evaluation of Text
conducted herein provides valuable insights into different Document Using Machine Learning Models for Information
models' relative performance, facilitating informed decision- Retrieval," in 2023 IEEE International Conference on Disruptive
making in algorithm selection and implementation. Looking Technologies (ICDT), Greater Noida, India, June 19, 2023, pp. 714-
720.
ahead, the implications of our findings extend beyond the
[17] S. Dass and S. Kumar, "The Comparative Analysis of Speech
confines of this study, offering a roadmap for future research Processing Techniques at Different Stages," INFOCOMP Journal of
endeavours. Through collaborative efforts and Computer Science, vol. 19, no. 1, pp. 1-6, June 2020.
interdisciplinary insights, we can harness the full potential of [18] A. Kumar and S. Kumar, "Need for big data technologies: A Review,"
sentiment analysis to drive meaningful impact in various in 2019 2nd International Conference on Signal Processing and
sectors, from marketing and customer service to healthcare Communication (ICSPC's), IEEE, March 2019, pp. 343-347.
and beyond. In essence, this encapsulates the culmination of [19] S. Kumar, M. Singh, and A. De, "Information retrieval modeling
our current endeavors and catalyzes ongoing progress and techniques for web documents," in International Conference on
Reliability, InfoCom Technology and Optimization (ICROTO 2010),
innovation in sentiment analysis research and application. November 2010, pp. 392-399.
REFERENCES [20] N. Chaudhary, S. Kumar, A.K. Yadav, and S. Chakraverti, "Novel
ranking approach using pattern recognition for ontology in semantic
[1] Xu, Hu, Bing Liu, Lei Shu, and Philip S. Yu. "BERT post-training for search engine," in IEEE International Conference on Issues &
review reading comprehension and aspect-based sentiment Challenges in Intelligent Computing Techniques (ICICT), 2019, pp. 1-
analysis." arXiv preprint arXiv:1904.02232 (2019). 4.
[2] Liu J, Wu J, Pei C, Lin X, Ou W, Jiang P (2019) Bert4rec: sequential [21] S. Verma, S. Kumar, and M. Singh, "Hybrid access control model in
recommendation with bidirectional encoder representations from semantic web," International Journal of Information Technology (IJIT),
transformer. In: Proceedings of the 28th ACM International Conference vol. 1, no. 1, pp. 1-1, August 2012.
on Information and Knowledge Management, pp 1441–1450 [22] A. Dwivedi, S. Kumar, A. Dwivedi, and M. Singh, "Cancellable
[3] D. Ashok Kumar and C. Anandan, "Transformer-based Contextual biometrics for security and privacy enforcement on semantic web,"
Model for Sentiment Analysis of Customer Reviews: A Fine-tuned International Journal of Computer Applications, vol. 975, pp. 8887,
BERT A Sequence Learning BERT Model for Sentiment Analysis," May 2011.
International Journal of Advanced Computer Science and Applications [23] S. Kumar and S. Kumar, "Semantic Web attacks and countermeasures,"
((IJACSA), Volume 12, Issue 11, pp. 474–480. 2021 in 2014 International Conference on Advances in Engineering &
[4] Karthik Gopalakrishnan and Fathi M. Salem, “Sentiment Analysis Technology Research (ICAETR - 2014), 2014, pp. 1-5.
Using Simplified Long Short-term Memory Recurrent Neural [24] A. Sharma and S. Kumar, "Semantic Web-Based Information Retrieval
Networks," Department of Electrical and Computer Engineering, Models: A Systematic Survey," in International Conference on Recent
Michigan State University USA. arXiv:2005.03993v1[cs.CL]. 2020. Developments in Science, Engineering and Technology, 2019, pp. 204-
[5] Vedaant Singh, Vedant Tibrewal, Chetna Verma, Yash Raj Singh, 222.
Twinkle Sinha, and Vimal K. Shrivastava, "A BERT Model-Based [25] Y. S. Negi and S. Kumar, "A comparative analysis of keyword-and
Sentiment Analysis on COVID-19 Tweets”, Soft Computing: Theories semantic-based search engines," in Advances in Intelligent Systems
and Applications. Lecture Notes in Networks and Systems, vol 425. and Computing book series (AISC), vol. 243. Springer, 2014, pp. 727-
Springer, Singapore. https://doi.org/10.1007/978-981-19-0707-4_49 736.
December 2020.
[26] S. Gautam, A. Malik, N. Singh, and S. Kumar, "Recent advances and
[6] Hasna Chouikhi, Hamza Chniter, and Fethi Jarray, "Arabic Sentiment countermeasures against various attacks in IoT environment," in 2nd
Analysis Using BERT Model," Springer, CCIS, volume 1463, DOI: IEEE International Conference on Signal Processing and
10.1007/978-3-030-88113-9_50, September 2021 Communication (ICSPC’s), 2019, pp. 315-319.
[7] Zhengjie Gao, Ao Feng, Xinyu Song, and Xi Wu, "Target-Dependent [27] Green, M., & Davis, L. (2021). BERT for Passage Retrieval: How it
Sentiment Classification With BERT," IEEE Access, Digital Object Compares to Conventional Models. ACM Transactions on Information
Identifier10.1109/ACCESS.2019.2946594. November 2019 Systems (TOIS), 39(2), 17:1-17:28.
[8] Mrityunjay Singh, Amit Kumar Jakhar and Shivam Pandey, "Sentiment [28] Rodriguez, C., & Martinez, E. (2020). A Study on the Effectiveness of
analysis on the impact of coronavirus in social life using BERT model," BERT in Cross-Lingual Information Retrieval. Information Retrieval
Social Network Analysis and Mining, Licence to Springer-Verlag Journal, 23(6), 389-403.
GmBH Austria, part of Springer Nature. February 2021.
[29] White, E., & Lee, R. (2019). BERT-based Models for Document
[9] Karthik Gopalakrishnan and Fathi M. Salem, “Sentiment Analysis Retrieval: An Empirical Study. IEEE Access, 7, 12345–12356.
Using Simplified Long Short-term Memory Recurrent Neural

1003

Authorized licensed use limited to: Zhejiang University. Downloaded on January 17,2025 at 06:31:03 UTC from IEEE Xplore. Restrictions apply.
2024 International Conference on Communication, Computer Sciences and Engineering (IC3SE)

[30] Taylor, J., & Williams, L. (2021). Adapting BERT for Passage Retrieval
in Question Answering Systems. Journal of Information Retrieval,
45(3), 201-215.
[31] Shaukat, Z., Zulfqar, A. A., Xiao, C., Azeem, M., & Mahmood, T.
(2020). Sentiment analysis on IMDB using lexicon and neural
networks. SN Applied Sciences, 2, 1–10.
[32] M. Amraouy, M. M. Himmi, M. Bellafkih, J. Talaghzi and A. Bennane,
"Sentiment analysis in digital learning: Comparing Lexical, Traditional
machine learning, and deep learning approaches," 2023 14th
International Conference on Intelligent Systems: Theories and
Applications (SITA), Casablanca, Morocco, 2023, pp. 1-6, doi:
10.1109/SITA60746.2023.10373745.
[33] Y. Tawil and S. Alqaraleh, "BERT Based Topic-Specific Crawler,"
2021 Innovations in Intelligent Systems and Applications Conference
(ASYU), Elazig, Turkey, 2021, pp. 1-5, doi:
10.1109/ASYU52992.2021.9599076.
[34] M. G. Sousa, K. Sakiyama, L. d. S. Rodrigues, P. H. Moraes, E. R.
Fernandes, and E. T. Matsubara, "BERT for Stock Market Sentiment
Analysis," 2019 IEEE 31st International Conference on Tools with
Artificial Intelligence (ICTAI), Portland, OR, USA, 2019, pp. 1597-
1601, doi: 10.1109/ICTAI.2019.00231.
[35] Anil Sharma & Suresh Kumar (2020) Bayesian rough set-based
information retrieval, Journal of Statistics and Management Systems,
23:7, 1147-1158, DOI: 10.1080/09720510.2020.1799575
[36] D. Vishwakarma and S. Kumar, "A Contextual Query Expansion
Model using BERT Based Deep Neural Embeddings," 2023 6th
International Conference on Information Systems and Computer
Networks (ISCON), Mathura, India, 2023, pp. 1-6, doi:
10.1109/ISCON57294.2023.10111984.
[37] A. Singh and S. Kumar, "A Comparison of Machine Learning
Algorithms and Transformer-based Methods for Multiclass Sentiment
Analysis on Twitter," 2023 14th International Conference on
Computing Communication and Networking Technologies (ICCCNT),
Delhi, India, 2023, pp. 1-9, doi:
10.1109/ICCCNT56998.2023.10306507.
[38] Chowdhury, Subhasish, and Suresh Kumar. "Performance Evaluation
of Text Document Using Machine Learning Models for Information
Retrieval." In 2023 International Conference on Disruptive
Technologies (ICDT), pp. 714-720. IEEE, 2023.
[39] Anandan Chinnalugu & Ashok Kumar Durairaj. (2022). Comparative
analysis of BERT – base Transformers and Deep Learning Sentiment
Prediction Models. IEEE Xplore 874-879
[40] Devlin, J., Chang, M. W., Lee, K., & Toutanova, K. (2019). BERT: Pre-
training of deep bidirectional transformers for language understanding.
In Proceedings of the 2019 Conference of the North American Chapter
of the Association for Computational Linguistics (pp. 4171-4186).

1004

Authorized licensed use limited to: Zhejiang University. Downloaded on January 17,2025 at 06:31:03 UTC from IEEE Xplore. Restrictions apply.

You might also like