Sentence-Level Granularity Oriented Sentiment Analysis of Social Media Using Long Short-Term Memory (LSTM) and IndoBERTweet Method
DOI:
https://doi.org/10.26555/jiteki.v9i1.25765Keywords:
Sentiment analysis sentence-level granularity oriented, LSTM, TF-IDF, IndoBERTweet, Word2VecAbstract
The dissemination of information through social media has been rampant, especially on the Twitter platform. This information eventually invites various opinions from users as their points of view on a topic being discussed. These opinions can be collected and processed using sentiment analysis to assess public tendencies to obtain a fundamental source of decision-making. However, the procedure is not optimal enough due to its inability to recognize the word meaning of the opinion sentences. By using sentence-level granularity-oriented sentiment analysis, the system can explore the "sense of the word" in each sentence by giving it a granularity weight as the system's consideration in recognizing word meaning. To construct the procedure, this research utilizes LSTM as the classification model combined with TF-IDF and IndoBERTweet as feature extraction. Not only that, but this research also conducts the Word2Vec feature expansion method which was built using Twitter and IndoNews corpus to produce word similarity corpus and find effective word semantics. To be fully compliant with the granularity requirements, manual labeling, and system labeling were performed by considering weight granularity as a model performance comparison. This research succeeded in getting 88.97% accuracy for manual labeling data and 97.80% for system labeling data after combining these methods. The experimental results show that the granularity-oriented sentiment analysis model can outperform the conventional sentiment analysis system which can be seen based on the high performance of the resulting system.Downloads
Published
2023-02-07
How to Cite
[1]
N. M. Azahra and E. B. Setiawan, “Sentence-Level Granularity Oriented Sentiment Analysis of Social Media Using Long Short-Term Memory (LSTM) and IndoBERTweet Method”, J. Ilm. Tek. Elektro Komput. Dan Inform, vol. 9, no. 1, pp. 85–95, Feb. 2023.
Issue
Section
Articles
License
Authors who publish with JITEKI agree to the following terms:
- Authors retain copyright and grant the journal the right of first publication with the work simultaneously licensed under a Creative Commons Attribution License (CC BY-SA 4.0) that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.
This work is licensed under a Creative Commons Attribution 4.0 International License