AttentiveBugLocator: A Bug Localization Model using Attention-based Semantic Features and Information Retrieval
DOI:
https://doi.org/10.14738/tecs.122.16373Keywords:
Bug report, Bug Localization, Deep learning, Software Maintenance, Bug tracking systemsAbstract
In recent years, deep learning-based algorithms such as CNN, LSTM, and auto-encoders have been proposed to rank suspicious buggy files. Meanwhile, representational learning has served to be the best ap-proach to extract rich semantic features of bug reports and source code to reduce their lexical mismatch. In this paper, we propose AttentiveBu-gLocator, a Siamese-based representational learning model for improved bug localization performance. AttentiveBugLocator employs BERT and code2vec embedding models to produce richer semantic representations and a Siamese BiLSTM network with context attention to learn semantic matching between BRs and SFs. To further improve the effectiveness of AttentiveBugLocator, the semantic matching features are carefully fused with VSM, stack trace, and code complexity features. Evaluation results on four data sets show that AttentiveBugLocator can identify buggy files on the scale of 56% and 62% on MAP and MRR – thus, outperforms sev-eral state-of-the-art approaches.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2024 Aminu Ahmad, Yu Lasheng, Hussaini Aliyu Idris, Buhari Aliyu, Adamu Muhammad, Muhammad Umar Diginsa
This work is licensed under a Creative Commons Attribution 4.0 International License.