Using Machine Learning Techniques to Classify and Predict Static Code Analysis Tool Warnings

Date
2018-10
Language
English
Embargo Lift Date
Committee Members
Degree
Degree Year
Department
Grantor
Journal Title
Journal ISSN
Volume Title
Found At
IEEE
Abstract

This paper discusses our work on using software engineering metrics (i.e., source code metrics) to classify an error message generated by a Static Code Analysis (SCA) tool as a true-positive, false-positive, or false-negative. Specifically, we compare the performance of Support Vector Machine (SVM), K-Nearest Neighbor (KNN), Random Forests, and Repeated Incremental Pruning to Produce Error Reduction (RIPPER) over eight datasets. The performance of the techniques is assessed by computing the F-measure metric, which is defined as the weighted harmonic mean of the precision and recall of the predicted model. The overall results of the study show that the F-measure value of the predicted model, which is generated using Random Forests technique, ranges from 83% to 98%. Additionally, the Random Forests technique outperforms the other techniques. Lastly, our results indicate that the complexity and coupling metrics have the most impact on whether a SCA tool with generate a false-positive warning or not.

Description
item.page.description.tableofcontents
item.page.relation.haspart
Cite As
Alikhashashneh, E. A., Raje, R. R., & Hill, J. H. (2018). Using Machine Learning Techniques to Classify and Predict Static Code Analysis Tool Warnings. 2018 IEEE/ACS 15th International Conference on Computer Systems and Applications (AICCSA), 1–8. https://doi.org/10.1109/AICCSA.2018.8612819
ISSN
Publisher
Series/Report
Sponsorship
Major
Extent
Identifier
Relation
Journal
2018 IEEE/ACS 15th International Conference on Computer Systems and Applications
Source
Author
Alternative Title
Type
Conference proceedings
Number
Volume
Conference Dates
Conference Host
Conference Location
Conference Name
Conference Panel
Conference Secretariat Location
Version
Author's manuscript
Full Text Available at
This item is under embargo {{howLong}}