Cite: Hillah, N. The origins of severe software defects method. J. Digit. Sci. 2(2), 23 – 30 (2020). https://doi.org/10.33847/2686-8296.2.2_3
Abstract. Identifying the causes which may potentially generate high financial damage was the goal of our research. To reach this goal, we conducted a case study on a system in the field of education. We studied the software defects of this system over several months and classified them based on two classification concepts: the EVOLIS and their severity. These defects prevent any essential operation or activity to be conducted through the concerned system or other systems connected to it. In fact, the occurrence of these failures causes a double financial cost to organizations: one in fixing them and the other one because of the unavailability of the system or systems. We targeted three types of software defects as sources of these failures. We conducted this study by classifying 665 software defects of a school management system and we found that the top two trigger groups are the technology and the IS architecture groups. This result led us to propose a method to identify the origins of severe software defects.
Keywords: Severe software defect, Software defect triggers, Software defect classification.
- R. N. Charette, “Why software fails [software failure],” Ieee Spectr., vol. 42.9, pp. 42–49, 2005.
- R. Kaur and D. J. Sengupta, “Software Process Models and Analysis on Failure of Software Development Projects,” Int. J. Sci. Eng. Res., vol. 2, no. 2, p. 4.
- 1044-2009 IEEE Standard Classification for Software Anomalies. 2009.
- A. Métrailler and T. Estier, “EVOLIS Framework: A Method to Study Information Systems Evolution Records,” in System Sciences (HICSS), 2014 47th Hawaii International Conference on, 2014, pp. 3798–3807.
- A. A. Alshazly, A. M. Elfatatry, and M. S. Abougabal, “Detecting defects in software requirements specification,” Alex. Eng. J., vol. 53, no. 3, pp. 513–527, Sep. 2014.
- G. K. Rajbahadur, S. Wang, Y. Kamei, and A. E. Hassan, “The impact of using regression models to build defect classifiers,” in Proceedings of the 14th International Conference on Mining Software Repositories, 2017, pp. 135–145.
- R. Binder, Testing object-oriented systems: models, patterns, and tools. Reading, Mass: Addison-Wesley, 2000.
- D. Vallespir, F. Grazioli, and J. Herbert, “A framework to evaluate defect taxonomies,” in XV Congreso Argentino de Ciencias de La Computación, 2009.
- M. Leszak, P. Dewayne E., and D. Stoll, “Classification and evaluation of defects in a project retrospective,” Elsevier, no. 61, pp. 173–187, 2002.
- N. Mellegãrd, Improving Defect Management in Automotive Software Development, LiDeC—A Light-weight Defect Classification Scheme. Chalmers University of Technology, 2013.
- R. Chillarege et al., “Orthogonal defect classification-a concept for in-process measurements,” IEEE Trans. Softw. Eng., vol. 18, no. 11, pp. 943–956, 1992.
- S. Wagner, “Defect classification and defect types revisited,” in Proceedings of the 2008 workshop on Defects in large software systems, 2008, pp. 39–40.
- J. T. Huber, “A Comparison of IBM’s Orthogonal Defect Classification to Hewlett Packard’s Defect Origins, Types, and Modes.” Hewlett Packard Company, 1999.
- L. Yu and S. R. Schach, “Applying association mining to change propagation,” International Journal of Software Engineering and Knowledge Engineering, vol. 18, no. 08, pp. 1043–1061, 2008.
- G. Murphy and D. Cubranic, “Automatic bug triage using text categorization,” in Proceedings of the Sixteenth International Conference on Software Engineering & Knowledge Engineering, 2004.
- W. Dickinson, D. Leon, and A. Fodgurski, “Finding failures by cluster analysis of execution profiles,” in Proceedings of the 23rd International Conference on Software Engineering. ICSE 2001, Toronto, Ont., Canada, 2001, pp. 339–348.
- Atlassian, “Jira | Logiciel de suivi des tickets et des projets,” Atlassian. [Online]. Available: https://fr.atlassian.com/software/jira. [Accessed: 06-Apr-2020].
Published online 29.12.2020