Software intelligence

From HandWiki

Software Intelligence is insight into the inner workings and structural condition of software assets produced by software designed to analyze database structure, software framework and source code to better understand and control complex software systems in Information Technology environments.[1][2] Similarly to Business Intelligence (BI), Software Intelligence is produced by a set of software tools and techniques for the mining of data and the software's inner-structure. Results are automatically produced and feed a knowledge base containing technical documentation and blueprints of the innerworking of applications,[3] and make it available to all to be used by business and software stakeholders to make informed decisions,[4] measure the efficiency of software development organizations, communicate about the software health, prevent software catastrophes.[5]

History

Software Intelligence has been used by Kirk Paul Lafler, an American engineer, entrepreneur, and consultant, and founder of Software Intelligence Corporation in 1979. At that time, it was mainly related to SAS activities, in which he has been an expert since 1979.[6]

In the early 1980s, Victor R. Basili participated in different papers detailing a methodology for collecting valid software engineering data relating to software engineering, evaluation of software development, and variations. [7] [8] In 2004, different software vendors in software analysis start using the terms as part of their product naming and marketing strategy.

Then in 2010, Ahmed E. Hassan and Tao Xie defined Software Intelligence as a "practice offering software practitioners up-to-date and pertinent information to support their daily decision-making processes and Software Intelligence should support decision-making processes throughout the lifetime of a software system". They go on by defining Software Intelligence as a "strong impact on modern software practice" for the upcoming decades.[9]

Capabilities

Because of the complexity and wide range of components and subjects implied in software, Software intelligence is derived from different aspects of software:

  • Software composition is the construction of software application components.[10] Components result from software coding, as well as the integration of the source code from external components: Open source, 3rd party components, or frameworks. Other components can be integrated using application programming interface call to libraries or services.
  • Software architecture refers to the structure and organization of elements of a system, relations, and properties among them.
  • Software flaws designate problems that can cause security, stability, resiliency, and unexpected results. There is no standard definition of software flaws but the most accepted is from The MITRE Corporation where common flaws are cataloged as Common Weakness Enumeration.[11]
  • Software grades assess attributes of the software. Historically, the classification and terminology of attributes have been derived from the ISO 9126-3 and the subsequent ISO 25000:2005[12] quality model.
  • Software economics refers to the resource evaluation of software in past, present, or future to make decisions and to govern.[13]

Components

The capabilities of Software intelligence platforms include an increasing number of components:

  • Code analyzer to serve as an information basis for other Software Intelligence components identifying objects created by the programming language, external objects from Open source, third parties objects, frameworks, API, or services
  • Graphical visualization and blueprinting of the inner structure of the software product or application considered[14] including dependencies, from data acquisition (automated and real-time data capture, end-user entries) up to data storage, the different layers[15] within the software, and the coupling between all elements.
  • Navigation capabilities within components and impact analysis features
  • List of flaws, architectural and coding violations, against standardized best practices,[16]cloud blocker preventing migration to a Cloud environment,[17] and rogue data-call entailing the security and integrity of software [18]
  • Grades or scores of the structural and software quality aligned with industry-standard like OMG, CISQ or SEI assessing the reliability, security, efficiency, maintainability, and scalability to cloud or other systems.
  • Metrics quantifying and estimating software economics including work effort, sizing, and technical debt[19]
  • Industry references and benchmarking allowing comparisons between outputs of analysis and industry standards

User Aspect

Some considerations must be made in order to successfully integrate the usage of Software Intelligence systems in a company. Ultimately the Software Intelligence system must be accepted and utilized by the users in order for it to add value to the organization. If the system does not add value to the users' mission, they simply don't use it as stated by M. Storey in 2003.[20]

At the code level and system representation, Software Intelligence systems must provide a different level of abstractions: an abstract view for designing, explaining and documenting and a detailed view for understanding and analyzing the software system.[21]

At the governance level, the user acceptance for Software Intelligence covers different areas related to the inner functioning of the system as well as the output of the system. It encompasses these requirements:

  • Comprehensive: missing information may lead to a wrong or inappropriate decision, as well as it is a factor influencing the user acceptance of a system.[22]
  • Accurate: accuracy depends on how the data is collected to ensure fair and indisputable opinion and judgment.[23]
  • Precise: precision is usually judged by comparing several measurements from the same or different sources.[24]
  • Scalable: lack of scalability in the software industry is a critical factor leading to failure.[25]
  • Credible: outputs must be trusted and believed.
  • Deploy-able and usable.

Applications

Software intelligence has many applications in all businesses relating to the software environment, whether it is software for professionals, individuals, or embedded software. Depending on the association and the usage of the components, applications will relate to:

  • Change and modernization: uniform documentation and blueprinting on all inner components, external code integrated, or call to internal or external components of the software[26]
  • Resiliency and security: measuring against industry standards to diagnose structural flaws in an IT environment.[citation needed] Compliance validation regarding security, specific regulations or technical matters.
  • Decisions making and governance: Providing analytics about the software itself or stakeholders involved in the development of the software, e.g. productivity measurement to inform business and IT leaders about progress towards business goals.[27]
  • Assessment and Benchmarking to help business and IT leaders to make informed, fact-based decision about software.[28]

Marketplace

The Software Intelligence is a high-level discipline and has been gradually growing covering applications listed above. There are several markets driving the need for it:

  • Application Portfolio Analysis (APA) aiming at improving the enterprise performance.[29][30]
  • Software Assessment for producing the software KPI and improving quality and productivity.[31]
  • Software security and resiliency measures and validation.
  • Software evolution or legacy modernization, for which blueprinting the software systems are needed nor tools improving and facilitating modifications.[citation needed]

References

  1. Dąbrowski R. (2012) On Architecture Warehouses and Software Intelligence. In: Kim T., Lee Y., Fang W. (eds) Future Generation Information Technology. FGIT 2012. Lecture Notes in Computer Science, vol 7709. Springer, Berlin, Heidelberg
  2. Hinchey, Mike; Jain, Amit; Kaushik, Manju; Misra, Sanjay (Jan 2023). "Guest Editorial: Intelligence for systems and software engineering". Innovations in Systems and Software Engineering (Springer) 19 (1): 1–4. doi:10.1007/s11334-023-00526-1. PMID 36744022. 
  3. Bartoszuk, C., Dąbrowski, R., Stencel, K., & Timoszuk, G. "On quick comprehension and assessment of software.", In Proceedings of the 14th International Conference on Computer Systems and Technologies, June 2013, pp. 161-168 doi:10.1145/2516775.2516806
  4. Raymond PL Buse, and Thomas Zimmermann. "Information needs for software development analytics." 2012 34th International Conference on Software Engineering (ICSE). IEEE, June 2012, pp. 987-996 doi:10.1109/ICSE.2012.6227122
  5. Ahmed E. Hassan and Tao Xie. 2010. Software intelligence: the future of mining software engineering data. In Proceedings of the FSE/SDP workshop on Future of software engineering research (FoSER '10). ACM, New York, NY, USA, 161–166
  6. "Mr. Kirk Paul Lafler". 21 December 2015. https://www.ithistory.org/honor-roll/mr-kirk-paul-lafler. 
  7. Basili, Victor R. (1981). Data collection, validation and analysis. Software Metrics: An Analysis and Evaluation. MIT Press. p. 143. ISBN 0-262-16083-8. http://www.cs.umd.edu/~basili/publications/chapters/C12.pdf. 
  8. Basili, Victor R.; Weiss, David M. (Nov 1984). "A Methodology for Collecting Valid Software Engineering Data.". IEEE Transactions on Software Engineering (IEEE Trans. Softw. Eng. 10, 6 (November 1984)) (6): 728–738. doi:10.1109/TSE.1984.5010301. https://www.researchgate.net/publication/220070466. 
  9. Ahmed E. Hassan and Tao Xie. 2010. Software intelligence: the future of mining software engineering data. In Proceedings of the FSE/SDP workshop on Future of software engineering research (FoSER '10). ACM, New York, NY, USA, 161–166. doi:10.1145/1882362.1882397
  10. Nierstrasz, Oscar, and Theo Dirk Meijler. "Research directions in software composition." ACM Computing Surveys 27.2 (1995): 262-264 doi:10.1145/210376.210389
  11. Kanashiro, L., et al. "Predicting software flaws with low complexity models based on static analysis data." Journal of Information Systems Engineering & Management 3.2 (2018): 17 doi:10.20897/jisem.201817
  12. "ISO 25000:2005". Archived from the original on 2013-04-14. https://web.archive.org/web/20130414112148/http://webstore.iec.ch/preview/info_isoiec25000%7Bed1.0%7Den.pdf. Retrieved 2013-10-18. 
  13. Boehm, Barry W., and Kevin J. Sullivan. "Software economics: a roadmap." Proceedings of the conference on The future of Software engineering. 2000. doi:10.1145/336512.336584
  14. Renato Novais, José Amancio Santos, Manoel Mendonça, Experimentally assessing the combination of multiple visualization strategies for software evolution analysis, Journal of Systems and Software, Volume 128, 2017, pp. 56–71, ISSN 0164-1212, doi:10.1016/j.jss.2017.03.006.
  15. Rolia, Jerome A., and Kenneth C. Sevcik. "The method of layers." IEEE transactions on software engineering 21.8,1995, 689-700,doi:10.1109/32.403785
  16. "Software Engineering Rules on code quality". Object Management Group, Inc.. 2023. https://www.it-cisq.org/standards/code-quality-standards. 
  17. Balalaie, Armin, , Abbas Heydarnoori, and Pooyan Jamshidi. "Microservices architecture enables devops: Migration to a cloud-native architecture." Ieee Software 33.3 ,May–June 2016, 42-52,doi:10.1109/MS.2016.64
  18. Q. Feng, R. Kazman, Y. Cai, R. Mo and L. Xiao, "Towards an Architecture-Centric Approach to Security Analysis," 2016 13th Working IEEE/IFIP Conference on Software Architecture (WICSA), Venice, 2016, pp. 221-230, doi:10.1109/WICSA.2016.41.
  19. R. Haas, R. Niedermayr and E. Juergens, "Teamscale: Tackle Technical Debt and Control the Quality of Your Software," 2019 IEEE/ACM International Conference on Technical Debt (TechDebt), Montreal, QC, Canada, 2019, pp. 55-56, doi:10.1109/TechDebt.2019.00016.
  20. Storey MA. (2003) Designing a Software Exploration Tool Using a Cognitive Framework. In: Zhang K. (eds) Software Visualization. The Springer International Series in Engineering and Computer Science, vol 734. Springer, Boston, MA.
  21. Seonah Lee, Sungwon Kang, What situational information would help developers when using a graphical code recommender?, Journal of Systems and Software, Volume 117, 2016, pp. 199–217, ISSN 0164-1212, doi:10.1016/j.jss.2016.02.050.
  22. Linda G. Wallace, Steven D. Sheetz, The adoption of software measures: A technology acceptance model (TAM) perspective, Information & Management, Volume 51, Issue 2, 2014, pp. 249–259, ISSN 0378-7206, doi:10.1016/j.im.2013.12.003
  23. "Utilization of information technology: examining cognitive and experiential factors of post-adoption behavior". IEEE Transactions on Engineering Management. August 2005. pp. 363-381. https://ieeexplore.ieee.org/document/1468406. 
  24. Banker, R.D.; Kemerer, C.F. (December 1992). "Performance Evaluation Metrics for Information Systems Development: A Principal-Agent Model". Information Systems Research 3 (4): 379-400. https://www.jstor.org/stable/23010648. Retrieved 8 December 2023. 
  25. Crowne, M. (9 July 2003). "Why software product startups fail and what to do about it. Evolution of software product development in startup companies". IEEE International Engineering Management Conference. pp. 338-343. doi:10.1109/IEMC.2002.1038454. https://ieeexplore.ieee.org/document/1038454. 
  26. Parnas, David Lorge, Precise Documentation: The Key to Better Software, The Future of Software Engineering, 2011, 125–148, doi:10.1007/978-3-642-15187-3_8
  27. "Big data, analytics and the path from insights to value". 21 December 2010. pp. 21-32. https://sloanreview.mit.edu/article/big-data-analytics-and-the-path-from-insights-to-value. 
  28. Janez Prašnikar; Žiga Debeljak; Aleš Ahčan (3 December 2010). "Benchmarking as a tool of strategic management". Total Quality Management & Business Excellence 16 (2): 257-275. doi:10.1080/14783360500054400. https://www.tandfonline.com/action/showCitFormats?doi=10.1080%2F14783360500054400. Retrieved 8 December 2023. 
  29. "Gartner Glossary - Applications Portfolio Analysis (APA)". Gartner, Inc.. 2023. https://www.gartner.com/en/information-technology/glossary/application-portfolio-analysis. 
  30. "Gartner Research - Effective Strategies to Deliver Sustainable Cost Optimization in Application Services". Gartner, Inc.. 4 October 2017. https://www.gartner.com/en/documents/3812067. 
  31. "About the Automated Function Points Specification Version 1.0". Object Management Group. December 2013. https://www.omg.org/spec/AFP.