Static application security testing

From HandWiki

Static application security testing (SAST) is used to secure software by reviewing the source code of the software to identify sources of vulnerabilities. Although the process of statically analyzing the source code has existed as long as computers have existed, the technique spread to security in the late 90s and the first public discussion of SQL injection in 1998 when Web applications integrated new technologies like JavaScript and Flash. Unlike dynamic application security testing (DAST) tools for black-box testing of application functionality, SAST tools focus on the code content of the application, white-box testing. A SAST tool scans the source code of applications and its components to identify potential security vulnerabilities in their software and architecture. Static analysis tools can detect an estimated 50% of existing security vulnerabilities.[1]

In the software development life cycle (SDLC), SAST is performed early in the development process and at code level, and also when all pieces of code and components are put together in a consistent testing environment. SAST is also used for software quality assurance,[2] even if the many resulting false-positive impede its adoption by developers[3]

SAST tools are integrated into the development process to help development teams as they are primarily focusing on developing and delivering software respecting requested specifications.[4] SAST tools, like other security tools, focus on reducing the risk of downtime of applications or that private information stored in applications will not be compromised.

For the year of 2018, the Privacy Rights Clearinghouse database[5] shows that more than 612 million records have been compromised by hacking.

Overview

Application security tests of applications their release: static application security testing (SAST), dynamic application security testing (DAST), and interactive application security testing (IAST), a combination of the two.[6]

Static analysis tools examine the text of a program syntactically. They look for a fixed set of patterns or rules in the source code. Theoretically, they can also examine a compiled form of the software. This technique relies on instrumentation of the code to do the mapping between compiled components and source code components to identify issues. Static analysis can be done manually as a code review or auditing of the code for different purposes, including security, but it is time-consuming.[7]

The precision of SAST tool is determined by its scope of analysis and the specific techniques used to identify vulnerabilities. Different levels of analysis include:

  • function level - sequences of instruction.
  • file or class-level - an extensible program-code-template for object creation.
  • application level - a program or group of programs that interact.

The scope of the analysis determines its accuracy and capacity to detect vulnerabilities using contextual information.[8]

At a function level, a common technique is the construction of an Abstract syntax tree to control the flow of data within the function.[9]

Since late 90s, the need to adapt to business challenges has transformed software development with componentization[10] enforced by processes and organization of development teams.[11] Following the flow of data between all the components of an application or group of applications allows validation of required calls to dedicated procedures for sanitization and that proper actions are taken to taint data in specific pieces of code.[12][13]

The rise of web applications entailed testing them: Verizon Data Breach reports in 2016 that 40% of all data breaches use web application vulnerabilities.[14] As well as external security validations, there is a rise in focus on internal threats. The Clearswift Insider Threat Index (CITI) has reported that 92% of their respondents in a 2015 survey said they had experienced IT or security incidents in the previous 12 months and that 74% of these breaches were originated by insiders.[15][16] Lee Hadlington categorized internal threats in 3 categories: malicious, accidental, and unintentional. Mobile applications' explosive growth implies securing applications earlier in the development process to reduce malicious code development.[17]

SAST strengths

The earlier a vulnerability is fixed in the SDLC, the cheaper it is to fix. Costs to fix in development are 10 times lower than in testing, and 100 times lower than in production.[18] SAST tools run automatically, either at the code level or application-level and do not require interaction. When integrated into a CI/CD context, SAST tools can be used to automatically stop the integration process if critical vulnerabilities are identified.[19]

Because the tool scans the entire source-code, it can cover 100% of it, while dynamic application security testing covers its execution possibly missing part of the application,[6] or unsecured configuration in configuration files.

SAST tools can offer extended functionalities such as quality and architectural testing. There is a direct correlation between the quality and the security. Bad quality software is also poorly secured software. [20]

SAST weaknesses

Even though developers are positive about the usage of SAST tools, there are different challenges to the adoption of SAST tools by developers.[4] The usability of the output generated by these tools may challenge how much developers can make use of these tools. Research shows that despite the long out generated by these tools, they may lack usability.[21]

With Agile Processes in software development, early integration of SAST generates many bugs, as developers using this framework focus first on features and delivery.[22]

Scanning many lines of code with SAST tools may result in hundreds or thousands of vulnerability warnings for a single application. It can generate many false-positives, increasing investigation time and reducing trust in such tools. This is particularly the case when the context of the vulnerability cannot be caught by the tool.[3]

See also

References

  1. Okun, V.; Guthrie, W. F.; Gaucher, H.; Black, P. E. (October 2007). "Effect of static analysis tools on software security: preliminary investigation.". Proceedings of the 2007 ACM Workshop on Quality of Protection (ACM): 1–5. doi:10.1145/1314257.1314260. https://samate.nist.gov/docs/SA_tool_effect_QoP.pdf. 
  2. Ayewah, N.; Hovemeyer, D.; Morgenthaler, J.D.; Penix, J.; Pugh, W. (September 2008). "Using static analysis to find bugs". IEEE Software (IEEE) 25 (5): 22–29. doi:10.1109/MS.2008.130. 
  3. 3.0 3.1 Johnson, Brittany; Song, Yooki; Murphy-Hill, Emerson; Bowdidge, Robert (May 2013). "Why don't software developers use static analysis tools to find bug". ICSE '13 Proceedings of the 2013 International Conference on Software Engineering: 672–681. ISBN 978-1-4673-3076-3. 
  4. 4.0 4.1 Oyetoyan, Tosin Daniel; Milosheska, Bisera; Grini, Mari (May 2018). "Myths and Facts About Static Application Security Testing Tools: An Action Research at Telenor Digital". International Conference on Agile Software Development. (Springer): 86–103. 
  5. "Data Breaches | Privacy Rights Clearinghouse". https://privacyrights.org/data-breaches. 
  6. 6.0 6.1 Parizi, R. M.; Qian, K.; Shahriar, H.; Wu, F.; Tao, L. (July 2018). "Benchmark Requirements for Assessing Software Security Vulnerability Testing Tools". 2018 IEEE 42nd Annual Computer Software and Applications Conference (COMPSAC). IEEE. pp. 825–826. doi:10.1109/COMPSAC.2018.00139. ISBN 978-1-5386-2666-5. 
  7. Chess, B.; McGraw, G. (December 2004). "Static analysis for security". IEEE Security & Privacy (IEEE) 2 (6): 76–79. doi:10.1109/MSP.2004.111. 
  8. Chess, B.; McGraw, G. (October 2004). "Risk Analysis in Software Design". IEEE Security & Privacy (IEEE) 2 (4): 76–84. doi:10.1109/MSP.2004.55. 
  9. Yamaguchi, Fabian; Lottmann, Markus; Rieck, Konrad (December 2012). "Generalized vulnerability extrapolation using abstract syntax trees". Proceedings of the 28th Annual Computer Security Applications Conference. 2. IEEE. pp. 359–368. doi:10.1145/2420950.2421003. ISBN 9781450313124. 
  10. Booch, Grady; Kozaczynski, Wojtek (September 1998). "Component-Based Software Engineering". IEEE Software 15 (5): 34–36. doi:10.1109/MS.1998.714621. 
  11. Mezo, Peter; Jain, Radhika (December 2006). "Agile Software Development: Adaptive Systems Principles and Best Practices". Information Systems Management 23 (3): 19–30. doi:10.1201/1078.10580530/46108.23.3.20060601/93704.3. 
  12. Livshits, V.B.; Lam, M.S. (May 2006). "Finding Security Vulnerabilities in Java Applications with Static Analysis". USENIX Security Symposium 14: 18. 
  13. Jovanovic, N.; Kruegel, C.; Kirda, E. (May 2006). "Pixy: A static analysis tool for detecting Web application vulnerabilities". 2006 IEEE Symposium on Security and Privacy (S&P'06). IEEE. pp. 359–368. doi:10.1109/SP.2006.29. ISBN 0-7695-2574-1. 
  14. "2016 Data Breach Investigations Report". Verizon. 2016. https://www.verizon.com/business/resources/Ta80/reports/DBIR_2016_Report.pdf. 
  15. "Clearswift report: 40 percent of firms expect a data breach in the Next Year". Endeavor Business Media. 20 November 2015. https://www.securityinfowatch.com/cybersecurity/information-security/press-release/12141612/clearview-clearswift-report-40-percent-of-firms-expect-a-data-breach-in-the-next-year. 
  16. "The Ticking Time Bomb: 40% of Firms Expect an Insider Data Breach in the Next 12 Months". Fortra. 18 November 2015. https://www.clearswift.com/resources/press-releases/ticking-time-bomb-40-firms-expect-insider-data-breach-next-12-months. 
  17. Xianyong, Meng; Qian, Kai; Lo, Dan; Bhattacharya, Prabir; Wu, Fan (June 2018). "Secure Mobile Software Development with Vulnerability Detectors in Static Code Analysis". 2018 International Symposium on Networks, Computers and Communications (ISNCC). pp. 1–4. doi:10.1109/ISNCC.2018.8531071. ISBN 978-1-5386-3779-1. 
  18. Hossain, Shahadat (October 2018). "Rework and Reuse Effects in Software Economy". Global Journal of Computer Science and Technology 18 (C4): 35–50. https://computerresearch.org/index.php/computer/article/view/1780. 
  19. Okun, V.; Guthrie, W. F.; Gaucher, H.; Black, P. E. (October 2007). "Effect of static analysis tools on software security: preliminary investigation". Proceedings of the 2007 ACM Workshop on Quality of Protection (ACM): 1–5. doi:10.1145/1314257.1314260. https://samate.nist.gov/docs/SA_tool_effect_QoP.pdf. 
  20. Siavvas, M.; Tsoukalas, D.; Janković, M.; Kehagias, D.; Chatzigeorgiou, A.; Tzovaras, D.; Aničić, N.; Gelenbe, E. (August 2019). "An Empirical Evaluation of the Relationship between Technical Debt and Software Security". in Konjović, Z.. 1. pp. 199–203. doi:10.5281/zenodo.3374712. 
  21. Tahaei, Mohammad; Vaniea, Kami; Beznosov, Konstantin (Kosta); Wolters, Maria K (6 May 2021). Security Notifications in Static Analysis Tools: Developers' Attitudes, Comprehension, and Ability to Act on Them. pp. 1–17. doi:10.1145/3411764.3445616. ISBN 9781450380966. https://www.research.ed.ac.uk/en/publications/e1bc04ef-ae83-4e82-8ade-ca572bc503d2. 
  22. Arreaza, Gustavo Jose Nieves (June 2019). "Methodology for Developing Secure Apps in the Clouds. (MDSAC) for IEEECS Confererences". 2019 6th IEEE International Conference on Cyber Security and Cloud Computing (CSCloud)/ 2019 5th IEEE International Conference on Edge Computing and Scalable Cloud (EdgeCom). IEEE. pp. 102–106. doi:10.1109/CSCloud/EdgeCom.2019.00-11. ISBN 978-1-7281-1661-7.