Linus's law
In software development, Linus's law is the assertion that "given enough eyeballs, all bugs are shallow". The law was formulated by Eric S. Raymond in his essay and book The Cathedral and the Bazaar (1999), and was named in honor of Linus Torvalds.[1][2]
A more formal statement is: "Given a large enough beta-tester and co-developer base, almost every problem will be characterized quickly and the fix obvious to someone." Presenting the code to multiple developers with the purpose of reaching consensus about its acceptance is a simple form of software reviewing. Researchers and practitioners have repeatedly shown the effectiveness of reviewing processes in finding bugs and security issues.[3]
Validity
In Facts and Fallacies about Software Engineering, Robert Glass refers to the law as a "mantra" of the open source movement, but calls it a fallacy due to the lack of supporting evidence and because research has indicated that the rate at which additional bugs are uncovered does not scale linearly with the number of reviewers; rather, there is a small maximum number of useful reviewers, between two and four, and additional reviewers above this number uncover bugs at a much lower rate.[4] While closed-source practitioners also promote stringent, independent code analysis during a software project's development, they focus on in-depth review by a few and not primarily the number of "eyeballs".[5]
The persistence of the Heartbleed security bug in a critical piece of code for two years has been considered as a refutation of Raymond's dictum.[6][7][8][9] Larry Seltzer suspects that the availability of source code may cause some developers and researchers to perform less extensive tests than they would with closed source software, making it easier for bugs to remain.[9] In 2015, the Linux Foundation's executive director Jim Zemlin argued that the complexity of modern software has increased to such levels that specific resource allocation is desirable to improve its security. Regarding some of 2014's largest global open source software vulnerabilities, he says, "In these cases, the eyeballs weren't really looking".[8] Large scale experiments or peer-reviewed surveys to test how well the mantra holds in practice have not been performed.[10]
Empirical support of the validity of Linus's law[11] was obtained by comparing popular and unpopular projects of the same organization. Popular projects are projects with the top 5% of GitHub stars (7,481 stars or more). Bug identification was measured using the corrective commit probability, the ratio of commits determined to be related to fixing bugs. The analysis showed that popular projects had a higher ratio of bug fixes (e.g., Google's popular projects had a 27% higher bug fix rate than Google's less popular projects). Since it is unlikely that Google lowered its code quality standards in more popular projects, this is an indication of increased bug detection efficiency in popular projects.
See also
References
- ↑ Raymond, Eric S.. "The Cathedral and the Bazaar". catb.org. http://www.catb.org/~esr/writings/cathedral-bazaar/cathedral-bazaar/ar01s04.html.
- ↑ Raymond, Eric S. (1999). The Cathedral and the Bazaar. O'Reilly Media. p. 30. ISBN 1-56592-724-9. https://books.google.com/books?id=F6qgFtLwpJgC&pg=PA30.
- ↑ Pfleeger, Charles P.; Pfleeger, Shari Lawrence (2003). Security in Computing, 4th Ed.. Prentice Hall PTR. pp. 154–157. ISBN 0-13-239077-9. https://books.google.com/books?id=O3VB-zspJo4C&pg=PA154.
- ↑ Glass, Robert L. (2003). Facts and Fallacies of Software Engineering. Addison-Wesley. p. 174. ISBN 0-321-11742-5. https://books.google.com/books?id=3Ntz-UJzZN0C&pg=PA174. ISBN:978-0321117427.
- ↑ Howard, Michael; LeBlanc, David (2003). Writing Secure Code, 2nd. Ed.. Microsoft Press. pp. 44–45, 615, 726. ISBN 0-7356-1722-8. https://books.google.com/books?id=Uafp7m2wPcMC&q=Writing+Secure+Code.
- ↑ Byfield, Bruce (April 14, 2014). "Does Heartbleed Disprove 'Open Source is Safer'?"". Datamation. https://www.datamation.com/open-source/does-heartbleed-disprove-open-source-is-safer-1.html.
- ↑ Felten, Edward W.; Kroll, Joshua A. (2014). "Help Wanted on Internet Security". Scientific American 311 (1): 14. doi:10.1038/scientificamerican0714-14. PMID 24974688. Bibcode: 2014SciAm.311a..14F.
- ↑ 8.0 8.1 Kerner, Sean Michael (February 20, 2015). "Why All Linux (Security) Bugs Aren't Shallow". eSecurity Planet. http://www.esecurityplanet.com/open-source-security/why-all-linux-security-bugs-arent-shallow.html.
- ↑ 9.0 9.1 Seltzer, Larry (April 14, 2014). "Did open source matter for Heartbleed?". https://www.zdnet.com/article/did-open-source-matter-for-heartbleed/.
- ↑ Arceneaux, Kevin; Gerber, Alan S.; Green, Donald P. (January 2006). "Comparing Experimental and Matching Methods Using a Large-Scale Voter Mobilization Experiment" (in en). Political Analysis 14 (1): 37–62. doi:10.1093/pan/mpj001. ISSN 1047-1987. https://www.cambridge.org/core/journals/political-analysis/article/abs/comparing-experimental-and-matching-methods-using-a-largescale-voter-mobilization-experiment/E7B43806BEE0FB3000EE6627A9C03720.
- ↑ Amit, Idan; Feitelson, Dror G. (2020). "The Corrective Commit Probability Code Quality Metric". arXiv:2007.10912 [cs.SE].
Further reading
- Jing Wang; J.M. Carroll (2011-05-27). "Behind Linus's law: A preliminary analysis of open source software peer review practices in Mozi". Int. Conf. on Collaboration Technologies and Systems (CTS), Philadelphia, PA. IEEE Xplore Digital Library. pp. 117–124. doi:10.1109/CTS.2011.5928673.
Original source: https://en.wikipedia.org/wiki/Linus's law.
Read more |