Scientific integrity
Research integrity or scientific integrity is a form of scientific ethics that deals with "best practice" or rules of professional practice of researchers.
First introduced in the 19th century by Charles Babbage, the concept of research integrity came to the fore in the late 1970s. A series of publicized scandals in the United States led to heightened debate on the ethical norms of sciences and the limitations of the self-regulation processes implemented by scientific communities and institutions. Formalized definitions of scientific misconduct, and codes of conduct, became the main policy response after 1990. In the 21st century, codes of conduct on research integrity are widespread. Along with codes of conduct at institutional and national levels, major international texts include the European Charter for Researchers (2005), the Singapore statement on research integrity (2010), the European Code of Conduct for Research Integrity (2011 & 2017) and the Hong Kong principles for assessing researchers (2020).
Scientific literature on research integrity falls mostly into two categories:[unclear what the two categories are.] mapping of the definitions and categories, especially in regard to scientific misconduct and empirical surveys of the attitudes and practices of scientists.[1] Following the development of codes of conduct, taxonomies of non-ethical uses have been significantly expanded, beyond the long-established forms of scientific fraud (plagiarism, falsification and fabrication of results). Definitions of "questionable research practices" and the debate over reproducibility also target a grey area of dubious scientific results, which may not be the outcome of voluntary manipulations.
The concrete impact of codes of conduct and other measures put in place to ensure research integrity remain uncertain. Several case studies have highlighted that while the principles of the code of conduct adhere to common scientific ideals, they are seen as remote from actual work practices and their efficiency is criticized.
After 2010, debates on research integrity have been increasingly linked to open science. International codes of conduct and national legislation on research integrity have officially endorsed open sharing of scientific output (publications, data or code[clarification needed]) as ways to limit questionable research practices and to enhance reproducibility. References to open science have incidentally opened up the debate over scientific integrity beyond academic communities, as it increasingly concerns a wider audience of scientific readers.
Definition and history
Research integrity or scientific integrity has become an autonomous concept within scientific ethics in the late 1970s. In contrast with other forms of ethical misconducts, the debate over research integrity is focused on "victimless offence" that only hurts "the robustness of scientific record and public trust in science".[2] Infractions to research integrity include chiefly "data fabrication, falsification, or plagiarism".[2] In that sense, research integrity mostly deal with the internal process of science. It can be treated as commmunity issue, that should not involve external observers: "research integrity is more autonomously defined and regulated by the community, while research ethics (again, a narrow definition) has closer links to legislation".[2]
Emergence of the issue (1970–1980)
Before the 1970s, ethical issues were largely focused on the conduct of medical experiments, especially in regard to tests on humans. In 1803, the "code" of Thomas Percival created a moral foundation for experimental treatments that "was built upon fairly regularly" throughout the next two centuries, notably by Walter Reed in 1898 and by the Berlin code in 1900.[3] After the Second World War, the Nazi human experimentations motivated the development of international, widely acknowledged codes of research ethics, such as the Nuremberg code (1947) and the World Medical Association Declaration of Helsinki.[4]
According to Kenneth Pimple, Charles Babbage was the first author to set aside[clarification needed] the specific issue of scientific integrity.[5] In the Reflections on the Decline of Science in England, and on Some of its Causes, first published in 1830, Babbage identified four classes of scientific frauds,[6] from outright forgery to varied degrees of arrangments and cooking of the data or the methods.
Research integrity became a major debated topic in biological sciences after 1970, due to a combination of factors: the development of advanced data analysis methods, the growing commercial relevancy of fundamental research,[7] and the increased focus of federal funding agencies in the context of big science.[8] In 1974, the "painted mouse incident" attracted unprecedented media attention: William Summerlin inked a black dot on a mouse[Needs more explanation.] to claim a treatment has been a success.[9] Between 1979 and 1981, several major cases of scientific fraud and plagiarism drew a greater focus on the issue from researchers and policymakers in the United States:[7] as many as four important frauds occurred in the summer of 1980.
At the time, the "scientific community responded to reports of 'scientific fraud' (as it was often called) by asserting that such cases are rare and that neither errors nor deception can be hidden for long because of science's self-correcting nature".[10] A journalist of Science, William Brad, took the opposite position and made an influential contribution to the issue of research integrity. In an answer to the US House of Representatives Science and Technology subcommittee, he highlighted that "cheating in science was nothing new" but, until recently, "had been handled as an internal affair". In a detailed investigation co-signed with Nicholas Wade, Betrayers of Science, Brad described scientific fraud as a structural problem: "As more cases of frauds broke into public view (…) we wondered if fraud wasn't a quite regular minor feature of the scientific landscape (…) Logic, replication, peer review — all had been successfully defied by scientific forgers, often for extended periods of time."[11] Other early assessments of the systematicity of scientific frauds presented a more nuanced picture.[12] For Patricia Wolff, along with a few obvious manipulations, there were a wide range of grey areas, which were due to the complexity of fundamental research: "the boundaries between egregious self-deception, culpable carelessness, fraud, and just plain error, can be very blurred indeed".[13] Characteristically, the debate led to a re-evaluation of past scientific practices. In 1913, a well-known scientific experiment on electron charge by Robert Millikan was explicitly based on discarding some results that would not agree with the underlying theory: while well received at the time, by the 1980s this work had come to be considered as a textbook example of scientific manipulation.[14]
Formalization of research integrity (1990–2020)
By the end of the 1980s, the amplification of misconduct scandals and the heightened political and public scrutiny put scientists in a difficult position in the United States and elsewhere: "The tone of the 1988 US congressional oversight hearings, chaired by Rep. John Dingell (D-MI), that investigated how research institutions were responding to misconduct allegations reinforced many scientists’ view that both they and scientific research itself were under siege."[15] The main answer was procedural: research integrity has "been codified into numerous codes of conduct field specific, national, and international alike."[16] This policy response largely stemmed from research communities, funders and scientific administrators. In the United States, the United States Public Health Service and the National Science Foundation adopted "similar definitions of misconduct in science" in 1989 and 1991.[17] The concepts of research integrity and its reverse, scientific misconduct were especially relevant from the perspective funding bodies, since it made it possible to "delineate the research-related practices that merit intervention":[18] lack of integrity led not only to unethical but inefficient research and funds have better to be allocated elsewhere.
After 1990, there was a "veritable explosion of scientific codes of conduct".[19] In 2007 the OECD published a report on best practices for promoting scientific integrity and preventing misconduct in science (Global Science Forum). Such international texts include:
- European Charter for Researchers (2005)
- the Singapore statement on research integrity (2010)[20]
- European Code of Conduct for Research Integrity of All European Academies (ALLEA) and the European Science Foundation (ESF) (2011 revised in 2017[21]).
There are no global estimates of the total number of codes of conduct related to research integrity.[22] A UNESCO project, the Global Ethics Observatory (no longer accessible after 2021), referenced 155 codes of conduct[23] but "this is probably just a fraction of the total number of codes produced in recent years."[24] Codes have been created in highly diverse settings and show a wide variation in scale and ambition: along with national-scale codes, there are codes for scientific societies, institutions and R&D services.[Unclear sentence: which is the main verb?][22] While these normative texts may frequently share a core of common principles, there has been growing concern "over fragmentation, lack of interoperability and varying understandings of central terms can be sensed".[25]
Taxonomy and classification
In codes of conduct, the definition of research integrity is usually negative: the collection of norms aims to single out different forms of unethical research and scientific misconduct with varying degrees of gravity.
The multiplication of codes of conduct has also corresponded with an expansion of scope. While the initial debate was focused on "three deadly sins of scientific and scholarly research: fabrication, falsification and plagiarism", attention has later shifted "to the lesser breaches of research integrity".[26] In 1830, Charles Babbage introduced the first taxonomy of scientific frauds that already encover some forms of questionable research practices : hoaxing (a voluntary fraud "far from justifiable"[27]), forging ("whereas the forger is one who, wishing to acquire a reputation for science, records observations which he has never made"[28]), trimming (which "consists in clipping off little bits here and there from those observations which differ most in excess from the mean"[29]) and cooking. Cooking is the main focus of Babbage as an "art of various forms, the object of which is to give to ordinary observations the appearance and character of those of the highest degree of accuracy".[29] It falls done under several sub-cases such as data selection ("if a hundred observations are made, the cook must be very unlucky if he cannot pick out fifty or twenty to do the serving up",[30] model/algorithm selection ("another approved receipt (…) is to calculate them by two different formulae"[30]) or use of different constants.[31]
In the late 20th century, this classification has been greatly expanded and have come to encompass a wider range of deficiencies than intentional frauds. The formalization of research integrity entailed a structural change in the vocabularies and the concept associated with it.[32] By the end of the 1990s, use of the expression "scientific fraud" was discouraged in the United States, in favor a "semi-legal term": scientific misconducts. The scope of scientific misconducts is expansive: along with data fabrication, falsification and plagiarism it includes "other serious deviations" that are demonstrably done in bad faith[33] The associated concept of questional research practice, first incepted in a 1992 report of the Committee on Science, Engineering, and Public Policy, has an even broader scope, as it also encompass potentially non-intentional research failures (such as inadequacies in the research data management process).[34] In 2016, a study identified as much as 34 questionable research practices or "degree of freedom", that can occur at all the steps of the project (the initial hypothesis, the design of the study, collection of the data, the analysis and the reporting).[35]
After 2005, research integrity has been additionally redefined through the perspective of research reproducibility and, more specifically, of the "reproducibility crisis". Studies of reproducibility suggest that there is continuum between irreproducibility, questionable research practices and scientific misconducts: "Reproducibility is not just a scientific issue; it is also an ethical one. When scientists cannot reproduce a research result, they may suspect data fabrication or falsification."[36] In this context, ethical debates are less focused on a few highly publicized scandals and more on the suspicion that the standard scientific process is broken and fail to meet its own standard.
Current landscape and issues
Prevalence of ethical issues
In 2009, a meta-analysis of 18 surveys estimated that less than 2% of scientists "admitted to have fabricated, falsified or modified data or results at least once". Real prevalence may be under-estimated due to self-reporting: regarding "the behaviour of colleagues admission rates were 14.12%".[37] Questionable research practices are more widespread as more than one third of the respondents admit to have done it once.[38] A large 2021 survey of 6,813 respondents in the Netherlands found significantly higher estimate, with 4% of the respondents engaging in data fabrication and more than half of the respondents engaging in questionable research practices.[39] Higher rates can be either attributed to a deterioration of ethic norms or to "the increased awareness of research integrity in recent years".[40] The higher rates of self-declared scientific misconducts are found in the medical and life science, with at much as 10.4% respondents surveyed in the Nerthelands admitting a scientific fraud (either fabrication of falsification of the data).[41]
Other forms or scientific misconducts or questional research practices are both less problematic and much more widespread. A 2012 survey of 2,000 psychologists found that "the percentage of respondents who have engaged in questionable practices was surprisingly high",[42] especially in regard to selective reporting.[42] A 2018 survey of 807 researchers in ecology an evolutionary biology showed that 64% "did not report results because they were not statistically significant", 42% have decided to collect additional data "after inspecting whether results were statistically significant" and 51% "reported an unexpected finding as though ithad been hypothesised from the start".[43] As they come from self-declared survey, theses estimations are likely to be underestimated and questionable research practices may be even more mainstream.[44]
Implementation and assessment of codes of conduct
Several case studies and retrospective analyses have been devoted to the reception of codes of conduct in scientific communities. They frequently highlight a discrepancy between the theoretical norms and the "lived[clarification needed] morality of researchers".[45]
In 2004, Caroline Whitbeck underlined that the enforcement of a few formal rules has overall failed to answer[clarification needed] to a structural "erosion or neglect" of scientific trust.[46] In 2009, Schuurbiers, Osseweijer and Kinderler led a series of interviews in the aftermath[clarification needed] of the Dutch code of conduct on research integrity, introduced in 2005. Overall, most respondents were unaware of the code and complementary ethical recommendations.[47] While the principles "were seen to reflect the norms and values within science rather well", they seemed to be isolated from the actual work practices, which "may lead to morally complex situations".[48] Respondents were also critical of the underlying individualist philosophy of the code, which shifted the entire blame to individual researchers without taking into account institutional or community-wide issues.[49] In 2015, a survey of "64 faculty members at a large southwestern university" in the United States "yielded similar results":[50] many of the respondents were not aware of the existing ethical guidelines, and the communication process remained poor.[51] In 2019, a case study on Italian universities noted that the proliferation of research codes "has a reactive nature because codes of ethics are drawn up in response to scandals and as a result are punitive and negative, with lists of prohibitions".[52]
Codes of conduct on research integrity may have a more significant impact on professional identity.[clarification needed] Development of research codes has been equated to an internalization of issues related to research integrity within scientific social circles and its close associate[clarification needed] with disputed results, which made it a typical form of "knowledge club" governance.[clarification needed] In contrast to a wider range of ethical issues that may overlap with more general social debates (such as gender equality), research integrity belongs to a form of professional ethics analogous to the ethical standards applied by journalists or medical practicians.[53] As such, not only does it create a common moral framework but also, incidentally, "justifies the existence of the profession as separate from other professions".[53] While the impact of codes on actual ethical practices remains difficult to assess, they have a more measurable impact on the professionalization of research, by transforming informal norms and customs into a set of predefined principles: "codes in general are supported both by those pursuing them as a vehicle to encourage the greater professionalization of biologists (e.g., an initial stage to introducing professional licensing) and those seeking them to forestall any further regulation."[54]
Research integrity and open science
In the 2000s and 2010s, scientific integrity was gradually reframed in the context of open science, and increased accessibility to scientific publications. The debate on research reproducibility has significantly contributed to this evolution.
Ethics of open science
The underlying ethical principles of open science predates the development of an organized open science movement. In 1973, Robert K. Merton theoretized a normative "ethos of science" structured on a "norm of disclosure". This norm "was far from universally accepted" in the early development of scientific communities and has remained "one of the many ambivalent precepts contained in the institution of science."[55] Disclosure was counterbalanced by the limitations of the publication and evaluation process, that tended to slow down the divulgation of research results.[55] In the early 1990s, this norm of disclosure was reframed as norm of "openness" or "open science".[56]
The early open access and open science movements emerged partly as a reaction against the large corporate model that has come to dominate scientific publishing since the Second World War[57] Open science was not framed as a radical transformation of scientific communication but as a realization of core underlying principles, already visible at the start of the scientific revolution of the 17th and the 18th century: the autonomy and self-governance of scientific communities and the divulgation of research results.[58]
Since 2000, the open science movement has expanded beyond access to scientific outputs (publication, data or software) to encompass the entire process of scientific production. The reproducibility crisis has been an instrumental factor in this development, as it moved the debates over the definition open science further from scientific publishing. In 2018, Vicente-Saez and Martinez-Fuentes have attempted to map the common values shared by the standard definitions of open science in the English-speaking scientific literature indexed on Scopus and the Web of Science.[59] Access is no longer the main dimension of open science, as it has been extended by more recent committments toward transparency, collaborative work and social impact.[60] These diverse conceptual dimensions "encompasses (Graph 5) the emerging trends on Open Science such as open code […] open notebooks, open lab books, science blogs, collaborative bibliographies, citizen science, open peer review, or pre-registration"[61]
Through this process, open science has been increasingly structured over a consisting set of ethical principles: "novel open science practices have developed in tandem with novel organising forms of conducting and sharing research through open repositories, open physical labs, and transdisciplinary research platforms. Together, these novel practices and organising forms are expanding the ethos of science at universities."[62]
Codification of open science ethics
The translation of the ethical values of open science toward applied recommendation[clarification needed] was mostly undertaken by institutional and communities initiatives until the 2010s. The TOP guidelines were elaborated in 2014 by a committee for Transparency and Openness Promotion that included "disciplinary leaders, journal editors, funding agency representatives, and disciplinary experts largely from the social and behavioral sciences".[63] The guidelines rely on eight standards, with different levels of compliance. While the standards are modular, they also aim to articulate a consistent ethos of science as "they also complement each other, in that commitment to one standard may facilitate adoption of others."[63]. The highest levels of compliance for each standard include the following requirements:
Standard | Contents |
---|---|
Citation standards (1) | providing "appropriate citation for data and materials" for each publication.[64] |
Data transparency (2) Analytic methods transparency (3) Research materials transparency (4) |
with all the relevant data, code and resarch materials stored on a "trusted repository" and all analysis being already reproduced independently prior to publication.[64] |
Design and analysis transparency (5) | with dedicated standards for "review and publication".[64] |
Preregistration of studies (6) Preregistration of analysis plans (7) |
with publications providing "link and badge in article to meeting requirements".[64] |
Replication (8) | with the journal using "Registered Reports as a submission option for replication studies with peer review".[64] |
In 2018, Heidi Laine attempted to establish a nearly-exhaustive list of "ethical principles associated with open science":[65]
Scientific activity | Open science principles | Singapore Statement (2010) | Montreal Statement (2013) | Finnish Code of conduct (2012) | European Code of conduct (2017) |
---|---|---|---|---|---|
(Full titles given below) | |||||
Publication | Open access | Full requirements | Full requirements | Full requirements | Full requirements |
Research Data | Open scientific data | Partial requirements | Mention/encouragement | Mention/encouragement | Full requirements |
Research Methods | Reproducibility | Mention/encouragement | Mention/encouragement | Mention/encouragement | Full requirements |
Evaluation | Open Evaluation | No mention | No mention | No mention | No mention |
Collaboration | Citizen Science, open collaboration | No mention | No mention | No mention | No mention |
Communication | Citizen Science, science communication | Mention/encouragement | No mention | Mention/encouragement | Mention/encouragement |
This categorization has to contend with the diversity of approaches and values associated with the open science movement and their ongoing evolutions, as the "term will likely remain as fluid as any other attempt to coin a complex system of practices, values and ideologies in one term".[66] Laine identified a significant variation in the way open science principles have been embedded in four major codes of conduct and statements on research integrity: the Singapore Statement on Research Integrity (2010), the Montreal Statement on Research Integrity in Cross-Boundary Research Collaborations (2013), the Responsible Conduct of Research and Procedures for Handling Allegations of Misconduct in Finland (2012) and the European Code of Conduct for Research Integrity (2017). Access to research publications is recommended in all four codes. Integrations of data sharing and reproducibility practices are less obvious, and vary from a tacit approval to detailed support, in the case of the later European Code of Conduct: "The European code pays data management almost an equal amount of attention as publishing and is also in this sense the most advanced of the four CoCs."[67] Yet, important areas of open science, are consistently ignored, especially regarding the development of open science infrastructure, increased transparency of evaluation or support for citizen science and wider social impact. Overall, Laine found "none of the evaluated CoCs to be in blatant contradiction with the ethical principles of open science, but only the European code of conduct can be said to actively support and give guidance on open science."
After 2020, new forms of open science code of conduct have explicitly claimed to "foster the ethos of open scientific practices".[68] First adopted in July 2020, the Hong Kong principles for assessing researchers acknowledge open science as one of the five pillars of scientific integrity: "It seems clear that the various modalities of open science need to be rewarded in the assessment of researchers because these behaviors strongly increase transparency, which is a core principle of research integrity."[69]
Research integrity and society
While there is still a continuum between the procedural norms of the codes of conduct and the range of values encompassed by open science, open science has significantly altered the setting and the context of the ethical debate. Open scientific productions can be universally shared in theory: their dissemination is not constrained to the classic membership model of the "knowledge club". Implications are wider as well, as potential misuses of scientific publications is no longer limited to professional scientists. The discrepancy was already visible in the late 2000s, although it was framed under "different buzzwords":[70] in a case study on the implementation of the Dutch code of conduct, Schuubiers, Osseweijer and Kinderlerer already identified a "shift in practices" that "goes by many names like Mode 2 science, post-normal science, or post-academic science" that a diverse array of transfrom such as technological evolution in the management of research, increased involvement of private actors, open innovation or open access.[71] These structural trends were not well covered by the existing codes of conduct.[71]
In the 1990s and the 2000s, discussions about research integrity have become increasingly professionalized and detached from the public domain. The shift toward open science may potentially contradict this trend, as the range of interesting parties and potential reusers of scientific production has expanded well beyond professional academic circles. In 2018, Heidi Laine underlines that established codes of conduct have not yet taken this decisive step: "The one aspect where even the European code falls short of a full recognition of open science is in crossing the traditional professional borders of the research community, i.e. citizen science, open collaboration and science communication."[72] By not taking into account this new framework, existing codes of conduct risk becoming increasingly out of touch with the reality of scientific practices:
If the ethical aspects of open science continue to be left out of RCR (Responsible Code of Research) guidance and ponderings, the research community risks losses on both fronts: open science as well as RI (Research integrity). Open science is just as much about values and ethics as it is about technology. Most of all it is about the role of science in society. It is perhaps the most all-encompassing value discussion that the research community has ever known, and the research integrity angle and community of experts risks being side-lined.[73]
The broadened discussion about scientific integrity led to an increased involvement of political institutions and representatives, beyond specialized scientific committee and funders. In 2021, the French government passed a decree on scientific integrity, which called for generalization of open science practices.[74]
Initiatives
In 2007 the OECD published a report on best practices for promoting scientific integrity and preventing misconduct in science (Global Science Forum).
Main international texts in this field:
- European Charter for Researchers (2005)
- the Singapore statement on research integrity (2010)[75]
- European Code of Conduct for Research Integrity of All European Academies (ALLEA) and the European Science Foundation (ESF) (2011 revised in 2017[76]).
In Europe
The European Code of Conduct for Research Integrity, published in 2011 and revised in 2017, develops the concept of scientific integrity along four main lines :
- Reliability: concerns the quality and reproducibility of research.
- Honesty: concerns the transparency and objectivity of research.
- Respect: for the human, cultural, and ecological environment of research.
- Accountability: concerns the implications of publishing the research.
In the USA
US Department of Health and Human Services
In a statement made by the US Department of Health and Human Services (HHS), they adopted the definition of Scientific Integrity as stated below.[77] This policy is currently being reviewed and will be officially published in early 2024.[78]
" Scientific integrity is the adherence to professional practices, ethical behavior, and the principles of honesty and objectivity when conducting, managing, using the results of, and communicating about science and scientific activities. Inclusivity, transparency, and protection from inappropriate influence are hallmarks of scientific integrity.”-HHS
To promote a culture of scientific integrity at HHS, they have outlined their policy in seven specific areas:[77]
- Protecting Scientific Processes
- Ensuring the Free Flow of Scientific Information
- Supporting Policymaking Processes
- Ensuring Accountability
- Protecting Scientists
- Professional Development for Government Scientists
- Federal Advisory Committees
As a result of these areas, open science practices can be promoted to protect against bias, plagiarism, and data fabrication, falsification as well as inappropriate influencing, political interference, and censorship.[79]
National Institute of Health
The National Institute of Health (NIH) is a branch of the HHS. They act as the nation's medical research agency which focuses on making important discoveries that improve health and save lives.[80] The mission of NIH is to provide a fundamental understanding of the nature and behavior of living systems and applying that understanding to improve health, extend life, and reduce illness and disability.[81] The NIH fosters the definition of Scientific Integrity from the HHS Scientific Integrity Policy draft to ensure their scientific findings are objective, creditable, transparent, and readily available to the public. All NIH staff are expected to:
- Foster an organizational Culture of Scientific Integrity
- Protect the Integrity of the Research Process
- Communicate Science with Integrity
- Safeguard Scientific Integrity
References
- ↑ Laine 2018, p. 52
- ↑ 2.0 2.1 2.2 Laine 2018, p. 50
- ↑ Pimple 2017, p. XV
- ↑ Pimple 2017, p. XVI
- ↑ Pimple 2017, p. XVI
- ↑ Babbage 1830, p. 176
- ↑ 7.0 7.1 Löppönen & Vuorio 2013, p. 3
- ↑ Pimple 2017, p. XVIII
- ↑ Woolf 1988, p. 69
- ↑ Pimple 2017, p. XVIII
- ↑ Broad & Wade 1983, p. 8
- ↑ Pimple 2017, p. XIX
- ↑ Woolf 1988, p. 80
- ↑ Whitbeck 2004, p. 49
- ↑ Whitbeck 2004, p. 50
- ↑ Laine 2018, p. 49
- ↑ Pimple 2017, p. XIX
- ↑ Pimple 2017, p. XX
- ↑ Schuurbiers et al. 2009
- ↑ "Singapore Statement on Research Integrity". 2010. https://www.jsps.go.jp/english/e-kousei/data/singapore_statement_EN.pdf.
- ↑ "ALLEA publishes revised edition of The European Code of Conduct for Research Integrity". All European Academies (ALLEA). 2017. https://allea.org/allea-publishes-revised-edition-european-code-conduct-research-integrity/.
- ↑ 22.0 22.1 Laine 2018, p. 53
- ↑ Database 5: code of conduct, Unesco, archived in 2021 by Internet Archive
- ↑ Schuurbiers et al. 2009
- ↑ Laine 2018, p. 52
- ↑ Bouter 2020, p. 2364
- ↑ Babbage 1830, p. 176
- ↑ Babbage 1830, p. 177
- ↑ 29.0 29.1 Babbage 1830, p. 178
- ↑ 30.0 30.1 Babbage 1830, p. 179
- ↑ Babbage 1830, p. 180
- ↑ Pimple 2002, p. 199
- ↑ Pimple 2002, p. 200
- ↑ Pimple 2002, p. 202
- ↑ Wicherts et al. 2016
- ↑ Resnik & Shamoo 2017
- ↑ Fanelli 2009
- ↑ Fanelli 2009
- ↑ Gopalakrishna et al. 2021
- ↑ Gopalakrishna et al. 2021, p. 5
- ↑ Gopalakrishna et al. 2021, p. 5
- ↑ 42.0 42.1 John, Loewenstein & Prelec 2012, p. 525.
- ↑ Fraser et al. 2018, p. 1.
- ↑ Fraser et al. 2018, p. 12.
- ↑ Laine 2018, p. 54
- ↑ Whitbeck 2004, p. 48
- ↑ Schuurbiers et al. 2009, p. 218
- ↑ Schuurbiers et al. 2009, p. 222
- ↑ Schuurbiers et al. 2009, p. 224
- ↑ Laine 2018, p. 54
- ↑ Giorgini et al. 2015, p. 10
- ↑ Mion et al. 2019
- ↑ 53.0 53.1 Laine 2018, p. 51
- ↑ Rappert 2007, p. 8
- ↑ 55.0 55.1 Merton 1973, p. 337.
- ↑ Partha & David 1994.
- ↑ Suber 2012, p. 29
- ↑ Rentier 2019, p. 19.
- ↑ Vicente-Saez & Martinez-Fuentes 2018.
- ↑ Vicente-Saez & Martinez-Fuentes 2018, p. 2.
- ↑ Vicente-Saez & Martinez-Fuentes 2018, p. 7.
- ↑ Vicente-Saez, Gustafsson & Van den Brande 2020, p. 1.
- ↑ 63.0 63.1 Nosek et al. 2015, p. 1423.
- ↑ 64.0 64.1 64.2 64.3 64.4 Nosek et al. 2015, p. 1424.
- ↑ Laine 2018, p. 58
- ↑ Laine 2018, p. 56
- ↑ Laine 2018, p. 65
- ↑ Pauly 2021, p. 5
- ↑ Moher et al. 2020, p. 6
- ↑ Laine 2018, p. 54
- ↑ 71.0 71.1 Schuurbiers et al. 2009, p. 229.
- ↑ Laine 2018, p. 68
- ↑ Laine 2018, p. 69
- ↑ Décret n° 2021-1572 du 3 décembre 2021 relatif au respect des exigences de l'intégrité scientifique
- ↑ "Singapore Statement on Research Integrity". 2010. https://www.jsps.go.jp/english/e-kousei/data/singapore_statement_EN.pdf.
- ↑ "ALLEA publishes revised edition of The European Code of Conduct for Research Integrity". All European Academies (ALLEA). 2017. https://allea.org/allea-publishes-revised-edition-european-code-conduct-research-integrity/.
- ↑ 77.0 77.1 https://www.hhs.gov/sites/default/files/draft-hhs-scientific-integrity-policy.pdf
- ↑ Evaluation (ASPE), Assistant Secretary for Planning and (2023-06-27). "HHS Scientific Integrity" (in en). https://www.hhs.gov/programs/research/scientificintegrity/index.html.
- ↑ "Scientific Integrity and Research Misconduct" (in en). https://www.usda.gov/our-agency/staff-offices/office-chief-scientist-ocs/scientific-integrity-and-research-misconduct.
- ↑ "Who We Are" (in EN). https://www.nih.gov/about-nih/who-we-are.
- ↑ https://osp.od.nih.gov/wp-content/uploads/2023/09/SI_Compendium-2022Update.pdf
Bibliography
Books & Thesis
- Babbage, Charles (1830). Reflections on the Decline of Science in England: And on Some of Its Causes, by Charles Babbage (1830). To which is Added On the Alleged Decline of Science in England, by a Foreigner (Gerard Moll) with a Foreword by Michael Faraday (1831).. B. Fellowes.
- Merton, Robert K. (1973). The Sociology of Science: Theoretical and Empirical Investigations. University of Chicago Press. ISBN 978-0-226-52092-6.
- Broad, William J.; Wade, Nicholas (1983). Betrayers of the Truth. Simon and Schuster. ISBN 978-0-671-44769-4.
- Suber, Peter (2012-07-20). Open Access. Cambridge, Mass: The MIT Press. ISBN 978-0-262-51763-8.
- Rentier, Bernard (2019-03-08). Open Science, the challenge of transparency (1er édition ed.). Académie royale de Belgique.
- Pimple, Kenneth D., ed (2017-05-15). Research Ethics. Routledge. ISBN 978-1-351-90400-1.
Reports
- Pauly, Gerhard (2021). OSCAR open science code of conduct (Report). Fraunhofer-Gesellschaft zur Förderung der Angewandten Forschung e.V..
- Henriet, M Pierre; Ouzoulias, M Pierre (2021). Promouvoir et protéger une culture partagée de l'intégrité scientifique (Report). Assemblée nationale.
Journal articles
- Woolf, Patricia K. (1988). "Deception in Scientific Research". Jurimetrics 29 (1): 67–95. ISSN 0897-1277. PMID 11654908. http://www.jstor.org/stable/29762108. Retrieved 2022-02-12.
- Partha, Dasgupta; David, Paul A. (1994-09-01). "Toward a new economics of science". Research Policy. Special Issue in Honor of Nathan Rosenberg 23 (5): 487–521. doi:10.1016/0048-7333(94)01002-1. ISSN 0048-7333. https://dx.doi.org/10.1016/0048-7333%2894%2901002-1. Retrieved 2022-04-18.
- Pimple, Kenneth D. (2002-06-01). "Six domains of research ethics". Science and Engineering Ethics 8 (2): 191–205. doi:10.1007/s11948-002-0018-1. ISSN 1471-5546. PMID 12092490. https://doi.org/10.1007/s11948-002-0018-1. Retrieved 2022-02-19.
- Whitbeck, Caroline (2004). "Trust and the Future of Research". Physics Today 57 (11): 48–53. doi:10.1063/1.1839377. ISSN 0031-9228. Bibcode: 2004PhT....57k..48W. https://physicstoday.scitation.org/doi/10.1063/1.1839377. Retrieved 2022-02-12.
- Ioannidis, John P. A. (2005). "Why Most Published Research Findings Are False". PLOS Medicine 2 (8): –124. doi:10.1371/journal.pmed.0020124. ISSN 1549-1676. PMID 16060722.
- Rappert, Brian (2007). "Codes of conduct and biological weapons: an in-process assessment". Biosecurity and Bioterrorism: Biodefense Strategy, Practice, and Science 5 (2): 145–154. doi:10.1089/bsp.2007.0003. ISSN 1538-7135. PMID 17608600.
- David, Paul A. (2008-10-24). "The Historical Origins of 'Open Science': An Essay on Patronage, Reputation and Common Agency Contracting in the Scientific Revolution". Capitalism and Society 3 (2). doi:10.2202/1932-0213.1040. ISSN 1932-0213. https://www.degruyter.com/document/doi/10.2202/1932-0213.1040/html. Retrieved 2021-11-11.
- Schuurbiers, Daan; Osseweijer, Patricia; Kinderlerer, Julian (2009). "Implementing the Netherlands code of conduct for scientific practice-a case study". Science and Engineering Ethics 15 (2): 213–231. doi:10.1007/s11948-009-9114-9. ISSN 1353-3452. PMID 19156537.
- Fanelli, Daniele (2009). "How Many Scientists Fabricate and Falsify Research? A Systematic Review and Meta-Analysis of Survey Data". PLOS ONE 4 (5): –5738. doi:10.1371/journal.pone.0005738. ISSN 1932-6203. PMID 19478950. Bibcode: 2009PLoSO...4.5738F.
- John, Leslie K.; Loewenstein, George; Prelec, Drazen (2012-05-01). "Measuring the Prevalence of Questionable Research Practices With Incentives for Truth Telling". Psychological Science 23 (5): 524–532. doi:10.1177/0956797611430953. ISSN 0956-7976. PMID 22508865. https://doi.org/10.1177/0956797611430953. Retrieved 2022-09-13.
- Löppönen, Paavo; Vuorio, Eero (2013-02-21). "Tutkimusetiikka Suomessa 1980-luvulta tähän päivään". Tieteessä tapahtuu 31 (1). ISSN 1239-6540. https://journal.fi/tt/article/view/7704. Retrieved 2022-02-12.
- Resnik, David B.; Rasmussen, Lisa M.; Kissling, Grace E. (2015-09-03). "An International Study of Research Misconduct Policies". Accountability in Research 22 (5): 249–266. doi:10.1080/08989621.2014.958218. ISSN 0898-9621. PMID 25928177. PMC 4449617. https://doi.org/10.1080/08989621.2014.958218. Retrieved 2021-11-11.
- Giorgini, Vincent; Mecca, Jensen T.; Gibson, Carter; Medeiros, Kelsey; Mumford, Michael D.; Connelly, Shane; Devenport, Lynn D. (2015). "Researcher Perceptions of Ethical Guidelines and Codes of Conduct". Accountability in Research 22 (3): 123–138. doi:10.1080/08989621.2014.955607. ISSN 0898-9621. PMID 25635845. PMC 4313573. https://doi.org/10.1080/08989621.2014.955607. Retrieved 2022-02-13.
- Nosek, B. A.; Alter, G.; Banks, G. C.; Borsboom, D.; Bowman, S. D.; Breckler, S. J.; Buck, S.; Chambers, C. D. et al. (2015-06-26). "Promoting an open research culture". Science 348 (6242): 1422–1425. doi:10.1126/science.aab2374. ISSN 0036-8075. PMID 26113702. Bibcode: 2015Sci...348.1422N.
- Wicherts, Jelte M.; Veldkamp, Coosje L. S.; Augusteijn, Hilde E. M.; Bakker, Marjan; van Aert, Robbie C. M.; van Assen, Marcel A. L. M. (2016). "Degrees of Freedom in Planning, Running, Analyzing, and Reporting Psychological Studies: A Checklist to Avoid p-Hacking". Frontiers in Psychology 7: 1832. doi:10.3389/fpsyg.2016.01832. ISSN 1664-1078. PMID 27933012.
- Baker, Monya (2016-05-26). "1,500 scientists lift the lid on reproducibility". Nature News 533 (7604): 452–454. doi:10.1038/533452a. PMID 27225100. Bibcode: 2016Natur.533..452B. http://www.nature.com/news/1-500-scientists-lift-the-lid-on-reproducibility-1.19970. Retrieved 2020-02-08.
- Resnik, David B.; Shamoo, Adil E. (2017). "Reproducibility and Research Integrity". Accountability in Research 24 (2): 116–123. doi:10.1080/08989621.2016.1257387. ISSN 0898-9621. PMID 27820655.
- Fraser, Hannah; Parker, Tim; Nakagawa, Shinichi; Barnett, Ashley; Fidler, Fiona (2018). "Questionable research practices in ecology and evolution". PLOS ONE 13 (7): –0200303. doi:10.1371/journal.pone.0200303. ISSN 1932-6203. PMID 30011289. Bibcode: 2018PLoSO..1300303F.
- Vicente-Saez, Ruben; Martinez-Fuentes, Clara (2018-07-01). "Open Science now: A systematic literature review for an integrated definition". Journal of Business Research 88: 428–436. doi:10.1016/j.jbusres.2017.12.043. ISSN 0148-2963. https://www.sciencedirect.com/science/article/pii/S0148296317305441. Retrieved 2022-04-18.
- Laine, Heidi (2018-12-31). "Open science and codes of conduct on research integrity". Informaatiotutkimus 37 (4). doi:10.23978/inf.77414. ISSN 1797-9129. https://journal.fi/inf/article/view/77414. Retrieved 2021-11-11.
- Fanelli, Daniele (2018-03-13). "Opinion: Is science really facing a reproducibility crisis, and do we need it to?". Proceedings of the National Academy of Sciences 115 (11): 2628–2631. doi:10.1073/pnas.1708272114. ISSN 0027-8424. PMID 29531051.
- Mion, Giorgio; Broglia, Angela; Bonfanti, Angelo (2019). "Do Codes of Ethics Reveal a University's Commitment to Sustainable Development? Evidence from Italy". Sustainability 11 (4): 1134. doi:10.3390/su11041134. ISSN 2071-1050.
- Moher, David; Bouter, Lex; Kleinert, Sabine; Glasziou, Paul; Sham, Mai Har; Barbour, Virginia; Coriat, Anne-Marie; Foeger, Nicole et al. (2020). "The Hong Kong Principles for assessing researchers: Fostering research integrity". PLOS Biology 18 (7): –3000737. doi:10.1371/journal.pbio.3000737. ISSN 1545-7885. PMID 32673304.
- Vicente-Saez, Ruben; Gustafsson, Robin; Van den Brande, Lieve (2020-07-01). "The dawn of an open exploration era: Emergent principles and practices of open science and innovation of university research teams in a digital world". Technological Forecasting and Social Change 156: 120037. doi:10.1016/j.techfore.2020.120037. ISSN 0040-1625. https://www.sciencedirect.com/science/article/pii/S0040162518316378. Retrieved 2022-04-18.
- Bouter, Lex (2020-08-01). "What Research Institutions Can Do to Foster Research Integrity". Science and Engineering Ethics 26 (4): 2363–2369. doi:10.1007/s11948-020-00178-5. ISSN 1471-5546. PMID 31965429. PMC 7417389. https://doi.org/10.1007/s11948-020-00178-5. Retrieved 2022-02-14.
- Moher, David; Bouter, Lex; Kleinert, Sabine; Glasziou, Paul; Sham, Mai Har; Barbour, Virginia; Coriat, Anne-Marie; Foeger, Nicole et al. (2020). "The Hong Kong Principles for assessing researchers: Fostering research integrity". PLOS Biology 18 (7): –3000737. doi:10.1371/journal.pbio.3000737. ISSN 1545-7885. PMID 32673304.}
- Gopalakrishna, Gowri; Riet, Gerben ter; Vink, Gerko; Stoop, Ineke; Wicherts, Jelte; Bouter, Lex (2021-07-06). "Prevalence of questionable research practices, research misconduct and their potential explanatory factors: a survey among academic researchers in The Netherlands". https://osf.io/preprints/metaarxiv/vk9yt/. Retrieved 2022-02-18.
Other sources
- NW, 1615 L. St; Washington, Suite 800; Inquiries, DC 20036 USA202-419-4300 (2015-01-29). "Public and Scientists' Views on Science and Society". Pew Research Center Science & Society. https://www.pewresearch.org/science/2015/01/29/public-and-scientists-views-on-science-and-society/. Retrieved 2021-11-11.
Original source: https://en.wikipedia.org/wiki/Scientific integrity.
Read more |