Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Do German university medical centres promote robust and transparent research? A cross-sectional study of institutional policies

Do German university medical centres promote robust and transparent research? A cross-sectional... Background: In light of replication and translational failures, biomedical research practices have recently come under scrutiny. Experts have pointed out that the current incentive structures at research institutions do not suf- ficiently incentivise researchers to invest in robustness and transparency and instead incentivise them to optimize their fitness in the struggle for publications and grants. This cross-sectional study aimed to describe whether and how relevant policies of university medical centres in Germany support the robust and transparent conduct of research and how prevalent traditional metrics are. Methods: For 38 German university medical centres, we searched for institutional policies for academic degrees and academic appointments as well as websites for their core facilities and research in general between December 2020 and February 2021. We screened the documents for mentions of indicators of robust and transparent research (study registration; reporting of results; sharing of research data, code and protocols; open access; and measures to increase robustness) and for mentions of more traditional metrics of career progression (number of publications; number and value of awarded grants; impact factors; and authorship order). Results: While open access was mentioned in 16% of PhD regulations, other indicators of robust and transparent research were mentioned in less than 10% of institutional policies for academic degrees and academic appointments. These indicators were more frequently mentioned on the core facility and general research websites. Institutional policies for academic degrees and academic appointments had frequent mentions of traditional metrics. Conclusions: References to robust and transparent research practices are, with a few exceptions, generally uncom- mon in institutional policies at German university medical centres, while traditional criteria for academic promotion and tenure still prevail. Keywords: Science policy, Incentives, Robustness, Transparency, Open science, University medical centre research lead to research waste [1]. These statements Background have been accompanied by findings that biomedical In recent years, the field of biomedicine has seen broad research often fails to reproduce [2–6], which ultimately and increasing reflection on its research practices. Vari - hampers the goal of biomedical research, which is trans- ous authors have pointed out that flaws in the choice lation of findings into medical practice, and ultimately of research questions and in the conduct of biomedical improving healthcare [1]. Concretely, while authors have discussed a possible low *Correspondence: martin.holst@bih-charite.de base rate of true hypotheses [7], and others have pointed Berlin Institute of Health at Charité – Universitätsmedizin Berlin, QUEST to necessary changes in how research is funded [8] and Center for Responsible Research, Charitéplatz 1, 10117 Berlin, Germany Full list of author information is available at the end of the article regulated [9], much of the discussion has focused on the © The Author(s) 2022. Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http:// creat iveco mmons. org/ licen ses/ by/4. 0/. The Creative Commons Public Domain Dedication waiver (http:// creat iveco mmons. org/ publi cdoma in/ zero/1. 0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data. Holst et al. Health Research Policy and Systems (2022) 20:39 Page 2 of 14 design, conduct, dissemination and reporting of bio- ultimately tenure, researchers are evaluated based pri- medical research. It has been argued that the field funda - marily on how many journal articles (with high impact mentally lacks transparency, with study reports, research factors) they publish and how much grant money they protocols or participant data often not publicly accessi- secure [35]. The negative influence of the so-called pub - ble, and many research findings not being published at all lication pressure on research quality has been shown [10]. If findings are published, they often lack sufficient by mathematical simulations [35, 36] as well as empiri- detail and suffer from selective reporting of outcomes cal surveys indicating that it is both positively associated or limitations [12]. In addition, authors have pointed to with questionable research practices, and negatively asso- flaws in biomedical study design and statistical analyses ciated with responsible research practices [11, 38]. It has [7, 13, 14]. A recent survey from the Netherlands found been said that all stakeholder groups, including funders that some of these so-called questionable research prac- and journals, must contribute [9, 12] to an incentive sys- tices are more prevalent in the biomedical field than in tem that actually does reward robust and transparent other fields [11]. research practices; in the case of funders, for example, by Several solutions have been proposed to address the awarding grants based not only on publication numbers, flaws in the design, conduct, dissemination and reporting but on the adoption of open practices and, in the case of of biomedical research. One of the most widely discussed publishers, by providing peer review that embraces open proposals is the call for more transparent, or “open,” practices (allowing peer reviewers to better serve as qual- science along all steps of biomedical research. One of ity control instances and detect questionable research these steps is study registration, that is, registering study practices [11]) and not publishing only positive findings, protocols before data collection, which is supposed to but instead basing editorial decisions just on the sound- disclose flexibility in data analysis that might lead to ness of the research. This is, as some studies show, cur - false-positive results [15–17]. There have been calls to rently not always the case [39, 40]. increase the robustness of science, for example, by ask- The role and influence of the research institutions ing and supporting researchers in choosing adequately has thus far been less prominently discussed [3]. Since large samples, appropriately randomising participants research institutions define the requirements for aca - and performing blinding of subjects, experimenters and demic degrees, academic appointments and available outcome assessors [3, 4, 18, 19]. Researchers have been intramural funding, their policies and regulations could, urged to share their data, code and protocols to increase and do [11, 38], have a strong impact on researchers’ transparency and reproducibility of biomedical research capability, opportunity and motivation to apply robust [20], and to report all research results in a timely manner, and transparent research practices in their work. With in line with established reporting guidelines, and ideally regard to university policies, some changes have already without paywalls (open access). This is supposed to tackle been proposed. One of these changes is abandoning the prevalent publication bias in which only positive results current dysfunctional incentive systems of promotion are reported in journals [21], which distorts the evidence [35, 36]. Another is an increased focus on transparent base and thus leads to research waste, for example, by practices: the signers of the San Francisco Declara- encouraging follow-up studies that would have been con- tion on Research Assessment (DORA) call for institu- sidered futile if all research had been reported. To aid in tions to clearly highlight “that the scientific content of this, new publication formats, namely, preprints and reg- a paper is much more important than publication met- istered reports [22], have been established. All of these rics or the identity of the journal in which it was pub- procedures are, in the long run, supposed to increase lished” [41]. More specifically, Moher et al. [42] suggest trust in science and lead to more reproducible research that rewards, incentives and performance metrics at [23]. Additionally, more emphasis has been put on actual institutions should align with the full dissemination of replication of studies [24], and there have also been calls research, reuse of original datasets and more complete to abandon [25], redefine [26] or better justify [27] statis - reporting, namely, the sharing of protocols, code and tical significance thresholds; however, these suggestions data, as well as preregistration of research (see also the have been subject to debate. publications by the League of European Research Uni- To date, the uptake of the aforementioned robust and versities [43] and others [12, 44–47]). Mejlgaard et  al. transparent practices has been slow [28–33]. Many have [48] propose that institutions should incentivise mak- pointed out that the current incentive structures for ing data findable, accessible, interoperable and reusable researchers do not sufficiently incentivise them to invest (FAIR) [49]. Begley et  al. [3] suggest similar rules for in robustness and transparency and instead incentivise academic degrees and academic appointments but with them to optimise their fitness in the struggle for publi - regard to the robustness of the research. These authors cations and grants [34–37]. To receive promotion and also demand that the use of reporting guidelines, such Holst  et al. Health Research Policy and Systems (2022) 20:39 Page 3 of 14 as the ARRIVE (Animal Research: Reporting of In Vivo Methods Experiments) guidelines [50] or the CONSORT (Con- A detailed methodology is described in our preregis- solidated Standards of Reporting Trials) guidelines [51], tered study protocol, which is available here: https:// osf. be mandated by institutions. Additionally, core facili- io/ wu69s/ (including a list of protocol amendments and ties such as clinical research units and animal research deviations). The following section provides a summary of facilities provide centralised services for the conduct of the methods, which are reported in accordance with the clinical or animal studies (this includes animal protec- STROBE (Strengthening the Reporting of Observational tion and research according to the so-called 3R princi- studies in Epidemiology) [55] guidelines. ples: replace, reduce, refine [52]). These core facilities could have additional influence [53], for example, by Sampling and search strategy recommending that researchers report their results in a We obtained a list of all German medical faculties from timely and nonselective way or by requiring researchers the website of the German medical faculty council to adhere to established reporting guidelines. (Medizinischer Fakultätentag). For each of the 38 fac- Studying the uptake of the aforementioned recom- ulties (as of December 2020), we performed a manual mendations in institutional policies could inform areas search of their websites between 14 December 2020 and for improvement in policy-making at universities. To 12 February 2021. The search terms and strategy were our knowledge, however, only one study [54] has dealt based on discussions in our full research team after pilot- with this issue, sampling biomedical faculties of 170 ing; they have been presented in detail in our proto- universities worldwide and searching criteria for pro- col. The search was done by the first author (MH), who motion and tenure. The authors report that mentions searched the websites of both the medical faculties and of traditional criteria of research evaluation were very the adjacent university hospitals, looking for the sources frequent, while mentions of robust and transparent presented in Table 1. research practices were rare. Regarding the PhD and habilitation regulations and In this cross-sectional study, we aim to describe the application forms and procedural guidelines for ten- whether and how relevant policies of university medi- ure, we saved all related policy documents. Regarding cal centres (UMCs) in Germany support the robust and the websites of clinical research units, websites of ani- transparent conduct of research and how prevalent tra- mal research facilities, 3R centres and animal protection ditional metrics of career progression are. We choose offices, and the general research websites, we first went to investigate only German UMCs, as this ensures bet- through each website in detail (including all subpages), ter comparability of the institutions, since different saving only those websites and documents that contained countries have different regulatory environments (for any mention of one of the indicators summarised in example, German UMCs are currently in the process Table  2.  (See  Additional file  1: Table  S1 for a more fine- of implementing new good scientific practice regula - grained terminology with subcategories). tions, mandated by the German Research Foundation We chose both the indicators of robust and transparent [Deutsche Forschungsgemeinschaft, [DFG]), different research and the traditional metrics of career progres- curricula for medical studies and different frameworks sion based on their frequent discussion in the literature for postgraduate degrees. The focus on Germany also as either cornerstones of more robust and transpar- allows us to perform in-depth data collection of Ger- ent biomedical research or as incentives leading to the man-language documents. opposite [3, 39, 41, 45, 48]. We also chose them for their Table 1 Data sources that were screened in this study Sources (1) PhD regulations (for every different type of PhD awarded by the medical faculty) (2) Habilitation regulations (habilitation is an academic degree which is awarded after the PhD, and which involves a second, larger research thesis and additional teaching; it historically was and often still is considered a prerequisite for obtaining tenure or securing third-party funding in Germany [74, 75]) (3) Application forms for tenured professorships (4) Procedural guidelines for the tenure process (Berufungsordnungen) (5) Websites of clinical research units (6) Animal research websites, including for animal facilities, 3R centres and animal protection offices (7) General research websites of the medical faculties or university hospitals Holst et al. Health Research Policy and Systems (2022) 20:39 Page 4 of 14 Table 2 Indicators that were chosen for inclusion in this study Indicators of robustness and transparency Traditional metrics (1) Study registration in publicly accessible registries (e.g. ClinicalTrials.gov, DRKS [German Clinical (1) Number of publications Trials Register], Open Science Framework, German Animal Study Registry) (2) Reporting of results (2) Number and monetary value of awarded grants (3) Sharing of research data, code or protocol (3) Impact factor of journals in which research has been published (4) Open access (4) Authorship order (5) Measures to improve robustness – consistency with previous research works [54] and publi- which distinguishes between capability, opportunity and cations from our institute [32, 37]. motivation to change behaviour, and lists education, per- suasion, incentivisation, coercion, training, restriction, Data extraction environmental restructuring, modelling and enablement All documents were imported into qualitative research as potential interventions. We defined anything that software (MAXQDA 2020, Release 20.3.0, VERBI GmbH, could increase capability, opportunity or motivation to Germany). We applied deductive content analysis [56]. engage in that behaviour as “incentivised” or “required”. One rater (MRH) went through all of the documents A second, independent rater (AF) went through the and coded whether there was any mention of the pre- documents of 10 of the 38 UMCs. specified indicators of robust and transparent research, as well as the traditional indicators of metrics for career Results progression. While we searched all documents for the The datasets generated and analysed during the current indicators of robust and transparent research, we only study are available in a repository on the Open Science searched the PhD and habilitation regulations and appli- Framework (https:// osf. io/ 4pzjg/). The code for calcula - cation forms and procedural guidelines for tenure for the tions of inter-rater reliability, which also includes robust- traditional metrics, as these related specifically to career ness checks, is available on GitHub (https:// github. com/ progression.Martin- R-H/ umc- policy- review). The inter-rater reli - If a certain indicator was found, the rater decided ability in our sample of 10 UMCs, measured by Cohen’s whether it was just mentioned (e.g. a university explain- kappa, was κ = 0.806. Thus, we deemed further double- ing what open access is, or a clinical research unit stat- coding unnecessary. ing that 60% of clinical trial results were published) or Overall, the web searches of the 38 German UMCs whether that procedure was incentivised/required (e.g. a yielded 339 documents. We found PhD regulations for 37 university specifically requiring a certain impact factor to UMCs (97%), habilitation regulations for 35 UMCs (92%), receive top marks in the PhD or a clinical research unit tenure application forms for 25 UMCs (66%) and proce- offering support with summary results reporting of clini - dural guidelines for tenure for 11 UMCs (29%). We found cal trials). Thus, while we refer to the traditional indica - 38 general research websites (100%), 32 websites of clini- tors as “metrics” based on their frequent usage as that, cal research units (84%) and 23 animal research websites there is no actual difference between indicators and met - (61%; see Table 3). Additional file  1: Table S2 shows num- rics in the sense that they can both incentivise or require bers for each UMC. behaviour. We based our assessment of incentivised/ The results are presented in detail in Tables  4 and 5, required on the COM-B model of behaviour change [57], divided by each procedure and each type of document Table 3 Number of documents we included for each university and document type PhD regulation Habilitation Tenure Tenure Website Animal General research regulation (application (procedural of clinical research website form) guidelines) research unit website UMCs with docu- 37 35 25 11 32 23 38 ment or website (97%) (92%) (66%) (29%) (84%) (61%) (100%) found For the criteria for promotion and tenure, we counted every UMC with at least one policy found. For the core facility websites, we counted every UMC at which we at least found a website—even if we did not save any documents due to lack of mentions of any of the relevant procedures Holst  et al. Health Research Policy and Systems (2022) 20:39 Page 5 of 14 Table 4 Number of university medical centres that mention indicators of robust and transparent science and traditional indicators of career progression in each of the included sources PhD regulation Habilitation regulation Tenure (application form) Tenure (procedural (n = 37) (n = 35) (n = 25) guideline) (n = 11) Any mention Incentivised/ Any mention Incentivised/ Any mention Incentivised/ Any mention Incentivised/ required required required required Indicators of robust/transparent science Study registration 0% (0) 0% (0) 0% (0) 0% (0) 0% (0) 0% (0) 0% (0) 0% (0) Reporting of results 8% (3) 3% (1) 3% (1) 3% (1) 0% (0) 0% (0) 0% (0) 0% (0) Sharing of data/code/ 3% (1) 3% (1) 0% (0) 0% (0) 0% (0) 0% (0) 0% (0) 0% (0) protocol Open access 16% (6) 14% (5) 3% (1) 0% (0) 0% (0) 0% (0) 0% (0) 0% (0) Robustness 3% (1) 3% (1) 0% (0) 0% (0) 0% (0) 0% (0) 0% (0) 0% (0) Traditional indicators of career progression Number of publica- 100% (37) 100% (37) 91% (32) 80% (28) 0% (0) 0% (0) 27% (3) 9% (1) tions Grant money 0% (0) 0% (0) 11% (4) 3% (1) 84% (21) 0% (0) 27% (3) 9% (1) Impact factor 16% (6) 14% (5) 63% (24) 54% (19) 72% (18) 0% (0) 0% (0) 0% (0) Authorship order 97% (36) 97% (36) 80% (28) 80% (28) 68% (17) 0% (0) 0% (0) 0% (0) The left column for each source indicates any mention of the indicator, and the right column for each source indicates indicators that are incentivised or required Table 5 Number of university medical centres that mention indicators of robust and transparent science in each of the included sources Clinical research units Animal research websites General research website (n = 32) (n = 23) (n = 38) Any mention Incentivised/ Any mention Incentivised/ Any mention Incentivised/ required required required Indicators of robust/transparent science Study registration 34% (11) 31% (10) 4% (1) 4% (1) 5% (2) 3% (1) Reporting of results 9% (3) 3% (1) 4% (1) 0% (0) 21% (8) 11% (4) Sharing of data/code/protocol 0% (0) 0% (0) 4% (1) 0% (0) 21% (8) 11% (4) Open access 0% (0) 0% (0) 4% (1) 0% (0) 34% (13) 24% (9) Robustness 81% (26) 75% (24) 26% (6) 17% (4) 0% (0) 0% (0) The left column for each source indicates any mention of the indicator, and the right column for each source indicates indicators that are incentivised or required or website. Additional file  1: Tables S3 and S4 provide of animal research websites and 5% of general research more detailed data on the subcategories of the indicators websites mentioned registration. The animal facility pro - of robust and transparent science. Tables 6 and 7 provide vided a link to an animal study register, while the two example quotes. research webpages generally endorsed the practice. Indicators of robust and transparent science Reporting of results Study registration Eight percent of the PhD regulations and 3% of habilita- The issue or relevance of registering studies was not men - tion regulations mentioned the issue of results reporting; tioned in any (0%) of the documents regarding academic these mentions included general requirements that the promotion and tenure. Thirty-four percent of websites of respective thesis be published. The habilitation regu - clinical research units mentioned registration, with 31% lation also referred to timely publication, asking indi- of those also incentivising or requiring the practice. This viduals to publish their thesis no later than 2  years after appeared mostly in the form of clinical research units receiving the degree. Results reporting was also men- offering support with registering clinical studies. Only 4% tioned by 9% of clinical research units, 4% of animal Holst et al. Health Research Policy and Systems (2022) 20:39 Page 6 of 14 Table 6 Examples of mentions of each practice, divided by policy type (empty sections indicate that no section regarding the metric was found) PhD regulation Habilitation requirement Tenure Tenure (application (regulation) form) Study registration – – – – Reporting of results “Doctoral theses must meet high quality stand- “The habilitation thesis, or at least its – – ards; after peer review, the written doctoral work essential parts, are to be published by the should be made accessible to the public and habilitated person. The publication should the scientific community as openly as possible take place within 2 years after awarding of through publication.” the teaching qualification.” “Accordingly, the following requirements arise: […] 3. they should lead to a publication in a professional journal or to another kind of publi- cation with a high scientific standard” “According to the recommendations of the DFG (German Research Foundation) […] the follow- ing general principles apply to good scientific practice: […] Publication of results” Data/code sharing “If possible, supervisors should address the fol- – – – lowing points: […] publication of the full original dataset (e.g., via Figshare, Dryad) of all figures (graphs, tables, in-text data, etc.) in the article.” Open access “If possible, supervisors should address the fol- “In the event of publication in accordance – – lowing points: […] Open Access” with sentence 3 no. 4, the university library “The work can be published: […] 2. As an elec- shall be granted the right to produce and tronic Open Access publication in the university distribute further copies of the habilitation repository operated by the university library.” thesis within the scope of the university “The University Library shall be granted the right library’s statutory duties, and to make the to make and distribute further copies of the habilitation thesis publicly accessible in dissertation as well as to make the dissertation data networks.” publicly accessible in data networks within the scope of the legal duties of the University Library” Robustness “If possible, supervisors should address the – – – following points: […] Reduction of bias by appropriate measures (blinding, randomisation, a priori definition of inclusion and exclusion criteria, etc.), a priori power calculations.” research websites and 21% of general research websites. mentioned sharing of data/protocols (0%). Four percent All mentions expressed general endorsements or high- of animal research websites and 21% of research websites lighted education regarding the publication of all results. mentioned data, code or protocol sharing. In the case of One of the clinical research units further offered help the animal facility, the mention was a general introduc- with the publication process. The animal research facil - tion to the FAIR principles [49] of data sharing. The gen - ity that mentioned results reporting provided a tool to eral research websites included endorsements of data and identify publication formats that fit the characteristics of code sharing, mostly within the university’s good scien- the respective datasets. When the general research web- tific practice guidelines. sites mentioned reporting results, they usually referred to statements in the university’s or the DFG’s good scientific Open access practice guidelines for publishing research. Sixteen percent of PhD regulations and 3% of habilita- tion requirements mentioned open access. In one PhD Data/code/protocol sharing regulation, PhD supervisors were asked to also keep in Data, code, or protocol sharing was only mentioned in mind whether the work was published with open access. one PhD regulation (3%). In this mention, supervisors In the other cases, the PhD regulation mentioned that were asked to consider data sharing in the evaluation of the university library had the right to publish the submit- the thesis. No habilitation regulations, tenure application ted thesis in a repository (green open access). No clini- forms or procedural guidelines for tenure mentioned this cal research unit (0%) and 4% of animal research websites indicator (0%). Likewise, no clinical research unit website mentioned open access. In the case of the animal facility, Holst  et al. Health Research Policy and Systems (2022) 20:39 Page 7 of 14 Table 7 Examples for mentions of each practice, divided by policy type (empty sections indicate that no section regarding the metric was found) Clinical research units Animal research facilities General research website Study registration “We offer you the entry of your clinical study both “The Animal Study Registry provides a platform for “When should a study project be listed in a public study prospectively and retrospectively, i.e., after the study has pre-registration of an accurate study plan for animal registry? The International Committee of Medical Journal already started, or support with the entry by yourself.” experimental studies prior to the start of experiments, to Editors (ICMJE) requires prospective registration in an “The Center for Clinical Trials provides free access to avoid selective reporting. […] <Institution> 3R sponsors ICMJE-recognised registry for all clinical trials to be pub- clinicaltrials.gov for scientists of the <institution>. Via this first entries with 500 euros each for research” lished by one of the participating journals. The new version access, investigator-initiated trials from the local area of “The <institution’s> 3Rs toolbox is structured along the of the Declaration of Helsinki also requires this: Article 19 responsibility can be entered in the registry.” lines of the <other institution’s> 6Rs model […], which Every clinical trial shall be registered in a publicly accessible “Since the beginning of this year, all new clinical studies adds robustness, registration, and reporting to replace- database before recruitment of the first subject. The Ethics conducted at the University Medical Center have been ment, reduction, and refinement.” Committee of the <institution> supports this requirement.” centrally recorded in the Research Registry University “Recommendation for registration in the DRKS. The Medicine <institution>” International Committee of Medical Journal Editors (ICMJE) “The analyses will be carried out according to statistical requires prospective registration in an ICMJE-recognised analysis plans that were already designed at the time of registry for all clinical trials to be published by one of the study planning” participating journals. The current version of the Declara- tion of Helsinki also calls for this (Article 35: “Every research project involving human subjects shall be registered in a publicly accessible database before recruitment of the first subject.”). The Ethics Committee of the <institution> sup- ports this call.” Reporting of results “ <Institution>: 92% of clinical studies published in EU “Fiddle is a tool developed by <institution> to combat “As a matter of principle, contributors to research projects [European Union] database” publication bias. This ‘match-making’ tool helps research- are required to actively seek, or at least not refuse, publica- “The CRU [clinical research unit] at <institution> offers ers to identify alternative ways of publishing information tion of the results.” support with the following tasks, among others: […] Sup- from well-designed experiments, which is often difficult “As a rule, scientists contribute all results to the scientific port in writing publications” to publish in traditional journals (i.e., null or neutral discourse. In individual cases, however, there may be rea- “In principle, a publication of the results should be aimed results, datasets, etc.).” sons not to make results publicly available (in the narrower at…” sense in the form of publications but also in the broader sense via other communication channels); this decision must not depend on third parties.” “Findings that do not support the authors’ hypothesis should also be reported.” “Scientific results are to be communicated to the scientific public in the form of publications; the scientific publica- tions are thus—like the scientific observation or the scientific experiment itself—the product of the work of scientists.” “Rules for the publication of results: publication in principle of results obtained with public funds (principle of publicity of basic research), publication also of falsified hypotheses in an appropriate manner and admission of errors (princi- ple of a scientific culture open to error).” Holst et al. Health Research Policy and Systems (2022) 20:39 Page 8 of 14 Table 7 (continued) Clinical research units Animal research facilities General research website Data/Code sharing – “The FAIR Guiding Principles for scientific data manage - “ The <city> Open Science programme of <city> has set ment and stewardship state that data must be Findable, itself the goal of making the research results and data of Accessible, Interoperable, and Reusable ( Wilkinson research institutions in <city>, which were created with et al. [49]). These guidelines put specific emphasis on funds from government research funding, freely accessible enhancing the ability of machines to automatically find and easy to find together with other information on sci- and use the data, in addition to supporting its reuse by ence in <city >.” individuals.” “Scientists at <university> are encouraged to publish and store raw research data that served as the basis for publica- tions, together with the associated materials and informa- tion, in recognised open-access subject repositories in accordance with the FAIR principles (Findable, Accessible, Interoperable, Reusable), insofar as this is in conformity with the applicable legal provisions on data protection and copyright and with planned patent applications.” “Self-programmed software is made publicly available with indication of the source code.” Open access – “ <Institutions> will […] establish an “Open access” and “To promote the OA [open access] publishing activities of “Open Data” culture […].” scientists affiliated with the <university>, the Dean’s Office of the Faculty of Medicine has been providing a publica- tion fund from central funds since the beginning of 2020, from which the APCs [article processing charges] for OA publications (original work, review articles) in journals with an impact factor (IF) above 10 are financed.” “Publications in high-ranking open-access journals are supported by the faculty by covering 50% of the costs. Publications can be made free of charge with individual Open Access publishers.” “Therefore, the Presidential Board recommends […] to archive their scientific publications as a matter of principle as pre- or post-print on a subject repository or on the insti- tutional repository of <university>, and/or to publish them in peer-reviewed open-access journals, and to reserve the right to publish or archive their research results electroni- cally in publishing contracts, if possible. In doing so, the freedom of research and teaching is preserved. Discipline- specific practices and rights of use of publishers are to be taken into account in a differentiated manner.” “For a scientific qualification work, e.g., cumulative dis- sertations or post-doctoral theses, which have appeared in a journal as a published article, permission for second publication must be obtained in any case” Holst  et al. Health Research Policy and Systems (2022) 20:39 Page 9 of 14 Table 7 (continued) Clinical research units Animal research facilities General research website Robustness “Biometric study planning and analysis includes the “The course is aimed at postdocs and scientists who – following: […] power calculation […] implementation of write and/or plan their own animal research applications. randomisation” In particular, the course will consider what requirements “The biometric consultation hour is an internal service a biometric report must meet in order to satisfy the of the CRU for members of the hospital and the medical authorities’ specifications and how this can be achieved. faculty of the <university>. The biometric consultation Special attention will be given to power analysis, study hour provides information on questions of study design, design, and sample size calculation.” sample size planning and the choice of suitable evalua- “Experimental animal science links and information tion methods.” resources: […] “Important tasks of biometrics in clinical trials are power G*Power (freeware for the analysis of test power)” calculation, randomisation” “Reduction: “Online randomisation service” The number of test animals is reduced to a minimum. This is achieved by an optimised experimental design, in which it is clarified statistically and methodically in advance how many animals are necessary for an evalu- able result.” Holst et al. Health Research Policy and Systems (2022) 20:39 Page 10 of 14 it was a link to an interview in which an “open access allowed PhD students to publish only one paper instead culture” was announced. Thirty-four percent of general of three if that paper was in a sufficiently “good” journal. research websites mentioned open access; these websites Tenure application forms mentioned impact factors in either generally recommended open access or referred to 72% of cases, mostly requiring applicants to provide a list the university’s open access publishing funds. of impact factors of each journal they published in. None (0%) of the procedural guidelines for tenure mentioned Measures to improve robustness impact factors. Robustness was mentioned in 3% of PhD regulations but in none (0%) of the habilitation regulations, tenure Authorship order application forms or procedural guidelines for tenure. Ninety-seven percent of the PhD regulations mentioned Robustness was mentioned by 81% of websites of clinical the authorship order, always as an incentive/requirement. research units and 26% of the animal research websites. The same applied to 80% of habilitation regulations, all The clinical research units usually offered services to help of which incentivised or required it. These were regula - with power calculations and randomisation (and, in a tions requiring PhD students and habilitation candi- few cases, blinding). In the case of animal research web- dates to publish a portion of their articles as the first or sites, the mentions pointed to documents recommending last author (e.g. a very common regulation for German power calculation as part of an effort to protect animals, PhD students is to publish three papers, one of which courses on robust animal research and general informa- with first/last authorship). Sixty-eight percent of ten - tional material on these issues. None (0%) of the general ure application forms also mentioned this requirement, research webpages mentioned the issue of robustness. noting that applicants should provide a list of publica- tions divided by authorship. None (0%) of the procedural Traditional indicators guidelines for tenure had a related section. Number of publications References to publication numbers were made by 100% of Discussion PhD regulations and 91% of habilitation regulations. No In this study, we aimed to assess how and to what extent tenure application documents referred to the number of the 38 German UMCs promote robust and transpar- publications, aside from requirements to provide a com- ent research in their publicly available institutional poli- plete list of publications. Procedural guidelines for tenure cies for academic degrees, academic appointments, core had references to the number of publications in 27% of facilities and research in general. We also investigated cases. The PhD regulations and habilitation requirements the presence of traditional metrics of researcher evalu- listed a certain number of publications as a requirement ation. Our results show that current UMC policies on to obtain a PhD or habilitation, respectively. academic degrees (e.g. PhD regulations) or appointments (e.g. tenure application forms) contain very few (less than Number and value of grants 10%) references to our chosen indicators for robust and None (0%) of the PhD regulations mentioned grant transparent research, such as study registration, report- money. Among the habilitation regulations, 11% men- ing of results, data/code/protocol sharing or measures tioned grant money, while 84% of the tenure applica- to improve robustness (e.g. sample size calculation, ran- tion forms mentioned grant money, in which case there domisation, blinding). An exception is open access, which were requirements to provide a complete list of grants was mentioned in 16% (6 out of 37) PhD regulations, in awarded. Twenty-seven percent of the procedural guide- most cases referring to a repository to which the thesis lines for tenure regulations also mentioned grants. These could be publicly uploaded. In contrast, the number of passages stated that experience with grants was expected publications and the authorship order were frequently or that people were required to provide a list of grants mentioned in UMC policies on academic degrees and they received. appointments, particularly PhD and habilitation regula- tions (more than 80%). The majority of application forms Impact factor for tenure further mentioned impact factors and secured Sixteen percent of the PhD regulations and 63% of the grant money (more than 70%). habilitation requirements mentioned an impact factor, The UMCs’ websites for clinical and animal research with most of them establishing concrete incentives or included more frequent mentions of robust and transpar- requirements. These two types of regulations contained ent research, but these differed based on the type of web - passages that asked doctoral students or habilitation can- site. Clinical research unit websites frequently mentioned didates to publish in high-impact journals to achieve the study registration and measures to improve robust- highest grade (summa cum laude) or regulations that ness, while animal research websites only had frequent Holst  et al. Health Research Policy and Systems (2022) 20:39 Page 11 of 14 mentions of measures to improve robustness. These policies for only 66% (application forms) and 29% (pro- mentions were mostly related to sample size calculations cedural guidelines) of all UMCs. We refrained from and randomization. The general research websites had this additional step, however, because the results across the most frequent mentions of open access, reporting the available tenure policies showed a very homogene- of results, and data, code or protocol sharing. In most of ous pattern of no mentions (0%) of measures for robust these cases, these indicators were mentioned in the good and transparent research, and we assumed that this pat- scientific practice guidelines. In the case of open access, tern did not differ across policies that were not publicly some websites also featured references to a university- available. wide open access publishing fund. While our study focused on reviewing policies for Our findings are in line with a similar study that col - robust and transparent research in policies for academic lected data from an international sample [54]. The degrees and academic appointments, as well as their authors found very frequent mentions of traditional crite- research and core facility websites, there are other ways ria for research evaluation, while mentions of robust and for institutions to promote these practices. An exam- transparent research practices were less frequent than ple is the performance-based allocation of intramural in our study, with none of the documents mentioning resources, the so-called Leistungsorientierte Mittelver- publishing in open access mediums, registering research gabe (LOM). The LOM might also have a strong influ - or adhering to reporting guidelines, and only one men- ence on researcher behaviour, and it has been proposed tioning data sharing. The results are unsurprising, given that it should be based on transparency of research [61]. recent findings that practices for robust and transparent Another example would be education on robust and research are only very slowly becoming more prevalent transparent research practices, which has already become [30, 32]; however, they stand in stark contrast to the vari- a target of reform in Germany. These reforms aim explic - ous experts and institutions that have called for institu- itly at training for medical students, who normally do not tions to align their promotion criteria with robust and receive any training in research methodology, to allow transparent research [3, 41–43, 47, 48, 58, 59]. While we them to better understand the evidence base of biomedi- focused exclusively on a full sample of all German UMCs, cal research [62–64]. Education aimed at postgraduates our approach could also be applied to other countries. might mostly be organised and announced via internal It is important to keep in mind that policies and incen- channels of a university and thus not visible for our web tives are constantly changing. As mentioned in the intro- search-based methodology. Third, robustness and trans - duction, a major German funder, the DFG, recently parency might be improved by better supervision or bet- reworked their good scientific practice guidelines [60], ter actions against research misconduct, including better expecting universities to ratify them in their own good whistleblowing systems [48]. Nevertheless, we are con- scientific practice guidelines by July 2022. For the first vinced that our approach was able to find policies that time, these guidelines state that measures to avoid bias cover many institutional incentives, especially policies for in research, such as blinding, should be used and that promotion and tenure, which have a strong influence on researchers should document all information and gener- researcher behaviour. ally should publish all results, including those that do not Additionally, initiatives for transparent research exist support the hypothesis. They also recommend open shar - at the federal and national levels (e.g. Projekt DEAL ing of data and materials in accordance with the FAIR for open access). While universities remain obliged to principles and suggest that authors consider alternative include these national incentives and policies in their publication platforms, such as academic repositories. own regulations, future research might focus on these Some German UMCs might have already changed their other incentives or policies in the biomedical field. internal good scientific practice guidelines by the time More generally, there is discussion about how aca- the data collection of this study was conducted, which is demic institutions—or the academic system in general— the reason why we did not explicitly include these guide- need to change to facilitate better research. People have lines in our web search (we included them, however, if we argued that new regulations for open and transparent found them on the general research websites). research might not lead to genuine change for the bet- One limitation of our study is that the raters were not ter, but rather to box-ticking, for example, by arguing blinded, which was not possible due to the ability to iden- that reporting guidelines are not really of help [65] or by tify the policies from context. Another limitation is that showing that study registrations sometimes lack specific - we only searched for publicly available policies and did ity [66]. Additionally, questions have been raised whether not survey relevant representatives of the 38 UMCs per- assessing individual researchers is the right strategy sonally to identify further policies. For the two types of after all [67]. Criticism has been directed at the general tenure-related policies in particular, we found relevant work structures in academia, with some arguing that Holst et al. Health Research Policy and Systems (2022) 20:39 Page 12 of 14 Acknowledgements short-term, non-permanent contracts [68] and a gen- The authors would like to acknowledge Danielle Rice, Miriam Kip, Tim Neu- eral overweight of third-party funding [69, 70] lead to an mann, Delwen Franzen and Tamarinde Haven for their helpful comments on unhealthy amount of competition and power imbalances the first drafts of the protocol. They would also like to thank the two reviewers and the editor for their feedback on the first submitted manuscript drafts. in academia, which in turn facilitate the use of question- able research practices. Research institutions and aca- Authors’ contributions demia at large are complex systems, with many layers of MH: conceptualisation, data curation, formal analysis, investigation, methodol- ogy, writing—original draft. AF: formal analysis, investigation, writing—review incentives, and it is yet unclear which measures will lead & editing. DS: conceptualisation, funding acquisition, methodology, project to a change for the better. administration, resources, supervision, writing—review & editing. All authors u Th s, future research should also address the effects read and approved the final manuscript. of policies and other institutional activities to increase Funding robust and transparent research practices [71]. Thus Open Access funding enabled and organised by Projekt DEAL. This work was far, only a few studies have addressed this. For example, funded by the German Federal Ministry of Education and Research (BMBF 01PW18012). The funder had no role in the study design, data collection and Keyes et al. [72] evaluated the effect of a clinical trial reg - analysis, decision to publish or preparation of the manuscript. istration and reporting programme, which turned out to be a success. More generally, there is a lack of research Availability of data and materials The datasets generated and analysed during the current study are available in on interventions on organisational climate and culture in a repository on the Open Science Framework (https:// osf. io/ 4pzjg/). The code academia [73]. for inter-rater reliability calculations, which also includes robustness checks, is available on GitHub (https:// github. com/ Martin- R-H/ umc- policy- review). Conclusion Declarations In summary, current UMC policies on academic degrees Ethics approval and consent to participate or appointments do not promote procedures for robust Not applicable. and transparent research, especially in terms of poli- Consent for publication cies for academic degrees and academic appointments. Not applicable. In contrast, the number of publications and the author- ship order play a dominant role in almost all UMC poli- Competing interests All authors are affiliated with German university medical centres. They declare cies on academic degrees and appointments, and most no further conflicts of interest. of the tenure- and appointment-related policies further promote impact factors and grant money secured. This Author details Berlin Institute of Health at Charité – Universitätsmedizin Berlin, QUEST stands in stark contrast to the various experts and insti- Center for Responsible Research, Charitéplatz 1, 10117 Berlin, Germany. tutions that have called for institutions to align their pro- Medizinische Hochschule Hannover, Institute of Ethics, History and Philoso- motion criteria with robust and transparent research. phy of Medicine, Carl-Neuberg-Str. 1, 30625 Hannover, Germany. Received: 4 November 2021 Accepted: 21 March 2022 Abbreviations ARRIVE: Animal Research: Reporting of In Vivo Experiments; CONSORT: Consolidated Standards of Reporting Trials; DFG: Deutsche Forschungsge- meinschaft (German Research Foundation); DORA: San Francisco Declaration References on Research AssessmentDRKS: Deutsches Register Klinischer Studien (German 1. Macleod MR, Michie S, Roberts I, Dirnagl U, Chalmers I, Ioannidis JPA, Clinical Trials Register); DRKS: Deutsches Register Klinischer Studien (German et al. Biomedical research: increasing value, reducing waste. Lancet. Clinical Trials Register); FAIR: Findable, accessible, interoperable and reusable; 2014;383(9912):101–4. LOM: Leistungsorientierte Mittelvergabe (performance-based allocation of 2. Begley CG, Ellis LM. Raise standards for preclinical cancer research. Nature. funding); PhD: Doctor of philosophy; STROBE: Strengthening the Reporting of 2012;483(7391):531–3. Observational Studies in Epidemiology; UMC: University medical centre; 3R: 3. Begley CG, Buchan AM, Dirnagl U. Robust research: institutions must do Replace, reduce, refine. their part for reproducibility. Nature. 2015;525(7567):25–7. 4. Prinz F, Schlange T, Asadullah K. Believe it or not: how much can we Supplementary Information rely on published data on potential drug targets? Nat Rev Drug Discov. 2011;10(9):712–712. The online version contains supplementary material available at https:// doi. 5. Errington TM, Denis A, Perfito N, Iorns E, Nosek BA. Challenges for assess- org/ 10. 1186/ s12961- 022- 00841-2. ing replicability in preclinical cancer biology. eLife. 2021;10:e67995. 6. Errington TM, Mathur M, Soderberg CK, Denis A, Perfito N, Iorns E, Additional file 1: Table S1. Indicators that were chosen for inclusion in et al. Investigating the replicability of preclinical cancer biology. eLife. this study (detailed view). Table S2. Number of documents we included 2021;10:e71601. for each university and document type. Table S3. Number of university 7. Ioannidis JPA. Why most published research findings are false. PLoS Med. medical centres that mention indicators of robust and transparent science 2005;2(8):6. (fine-grained structure) for career progression in each of the included sources. Table S4. Number of university medical centres that mention indicators of robust and transparent science (fine-grained structure) for career progression in each of the included sources. Holst  et al. Health Research Policy and Systems (2022) 20:39 Page 13 of 14 8. Chalmers I, Bracken MB, Djulbegovic B, Garattini S, Grant J, Gülmezoglu at German university medical centers was delayed and incomplete. J AM, et al. How to increase value and reduce waste when research priori- Clin Epidemiol. 2019;1(115):37–45. ties are set. Lancet. 2014;383(9912):156–65. 33. Scheliga K, Friesike S. Putting open science into practice: a social 9. Salman RA-S, Beller E, Kagan J, Hemminki E, Phillips RS, Savulescu J, et al. dilemma? First Monday [Internet]. 2014 Aug 24 [cited 2022 Feb 1]; Increasing value and reducing waste in biomedical research regulation Available from: https:// journ als. uic. edu/ ojs/ index. php/ fm/ artic le/ view/ and management. Lancet. 2014;383(9912):176–85. 5381. 10. Chan A-W, Song F, Vickers A, Jefferson T, Dickersin K, Gøtzsche PC, et al. 34. Flier J. Faculty promotion must assess reproducibility. Nature. Increasing value and reducing waste: addressing inaccessible research. 2017;549(7671):133–133. Lancet. 2014;383(9913):257–66. 35. Higginson AD, Munafò MR. Current incentives for scientists lead 11. Gopalakrishna G, ter Riet G, Vink G, Stoop I, Wicherts JM, Bouter L. to underpowered studies with erroneous conclusions. PLoS Biol. Prevalence of questionable research practices, research misconduct 2016;14(11):e2000995. and their potential explanatory factors: a survey among academic 36. Smaldino PE, McElreath R. The natural selection of bad science. R Soc researchers in The Netherlands [Internet]. MetaArXiv; 2021 Jul [cited Open Sci. 2016;3:106384. 2022 Jan 16]. Available from: https:// osf. io/ vk9yt. 37. Strech D, Weissgerber T, Dirnagl U. Improving the trustworthiness, 12. Glasziou P, Altman DG, Bossuyt P, Boutron I, Clarke M, Julious S, et al. usefulness, and ethics of biomedical research through an innovative and Reducing waste from incomplete or unusable reports of biomedical comprehensive institutional initiative. PLoS Biol. 2020;18(2):e3000576. research. Lancet. 2014;383(9913):267–76. 38. Gopalakrishna G, Wicherts JM, Vink G, Stoop I. Prevalence of responsible 13. Ioannidis JPA, Greenland S, Hlatky MA, Khoury MJ, Macleod MR, Moher research practices among academics in the Netherlands [Internet]. D, et al. Increasing value and reducing waste in research design, con- MetaArXiv. 2021; p. 27. Available from: https:// osf. io/ prepr ints/ metaa rxiv/ duct, and analysis. Lancet. 2014;383(9912):166–75. xsn94/. 14. Macleod MR, Lawson McLean A, Kyriakopoulou A, Serghiou S, de Wilde 39. Robson SG, Baum MA, Beaudry JL, Beitner J, Brohmer H, Chin JM, et al. A, Sherratt N, et al. Risk of bias in reports of in vivo research: a focus for Promoting open science: a holistic approach to changing behaviour. Col- improvement. PLoS Biol. 2015;13(10):e1002273. labra Psychol. 2021;7(1):30137. 15. Simmons JP, Nelson LD, Simonsohn U. False-positive psychology: 40. Minnerup J, Wersching H, Diederich K, Schilling M, Ringelstein EB, undisclosed flexibility in data collection and analysis allows presenting Wellmann J, et al. Methodological quality of preclinical stroke studies is anything as significant. Psychol Sci. 2011;22(11):1359–66. not required for publication in high-impact journals. J Cereb Blood Flow 16. Botvinik-Nezer R, Holzmeister F, Camerer CF, Dreber A, Huber J, Johan- Metab. 2010;30(9):1619–24. nesson M, et al. Variability in the analysis of a single neuroimaging 41. San Francisco Declaration on Research Assessment [Internet]. 2013 [cited dataset by many teams. Nature. 2020;582:84–8. 2021 Jul 26]. Available from: https:// sfdora. org/ read/. 17. Nosek BA, Ebersole CR, DeHaven AC, Mellor DT. The preregistration 42. Moher D, Glasziou P, Chalmers I, Nasser M, Bossuyt PMM, Korevaar DA, revolution. Proc Natl Acad Sci USA. 2018;115(11):2600–6. et al. Increasing value and reducing waste in biomedical research: who’s 18. Nature. Announcement: towards greater reproducibility for life-sci- listening? Lancet. 2016;387(10027):1573–86. ences research in Nature. Nature. 2017;546(7656):8. 43. Lerouge I, Hol T. Towards a research integrity culture at universities: from 19. Percie du Sert N, Bamsey I, Bate ST, Berdoy M, Clark RA, Cuthill I, et al. recommendations to implementation [Internet]. 2020 Jan. Available The experimental design assistant. PLoS Biol. 2017;15(9):e2003779. from: https:// www. leru. org/ publi catio ns/ towar ds-a- resea rch- integ rity- 20. Munafò MR, Nosek BA, Bishop DVM, Button KS, Chambers CD, Percie cultu re- at- unive rsiti es- from- recom menda tions- to- imple menta tion. du Sert N, et al. A manifesto for reproducible science. Nat Hum Behav. 44. Moher D, Naudet F, Cristea IA, Miedema F, Ioannidis JPA, Goodman 2017;1(1):0021. SN. Assessing scientists for hiring, promotion, and tenure. PLoS Biol. 21. DeVito NJ, Goldacre B. Catalogue of bias: publication bias. BMJ Evid 2018;16(3):e2004089. Based Med. 2019;24(2):53–4. 45. Moher D, Bouter L, Kleinert S, Glasziou P, Sham MH, Barbour V, et al. The 22. Chambers C. What’s next for registered reports? Nature. Hong Kong Principles for assessing researchers: fostering research integ- 2019;573(7773):187–9. rity. PLoS Biol. 2020;18(7):e3000737. 23. Nosek BA, Spies JR, Motyl M. Scientific Utopia: II. Restructuring incen- 46. McKiernan EC. Imagining the “open” university: sharing scholarship to tives and practices to promote truth over publishability. Perspect improve research and education. PLoS Biol. 2017;15(10):e1002614. Psychol Sci. 2012;7(6):615–31. 47. Wissenschaftsrat. Perspektiven der Universitätsmedizin [Internet]. 2016 24. Drude NI, Martinez Gamboa L, Danziger M, Dirnagl U, Toelch U. Improv- [cited 2021 Aug 3]. Available from: https:// www. wisse nscha ftsrat. de/ ing preclinical studies through replications. eLife. 2021;10:e62101.downl oad/ archiv/ 5663- 16. pdf?__ blob= publi catio nFile &v=1. 25. Amrhein V, Greenland S, McShane B. Retire statistical significance. 48. Mejlgaard N, Bouter LM, Gaskell G, Kavouras P, Allum N, Bendtsen A-K, Nature. 2019 Mar 21;567:305–7. et al. Research integrity: nine ways to move from talk to walk. Nature. 26. Benjamin DJ, Berger JO, Johannesson M, Nosek BA, Wagenmakers 2020;586(7829):358–60. E-J, Berk R, et al. Redefine statistical significance. Nat Hum Behav. 49. Wilkinson MD, Dumontier M, Aalbersberg IjJ, Appleton G, Axton M, Baak 2018;2(1):6–10. A, et al. The FAIR guiding principles for scientific data management and 27. Lakens D, Adolfi FG, Albers CJ, Anvari F, Apps MAJ, Argamon SE, et al. stewardship. Sci Data. 2016;3(1):160018. Justify your alpha. Nat Hum Behav. 2018;2(3):168–71. 50. Percie du Sert N, Ahluwalia A, Alam S, Avey MT, Baker M, Browne WJ, et al. 28. Hobert A, Jahn N, Mayr P, Schmidt B, Taubert N. Open access uptake Reporting animal research: explanation and elaboration for the ARRIVE in Germany 2010–2018: adoption in a diverse research landscape. guidelines 2.0. Boutron I, editor. PLoS Biol. 2020;18(7):e3000411. Scientometrics. 2021. https:// doi. org/ 10. 1007/ s11192- 021- 04002-0. 51. Schulz KF, Altman DG, Moher D. CONSORT 2010 statement: updated 29. Keyes A, Mayo-Wilson E, Atri N, Lalji A, Nuamah PS, Tetteh O, et al. Time guidelines for reporting parallel group randomised trials. BMJ. 2010. from submission of Johns Hopkins University trial results to posting on https:// doi. org/ 10. 1136/ bmj. c332. ClinicalTrials.gov. JAMA Intern Med. 2020;180(2):317. 52. Russell WMS, Burch RL. The principles of humane experimental tech- 30. Wallach JD, Boyack KW, Ioannidis JPA. Reproducible research practices, nique. London: Methuen; 1959. transparency, and open access data in the biomedical literature, 53. Kos-Braun IC, Gerlach B, Pitzer C. A survey of research quality in core facili- 2015–2017. Dirnagl U, editor. PLoS Biol. 2018;16(11):e2006930. ties. eLife. 2020;9:e62212. 31. Wieschowski S, Biernot S, Deutsch S, Glage S, Bleich A, Tolba R, et al. 54. Rice DB, Raffoul H, Ioannidis JPA, Moher D. Academic criteria for promo - Publication rates in animal research. Extent and characteristics of tion and tenure in biomedical sciences faculties: cross sectional analysis published and non-published animal studies followed up at two of international sample of universities. BMJ. 2020. https:// doi. org/ 10. German university medical centres. Lopes LC, editor. PLoS ONE. 1136/ bmj. m2081. 2019;14(11):e0223758. 55. von Elm E, Altman DG, Egger M, Pocock SJ, Gøtzsche PC, Vandenbroucke 32. Wieschowski S, Riedel N, Wollmann K, Kahrass H, Müller-Ohlraun S, JP. The Strengthening the Reporting of Observational Studies in Epidemi- Schürmann C, et al. Result dissemination from clinical trials conducted ology (STROBE) statement: guidelines for reporting observational studies. Lancet. 2007;370(9596):1453–7. Holst et al. Health Research Policy and Systems (2022) 20:39 Page 14 of 14 56. Elo S, Kyngäs H. The qualitative content analysis process. J Adv Nurs. 2008;62(1):107–15. 57. Michie S, van Stralen MM, West R. The behaviour change wheel: a new method for characterising and designing behaviour change interven- tions. Implement Sci. 2011;6:42. 58. Ioannidis JPA, Khoury MJ. Assessing value in biomedical research: the PQRST of appraisal and reward. JAMA. 2014;312(5):483. 59. Deutsche Forschungsgemeinschaft. Replizierbarkeit von Ergebnissen in der Medizin und Biomedizin. Stellungnahme der Arbeitsgruppe “Qualität in der Klinischen Forschung” der DFG-Senatskommission für Grundsatz- fragen in der Klinischen Forschung [Internet]. DFG; 2018. Available from: https:// www. dfg. de/ downl oad/ pdf/ dfg_ im_ profil/ reden_ stell ungna hmen/ 2018/ 180507_ stell ungna hme_ repli zierb arkeit_ sgkf. pdf. 60. Deutsche Forschungsgemeinschaft. Guidelines for safeguarding good research practice. Code of conduct. 2019 Sep 15 [cited 2021 May 20]; Available from: https:// zenodo. org/ record/ 39236 02. 61. Kip M, Bobrov E, Riedel N, Scheithauer H, Gazlig T, Dirnagl U. Einführung von Open Data als zusätzlicher Indikator für die Leistungsorientierte Mit- telvergabe (LOM)-Forschung an der Charité—Universitätsmedizin Berlin. 2019; p. 1. 62. Ratte A, Drees S, Schmidt-Ott T. The importance of scientific competen- cies in German medical curricula—the student perspective. BMC Med Educ. 2018;18(1):146. 63. Medizinischer Fakultätentag. Positionspapier Vermittlung von Wis- senschaftskompetenz im Medizinstudium [Internet]. Medizinischer Fakultätentag; 2017 [cited 2022 Jan 16]. Available from: https:// mediz inisc he- fakul taeten. de/ medien/ stell ungna hmen/ posit ionsp apier- vermi ttlung- von- wisse nscha ftsko mpete nz- im- mediz instu dium/. 64. Bundesministerium für Bildung und Forschung. Masterplan Medizin- studium 2020 [Internet]. Bundesministerium für Bildung und Forschung; 2017 [cited 2022 Jan 16]. Available from: https:// www. bmbf. de/ bmbf/ share ddocs/ kurzm eldun gen/ de/ maste rplan- mediz instu dium- 2020. html. 65. Barbour RS. Checklists for improving rigour in qualitative research: a case of the tail wagging the dog? BMJ. 2001;322(7294):1115–7. 66. Bakker M, Veldkamp CLS, van Assen MALM, Crompvoets EAV, Ong HH, Nosek BA, et al. Ensuring the quality and specificity of preregistrations. Bero L, editor. PLoS Biol. 2020;18(12):e3000937. 67. Tiokhin L, Panchanathan K, Smaldino PE, Lakens D. Shifting the level of selection in science [Internet]. MetaArXiv; 2021 Oct [cited 2022 Jan 30]. Available from: https:// osf. io/ juwck. 68. Dirnagl U. #IchbinHannah and the fight for permanent jobs for postdocs: how a fictitious postdoc (almost) triggered a fundamental reform of Ger - man academia. EMBO Rep. 2022. https:// doi. org/ 10. 15252/ embr. 20225 69. Barbutev A-S. Wir brauchen einen Systemwechsel. ZEIT Campus [Inter- net]. 2021 Oct 30 [cited 2022 Feb 1]; Available from: https:// www. zeit. de/ campus/ 2021- 10/ ichbi nhanna- hochs chule- sabine- kunst- birgi tt- riegr af- pader born- befri stete- stell en- mitte lbau 70. Janotta L, Lukman C. Wer gut betreut, schadet seiner Karriere. FAZ.NET [Internet]. 2021 Nov 20 [cited 2022 Feb 1]; Available from: https:// www. faz. net/ aktue ll/ wirts chaft/ arm- und- reich/ ichbi nhanna- aerger- ueber- arbei tsver haelt nisse- in- der- wisse nscha ft- 17644 369. html 71. Bouter L. What research institutions can do to foster research integrity. Sci Eng Ethics. 2020;26(4):2363–9. 72. Keyes A, Mayo-Wilson E, Nuamah P, Lalji A, Tetteh O, Ford DE. Creating a program to support registering and reporting clinical trials at Johns Hopkins University. Acad Med. 2021;96(4):529–33. Re Read ady y to to submit y submit your our re researc search h ? Choose BMC and benefit fr ? Choose BMC and benefit from om: : 73. Viđak M, Barać L, Tokalić R, Buljan I, Marušić A. Interventions for organiza- tional climate and culture in academia: a scoping review. Sci Eng Ethics. fast, convenient online submission 2021;27(2):24. thorough peer review by experienced researchers in your field 74. Strauss M, Ehlers J, Gerß J, Klotz L, Reinecke H, Leischik R. Status Quo— Die Anforderungen an die medizinische Habilitation in Deutschland. rapid publication on acceptance DMW. 2020;145(23):e130–6. support for research data, including large and complex data types 75. Schiermeier Q. Breaking the Habilitation habit. Nature. • gold Open Access which fosters wider collaboration and increased citations 2002;415(6869):257–8. maximum visibility for your research: over 100M website views per year Publisher’s Note At BMC, research is always in progress. Springer Nature remains neutral with regard to jurisdictional claims in pub- lished maps and institutional affiliations. Learn more biomedcentral.com/submissions http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Health Research Policy and Systems Springer Journals

Do German university medical centres promote robust and transparent research? A cross-sectional study of institutional policies

Loading next page...
 
/lp/springer-journals/do-german-university-medical-centres-promote-robust-and-transparent-QIhFUfbLwJ

References (126)

Publisher
Springer Journals
Copyright
Copyright © The Author(s) 2022
eISSN
1478-4505
DOI
10.1186/s12961-022-00841-2
Publisher site
See Article on Publisher Site

Abstract

Background: In light of replication and translational failures, biomedical research practices have recently come under scrutiny. Experts have pointed out that the current incentive structures at research institutions do not suf- ficiently incentivise researchers to invest in robustness and transparency and instead incentivise them to optimize their fitness in the struggle for publications and grants. This cross-sectional study aimed to describe whether and how relevant policies of university medical centres in Germany support the robust and transparent conduct of research and how prevalent traditional metrics are. Methods: For 38 German university medical centres, we searched for institutional policies for academic degrees and academic appointments as well as websites for their core facilities and research in general between December 2020 and February 2021. We screened the documents for mentions of indicators of robust and transparent research (study registration; reporting of results; sharing of research data, code and protocols; open access; and measures to increase robustness) and for mentions of more traditional metrics of career progression (number of publications; number and value of awarded grants; impact factors; and authorship order). Results: While open access was mentioned in 16% of PhD regulations, other indicators of robust and transparent research were mentioned in less than 10% of institutional policies for academic degrees and academic appointments. These indicators were more frequently mentioned on the core facility and general research websites. Institutional policies for academic degrees and academic appointments had frequent mentions of traditional metrics. Conclusions: References to robust and transparent research practices are, with a few exceptions, generally uncom- mon in institutional policies at German university medical centres, while traditional criteria for academic promotion and tenure still prevail. Keywords: Science policy, Incentives, Robustness, Transparency, Open science, University medical centre research lead to research waste [1]. These statements Background have been accompanied by findings that biomedical In recent years, the field of biomedicine has seen broad research often fails to reproduce [2–6], which ultimately and increasing reflection on its research practices. Vari - hampers the goal of biomedical research, which is trans- ous authors have pointed out that flaws in the choice lation of findings into medical practice, and ultimately of research questions and in the conduct of biomedical improving healthcare [1]. Concretely, while authors have discussed a possible low *Correspondence: martin.holst@bih-charite.de base rate of true hypotheses [7], and others have pointed Berlin Institute of Health at Charité – Universitätsmedizin Berlin, QUEST to necessary changes in how research is funded [8] and Center for Responsible Research, Charitéplatz 1, 10117 Berlin, Germany Full list of author information is available at the end of the article regulated [9], much of the discussion has focused on the © The Author(s) 2022. Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http:// creat iveco mmons. org/ licen ses/ by/4. 0/. The Creative Commons Public Domain Dedication waiver (http:// creat iveco mmons. org/ publi cdoma in/ zero/1. 0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data. Holst et al. Health Research Policy and Systems (2022) 20:39 Page 2 of 14 design, conduct, dissemination and reporting of bio- ultimately tenure, researchers are evaluated based pri- medical research. It has been argued that the field funda - marily on how many journal articles (with high impact mentally lacks transparency, with study reports, research factors) they publish and how much grant money they protocols or participant data often not publicly accessi- secure [35]. The negative influence of the so-called pub - ble, and many research findings not being published at all lication pressure on research quality has been shown [10]. If findings are published, they often lack sufficient by mathematical simulations [35, 36] as well as empiri- detail and suffer from selective reporting of outcomes cal surveys indicating that it is both positively associated or limitations [12]. In addition, authors have pointed to with questionable research practices, and negatively asso- flaws in biomedical study design and statistical analyses ciated with responsible research practices [11, 38]. It has [7, 13, 14]. A recent survey from the Netherlands found been said that all stakeholder groups, including funders that some of these so-called questionable research prac- and journals, must contribute [9, 12] to an incentive sys- tices are more prevalent in the biomedical field than in tem that actually does reward robust and transparent other fields [11]. research practices; in the case of funders, for example, by Several solutions have been proposed to address the awarding grants based not only on publication numbers, flaws in the design, conduct, dissemination and reporting but on the adoption of open practices and, in the case of of biomedical research. One of the most widely discussed publishers, by providing peer review that embraces open proposals is the call for more transparent, or “open,” practices (allowing peer reviewers to better serve as qual- science along all steps of biomedical research. One of ity control instances and detect questionable research these steps is study registration, that is, registering study practices [11]) and not publishing only positive findings, protocols before data collection, which is supposed to but instead basing editorial decisions just on the sound- disclose flexibility in data analysis that might lead to ness of the research. This is, as some studies show, cur - false-positive results [15–17]. There have been calls to rently not always the case [39, 40]. increase the robustness of science, for example, by ask- The role and influence of the research institutions ing and supporting researchers in choosing adequately has thus far been less prominently discussed [3]. Since large samples, appropriately randomising participants research institutions define the requirements for aca - and performing blinding of subjects, experimenters and demic degrees, academic appointments and available outcome assessors [3, 4, 18, 19]. Researchers have been intramural funding, their policies and regulations could, urged to share their data, code and protocols to increase and do [11, 38], have a strong impact on researchers’ transparency and reproducibility of biomedical research capability, opportunity and motivation to apply robust [20], and to report all research results in a timely manner, and transparent research practices in their work. With in line with established reporting guidelines, and ideally regard to university policies, some changes have already without paywalls (open access). This is supposed to tackle been proposed. One of these changes is abandoning the prevalent publication bias in which only positive results current dysfunctional incentive systems of promotion are reported in journals [21], which distorts the evidence [35, 36]. Another is an increased focus on transparent base and thus leads to research waste, for example, by practices: the signers of the San Francisco Declara- encouraging follow-up studies that would have been con- tion on Research Assessment (DORA) call for institu- sidered futile if all research had been reported. To aid in tions to clearly highlight “that the scientific content of this, new publication formats, namely, preprints and reg- a paper is much more important than publication met- istered reports [22], have been established. All of these rics or the identity of the journal in which it was pub- procedures are, in the long run, supposed to increase lished” [41]. More specifically, Moher et al. [42] suggest trust in science and lead to more reproducible research that rewards, incentives and performance metrics at [23]. Additionally, more emphasis has been put on actual institutions should align with the full dissemination of replication of studies [24], and there have also been calls research, reuse of original datasets and more complete to abandon [25], redefine [26] or better justify [27] statis - reporting, namely, the sharing of protocols, code and tical significance thresholds; however, these suggestions data, as well as preregistration of research (see also the have been subject to debate. publications by the League of European Research Uni- To date, the uptake of the aforementioned robust and versities [43] and others [12, 44–47]). Mejlgaard et  al. transparent practices has been slow [28–33]. Many have [48] propose that institutions should incentivise mak- pointed out that the current incentive structures for ing data findable, accessible, interoperable and reusable researchers do not sufficiently incentivise them to invest (FAIR) [49]. Begley et  al. [3] suggest similar rules for in robustness and transparency and instead incentivise academic degrees and academic appointments but with them to optimise their fitness in the struggle for publi - regard to the robustness of the research. These authors cations and grants [34–37]. To receive promotion and also demand that the use of reporting guidelines, such Holst  et al. Health Research Policy and Systems (2022) 20:39 Page 3 of 14 as the ARRIVE (Animal Research: Reporting of In Vivo Methods Experiments) guidelines [50] or the CONSORT (Con- A detailed methodology is described in our preregis- solidated Standards of Reporting Trials) guidelines [51], tered study protocol, which is available here: https:// osf. be mandated by institutions. Additionally, core facili- io/ wu69s/ (including a list of protocol amendments and ties such as clinical research units and animal research deviations). The following section provides a summary of facilities provide centralised services for the conduct of the methods, which are reported in accordance with the clinical or animal studies (this includes animal protec- STROBE (Strengthening the Reporting of Observational tion and research according to the so-called 3R princi- studies in Epidemiology) [55] guidelines. ples: replace, reduce, refine [52]). These core facilities could have additional influence [53], for example, by Sampling and search strategy recommending that researchers report their results in a We obtained a list of all German medical faculties from timely and nonselective way or by requiring researchers the website of the German medical faculty council to adhere to established reporting guidelines. (Medizinischer Fakultätentag). For each of the 38 fac- Studying the uptake of the aforementioned recom- ulties (as of December 2020), we performed a manual mendations in institutional policies could inform areas search of their websites between 14 December 2020 and for improvement in policy-making at universities. To 12 February 2021. The search terms and strategy were our knowledge, however, only one study [54] has dealt based on discussions in our full research team after pilot- with this issue, sampling biomedical faculties of 170 ing; they have been presented in detail in our proto- universities worldwide and searching criteria for pro- col. The search was done by the first author (MH), who motion and tenure. The authors report that mentions searched the websites of both the medical faculties and of traditional criteria of research evaluation were very the adjacent university hospitals, looking for the sources frequent, while mentions of robust and transparent presented in Table 1. research practices were rare. Regarding the PhD and habilitation regulations and In this cross-sectional study, we aim to describe the application forms and procedural guidelines for ten- whether and how relevant policies of university medi- ure, we saved all related policy documents. Regarding cal centres (UMCs) in Germany support the robust and the websites of clinical research units, websites of ani- transparent conduct of research and how prevalent tra- mal research facilities, 3R centres and animal protection ditional metrics of career progression are. We choose offices, and the general research websites, we first went to investigate only German UMCs, as this ensures bet- through each website in detail (including all subpages), ter comparability of the institutions, since different saving only those websites and documents that contained countries have different regulatory environments (for any mention of one of the indicators summarised in example, German UMCs are currently in the process Table  2.  (See  Additional file  1: Table  S1 for a more fine- of implementing new good scientific practice regula - grained terminology with subcategories). tions, mandated by the German Research Foundation We chose both the indicators of robust and transparent [Deutsche Forschungsgemeinschaft, [DFG]), different research and the traditional metrics of career progres- curricula for medical studies and different frameworks sion based on their frequent discussion in the literature for postgraduate degrees. The focus on Germany also as either cornerstones of more robust and transpar- allows us to perform in-depth data collection of Ger- ent biomedical research or as incentives leading to the man-language documents. opposite [3, 39, 41, 45, 48]. We also chose them for their Table 1 Data sources that were screened in this study Sources (1) PhD regulations (for every different type of PhD awarded by the medical faculty) (2) Habilitation regulations (habilitation is an academic degree which is awarded after the PhD, and which involves a second, larger research thesis and additional teaching; it historically was and often still is considered a prerequisite for obtaining tenure or securing third-party funding in Germany [74, 75]) (3) Application forms for tenured professorships (4) Procedural guidelines for the tenure process (Berufungsordnungen) (5) Websites of clinical research units (6) Animal research websites, including for animal facilities, 3R centres and animal protection offices (7) General research websites of the medical faculties or university hospitals Holst et al. Health Research Policy and Systems (2022) 20:39 Page 4 of 14 Table 2 Indicators that were chosen for inclusion in this study Indicators of robustness and transparency Traditional metrics (1) Study registration in publicly accessible registries (e.g. ClinicalTrials.gov, DRKS [German Clinical (1) Number of publications Trials Register], Open Science Framework, German Animal Study Registry) (2) Reporting of results (2) Number and monetary value of awarded grants (3) Sharing of research data, code or protocol (3) Impact factor of journals in which research has been published (4) Open access (4) Authorship order (5) Measures to improve robustness – consistency with previous research works [54] and publi- which distinguishes between capability, opportunity and cations from our institute [32, 37]. motivation to change behaviour, and lists education, per- suasion, incentivisation, coercion, training, restriction, Data extraction environmental restructuring, modelling and enablement All documents were imported into qualitative research as potential interventions. We defined anything that software (MAXQDA 2020, Release 20.3.0, VERBI GmbH, could increase capability, opportunity or motivation to Germany). We applied deductive content analysis [56]. engage in that behaviour as “incentivised” or “required”. One rater (MRH) went through all of the documents A second, independent rater (AF) went through the and coded whether there was any mention of the pre- documents of 10 of the 38 UMCs. specified indicators of robust and transparent research, as well as the traditional indicators of metrics for career Results progression. While we searched all documents for the The datasets generated and analysed during the current indicators of robust and transparent research, we only study are available in a repository on the Open Science searched the PhD and habilitation regulations and appli- Framework (https:// osf. io/ 4pzjg/). The code for calcula - cation forms and procedural guidelines for tenure for the tions of inter-rater reliability, which also includes robust- traditional metrics, as these related specifically to career ness checks, is available on GitHub (https:// github. com/ progression.Martin- R-H/ umc- policy- review). The inter-rater reli - If a certain indicator was found, the rater decided ability in our sample of 10 UMCs, measured by Cohen’s whether it was just mentioned (e.g. a university explain- kappa, was κ = 0.806. Thus, we deemed further double- ing what open access is, or a clinical research unit stat- coding unnecessary. ing that 60% of clinical trial results were published) or Overall, the web searches of the 38 German UMCs whether that procedure was incentivised/required (e.g. a yielded 339 documents. We found PhD regulations for 37 university specifically requiring a certain impact factor to UMCs (97%), habilitation regulations for 35 UMCs (92%), receive top marks in the PhD or a clinical research unit tenure application forms for 25 UMCs (66%) and proce- offering support with summary results reporting of clini - dural guidelines for tenure for 11 UMCs (29%). We found cal trials). Thus, while we refer to the traditional indica - 38 general research websites (100%), 32 websites of clini- tors as “metrics” based on their frequent usage as that, cal research units (84%) and 23 animal research websites there is no actual difference between indicators and met - (61%; see Table 3). Additional file  1: Table S2 shows num- rics in the sense that they can both incentivise or require bers for each UMC. behaviour. We based our assessment of incentivised/ The results are presented in detail in Tables  4 and 5, required on the COM-B model of behaviour change [57], divided by each procedure and each type of document Table 3 Number of documents we included for each university and document type PhD regulation Habilitation Tenure Tenure Website Animal General research regulation (application (procedural of clinical research website form) guidelines) research unit website UMCs with docu- 37 35 25 11 32 23 38 ment or website (97%) (92%) (66%) (29%) (84%) (61%) (100%) found For the criteria for promotion and tenure, we counted every UMC with at least one policy found. For the core facility websites, we counted every UMC at which we at least found a website—even if we did not save any documents due to lack of mentions of any of the relevant procedures Holst  et al. Health Research Policy and Systems (2022) 20:39 Page 5 of 14 Table 4 Number of university medical centres that mention indicators of robust and transparent science and traditional indicators of career progression in each of the included sources PhD regulation Habilitation regulation Tenure (application form) Tenure (procedural (n = 37) (n = 35) (n = 25) guideline) (n = 11) Any mention Incentivised/ Any mention Incentivised/ Any mention Incentivised/ Any mention Incentivised/ required required required required Indicators of robust/transparent science Study registration 0% (0) 0% (0) 0% (0) 0% (0) 0% (0) 0% (0) 0% (0) 0% (0) Reporting of results 8% (3) 3% (1) 3% (1) 3% (1) 0% (0) 0% (0) 0% (0) 0% (0) Sharing of data/code/ 3% (1) 3% (1) 0% (0) 0% (0) 0% (0) 0% (0) 0% (0) 0% (0) protocol Open access 16% (6) 14% (5) 3% (1) 0% (0) 0% (0) 0% (0) 0% (0) 0% (0) Robustness 3% (1) 3% (1) 0% (0) 0% (0) 0% (0) 0% (0) 0% (0) 0% (0) Traditional indicators of career progression Number of publica- 100% (37) 100% (37) 91% (32) 80% (28) 0% (0) 0% (0) 27% (3) 9% (1) tions Grant money 0% (0) 0% (0) 11% (4) 3% (1) 84% (21) 0% (0) 27% (3) 9% (1) Impact factor 16% (6) 14% (5) 63% (24) 54% (19) 72% (18) 0% (0) 0% (0) 0% (0) Authorship order 97% (36) 97% (36) 80% (28) 80% (28) 68% (17) 0% (0) 0% (0) 0% (0) The left column for each source indicates any mention of the indicator, and the right column for each source indicates indicators that are incentivised or required Table 5 Number of university medical centres that mention indicators of robust and transparent science in each of the included sources Clinical research units Animal research websites General research website (n = 32) (n = 23) (n = 38) Any mention Incentivised/ Any mention Incentivised/ Any mention Incentivised/ required required required Indicators of robust/transparent science Study registration 34% (11) 31% (10) 4% (1) 4% (1) 5% (2) 3% (1) Reporting of results 9% (3) 3% (1) 4% (1) 0% (0) 21% (8) 11% (4) Sharing of data/code/protocol 0% (0) 0% (0) 4% (1) 0% (0) 21% (8) 11% (4) Open access 0% (0) 0% (0) 4% (1) 0% (0) 34% (13) 24% (9) Robustness 81% (26) 75% (24) 26% (6) 17% (4) 0% (0) 0% (0) The left column for each source indicates any mention of the indicator, and the right column for each source indicates indicators that are incentivised or required or website. Additional file  1: Tables S3 and S4 provide of animal research websites and 5% of general research more detailed data on the subcategories of the indicators websites mentioned registration. The animal facility pro - of robust and transparent science. Tables 6 and 7 provide vided a link to an animal study register, while the two example quotes. research webpages generally endorsed the practice. Indicators of robust and transparent science Reporting of results Study registration Eight percent of the PhD regulations and 3% of habilita- The issue or relevance of registering studies was not men - tion regulations mentioned the issue of results reporting; tioned in any (0%) of the documents regarding academic these mentions included general requirements that the promotion and tenure. Thirty-four percent of websites of respective thesis be published. The habilitation regu - clinical research units mentioned registration, with 31% lation also referred to timely publication, asking indi- of those also incentivising or requiring the practice. This viduals to publish their thesis no later than 2  years after appeared mostly in the form of clinical research units receiving the degree. Results reporting was also men- offering support with registering clinical studies. Only 4% tioned by 9% of clinical research units, 4% of animal Holst et al. Health Research Policy and Systems (2022) 20:39 Page 6 of 14 Table 6 Examples of mentions of each practice, divided by policy type (empty sections indicate that no section regarding the metric was found) PhD regulation Habilitation requirement Tenure Tenure (application (regulation) form) Study registration – – – – Reporting of results “Doctoral theses must meet high quality stand- “The habilitation thesis, or at least its – – ards; after peer review, the written doctoral work essential parts, are to be published by the should be made accessible to the public and habilitated person. The publication should the scientific community as openly as possible take place within 2 years after awarding of through publication.” the teaching qualification.” “Accordingly, the following requirements arise: […] 3. they should lead to a publication in a professional journal or to another kind of publi- cation with a high scientific standard” “According to the recommendations of the DFG (German Research Foundation) […] the follow- ing general principles apply to good scientific practice: […] Publication of results” Data/code sharing “If possible, supervisors should address the fol- – – – lowing points: […] publication of the full original dataset (e.g., via Figshare, Dryad) of all figures (graphs, tables, in-text data, etc.) in the article.” Open access “If possible, supervisors should address the fol- “In the event of publication in accordance – – lowing points: […] Open Access” with sentence 3 no. 4, the university library “The work can be published: […] 2. As an elec- shall be granted the right to produce and tronic Open Access publication in the university distribute further copies of the habilitation repository operated by the university library.” thesis within the scope of the university “The University Library shall be granted the right library’s statutory duties, and to make the to make and distribute further copies of the habilitation thesis publicly accessible in dissertation as well as to make the dissertation data networks.” publicly accessible in data networks within the scope of the legal duties of the University Library” Robustness “If possible, supervisors should address the – – – following points: […] Reduction of bias by appropriate measures (blinding, randomisation, a priori definition of inclusion and exclusion criteria, etc.), a priori power calculations.” research websites and 21% of general research websites. mentioned sharing of data/protocols (0%). Four percent All mentions expressed general endorsements or high- of animal research websites and 21% of research websites lighted education regarding the publication of all results. mentioned data, code or protocol sharing. In the case of One of the clinical research units further offered help the animal facility, the mention was a general introduc- with the publication process. The animal research facil - tion to the FAIR principles [49] of data sharing. The gen - ity that mentioned results reporting provided a tool to eral research websites included endorsements of data and identify publication formats that fit the characteristics of code sharing, mostly within the university’s good scien- the respective datasets. When the general research web- tific practice guidelines. sites mentioned reporting results, they usually referred to statements in the university’s or the DFG’s good scientific Open access practice guidelines for publishing research. Sixteen percent of PhD regulations and 3% of habilita- tion requirements mentioned open access. In one PhD Data/code/protocol sharing regulation, PhD supervisors were asked to also keep in Data, code, or protocol sharing was only mentioned in mind whether the work was published with open access. one PhD regulation (3%). In this mention, supervisors In the other cases, the PhD regulation mentioned that were asked to consider data sharing in the evaluation of the university library had the right to publish the submit- the thesis. No habilitation regulations, tenure application ted thesis in a repository (green open access). No clini- forms or procedural guidelines for tenure mentioned this cal research unit (0%) and 4% of animal research websites indicator (0%). Likewise, no clinical research unit website mentioned open access. In the case of the animal facility, Holst  et al. Health Research Policy and Systems (2022) 20:39 Page 7 of 14 Table 7 Examples for mentions of each practice, divided by policy type (empty sections indicate that no section regarding the metric was found) Clinical research units Animal research facilities General research website Study registration “We offer you the entry of your clinical study both “The Animal Study Registry provides a platform for “When should a study project be listed in a public study prospectively and retrospectively, i.e., after the study has pre-registration of an accurate study plan for animal registry? The International Committee of Medical Journal already started, or support with the entry by yourself.” experimental studies prior to the start of experiments, to Editors (ICMJE) requires prospective registration in an “The Center for Clinical Trials provides free access to avoid selective reporting. […] <Institution> 3R sponsors ICMJE-recognised registry for all clinical trials to be pub- clinicaltrials.gov for scientists of the <institution>. Via this first entries with 500 euros each for research” lished by one of the participating journals. The new version access, investigator-initiated trials from the local area of “The <institution’s> 3Rs toolbox is structured along the of the Declaration of Helsinki also requires this: Article 19 responsibility can be entered in the registry.” lines of the <other institution’s> 6Rs model […], which Every clinical trial shall be registered in a publicly accessible “Since the beginning of this year, all new clinical studies adds robustness, registration, and reporting to replace- database before recruitment of the first subject. The Ethics conducted at the University Medical Center have been ment, reduction, and refinement.” Committee of the <institution> supports this requirement.” centrally recorded in the Research Registry University “Recommendation for registration in the DRKS. The Medicine <institution>” International Committee of Medical Journal Editors (ICMJE) “The analyses will be carried out according to statistical requires prospective registration in an ICMJE-recognised analysis plans that were already designed at the time of registry for all clinical trials to be published by one of the study planning” participating journals. The current version of the Declara- tion of Helsinki also calls for this (Article 35: “Every research project involving human subjects shall be registered in a publicly accessible database before recruitment of the first subject.”). The Ethics Committee of the <institution> sup- ports this call.” Reporting of results “ <Institution>: 92% of clinical studies published in EU “Fiddle is a tool developed by <institution> to combat “As a matter of principle, contributors to research projects [European Union] database” publication bias. This ‘match-making’ tool helps research- are required to actively seek, or at least not refuse, publica- “The CRU [clinical research unit] at <institution> offers ers to identify alternative ways of publishing information tion of the results.” support with the following tasks, among others: […] Sup- from well-designed experiments, which is often difficult “As a rule, scientists contribute all results to the scientific port in writing publications” to publish in traditional journals (i.e., null or neutral discourse. In individual cases, however, there may be rea- “In principle, a publication of the results should be aimed results, datasets, etc.).” sons not to make results publicly available (in the narrower at…” sense in the form of publications but also in the broader sense via other communication channels); this decision must not depend on third parties.” “Findings that do not support the authors’ hypothesis should also be reported.” “Scientific results are to be communicated to the scientific public in the form of publications; the scientific publica- tions are thus—like the scientific observation or the scientific experiment itself—the product of the work of scientists.” “Rules for the publication of results: publication in principle of results obtained with public funds (principle of publicity of basic research), publication also of falsified hypotheses in an appropriate manner and admission of errors (princi- ple of a scientific culture open to error).” Holst et al. Health Research Policy and Systems (2022) 20:39 Page 8 of 14 Table 7 (continued) Clinical research units Animal research facilities General research website Data/Code sharing – “The FAIR Guiding Principles for scientific data manage - “ The <city> Open Science programme of <city> has set ment and stewardship state that data must be Findable, itself the goal of making the research results and data of Accessible, Interoperable, and Reusable ( Wilkinson research institutions in <city>, which were created with et al. [49]). These guidelines put specific emphasis on funds from government research funding, freely accessible enhancing the ability of machines to automatically find and easy to find together with other information on sci- and use the data, in addition to supporting its reuse by ence in <city >.” individuals.” “Scientists at <university> are encouraged to publish and store raw research data that served as the basis for publica- tions, together with the associated materials and informa- tion, in recognised open-access subject repositories in accordance with the FAIR principles (Findable, Accessible, Interoperable, Reusable), insofar as this is in conformity with the applicable legal provisions on data protection and copyright and with planned patent applications.” “Self-programmed software is made publicly available with indication of the source code.” Open access – “ <Institutions> will […] establish an “Open access” and “To promote the OA [open access] publishing activities of “Open Data” culture […].” scientists affiliated with the <university>, the Dean’s Office of the Faculty of Medicine has been providing a publica- tion fund from central funds since the beginning of 2020, from which the APCs [article processing charges] for OA publications (original work, review articles) in journals with an impact factor (IF) above 10 are financed.” “Publications in high-ranking open-access journals are supported by the faculty by covering 50% of the costs. Publications can be made free of charge with individual Open Access publishers.” “Therefore, the Presidential Board recommends […] to archive their scientific publications as a matter of principle as pre- or post-print on a subject repository or on the insti- tutional repository of <university>, and/or to publish them in peer-reviewed open-access journals, and to reserve the right to publish or archive their research results electroni- cally in publishing contracts, if possible. In doing so, the freedom of research and teaching is preserved. Discipline- specific practices and rights of use of publishers are to be taken into account in a differentiated manner.” “For a scientific qualification work, e.g., cumulative dis- sertations or post-doctoral theses, which have appeared in a journal as a published article, permission for second publication must be obtained in any case” Holst  et al. Health Research Policy and Systems (2022) 20:39 Page 9 of 14 Table 7 (continued) Clinical research units Animal research facilities General research website Robustness “Biometric study planning and analysis includes the “The course is aimed at postdocs and scientists who – following: […] power calculation […] implementation of write and/or plan their own animal research applications. randomisation” In particular, the course will consider what requirements “The biometric consultation hour is an internal service a biometric report must meet in order to satisfy the of the CRU for members of the hospital and the medical authorities’ specifications and how this can be achieved. faculty of the <university>. The biometric consultation Special attention will be given to power analysis, study hour provides information on questions of study design, design, and sample size calculation.” sample size planning and the choice of suitable evalua- “Experimental animal science links and information tion methods.” resources: […] “Important tasks of biometrics in clinical trials are power G*Power (freeware for the analysis of test power)” calculation, randomisation” “Reduction: “Online randomisation service” The number of test animals is reduced to a minimum. This is achieved by an optimised experimental design, in which it is clarified statistically and methodically in advance how many animals are necessary for an evalu- able result.” Holst et al. Health Research Policy and Systems (2022) 20:39 Page 10 of 14 it was a link to an interview in which an “open access allowed PhD students to publish only one paper instead culture” was announced. Thirty-four percent of general of three if that paper was in a sufficiently “good” journal. research websites mentioned open access; these websites Tenure application forms mentioned impact factors in either generally recommended open access or referred to 72% of cases, mostly requiring applicants to provide a list the university’s open access publishing funds. of impact factors of each journal they published in. None (0%) of the procedural guidelines for tenure mentioned Measures to improve robustness impact factors. Robustness was mentioned in 3% of PhD regulations but in none (0%) of the habilitation regulations, tenure Authorship order application forms or procedural guidelines for tenure. Ninety-seven percent of the PhD regulations mentioned Robustness was mentioned by 81% of websites of clinical the authorship order, always as an incentive/requirement. research units and 26% of the animal research websites. The same applied to 80% of habilitation regulations, all The clinical research units usually offered services to help of which incentivised or required it. These were regula - with power calculations and randomisation (and, in a tions requiring PhD students and habilitation candi- few cases, blinding). In the case of animal research web- dates to publish a portion of their articles as the first or sites, the mentions pointed to documents recommending last author (e.g. a very common regulation for German power calculation as part of an effort to protect animals, PhD students is to publish three papers, one of which courses on robust animal research and general informa- with first/last authorship). Sixty-eight percent of ten - tional material on these issues. None (0%) of the general ure application forms also mentioned this requirement, research webpages mentioned the issue of robustness. noting that applicants should provide a list of publica- tions divided by authorship. None (0%) of the procedural Traditional indicators guidelines for tenure had a related section. Number of publications References to publication numbers were made by 100% of Discussion PhD regulations and 91% of habilitation regulations. No In this study, we aimed to assess how and to what extent tenure application documents referred to the number of the 38 German UMCs promote robust and transpar- publications, aside from requirements to provide a com- ent research in their publicly available institutional poli- plete list of publications. Procedural guidelines for tenure cies for academic degrees, academic appointments, core had references to the number of publications in 27% of facilities and research in general. We also investigated cases. The PhD regulations and habilitation requirements the presence of traditional metrics of researcher evalu- listed a certain number of publications as a requirement ation. Our results show that current UMC policies on to obtain a PhD or habilitation, respectively. academic degrees (e.g. PhD regulations) or appointments (e.g. tenure application forms) contain very few (less than Number and value of grants 10%) references to our chosen indicators for robust and None (0%) of the PhD regulations mentioned grant transparent research, such as study registration, report- money. Among the habilitation regulations, 11% men- ing of results, data/code/protocol sharing or measures tioned grant money, while 84% of the tenure applica- to improve robustness (e.g. sample size calculation, ran- tion forms mentioned grant money, in which case there domisation, blinding). An exception is open access, which were requirements to provide a complete list of grants was mentioned in 16% (6 out of 37) PhD regulations, in awarded. Twenty-seven percent of the procedural guide- most cases referring to a repository to which the thesis lines for tenure regulations also mentioned grants. These could be publicly uploaded. In contrast, the number of passages stated that experience with grants was expected publications and the authorship order were frequently or that people were required to provide a list of grants mentioned in UMC policies on academic degrees and they received. appointments, particularly PhD and habilitation regula- tions (more than 80%). The majority of application forms Impact factor for tenure further mentioned impact factors and secured Sixteen percent of the PhD regulations and 63% of the grant money (more than 70%). habilitation requirements mentioned an impact factor, The UMCs’ websites for clinical and animal research with most of them establishing concrete incentives or included more frequent mentions of robust and transpar- requirements. These two types of regulations contained ent research, but these differed based on the type of web - passages that asked doctoral students or habilitation can- site. Clinical research unit websites frequently mentioned didates to publish in high-impact journals to achieve the study registration and measures to improve robust- highest grade (summa cum laude) or regulations that ness, while animal research websites only had frequent Holst  et al. Health Research Policy and Systems (2022) 20:39 Page 11 of 14 mentions of measures to improve robustness. These policies for only 66% (application forms) and 29% (pro- mentions were mostly related to sample size calculations cedural guidelines) of all UMCs. We refrained from and randomization. The general research websites had this additional step, however, because the results across the most frequent mentions of open access, reporting the available tenure policies showed a very homogene- of results, and data, code or protocol sharing. In most of ous pattern of no mentions (0%) of measures for robust these cases, these indicators were mentioned in the good and transparent research, and we assumed that this pat- scientific practice guidelines. In the case of open access, tern did not differ across policies that were not publicly some websites also featured references to a university- available. wide open access publishing fund. While our study focused on reviewing policies for Our findings are in line with a similar study that col - robust and transparent research in policies for academic lected data from an international sample [54]. The degrees and academic appointments, as well as their authors found very frequent mentions of traditional crite- research and core facility websites, there are other ways ria for research evaluation, while mentions of robust and for institutions to promote these practices. An exam- transparent research practices were less frequent than ple is the performance-based allocation of intramural in our study, with none of the documents mentioning resources, the so-called Leistungsorientierte Mittelver- publishing in open access mediums, registering research gabe (LOM). The LOM might also have a strong influ - or adhering to reporting guidelines, and only one men- ence on researcher behaviour, and it has been proposed tioning data sharing. The results are unsurprising, given that it should be based on transparency of research [61]. recent findings that practices for robust and transparent Another example would be education on robust and research are only very slowly becoming more prevalent transparent research practices, which has already become [30, 32]; however, they stand in stark contrast to the vari- a target of reform in Germany. These reforms aim explic - ous experts and institutions that have called for institu- itly at training for medical students, who normally do not tions to align their promotion criteria with robust and receive any training in research methodology, to allow transparent research [3, 41–43, 47, 48, 58, 59]. While we them to better understand the evidence base of biomedi- focused exclusively on a full sample of all German UMCs, cal research [62–64]. Education aimed at postgraduates our approach could also be applied to other countries. might mostly be organised and announced via internal It is important to keep in mind that policies and incen- channels of a university and thus not visible for our web tives are constantly changing. As mentioned in the intro- search-based methodology. Third, robustness and trans - duction, a major German funder, the DFG, recently parency might be improved by better supervision or bet- reworked their good scientific practice guidelines [60], ter actions against research misconduct, including better expecting universities to ratify them in their own good whistleblowing systems [48]. Nevertheless, we are con- scientific practice guidelines by July 2022. For the first vinced that our approach was able to find policies that time, these guidelines state that measures to avoid bias cover many institutional incentives, especially policies for in research, such as blinding, should be used and that promotion and tenure, which have a strong influence on researchers should document all information and gener- researcher behaviour. ally should publish all results, including those that do not Additionally, initiatives for transparent research exist support the hypothesis. They also recommend open shar - at the federal and national levels (e.g. Projekt DEAL ing of data and materials in accordance with the FAIR for open access). While universities remain obliged to principles and suggest that authors consider alternative include these national incentives and policies in their publication platforms, such as academic repositories. own regulations, future research might focus on these Some German UMCs might have already changed their other incentives or policies in the biomedical field. internal good scientific practice guidelines by the time More generally, there is discussion about how aca- the data collection of this study was conducted, which is demic institutions—or the academic system in general— the reason why we did not explicitly include these guide- need to change to facilitate better research. People have lines in our web search (we included them, however, if we argued that new regulations for open and transparent found them on the general research websites). research might not lead to genuine change for the bet- One limitation of our study is that the raters were not ter, but rather to box-ticking, for example, by arguing blinded, which was not possible due to the ability to iden- that reporting guidelines are not really of help [65] or by tify the policies from context. Another limitation is that showing that study registrations sometimes lack specific - we only searched for publicly available policies and did ity [66]. Additionally, questions have been raised whether not survey relevant representatives of the 38 UMCs per- assessing individual researchers is the right strategy sonally to identify further policies. For the two types of after all [67]. Criticism has been directed at the general tenure-related policies in particular, we found relevant work structures in academia, with some arguing that Holst et al. Health Research Policy and Systems (2022) 20:39 Page 12 of 14 Acknowledgements short-term, non-permanent contracts [68] and a gen- The authors would like to acknowledge Danielle Rice, Miriam Kip, Tim Neu- eral overweight of third-party funding [69, 70] lead to an mann, Delwen Franzen and Tamarinde Haven for their helpful comments on unhealthy amount of competition and power imbalances the first drafts of the protocol. They would also like to thank the two reviewers and the editor for their feedback on the first submitted manuscript drafts. in academia, which in turn facilitate the use of question- able research practices. Research institutions and aca- Authors’ contributions demia at large are complex systems, with many layers of MH: conceptualisation, data curation, formal analysis, investigation, methodol- ogy, writing—original draft. AF: formal analysis, investigation, writing—review incentives, and it is yet unclear which measures will lead & editing. DS: conceptualisation, funding acquisition, methodology, project to a change for the better. administration, resources, supervision, writing—review & editing. All authors u Th s, future research should also address the effects read and approved the final manuscript. of policies and other institutional activities to increase Funding robust and transparent research practices [71]. Thus Open Access funding enabled and organised by Projekt DEAL. This work was far, only a few studies have addressed this. For example, funded by the German Federal Ministry of Education and Research (BMBF 01PW18012). The funder had no role in the study design, data collection and Keyes et al. [72] evaluated the effect of a clinical trial reg - analysis, decision to publish or preparation of the manuscript. istration and reporting programme, which turned out to be a success. More generally, there is a lack of research Availability of data and materials The datasets generated and analysed during the current study are available in on interventions on organisational climate and culture in a repository on the Open Science Framework (https:// osf. io/ 4pzjg/). The code academia [73]. for inter-rater reliability calculations, which also includes robustness checks, is available on GitHub (https:// github. com/ Martin- R-H/ umc- policy- review). Conclusion Declarations In summary, current UMC policies on academic degrees Ethics approval and consent to participate or appointments do not promote procedures for robust Not applicable. and transparent research, especially in terms of poli- Consent for publication cies for academic degrees and academic appointments. Not applicable. In contrast, the number of publications and the author- ship order play a dominant role in almost all UMC poli- Competing interests All authors are affiliated with German university medical centres. They declare cies on academic degrees and appointments, and most no further conflicts of interest. of the tenure- and appointment-related policies further promote impact factors and grant money secured. This Author details Berlin Institute of Health at Charité – Universitätsmedizin Berlin, QUEST stands in stark contrast to the various experts and insti- Center for Responsible Research, Charitéplatz 1, 10117 Berlin, Germany. tutions that have called for institutions to align their pro- Medizinische Hochschule Hannover, Institute of Ethics, History and Philoso- motion criteria with robust and transparent research. phy of Medicine, Carl-Neuberg-Str. 1, 30625 Hannover, Germany. Received: 4 November 2021 Accepted: 21 March 2022 Abbreviations ARRIVE: Animal Research: Reporting of In Vivo Experiments; CONSORT: Consolidated Standards of Reporting Trials; DFG: Deutsche Forschungsge- meinschaft (German Research Foundation); DORA: San Francisco Declaration References on Research AssessmentDRKS: Deutsches Register Klinischer Studien (German 1. Macleod MR, Michie S, Roberts I, Dirnagl U, Chalmers I, Ioannidis JPA, Clinical Trials Register); DRKS: Deutsches Register Klinischer Studien (German et al. Biomedical research: increasing value, reducing waste. Lancet. Clinical Trials Register); FAIR: Findable, accessible, interoperable and reusable; 2014;383(9912):101–4. LOM: Leistungsorientierte Mittelvergabe (performance-based allocation of 2. Begley CG, Ellis LM. Raise standards for preclinical cancer research. Nature. funding); PhD: Doctor of philosophy; STROBE: Strengthening the Reporting of 2012;483(7391):531–3. Observational Studies in Epidemiology; UMC: University medical centre; 3R: 3. Begley CG, Buchan AM, Dirnagl U. Robust research: institutions must do Replace, reduce, refine. their part for reproducibility. Nature. 2015;525(7567):25–7. 4. Prinz F, Schlange T, Asadullah K. Believe it or not: how much can we Supplementary Information rely on published data on potential drug targets? Nat Rev Drug Discov. 2011;10(9):712–712. The online version contains supplementary material available at https:// doi. 5. Errington TM, Denis A, Perfito N, Iorns E, Nosek BA. Challenges for assess- org/ 10. 1186/ s12961- 022- 00841-2. ing replicability in preclinical cancer biology. eLife. 2021;10:e67995. 6. Errington TM, Mathur M, Soderberg CK, Denis A, Perfito N, Iorns E, Additional file 1: Table S1. Indicators that were chosen for inclusion in et al. Investigating the replicability of preclinical cancer biology. eLife. this study (detailed view). Table S2. Number of documents we included 2021;10:e71601. for each university and document type. Table S3. Number of university 7. Ioannidis JPA. Why most published research findings are false. PLoS Med. medical centres that mention indicators of robust and transparent science 2005;2(8):6. (fine-grained structure) for career progression in each of the included sources. Table S4. Number of university medical centres that mention indicators of robust and transparent science (fine-grained structure) for career progression in each of the included sources. Holst  et al. Health Research Policy and Systems (2022) 20:39 Page 13 of 14 8. Chalmers I, Bracken MB, Djulbegovic B, Garattini S, Grant J, Gülmezoglu at German university medical centers was delayed and incomplete. J AM, et al. How to increase value and reduce waste when research priori- Clin Epidemiol. 2019;1(115):37–45. ties are set. Lancet. 2014;383(9912):156–65. 33. Scheliga K, Friesike S. Putting open science into practice: a social 9. Salman RA-S, Beller E, Kagan J, Hemminki E, Phillips RS, Savulescu J, et al. dilemma? First Monday [Internet]. 2014 Aug 24 [cited 2022 Feb 1]; Increasing value and reducing waste in biomedical research regulation Available from: https:// journ als. uic. edu/ ojs/ index. php/ fm/ artic le/ view/ and management. Lancet. 2014;383(9912):176–85. 5381. 10. Chan A-W, Song F, Vickers A, Jefferson T, Dickersin K, Gøtzsche PC, et al. 34. Flier J. Faculty promotion must assess reproducibility. Nature. Increasing value and reducing waste: addressing inaccessible research. 2017;549(7671):133–133. Lancet. 2014;383(9913):257–66. 35. Higginson AD, Munafò MR. Current incentives for scientists lead 11. Gopalakrishna G, ter Riet G, Vink G, Stoop I, Wicherts JM, Bouter L. to underpowered studies with erroneous conclusions. PLoS Biol. Prevalence of questionable research practices, research misconduct 2016;14(11):e2000995. and their potential explanatory factors: a survey among academic 36. Smaldino PE, McElreath R. The natural selection of bad science. R Soc researchers in The Netherlands [Internet]. MetaArXiv; 2021 Jul [cited Open Sci. 2016;3:106384. 2022 Jan 16]. Available from: https:// osf. io/ vk9yt. 37. Strech D, Weissgerber T, Dirnagl U. Improving the trustworthiness, 12. Glasziou P, Altman DG, Bossuyt P, Boutron I, Clarke M, Julious S, et al. usefulness, and ethics of biomedical research through an innovative and Reducing waste from incomplete or unusable reports of biomedical comprehensive institutional initiative. PLoS Biol. 2020;18(2):e3000576. research. Lancet. 2014;383(9913):267–76. 38. Gopalakrishna G, Wicherts JM, Vink G, Stoop I. Prevalence of responsible 13. Ioannidis JPA, Greenland S, Hlatky MA, Khoury MJ, Macleod MR, Moher research practices among academics in the Netherlands [Internet]. D, et al. Increasing value and reducing waste in research design, con- MetaArXiv. 2021; p. 27. Available from: https:// osf. io/ prepr ints/ metaa rxiv/ duct, and analysis. Lancet. 2014;383(9912):166–75. xsn94/. 14. Macleod MR, Lawson McLean A, Kyriakopoulou A, Serghiou S, de Wilde 39. Robson SG, Baum MA, Beaudry JL, Beitner J, Brohmer H, Chin JM, et al. A, Sherratt N, et al. Risk of bias in reports of in vivo research: a focus for Promoting open science: a holistic approach to changing behaviour. Col- improvement. PLoS Biol. 2015;13(10):e1002273. labra Psychol. 2021;7(1):30137. 15. Simmons JP, Nelson LD, Simonsohn U. False-positive psychology: 40. Minnerup J, Wersching H, Diederich K, Schilling M, Ringelstein EB, undisclosed flexibility in data collection and analysis allows presenting Wellmann J, et al. Methodological quality of preclinical stroke studies is anything as significant. Psychol Sci. 2011;22(11):1359–66. not required for publication in high-impact journals. J Cereb Blood Flow 16. Botvinik-Nezer R, Holzmeister F, Camerer CF, Dreber A, Huber J, Johan- Metab. 2010;30(9):1619–24. nesson M, et al. Variability in the analysis of a single neuroimaging 41. San Francisco Declaration on Research Assessment [Internet]. 2013 [cited dataset by many teams. Nature. 2020;582:84–8. 2021 Jul 26]. Available from: https:// sfdora. org/ read/. 17. Nosek BA, Ebersole CR, DeHaven AC, Mellor DT. The preregistration 42. Moher D, Glasziou P, Chalmers I, Nasser M, Bossuyt PMM, Korevaar DA, revolution. Proc Natl Acad Sci USA. 2018;115(11):2600–6. et al. Increasing value and reducing waste in biomedical research: who’s 18. Nature. Announcement: towards greater reproducibility for life-sci- listening? Lancet. 2016;387(10027):1573–86. ences research in Nature. Nature. 2017;546(7656):8. 43. Lerouge I, Hol T. Towards a research integrity culture at universities: from 19. Percie du Sert N, Bamsey I, Bate ST, Berdoy M, Clark RA, Cuthill I, et al. recommendations to implementation [Internet]. 2020 Jan. Available The experimental design assistant. PLoS Biol. 2017;15(9):e2003779. from: https:// www. leru. org/ publi catio ns/ towar ds-a- resea rch- integ rity- 20. Munafò MR, Nosek BA, Bishop DVM, Button KS, Chambers CD, Percie cultu re- at- unive rsiti es- from- recom menda tions- to- imple menta tion. du Sert N, et al. A manifesto for reproducible science. Nat Hum Behav. 44. Moher D, Naudet F, Cristea IA, Miedema F, Ioannidis JPA, Goodman 2017;1(1):0021. SN. Assessing scientists for hiring, promotion, and tenure. PLoS Biol. 21. DeVito NJ, Goldacre B. Catalogue of bias: publication bias. BMJ Evid 2018;16(3):e2004089. Based Med. 2019;24(2):53–4. 45. Moher D, Bouter L, Kleinert S, Glasziou P, Sham MH, Barbour V, et al. The 22. Chambers C. What’s next for registered reports? Nature. Hong Kong Principles for assessing researchers: fostering research integ- 2019;573(7773):187–9. rity. PLoS Biol. 2020;18(7):e3000737. 23. Nosek BA, Spies JR, Motyl M. Scientific Utopia: II. Restructuring incen- 46. McKiernan EC. Imagining the “open” university: sharing scholarship to tives and practices to promote truth over publishability. Perspect improve research and education. PLoS Biol. 2017;15(10):e1002614. Psychol Sci. 2012;7(6):615–31. 47. Wissenschaftsrat. Perspektiven der Universitätsmedizin [Internet]. 2016 24. Drude NI, Martinez Gamboa L, Danziger M, Dirnagl U, Toelch U. Improv- [cited 2021 Aug 3]. Available from: https:// www. wisse nscha ftsrat. de/ ing preclinical studies through replications. eLife. 2021;10:e62101.downl oad/ archiv/ 5663- 16. pdf?__ blob= publi catio nFile &v=1. 25. Amrhein V, Greenland S, McShane B. Retire statistical significance. 48. Mejlgaard N, Bouter LM, Gaskell G, Kavouras P, Allum N, Bendtsen A-K, Nature. 2019 Mar 21;567:305–7. et al. Research integrity: nine ways to move from talk to walk. Nature. 26. Benjamin DJ, Berger JO, Johannesson M, Nosek BA, Wagenmakers 2020;586(7829):358–60. E-J, Berk R, et al. Redefine statistical significance. Nat Hum Behav. 49. Wilkinson MD, Dumontier M, Aalbersberg IjJ, Appleton G, Axton M, Baak 2018;2(1):6–10. A, et al. The FAIR guiding principles for scientific data management and 27. Lakens D, Adolfi FG, Albers CJ, Anvari F, Apps MAJ, Argamon SE, et al. stewardship. Sci Data. 2016;3(1):160018. Justify your alpha. Nat Hum Behav. 2018;2(3):168–71. 50. Percie du Sert N, Ahluwalia A, Alam S, Avey MT, Baker M, Browne WJ, et al. 28. Hobert A, Jahn N, Mayr P, Schmidt B, Taubert N. Open access uptake Reporting animal research: explanation and elaboration for the ARRIVE in Germany 2010–2018: adoption in a diverse research landscape. guidelines 2.0. Boutron I, editor. PLoS Biol. 2020;18(7):e3000411. Scientometrics. 2021. https:// doi. org/ 10. 1007/ s11192- 021- 04002-0. 51. Schulz KF, Altman DG, Moher D. CONSORT 2010 statement: updated 29. Keyes A, Mayo-Wilson E, Atri N, Lalji A, Nuamah PS, Tetteh O, et al. Time guidelines for reporting parallel group randomised trials. BMJ. 2010. from submission of Johns Hopkins University trial results to posting on https:// doi. org/ 10. 1136/ bmj. c332. ClinicalTrials.gov. JAMA Intern Med. 2020;180(2):317. 52. Russell WMS, Burch RL. The principles of humane experimental tech- 30. Wallach JD, Boyack KW, Ioannidis JPA. Reproducible research practices, nique. London: Methuen; 1959. transparency, and open access data in the biomedical literature, 53. Kos-Braun IC, Gerlach B, Pitzer C. A survey of research quality in core facili- 2015–2017. Dirnagl U, editor. PLoS Biol. 2018;16(11):e2006930. ties. eLife. 2020;9:e62212. 31. Wieschowski S, Biernot S, Deutsch S, Glage S, Bleich A, Tolba R, et al. 54. Rice DB, Raffoul H, Ioannidis JPA, Moher D. Academic criteria for promo - Publication rates in animal research. Extent and characteristics of tion and tenure in biomedical sciences faculties: cross sectional analysis published and non-published animal studies followed up at two of international sample of universities. BMJ. 2020. https:// doi. org/ 10. German university medical centres. Lopes LC, editor. PLoS ONE. 1136/ bmj. m2081. 2019;14(11):e0223758. 55. von Elm E, Altman DG, Egger M, Pocock SJ, Gøtzsche PC, Vandenbroucke 32. Wieschowski S, Riedel N, Wollmann K, Kahrass H, Müller-Ohlraun S, JP. The Strengthening the Reporting of Observational Studies in Epidemi- Schürmann C, et al. Result dissemination from clinical trials conducted ology (STROBE) statement: guidelines for reporting observational studies. Lancet. 2007;370(9596):1453–7. Holst et al. Health Research Policy and Systems (2022) 20:39 Page 14 of 14 56. Elo S, Kyngäs H. The qualitative content analysis process. J Adv Nurs. 2008;62(1):107–15. 57. Michie S, van Stralen MM, West R. The behaviour change wheel: a new method for characterising and designing behaviour change interven- tions. Implement Sci. 2011;6:42. 58. Ioannidis JPA, Khoury MJ. Assessing value in biomedical research: the PQRST of appraisal and reward. JAMA. 2014;312(5):483. 59. Deutsche Forschungsgemeinschaft. Replizierbarkeit von Ergebnissen in der Medizin und Biomedizin. Stellungnahme der Arbeitsgruppe “Qualität in der Klinischen Forschung” der DFG-Senatskommission für Grundsatz- fragen in der Klinischen Forschung [Internet]. DFG; 2018. Available from: https:// www. dfg. de/ downl oad/ pdf/ dfg_ im_ profil/ reden_ stell ungna hmen/ 2018/ 180507_ stell ungna hme_ repli zierb arkeit_ sgkf. pdf. 60. Deutsche Forschungsgemeinschaft. Guidelines for safeguarding good research practice. Code of conduct. 2019 Sep 15 [cited 2021 May 20]; Available from: https:// zenodo. org/ record/ 39236 02. 61. Kip M, Bobrov E, Riedel N, Scheithauer H, Gazlig T, Dirnagl U. Einführung von Open Data als zusätzlicher Indikator für die Leistungsorientierte Mit- telvergabe (LOM)-Forschung an der Charité—Universitätsmedizin Berlin. 2019; p. 1. 62. Ratte A, Drees S, Schmidt-Ott T. The importance of scientific competen- cies in German medical curricula—the student perspective. BMC Med Educ. 2018;18(1):146. 63. Medizinischer Fakultätentag. Positionspapier Vermittlung von Wis- senschaftskompetenz im Medizinstudium [Internet]. Medizinischer Fakultätentag; 2017 [cited 2022 Jan 16]. Available from: https:// mediz inisc he- fakul taeten. de/ medien/ stell ungna hmen/ posit ionsp apier- vermi ttlung- von- wisse nscha ftsko mpete nz- im- mediz instu dium/. 64. Bundesministerium für Bildung und Forschung. Masterplan Medizin- studium 2020 [Internet]. Bundesministerium für Bildung und Forschung; 2017 [cited 2022 Jan 16]. Available from: https:// www. bmbf. de/ bmbf/ share ddocs/ kurzm eldun gen/ de/ maste rplan- mediz instu dium- 2020. html. 65. Barbour RS. Checklists for improving rigour in qualitative research: a case of the tail wagging the dog? BMJ. 2001;322(7294):1115–7. 66. Bakker M, Veldkamp CLS, van Assen MALM, Crompvoets EAV, Ong HH, Nosek BA, et al. Ensuring the quality and specificity of preregistrations. Bero L, editor. PLoS Biol. 2020;18(12):e3000937. 67. Tiokhin L, Panchanathan K, Smaldino PE, Lakens D. Shifting the level of selection in science [Internet]. MetaArXiv; 2021 Oct [cited 2022 Jan 30]. Available from: https:// osf. io/ juwck. 68. Dirnagl U. #IchbinHannah and the fight for permanent jobs for postdocs: how a fictitious postdoc (almost) triggered a fundamental reform of Ger - man academia. EMBO Rep. 2022. https:// doi. org/ 10. 15252/ embr. 20225 69. Barbutev A-S. Wir brauchen einen Systemwechsel. ZEIT Campus [Inter- net]. 2021 Oct 30 [cited 2022 Feb 1]; Available from: https:// www. zeit. de/ campus/ 2021- 10/ ichbi nhanna- hochs chule- sabine- kunst- birgi tt- riegr af- pader born- befri stete- stell en- mitte lbau 70. Janotta L, Lukman C. Wer gut betreut, schadet seiner Karriere. FAZ.NET [Internet]. 2021 Nov 20 [cited 2022 Feb 1]; Available from: https:// www. faz. net/ aktue ll/ wirts chaft/ arm- und- reich/ ichbi nhanna- aerger- ueber- arbei tsver haelt nisse- in- der- wisse nscha ft- 17644 369. html 71. Bouter L. What research institutions can do to foster research integrity. Sci Eng Ethics. 2020;26(4):2363–9. 72. Keyes A, Mayo-Wilson E, Nuamah P, Lalji A, Tetteh O, Ford DE. Creating a program to support registering and reporting clinical trials at Johns Hopkins University. Acad Med. 2021;96(4):529–33. Re Read ady y to to submit y submit your our re researc search h ? Choose BMC and benefit fr ? Choose BMC and benefit from om: : 73. Viđak M, Barać L, Tokalić R, Buljan I, Marušić A. Interventions for organiza- tional climate and culture in academia: a scoping review. Sci Eng Ethics. fast, convenient online submission 2021;27(2):24. thorough peer review by experienced researchers in your field 74. Strauss M, Ehlers J, Gerß J, Klotz L, Reinecke H, Leischik R. Status Quo— Die Anforderungen an die medizinische Habilitation in Deutschland. rapid publication on acceptance DMW. 2020;145(23):e130–6. support for research data, including large and complex data types 75. Schiermeier Q. Breaking the Habilitation habit. Nature. • gold Open Access which fosters wider collaboration and increased citations 2002;415(6869):257–8. maximum visibility for your research: over 100M website views per year Publisher’s Note At BMC, research is always in progress. Springer Nature remains neutral with regard to jurisdictional claims in pub- lished maps and institutional affiliations. Learn more biomedcentral.com/submissions

Journal

Health Research Policy and SystemsSpringer Journals

Published: Apr 12, 2022

Keywords: Science policy; Incentives; Robustness; Transparency; Open science; University medical centre

There are no references for this article.