Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

A Standards Organization for Open and FAIR Neuroscience: the International Neuroinformatics Coordinating Facility

A Standards Organization for Open and FAIR Neuroscience: the International Neuroinformatics... There is great need for coordination around standards and best practices in neuroscience to support efforts to make neuroscience a data-centric discipline. Major brain initiatives launched around the world are poised to generate huge stores of neuroscience data. At the same time, neuroscience, like many domains in biomedicine, is confronting the issues of transparency, rigor, and reproducibility. Widely used, validated standards and best practices are key to addressing the challenges in both big and small data science, as they are essential for integrating diverse data and for developing a robust, effective, and sustainable infrastructure to support open and reproducible neuroscience. However, developing community standards and gaining their adoption is difficult. The current landscape is characterized both by a lack of robust, validated standards and a plethora of overlapping, underdeveloped, untested and underutilized standards and best practices. The International Neuroinformatics Coordinating Facility (INCF), an independent organization dedicated to promoting data sharing through the coordination of infrastructure and standards, has recently implemented a formal procedure for evaluating and endorsing community standards and best practices in support of the FAIR principles. By formally serving as a standards organization dedicated to open and FAIR neuroscience, INCF helps evaluate, promulgate, and coordinate standards and best practices across neuroscience. Here, we provide an overview of the process and discuss how neuroscience can benefit from having a dedicated standards body. . . . . . . Keywords Neuroinformatics Standards and best practices FAIR principles Standards organization Neuroscience INCF INCF endorsement process Introduction routine data sharing lead to difficulty in relying on published results (Button et al. 2013). With major brain initiatives across Asia, North America, and Common to both the large brain projects and individual Europe committing significant resources to large-scale, multi- investigator led research is the recognition that neuroscience faceted efforts to understand the nervous system, we are likely as a whole needs to converge towards a more open and col- entering a golden age for neuroscience. At the same time, laborative enterprise with neuroscientists around the globe neuroscience, like many domains in biomedicine, is undergo- committed to open sharing of data and tools. The ing a reproducibility crisis, where small, underpowered stud- Declaration of Intent of the International Brain Initiative, an ies, problems in experimental design and analysis, and lack of alliance of large national brain projects, states: “Researchers working on brain initiatives from around the world recognise Dr. Martone has an equity interest in SciCrunch, a tech start up that that they are engaged in an effort so large and complex that provides software services to support research resource identifiers. even with the unprecedented efforts and resources from public * Mathew Birdsall Abrams mathew@incf.org https://www.internationalbraininitiative.org/sites/default/files/declaration- Extended author information available on the last page of the article of-intent-september-2018.pdf 26 Neuroinform (2022) 20:25–36 and private enterprise, no single initiative will be able to tackle et al. 2014)) metadata standards such as minimal information the challenge to fully understand the brain”. models (e.g., COBIDAS, (Nichols et al. 2017)), protocols and Effective resource sharing means not just that data, process- machine-readable “FAIR” vocabularies (e.g., NIFSTD ontol- ing methods, workflows, and tools are made available, but that ogy, (Bug et al. 2008). For neuroscience, with its diverse data theycan bediscovered andaremadeavailableinawaythat types, dynamics and scales, such standards need to include the ensures that published findings can be reproduced. Currently, it necessary information for understanding what areas of the has been estimated that over 80% of the time spent in handling nervous system were studied and from which structures data data goes not to the analysis, but to data preparation: 60% of were acquired under which conditions. time for cleaning and organizing data and 19% of time spent As in many disciplines, standards in neuroscience have collecting datasets (Gil Press 2016); and curation for dataset been developed on an “as needed” basis with many different integration requires more resources than generation of the data starting points. For instance, the Connectivity File Formats (Palsson and Zengler 2010). Of equal importance, in the age of Documentation (cifti) format was developed internally in the machine learning and artificial intelligence, data should be pub- Human Connectome Project as a standard for storing both lished with integration and reuse in mind, so they can be surface and volumetric imaging data, tailored to the specific interpreted in new ways and leveraged so that new knowledge needs of the project. The Neuroimaging Informatics can be extracted (Ferguson et al. 2014). For that to happen, Technology Initiative (Nifti) image format was developed un- neuroscience as a discipline needs to adopt the FAIR principles der the umbrella of the US National Institutes of Health (NIH) (Wilkinson et al. 2016), ensuring that the results of science are which acted as a broker. Adoption of the format was ensured Findable, Accessible, Interoperable and Reusable, to both by involving developers of all the major brain imaging anal- humans and machines. FAIR neuroscience means that neuro- ysis tools and their commitment to implement the standard. scientists world-wide, working in big team projects or individ- Similarly, a joint effort by neurophysiology data acquisition ual laboratories acquire, manage, and share digital resources so systems vendors to define a common format led to the that they can be reliably compared, aggregated, and reused. As neuroshare standard (neuroshare.org); while being seen as neuroscience becomes a FAIR discipline, the grand challenge far fromideal,cifti,Nifti, and the neurosharestandardhave of piecing together a more comprehensive understanding of been in wide use by the community and undoubtedly have nervous system structure and function from multiple data sets enabled re-use of data to an extent that otherwise would not should become more feasible. have been possible. The FAIR principles were formulated in a collective effort Beyond clinical standards such as FHIR, convergence on by several international groups, based on practical experience disease-specific standards for data collection, Common Data of the roadblocks encountered when trying to reuse data, par- Elements (CDEs ), is resulting in some early successes where ticularly public data. The high level principles are summarised data collected across different centers and even countries is com- into a set of 15 attributes that represent best practices for parable. For example, a cross-European study of traumatic brain FAIR. Some recommendations are domain independent, injury, CENTER-TBI has used CDEs and other data collection e.g., proper licenses, use of persistent identifiers. Other rec- standards to integrate data from 21 European countries and 3 ommendations, however, particularly those that address inter- countries beyond Europe (Maas et al. 2017). However, harmo- operability and reusability, delegate the specifics to individual nizing CDEs and other clinical data standards across broader scientific communities, who are required to define the relevant international boundaries remains a challenge, although recent standards and best practices for their specialized data types progress has been made in the form of the guidelines for Data and protocols. So how does neuroscience with its vast number Acquisition, Quality, and Curation for Observational Research of subdisciplines, techniques, data types, and model systems Designs (DAQCORD; Ercole et al. 2020). become a FAIR discipline? Issues in the development and use of standards fall into First, FAIR requires that the necessary infrastructure in the several broad technical and sociological categories. At the form of web-accessible repositories is available to neurosci- forefront is the paradoxical nature of the standards landscape entists for publishing research objects: data, code, and where the availability of too many overlapping standards leads workflows. These repositories should support FAIR and im- to too few being adopted, as a well known cartoon illustrates. plement basics such as persistent identifiers, programmatic It is common in scientific domains, where researchers are access, and clear licenses. Second, neuroscience needs the generally rewarded for novelty, that research funding ends means to define and support “community-relevant” standards up producing multiple potential standards, many of which both for data and metadata. Such standards include common formats (e.g., NifTI; (Cox et al. 2004), file structures (e.g., https://www.hl7.org/fhir/summary.html BIDs, (Gorgolewski et al. 2016)), data elements (Sheehan https://www.commondataelements.ninds.nih.gov/ et al. 2016), markup languages (e.g., odML, NeuroML, www.center-tbi.eu/ NineML (Grewe et al. 2011); (Cannon et al. 2014); (Raikov https://goo.gl/images/KaYDbJ Neuroinform (2022) 20:25–36 27 lack the required documentation, tooling, or community sup- Neuroscience, whether basic, clinical or computational, port for wide adoption and long term sustainability. As an similarly will benefit from having a dedicated standards orga- example in genomics, FAIRsharing.org , a database that nization to help support the ambitious goals of international keeps track of standards for biomedical science, lists 38 brain projects and the needs of individual investigators, in- standards for “gene expression data” of which 24 have a cluding the necessity to formally publish data and tools in an publication associated. Seventeen of these have a maintainer effective manner. The International Neuroinformatics listed, but only three are recommended (by Biomed central, Coordinating Facility (INCF) has been actively working in EMBO, Giga Science, or Scientific data). Only one has all the area of standards and infrastructure for neuroscience over three: publications, a maintainer, and evidence of use. the past decade. Here, we outline how INCF is evolving its The overhead of having to account for multiple standards operations to promote open and FAIR neuroscience across in neuroscience research is very high. With multiple compet- international boundaries. In particular, INCF is taking on a ing standards, those developing tools may need to implement more formal role as a standards organization for neuroscience, and maintain several input/output interfaces or develop format by extending their work in standards to include the evaluation, conversion routines, draining time and money away from coordination, and endorsement of community standards. more critical tasks. For example, Neo, a Python package for Through this process, neuroscientists and big brain projects representing electrophysiology data, provides IO modules for will have uniform, unbiased and independent analysis of neu- ~20 different electrophysiology formats. With poorly docu- roscience standards and best practices, to ensure that standards mented or out of date standards, projects may invest in a are robust, well supported and documented. standard to accommodate immediate needs, only to find that it hasn’t achieved widespread uptake and therefore outputs are not FAIR. INCF as a Standards Organization In areas that benefit from well documented and validated standards, standards organizations or standards bodies play a The International Neuroinformatics Coordinating Facility central role in the adoption and promotion of standards and (INCF) was launched in 2005 as an independent international best practices. Standards organizations like the W3C and organization dedicated to promoting the sharing of neurosci- IEEE have as their primary activity the development, coordi- ence data, data reuse and reproducibility, through the coordi- nation, promulgation, and upkeep of technical standards that nation of infrastructures and standards. Based on recommen- are intended to address the needs of a group of affected dations from the Organisation for Economic Co-operation and adopters (e.g., Web browser developers, hardware developers; Development (OECD), an international agency of over 30 (Wikipedia contributors 2018b). They establish criteria by countries comprising the world’s leading economies, the which standards and best practices can be evaluated and a INCF instituted a national membership model, whereby indi- means for community vetting to ensure that the standard is vidual nations establish a national neuroinformatics Node and needed and functions appropriately. Such criteria include the is represented in INCF governance structures. Since 2016, the availability of proper validation tools and implementations. governance framework has consisted of the Governing Board, Standards efforts in basic science are also propelled by comprising national-level funding representation from those dedicated organizations such as the Research Data Alliance Nodes that financially sustain the organisation (Governing (rd-alliance.org) to provide a substrate whereby communities Nodes), and an additional Council for Training, Science and can come together to define a needed standard, or to provide Infrastructure (CTSI) which comprises scientific and infra- coordination among different standards’ efforts to ensure structural representation from all INCF Nodes (Governing interoperation. For example, the Computational Modeling in and Associate Nodes), as well additional appointed interna- Biology Network (COMBINE ), is an initiative composed of tional experts. The CTSI recommends INCF’s scientific, in- those developing standards and tools for computational frastructural and training direction and appoints specialist sub- modeling, whose goal is to “coordinate the development of committees such as Training & Education, Infrastructure, the various community standards and formats for Standards and Best Practices, and FAIR. A Secretariat based computational models. By doing so, it is expected that the at the Karolinska Institute in Sweden manages the coordina- federated projects will develop a set of interoperable and tion operations of the organization. non-overlapping standards covering all aspects of modeling From 2007 to 2016, INCF operated scientific programs on in biology.” topics requiring coordination and cooperation across national boundaries. Community needs and requirements were defined through topical international scientific workshops. Building on these identified areas, the Governing Board instantiated a https://fairsharing.org http://neuralensemble.org/neo/ 8 9 http://co.mbine.org/ https://www.incf.org/about-us/history/incf-scientific-workshops 28 Neuroinform (2022) 20:25–36 steering committee comprising international experts in the novo development of standards, serving as a broker for field to have oversight of each scientific program. Working standards development across stakeholder groups. with the Secretariat, the steering committee initiated actions However, this earlier INCF model for standards develop- (top-down) which included launching one or more task forces ment was subject to limitations and criticisms. The process to address the issues, develop technical specifications, make was expensive to maintain and often too slow to keep pace recommendations and develop appropriate tools or infrastruc- with the launch of new projects or development of new tech- ture. The INCF task forces each operated for a few years to nologies. It lacked a formal means for evaluation of resulting deliver these technical solutions, many outreaching also to the standards and for community input into the process. Also, it broader international community. had no formal mechanism for promoting and encouraging the Under this model, the INCF yielded a number of suc- use of already existing standards and best practices, nor a cesses, e.g., the Waxholm space atlas interoperability formal governance procedure to help adjudicate among com- framework ((Johnson et al. 2010); (Hawrylycz et al. peting interests. 2011); (Papp et al. 2014)), the neuroimaging data model: The INCF has undergone a significant reorganization over NIDM (Sochat and Nichols 2016) and others listed in the past 4 years to allow it to be more responsive to the needs Table 1. In these initial efforts and early days in of the global neuroscience community and more transparent in neuroinformatics, the INCF focused most heavily on de its operations. Rather than a top down governance model Table 1 A partial list of standards developed by INCF Task Forces or with INCF support. Active/inactive designations indicate whether the code base is being actively developed as of the writing of this manuscript Standard/Best Description INCF Available from: Status Practices contribution WaxholmSpace A coordinate-based reference space for the mapping and Task Force NITRC Active Mouse Atlas registration of neuroanatomical data in the mouse brain. WaxholmSpace An open access volumetric atlas based on high resolution Task Force NITRC Active Rat Atlas MRI and DTI, with Waxholm Space and stereotaxic space defined, shared in ITK-SNAP and MBAT- ready formats. Brain Imaging A standard for organizing neuroimaging and behavioral Meeting BIDS Active Data Structure data support Neurodata A unified, extensible, open-source data format for Task Force NWB.org Active Without cellular-based neurophysiology data (initial work), Borders Meeting support NIX A data model and file format to store annotated scientific Task Force GitHub Active datasets Neuroimaging A collection of specification documents and examples Task Force NIDM-NIDASH.org Active Data Model that describe an extension to the W3C PROV standard for the domain of human brain mapping. NineML A simulator independent language for unambiguous Task Force GitHub Somewhat active; also description of spiking neuronal network models that SpineML, a community-led aims to facilitate model sharing, portability, and extension of NineML, is re-usability. active NeuroML An XML-based description language that provides a Support neuroml.org Active common data format for defining and exchanging descriptions of neuronal cell and network models. Computational A controlled vocabulary of terms used in Computational Task Force BioPortal Not active Neuroscience Neurosciences to describe models of the nervous Ontology system. Ontology for A controlled vocabulary of terms used to describe Task Force GitHub Not active Experimental neurophysiology experiments Neurophysiol- ogy Common A reference framework for classifying general mammal Task Force Terms available Not active Mammalian nervous system structures. from InterLex Upper Brain Ontology Neuroinform (2022) 20:25–36 29 Table 2 Version 1.0 of the INCF endorsement criteria. These criteria were used to evaluate the SBPs indicated in Table 3. For the FAIR criteria, the relevant FAIR principle for each question is provided in the parentheses Area Criteria 1: Open 1.1 Is the SBP covered under an open license so that it is free to implement and reuse by all interested parties (including commercial)? 1.2 What license is used? 1.3 Does the SBP follow open development practices? 1.4 Where and how are the code/documents managed? 2: FAIR 2.1 SBP uses/permits persistent identifiers where appropriate (F1) 2.2 SBP allows addition of rich metadata to research objects (F2) 2.3 SBP uses/permits addition of appropriate PIDs to metadata (F3) 2.4 The protocol allows for an authentication and authorization when required (A1.2) 2.5 SBP uses or allows the use of vocabularies that follow the FAIR principles (I2) 2.6 SBP includes/allows qualified links to other identifiers (I3) 2.7 Does the standard interoperate with other relevant standards in the same domain? (I) 2.8 Does the SBP provide citation metadata so its use can be documented and tracked? (R1.2) 3: Testing and 3.1 Does the SBP have a reference implementation? implementation 3.2 What tools are available for the SBP? 3.3 Are the tools and implementations covered under an open source license? 3.4 What is your assessment of the quality of the code/document? 4: Governance 4.1 Does the SBP have a clear description of how decisions regarding its development are made? 4.2 Is the governing model document for maintenance and updates compatible with the INCF project governing model document and the open standards principles? 4.3 Is the SBP actively supported by the community? If so, what is the evidence? 4.4 Does the SBP provide tools for community feedback and support? 5: Adoption and use 5.1 Is there evidence of community use beyond the group that developed the SBP? 5.2 Please provide some concrete examples of use, e.g., publications where the use of the SBP is cited; databases or other projects that have adopted the SBP 5.3 Is there evidence of international use? 6: Stability and support 6.1 Does the SBP have a clear description of who is maintaining the SBP and 6.2 How is it currently supported? 6.3 What is the plan for long term support? 6.4 Are training and other supporting materials available? 7: Comparison 7.1 Are there other similar SBP’savailable? 7.2 If yes, how do they compare on key INCF criteria? https://space.incf.org/index.php/s/Ypig2tfHOU4no8C#pdfviewer where a steering committee sets priorities, INCF adopted suc- considered and endorsed. The process includes a pathway for cessful models from other community organizations like both community nomination and committee invited submis- FORCE11 (www.force11.org) and the Research Data sions of SBPs spanning data collection to publication, evalu- Alliance (RDA; www.rd-alliance.org/)toincrease ation against a consistent set of criteria, and active solicitation community participation and a sense of ownership over the of community feedback. An important change for INCF is that process. INCF has launched a new system of community- these standards and best practices need not have been devel- driven scientific interest groups, where groups of neuroscien- oped by INCF sanctioned groups or even be specific to neu- tists can come together to work on an issue of particular inter- roscience. Indeed, one of the goals is to ensure that neurosci- est in the area of neuroinformatics. Oversight and guidance is ence can benefit from the work that has gone on in other provided by the CTSI with its international scientific repre- biomedical or scientific domains around FAIR data. For ex- sentation from INCF member Nodes and external expertise. ample, INCF may choose to endorse standards such as the As part of this reorganization, INCF has developed a for- ORCID, the unique identifier for researchers, or the FAIR mal and community-focused process whereby standards are principles themselves. In this way, INCF can promote 30 Neuroinform (2022) 20:25–36 Fig. 1 A schematic representation of the INCF SBP submission, review and endorsement process initiatives emerging in different scientific domains that bring reference their data to a standard brain atlas when reporting neuroscience data into alignment with widely accepted stan- on location. dards and principles. This approach also allows INCF to fulfill A call went out in spring of 2018 for nominations of SBPs its coordinating role by offering sets of endorsed practices, from the community and a standing committee was formed to and to select, prioritize, and possibly stimulate further devel- establish the necessary procedures and infrastructure for re- opment and convergence of overlapping standards. As an in- view and voting. The SBP Committee operates under the aus- dependent organization with broad international reach and pices of the CTSI and is composed of a representative from neuroinformatics expertise, INCF is uniquely positioned and each of the INCF Governing Nodes, and members from two experienced to act as a standards endorsing authority for of the Associate Nodes (currently the US and Germany). neuroscience. Since 2019, a more formal procedure for committee member- ship has been implemented to ensure broad community par- ticipation in the process. As a first step, the SBP committee established a more de- The INCF Standards and Best Practices tailed set of criteria for evaluation based on seven key areas: Endorsement Process 1. Open: Is the SBP open according to the Open Through a series of community meetings and interactions with Definition and does it follow open development representatives from national standards organizations like the practices? Standard and Industrial Research Institute of Malaysia 2. FAIR: Considers the SBP from the point of view of rel- (SIRIM) and the US National Information Standards evant FAIR criteria (Wilkinson et al. 2016). Is the SBP Organization (NISO), the CTSI developed a set of criteria itself FAIR? Does it result in the production of FAIR and an initial process for evaluating standards and best prac- research objects? Some of these criteria may not apply tices (SBPs) against criteria that support open and FAIR neu- in all cases. roscience (Table 2). The term “best practices” was added in 3. Testing and implementation:Is the SBP supported by recognition that many of the requirements for open and FAIR appropriate software, that is open, well designed, imple- neuroscience may not involve an actual technical standard, mented, validated, documented and available for use? such as a file format. Rather best practices involve practices 4. Governance: Does the SBP have a governance structure that are accepted as producing better results than those that makes it clear how decisions are made and how griev- achieved by other means (Wikipedia contributors 2018a), ances are handled? and that should become standard operating procedure for ex- 5. Adoption and use: The SBP must have substantive evi- perimental neuroscience, e.g., making sure that researchers dence of use outside of the group or individual that https://opendefinition.org/od/2.1/en/ Neuroinform (2022) 20:25–36 31 Table 3 SBP’s that have been submitted for consideration for INCF endorsement and their status as of 12/19/2020 Standard or Best Practice Description Date Endorsement Status Similar Standards Nominated and by whom Neurodata without Borders: Neurophysiology (NWB:N). A 3/8/2018 by Endorsed (Martone et al. 2020a) on 4/3/2020 NIX/odML BIDS unified, extensible, open-source data format for cellular-based Ben EEG extension neurophysiology data Dichter The FAIR Data Principles. A set of guiding principles to make 3/8/2018 by In pipeline data and metadata Findable, Accessible, Interoperable, and Jeffrey Reusable Grethe NeuroML. An XML-based description language that provides a 3/20/2018 by Endorsed (Martone et al. 2019b)on PyNN common data format for defining and exchanging descriptions Padraig 3/20/2019 NineML of neuronal cell and network models. Gleeson SpineML Brain Imaging Data Structure (BIDS). Astandard for 4/15/2018 by Endorsed (Martone et al. 2018) on 11/1/2018 OpenfMRI schema organizing neuroimaging and behavioral data Chris NIDM Experiment Gorgolew- EEG Study Schema ski XCEDE NeuroImaging Data Model (NIDM)-Results. A standard that 4/17/2018 by Identified as a candidate standard, but not An extension of BIDS provides a representation of mass univariate neuroimaging Camille ready for endorsement after community currently analysis results, unified across analysis software packages Maumet review on 11/9/2020 underdevelopment PyNN. A simulator-independent language for building neuronal 4/17/2018 by Endorsed (Martone et al. 2019b)on NeuroML network models Andrew 3/20/2019 SpineML Davison NineML Neo. Python objects for neurophysiology data that could serve as 4/17/2018 by In progress SpikeInterface a common object model for neurophysiology. Andrew NiBabel Davison open metadata mark-up language (odML). A standard 4/17/2018 by In progress BIDS-EEG metadata format for data annotation in electrophysiology Thomas Wachtler Neuroscience information Exchange (NIX). A data model and 4/17/2018 by Endorsed (Martone et al. 2020b)on NEO file format to store annotated scientific datasets Thomas 11/9/2020 NWB:N Wachtler NSDF (Neuroscience Simulation Data Format) develops and maintains it. Because INCF is an interna- community nomination or in response to an invitation from tional organization, evidence of international use is a the committee to submit an SBP. From the first SBP nomina- requirement. tions, BIDS (the Brain Imaging Data Structure; http://bids. 6. Stability and support: Who is actively maintaining and org), a standard for organizing and naming files generated supporting the SBP and what are the plans for long term during a neuroimaging experiment, was chosen as the initial sustainability? test case. The current procedure is shown schematically in 7. Comparison with other SBP’s: Competing standards Fig. 1 and comprises the following steps: add extra burden to the community. The INCF seeks to endorse only a single standard per area, unless the sug- 1. SBP is received by the INCF through an on-line submis- gested approach is complementary as further discussed sion form. SBP submissions are received as the result of below. direct submission, in response to a broad call for submis- sions, or in response to direct invitation from the Under each of these areas, a set of questions were devel- committee. oped to aid reviewers in evaluating how well an SBP com- 2. If the SBP is determined to be in scope, the developer/ plied with each criteria. Version 1 of the review criteria steward of the SBP is contacted and asked to provide (Standards and Best Practices Committee 2019a) are shown some details about the SBP according to the criteria in Table 2. outlined in Table 2. Once the criteria were established, the committee devel- 3. The Committee assigns 2–3 reviewers, committee mem- oped a basic procedure for the evaluation, starting with bers or external experts, to review the materials and 32 Neuroinform (2022) 20:25–36 conduct an independent analysis. Reviewers should the SBP. Any work performed by INCF-supported groups have no conflicts of interest that would preclude an im- will be subjected to the same type of rigorous review as partial analysis of the SBP. outside SBP’s to achieve INCF endorsement. We expect 4. After initial review, the full committee votes on whether the INCF endorsement process to further evolve over time to accept the SBP for consideration or to reject it. to confront the challenges inherent in a dynamic and dis- 5. If accepted, a write up of the SBP is prepared and posted tributed research landscape. Some of the known chal- for community input. For BIDS, the text was posted on lenges involve establishing open and transparent gover- the INCF’s F1000 channel (Martone et al. 2018)and on nance for the endorsement process that recognizes and Google Docs. seeks to balance the competing needs of different stake- 6. Feedback is solicited through announcements via the holder groups. Another key issue is the extension and INCF and the Node Network’s social and media chan- evolution of SBPs over time. nels. The comment period is 60 days from posting. 7. After the commenting period, the reviewers review the Governance feedback and decide whether the comments require fur- ther review. The INCF SBP committee operates in a transparent manner 8. Once the review is complete, the committee votes on and seeks to avoid at all times any type of bias or appearance whether to endorse the SBP. of bias. The process should be fair to those who are develop- 9. If endorsed, the stewards/authors are allowed to display ing SBP’s, but also in the best interests of the broader neuro- the “Endorsed by INCF logo” on their website. science community that we seek to serve. Although the pro- 10. Endorsed standards are displayed on the INCF website cess is still being refined, it was designed to be open, collegial, and actively promulgated through INCF training and transparent. Reviewers are not anonymous and are re- activities. quired to clearly state whether they have a conflict of interest. 11. Endorsed standards are re-evaluated every 2 years to Committee members with conflicts do not participate in the ensure that they are still relevant or need to be replaced. reviewing or voting process. At each step—preparation of review documents, posting of the review for community feed- back, and post-feedback synthesis—reviewers are encouraged As of this writing, INCF has completed the reviews of 6 to contact the SBP provider for additional information and to standards, endorsed 5, and is in the process of reviewing an provide feedback on issues that might be addressable, e.g., additional 2 submitted standards (Table 3). We are using this indicating a clear license on their website, providing a clear initial round of submissions to develop and test the review description of their governance procedures, making sure that process, including both the criteria used and the governance help materials are easy to find. The SBP committee strives at of the process itself, e.g., how does the SBP committee handle all times to reach consensus among the members, the provider conflicts of interests within the committee. and the broader community. As in any human endeavor, con- INCF is also developing additional materials and tools flicts may arise when seeking to balance the interests of all to help the neuroscience community identify and use ap- parties. The committee therefore felt it important to document propriate standards, e.g., a catalog to navigate and assess formal procedures for dealing with any issues that might arise relevance of endorsed SBP’s for their work, and training (Standards and Best Practices Committee 2019b). materials and workshops designed to guide neuroscien- tists and tool developers in their use. To fulfill its coordi- Competing Standards and Best Practices nating role, those working on SBP’s ranging from data collection to publication can request support to form a The SBP process was initiated to help those who need to use working group to develop a standard in an area in need SBP’s in neuroscience to navigate the current options and to of standardization and address issues such as extension of promote interoperability among neuroscience tools. One issue endorsed standards to cover different domains and harmo- that must be addressed carefully is the issue of competing nization of existing standards. INCF actively solicits input standards. Competing SBP’s should ideally be identified dur- from the community on areas in neuroscience in need of ing the review process, either by the submitter, the review standardization through its thematic workshops and a sub- committee, or during the period of community comment. mission form on the INCF website where community When competing SBP’s are identified, the committee deter- members can recommend an area in neuroscience in need mines whether having competing standards in a domain will of standardization (e.g. methods standardization) whether be a significant impediment to further progress or if the field they are willing to work on it or not; under this frame- can support multiple standards without negative conse- work, INCF hosts thematic workshops to determine re- quences. For example, during the reviews of PyNN and quirements and supports working groups to develop to NeuroML, both standards for sharing computational models, Neuroinform (2022) 20:25–36 33 the committee deemed that the field could support multiple into the hands of the researcher that can propel discovery standards without negative consequences; so they are viewed science. When a well defined standard becomes widely ac- as complementary rather than competing, in that they are op- cepted, it provides the necessary uniformity and stability to timized for different conditions(Gleeson and Davison 2020). reduce the overhead of tool development and to promote in- During the review of NWB:N 2.0, a standard for neurophys- teroperability among tools so that researchers have a more iology data, the committee determined that it overlapped with powerful tool arsenal at their disposal. For example, well de- other standards for neurophysiology data, NIX and fined API’s can pass metadata and data between tools to avoid BIDS:EEG, and recommended that groups form an INCF extra steps and so that provenance is maintained. A simple working group so that they remain up to date on each groups’ example is using ORCIDs for account management. As neu- efforts and work towards interoperability. When the commit- roscience adopts ORCIDs, users should be able to log into a tee determines that having competing standards constitutes a resource like a data repository with their ORCIDs. The repos- significant impediment to further progress in the field, the itory can automatically extract required details, e.g., affilia- committee will invite the maintainers of the competing stan- tions, emails, from the ORCID database. At the same time, dards form a working group through INCF to work towards the repository can push information about data sets deposited harmonization of the competing standards. by that researcher into their ORCID profile, much as ORCID is currently linked to databases such as PubMed. On the data side, we often hear that “Data is the new oil”.But Evolution of Evaluation Criteria the extended metaphor goes on to state that “It’s valuable, but if unrefined it cannot really be used.” (Rotella 2012). We expect that our understanding of what constitutes an effec- Operationalizing FAIR for neuroscience is one of the key ways tive standard will evolve as neuroscience continues to move to ensure that data produced by the neuroscience community can towards collaborative, open, and FAIR-neuroscience. Indeed, be put to work, and community standards are essential for FAIR. there is an active effort in many domains to develop metrics for While it is too early to measure the impact of the INCF endorse- how to interpret FAIR (e.g., (Mons et al. 2017). Therefore, the ment process on community adoption, standards developed by SBP criteria themselves should have a clearly documented and the INCF network are having an impact on data quality and community-based process for extension and updates. interoperability. For example, BIDS, the first standard endorsed The criteria listed in Table 2 were used for the reviews com- by INCF, has a community of 136 credited contributors (22 pleted and underway (Table 3). However, not surprisingly, during female, as of October 3, 2020), with ~10,000 users visiting the the preparation of this manuscript, omissions were noted and website, and ~ 7000 users exploring the BIDS Specification, over modifications suggested. For example, Version 1 of the review the past 6 months. Over 404 journal articles have cited BIDS or any of its extensions. Currently, 10 reported centers, institutes criteria did not explicitly include extensibility as a criterion. What happens when new data types, hardware, tool, technology, or use- and databases around the world that have implemented BIDS case are introduced, as neuroscience evolves? It is common prac- as their organizational structure. Furthermore, INCF has served tice, given the diverse use cases and experimental landscape of as a convener of the standards developers and the large-scale neuroscience,totakeanexistingstandardandextendormodifyit brain initiatives which has resulted in harmonization/ for other use cases. BIDS, for example, has over 23 proposals for interoperability of the ontologies and metadata standards adopted creating extensions to the core specification. The INCF and the by HBP and BRAIN Initiative infrastructure projects. More and SBP process are in a good position to provide a community-wide more funders and journals are requiring that individual re- platform for discussions and consensus building about when a searchers publish their data so that it can be inspected and reused. new standard is necessary vs extending an existing one. We are starting to see good examples where pooling of smaller data sets leads to better powered studies and more reliable results (Ferguson et al. 2013;Lefebvreetal. 2015). Such studies suggest How Does the SBP Endorsement Process Help that publishing FAIR data will be of equal importance to pub- Neuroscience? lishing articles about findings derived from these data. Today, INCF is well positioned to assume the role of a stan- Why should an individual neuroscientist care? The adoption dards organization for neuroscience. Originally formed in 2005 of clear and robust standards should also lead to a dramatic to help neuroscientists to coordinate data and computational ac- increase in the number, quality, interoperability and sustain- tivities across international borders, INCF facilitated global co- ability of tools and infrastructures. Our current model of operation for brain science in the very early days of funding tools and infrastructures through research grants leads neuroinformatics. The landscape has changed dramatically, as to a lot of innovative ideas, but often less than useful or in- has the push towards open and FAIR neuroscience with INCF complete implementations. They advance the field of actively internalizing and adapting to those changes. As such, INCF has implemented a model for community standards neuroinformatics, but they don’t always deliver working tools 34 Neuroinform (2022) 20:25–36 Open Access This article is licensed under a Creative Commons development and adoption that empowers the broader neurosci- Attribution 4.0 International License, which permits use, sharing, adap- ence community to develop, evaluate, and endorse standards. tation, distribution and reproduction in any medium or format, as long as Three important policies have been implemented to accomplish you give appropriate credit to the original author(s) and the source, pro- these goals: 1. SBP’s need not have been developed by INCF vide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included working groups to be considered, 2. the endorsement process in the article's Creative Commons licence, unless indicated otherwise in a includes community feedback, and 3. INCF does not just list credit line to the material. If material is not included in the article's SBP’s but actively evaluates them and works with standards Creative Commons licence and your intended use is not permitted by providers to improve them when possible. The endorsement pro- statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this cess is part of INCF’s strategy to develop a FAIR roadmap for licence, visit http://creativecommons.org/licenses/by/4.0/. neuroscience that provides researchers, infrastructure providers, tool developers, publishers, and funders with practical solutions for implementing the FAIR Principles in neuroscience. In addi- tion to the endorsement process, the strategy also includes: 1. a portfolio of INCF endorsed SBPs that provides guidance on the References appropriate use, implementation, and links to tutorials and tools/ infrastructure that have implemented the SBPs, 2. Training and Bug, W. J., Ascoli, G. A., Grethe, J. S., Gupta, A., Fennema-Notestine, dissemination activities to promote community adoption, 3. a C., Laird, A. R., Larson, S. D., et al. (2008). The NIFSTD and BIRNLex vocabularies: Building comprehensive ontologies for framework to identify areas in need of standardization, and 4. a neuroscience. Neuroinformatics, 6(3), 175–194. framework for developing, extending, and harmonizing existing Button, K. S., Ioannidis, J. P. A., Mokrysz, C., Nosek, B. A., Flint, J., community standards. Robinson, E. S. J., & Munafò, M. R. (2013). Power failure: why Thus, INCF can serve as a neutral broker and coordination small sample size undermines the reliability of neuroscience. Nature center on behalf of the wider neuroscience community to help Reviews Neuroscience, 14(5), 365–376. Cannon, R. C., Gleeson, P., Crook, S., Ganapathy, G., Marin, B., Piasini, coordinate and disseminate SBPs relevant for neuroscience. An E., & Angus Silver, R. (2014). LEMS: A language for expressing INCF endorsement seal means that researchers, project man- complex biological models in concise and hierarchical form and its agers, developers and funders can be confident in their choices. use in underpinning NeuroML 2. Frontiers in Neuroinformatics, The community building experience and expertise with identify- 8(September), 79. ing and evaluating standards available in the INCF network also Cox, R. W., Ashburner, J., Breman, H., Fissell, K., Haselgrove, C., Holmes, C. J., Lancaster, J. L., et al. (2004). A (sort of) new image provides important expertise for those who are new to the prac- data format standard: Nifti-1: We 150. NeuroImage, e1440,22. tices of collaborative, open and FAIR neuroscience. As the pro- Ercole, A., Brinck, V., George, P., Hicks, R., Huijben, J., Jarrett, M., cess becomes better established, INCF can also provide a conduit Vassar, M., Wilson, L., & the DAQCORD Collaborators. (2020). for neuroscience-specific specifications to make their way into Guidelines for data acquisition, quality and Curation for observa- tional research designs (DAQCORD). Journal of Clinical and national and international standards organizations, to promote Translational Science, 4(4), 354–359. https://doi.org/10.1017/cts. deployment in instruments and other commercial products 2020.24. supporting science. The training component of INCF will in- Ferguson, A. R., Irvine, K.-A., Gensel, J. C., Nielson, J. L., Lin, A., Ly, J., creasingly engage in training the communities to the use of the Segal, M. R., Ratan, R. R., Bresnahan, J. C., & Beattie, M. S. (2013). endorsed standards. Derivation of multivariate Syndromic outcome metrics for consis- tent testing across multiple models of cervical spinal cord injury in We encourage the neuroscience community to utilize the rats. PLoS One, 8(3), e59712. INCF network and expertise in identifying and evaluating Ferguson, A. R., Nielson, J. L., Cragin, M. H., Bandrowski, A. E., & additional standards, and to actively participate in this process Martone, M. E. (2014). Big data from small data: Data-sharing in the through proposing SBP’s, providing feedback and joining or ‘long tail’ of neuroscience. Nature Neuroscience, 17(11), 1442– initiating INCF special interest groups (visit: https://www. Gil Press. (2016). Cleaning big data: Most time-consuming, least enjoy- incf.org/). As the amount of neuroscience data continues to able data science task, survey says. Forbes Magazine, March 23, grow, knowing how to make them open, FAIR and citable is 2016. https://www.forbes.com/sites/gilpress/2016/03/23/data- an important skill and requirement to propel neuroscientific preparation-most-time-consuming-least-enjoyable-data-science- task-survey-says/. discovery in the twenty-first century. Gleeson, P., & Davison, A. (2020). Relationship between NeuroML and PyNN. F1000 Research, 621(document), 9. Funding JBP was partially funded by the National Institutes of Health Gorgolewski, K. J., Auer, T., Calhoun, V. D., Cameron Craddock, R., (NIH) NIH-NIBIB P41 EB019936 (ReproNim) NIH-NIMH R01 Das, S., Duff, E. P., Flandin, G., et al. (2016). The brain imaging MH083320 (CANDIShare) and NIH RF1 MH120021 (NIDM), the data structure, a format for organizing and describing outputs of National Institute Of Mental Health under Award Number neuroimaging experiments. Scientific Data, 3, 160044(June). R01MH096906 (Neurosynth), as well as the Canada First Research Grewe, J., Wachtler, T., & Benda, J. (2011). A bottom-up approach to Excellence Fund, awarded to McGill University for the Healthy Brains data annotation in neurophysiology. Frontiers in Neuroinformatics, for Healthy Lives initiative and the Brain Canada Foundation with sup- 5(January), 16. port from Health Canada. Neuroinform (2022) 20:25–36 35 Hawrylycz, M., Baldock, R. A., Burger, A., Hashikawa, T., Johnson, G. Nichols, T. E., Das, S., Eickhoff, S. B., Evans, A. C., Glatard, T., Hanke, M., Kriegeskorte, N., Milham, M. P., Poldrack, R. A., Poline, J.-B., A., Martone, M. E., Ng, L., et al. (2011). Digital Atlasing and stan- dardization in the mouse brain. PLoS Comput Biol, 7(2), e1001065. Proal, E., Thirion, B., van Essen, D. C., White, T., & Yeo, B. T. T. (2017). Best practices in data analysis and sharing in neuroimaging Johnson, G. A., Badea, A., Brandenburg, J., Cofer, G., Fubara, B., Liu, S., using MRI. Nature Neuroscience, 20(3), 299–303. & Nissanov, J. (2010). Waxholm space: An image-based reference Palsson, B., & Zengler, K. (2010). The challenges of integrating multi- for coordinating mouse brain research. NeuroImage, 53(2), 365– omic data sets. Nature Chemical Biology, 6,787–789. Papp, E. A., Leergaard, T. B., Evan, C., Allan Johnson, G., & Bjaalie, J. Lefebvre, A., Beggiato, A., Bourgeron, T., & Toro, R. (2015). G. (2014). Waxholm space atlas of the Sprague Dawley rat brain. Neuroanatomical diversity of Corpus callosum and brain volume NeuroImage, 97(August), 374–386. in autism: Meta-analysis, analysis of the autism brain imaging data Raikov, I., Kumar, S. S., Torben-Nielsen, B., & De Schutter, E. (2014). A exchange project, and simulation. Biological Psychiatry, 78(2), NineML-based domain-specific language for computational explo- 126–134. ration of connectivity in the cerebellar granular layer. BMC Maas, A. I. R., Menon, D. K., David Adelson, P., Andelic, N., Bell, M. J., Neurosci, 15(1), P176. Belli, A., Bragge, P., et al. (2017). Traumatic brain injury: Integrated Rotella, P., (2012). Is data the new oil? Forbes. April 2. https://ana.blogs. approaches to improve prevention, clinical care, and research. com/maestros/2006/11/data_is_the_new.html. Lancet Neurol, 16(12), 987–1048. Sheehan, J., Hirschfeld, S., Foster, E., Ghitza, U., Goetz, K., Karpinski, J., Martone, M., Gerkin, R., Moucek, R., et al. (2020a). NIX- Neuroscience Lang, L., Moser, R. P., Odenkirchen, J., Reeves, D., Rubinstein, Y., Information Exchange Format [version 1; not peer reviewed]. Werner, E., & Huerta, M. (2016). Improving the value of clinical F1000Research 2020, 9:358 (document). research through the use of common data elements. Clinical Trials, Martone, M., Gerkin, R., Moucek, R., et al. (2020b). Call for community 13(6), 671–676. review of Neurodata Without Borders: Neurophysiology (NWB:N) Sochat, V., & Nichols, B. N. (2016). The Neuroimaging Data Model 2.0–a data standard for neurophysiology [version 1; not peer (NIDM) API. GigaScience, 5(1), 23–24. reviewed].” F1000Research 8:1731 (document). https://doi.org/10. Standards and Best Practices Committee. (2019a). International 7490/f1000research.1117538.1. neuroinformatics coordinating facility review criteria for endorse- Martone, M., Das, S., Goscinski, W., et al. (2019a). Call for community ment of standards and best practices. https://doi.org/10.5281/ review of NeuroML — A Model Description Language for zenodo.2535741. Computational Neuroscience [version 1; not peer reviewed]. Standards and Best Practices Committee. (2019b). International F1000Research, 8:75 (document). https://doi.org/10.7490/ neuroinformatics coordinating facility vetting and endorsement pro- f1000research.1116398.1. cess for standards and best practices. https://doi.org/10.5281/ Martone, M., Das, S., Goscinski, W., et al. (2019b). Call for community zenodo.2535784. review of PyNN — A simulator-independent language for building Wikipedia Contributors. (2018a). “Best Practice.” Wikipedia, The Free neuronal network models [version 1; not peer reviewed]. Encyclopedia. November 1, 2018. https://en.wikipedia.org/w/index. F1000Research 8:74 (document). https://doi.org/10.7490/ php?title=Best_practice&oldid=866773529. f1000research.1116399.1. Wikipedia Contributors (2018b). Standards organization. Wikipedia, The Martone, M., Goscinski, W., Das, S., Yamaguchi, Y., Ho, E.T.W., Free Encyclopedia. November 14, 2018. https://en.wikipedia.org/ Leergaard, T., Hellgren-Kotaleski, J., Wachtler, T., Kennedy, D., w/index.php?title=Standards_organization&oldid=868762976. & Abrams, M., (2018). Call for Community Review of the brain Wilkinson, M. D., Michel, D., Aalbersberg, I. J. J., Appleton, G., Axton, imaging data structure – a standard for organizing and describing M., Baak, A., Blomberg, N., et al. (2016). The FAIR guiding prin- MRI Data Sets. F1000Research 7 (August). https://doi.org/10.7490/ ciples for scientific data management and stewardship. Scientific f1000research.1115998.1. Data 3 (March), 160018. Mons, B., Neylon, C., Velterop, J., Dumontier, M., da Silva Santos, L. O. B., & Wilkinson, M. D. (2017). Cloudy, increasingly FAIR; revisiting the FAIR data guiding principles for the European Open Publisher’sNote Springer Nature remains neutral with regard to jurisdic- Science cloud. Information Services & Use, 37(1), 49–56. tional claims in published maps and institutional affiliations. Affiliations 1 2 3 4 5,6 Mathew Birdsall Abrams & Jan G. Bjaalie & Samir Das & Gary F. Egan & Satrajit S. Ghosh & 7 8 9 10 Wojtek J. Goscinski & Jeffrey S. Grethe & Jeanette Hellgren Kotaleski & Eric Tatt Wei Ho & 11 12 2 13 14 David N. Kennedy & Linda J. Lanyon & Trygve B. Leergaard & Helen S. Mayberg & Luciano Milanesi & 15 16 17 18 19 Roman Mouček & J. B. Poline & Prasun K. Roy & Stephen C. Strother & Tong Boon Tang & 20 21 22 8 Paul Tiesinga & Thomas Wachtler & Daniel K. Wójcik & Maryann E. Martone 1 4 INCF Secretariat, Karolinska Institutet, Stockholm, Sweden Monash Biomedical Imaging, Monash University, Clayton, VIC, Australia Institute of Basic Medical Sciences, University of Oslo, Oslo, Norway McGovern Institute for Brain Research, Massachusetts Institute of Technology, Cambridge, MA, USA McGill Centre for Integrative Neuroscience, McGill University, Montreal, QC, Canada Department of Otolaryngology - Head and Neck Surgery Harvard Medical School Boston, Boston, MA, USA 36 Neuroinform (2022) 20:25–36 7 16 Monash eResearch Centre, Monash University, Melbourne, VIC, Montreal Neurological Institute, Faculty of Medicine and Health Australia Sciences, McGill University, Montreal, Canada 8 17 Department of Neuroscience, School of Medicine, University of Computational Neuroscience & Neuroimaging Laboratory, School California, San Diego, La Jolla, CA, USA of Bio-Medical Engineering, Indian Institute of Technology (BHU), Varanasi, UP, India KTH Royal Institute of Technology, School of Electrical Engineering and Computer Science, Stockholm, Sweden Rotman Research Institute, Baycrest Centre, Department of Medical Biophysics, University of Toronto, Ontario, ON, Canada Centre for Intelligent Signal and Imaging Research, Institute of Health and Analytics, Universiti Teknologi PETRONAS, Centre for Intelligent Signal and Imaging Research, Institute of Perak, Malaysia Health and Analytics, Universiti Teknologi PETRONAS, Bandar Seri Iskandar, Malaysia Department of Psychiatry, University of Massachusetts Medical School, Worchester, MA, USA Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen, Nijmegen, Netherlands Serendipitea.World, Hasselby, Sweden Department of Biology II, Ludwig-Maximilians-Universität Nash Family Center for Advanced Circuit Therapeutics, Icahn München, Martinsried, Planegg, Germany School of Medicine, New York, NY, USA Laboratory of Neuroinformatics, Nencki Institute of Experimental Institute of Biomedical Technologies, National Research Council Biology of Polish Academy of Sciences, Warsaw, Poland (CNR), Milan, Italy Department of Computer Science and Engineering, Faculty of Applied Sciences, University of West Bohemia, Pilsen, Czech Republic http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Neuroinformatics Springer Journals

Loading next page...
 
/lp/springer-journals/a-standards-organization-for-open-and-fair-neuroscience-the-cTb67HCWJW
Publisher
Springer Journals
Copyright
Copyright © The Author(s) 2021. corrected publication 2021
ISSN
1539-2791
eISSN
1559-0089
DOI
10.1007/s12021-020-09509-0
Publisher site
See Article on Publisher Site

Abstract

There is great need for coordination around standards and best practices in neuroscience to support efforts to make neuroscience a data-centric discipline. Major brain initiatives launched around the world are poised to generate huge stores of neuroscience data. At the same time, neuroscience, like many domains in biomedicine, is confronting the issues of transparency, rigor, and reproducibility. Widely used, validated standards and best practices are key to addressing the challenges in both big and small data science, as they are essential for integrating diverse data and for developing a robust, effective, and sustainable infrastructure to support open and reproducible neuroscience. However, developing community standards and gaining their adoption is difficult. The current landscape is characterized both by a lack of robust, validated standards and a plethora of overlapping, underdeveloped, untested and underutilized standards and best practices. The International Neuroinformatics Coordinating Facility (INCF), an independent organization dedicated to promoting data sharing through the coordination of infrastructure and standards, has recently implemented a formal procedure for evaluating and endorsing community standards and best practices in support of the FAIR principles. By formally serving as a standards organization dedicated to open and FAIR neuroscience, INCF helps evaluate, promulgate, and coordinate standards and best practices across neuroscience. Here, we provide an overview of the process and discuss how neuroscience can benefit from having a dedicated standards body. . . . . . . Keywords Neuroinformatics Standards and best practices FAIR principles Standards organization Neuroscience INCF INCF endorsement process Introduction routine data sharing lead to difficulty in relying on published results (Button et al. 2013). With major brain initiatives across Asia, North America, and Common to both the large brain projects and individual Europe committing significant resources to large-scale, multi- investigator led research is the recognition that neuroscience faceted efforts to understand the nervous system, we are likely as a whole needs to converge towards a more open and col- entering a golden age for neuroscience. At the same time, laborative enterprise with neuroscientists around the globe neuroscience, like many domains in biomedicine, is undergo- committed to open sharing of data and tools. The ing a reproducibility crisis, where small, underpowered stud- Declaration of Intent of the International Brain Initiative, an ies, problems in experimental design and analysis, and lack of alliance of large national brain projects, states: “Researchers working on brain initiatives from around the world recognise Dr. Martone has an equity interest in SciCrunch, a tech start up that that they are engaged in an effort so large and complex that provides software services to support research resource identifiers. even with the unprecedented efforts and resources from public * Mathew Birdsall Abrams mathew@incf.org https://www.internationalbraininitiative.org/sites/default/files/declaration- Extended author information available on the last page of the article of-intent-september-2018.pdf 26 Neuroinform (2022) 20:25–36 and private enterprise, no single initiative will be able to tackle et al. 2014)) metadata standards such as minimal information the challenge to fully understand the brain”. models (e.g., COBIDAS, (Nichols et al. 2017)), protocols and Effective resource sharing means not just that data, process- machine-readable “FAIR” vocabularies (e.g., NIFSTD ontol- ing methods, workflows, and tools are made available, but that ogy, (Bug et al. 2008). For neuroscience, with its diverse data theycan bediscovered andaremadeavailableinawaythat types, dynamics and scales, such standards need to include the ensures that published findings can be reproduced. Currently, it necessary information for understanding what areas of the has been estimated that over 80% of the time spent in handling nervous system were studied and from which structures data data goes not to the analysis, but to data preparation: 60% of were acquired under which conditions. time for cleaning and organizing data and 19% of time spent As in many disciplines, standards in neuroscience have collecting datasets (Gil Press 2016); and curation for dataset been developed on an “as needed” basis with many different integration requires more resources than generation of the data starting points. For instance, the Connectivity File Formats (Palsson and Zengler 2010). Of equal importance, in the age of Documentation (cifti) format was developed internally in the machine learning and artificial intelligence, data should be pub- Human Connectome Project as a standard for storing both lished with integration and reuse in mind, so they can be surface and volumetric imaging data, tailored to the specific interpreted in new ways and leveraged so that new knowledge needs of the project. The Neuroimaging Informatics can be extracted (Ferguson et al. 2014). For that to happen, Technology Initiative (Nifti) image format was developed un- neuroscience as a discipline needs to adopt the FAIR principles der the umbrella of the US National Institutes of Health (NIH) (Wilkinson et al. 2016), ensuring that the results of science are which acted as a broker. Adoption of the format was ensured Findable, Accessible, Interoperable and Reusable, to both by involving developers of all the major brain imaging anal- humans and machines. FAIR neuroscience means that neuro- ysis tools and their commitment to implement the standard. scientists world-wide, working in big team projects or individ- Similarly, a joint effort by neurophysiology data acquisition ual laboratories acquire, manage, and share digital resources so systems vendors to define a common format led to the that they can be reliably compared, aggregated, and reused. As neuroshare standard (neuroshare.org); while being seen as neuroscience becomes a FAIR discipline, the grand challenge far fromideal,cifti,Nifti, and the neurosharestandardhave of piecing together a more comprehensive understanding of been in wide use by the community and undoubtedly have nervous system structure and function from multiple data sets enabled re-use of data to an extent that otherwise would not should become more feasible. have been possible. The FAIR principles were formulated in a collective effort Beyond clinical standards such as FHIR, convergence on by several international groups, based on practical experience disease-specific standards for data collection, Common Data of the roadblocks encountered when trying to reuse data, par- Elements (CDEs ), is resulting in some early successes where ticularly public data. The high level principles are summarised data collected across different centers and even countries is com- into a set of 15 attributes that represent best practices for parable. For example, a cross-European study of traumatic brain FAIR. Some recommendations are domain independent, injury, CENTER-TBI has used CDEs and other data collection e.g., proper licenses, use of persistent identifiers. Other rec- standards to integrate data from 21 European countries and 3 ommendations, however, particularly those that address inter- countries beyond Europe (Maas et al. 2017). However, harmo- operability and reusability, delegate the specifics to individual nizing CDEs and other clinical data standards across broader scientific communities, who are required to define the relevant international boundaries remains a challenge, although recent standards and best practices for their specialized data types progress has been made in the form of the guidelines for Data and protocols. So how does neuroscience with its vast number Acquisition, Quality, and Curation for Observational Research of subdisciplines, techniques, data types, and model systems Designs (DAQCORD; Ercole et al. 2020). become a FAIR discipline? Issues in the development and use of standards fall into First, FAIR requires that the necessary infrastructure in the several broad technical and sociological categories. At the form of web-accessible repositories is available to neurosci- forefront is the paradoxical nature of the standards landscape entists for publishing research objects: data, code, and where the availability of too many overlapping standards leads workflows. These repositories should support FAIR and im- to too few being adopted, as a well known cartoon illustrates. plement basics such as persistent identifiers, programmatic It is common in scientific domains, where researchers are access, and clear licenses. Second, neuroscience needs the generally rewarded for novelty, that research funding ends means to define and support “community-relevant” standards up producing multiple potential standards, many of which both for data and metadata. Such standards include common formats (e.g., NifTI; (Cox et al. 2004), file structures (e.g., https://www.hl7.org/fhir/summary.html BIDs, (Gorgolewski et al. 2016)), data elements (Sheehan https://www.commondataelements.ninds.nih.gov/ et al. 2016), markup languages (e.g., odML, NeuroML, www.center-tbi.eu/ NineML (Grewe et al. 2011); (Cannon et al. 2014); (Raikov https://goo.gl/images/KaYDbJ Neuroinform (2022) 20:25–36 27 lack the required documentation, tooling, or community sup- Neuroscience, whether basic, clinical or computational, port for wide adoption and long term sustainability. As an similarly will benefit from having a dedicated standards orga- example in genomics, FAIRsharing.org , a database that nization to help support the ambitious goals of international keeps track of standards for biomedical science, lists 38 brain projects and the needs of individual investigators, in- standards for “gene expression data” of which 24 have a cluding the necessity to formally publish data and tools in an publication associated. Seventeen of these have a maintainer effective manner. The International Neuroinformatics listed, but only three are recommended (by Biomed central, Coordinating Facility (INCF) has been actively working in EMBO, Giga Science, or Scientific data). Only one has all the area of standards and infrastructure for neuroscience over three: publications, a maintainer, and evidence of use. the past decade. Here, we outline how INCF is evolving its The overhead of having to account for multiple standards operations to promote open and FAIR neuroscience across in neuroscience research is very high. With multiple compet- international boundaries. In particular, INCF is taking on a ing standards, those developing tools may need to implement more formal role as a standards organization for neuroscience, and maintain several input/output interfaces or develop format by extending their work in standards to include the evaluation, conversion routines, draining time and money away from coordination, and endorsement of community standards. more critical tasks. For example, Neo, a Python package for Through this process, neuroscientists and big brain projects representing electrophysiology data, provides IO modules for will have uniform, unbiased and independent analysis of neu- ~20 different electrophysiology formats. With poorly docu- roscience standards and best practices, to ensure that standards mented or out of date standards, projects may invest in a are robust, well supported and documented. standard to accommodate immediate needs, only to find that it hasn’t achieved widespread uptake and therefore outputs are not FAIR. INCF as a Standards Organization In areas that benefit from well documented and validated standards, standards organizations or standards bodies play a The International Neuroinformatics Coordinating Facility central role in the adoption and promotion of standards and (INCF) was launched in 2005 as an independent international best practices. Standards organizations like the W3C and organization dedicated to promoting the sharing of neurosci- IEEE have as their primary activity the development, coordi- ence data, data reuse and reproducibility, through the coordi- nation, promulgation, and upkeep of technical standards that nation of infrastructures and standards. Based on recommen- are intended to address the needs of a group of affected dations from the Organisation for Economic Co-operation and adopters (e.g., Web browser developers, hardware developers; Development (OECD), an international agency of over 30 (Wikipedia contributors 2018b). They establish criteria by countries comprising the world’s leading economies, the which standards and best practices can be evaluated and a INCF instituted a national membership model, whereby indi- means for community vetting to ensure that the standard is vidual nations establish a national neuroinformatics Node and needed and functions appropriately. Such criteria include the is represented in INCF governance structures. Since 2016, the availability of proper validation tools and implementations. governance framework has consisted of the Governing Board, Standards efforts in basic science are also propelled by comprising national-level funding representation from those dedicated organizations such as the Research Data Alliance Nodes that financially sustain the organisation (Governing (rd-alliance.org) to provide a substrate whereby communities Nodes), and an additional Council for Training, Science and can come together to define a needed standard, or to provide Infrastructure (CTSI) which comprises scientific and infra- coordination among different standards’ efforts to ensure structural representation from all INCF Nodes (Governing interoperation. For example, the Computational Modeling in and Associate Nodes), as well additional appointed interna- Biology Network (COMBINE ), is an initiative composed of tional experts. The CTSI recommends INCF’s scientific, in- those developing standards and tools for computational frastructural and training direction and appoints specialist sub- modeling, whose goal is to “coordinate the development of committees such as Training & Education, Infrastructure, the various community standards and formats for Standards and Best Practices, and FAIR. A Secretariat based computational models. By doing so, it is expected that the at the Karolinska Institute in Sweden manages the coordina- federated projects will develop a set of interoperable and tion operations of the organization. non-overlapping standards covering all aspects of modeling From 2007 to 2016, INCF operated scientific programs on in biology.” topics requiring coordination and cooperation across national boundaries. Community needs and requirements were defined through topical international scientific workshops. Building on these identified areas, the Governing Board instantiated a https://fairsharing.org http://neuralensemble.org/neo/ 8 9 http://co.mbine.org/ https://www.incf.org/about-us/history/incf-scientific-workshops 28 Neuroinform (2022) 20:25–36 steering committee comprising international experts in the novo development of standards, serving as a broker for field to have oversight of each scientific program. Working standards development across stakeholder groups. with the Secretariat, the steering committee initiated actions However, this earlier INCF model for standards develop- (top-down) which included launching one or more task forces ment was subject to limitations and criticisms. The process to address the issues, develop technical specifications, make was expensive to maintain and often too slow to keep pace recommendations and develop appropriate tools or infrastruc- with the launch of new projects or development of new tech- ture. The INCF task forces each operated for a few years to nologies. It lacked a formal means for evaluation of resulting deliver these technical solutions, many outreaching also to the standards and for community input into the process. Also, it broader international community. had no formal mechanism for promoting and encouraging the Under this model, the INCF yielded a number of suc- use of already existing standards and best practices, nor a cesses, e.g., the Waxholm space atlas interoperability formal governance procedure to help adjudicate among com- framework ((Johnson et al. 2010); (Hawrylycz et al. peting interests. 2011); (Papp et al. 2014)), the neuroimaging data model: The INCF has undergone a significant reorganization over NIDM (Sochat and Nichols 2016) and others listed in the past 4 years to allow it to be more responsive to the needs Table 1. In these initial efforts and early days in of the global neuroscience community and more transparent in neuroinformatics, the INCF focused most heavily on de its operations. Rather than a top down governance model Table 1 A partial list of standards developed by INCF Task Forces or with INCF support. Active/inactive designations indicate whether the code base is being actively developed as of the writing of this manuscript Standard/Best Description INCF Available from: Status Practices contribution WaxholmSpace A coordinate-based reference space for the mapping and Task Force NITRC Active Mouse Atlas registration of neuroanatomical data in the mouse brain. WaxholmSpace An open access volumetric atlas based on high resolution Task Force NITRC Active Rat Atlas MRI and DTI, with Waxholm Space and stereotaxic space defined, shared in ITK-SNAP and MBAT- ready formats. Brain Imaging A standard for organizing neuroimaging and behavioral Meeting BIDS Active Data Structure data support Neurodata A unified, extensible, open-source data format for Task Force NWB.org Active Without cellular-based neurophysiology data (initial work), Borders Meeting support NIX A data model and file format to store annotated scientific Task Force GitHub Active datasets Neuroimaging A collection of specification documents and examples Task Force NIDM-NIDASH.org Active Data Model that describe an extension to the W3C PROV standard for the domain of human brain mapping. NineML A simulator independent language for unambiguous Task Force GitHub Somewhat active; also description of spiking neuronal network models that SpineML, a community-led aims to facilitate model sharing, portability, and extension of NineML, is re-usability. active NeuroML An XML-based description language that provides a Support neuroml.org Active common data format for defining and exchanging descriptions of neuronal cell and network models. Computational A controlled vocabulary of terms used in Computational Task Force BioPortal Not active Neuroscience Neurosciences to describe models of the nervous Ontology system. Ontology for A controlled vocabulary of terms used to describe Task Force GitHub Not active Experimental neurophysiology experiments Neurophysiol- ogy Common A reference framework for classifying general mammal Task Force Terms available Not active Mammalian nervous system structures. from InterLex Upper Brain Ontology Neuroinform (2022) 20:25–36 29 Table 2 Version 1.0 of the INCF endorsement criteria. These criteria were used to evaluate the SBPs indicated in Table 3. For the FAIR criteria, the relevant FAIR principle for each question is provided in the parentheses Area Criteria 1: Open 1.1 Is the SBP covered under an open license so that it is free to implement and reuse by all interested parties (including commercial)? 1.2 What license is used? 1.3 Does the SBP follow open development practices? 1.4 Where and how are the code/documents managed? 2: FAIR 2.1 SBP uses/permits persistent identifiers where appropriate (F1) 2.2 SBP allows addition of rich metadata to research objects (F2) 2.3 SBP uses/permits addition of appropriate PIDs to metadata (F3) 2.4 The protocol allows for an authentication and authorization when required (A1.2) 2.5 SBP uses or allows the use of vocabularies that follow the FAIR principles (I2) 2.6 SBP includes/allows qualified links to other identifiers (I3) 2.7 Does the standard interoperate with other relevant standards in the same domain? (I) 2.8 Does the SBP provide citation metadata so its use can be documented and tracked? (R1.2) 3: Testing and 3.1 Does the SBP have a reference implementation? implementation 3.2 What tools are available for the SBP? 3.3 Are the tools and implementations covered under an open source license? 3.4 What is your assessment of the quality of the code/document? 4: Governance 4.1 Does the SBP have a clear description of how decisions regarding its development are made? 4.2 Is the governing model document for maintenance and updates compatible with the INCF project governing model document and the open standards principles? 4.3 Is the SBP actively supported by the community? If so, what is the evidence? 4.4 Does the SBP provide tools for community feedback and support? 5: Adoption and use 5.1 Is there evidence of community use beyond the group that developed the SBP? 5.2 Please provide some concrete examples of use, e.g., publications where the use of the SBP is cited; databases or other projects that have adopted the SBP 5.3 Is there evidence of international use? 6: Stability and support 6.1 Does the SBP have a clear description of who is maintaining the SBP and 6.2 How is it currently supported? 6.3 What is the plan for long term support? 6.4 Are training and other supporting materials available? 7: Comparison 7.1 Are there other similar SBP’savailable? 7.2 If yes, how do they compare on key INCF criteria? https://space.incf.org/index.php/s/Ypig2tfHOU4no8C#pdfviewer where a steering committee sets priorities, INCF adopted suc- considered and endorsed. The process includes a pathway for cessful models from other community organizations like both community nomination and committee invited submis- FORCE11 (www.force11.org) and the Research Data sions of SBPs spanning data collection to publication, evalu- Alliance (RDA; www.rd-alliance.org/)toincrease ation against a consistent set of criteria, and active solicitation community participation and a sense of ownership over the of community feedback. An important change for INCF is that process. INCF has launched a new system of community- these standards and best practices need not have been devel- driven scientific interest groups, where groups of neuroscien- oped by INCF sanctioned groups or even be specific to neu- tists can come together to work on an issue of particular inter- roscience. Indeed, one of the goals is to ensure that neurosci- est in the area of neuroinformatics. Oversight and guidance is ence can benefit from the work that has gone on in other provided by the CTSI with its international scientific repre- biomedical or scientific domains around FAIR data. For ex- sentation from INCF member Nodes and external expertise. ample, INCF may choose to endorse standards such as the As part of this reorganization, INCF has developed a for- ORCID, the unique identifier for researchers, or the FAIR mal and community-focused process whereby standards are principles themselves. In this way, INCF can promote 30 Neuroinform (2022) 20:25–36 Fig. 1 A schematic representation of the INCF SBP submission, review and endorsement process initiatives emerging in different scientific domains that bring reference their data to a standard brain atlas when reporting neuroscience data into alignment with widely accepted stan- on location. dards and principles. This approach also allows INCF to fulfill A call went out in spring of 2018 for nominations of SBPs its coordinating role by offering sets of endorsed practices, from the community and a standing committee was formed to and to select, prioritize, and possibly stimulate further devel- establish the necessary procedures and infrastructure for re- opment and convergence of overlapping standards. As an in- view and voting. The SBP Committee operates under the aus- dependent organization with broad international reach and pices of the CTSI and is composed of a representative from neuroinformatics expertise, INCF is uniquely positioned and each of the INCF Governing Nodes, and members from two experienced to act as a standards endorsing authority for of the Associate Nodes (currently the US and Germany). neuroscience. Since 2019, a more formal procedure for committee member- ship has been implemented to ensure broad community par- ticipation in the process. As a first step, the SBP committee established a more de- The INCF Standards and Best Practices tailed set of criteria for evaluation based on seven key areas: Endorsement Process 1. Open: Is the SBP open according to the Open Through a series of community meetings and interactions with Definition and does it follow open development representatives from national standards organizations like the practices? Standard and Industrial Research Institute of Malaysia 2. FAIR: Considers the SBP from the point of view of rel- (SIRIM) and the US National Information Standards evant FAIR criteria (Wilkinson et al. 2016). Is the SBP Organization (NISO), the CTSI developed a set of criteria itself FAIR? Does it result in the production of FAIR and an initial process for evaluating standards and best prac- research objects? Some of these criteria may not apply tices (SBPs) against criteria that support open and FAIR neu- in all cases. roscience (Table 2). The term “best practices” was added in 3. Testing and implementation:Is the SBP supported by recognition that many of the requirements for open and FAIR appropriate software, that is open, well designed, imple- neuroscience may not involve an actual technical standard, mented, validated, documented and available for use? such as a file format. Rather best practices involve practices 4. Governance: Does the SBP have a governance structure that are accepted as producing better results than those that makes it clear how decisions are made and how griev- achieved by other means (Wikipedia contributors 2018a), ances are handled? and that should become standard operating procedure for ex- 5. Adoption and use: The SBP must have substantive evi- perimental neuroscience, e.g., making sure that researchers dence of use outside of the group or individual that https://opendefinition.org/od/2.1/en/ Neuroinform (2022) 20:25–36 31 Table 3 SBP’s that have been submitted for consideration for INCF endorsement and their status as of 12/19/2020 Standard or Best Practice Description Date Endorsement Status Similar Standards Nominated and by whom Neurodata without Borders: Neurophysiology (NWB:N). A 3/8/2018 by Endorsed (Martone et al. 2020a) on 4/3/2020 NIX/odML BIDS unified, extensible, open-source data format for cellular-based Ben EEG extension neurophysiology data Dichter The FAIR Data Principles. A set of guiding principles to make 3/8/2018 by In pipeline data and metadata Findable, Accessible, Interoperable, and Jeffrey Reusable Grethe NeuroML. An XML-based description language that provides a 3/20/2018 by Endorsed (Martone et al. 2019b)on PyNN common data format for defining and exchanging descriptions Padraig 3/20/2019 NineML of neuronal cell and network models. Gleeson SpineML Brain Imaging Data Structure (BIDS). Astandard for 4/15/2018 by Endorsed (Martone et al. 2018) on 11/1/2018 OpenfMRI schema organizing neuroimaging and behavioral data Chris NIDM Experiment Gorgolew- EEG Study Schema ski XCEDE NeuroImaging Data Model (NIDM)-Results. A standard that 4/17/2018 by Identified as a candidate standard, but not An extension of BIDS provides a representation of mass univariate neuroimaging Camille ready for endorsement after community currently analysis results, unified across analysis software packages Maumet review on 11/9/2020 underdevelopment PyNN. A simulator-independent language for building neuronal 4/17/2018 by Endorsed (Martone et al. 2019b)on NeuroML network models Andrew 3/20/2019 SpineML Davison NineML Neo. Python objects for neurophysiology data that could serve as 4/17/2018 by In progress SpikeInterface a common object model for neurophysiology. Andrew NiBabel Davison open metadata mark-up language (odML). A standard 4/17/2018 by In progress BIDS-EEG metadata format for data annotation in electrophysiology Thomas Wachtler Neuroscience information Exchange (NIX). A data model and 4/17/2018 by Endorsed (Martone et al. 2020b)on NEO file format to store annotated scientific datasets Thomas 11/9/2020 NWB:N Wachtler NSDF (Neuroscience Simulation Data Format) develops and maintains it. Because INCF is an interna- community nomination or in response to an invitation from tional organization, evidence of international use is a the committee to submit an SBP. From the first SBP nomina- requirement. tions, BIDS (the Brain Imaging Data Structure; http://bids. 6. Stability and support: Who is actively maintaining and org), a standard for organizing and naming files generated supporting the SBP and what are the plans for long term during a neuroimaging experiment, was chosen as the initial sustainability? test case. The current procedure is shown schematically in 7. Comparison with other SBP’s: Competing standards Fig. 1 and comprises the following steps: add extra burden to the community. The INCF seeks to endorse only a single standard per area, unless the sug- 1. SBP is received by the INCF through an on-line submis- gested approach is complementary as further discussed sion form. SBP submissions are received as the result of below. direct submission, in response to a broad call for submis- sions, or in response to direct invitation from the Under each of these areas, a set of questions were devel- committee. oped to aid reviewers in evaluating how well an SBP com- 2. If the SBP is determined to be in scope, the developer/ plied with each criteria. Version 1 of the review criteria steward of the SBP is contacted and asked to provide (Standards and Best Practices Committee 2019a) are shown some details about the SBP according to the criteria in Table 2. outlined in Table 2. Once the criteria were established, the committee devel- 3. The Committee assigns 2–3 reviewers, committee mem- oped a basic procedure for the evaluation, starting with bers or external experts, to review the materials and 32 Neuroinform (2022) 20:25–36 conduct an independent analysis. Reviewers should the SBP. Any work performed by INCF-supported groups have no conflicts of interest that would preclude an im- will be subjected to the same type of rigorous review as partial analysis of the SBP. outside SBP’s to achieve INCF endorsement. We expect 4. After initial review, the full committee votes on whether the INCF endorsement process to further evolve over time to accept the SBP for consideration or to reject it. to confront the challenges inherent in a dynamic and dis- 5. If accepted, a write up of the SBP is prepared and posted tributed research landscape. Some of the known chal- for community input. For BIDS, the text was posted on lenges involve establishing open and transparent gover- the INCF’s F1000 channel (Martone et al. 2018)and on nance for the endorsement process that recognizes and Google Docs. seeks to balance the competing needs of different stake- 6. Feedback is solicited through announcements via the holder groups. Another key issue is the extension and INCF and the Node Network’s social and media chan- evolution of SBPs over time. nels. The comment period is 60 days from posting. 7. After the commenting period, the reviewers review the Governance feedback and decide whether the comments require fur- ther review. The INCF SBP committee operates in a transparent manner 8. Once the review is complete, the committee votes on and seeks to avoid at all times any type of bias or appearance whether to endorse the SBP. of bias. The process should be fair to those who are develop- 9. If endorsed, the stewards/authors are allowed to display ing SBP’s, but also in the best interests of the broader neuro- the “Endorsed by INCF logo” on their website. science community that we seek to serve. Although the pro- 10. Endorsed standards are displayed on the INCF website cess is still being refined, it was designed to be open, collegial, and actively promulgated through INCF training and transparent. Reviewers are not anonymous and are re- activities. quired to clearly state whether they have a conflict of interest. 11. Endorsed standards are re-evaluated every 2 years to Committee members with conflicts do not participate in the ensure that they are still relevant or need to be replaced. reviewing or voting process. At each step—preparation of review documents, posting of the review for community feed- back, and post-feedback synthesis—reviewers are encouraged As of this writing, INCF has completed the reviews of 6 to contact the SBP provider for additional information and to standards, endorsed 5, and is in the process of reviewing an provide feedback on issues that might be addressable, e.g., additional 2 submitted standards (Table 3). We are using this indicating a clear license on their website, providing a clear initial round of submissions to develop and test the review description of their governance procedures, making sure that process, including both the criteria used and the governance help materials are easy to find. The SBP committee strives at of the process itself, e.g., how does the SBP committee handle all times to reach consensus among the members, the provider conflicts of interests within the committee. and the broader community. As in any human endeavor, con- INCF is also developing additional materials and tools flicts may arise when seeking to balance the interests of all to help the neuroscience community identify and use ap- parties. The committee therefore felt it important to document propriate standards, e.g., a catalog to navigate and assess formal procedures for dealing with any issues that might arise relevance of endorsed SBP’s for their work, and training (Standards and Best Practices Committee 2019b). materials and workshops designed to guide neuroscien- tists and tool developers in their use. To fulfill its coordi- Competing Standards and Best Practices nating role, those working on SBP’s ranging from data collection to publication can request support to form a The SBP process was initiated to help those who need to use working group to develop a standard in an area in need SBP’s in neuroscience to navigate the current options and to of standardization and address issues such as extension of promote interoperability among neuroscience tools. One issue endorsed standards to cover different domains and harmo- that must be addressed carefully is the issue of competing nization of existing standards. INCF actively solicits input standards. Competing SBP’s should ideally be identified dur- from the community on areas in neuroscience in need of ing the review process, either by the submitter, the review standardization through its thematic workshops and a sub- committee, or during the period of community comment. mission form on the INCF website where community When competing SBP’s are identified, the committee deter- members can recommend an area in neuroscience in need mines whether having competing standards in a domain will of standardization (e.g. methods standardization) whether be a significant impediment to further progress or if the field they are willing to work on it or not; under this frame- can support multiple standards without negative conse- work, INCF hosts thematic workshops to determine re- quences. For example, during the reviews of PyNN and quirements and supports working groups to develop to NeuroML, both standards for sharing computational models, Neuroinform (2022) 20:25–36 33 the committee deemed that the field could support multiple into the hands of the researcher that can propel discovery standards without negative consequences; so they are viewed science. When a well defined standard becomes widely ac- as complementary rather than competing, in that they are op- cepted, it provides the necessary uniformity and stability to timized for different conditions(Gleeson and Davison 2020). reduce the overhead of tool development and to promote in- During the review of NWB:N 2.0, a standard for neurophys- teroperability among tools so that researchers have a more iology data, the committee determined that it overlapped with powerful tool arsenal at their disposal. For example, well de- other standards for neurophysiology data, NIX and fined API’s can pass metadata and data between tools to avoid BIDS:EEG, and recommended that groups form an INCF extra steps and so that provenance is maintained. A simple working group so that they remain up to date on each groups’ example is using ORCIDs for account management. As neu- efforts and work towards interoperability. When the commit- roscience adopts ORCIDs, users should be able to log into a tee determines that having competing standards constitutes a resource like a data repository with their ORCIDs. The repos- significant impediment to further progress in the field, the itory can automatically extract required details, e.g., affilia- committee will invite the maintainers of the competing stan- tions, emails, from the ORCID database. At the same time, dards form a working group through INCF to work towards the repository can push information about data sets deposited harmonization of the competing standards. by that researcher into their ORCID profile, much as ORCID is currently linked to databases such as PubMed. On the data side, we often hear that “Data is the new oil”.But Evolution of Evaluation Criteria the extended metaphor goes on to state that “It’s valuable, but if unrefined it cannot really be used.” (Rotella 2012). We expect that our understanding of what constitutes an effec- Operationalizing FAIR for neuroscience is one of the key ways tive standard will evolve as neuroscience continues to move to ensure that data produced by the neuroscience community can towards collaborative, open, and FAIR-neuroscience. Indeed, be put to work, and community standards are essential for FAIR. there is an active effort in many domains to develop metrics for While it is too early to measure the impact of the INCF endorse- how to interpret FAIR (e.g., (Mons et al. 2017). Therefore, the ment process on community adoption, standards developed by SBP criteria themselves should have a clearly documented and the INCF network are having an impact on data quality and community-based process for extension and updates. interoperability. For example, BIDS, the first standard endorsed The criteria listed in Table 2 were used for the reviews com- by INCF, has a community of 136 credited contributors (22 pleted and underway (Table 3). However, not surprisingly, during female, as of October 3, 2020), with ~10,000 users visiting the the preparation of this manuscript, omissions were noted and website, and ~ 7000 users exploring the BIDS Specification, over modifications suggested. For example, Version 1 of the review the past 6 months. Over 404 journal articles have cited BIDS or any of its extensions. Currently, 10 reported centers, institutes criteria did not explicitly include extensibility as a criterion. What happens when new data types, hardware, tool, technology, or use- and databases around the world that have implemented BIDS case are introduced, as neuroscience evolves? It is common prac- as their organizational structure. Furthermore, INCF has served tice, given the diverse use cases and experimental landscape of as a convener of the standards developers and the large-scale neuroscience,totakeanexistingstandardandextendormodifyit brain initiatives which has resulted in harmonization/ for other use cases. BIDS, for example, has over 23 proposals for interoperability of the ontologies and metadata standards adopted creating extensions to the core specification. The INCF and the by HBP and BRAIN Initiative infrastructure projects. More and SBP process are in a good position to provide a community-wide more funders and journals are requiring that individual re- platform for discussions and consensus building about when a searchers publish their data so that it can be inspected and reused. new standard is necessary vs extending an existing one. We are starting to see good examples where pooling of smaller data sets leads to better powered studies and more reliable results (Ferguson et al. 2013;Lefebvreetal. 2015). Such studies suggest How Does the SBP Endorsement Process Help that publishing FAIR data will be of equal importance to pub- Neuroscience? lishing articles about findings derived from these data. Today, INCF is well positioned to assume the role of a stan- Why should an individual neuroscientist care? The adoption dards organization for neuroscience. Originally formed in 2005 of clear and robust standards should also lead to a dramatic to help neuroscientists to coordinate data and computational ac- increase in the number, quality, interoperability and sustain- tivities across international borders, INCF facilitated global co- ability of tools and infrastructures. Our current model of operation for brain science in the very early days of funding tools and infrastructures through research grants leads neuroinformatics. The landscape has changed dramatically, as to a lot of innovative ideas, but often less than useful or in- has the push towards open and FAIR neuroscience with INCF complete implementations. They advance the field of actively internalizing and adapting to those changes. As such, INCF has implemented a model for community standards neuroinformatics, but they don’t always deliver working tools 34 Neuroinform (2022) 20:25–36 Open Access This article is licensed under a Creative Commons development and adoption that empowers the broader neurosci- Attribution 4.0 International License, which permits use, sharing, adap- ence community to develop, evaluate, and endorse standards. tation, distribution and reproduction in any medium or format, as long as Three important policies have been implemented to accomplish you give appropriate credit to the original author(s) and the source, pro- these goals: 1. SBP’s need not have been developed by INCF vide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included working groups to be considered, 2. the endorsement process in the article's Creative Commons licence, unless indicated otherwise in a includes community feedback, and 3. INCF does not just list credit line to the material. If material is not included in the article's SBP’s but actively evaluates them and works with standards Creative Commons licence and your intended use is not permitted by providers to improve them when possible. The endorsement pro- statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this cess is part of INCF’s strategy to develop a FAIR roadmap for licence, visit http://creativecommons.org/licenses/by/4.0/. neuroscience that provides researchers, infrastructure providers, tool developers, publishers, and funders with practical solutions for implementing the FAIR Principles in neuroscience. In addi- tion to the endorsement process, the strategy also includes: 1. a portfolio of INCF endorsed SBPs that provides guidance on the References appropriate use, implementation, and links to tutorials and tools/ infrastructure that have implemented the SBPs, 2. Training and Bug, W. J., Ascoli, G. A., Grethe, J. S., Gupta, A., Fennema-Notestine, dissemination activities to promote community adoption, 3. a C., Laird, A. R., Larson, S. D., et al. (2008). The NIFSTD and BIRNLex vocabularies: Building comprehensive ontologies for framework to identify areas in need of standardization, and 4. a neuroscience. Neuroinformatics, 6(3), 175–194. framework for developing, extending, and harmonizing existing Button, K. S., Ioannidis, J. P. A., Mokrysz, C., Nosek, B. A., Flint, J., community standards. Robinson, E. S. J., & Munafò, M. R. (2013). Power failure: why Thus, INCF can serve as a neutral broker and coordination small sample size undermines the reliability of neuroscience. Nature center on behalf of the wider neuroscience community to help Reviews Neuroscience, 14(5), 365–376. Cannon, R. C., Gleeson, P., Crook, S., Ganapathy, G., Marin, B., Piasini, coordinate and disseminate SBPs relevant for neuroscience. An E., & Angus Silver, R. (2014). LEMS: A language for expressing INCF endorsement seal means that researchers, project man- complex biological models in concise and hierarchical form and its agers, developers and funders can be confident in their choices. use in underpinning NeuroML 2. Frontiers in Neuroinformatics, The community building experience and expertise with identify- 8(September), 79. ing and evaluating standards available in the INCF network also Cox, R. W., Ashburner, J., Breman, H., Fissell, K., Haselgrove, C., Holmes, C. J., Lancaster, J. L., et al. (2004). A (sort of) new image provides important expertise for those who are new to the prac- data format standard: Nifti-1: We 150. NeuroImage, e1440,22. tices of collaborative, open and FAIR neuroscience. As the pro- Ercole, A., Brinck, V., George, P., Hicks, R., Huijben, J., Jarrett, M., cess becomes better established, INCF can also provide a conduit Vassar, M., Wilson, L., & the DAQCORD Collaborators. (2020). for neuroscience-specific specifications to make their way into Guidelines for data acquisition, quality and Curation for observa- tional research designs (DAQCORD). Journal of Clinical and national and international standards organizations, to promote Translational Science, 4(4), 354–359. https://doi.org/10.1017/cts. deployment in instruments and other commercial products 2020.24. supporting science. The training component of INCF will in- Ferguson, A. R., Irvine, K.-A., Gensel, J. C., Nielson, J. L., Lin, A., Ly, J., creasingly engage in training the communities to the use of the Segal, M. R., Ratan, R. R., Bresnahan, J. C., & Beattie, M. S. (2013). endorsed standards. Derivation of multivariate Syndromic outcome metrics for consis- tent testing across multiple models of cervical spinal cord injury in We encourage the neuroscience community to utilize the rats. PLoS One, 8(3), e59712. INCF network and expertise in identifying and evaluating Ferguson, A. R., Nielson, J. L., Cragin, M. H., Bandrowski, A. E., & additional standards, and to actively participate in this process Martone, M. E. (2014). Big data from small data: Data-sharing in the through proposing SBP’s, providing feedback and joining or ‘long tail’ of neuroscience. Nature Neuroscience, 17(11), 1442– initiating INCF special interest groups (visit: https://www. Gil Press. (2016). Cleaning big data: Most time-consuming, least enjoy- incf.org/). As the amount of neuroscience data continues to able data science task, survey says. Forbes Magazine, March 23, grow, knowing how to make them open, FAIR and citable is 2016. https://www.forbes.com/sites/gilpress/2016/03/23/data- an important skill and requirement to propel neuroscientific preparation-most-time-consuming-least-enjoyable-data-science- task-survey-says/. discovery in the twenty-first century. Gleeson, P., & Davison, A. (2020). Relationship between NeuroML and PyNN. F1000 Research, 621(document), 9. Funding JBP was partially funded by the National Institutes of Health Gorgolewski, K. J., Auer, T., Calhoun, V. D., Cameron Craddock, R., (NIH) NIH-NIBIB P41 EB019936 (ReproNim) NIH-NIMH R01 Das, S., Duff, E. P., Flandin, G., et al. (2016). The brain imaging MH083320 (CANDIShare) and NIH RF1 MH120021 (NIDM), the data structure, a format for organizing and describing outputs of National Institute Of Mental Health under Award Number neuroimaging experiments. Scientific Data, 3, 160044(June). R01MH096906 (Neurosynth), as well as the Canada First Research Grewe, J., Wachtler, T., & Benda, J. (2011). A bottom-up approach to Excellence Fund, awarded to McGill University for the Healthy Brains data annotation in neurophysiology. Frontiers in Neuroinformatics, for Healthy Lives initiative and the Brain Canada Foundation with sup- 5(January), 16. port from Health Canada. Neuroinform (2022) 20:25–36 35 Hawrylycz, M., Baldock, R. A., Burger, A., Hashikawa, T., Johnson, G. Nichols, T. E., Das, S., Eickhoff, S. B., Evans, A. C., Glatard, T., Hanke, M., Kriegeskorte, N., Milham, M. P., Poldrack, R. A., Poline, J.-B., A., Martone, M. E., Ng, L., et al. (2011). Digital Atlasing and stan- dardization in the mouse brain. PLoS Comput Biol, 7(2), e1001065. Proal, E., Thirion, B., van Essen, D. C., White, T., & Yeo, B. T. T. (2017). Best practices in data analysis and sharing in neuroimaging Johnson, G. A., Badea, A., Brandenburg, J., Cofer, G., Fubara, B., Liu, S., using MRI. Nature Neuroscience, 20(3), 299–303. & Nissanov, J. (2010). Waxholm space: An image-based reference Palsson, B., & Zengler, K. (2010). The challenges of integrating multi- for coordinating mouse brain research. NeuroImage, 53(2), 365– omic data sets. Nature Chemical Biology, 6,787–789. Papp, E. A., Leergaard, T. B., Evan, C., Allan Johnson, G., & Bjaalie, J. Lefebvre, A., Beggiato, A., Bourgeron, T., & Toro, R. (2015). G. (2014). Waxholm space atlas of the Sprague Dawley rat brain. Neuroanatomical diversity of Corpus callosum and brain volume NeuroImage, 97(August), 374–386. in autism: Meta-analysis, analysis of the autism brain imaging data Raikov, I., Kumar, S. S., Torben-Nielsen, B., & De Schutter, E. (2014). A exchange project, and simulation. Biological Psychiatry, 78(2), NineML-based domain-specific language for computational explo- 126–134. ration of connectivity in the cerebellar granular layer. BMC Maas, A. I. R., Menon, D. K., David Adelson, P., Andelic, N., Bell, M. J., Neurosci, 15(1), P176. Belli, A., Bragge, P., et al. (2017). Traumatic brain injury: Integrated Rotella, P., (2012). Is data the new oil? Forbes. April 2. https://ana.blogs. approaches to improve prevention, clinical care, and research. com/maestros/2006/11/data_is_the_new.html. Lancet Neurol, 16(12), 987–1048. Sheehan, J., Hirschfeld, S., Foster, E., Ghitza, U., Goetz, K., Karpinski, J., Martone, M., Gerkin, R., Moucek, R., et al. (2020a). NIX- Neuroscience Lang, L., Moser, R. P., Odenkirchen, J., Reeves, D., Rubinstein, Y., Information Exchange Format [version 1; not peer reviewed]. Werner, E., & Huerta, M. (2016). Improving the value of clinical F1000Research 2020, 9:358 (document). research through the use of common data elements. Clinical Trials, Martone, M., Gerkin, R., Moucek, R., et al. (2020b). Call for community 13(6), 671–676. review of Neurodata Without Borders: Neurophysiology (NWB:N) Sochat, V., & Nichols, B. N. (2016). The Neuroimaging Data Model 2.0–a data standard for neurophysiology [version 1; not peer (NIDM) API. GigaScience, 5(1), 23–24. reviewed].” F1000Research 8:1731 (document). https://doi.org/10. Standards and Best Practices Committee. (2019a). International 7490/f1000research.1117538.1. neuroinformatics coordinating facility review criteria for endorse- Martone, M., Das, S., Goscinski, W., et al. (2019a). Call for community ment of standards and best practices. https://doi.org/10.5281/ review of NeuroML — A Model Description Language for zenodo.2535741. Computational Neuroscience [version 1; not peer reviewed]. Standards and Best Practices Committee. (2019b). International F1000Research, 8:75 (document). https://doi.org/10.7490/ neuroinformatics coordinating facility vetting and endorsement pro- f1000research.1116398.1. cess for standards and best practices. https://doi.org/10.5281/ Martone, M., Das, S., Goscinski, W., et al. (2019b). Call for community zenodo.2535784. review of PyNN — A simulator-independent language for building Wikipedia Contributors. (2018a). “Best Practice.” Wikipedia, The Free neuronal network models [version 1; not peer reviewed]. Encyclopedia. November 1, 2018. https://en.wikipedia.org/w/index. F1000Research 8:74 (document). https://doi.org/10.7490/ php?title=Best_practice&oldid=866773529. f1000research.1116399.1. Wikipedia Contributors (2018b). Standards organization. Wikipedia, The Martone, M., Goscinski, W., Das, S., Yamaguchi, Y., Ho, E.T.W., Free Encyclopedia. November 14, 2018. https://en.wikipedia.org/ Leergaard, T., Hellgren-Kotaleski, J., Wachtler, T., Kennedy, D., w/index.php?title=Standards_organization&oldid=868762976. & Abrams, M., (2018). Call for Community Review of the brain Wilkinson, M. D., Michel, D., Aalbersberg, I. J. J., Appleton, G., Axton, imaging data structure – a standard for organizing and describing M., Baak, A., Blomberg, N., et al. (2016). The FAIR guiding prin- MRI Data Sets. F1000Research 7 (August). https://doi.org/10.7490/ ciples for scientific data management and stewardship. Scientific f1000research.1115998.1. Data 3 (March), 160018. Mons, B., Neylon, C., Velterop, J., Dumontier, M., da Silva Santos, L. O. B., & Wilkinson, M. D. (2017). Cloudy, increasingly FAIR; revisiting the FAIR data guiding principles for the European Open Publisher’sNote Springer Nature remains neutral with regard to jurisdic- Science cloud. Information Services & Use, 37(1), 49–56. tional claims in published maps and institutional affiliations. Affiliations 1 2 3 4 5,6 Mathew Birdsall Abrams & Jan G. Bjaalie & Samir Das & Gary F. Egan & Satrajit S. Ghosh & 7 8 9 10 Wojtek J. Goscinski & Jeffrey S. Grethe & Jeanette Hellgren Kotaleski & Eric Tatt Wei Ho & 11 12 2 13 14 David N. Kennedy & Linda J. Lanyon & Trygve B. Leergaard & Helen S. Mayberg & Luciano Milanesi & 15 16 17 18 19 Roman Mouček & J. B. Poline & Prasun K. Roy & Stephen C. Strother & Tong Boon Tang & 20 21 22 8 Paul Tiesinga & Thomas Wachtler & Daniel K. Wójcik & Maryann E. Martone 1 4 INCF Secretariat, Karolinska Institutet, Stockholm, Sweden Monash Biomedical Imaging, Monash University, Clayton, VIC, Australia Institute of Basic Medical Sciences, University of Oslo, Oslo, Norway McGovern Institute for Brain Research, Massachusetts Institute of Technology, Cambridge, MA, USA McGill Centre for Integrative Neuroscience, McGill University, Montreal, QC, Canada Department of Otolaryngology - Head and Neck Surgery Harvard Medical School Boston, Boston, MA, USA 36 Neuroinform (2022) 20:25–36 7 16 Monash eResearch Centre, Monash University, Melbourne, VIC, Montreal Neurological Institute, Faculty of Medicine and Health Australia Sciences, McGill University, Montreal, Canada 8 17 Department of Neuroscience, School of Medicine, University of Computational Neuroscience & Neuroimaging Laboratory, School California, San Diego, La Jolla, CA, USA of Bio-Medical Engineering, Indian Institute of Technology (BHU), Varanasi, UP, India KTH Royal Institute of Technology, School of Electrical Engineering and Computer Science, Stockholm, Sweden Rotman Research Institute, Baycrest Centre, Department of Medical Biophysics, University of Toronto, Ontario, ON, Canada Centre for Intelligent Signal and Imaging Research, Institute of Health and Analytics, Universiti Teknologi PETRONAS, Centre for Intelligent Signal and Imaging Research, Institute of Perak, Malaysia Health and Analytics, Universiti Teknologi PETRONAS, Bandar Seri Iskandar, Malaysia Department of Psychiatry, University of Massachusetts Medical School, Worchester, MA, USA Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen, Nijmegen, Netherlands Serendipitea.World, Hasselby, Sweden Department of Biology II, Ludwig-Maximilians-Universität Nash Family Center for Advanced Circuit Therapeutics, Icahn München, Martinsried, Planegg, Germany School of Medicine, New York, NY, USA Laboratory of Neuroinformatics, Nencki Institute of Experimental Institute of Biomedical Technologies, National Research Council Biology of Polish Academy of Sciences, Warsaw, Poland (CNR), Milan, Italy Department of Computer Science and Engineering, Faculty of Applied Sciences, University of West Bohemia, Pilsen, Czech Republic

Journal

NeuroinformaticsSpringer Journals

Published: Jan 1, 2022

Keywords: Neuroinformatics; Standards and best practices; FAIR principles; Standards organization; Neuroscience; INCF; INCF endorsement process

References