In this brief editorial in the journal Diagnostics, Mashamba-Thompson et al. state their case for a potentially ideal solution to COVID-19 testing in resource-poor communities: using mobile health (mHealth) solutions in conjunction with a blockchain and AI-driven data acquisition and transfer system. They add that the “AI component of this technology enables powerful data collection (patient information, geographic location of the patient, and test results), security, analysis, and curation of disparate and clinical data sets from federated blockchain platforms to derive triangulated data at very high degrees of confidence and speed. ” Unfortunately, what isn’t clear in their argument is how the self-testing step would actually work, let alone how the system would realistically be implemented in resource-poor settings. One could argue, however, despite the relatively few details, the sharing of ideas on how to address COVID-19 testing in various environments still has value.
Wrangling environmental exposure data: Guidance for getting the best information from your laboratory measurements
In this 2019 paper published in Environmental Health, Udesky et al. share their organization’s guidance towards addressing quality in environmental exposure testing and data. In particular, the authors note the importance of reporting quality control data along with chemical measurements in published research, stating that its lacking leaves “readers uncertain about the level of confidence in the reported data.” Their objective was to provide full guidance on the implementation, interpretation, and reporting of QC data. After explaining their approach to study design, study implementation, and data interpretation, the authors conclude that their guidance, visualizations, and supplementary content provide “a useful set of tools for getting the best information from valuable environmental exposure datasets, and enabling valid comparison and synthesis of exposure data across studies.”
2019 novel coronavirus disease (COVID-19): Paving the road for rapid detection and point-of-care diagnostics
In this March 2020 journal article published in the journal Micromachines, Nguyen, Bang, and Wolff give their professional take on point-of-care diagnostics for combating the COVID-19 epidemic. In particular, the authors discuss the loop-mediated isothermal amplification (LAMP) nucleic acid amplification technique and its potential to make COVID-19 testing faster, more portable, more flexible, more stable, and more sensitive. After discussing the then-recent events (information is changing rapidly), the authors compare real-time reverse transcription polymerase chain reaction (rRT-PCR) with LAMP methods and discuss other important factors affecting society because of the pandemic.
In this brief editorial article published in the journal Diagnostics, Tiawanese researchers Yang et al. provide their insights into a more convenient at-home point-of-care (POC) testing method for the COVID-19 illness. Noting the urgency of the associated pandemic and the various costs associated with traditional testing methodologies such as lateral flow immunoassays and molecular-based assays, the authors suggest the blending of several tools: a paper-based result and common mobile devices. A patient at home could take a nasal swab and use a POC device to return a colorimetric result, which could then be captured by mobile phone and rapidly sent to a clinician for analysis. They conclude that such a method could “provide new insights into designing POC COVID-19 diagnostics and ultimately improve the health care system to combat this and similar diseases.”
This is the third of a series of interim guidance documents, this one originating from the U.S. Food and Drug Administration (FDA). Issued on March 16, this guidance document provides insight into FDA policy for the development, validation, and use of diagnostic tests for Coronavirus Disease-2019 (COVID-19), particularly those tests issued an Emergency Use Authorization (EUA). This guidance applies to not only manufacturers developing and distributing test kits but also CLIA laboratories cleared for high-complexity testing that develop and validate their own tests. The FDA provides background on the topic and then discusses the policy in detail, before closing with its validation study recommendations.
Laboratory biosafety guidance related to coronavirus disease (COVID-19): Interim Guidance, 19 March 2020
Again, not a journal article, but here rather we have another interim guidance document by the World Health Organization (WHO), this time in regards to laboratory biosafety related to COVID-19 testing. Effective March 19, 2020, this guidance document, complete with references, addresses laboratory biosafety principles to consider when handling and testing specimens believed to contain the COVID-19 virus. The WHO provides a bit of background, provides an overview of the key points, and then discusses those points in further detail. They also include two annex items: the core requirements of good microbiological practice and procedure (GMPP) and a risk assessment template for laboratories to borrow from.
Laboratory testing for coronavirus disease (COVID-19) in suspected human cases: Interim Guidance, 19 March 2020
Breaking from the norm, rather than a journal article, this document represents official interim guidance on laboratory testing for COVID-19 from the World Health Organization (WHO), effective March 19, 2020. This guidance document, complete with references, addresses laboratory testing guidance principles for testing patients who meet the case definition for being infected with the COVID-19 virus. The WHO provides a bit of background, then addresses specimen collection and shipment, laboratory testing procedures, and reporting. They also include where further research must tread, as well as an ISO 15189:2012-compliant COVID-19 laboratory test request form. WHO closes by encouraging “the sharing of data to better understand and thus manage the COVID-19 outbreak, and to develop countermeasures.”
In this 2019 journal article published in the African Journal of Laboratory Medicine, Mtonga et al. describe the efforts of developing a laboratory information system (LIS) for resource-poor regions, as well the benefits of other informatics interventions in those settings. Using open-source C4G BLIS as their base, the researchers then turned to “user stories” as a recommended method for determining the system requirements the end users required, then updating with those requirements as well as critical functionality that would suitable for resource-poor hospitals and labs. The authors then deployed the resulting LIS in Kamuzu Central Hospital in Lilongwe, Malawi. They conclude their experience “highlights the potential of using informatics interventions to address systemic problems in the laboratory testing process in low-resource settings,” though the implementation of those interventions “may require innovation of new hardware to address various contextual issues.”
Ultra-high-performance liquid chromatography coupled with quadrupole-Orbitrap high-resolution mass spectrometry for multi-residue analysis of mycotoxins and pesticides in botanical nutraceuticals
In this 2020 paper published in the journal Toxins, Narváez et al. of Spain and Italy present the results of their attempt to use a specific type of chromatography and spectrometry in order to analyze the amount of mycotoxins and pesticides in nutraceuticals made from Cannabis sativa and other botanicals. The authors developed specific procedures associated with a “methodology using ultra-high-performance liquid chromatography coupled with quadrupole-Orbitrap high-resolution mass spectrometry (UHPLC-Q-Orbitrap HRMS).” After testing their methodology on 10 cannabidiol (CBD) supplements, they found numerous mycotoxins and dozens of pesticides. They conclude that their “results highlight the necessity of monitoring contaminants in food supplements in order to ensure safe consumption” while also demonstrating the effectiveness of their methodology in doing so.
In this 2019 paper published in Journal of Medical Biochemistry, Lippi and Mattiuzi examine the primary drivers and outcomes of implementing project management strategies in the field of laboratory medicine. Citing funding and personnel shortages in healthcare services, as well as an increasing volume and complexity of analytical tests in the lab, the authors point out the importance of proper project management in the various aspects of the clinical laboratory and its organization and operation in order to better ensure more rapid and accurate diagnoses and improved clinical outcomes. They lay out four primary steps and address additional drivers to consider when implementing project management into laboratory planning and operations, including defining the laboratory environment, planning technical resources, managing staff, and interacting with hospital administrators and other important stakeholders. They conclude that through these activities, as well as some final last steps, clinical laboratories—and by extension the patients they serve—can benefit greatly from incorporating thoughtful project planning into their activities.
Information technology and medical technology personnel’s perception regarding segmentation of medical devices: A focus group study
Cybersecurity is an increasingly relevant and critical topic for most industries, including the healthcare industry. In hospitals and other clinical settings, patient monitoring systems, medical imaging systems, and laboratory instruments are increasingly networked to get the most out of data management and operational efficiency. However, these devices are at risk of cyber attack, with potentially significant consequences. Johansson et al. acknowledge this problem and attempt to address the perceptions of information and medical technology experts on the topic of securing in-house medical devices through the practice of network segmentation. The authors present their discussion results with 18 stakeholders from the Swedish Region Skåne, and they conclude that the stakeholders “were positively receptive to the increase in cybersecurity provided by network segmentation but concerned about the increase in the administration that it will entail for medical devices.” They note that these and other opinions stated in the research are vital for health networks to take into account when considering improving their cybersecurity.
Ontologies are important to biological and biomedical researchers working with multiple data sources because, when implemented well, ontologies ensure heterogeneous, multimodal data across multiple domains and they aid with more effective data mining and discoveries. Ontologies also assist in improving data quality. In this 2019 paper, Jackson et al. highlight the importance of ontologies with their ROBOT open-source library, which aids with the automation of ontology development. They describe the various means which ROBOT “provides ontology processing commands for a variety of tasks, including commands for converting formats, running a reasoner, creating import modules, running reports, and various other tasks.” They conclude that such automation allows developers to easily “configure, combine, and execute individual tasks in comprehensive, automated workflows.” The authors also demonstrate how ROBOT can help catch logical errors and provide quality control mechanisms to ontology creation and updating.
In this brief article by Schmieder et al., the authors provide an example of applying laboratory informatics to research activities for drug discovery and personalized medicine. In particular, the authors look at the in vitro nephrotoxicity assay and the use of transepithelial electrical resistance (TEER) to evaluate the barrier function of cellular layers. Citing issues such as the significant multi-step processes involved with this type of testing, the authors attempt to demonstrate how a laboratory information management system (LIMS), laboratory execution system (LES), and automation standards such as SiLA can be used together to decrease the workflow complexity of such cell-based assays.
Researchers in a wide variety of fields are gradually realizing the benefits of open-source hardware (OSH) in their scientific activities, though not easily. Projects associated with OSH often fail for lack of technological know-how, long-term funding, and strong organization. In this 2019 article, Hill et al. suggest the benefits of “a provisional framework for developing and sustaining the life cycle of not‐for‐profit OSH,” in this case applying it to their biological and conservation science. The authors describe their six-phase framework spread across two management teams, one for engineering and one for logistics (with both sharing social media and outreach duties). They present as a case study the AudioMoth project for the development of a low-cost, full-spectrum acoustic logger, which was largely crowd-sourced. They conclude by emphasizing the importance of frameworks such as theirs, stating that without them “existing OSH will continue to have short life spans, and remain out of reach for the majority of conservation biologists.” More broadly, they also note the need “to push access to OSH toward communities outside of the pockets of wealth and high opportunity that the framework may initially serve.”
In this 2018 article published in The Journal of Supercomputing, Ibrahim et al. present the results of their application of the National Institute of Standards and Technology’s (NIST) Cybersecurity Framework to a custom assessment tool for gauging the cybersecurity preparedness of an organization. In this paper, the authors used their assessment tool to assess a Western Australian government. After discussing the formula involved with the assessment tool and visualization of results using Microsoft Power BI, the authors conclude that using their tool and the NIST Cybersecurity Framework, organizations can better “identify the specific people, processes, and technology areas that require improvement” and improve threat mitigation. The authors also cite the user-friendly nature of the framework.
In this early 2019 journal article published in the journal Database, Grand et al. of the Candiolo Cancer Institute present the fine details of their laboratory information management system (LIMS) Laboratory Assistant Suite (LAS) for cancer and other genomic research. Citing “a substantial mismatch between the LIMS solutions on offer and the functional requirements dictated by research practice,” the authors describe the requirements they had for a LIMS in their institution and how they went about creating it. After describing the data models, functionalities, modular architecture, and its usage, the authors conclude that their LAS, in conjunction with a custom data management module, allows researchers to “execute complex queries without any knowledge of query languages or database structures, and easily integrate heterogeneous data stored in multiple databases,” while also resulting in an improvement in data quality, a reduction in data entry and retrieval, and new insights with the enabled data interconnections.
While the definitions surrounding “open-source” software have largely sufficed for such software, the term as applied to tangible hardware products has been insufficient, according to arguments by Bonvoisin et al. in this 2017 paper published in the Journal of Open Hardware. Their work turns to analysis of 132 proclaimed open-source ” non-electronic and complex open-source hardware products.” After lengthy background information and discussion of their methods and results, the authors conclude: “The empirical results strongly indicate the existence of two main usages of open-source principles in the context of tangible products: publication of product-related documentation as a means to support community-based product development and to disseminate privately developed innovations. It also underlines the high variety of interpretations and even misuses of the concept of open-source hardware.”
The topic of making bioinformatics applications more approachable to researchers and students has been discussed off and on for years, and some efforts have even been made in that regard. Another step forward for bioinformatics applications is offered by Joppich and Zimmer of Ludwig-Maximilians-Universität München, with their open-source bioGUI. The software attempts to address two problems of bioinformatics applications that rely heavily on the command line: many of them work on Unix-based systems but not Microsoft Windows, and researchers have a tendency to shy away from complex command-line apps despite their utility. The authors present in detail their framework and its use cases, showing how a graphical user interface or GUI can make many such command-line apps more approachable. They conclude that providing a GUI and easy-to-use install modules for bioinformatics apps using the command line makes “execution and usage of these tools more comfortable” while allowing scientists to better analyze their data.
ChromaWizard: An open-source image analysis software for multicolor fluorescence in situ hybridization analysis
In this brief article published in Cytometry Part A, researchers at the Austrian Centre for Industrial Biotechnology present their open-source multiplex fluorescence in situ hybridization (M‐FISH) software for chromosome painting. The tool—ChromaWizard—acts as a free and open-source option for hybridization analysis, integrating “image processing, multicolor integration, chromosome separation, and visualization with false color assignments.” The software can handle images in TIFF, PNG, and JPEG formats and provides robust visualization tools. The authors conclude that ChromaWizard “allows direct inspection of the original hybridization signals and enables either manual or automatic assignment of colors, making it a functional and versatile tool that can also be used for other multicolor applications.”
Open-source software has been a topic of discussion for decades, both as a model of software development and distribution and as a potential for what could be done with it. But the concept of open-source hardware, particularly in the field of science, has been more challenging to address. In this brief perspective article by University of Tübingen’s André Maia Chagas, he discusses the benefits of open science hardware to address the growing concern of “haves and have nots” in the scientific research community. Citing high prices, the overall closed-source nature of equipment, businesses that can close without warning, and poor customer support, Chagas attempts to demonstrate that advances in modern design—such as the smartphone—and organizational efforts to implement and promote open hardware philosophies provide opportunity for more people to engage in scientific endeavors. He concludes we “need to reassess our relationship to knowledge and technology, how it determines our role in society, and how we want to spend grant money entrusted to us by the people,” and by focusing on making hardware more open and accessible, we’ll shrink the divide and improve scientific research as a result.