As a natural extension of the paper lab notebook, the electronic laboratory notebook (ELN) is an important component of modern research activities. However, as calls for reproducibility and data provenance in research grow, the voracity and semantic documentation of research data and experiment protocols logged in ELNs grows more important. In this 2022 paper from the Journal of Biomedical Semantics
, Schröder et al.
address this sort of semantic documentation of experiment protocols and take their own structure-based approach to doing it, using a biomedical wet lab experiment in an ELN as a test case. The authors present background information, related work, and information about their use case and approach to the task. After presenting and discussing the results of their semantic documentation efforts, they conclude that their "structure-based approach, in combination with RO-Crate bundling, can be used to successfully document research data based on the description in the form of ELN protocols." They add that bundling of the information into RO-Crate better enables "the sharing, publication, and archiving of the research data in terms of the FAIR principles."
Informatics, the study of information processing and management, has moved into many industries over the past few decades, and the food manufacturing and testing field is no different. Food informatics has evolved as a sub-branch of informatics that assesses how food-related activities can be improved for not only manufacturers but also consumers, using information technologies. This 2021 paper by Krupitzer and Stein examines the state of food informatics and proposes an updated definition that encompasses the state-of-the-art in food technologies and how they are used in the manufacturing and research landscape. After discussing numerous concepts important to food informatics, the authors propose their own definition and discuss how it applies to the practical use of tools like autonomous robotics, artificial intelligence (AI), machine learning (ML), internet of things (IoT), predictive maintenance, and more. They conclude that while "various research streams focus on different aspects of the production process ... they miss the methods and approaches that can be applied across several steps along the food production process." They add that the mindful addition of technologies like AI and IoT has high potential "to optimize the various aspects and processes concerning food production, consumption, and security."
In this 2022 paper published in the journal Learning Health Systems
, Kohn et al.
of Wake Forest School of Medicine's Center for Biomedical Informatics describe the evolution of an academic biomedical informatics program in the scope of a "learning health system" (LHS) and improving decision making in the clinical field. The authors first provide background information on topics such as artificial intelligence (AI) and machine learning (ML) and how they they are trained upon and accepted within the clinical community. They then describe the concept of the LHS and what is required to understand and utilize it well, followed by full details of their biomedical informatics program and its incorporation of LHS at Wake Forest. The authors conclude that programs such as its Clinical Scholars in Informatics (CSI) program "can be broadly applied to most internal medicine programs across the United States" and that such programs clearly provide "bidirectional net benefits for the resident physicians and for the health system as a whole." They add that enabling clinical students in "understanding the role of AI and predictive analytics, and how to apply them, will become progressively more important" as the era of big data continues to grow.
Most know that the COVID-19 pandemic turned a solid chunk of the workforce—and their work itself—on its heels. More than two years later, this change has both positively and negatively affected many a workplace, including the laboratory. In this March 2022 work by laboratory veteran Joe Liscouski, the impact of disruptions like COVID-19, flooding, and power failure to laboratories is discussed, particularly within the scope of implementing automation to better limit those disruptions. However, as Liscouski notes, it's not as simple as "let's implement laboratory automation"; many nuances to its implementation and use exist within the context of laboratory work. The author first discusses the nature of work itself, followed by a brief look at laboratory work. He then examines eight talking points about using automation to prevent disruptions in on-site laboratory activities, as well as a few critical points about the laboratory work that can be done remotely. He finishes by discussing other external disruptions to laboratory work, including meteorological issues, natural disasters, power disruptions, and supply chain issues. Liscouski concludes that while laboratory automation is here to stay by improving workflows and limiting disruptions, we must implement it with care and deliberate planning in order to make the most of it.
Artificial intelligence (AI) has been discussed in many healthcare contexts over the years, and this includes within the medical imaging field. But how aware are imaging specialists of AI and machine learning (ML) methods in imaging informatics, and what knowledge gaps must be filled to address concerns about misuse of patient safety with AI and ML in medical imaging? This survey-based research by Eiroa et al.
examines the responses to numerous questions related to AI and ML by Spanish radiologists and suggests there are several information gaps that must be addressed. After a brief introduction and discussion concerning the survey methods, the authors present their results in numerous tables and images, followed by a discussion of that data in the scope of current and new radiologists entering the field. They conclude that "there is a general lack of knowledge about AI, ML, and related topics among Spanish radiologists, including both members in training and attending physicians," though there was an eagerness to learn and little fear of such automated methods taking away radiology jobs. They add that "there is no doubt that a common consensus is needed to change the current training curriculum to prepare new radiologists for a future world in which AI will undoubtedly shape the profession."
Mycotoxins, toxic secondary metabolites produced by fungi, have been known to be a significant concern in cannabis testing, particularly when considering medical cannabis patients. Plant material must certainly be tested for mycotoxins, but what about extractions created from that plant material? Do mycotoxins in the plant transfer to the extracts? This journal article by Serafimovska et al.
provides the results of research examining that issue. The authors first provide background about mycotoxins and cannabis, and then they discuss the materials and methods (as well as their validation) used in their research. In their discussion and concluding remarks, they note that making extracts from plant materials with the maximum allowed amount of mycotoxins added to them produced extracts with mycotoxins "in an amount much higher than the amount in which we added them." They add: "With this experiment, we have shown that aflatoxins as extremely toxic secondary metabolites, can reach critical values in cannabis extracts obtained from dry cannabis flowers with the maximum allowed quantity of aflatoxins ... [and] can pose a great risk to consumers and their health, especially to those with compromised immune systems."
In this brief article published in the journal Advances in Radiation Oncology
, Joyce et al.
state the case for the journal's move to develop a special manuscript category dedicated to cybersecurity issues as they relate to the field of radiation oncology. They review numerous cybersecurity incidents that occurred in 2021, which disrupted healthcare efforts, sometimes severely. In one case, the authors find the effects of a ransomware attack on a cancer treatment center in New Zealand to be worse than those effects of the COVID-19 pandemic. Additionally, the authors address the problems that come with not being able to rapidly access patient medical records, "causing delayed treatment for thousands of patients" in some cases. The discussion of these and other similar incidents leads to a non-exhaustive list of what oncology departments can do to improve preparedness for cybersecurity incidents, including maintaining offline, encrypted back-ups and implementing a user awareness training program.
In this 2021 paper published in the journal Frontiers in Digital Health
, Maury et al.
of University of Lausanne and Lausanne University Hospital describe their process of developing a COVID-19 dashboard for improving their health care efforts. Noting the usefulness of rapidly gathering, integrating, and using data during epidemics, the authors decided to pull information from their laboratory information system (LIS), as well as other systems, and try to streamline COVID-19 care. After describing their reverse transcription polymerase chain reaction (RT-PCR) workflows and information systems, they discuss their dashboard—developed in R Shiny—and how its various components aided in their hospital system's goals. The authors conclude that a "dashboard promises the potential gain of time and productivity in a public hospital context, where the resources are scarce and the staff is under the day-to-day task pressure."
In this 2022 article published in Journal of Cannabis Research
, Kruger and Kruger share the results of an exploratory study about delta-8-tetrahydrocannabinol (Δ8
-THC) "to inform policy discussions and provide directions for future systematic research." Noting the lack of research on the cannabinoid, the duo developed a survey and recruited Δ8
-THC users to report their experiences with it. After discussing their methodology and sharing their results from the survey, the authors found participant reports to be "overall supportive of the use of Δ8
-THC" and to contain "a wealth of other information that can inform hypothesis testing and research questions in future studies." They conclude that their research should further drive additional collaborative research about, for example, substituting Δ8
-THC for Δ9
-THC, and inform further policy discussions about laboratory testing, user safety (i.e., harm reduction), and legalization efforts.
In this 2021 journal article published in Advances in Laboratory Medicine
, Yeste et al.
discuss their recommendations in regards to implementing ISO 15189 in Spanish laboratories. Based on Entidad Nacional de Acreditación (ENAC) and its accreditation requirements, the authors provide context for clinical laboratorians involved with post-analytical processes seeking to meet ISO 15189 and ENAC requirements. The authors discuss specimen storage, retention, and disposal; quality assurance and continuous improvement; laboratory information management; and the use of ENAC accreditation labeling as part of their set of recommendations. They conclude by highlighting that "with ISO 15189 being the most specific standard for demonstrating technical performance, a clear understanding of its requirements is essential for proper implementation."
Back in December we posted an article by Fraggetta et al.
on recommended best practices when implementing a digital pathology workflow. However, a month before that article was published in the journal Diagnostics
, Fragetta and a different group of authors published a paper on digital pathology implementation, from the perspective of implementing it in the Gravina Hospital system in Sicily. This paper examines the hospital's transition step by step, demonstrating "the digital transition of analog, non-tracked pathology laboratories" to an improved digital workflow. After describing the hospital's prior situation and explaining their methods, the authors discuss the results of their implementation in detail, from initial accessioning to final archiving. After a lengthy discussion, the authors conclude that "following the step-by-step instructions, the implementation of a paperless routine with more standardized and safe processes, the possibility to manage the priority of the cases and to implement AI-based tools is no more a utopia for every analog pathology department."
In this 2021 paper published in the journal Journal of Translational Medicine
, Asiimwe et al.
discuss the challenge of moving "vast amounts of research and clinical data" from their silos to more accessible platforms that biobanks and their researchers can better take advantage of. In particular, they discuss the steps British Columbia’s Gynecological Cancer Research Program (OVCARE) took to move from a traditional system of data silos to an integrated data commons in an effort to standardize and encourage collaborative data sharing and governance. After providing a bit of background on their situation, the authors discuss matching their research community's needs to an integrated domain-specific system infrastructure. After reflecting on their implementation, the authors discuss their results and conclude that a "seamless data environment for clinical and research data can be achieved through shared policies and technologies, and privacy-preserving open computer architectures and storage platforms."
As with preparing samples in other fields of science, the processing method for cannabis in order to measure its analytes can vary, sometimes significantly. However, some standardized processes for, e.g., extraction protocols, do exist, though one process may still differ slightly from another. According to Bowen et al.
of Colorado State University and Charlotte’s Web, Inc., a knowledge gap exists concerning what the composition results of varying extraction protocols are on cannabis. In this 2021 paper, the authors seek to address that knowledge gap, using a single proprietary cannabis cultivar and 20 commercial extraction procedures. The authors conducted principal component analysis (PCA) using a variety of gas chromatography–mass spectrometry (GC–MS), ultra high-performance liquid chromatography-tandem mass spectrometry (UPLC-MS/MS), and inductively coupled plasma-mass spectrometry (ICP-MS) techniques. They found discrepancies in the bioactive chemical profiles across the different extraction protocols, leading them to conclude a definitive "need for further research regarding the influence of processing on therapeutic efficacy, as well as the importance of labeling in the marketing of multi-component cannabis products."
In this 2021 paper published in the journal One Health
, Kachuwaire et al.
discuss the results of an Armenian effort to strengthen the human and veterinary laboratory quality management system (QMS) programs across 15 laboratories. This QMS strengthening, based on World Health Organization and World Organization for Animal Health guidelines, occurred between 2017 and 2020, followed by an assessment of the program's success. After reviewing their assessment methods, the authors discuss the results, noting that in both human and veterinarian labs, improvements were seen in the "areas of organizational structure, human resources, equipment management, supply chain, and data management," though some laboratories still showed poor results afterwards. Their findings indicated those labs that tended to do poorly post-implementation lacked a supportive long-term quality manager, "demonstrating the importance of a full-time or substantive quality manager."
For those studying plant biology and how the environment affects it, the recognition of that study as being multidisciplinary, challenging, and vital is growing. However, as Krantz et al.
point out in their 2021 paper published in Frontiers in Plant Science
, that recognition alone is not enough. Given multi-omics research into plant biology and the environmental sciences over recent decades, as well as disparate data formats and management strategies across the many related disciplines, bringing this varied multidisciplinary data together for research is difficult, slowing the modern research process down. The authors review these incoherencies and discuss several ways in which " a quantitative model of plant-environment interactions" is required to advance such research. After their introduction, the authors discuss important components of this model, including genome-scale metabolic network reconstruction, quantitative large-scale experiments on integrative platforms, quantitative analysis methods, and research data management systems. They conclude that these elements, and others, will help lead to more "efficient use of experimental findings."
In this 2021 journal article published in Data Science Journal
, Damerow et al.
emphasize that while sample-based research is critical to a wide range of ecosystem sciences, the increasingly multidisciplinary approach to those sciences requires a better, more coordinated practice of sample identification and data management. "While there are widely adopted conventions within certain domains to describe sample data," they say, "these have gaps when applied in a multidisciplinary context." With this paper, the authors present a more practical approach to sample identification and management that takes into account the multidisciplinary requirements of ecosystems research. After a brief review of literature and existing sample identification methods, guidance, and standards, they describe their pilot program for standardizing sample metadata and propose several benefits to the program. They conclude that "user-friendly guidance and sample metadata templates are an essential step in promoting standard practices that make data publishing, integration, and reuse easier," though proper training, legacy data management tools, and information management systems are also important components towards those goals.
The implementation of digital pathology workflows has seen an uptick in interest in recent years, though as Fraggetta et al.
point out in this 2021 journal article, only a minority of pathology laboratories have fully embraced these workflows. Wanting to increase the adoption rate of digital pathology, the authors present their experiences with digital pathology and focus on four critical considerations to improve adoption rates. After presenting a brief introduction to digital pathology, the authors discuss various aspects of involvement, optimization, and automation required to get digital pathology projects initially started. They then commit a significant portion of their paper to discussing the quality control program that must be implemented as part of a digital pathology workflow. They also address whole slide imaging and its validation, as well as retention policies, results evaluation, and the necessary maintenance of workflows after implementation. They conclude with 10 basic principles (classified as recommendations or suggestions) to address when transitioning from "a classic, 'analog' to a completely digital workflow," adding that " the present document represents a practical, handy reference for the correct implementation of a digital workflow in Europe."
Veteran laboratory automation/computing professional Joe Liscouski is at it again, this time releasing a perspective piece that takes elements from more than 15 years of writing and presentation, painting a nuanced approach to planning for the use of computer systems in the laboratory. In particular, this November 2021 work continues to expand on the importance of laboratory systems engineering in the laboratory of the future. After providing a full introduction, Liscouski examines both the past and present of laboratory computing and how the automation aspects of that computing affects laboratory personnel. He then goes on to espouse the benefits of a more industry-wide approach to addressing the technological and educational needs of laboratories of all types, particularly in regard to how standardization plays an important role. He then addresses laboratory work itself, how automation can move that work forward, and how to effectively apply that automation to the laboratory. Finally, Liscouski closes by emphasizing the importance of a "center for laboratory systems engineering" to help centralize the efforts mentioned in the guide. A sizeable appendix is included, providing more historical perspective to the work and its conclusions.
In this 2021 paper published in Frontiers in Plant Science
, Feder et al.
present the analytical results of Cannabis
inflorescences that have been pollinated and fertilized, in an attempt to show that fertilization can affect phytocannabinoid accumulation, among other goals. Noting that growers have recently shifted to using seeds, which can despite feminization reveal to be male seeds five to ten percent of the time, the authors note that this type of research—rarely conducted—is particularly critical to answering questions about how pollination effects phytocannabinoid and terpenoid expression. After providing background information and reviewing materials and methods for their analyses, the authors discuss the results, noting in particular that "phytocannabinoid quantity predominately decreases after fertilization," terpenoid quantity can vary based upon the type of female plant, and that "individual terpenoid concentrations are differentially affected by fertilization." After further discussion, they conclude not only the three prior-mentioned findings, but they also by extension suggest that their findings indicate the "functional roles" of phytocannabinoids and terpenoids in Cannabis
's plant life cycle.
In this 2021 journal article published in Journal of Medical Biochemistry
, Arifin and Yusof of Universiti Kebangsaan Malaysia examine the error factors that come with using a laboratory information system (LIS) and propose the "total testing process for laboratory information systems" (TTP-LIS). This process leans on a variety of existing frameworks and lean quality improvement methods to meet the authors' needs and is applied to two large hospitals in Malaysia. After examining human, technology, and organizational factors, the authors discuss their findings, noting that their "findings showed the practicality of the TTP-LIS framework as an evaluation tool in identifying errors and their causal factors. The use of lean tools—namely, VSM, A3, and 5 Why—enabled us to analyze and visualize the root cause of problems in an objective and structured manner. " Those root causes were able to be categorized in three ways: "as a latent failure in system development, as poor error management, and as unsatisfactory lab testing processes and LIS use."