Infrastructure tools to support an effective radiation oncology learning health system

In this 2023 article published in the Journal of Applied Clinical Medical Physics, Kapoor et al. of Virginia Commonwealth University present their approach to managing radiation oncology research data in a FAIR (findable, accessible, interoperable, and reusable) way. The authors focus on the advantages of a knowledge graph-based database combined with ontology-based keyword search, synonym-based term matching, and a variety of visualization tools. After presenting background on the topic, Kapoor et al. describe their approach using 1,660 patient clinical and dosimetry records, standardize terminologies, data dictionaries, and Semantic Web technologies. They then discussed the results of their mapping efforts, as well as the discovered benefits of a knowledge graph-based solution over a traditional relational database approach. They conclude that their approach "successfully demonstrates the procedures of gathering data from multiple clinical systems and using ontology-based data integration," with the radiation oncology datasets being more FAIR "using open semantic ontology-based formats," which in turn helps "facilitate interoperability and execution of large scientific studies."

Comprehensive improvements in the emergency laboratory test process based on information technology

Can the thoughtful application and optimization of laboratory informatics solutions lead to do a variety of improvements in a hospital-based emergency medicine laboratory? Zhang et al. sought to answer this question in a a contemporary tertiary second-class general hospital in China. After analyzing their emergency laboratory workflows, identifying problems withing those workflows, and identifying other areas of optimization, the authors applied a laboratory information management system (LIMS) to their operations, upgraded instrumentation, and finessed other aspects of their IT infrastructure. After discussing their results, the authors concluded that "the automation and standardization of most stages in emergency laboratory testing can be realized by IT, which reduces the workload of employees and improves emergency laboratory test quality," further noting that "emergency laboratory test report time can be shortened, emergency laboratory test quality can be enhanced, and employee and patient satisfaction can also be improved."

Restricted data management: The current practice and the future

In this brief article in Journal of Privacy and Confidentiality, Jang et al. of Inter-university Consortium for Political and Social Research (ICPSR) at University of Michigan review the current approach of researchers towards restricted data management and how it may need to evolve given certain challenges. They offer this review in the scope of increasing demands for FAIR (findable, accessible, interoperable, and reusable) data in research and journal communities. After a brief introduction, the authors discuss three main elements of managing restricted date: data use agreements, disclose review, and training. While the Five Safes framework is effective under this scope, the authors concludes that such "safeguards could generate unintended challenges to certain groups of individuals (e.g., institutional approval that could exclude researchers without institutional affiliation) or in different areas (e.g., rigorous output checking that requires extensive insights from experts)." They add that organizations may need develop more thoughtful and standardized data management policies that still remain flexible under the scope of FAIR data.

Digitalization concepts in academic bioprocess development

In this 2024 paper published in the journal Engineering in Life Sciences, Habich and Beutel review the potential technological transformation processes that await the academic bioprocess laboratory, moving from paper-based workflows to electronic ones. Noting multiple challenges facing these academic labs, the authors look at multiple laboratory informatics and automation technologies in the scope of the FAIR (findable, accessible, interoperable, and reusable) data principles, standardized communication protocols, digital twins, and more. After going into detail about these and other related technologies (including electronic laboratory notebooks and laboratory information management systems), the authors present digitalization strategies for academic bioprocess labs. They conclude that "when supported by digital assistants, using connected devices, more intuitive user interfaces, and data gathering according to FAIR data principles, academic research will become less time consuming and more effective and accurate, as well as less prone to (human) error." They add that experiment reproducibility, instrument integration, and other aspect also benefit, though no one-size-fits-all approach exists for these labs.

Organizational Memory and Laboratory Knowledge Management: Its Impact on Laboratory Information Flow and Electronic Notebooks

In this 2024 work by laboratory informatics veteran Joe Liscouski, the concept of "organizational memory" within the laboratory realm and what it means for informatics systems like electronic notebooks—including electronic laboratory notebooks (ELNs)—is addressed. Liscouski argues that as artificial intelligence (AI) and large language models (LLMs) become more viable to laboratories, those laboratories stand to benefit from LLM technology in attempts to retain and act upon all sorts of data and information, from raw instrument data and test results to reports and hazardous material records. After addressing what organization memory is and means to the lab, the author discusses the flow of scientific data and information and how it relates to informatics and AI-driven systems. He then discusses the implications as they relate specifically to ELNs and other electronic notebook activities, looking specifically at a layered approach built upon open-source solutions such as Nextcloud "that can feed the [LLM] database as well as support research, development, and other laboratory activities."

Conversion of a classical microbiology laboratory to a total automation laboratory enhanced by the application of lean principles

In this 2024 article published in the journal Microbiology Spectrum, Trigueiro et al. "explore the relative impact of laboratory automation and continuous improvement events (CIEs) on productivity and [turnaround times (TATs)]" in the clinical microbiology laboratory. Noting numerous advantages to applying the concept of total laboratory automation (TLA), the authors implemented WASPLab and VITEK MS and compared pre-conversion workflow results with post-conversion workflow results. The group also incorporated change management, CIEs, and key performance indicators (KPIs). They concluded that "conversion resulted in substantial improvements in KPIs. While automation alone substantially improved TAT and productivity, the subsequent implementation of lean management further unlocked the potential of laboratory automation through the streamlining of the processes involved." However, they also noted that while effective, some cost is involved with TLA, and labs need "to find ways of maximizing the benefits of the use of [laboratory automation] technology."

OptiGUI DataCollector: A graphical user interface for automating the data collecting process in optical and photonics labs

Wanting to integrate instruments and electronic systems in a laboratory is nothing new, and it is in fact increasingly desired. The optics and photonics lab is no exception to this fact. In this 2023 article published in the journal SoftwareX, Soto-Perdomo et al. address this desire with their OptiGUI DataCollector, a modular system that integrates optical and photonic devices, simplifies experiment workflows, and automates numerous workflow aspects in the lab, which in turn reduces the number of human-based errors. After a brief introduction to their reasoning for the system, the authors fully describe it and provide illustrative examples of its use. They conclude that the "software's automation capabilities accelerate data acquisition, processing, and analysis, saving researchers time and effort," while also improving reproducibility of research experiments.

Ten simple rules for managing laboratory information

For many years, the journal PLOS Computational Biology has been publishing a series of "Ten Simple Rules" articles as they relate to data management and sharing in laboratories and research institutes. In this latest installment of the series, Berezin et al. tap into their collective experience to address the management of laboratory data and information, particularly leveraging systems like a laboratory information management system (LIMS). In these rules, the authors examine aspects such common culture development, inventory management, project management, sample management, labeling, and more. They conclude that apart from "[i]mparting a strong organizational structure for your lab information ... the goal of these rules is also to spur conversation about lab management systems both between and within labs as there is no one-size-fits-all solution for lab management." However, the application of these rules isn't exactly a straightforward process; their success "relies on the successful integration of effective software tools, training programs, lab management policies, and the will to abide by these policies."

Hierarchical AI enables global interpretation of culture plates in the era of digital microbiology

In this 2023 paper published in Nature Communications, Signoroni et al. present the results of their efforts towards a system designed for "the global interpretation of diagnostic bacterial culture plates" using deep learning architecture. Noting the many challenges of human-based culture interpretation, the authors present the results of DeepColony," a hierarchical multi-network capable of handling all identification, quantitation, and interpretation stages, from the single colony to the whole plate level." After reviewing the results, the authors conclude that given the "high level of agreement" between DeepColony's results and the interpretation of humans, their system holds significant promise. They add that DeepColony can be viewed as "a unique framework for improving the efficiency and quality of massive routine activities and high-volume decisional procedures in a microbiological laboratory, with great potential to refine and reinforce the critical role of the microbiologist."

Critical analysis of the impact of AI on the patient–physician relationship: A multi-stakeholder qualitative study

In this 2023 paper published in the journal Digital Health, Čartolovni et al. present the results of their qualitative, multi-stakeholder study regarding the "aspirations, expectations, and critical analysis of the potential for artificial intelligence (AI) to transform the patient–physician relationship." Pulling from hours of semi-structured interviews, the authors developed a number of themes from the interviews and discussed how to reach a consensus on their interpretation and use. The authors focused on four main themes and a variety of subthemes and present the results of their work in the framework of these themes and subthemes. From their work, they conclude a definitive "need to use a critical awareness approach to the implementation of AI in healthcare by applying critical thinking and reasoning, rather than simply relying upon the recommendation of the algorithm." Additionally, they highlight how "it is important not to neglect clinical reasoning and consideration of best clinical practices, while avoiding a negative impact on the existing patient–physician relationship by preserving its core values," strongly urging for the preservation of patient-physician trust in the face of increased AI use.

Judgements of research co-created by generative AI: Experimental evidence

As discussion of artificial intelligence (AI) continues to ramp up in not only academia but also industry and culture, questions abound concerning it's long-term potential and perils. Attitudes about AI and what it means for humanity are diverse, and sometimes controversial. In this 2023 paper published in Economics and Business Review, Niszczota and Conway contribute to the discussion, particularly in regards to how people view research co-created by generative AI. Recruiting more than 440 individuals, the duo conducted a mixed-design experiment using multiple research processes. All said, their "results suggest that people have clear, strong negative views of scientists delegating any aspect of the research process" to generative AI, denoting it "as immoral, untrustworthy, and scientifically unsound." They conclude that "researchers should employ caution when considering whether to incorporate ChatGPT or other [large language models] into their research."

Geochemical biodegraded oil classification using a machine learning approach

The use of chromatography methods in characterizing and interpreting phenomena related to oil has been around for a long time. However, reading chromatograms—particularly large quantities of them—can be time-comsuming. Bispo-Silva et al propose a more rapid process using modern deep learning and artificial intelligence (AI) techniques, increasingly being used in large companies. The authors land on using convolutional neural networks (CNN) for this work, particularly in the use of discriminating biodegraded oils from non-biodegraded oils. After presenting background on the topic, as well as their materials and methods, the authors present the results of using CNN and a variety of algorithms to classify chromatograms with both accurate and misleading training materials. The authors conclude that "CNN can open a new horizon for geochemistry when it comes to analysis by" a variety of gas chromatography techniques, as well as "identification of contaminants (as well as environmental pollutants), identification of analysis defects, and, finally, identification and characterization of origin and oil maturation."

Knowledge of internal quality control for laboratory tests among laboratory personnel working in a biochemistry department of a tertiary care center: A descriptive cross-sectional study

In this brief study by Mishra et al. in the Journal of Nepal Medical Association, the level of knowledge of internal quality control (IQC) in the Department of Biochemistry, B.P. Koirala Institute of Health Sciences (BPKIHS), a tertiary care center, is explored. Noting the importance of laboratory quality control and knowledge of IQC to patient outcomes, the authors conducted a descriptive cross-sectional study of its laboratory staff (n=20), asking questions related to "the understanding of the purpose of IQC, the types of control materials, various control charts, how and when IQC should be performed, and interpretations of the Levey-Jennings Chart using the Westgard rule." The authors concluded from their results were inline with other studies conducted in similar environments (25% had adequate knowledge of IQC), their facility had work to do in improving IQC knowledge and quality management systems more broadly. They add: "Hence, providing training opportunities on laboratory IQC can be reflected as a necessity in our current laboratory set-up. This could add value to the knowledge of IQC on laboratory personnel to ensure that the reports generated within the laboratory are accurate, reliable, and reproducible."

Sigma metrics as a valuable tool for effective analytical performance and quality control planning in the clinical laboratory: A retrospective study

In this 2023 article published in the National Journal of Laboratory Medicine, Karaattuthazhathu et al. of KMCT Medical College present the results of a performance assessment of its clinical laboratories' analyte testing parameters using a Six Sigma approach. Examining a six-month period in 2022, the authors looked at 26 parameters in biochemistry and hematology using both internal quality control (IQC) and external quality assurance (EQAS) analyses. After presenting materials and methods used, as well as the results, the authors reviewed their results and concluded that according their sigma metrics analysis, their laboratories are "able to achieve satisfactory results, with world-class performance of many analytes," though recognizing some deficiencies, which were corrected mid-study.

Why do we need food systems informatics? Introduction to this special collection on smart and connected regional food systems

In this 2023 paper published in Sustainability, Tomich et al. describe the concept of food systems informatics (FSI) within the context of a collection of this and other paper published as a special collection in the journal. Noting many challenges to improving and transforming food systems, as well as the potential for informatics applications to play an important role, the authors describe five use cases of FSI and discuss the potential outcomes and impacts of developing and implementing FSI platforms in these and other use cases. Finally, the authors draw six major conclusions from their work, as well as several caveats about FSI implementation going forward. They argue that FSI definitely has potential as "a tool to enhance equity, sustainability, and resilience of food systems through collaborative, user-driven interaction, negotiation, experimentation, and innovation within food systems." However, the scope of FSI must be expanded to include " food systems security, privacy, and intellectual property considerations" in order to have the greatest impact.

Data management challenges for artificial intelligence in plant and agricultural research

In this 2023 article published in the journal F1000Research, Williamson et al. identify and discuss "eight key challenges in data management that must be addressed to further unlock the potential of [artificial intelligence] in crop and agronomic research." Noting the state of the agricultural research landscape and growing potentials of artificial intelligence and machine learning in the field, the authors perform a literature review that better shapes the nuances of those eight key challenges: data heterogeneity, data selection and digitization, data linkages, standardization and curation of data, sufficient training and ground truth data, system access, data access, and engagement. After examining these challenges in detail, the authors conclude there's a definitive "need for a more systemic change in how research in this domain is conducted, incentivized, supported, and regulated," with aspects such as stronger collaboration, more efficient machine learning methods, improved data curation, and improved data management methods.

A blockchain-driven IoT-based food quality traceability system for dairy products using a deep learning model

In this 2023 paper published in the journal High-Confidence Computing, Manisha and Jagadeeshwar present their custom food quality traceability system, which combines various tools like blockchain, internet of things (IoT) mechanisms, and deep learning architecture for improving traceability about perishable food supply chains (PFSCs). Noting previous works that incorporated some but not all of these elements, each with their own downsides, the authors chose a system model that incorporates blockchain-enabled RFID scans that, upon verification, get added to the overall secure ledger, along with associated IoT-base metadata from sensors gauging humidity and temperature. The duo turns to milk manufacturing and distribution in their case study, applied to their blockchain-enabled deep residual network (BC-DRN) methodology. After examining and comparing their results to other prevalent methodologies used in supply chain management (SCM). The authors conclude that their BC-DRN traceability system, when gauged on metrics like sensitivity, response time, and testing accuracy, beat out other methods. They add "the performance of the devised scheme can be improved by considering better feature extraction techniques. "

Effect of good clinical laboratory practices (GCLP) quality training on knowledge, attitude, and practice among laboratory professionals: Quasi-experimental study

In this brief 2023 article published in Journal of Clinical and Diagnostic Research, Patel et al. provide the results of their survey-based analysis of laboratorians' knowledge, attitude, and practice (KAP) towards laboratory quality through good clinical laboratory practice (GCLP) training. Noting the "ethical obligation to provide accurate and precise results that are cost- and time-effective," the authors state that it's imperative for clinical laboratory personnel to adhere to quality planning and system implementation, while also possessing an understanding of quality management principles as they apply to the lab. The authors describe their survey format and present the results of their survey, concluding that while no statistically significant differences could be found towards staff attitudes towards quality in the lab after GCLP training, laboratorians in their survey acknowledged the benefits of GCLP guidelines and accreditation, as well as the importance of training on such matters. "Such training and assessments would also aid in evaluating the performance of laboratory staff, contributing to improved learning, execution of GLPs, and consistent patient care services."

GitHub as an open electronic laboratory notebook for real-time sharing of knowledge and collaboration

In this 2023 paper published in the journal Digital Discover, Scroggie et al. of the University of Sydney present their efforts towards utilizing the open developer platform GitHub as an electronic laboratory notebook (ELN) for chemistry research. Noting a lack of open-source ELNs with a focus on non-organic chemistry that have a wealth of collaboration tools, as well as problems with expensive and inflexible commercial ELNs, the authors turned to the many open facets of GitHub to repurpose its workings for synthetic chemistry projects. After a brief discussion of GitHub, the authors explain how they used the various facets of GitHub for ELN-related tasks, including notebooks, data and metadata management, and collaborative tools. They also acknowledged several shortcomings of their approach, including learning Markdown, dealing with data storage limitations, and integrating discipline-specific applications. The authors conclude that some of GitHub's features "are undeniably more oriented towards coders, such as the Actions tab in which users can set up workflows using code, these features do not detract from GitHub's usefulness as an ELN, which lies mainly in its adaptability and capacity for knowledge-sharing and collaboration."

SODAR: Managing multiomics study data and metadata

In this 2023 paper published in the journal GigaScience, Nieminen et al. of the Berlin Institute of Health at Charité–Universitätsmedizin Berlin present SODAR (System for Omics Data Access and Retrieval), an open-source scientific data management system (SDMS) with a focus on omics data management during multiassay research studies. Noting numeroud data management challenges and dearth of open-source options, the authors describe the software framework, features, and limitations of the system. Highlighting SODAR's "programmable application programming interfaces (APIs) and command-line access for metadata and file storage," the authors conclude that SODAR can readily "support multiple technologies such as whole genome sequencing, single-cell sequencing, proteomics, and mass spectrometry," though some aspects such as automated data export and "data commons" access are currently not available.
Page 1 of 24123Next ›Last »