Analysis of Cannabis
and its constituents is seemingly always in a state of improvement and revision as we continue to learn more about the plant. In this 2021 paper published in SN Applied Sciences
, Morehouse et al.
propose another improvement to cannabinoid molecule concentration determinations for the purposes of differentiating hemp from marijuana and other potency testing measures. In their paper, the authors turn to an “active grinding media" using a bead mill for homogenizing the plant and improving cannabinoid recovery for testing. After explaining their methodology, presenting their results, and discussing their findings, the authors conclude that their method results in "increased cannabinoid recovery when compared to larger particle size homogenates, and no alterations in the carboxylation profiles of CBDA or THCA during processing." This proves to be a boon to those who need "to maintain the highest of potency and purity standards when moving forward with their molecular analysis."
In this 2021 paper published in the journal Applied Sciences
, Jofre et al.
discuss the cybersecurity and privacy requirements associated with point-of-care (POC) and health information systems in the field of healthcare. In particular, the authors "propose a use-case approach to assess specifications of cybersecurity and privacy requirements of POC systems in a structured and self-contained form." After introducing the concept of cybersecurity in POC and other medical devices, they discuss their use case in the scope of the European Union's Secure and Private Health Data Exchange (CUREX) software platform for delivering better trust and security to applications in the healthcare domain. They provide in-depth details about their use case, its technical development, and its implementation in CUREX. They then discuss their results, concluding that as cybersecurity and privacy protection requirements continue to be implemented in healthcare, validated use cases such as their own will be vital for healthcare facilities adopting POC and related technologies, as they will further ensure "the highest quality for what is actually being delivered on the ground or showing promise in terms of developments in the pipeline."
In this brief commentary article published in Health Services Research and Managerial Epidemiology
, Matthews and Proctor of the University of Central Florida argue that simply building out public health informatics infrastructure isn't enough to ensure better public health data and improved health outcomes in the populace. The duo notes that when considering the build out of such technologies, it's vital the implementing organization(s) ask "who are we building this product for, and do we have the right information to back up our theories on implementation and use?” Improved technology skills within the public health workforce aren't enough, they argue, and important human factors research is being neglected when planning for systems implementations. After discussing the importance of health factors research in public health practice and what it should look like, they conclude with four critical recommendations for new and existing public health informatics implementations, including stakeholder collaboration, theoretically informed and validated data collection, proper documentation, and malleable but well-informed technology requirements statements for the future.
In this 2021 perspectives article published in the journal iScience
, Hauschild et al.
of Philipps University of Marburg lay out what they believe to be the first guidelines for an academic organization needing to incorporate a manageable and approachable quality management system (QMS) to support academic research, particularly that involving software developed in-house. Noting the challenges of academic scientific software development in health informatics and bioinformatics environments, the authors propose flexible quality management mechanisms as a means to "facilitate technology transfer as much as possible while keeping the overhead for researchers and developers in a well manageable range." After presenting the foundational guidelines for academic organizations to implement to not only promote reproducibility and reusability, but also support FAIR principles, the authors conclude that their guidelines effectively lower "the hurdle for research organizations to set up quality management" and " significantly facilitate reproducibility and reusability of scientific software and speed up technology transfer in a controlled and predictable way."
Clinical informatics continues to be a vital component for helping to ensure quality clinical outcomes for patients. But what of those who must learn about clinical informatics and what it entails? What of its core competencies for learning? Davies et al.
explore this topic in the scope of the United Kingdom and the NHS, noting that no national core competency framework for clinical informatics exists in that country. The authors provide an introduction to clinical informatics and a background review of building their Core Competencies Project in an attempt to develop a U.K.-based competency framework. Their methods—heavily leaning on interviews and a survey—are then described, along with the resulting final competency framework. After a discussion of their results and their limitations, the authors conclude that their mixed-methods approach allowed them to "take a systematic and structured approach to co-designing the framework with no major issues," adding that the "combination of qualitative and quantitative methods contributed to the robustness of the final output." However, the application of the framework still requires professional considerations and attentiveness to the needs of all stakeholders to ensure it's not too unwieldy, which can lead to knowledge "burnout."
This 2021 article published in the journal Practical Laboratory Medicine
examines laboratory testing in the scope of quality management and laboratory stewardship programs and the long-term benefits they bring to an organization. White et al.
first discuss the evolution of quality in the laboratory and then introduce the concept of "laboratory stewardship" and its ability to improve patient care. The rest of the work focuses on implementing such a program and presenting a hypothetical case study to show how a laboratory stewardship program can have a practical impact on a healthcare system. The authors conclude that a well-implemented stewardship program "presents a valuable opportunity for laboratory professionals to engage with clinical colleagues and drive change." They also distill their work down to five key elements required for making such programs have maximum impact: "1) a clear vision and organizational alignment; 2) appropriate skills for program execution and management; 3) resources to support the program; 4) incentives to motivate participation; and, 5) a plan of action that articulates program objectives and metrics."
In this 2021 journal article published in Molecules
, Pieracci et al.
present the results of a two-year analysis and comparison of cannabis constituents in the essential oil of 11 hemp genotypes using gas chromatography—mass spectrometry (GC-MS). Noting prior researchers' difficulties in comparing "the chemical composition of the essential oils extracted from different Cannabis sativa
L. genotypes," the researchers strived to conduct a thorough examination, demonstrating the ability of GC-MS methods to make the analyses necessary to determine most constituents, as well as "promote their employment as value-added by-products based on their peculiar characteristics." Through their efforts, the researchers were able to identify 116 compounds, representing 90.6–99.4% of the total composition of the essential oils (EO) gathered. After analyzing their results, the authors concluded "that both the EO chemical profile and extraction yield were significantly influenced by the genotype of the starting material, the year of cultivation, and the interaction between these two factors."
This brief journal article published in Scientific Bulletin
examines how organizations can mitigate the risks of implementing artificial intelligence (AI) and other Industry 4.0 technologies into manufacturing centers, particularly in regards to cybersecurity. Maurice Dawson of the Illinois Institute of Technology first gives a brief introduction on AI and cybersecurity, highlighting that "data science management" incorporates important technology skills, while also highlighting that the recognition of the cybersecurity component of data science management is perhaps lacking. Dawson then discusses the trend of Industry 4.0 and the promise it holds for manufacturers, while also addressing the ever-changing environment and the risks that must be highlighted within it. Noting the importance of manufacturing with the U.S. in particular, the author concludes that "as AI becomes increasingly prominent in critical industries such as manufacturing, it is essential to ensure that proper security controls are in place to thwart any possible threat, whether internal or external."
In this English review article published in the journal Rivista Italiana di Informatica e Diritto
, University of Macerata's Yuan Li "aims to systematically and chronologically describe Chinese regulations for cross-border data exchange." Given the growth of cloud-based systems and data management of clients around the world, doing business with Chinese businesses and individuals requires a better understanding of transfer regulations in the country. Li first explains the concepts behind global data transfer and some of the agreements and soft laws that have sprung up as a result of increasing cross-border transfers, as well as the problems that arise. Li then goes into a detailed review of China's data protection laws, as well as how they are enforced and who enforces them. Then the author examines the data export regulations that have evolved in China. The conclusion is that there are still limitations to China's approaches to building a regulatory framework for data protection and transfer; however, "there are various positive dynamic developments in the framing of China’s cross-border data regulation." Yet further agreements with other nations will still depend on future developments in "multilateral trade and investment negotiations."
In this 2021 article published in Cosmetics
, Jairoun et al.
describe the details of reportedly the first study in the United Arab Emirates (U.A.E.) "to measure to what extent topical cannabinoid-based consumer products contain undeclared tetrahydrocannabinol" (THC). Noting a lack of such studies in the U.A.E. as well as other parts of the world, the researchers used gas chromatography–mass spectrometry (GC-MS) to analyze 18 cannabinoid-based cosmetics manufactured in the U.S. and E.U. and sold in the U.A.E. to determine their undeclared THC and tetrahydrocannabinolic acid (THCA) content. After discussing their methodology, materials, and results, the authors conclude that with the discovery of "high levels of undeclared tetrahydrocannabinol" during their study, there is a real need for "cannabinoid-based cosmetic product producers to issue quality certificates ... [and provide] stricter monitoring and control regarding their [products'] safety and quality."
In this mini review published in Frontiers in Water
, Carriço and Ferreira draw from their personal experiences in the Portuguese water industry to present a case for better data and information management in assessing the condition of existing urban water infrastructure. The authors turn to a collection of 16 performance indicators for assessing water supply systems (WSS) or district metering areas (DMA), as well as the poor state of data management overall in water utilities, as a an impetus for improving the information systems that handle those indicators and related data. They further explain the data requirements for condition assessment of water utility assets, the importance of data integration and interoperability, and the existing barriers to getting those things correct. They close by suggesting future trends, noting however that "many small and medium size utilities worldwide are yet resistant" to such trends, while recommending that "utilities should start to investigate the data they collect and to rethink existing data models."
This mini review article published in the journal Frontiers in Digital Health
outlines the field of "diagnostic informatics," one that uses "digital health tools to facilitate the accuracy and timeliness of health information transfer and enhance the effectiveness of the decision-making processes." Georgiou et al.
focus on three primary aspects of diagnostic informatics: diagnostic errors and associated research, safety and efficacy of test result and follow-up management, and the enhancement of clinical decision support systems. The authors succinctly discuss these three aspects and then provide brief discussion about their findings. They close by noting that for the digital health tools of diagnostic informatics to prove most useful, they must consider the importance of patient-centered care, existing diagnostic processes in use, and of course the critical organizational communication processes in use.
Noting a dearth of published research on "cannabinoid and terpenoid profiles in different hemp phenotypes within the same variety," Eržen et al.
of the Slovenian Institute of Hop Research and Brewing and the University of Ljubljana took matters into their own hands and conducted such a study. The researchers examined 11 phenotypes from three different Cannabis
varieties: Carmagnola Selected (CS), Tiborszallasi (TS), and Finola Selection (FS). Their objective? They intended "to establish a connection between the chemical composition and morphological characteristics of hemp plants and to identify phenotypes with an interesting ratio between cannabinoids for further pharmaceutical applications." After providing background and reasoning for selecting their varieties, the authors discuss the results of their chemical analyses as well as the materials and methods they used. The authors conclude by summarizing their findings, including specific phenotypes that could be prime targets for further cannabis pharmaceutical research.
In this 2021 paper published in BMC Medical Informatics and Decision Making
, Jin et al.
discuss an autoverification and validation system they implemented in their laboratory information systems (LIS) to decrease validation workloads and reduce reporting risks in their healthcare systems. Using a human-machine dialog approach, the authors developed and implemented their autoverification system so that it could record personnel review steps and determine whether the human–machine review results are consistent, while also allowing laboratory personnel to tweak the system further for improved autoverification accuracy. After describing their system in detail, the authors present the results of their two years of use with the system, noting that "in the two years that our online validation has been in use, there have never been any defects or reporting risks due to autoverification" while at the same time enjoying more efficient laboratory verification.
In this 2021 article published in Frontiers in Digital Health
, Spanakis et al.
describe their research efforts towards the development of a more advanced way of securely transmitting and sharing sensitive healthcare data, including by use of blockchain and cloud computing. They first provide background on interoperability and data sharing in healthcare and the challenges that face such activities. This background also includes discussion of an interoperability framework in the European context and how it could be applied to their research. The authors then discuss the current status of standards-based health data exchange and how blockchain fits into that picture. They then propose their Innovative Secure Information Sharing Platform (InSISP), which is "able to support a fast and efficient medical information sharing at both national and cross-national levels, taking into account sharing constraints, including those imposed by the General Data Protection Regulation (GDPR)." Afterwards, they summarize their recommendations based on their InSISP, concluding that their platform has the potential improve healthcare data sharing and transmission, though not without challenges in addressing blockchained healthcare data in the scope of GDPR, as well as the overall complexity and heterogeneity of big healthcare data and the problems that arise.
Electronic health or eHealth solutions are increasingly used in healthcare, including in cloud computing settings. The cloud adds benefits to the use of those eHealth systems, but it also brings with it a number of security challenges, as Sivan and Zukarnain highlight in this 2021 paper. The duo turn to a literature review of eHealth technologies in relation to cloud computing and use that literature to show the strengths and weaknesses of a cloud-based eHealth systems approach. After providing background on cloud computing and the advantages it brings to eHealth systems, the authors identify security issues introduced by moving to the cloud and what security methods can help resolve those issues. They close with a recap of the proposed solutions, future directions for cloud-based eHealth security, and the conclusion that "the future of cloud-based eHealth services will be the integration of file-based and cloud-based applications that integrate a computer-based hybrid IT solution that measures the flexibility and scalability associated with cloud management and healthcare data security."
In this 2020 paper published in PLOS Computational Biology
, Davies et al.
of the University of Manchester discuss their experiences implementing Jupyter Notebooks in bioinformatics and health informatics education. Through these digital notebooks, the researchers were able to help teach future laboratorians with little coding experience how to perform basic coding of health informatics applications, as well as more advanced postgraduates how to apply the notebook to bioinformatics activities. After discussing the inner working of the Jupyter Notebook, the authors explain how such notebooks are able to enhance collaboration and reproducibility in laboratories, as well as improve learning and student assessment in the classroom. They then describe two education-based case studies of using the notebooks, and they describe the end results. The authors conclude that, given their experience as well as their students' experience, such notebooks can act "as a useful resource for learning to code and communicate research findings and analysis" in higher education, with the experience being transferrable to professional work in the future.
Informatics expert Joe Liscouski is back with another short laboratory informatics guide, this time geared towards those who are relatively new to the concept. In this 2021 guide, Liscouski approaches the various concepts surrounding laboratory informatics, but by first addressing the laboratory itself. What types of scientific and laboratory work are conducted? What's the difference between a research laboratory and a testing or "service" laboratory? How are the workflows different between the two? Liscouski notes that documenting data is critical to both types of laboratories, and historically, the paper-based laboratory notebook has been that tool. From there, he makes the logical leap to electronically documenting the same data, first by word processor and then by the electronic laboratory notebook (ELN). From there, integrating these electronic variations is addressed, as well as how other informatics tools like a laboratory information management system (LIMS) and scientific data management system (SDMS) shape and fit into laboratory workflows. He concludes by discussing the actual planning necessary for implementing these and other laboratory informatics solutions in the lab.
What do the internet of things (IoT), cloud computing, and blockchain all bring to the table of healthcare, particularly during a pandemic? Celesti et al.
demonstrate in this 2020 article published in Sensors
that the considerate combination of these technologies can lead to a "scenario where nurses, technicians, and medical doctors belonging to different hospitals cooperate through their federated hospital clouds to form a virtual health team able to carry out a healthcare workflow in secure fashion." The authors show how an IoT-connected laboratory that feeds its instrument data into a federated hospital cloud integrated with a blockchain engine can lead to less patient movement and better security of patient data, while also allowing nurses, doctors, and other practitioners from any of the connected hospitals to review results and issue treatments from a distance. They show the results of several experiments with such a system and conclude by promoting its benefits, as well as the possibility of extending the system to the pharmacy world.
In this brief 2020 paper published in JAMIA Open
, Seifert et al.
of the University of Florida health system share their experiences with taking their Beaker LIS implementation and improving its data tracking capabilities, which in turn improved "trainee education, slide logistics, staffing and instrumentation lobbying, and task tracking" within their anatomic pathology laboratories. After a brief introduction, the authors demonstrate Beaker's status board and weaknesses, and then show how they used Beaker's MyReports module to develop an improved status board. They then state six significant challenges and how they used the adapted LIS tools to solve them. They conclude that "the technical and/or functionality framework that we demonstrated in this manuscript could be adapted by other institutions to address common problems encountered by anatomic pathology laboratories."