In this 2017 paper by Plebani and Sciacovelli of the University Hospital of Padova, the duo offer their insights into the benefits and challenges of a clinical laboratory getting ISO 15189 accredited. Noting that in the European theater “major differences affect the approaches to accreditation promoted by the national bodies,” the authors discuss the quality management approach that ISO 15189 prescribes and why its worth following. The conclude that while laboratories can realize “world-class quality and the need for a rigorous process of quality assurance,” it still requires a high level of awareness among staff of the importance of ISO 15189 accreditation, an internal assessment plan, and well-defined, “suitable and user-friendly operating procedures.”
Compliance culture or culture change? The role of funders in improving data management and sharing practice amongst researchers
In this 2017 paper by Curtin University’s Cameron Neylon, the concepts of research data management (RDM) and research data sharing (RDS) are solidified and identified as increasingly common practices for funded research enterprises, particularly public-facing ones. But what of the implementation of these practices and the challenges associated with them? Neylon finds more than expected in his study, identifying sharp contrast among “those who saw [data management] requirements as a necessary part of raising issues among researchers and those concerned that this was leading to a compliance culture, where data management was seen as a required administrative exercise rather than an integral part of research practice.” Neylon concludes with three key recommendations for researchers in general should follow to further shape how RDM and RDS are approached in research communities.
This brief review article from early 2017 looks at the basic elements of public health informatics and addresses how they’re implemented. Aziz compares paper-based surveillance systems with electronic systems, noting the various improvements and challenges that have come with transitioning to electronic surveillance. He also reviews how public health informatics is applied in the U.S. and other parts of the world, including Saudi Arabia. Aziz concludes that “[p]atients, healthcare professionals, and public health officials can all help in reshaping public health through the adoption of new information systems, the use of electronic methods for disease surveillance, and the reformation of outmoded processes.”
Preferred names, preferred pronouns, and gender identity in the electronic medical record and laboratory information system: Is pathology ready?
In this technical note from the University of Iowa Hospitals and Clinics, Imborek et al. walk through the benefits and challenges of evaluating, implementing, and assessing preferred name and gender identity into its various laboratory systems and workflows. As preferred name has been an important point “in providing inclusion toward a class of patients that have historically been disenfranchised from the healthcare system” — primarily transgender patients — awareness is increasing at many levels about not only the inclusion of preferred name but also the complications that arise from it. From customizing software to billing and coding issues, from blood donor issues to laboratory test challenges, the group concludes that despite the “major undertaking [that] required the combined effort of many people from different teams,” the efforts provide new opportunities for the field of pathology.
Experimental application of business process management technology to manage clinical pathways: A pediatric kidney transplantation follow-up case
Sure, there’s plenty of talk about big data in the research and clinical realms, but what of smarter data, gained from using business process management (BPM) tools? That’s the direction Andellini et al. went in after looking at the kidney transplantation cases at Bambino Gesù Children’s Hospital, realizing the potential for improving clinical decisions, scheduling of patient appointments, and reducing human errors in workflow. The group concluded “that the usage of BPM in the management of kidney transplantation leads to real benefits in terms of resource optimization and quality improvement,” and that a “combination of a BPM-based platform with a service-oriented architecture could represent a revolution in clinical pathway management.”
Expert search strategies: The information retrieval practices of healthcare information professionals
In this 2017 article published in JMIR Medical Informatics, Russell-Rose and Chamberlain wished to take a closer look at the needs, goals, and requirements of healthcare information professionals performing complex searches of medical literature databases. Using a survey, they asked members of professional healthcare associations questions such as “how do you formulate search strategies?” and “what functionality do you value in a literature search system?”. The researchers concluded that “current literature search systems offer only limited support for their requirements” and that “there is a need for improved functionality regarding the management of search strategies and the ability to search across multiple databases.”
Unhappy with the electronic laboratory notebook (ELN) offerings for organic and inorganic chemists, a group of German researchers decided to take development of such an ELN into their own hands. This 2017 paper by Tremouilhac et al. details the process of developing and implementing their own open-source solution, Chemotion ELN. After over a half year of use, the researchers state the ELN new serves “as a common infrastructure for chemistry research and [enables] chemistry researchers to build their own databases of digital information as a prerequisite for the detailed, systematic investigation and evaluation of chemical reactions and mechanisms.” The group is hoping to add more functionality in time, including document generation and management, additional database query support, and an API for chemistry repositories.
In this brief commentary article, University Corporation for Atmospheric Research’s Matthew Mayernik discusses the concepts of “accountability” and “transparency” in regards to open data, conceptualizing open data “as the result of ongoing achievements, not one-time acts.” In his conclusion, Mayernik notes that “good data management can happen even without sanctions,” and that provision of access alone doesn’t make something transparent, meaning that licenses and other complications make transparency difficult and require researchers “to account for changing expectations and requirements” of open data.
The use of publicly available data repositories is increasingly encouraged by organizations handling academic researchers’ publications. Funding agencies, data organizations, and academic publishers alike are providing their researchers with lists of recommended repositories, though many aren’t certified (as little as six percent of recommended are certified). Husen et al. show that the landscape of recommended and certified data repositories is varied, and they conclude that “common ground for the standardization of data repository requirements” should be a common goal.
In this 2017 paper by Mathews and Marc, a picture is painted of laboratory information systems (LIS) usability, based on a structured survey with more than 250 qualifying responses: LIS design isn’t particularly user-friendly. From ad hoc laboratory reporting to quality control reporting, users consistently found several key LIS tasks either difficult or very difficult, with Orchard Harvest as the demonstrable exception. They conclude “that usability of an LIS is quite poor compared to various systems that have been evaluated using the [System Usability Scale]” though “[f]urther research is warranted to determine the root cause(s) of the difference in perceived usability” between Orchard Harvest and other discussed LIS.
As calls for open scientific data get louder — particularly in Europe — research institutes, organizations, and data management stakeholders are being forced to consider their data management policies and research workflow in an attempt to better answer those calls. Martin et al. of the National Research Institute of Science and Technology for Environment and Agriculture (Irstea) in France present their institute’s efforts towards those goals in this 2017 paper published in LIBER Quarterly. They refer frequently to the scientific and technical information (STI) professionals who must take on new skills, develop new policies, and implement new tools to better meet the goals of open data. Martin et al. conclude with five points, foremost that these changes don’t happen overnight; the necessary change “requires adaptation to technological developments and changes in scientific practices” in order to be successful.
Comprehending the health informatics spectrum: Grappling with system entropy and advancing quality clinical research
In this 2017 paper published in ICT Express, Huang et al., associated with a variety of power research institutes in China, provide an overview of energy informatics and the standardization efforts that have occurred around it and the concept of the “smart grid.” After describing the various energy systems, technical fundamentals, and standardization efforts, the group concludes that “[l]earning from the successful concepts and technologies pioneered by the internet, and establishing an open interconnection protocol are basic methods for the successful development of the future energy system.” They also add that “[i]t is essential to build an intelligent information system (or platform) to promote the interoperability and coordination of different energy systems and [information and communication technology] planes.”
The electronic laboratory notebook (ELN) is implemented in a wide variety of research environments, but what are the special requirements of a public-private partnership? In this 2016 paper published in PeerJ Computer Science, members of the TRANSLOCATION consortium — “a multinational and multisite public–private partnership (PPP) with 15 academic partners, five pharmaceutical companies and seven small- and medium-sized enterprises” — carefully present the process they took in selecting, installing, supporting, and re-evaluating an ELN for their scientific research. The group concludes that selection, implementation, change management. user buy-in, and value-added ability are all vital to the adoption and use of an ELN in a PPP.
This paper published in Journal of Medical Biochemistry looks back on the benefits and errors associated with the implementation of a laboratory information system (LIS) in the Railway Healthcare Institute of Belgrade, Serbia. Author Vera Lukić explains how their first implementation went wrong and how a new LIS quickly helped improve the workflow of Railway’s lab. Lukić finds that for them system flexibility and the ability to customize to user needs was most important in their implementation. He concludes that their LIS benefited them through the “increased pace of patient admission, prevention of sample identification errors, prevention of test translation errors, permanent results storage in electronic form, prevention of billing errors, improved time savings and better staff organization,” resulting in “a step forward towards optimization of the total testing process.”
This brief viewpoint article by Deliberato et al. looks at the state of note taking in electronic health records (EHRs) and proposes that EHR developers look to artificial intelligence (AI) components to improve note taking and other tasks in their software. Not only would AI improve note taking, they argue, but “AI would provide helpful suggestions to the user about what information is available and how it might influence the next course of action. AI could also function to emphasize or deemphasize certain elements of the record, based on previous results, external databases, and knowledge networks.”
In this 2017 paper published in the Journal of Biomedical Informatics, Panahiazar et al. “propose a novel metadata prediction framework to learn associations from existing metadata that can be used to predict metadata values.” They tested their framework using experimental metadata contained in the Gene Expression Omnibus (GEO), an international public repository for community-contributed genomic datasets. The framework itself uses association rule mining (ARM) algorithms to predict structured metadata, and they found that indeed ARM is a strong tool towards that goal, though with several limitations. They conclude that with tools like theirs, the discovered “[p]redictive metadata can be used both prospectively to facilitate metadata authoring, and retrospectively to improve, correct and augment existing metadata in biomedical databases.”
Rapid development of entity-based data models for bioinformatics with persistence object-oriented design and structured interfaces
When it comes to bioinformatics data, databases are one of the more important the behind-the-scenes work horses. Yet inherent challenges of data heterogeneity and context-dependent interconnection in database design have driven the creation of specialized databases, which has, as a byproduct, caused additional problems in their creation. In this paper, Jerusalem College of Technology’s Ezra Tsur proposes “an open-source framework for the curation of specialized databases,” one that demonstrates “integration of the most relevant technologies to OO-based database design in a single framework” as well as extensibility to function with many other bioinformatics tools.
What are the links between pathology in the clinical setting and bioinformatics? Are residents in pathology gaining enough training in bioinformatics? And why should they learn about it in the first place? Clay and Fisher provide their take on these questions in this 2017 review paper published in Cancer Informatics. From their point-of-view, bioinformatics training is vital to practicing pathology “in the ‘information age’ of diagnostic medicine,” and training should be taken more seriously at the residency and fellowship levels. They conclude that “in order for bioinformatics education to firmly integrate into the fabric of resident education, its importance and broad application to the practice of pathology must be recognized and given a prominent seat at the education table.”
This 2015 paper by Faria-Campos et al. of the Brazilian Universidade Federal de Minas Gerais presents the reader with an overview of FluxCTTX, essentially a cytotoxicity module for the Flux laboratory information management system (LIMS). Citing a lack of laboratory informatics tools that can handle the specifics of cytotoxicity assays, the group develop FluxCTTX and tested it in five different laboratory environments, concluding that it can better “guarantee the quality of activities in the process of cytotoxicity tests and enforce the use of good laboratory practices (GLP).”