Since 1995, Astrix Technology Group has been serving the scientific community by helping our customers improve laboratory business processes, source the right talent, and harmonize quality processes by employing the right technology. Our core mission is to build value and trust with the clients we serve, and ultimately help our customers do the important scientific work that improves the world we live in.
2019 was a really great year for Astrix. Over this past year, we doubled our practice servicing federal agencies, reached an important milestone of 750 total employees, and grew by an amazing 34%. Along the way, we added new services, conducted industry surveys, produced helpful white papers and blog posts, sponsored and gave presentations at several important industry conferences, and provided a number of information-packed webinars to support the scientific community. Let’s take a closer look at some of our key contributions and achievements which helped make 2019 such an exciting year for the Astrix family.
So far in our LIMS master data best practices series, we have discussed how to define master data and create a Master Data Plan, how to effectively extrapolate master data from current records to configure your system, and how to configure your master data so it will be easy to maintain and scale as your organization grows and the system matures.
The Master Data Plan, along with other documents we have discussed in previous blogs in this series, are part of an overall quality control process.
In today’s global economy, mergers and acquisitions have become a dominant strategy to improve profitability, maintain competitive edge, and expand services and reach. This practice is common in several industries such as pharmaceutical, biotech, food and beverage, oil and gas, and others. While corporate mergers certainly can provide several benefits for the organizations involved, they can also present significant challenges, not the least of which is harmonization and optimization of the laboratory environment. This often leads to the need to the need for integrating multiple LIMS apps to support a global enterprise.
Scientific organizations that have recently undergone a merger, and oftentimes even those that have not, are frequently in the situation where different labs in different locations are using different LIMS technologies/solutions. This scenario serves to inhibit process efficiency, cross-organization data reporting, regulatory compliance, and can result in high IT demand.
Given the advanced capabilities of modern LIMS, and the competitive advantages gained through establishing digital continuity across the product lifecycle, there is a strong incentive for modern scientific organizations with disparate LIMS to harmonize their laboratory environment by integrating the multiple LIMS into a single system. In this blog, we will discuss best practices for a project of this nature.
People have been asking, “What is a LIMS?,” practically since LIMS was invented. (We’ve asked it ourselves!) In 1998, Dr. Alan McLelland of the Institute of Biochemistry, Royal Infirmary, Glasgow, penned his famous—at least in informatics circles—essay offering four viewpoints on this question; those of the analytical staff, the laboratory manager, the IT group, and the finance team.
Whether clinical, analytical or research, all laboratories operate under a time constraint of having to provide information to a customer, client or management as business decisions are then made based on that information. To make such information available instantly is nearly impossible without a LIMS, traditional methods of record-keeping using excel or paper-based files makes it impossible to source the required data.
The pace of change in the pharmaceutical and biotech industries means that most companies eventually must face the daunting task of relocating their laboratories. Whether due to mergers, acquisitions, funding changes or simply organic growth, a laboratory relocation is an extraordinarily complex undertaking that will impact your laboratory’s scientists, research and business goals.
A laboratory relocation involves moving high-end analytical instrumentation, hazardous materials, products and samples, and sometimes even live animals. It will require shutting down laboratory equipment, along with safely packing and shipping instruments, samples, materials, devices, computer hardware and potentially data to the new location. Once all necessary items have been moved to the new location, they will need to be unpacked and equipment will need to be requalification and/or validated.
Whether you are moving your lab across the hall, street or country, a laboratory relocation is never a routine exercise. The reality is that no two laboratories are alike – each will have a set of unique challenges that will need to be addressed with care. In this blog, we will discuss critical best practices that should be followed in all cases in order to make sure your laboratory relocation is a smooth, safe and efficient process that minimizes downtime and disruption for your business.
Master data design has very important impacts over the lifecycle of a LIMS, as nearly every piece of functionality in the system revolves around the design of the master data. One of the most important aspects to any LIMS implementation is designing the master data so that it is easy to maintain and scale as the organization grows and business needs change. Some of the key benefits of configuring your master data to be maintainable and scalable include:
- Easier to add and/or modify master data down the road
- Increased system efficiency and reliability
- Future system enhancements are less resource intensive
- Better management for large volumes of data
- Increased user acceptance
- Increased ROI
In short, focusing on maintainability and scalability when configuring your master data really helps improve the lifespan and usability of your LIMS. In this blog, we will provide some best practice tips on how to set up master data so it will be easy to maintain and scale as your organization grows and the system matures.
Imagine…historic rainfalls flood your production facility and its lab, while knocking out power for miles around. Your analytical equipment is ruined. Your server is under water. Or, wildfires ignite suddenly after a lightning strike or car accident. The fires spread rapidly. Lab personnel have just enough time to escape with what they can carry. Your facility is burned to the ground. These are just two scenarios that have occurred in the last couple of years. And all of us have experienced a network failure or the dreaded blue screen of death at some point. System downtime is going to happen. You may even have a total loss on your hands in the future. How you plan for it is the key.
Do you know how to protect your LIMS or ELN in case of a catastrophic failure? It’s important to have a detailed plan in place in case a disaster happens.
Due to technological advances in laboratory instruments and higher throughput processes, data volumes in modern analytical laboratories have increased dramatically over the last several decades. While this increased data volume presents the opportunity to improve innovation and enable timely and effective business decisions, it also presents significant data management and processing challenges. In order to meet the challenge of turning this data into knowledge, laboratories are looking to automate and integrate laboratory operations and processes as much as possible in order to provide digital continuity throughout the product lifecycle.
Integrating laboratory instruments with Laboratory Information Management Systems (LIMS) is one of the best ways to automate laboratory processes. Instruments that are commonly integrated with LIMS in laboratories include:
- Particle Counters
- DNA Sequencers
- AA Analyzers
Unfortunately, many LIMS implementation projects either run out of time or lose the momentum before they are able to accomplish their initial instrument integration goals. In this blog, we will discuss both the benefits of instrument integration and best practices that help to ensure instruments are integrated effectively during a LIMS implementation.
Whether due to increases in data volume, regulatory constraints, M&As, globalization, outsourcing, or a myriad of other reasons, legacy LIMS are becoming extremely costly to manage, and many companies are finding themselves in a situation where their legacy system is not able to adequately keep up with changing business requirements.
Organizations looking to address deficiencies in their legacy LIMS have two options – re-architect/re-engineer their legacy system or purchase and migrate to a new LIMS. In part 1 of this blog series, we discussed the steps involved in re-architecting your legacy system, as well as the benefits of doing so. In this article, we’ll explore the other option – purchasing and migrating to a new LIMS.
Agile development is one of those buzzwords that you’ve heard a lot if you’ve spent any time at all in the business world. Like Lean Six Sigma, it’s one of those secret handshake-type terms that will have everyone nodding in recognition, even if they’re not really sure what it’s about. But in laboratory informatics—particularly in…
The first laboratory information management systems (LIMS) were designed as simple tracking tools that enabled systematic control of workflows in regulated laboratories. More recently, LIMS software has evolved into something more akin to an enterprise resource planning tool for the lab that has the ability to manage many different aspects of operations across the full data lifecycle – resource management/scheduling, assay data management, data mining, data analysis, case-centric clinical data, and electronic laboratory notebook (ELN) integration.
Because of today’s rapidly changing business environment, many scientific and R&D-oriented companies find themselves in a situation where their legacy LIMS software struggles to meet business needs. Whether due to increases in data volume, regulatory constraints, M&As, globalization, outsourcing, or a myriad of other reasons, the reality is that many legacy systems are becoming extremely costly to manage and are not able to adequately address changing business requirements. The improved functionality and flexibility inherent in modern COTS systems creants a powerful incentive for these companies to explore the possibility of replacing a legacy system with a new commercial LIMS.
Replacing an aging or legacy LIMS software is no small matter…
December 4, 2019 - LIMS Master Data Best Practices Part 2: Extrapolation of Master Data from Your Current Records
As we mentioned in part 1 of our LIMS Master Data Best Practices series, master data is the core information that needs to be in place in the system in order for a LIMS to function as intended. It’s essential for all parts of the organization to agree on both the meaning and usage of master data. As such, effective master data configuration is critical to the success of any LIMS implementation and migration, not to mention your organization as a whole.
Developers configuring master data in a new LIMS will need to determine what master data fields are needed and where to put them in the system. In order to do this, relevant business process workflows and data points must be identified and properly mapped in the system. In this part 2 blog of our Master Data Best Practices series, we will detail best practices for identifying and mapping your LIMS master data, along with an effective methodology to ensure your mapping exercise maximizes business value for your organization.
Given that many organizations and their laboratories are looking to improve efficiency and reduce costs, it may seem self-evident that introducing a Laboratory Information Management System (LIMS) should help in achieving that aim. However, tangible evidence in the form of a projected Return on Investment (ROI) and business plan is frequently required. In addition, these can provide a basis against which success of the project can be measured. Autoscribe Informatics have just published an updated version of their popular white paper ‘Justifying the Purchase of a LIMS’. This is designed to help any organization develop a detailed, evidence based justification for investing in a LIMS. This updated version not only discusses the key factors to be considered in an ROI assessment, but also provides models and examples for how ROI may be calculated.
The topic of this blog was occasioned by a recent calendar observation. November 8 is recognized in the United States as National STEM Day (or National STEAM Day). STEM is the acronym for science, technology, engineering, and math; STEAM is its extension when you include the arts. Educators around the country use the occasion to expose their students to STEM activities and to talk about potential career paths. The Society for Information Management provides information about a number of STEM-related activities specific to information technology.
There is often some confusion in the use of the terms Laboratory Information System (LIS) and Laboratory Information Management System (LIMS), and a temptation to use the terms interchangeably. Generally, however the term LIS refers to systems used to manage clinical diagnostic testing within in a hospital or healthcare environment. LIMS on the other hand refers to systems used in other analytical testing environments, for example those associated with pharmaceutical, food or chemical manufacturing, environmental control and commercial non-clinical testing organizations.
November 13, 2019 - Reducing the Cost of Computer Systems Validation Efforts Through Risk Assessment
Computerized systems have been widely adopted by the pharmaceutical industry and are frequently used for instrument control and data evaluation, documentation, transmission and archiving in laboratories. The FDA requires that all computer systems in regulated environments be validated in a documented process known as computer system validation (CSV). CSV serves to verify that a computerized system does exactly what it is designed to do in a consistent and reproducible manner.
For regulatory compliance purposes, CSV confirms the accuracy and integrity of data created, modified, archived, retrieved and transmitted by a computer system in order to ensure product safety and effectiveness. Computer system validation is required when configuring a new system or making a change in a validated system (upgrades, patches, extensions, etc.).
Depending on the complexity and functionality of computer systems, CSV can be a significant undertaking. Pharmaceutical companies have even been known to delay upgrading applications to avoid the validation effort required. In this blog, we will provide some tips on performing a risk-based CSV in order to dramatically lower the costs of validation efforts.
Change is clearly a dominant theme in the 21st century. With modern trends like the digital revolution, globalization, talent shortages, greater workforce mobility, and a multigenerational workforce, long-established business models are facing disruption on many different fronts, and companies seeking to maintain competitive advantage must be more agile and flexible than ever before.
One of the keys to business success in today’s volatile economic environment is attracting and retaining the best talent. As such, industry leading organizations are recognizing the importance of the role of HR leaders to their continued success. Once known for simply managing traditional personnel and administrative tasks, roles for Chief Human Resources Officers (CHROs) are evolving beyond siloed people initiatives.
The BIOVIA ELN is a crucial technology that works with Pipeline Pilot in a seamless fashion. Modern laboratory technologies such as Electronic Laboratory Notebooks (ELNs) and LIMS have made enormous amounts of data available to scientists in life science laboratories. Scientists that are recording, managing and archiving research data in paper notebooks and performing tasks manually simply cannot keep up with this deluge of data. These paper-based methods lead to scenarios where much collected data is siloed and never analyzed, while many experiments end up being needlessly repeated.