{"ID":95346,"post_author":"9412100","post_date":"2021-02-18 16:35:38","post_date_gmt":"2021-02-18 21:35:38","post_content":"","post_title":"LIMSjournal - Laboratory Technology Special Edition","post_excerpt":"","post_status":"publish","comment_status":"closed","ping_status":"closed","post_password":"","post_name":"limsjournal-laboratory-technology-special-edition","to_ping":"","pinged":"","post_modified":"2021-11-18 11:25:41","post_modified_gmt":"2021-11-18 16:25:41","post_content_filtered":"","post_parent":0,"guid":"https:\/\/www.limsforum.com\/?post_type=ebook&p=95346","menu_order":0,"post_type":"ebook","post_mime_type":"","comment_count":"0","filter":"","_ebook_metadata":{"enabled":"on","private":"0","guid":"E621EBAA-B37B-4883-91E2-832BF1646DFD","title":"LIMSjournal - Laboratory Technology Special Edition","subtitle":"","cover_theme":"nico_6","cover_image":"https:\/\/www.limsforum.com\/wp-content\/plugins\/rdp-ebook-builder\/pl\/cover.php?cover_style=nico_6&subtitle=&editor=Shawn+Douglas&title=LIMSjournal+-+Laboratory+Technology+Special+Edition&title_image=https%3A%2F%2Fs3.limswiki.org%2Fwww.limswiki.org%2Fimages%2F5%2F5a%2FFig4_Liscouski_NotesOnInstDataSys20.jpg&publisher=LabLynx+Press","editor":"Shawn Douglas","publisher":"LabLynx Press","author_id":"26","image_url":"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/5\/5a\/Fig4_Liscouski_NotesOnInstDataSys20.jpg","items":{"87d7f050e0c47d7762a90382989592a1_type":"article","87d7f050e0c47d7762a90382989592a1_title":"Directions in Laboratory Systems: One Person's Perspective (Liscouski 2021)","87d7f050e0c47d7762a90382989592a1_url":"https:\/\/www.limswiki.org\/index.php\/LII:Directions_in_Laboratory_Systems:_One_Person%27s_Perspective","87d7f050e0c47d7762a90382989592a1_plaintext":"\n\nLII:Directions in Laboratory Systems: One Person's PerspectiveFrom LIMSWikiJump to navigationJump to searchTitle: Directions in Laboratory Systems: One Person's Perspective\nAuthor for citation: Joe Liscouski, with editorial modifications by Shawn Douglas\nLicense for content: Creative Commons Attribution-ShareAlike 4.0 International\nPublication date: November 2021\n\nContents \n\n1 Introduction \n\n1.1 Intended audience \n1.2 About the content \n\n\n2 Looking forward and back: Where do we begin? \n\n2.1 The \"laboratory of the future\" and laboratory systems engineering \n2.2 But how did we get here? \n2.3 Moving forward \n\n\n3 Laboratory computing \n4 How automation affects people's work in the lab \n\n4.1 It can make routine work easier and more productive, reducing costs and improving ROI \n4.2 Automation can cause some jobs to end, or at least change them significantly \n\n\n5 Development of industry-wide guidelines \n\n5.1 C-1. Education for lab management and lab personnel \n5.2 C-3. Long-term usable access to lab information in databases without vendor controls \n5.3 C-4. Archived instrument data in standardized formats and standardized vendor software \n5.4 C-6. Communications \n\n5.4.1 Additional note on Bluetooth-enabled instruments \n\n\n\n\n6 Laboratory work and scientific production \n\n6.1 Different kinds of laboratory work \n6.2 Scientific production \n6.3 Demonstrating process stability: The standard sample program \n\n\n7 Where is the future of lab automation going to take us? \n8 Applying automation to lab work \n\n8.1 More on the quality control lab and its process management \n\n8.1.1 Early QC testing \n8.1.2 Normal QC testing \n8.1.3 A thought experiment \n\n\n\n\n9 The creation of a \"center for laboratory systems engineering\" \n10 Appendix 1: A very brief historical note \n\n10.1 1.1 Collecting data from instruments \n10.2 1.2 The beginning of laboratory informatics systems \n10.3 1.3 Electro-mechanical robotics \n\n10.3.1 1.3.1 Two approaches to sample processing with robotics \n\n\n10.4 1.4 Sample storage organization \n\n10.4.1 1.4.1 The nature of incoming samples \n\n\n\n\n11 Abbreviations, acronyms, and initialisms \n12 Footnotes \n13 About the author \n14 References \n\n\n\nIntroduction \nThe purpose of this work is to provide one person's perspective on planning for the use of computer systems in the laboratory, and with it a means of developing a direction for the future. Rather than concentrating on \u201cscience first, support systems second,\u201d it reverses that order, recommending the construction of a solid support structure before populating the lab with systems and processes that produce knowledge, information, and data (K\/I\/D).\n\nIntended audience \nThis material is intended for those working in laboratories of all types. The biggest benefit will come to those working in startup labs since they have a clean slate to work with, as well as those freshly entering into scientific work as it will help them understand the roles of various systems. Those working in existing labs will also benefit by seeing a different perspective than they may be used to, giving them an alternative path for evaluating their current structure and how they might adjust it to improve operations. \nHowever, all labs in a given industry can benefit from this guide since one of its key points is the development of industry-wide guidelines to solving technology management and planning issues, improving personnel development, and more effectively addressing common projects in automation, instrument communications, and vendor relationships (resulting in lower costs and higher success rates). This would also provide a basis for evaluating new technologies (reducing risks to early adopters) and fostering product development with the necessary product requirements in a particular industry.\n\nAbout the content \nThis material follows in the footsteps of more than 15 years of writing and presentation on the topic. That writing and presentation\u2014compiled here\u2014includes:\n\nAre You a Laboratory Automation Engineer? (2006)\nElements of Laboratory Technology Management (2014)\nA Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work (2018)\nLaboratory Technology Management & Planning (2019)\nNotes on Instrument Data Systems (2020)\nLaboratory Technology Planning and Management: The Practice of Laboratory Systems Engineering (2020)\nConsiderations in the Automation of Laboratory Procedures (2021)\nThe Application of Informatics to Scientific Work: Laboratory Informatics for Newbies (2021)\nWhile that material covers some of the \u201cwhere do we go from here\u201d discussions, I want to bring a lot of it together in one spot so that we can see what the entire picture looks like, while still leaving some of the details to the titles above. Admittedly, there have been some changes in thinking over time from what was presented in those pieces. For example, the concept of \"laboratory automation engineering\" has morphed into \"laboratory systems engineering,\" given that in the past 15 years the scope of laboratory automation and computing has broadened significantly. Additionally, references to \"scientific manufacturing\" are now replaced with \"scientific production,\" since laboratories tend to produce ideas, knowledge, results, information, and data, not tangible widgets. And as the state of laboratories continues to dynamically evolve, there will likely come more changes.\nOf special note is 2019's Laboratory Technology Management & Planning webinars. They provide additional useful background towards what is covered in this guide.\n\n Looking forward and back: Where do we begin? \n The \"laboratory of the future\" and laboratory systems engineering \nThe \u201claboratory of the future\u201d (LOF) makes for an interesting playground of concepts. People's view of the LOF is often colored by their commercial and research interests. Does the future mean tomorrow, next month, six years, or twenty years from now? In reality, it means all of those time spans coupled with the length of a person's tenure in the lab, and the legacy they want to leave behind.\nHowever, with those varied time spans we\u2019ll need flexibility and adaptability for managing data and information while also preserving access and utility for the products of lab work, and that requires organization and planning. Laboratory equipment will change and storage media and data formats will evolve. The instrumentation used to collect data and information will change, and so will the computers and software applications that manage that data and information. Every resource that has been expended in executing lab work has been to develop knowledge, information, and data (K\/I\/D). How are you going to meet that management challenge and retain the expected return on investment (ROI)? Answering it will be one of the hallmarks of the LOF. It will require a deliberate plan that touches on every aspect of lab work: people, equipment and systems choices, and relationships with vendors and information technology support groups. Some points reside within the lab while others require coordination with corporate groups, particularly when we address long-term storage, ease of access, and security (both physical and electronic).\nDuring discussions of the LOF, some people focus on the technology behind the instruments and techniques used in lab work, and they will continue to impress us with their sophistication. However, the bottom line of those conversations is their ability to produce results: K\/I\/D.\nModern laboratory work is a merger of science and information technology. Some of the information technology is built into instruments and equipment, the remainder supports those devices or helps manage operations. That technology needs to be understood, planned, and engineered into smoothly functioning systems if labs are to function at a high level of performance.\nGiven all that, how do we prepare for the LOF, whatever that future turns out to be? One step is the development of \u201claboratory systems engineering\u201d as a means of bringing structure and discipline to the use of informatics, robotics, and automation to lab systems.\nBut who is the laboratory systems engineer (LSE)? The LSE is someone able to understand and be conversant in both the laboratory science and IT worlds, relating them to each other to the benefit of lab operation effectiveness while guiding IT in performing their roles. A fully dedicated LSE will understand a number of important principles:\n\nKnowledge, information, and data should always be protected, available, and usable.\nData integrity is paramount.\nSystems and their underlying components should be supportable, meaning they are proven to meet users' needs (validated), capable of being modified without causing conflicts with results produced by previous versions, documented, upgradable (without major disruption to lab operations), and able to survive upgrades in connected systems.\nSystems should be integrated into lab operations and not exist as isolated entities, unless there are overriding concerns.\nSystems should be portable, meaning they are able to be relocated and installed where appropriate, and not restricted to a specific combination of hardware and\/or software that can\u2019t be duplicated.\nThere should be a smooth, reproducible (bi-directional, if appropriate), error-free (including error detection and correction) flow of results, from data generation to the point of use or need.\n But how did we get here? \nThe primary purpose of laboratory work is developing and carrying out scientific methods and experiments, which are used to answer questions. We don\u2019t want to lose sight of that, or the skill needed to do that work. Initially the work was done manually, which inevitably limited the amount of data and information that could be produced, and in turn the rate at which new knowledge could be developed and distributed.\nHowever, the introduction of electronic instruments changed that, and the problem shifted from data and information production to data and information utilization (including the development of new knowledge), distribution, and management. That\u2019s where we are today.\nScience plays a role in the production of data and information, as well as the development of knowledge. In between we have the tools used in data and information collection, storage, analysis, etc. That\u2019s what we\u2019ll be talking about in this document. The equipment and concepts we\u2019re concerned with here are the tools used to assist in conducting that work and working with the results. They are enablers and amplifiers of lab processes. As in almost any application, the right tools used well are an asset.\n\n\n\n\n\n\n\n\n\nFigure 1. Databases for knowledge, information, and data (K\/I\/D) are represented as ovals, and the processes acting on them as arrows.[1]\n\n\n\nOne problem with the distinction between science and informatics tools is that lab personnel understand the science, but they largely don't understand the intricacies of information and computing technologies that comprise the tools they use to facilitate their work. Laboratory personnel are educated in a variety of scientific disciplines, each having its own body of knowledge and practices, each requiring specialization. Shouldn't the same apply to the expertise needed to address the \u201ctools\u201d? On one hand we have chromatographers, spectroscopists, physical chemists, toxicologists, etc., and on the other roboticists, database experts, network specialists, and so on. In today's reality these are not IT specialists but rather LSEs who understand how to apply the IT tools to lab work with all the nuances and details needed to be successful.\n\nMoving forward \nIf we are going to advance laboratory science and its practice, we need the right complement of experienced people. Moving forward to address laboratory- and organization-wide productivity needs a different perspective; rather than ask how we improve things at the bench, ask how we improve the processes and organization.\nThis guide is about long-term planning for lab automation and K\/I\/D management. The material in this guide is structured in sections, and each section starts with a summary, so you can read the summary and decide if you want more detail. However, before you toss it on the \u201clater\u201d pile, read the next few bits and then decide what to do.\nLong-term planning is essential to organizational success. The longer you put it off, the more expensive it will be, the longer it will take to do it, and the more entrenched behaviors you'll have to overcome. Additionally, you won\u2019t be in a position to take advantage of the developing results.\nIt\u2019s time to get past the politics and the inertia and move forward. Someone has to take the lead on this, full-time or part-time, depending on the size of your organization (i.e., if there's more than one lab, know that it affects all of them). Leadership should be from the lab side, not the IT side, as IT people may not have the backgrounds needed and may view everything through their organizations priorities. (However, their support will be necessary for a successful outcome.)\nThe work conducted on the lab bench produces data and information. That is the start of realizing the benefits from research and testing work. The rest depends upon your ability to work with that data and information, which in turn depends on how well your data systems are organized and managed. This culminates in maximizing benefit at the least cost, i.e., ROI. It\u2019s important to you, and it\u2019s important to your organization.\nPlanning has to be done at least four levels:\n\nIndustry-wide (e.g., biotech, mining, electronics, cosmetics, food and beverage, plastics, etc.)\nWithin your organization\nWithin your lab\nWithin your lab processes\nOne important aspect of this planning process\u2014particularly at the top, industry-wide level\u2014is the specification of a framework to coordinate product, process (methods), or standards research and development at the lower levels. This industry-wide framework is ideally not a \u201cthis is what you must do\u201d but rather a common structure that can be adapted to make the work easier and, as a basis for approaching vendors for products and product modifications that will benefit those in your industry, give them confidence that the requests have a broader market appeal. If an industry-wide approach isn\u2019t feasible, then larger companies may group together to provide the needed leadership. Note, however, that this should not be perceived as an industry\/company vs. vendor effort; rather, this is an industry\/company working with vendors. The idea of a large group effort is to demonstrate a consensus viewpoint and that vendors' development efforts won\u2019t be in vain.\nThe development of this framework, among other things, should cover:\n\nInformatics\nCommunications (networking, instrument control and data, informatics control and data, etc.)\nPhysical security (including power)\nData integrity and security\nCybersecurity\nThe FAIR principles (the findability, accessibility, interoperability, and reusability of data[2])\nThe application of cloud and virtualization technologies\nLong-term usable access to lab information in databases without vendor controls (i.e., the impact of software as a service and other software subscription models)\nBi-directional data interchange between archived instrument data in standardized formats and vendor software, requiring tamper-proof formats\nInstrument design for automation\nSample storage management\nGuidance for automation\nEducation for lab management and lab personnel\nThe conversion of manual methods to semi- or fully automated systems\nThese topics affect both a lab's science personnel and its LSEs. While some topics will be of more interest to the engineers than the scientists, both groups have a stake in the results, as do any IT groups.\nAs digital systems become more entrenched in scientific work, we may need to restructure our thinking from \u201clab bench\u201d and \u201cinformatics\u201d to \u201cdata and information sources\u201d and \u201cdigital tools for working, organizing, and managing those elements.\" Data and information sources can extend to third-party labs and other published material. We have to move from digital systems causing incremental improvements (today\u2019s approach), to a true revolutionary restructuring of how science is conducted.\n\nLaboratory computing \nKey point: Laboratory systems are planned, designed, and engineered. They are not simply a collection of components. Laboratory computing is a transformational technology, one which has yet to fully emerge in large part because those who work in laboratories with computing aren\u2019t fully educated about it to take advantage of it.\nLaboratory computing has always been viewed as an \"add-on\" to traditional laboratory work. These add-ons have the potential to improve our work, make it faster, and make it more productive (see Appendix 1 for more details).\nThe common point-of-view in the discussion of lab computing has been focused on the laboratory scientist or manager, with IT providing a supporting role. That isn\u2019t the only viewpoint available to us, however. Another viewpoint is from that of the laboratory systems engineer (LSE), who focuses on data and information flow. This latter viewpoint should compel us to reconsider the role of computing in the laboratory and the higher level needs of laboratory operations.\nWhy is this important? Data and information generation may represent the end of the lab bench process, but it\u2019s just the beginning of its use in broader scientific work. The ability to take advantage of those elements in the scope of manufacturing and corporate research and development (R&D) is where the real value is realized. That requires planning for storage, access, and utilization over the long term.\nThe problem with the traditional point-of-view (i.e., instrumentation first, with computing in a supporting role) is that the data and information landscape is built supporting the portion of lab work that is the most likely to change (Figure 2). You wind up building an information architecture to meet the requirements of diverse data structures instead of making that architecture part of the product purchase criteria. Systems are installed as needs develop, not as part of a pre-planned information architecture.\n\n\n\n\n\n\n\n\n\nFigure 2. Hierarchy of lab systems, noting frequency of change\n\n\n\nDesigning an informatics architecture has some things in common with building a house. You create the foundation first, a supporting structure that everything sits on. Adding the framework sets up the primary living space, which can be modified as needed without disturbing the foundation (Figure 3). If you built the living space first and then wanted to install a foundation, you\u2019d have a mess to deal with.\n\n\n\n\n\n\n\n\n\nFigure 3. Comparison of foundation and living\/working space levels in an organization\n\n\n\nThe same holds true with laboratory informatics. Set the foundation\u2014the laboratory information management system (LIMS), electronic laboratory notebook (ELN), scientific data management system (SDMS), etc.\u2014first, then add the data and information generation systems. That gives you a common set of requirements for making connections and can clarify some issues in product selection and systems integration. It may seem backwards if your focus is on data and information production, but as soon as you realize that you have to organize and manage those products, the benefits will be clear.\nYou might wonder how you go about setting up a LIMS, ELN, etc. before the instrumentation is set. However it isn\u2019t that much of a problem. You know why your lab is there and what kind of work you plan to do. That will guide you in setting up the foundation. The details of tests can be added as need. Most of that depends on having your people educated in what the systems are and how to use them.\nOur comparison between a building and information systems does bring up some additional points. A building's access to utilities runs through control points; water and electricity don\u2019t come in from the public supply to each room but run through central control points that include a distribution system with safety and management features. We need the same thing in labs when it comes to network access. In our current society, access to private information for profit is a fact of life. While there are desirable features of lab systems available through network access (remote checking, access to libraries, updates, etc.), they should be controlled so that those with malicious intent are prevented access, and data and information are protected. Should instrument systems and office computers have access to corporate and external networks? That\u2019s your decision and revolves around how you want your lab run, as well as other applicable corporate policies.\nThe connections layer in Figure 3 is where devices connect to each other and the major informatics layer. This layer includes two functions: basic networking capability and application-to-application transfer. Take for example moving pH measurements to a LIMS or ELN; this is where things can get very messy. You need to define what that is and what the standards are to ensure a well-managed system (more on that when we look at industry-wide guidelines).\nTo complete the analogy, people do move the living space of a house from one foundation to another, often making for an interesting experience. Similarly, it\u2019s also possible to change the informatics foundation from one product set to another. It means exporting the contents of the database(s) to a product-independent format and then importing into the new system. If you think this is something that might be in your future, make the ability to engage in that process part of the product selection criteria. Like moving a house, it isn\u2019t going to be fun.[3][4][5] The same holds true for ELN and SDMS.\n\n How automation affects people's work in the lab \nKey point: There are two basic ways lab personnel can approach computing: it\u2019s a black box that they don\u2019t understand but is used as part of their work, or they are fully aware of the equipment's capabilities and limitations and know how to use it to its fullest benefit.\nWhile lab personnel may be fully educated in the science behind their work, the role of computing\u2014from pH meters to multi-instrument data systems\u2014may be viewed with a lack of understanding. That is a significant problem because they are responsible for the results that those systems produce, and they may not be aware of what happens to the signals from the instruments, where the limitations lie, and what can turn a well-executed procedure to junk because an instrument or computer setting wasn\u2019t properly evaluated and used. \nIn reality, automation has both a technical impact on their work and an impact on themselves. These are outlined below.\nTechnical impact on work:\n\nIt can make routine work easier and more productive, reducing costs and improving ROI (more on that below).\nIt can allow work to be performed that might otherwise be too expensive to entertain. There are techniques such as high-throughput screening and statistical experimental design that are useful in laboratory work but might be avoided because the effort of generating the needed data is too labor-intensive and time-consuming. Automated systems can relieve that problem and produce the volumes of data those techniques require.\nIt can improve accuracy and reproducibility. Automated systems, properly designed and implemented, are inherently more reproducible than a corresponding manual system.\nIt can increase safety by limiting people's exposure to hazardous situations and materials.\nIt can also be a financial hole if proper planning and engineering aren\u2019t properly applied to a project. \u201cScope creep,\u201d changes in direction, and changes in project requirements and personnel are key reasons that projects are delayed or fail.\nImpact on the personnel themselves:\n\nIt can increase technical specialization, potentially improving work opportunities and people's job satisfaction. Having people move into a new technology area gives them an opportunity to grow both personally and professionally.\nFull automation of a process can cause some jobs to end, or at least change them significantly (more on that below).\nIt can elevate routine work to more significant supervisory roles.\nMost of these impacts are straightforward to understand, but several require further elaboration.\n\n It can make routine work easier and more productive, reducing costs and improving ROI \nThis sounds like a standard marketing pitch; is there any evidence to support it? In the 1980s, clinical chemistry labs were faced with a problem: the cost for their services was set on an annual basis without any adjustments permitted for rising costs during that period. If costs rose, income dropped; the more testing they did the worse the problem became. They addressed this problem as a community, and that was a key factor in their success. Clinical chemistry labs do testing on materials taken from people and animals and run standardized tests. This is the kind of environment that automation was created for, and they, as a community, embarked on a total laboratory automation (TLA) program. That program had a number of factors: education, standardization of equipment (the tests were standardized so the vendors knew exactly what they needed in equipment capabilities), and the development of instrument and computer communications protocols that enabled the transfer of data and information between devices (application to application).\nOrganizations such as the American Society for Clinical Laboratory Science (ASCLS) and the American Association for Clinical Chemistry (AACC), as well as many others, provide members with education and industry-wide support organization. Other examples include the Clinical and Laboratory Standards Institute (CLSI), a non-profit organization that develops standards for laboratory automation and informatics (e.g., AUTO03-A2 on laboratory automation[6]), and Health Level Seven, Inc., a non-profit organization that provide software standards for data interoperability. \nGiven that broad, industry-wide effort to address automation issues, the initial response was as follows[7]:\n\nBetween 1965 and 2000, the Consumer Price Index increased by a factor of 5.5 in the United States.\nDuring the same 36 years, at Mount Sinai Medical Center's chemistry department, the productivity (indicated as the number of reported test results\/employee\/year) increased from 10,600 to 104,558 (9.3-fold).\nWhen expressed in constant 1965 dollars, the total cost per test decreased from $0.79 to $0.15.\nIn addition, the following data (Table 1 and 2) from Dr. Michael Bissell of Ohio State University provides further insight into the resulting potential increase in labor productivity by implementing TLA in the lab[8]:\n\n\n\n\n\n\n\nTable 1. Overall productivity of labor\r\n \r\nFTE = Full-time equivalent; TLA = Total laboratory automation\n\n\nRatio\n\nPre-TLA\n\nPost-Phase 1\n\nChange\n\n\nTest\/FTE\n\n50,813\n\n64,039\n\n+27%\n\n\nTests\/Tech FTE\n\n80,058\n\n89,120\n\n+11%\n\n\nTests\/Paid hour\n\n20.8\n\n52.9\n\n+24%\n\n\nTests\/Worked hour\n\n24.4\n\n30.8\n\n+26%\n\n\n\n\n\n\n\n\n\nTable 2. Productivity of labor-processing area\r\n \r\nFTE = Full-time equivalent; TLA = Total laboratory\n\n\nRatio\n\nPre-TLA\n\nPost-Phase 1\n\nChange\n\n\nSpecimens\/Processing FTE\n\n39,899\n\n68,708\n\n+72%\n\n\nSpecimens\/Processing paid hour\n\n19.1\n\n33.0\n\n+73%\n\n\nRequests\/Processing paid hour\n\n12.7\n\n21.5\n\n+69%\n\n\n\nThrough TLA, improvements can be seen in:\n\nImproved sample throughput\nCost reduction\nLess variability in the data\nReduced and more predictable consumption of materials\nImproved use of people's talents\nWhile this data is almost 20 years old, it illustrates the impact in a change from a manual system to an automated lab environment. It also gives us an idea of what might be expected if industry- ir community-wide automation programs were developed.\nClinical laboratories are not unique in the potential to organize industry-wide standardized aspects of technology and work, and provide education. The same can be done anywhere as long as the ability to perform a particular test procedure isn\u2019t viewed as a unique competitive advantage. The emerging Cannabis testing industry represents one such opportunity, among others.\nThe clinical industry has provided a template for the development of laboratory systems and automation. The list of instruments that meet the clinical communications standard continues to grow (e.g., Waters' MassLynx LIMS Interface[9]). There is nothing unique about the communications standards that prevent them from being used as a basis for development in other industries, aside from the data dictionary. As such, we need to move from an every-lab-for-itself approach to lab systems development toward a more cooperative and synergistic model.\n\n Automation can cause some jobs to end, or at least change them significantly \nOne of the problems that managers and the people under them get concerned about is change. No matter how beneficial the change is for the organization, it raises people\u2019s anxiety levels and can affect their job performance unless they are prepared for it. In that context, questions and concerns staff may have in relation to automating aspects of a job include:\n\nHow are these changes going to affect my job and my income? Will it cause me to lose my job? That\u2019s about as basic as it gets, and it can impact people at any organizational level.\nBringing in new technologies and products means learning new things, and that opens up the possibility that people may fail or not be as effective as they are currently. It can reshuffle the pecking order.\nThe process of introducing new equipment, procedures, etc. is going to disrupt the lab's workflow. The changes may be procedural structural, or both; how are you going to deal with those issues?\nTwo past examples will highlight different approaches. In the first, a multi-instrument automation system was being introduced. Management told the lab personnel what was going to happen and why, and that they would be part of the final acceptance process. If they weren\u2019t happy, the system would be modified to meet their needs. The system was installed, software written to meet their needs, instruments connected, and the system was tested. Everyone was satisfied except one technician, and that proved to be a major roadblock to putting the system into service. The matter was discussed with the lab manager, who didn\u2019t see the problem; as soon as the system was up and running, that technician would be given a new set of responsibilities, something she was interested in. But no one told her that. As she saw it, once the system came on line she was out of a job. (One of the primary methods used in that lab's work was chromatography, with the instrument output recorded on chart paper. Most measurements were done using peak height, but peak area was used for some critical analyses. Those exacting measurements, made with a planimeter, were her responsibility and her unique\u2014as she saw it\u2014contribution to the lab's work. The instrument system replaced the need for this work. The other technicians and chemists had no problem adapting to the data system.) However, a short discussion between she and the lab manager alleviated the concerns.\nThe second example was handled a lot differently, and was concerned with the implementation of a lab\u2019s first LIMS. The people in the lab knew something was going on but not the details. Individual meetings were held with each member of the lab team to discuss what was being considered and to learn of their impressions and concerns (these sessions were held with an outside consultant, and the results were reported to management in summary form). Once that was completed, the project started, with the lab manager holding a meeting of all lab personnel and IT, describing what was going to be done, why, how the project was going to proceed, and the role of those working in the lab in determining product requirement and product selection. The concerns raised in the individual sessions were addressed up-front, and staff all understood that no one was going to lose their job, or suffer a pay cut. Yes, some jobs would change, and where appropriate that was discussed with each individual. There was an educational course about what a LIMS was, its role in the lab's work, and how it would improve the lab\u2019s operations. When those sessions were completed, the lab\u2019s personnel looked forward to the project. They were part of the project and the process, rather than having it done without their participation. In short, instead of it happening to them, it happened with them as willing participants.\nPeople's attitudes about automation systems and being willing participants in their development can make a big difference in a project's success or failure. You don\u2019t want people to feel that the incoming system and the questions surrounding it are threatening, or seen as something that is potentially going to end their employment. They may not freely participate or they may leave when you need them the most.\nAll of this may seem daunting for a lab to take on by itself. Large companies may have the resources to handle it, but we need more than a large company to do this right; we need an industry-wide effort.\n\nDevelopment of industry-wide guidelines \nKey point: The discussion above about clinical labs illustrates what can be accomplished when an industry group focuses on a technology problem. We need to extend that thinking\u2014and action\u2014to a broader range of industries individually. The benefits of an industry-wide approach to addressing technology and education issues include:\n\nproviding a wider range of inputs to solving problems;\nproviding a critical marketing mass to lobby vendors to create products that fit customers\u2019 needs in a particular industry;\ndeveloping an organized educational program with emphasis on that industry's requirements;\ngiving labs (startup and existing) a well-structured reference point to guide them (not dictate) in making technology decisions;\nreducing the cost of automation, with improved support; and\nwhere competitive issues aren\u2019t a factor, enabling industry-funded R&D technology development and implementation for production or manufacturing quality, process control (integration of online quality information), and process management.\nThe proposal: each industry group should define a set of guidelines to assist labs in setting up an information infrastructure. For the most part, large sections would end up similar across multiple industries, as there isn\u2019t that much behavioral distinction between some industry sets. The real separation would come in two places: the data dictionaries (data descriptions) and the nature of the testing and automation to implement that testing. Where there is a clear competitive edge to a test or its execution, each company may choose to go it alone, but that still leaves a lot of room for cooperation in method development, addressing both the basic science and its automated implementations, particularly where ASTM, USP, etc. methods are employed.\nThe benefits of this proposal are noted in the key point above. However, the three most significant ones are arguably:\n\nthe development of an organized education program with an emphasis on industry requirements;\nthe development of a robust communications protocol for application-to-application transfers; and,\nthe ability to lobby vendors from an industry-wide basis for product development, modification, and support.\nIn the \"Looking forward and back\" section earlier in this guide, we showed a bulleted list of considerations for the development of such a guideline-based framework. What follows is a more organized version of those points, separated into three sections, which all need to be addressed in any industry-based framework. For the purposes of this guide, we'll focus primarily on Section C: \"Issues that need concerted attention.\" Section A on background and IT support, and Section B on lab-specific background information, aren't discussed as they are addressed elsewhere, particularly in the previous works referenced in the Introduction. \n\nA. General background and routine IT support\n1. Physical security (including power)\n2. Cybersecurity\n3. Cloud and virtualization technologies<\/dd>\nB. Lab-specific background information\n1. Informatics (could be an industry-specific version of ASTM E1578 - 18 Standard Guide for Laboratory Informatics[10])\n2. Sample storage management and organization (see Appendix 1, section 1.4 of this guide)\n3. Guidance for automation\n4: Data integrity and security (see S. Schmitt's Assuring Data Integrity for Life Sciences[11], which has broader application outside the life sciences)\n5. The FAIR principles (the findability, accessibility, interoperability, and reusability of data; see Wilkinson et al. 2016[2])<\/dd>\nC. Issues that need concerted attention:\n1. Education for lab management and lab personnel\n2. The conversion of manual methods to semi- or full-automated methods (see Considerations in the Automation of Laboratory Procedures for more)\n3. Long-term usable access to lab information in databases without vendor controls (i.e., the impact of software as a service and other software subscription models)\n4. Archived instrument data in standardized formats and standardized vendor software (requires tamper-proof formats)\n5. Instrument design for automation (Most instruments and their support software are dual-use, i.e., they work as stand-alone devices via front panels, or through software controls. While this is a useful selling tool\u2014whether by manual or automated use\u2014it means the device is larger, more complex, and expensive than a automation-only device that uses software [e.g., a smartphone or computer] for everything. Instruments and devices designed for automation should be more compact and permit more efficient automation systems.)\n6. Communications (networking, instrument control and data, informatics control and data, etc.)<\/dd>\nWe'll now go on to expand upon items C-1, C-3, C-4, and C-6.\n\nC-1. Education for lab management and lab personnel \nLaboratory work has become a merger of two disciplines: science and information technology (including robotics). Without the first, nothing happens; without the second, work will proceed but at a slower more costly pace. There are different levels of education requirements. For those working at the lab bench, not only do they need to understand the science, but also how the instrumentation and supporting computers systems (if any, including those embedded in the instrument) make and transform measurements into results. \u201cThe computer did it\u201d is not a satisfactory answer to \u201chow did you get that result,\u201d nor is \u201cmagic,\" or maintaining willful ignorance of exactly how the instrument measurements are taken. Laboratorians are responsible and accountable for the results, and they should be able to explain the process of how they were derived, including how settings on the instrument and computer software affect that work. Lab managers should understand the technologies at the functional level and how the systems interact with each other. They are accountable for the overall integrity of the systems and the data and information they contain. IT personnel should understand how lab systems differ from office applications and why business-as-usual for managing office applications doesn\u2019t work in the lab environment. The computer systems are there to support the instruments, and any changes that may affect that relationship should be initiated with care; the instrument vendor\u2019s support for computer systems upgrades is essential.\nAs such, we need a new personnel position, that of the laboratory systems engineer or LSE, to provide support for the informatics architecture. This isn\u2019t simply an IT person; it should be someone who is fluent in both the science and the information technology applied to lab work. (See Laboratory Technology Planning and Management: The Practice of Laboratory Systems Engineering for more on this topic.)\n\nC-3. Long-term usable access to lab information in databases without vendor controls \nThe data and information your lab produces, with the assistance of instrumentation and instrument data systems, is yours, and no one should put limits on your ability to work with it. There is a problem with modern software design: most lab data and information can only be viewed through applications software. In some cases, the files may be used with several applications, but often it is the vendor's proprietary formats that limit access. In those instances, you have to maintain licenses for that software for as long as you need access to the data and information, even if the original application has been replaced by something else. This can happen for a variety of reasons:\n\nBetter technology is available from another vendor\nA vendor sold part of its operations to another organization\nOrganizations merge\nCompletion of a consulting or research contract requires all data to be sent to the contracting organization\nAll of these, and others, are reasons for maintaining multiple versions of similar datasets that people need access to yet don\u2019t want to maintain licenses for into the future, even though they still must consider meeting regulatory (FDA, EPA, ISO, etc.) requirements.\nAll of this revolves around your ability to gain value from your data and information without having to pay for its access. The vendors don\u2019t want to give their software away for free either. What we need is something like the relationship between Adobe Acrobat Reader and the Adobe Acrobat application software. The latter gives you the ability to create, modify, comment, etc. documents, while the former allows you to view them. The Reader gives anyone the ability to view the contents, just not alter it. We need a \u201creader\u201d application for instrument data collected and processed by an instrument data system. We need to be able to view the reports, raw data, etc. and export the data in a useful format, everything short of acquiring and processing new data. This gives you the ability to work with your intellectual property and allows it to be viewed by regulatory agencies if that becomes necessary, without incurring unnecessary costs or depriving the vendor of justifiable income.\nThis has become increasingly important as vendors have shifted to a subscription model for software licenses in place of one-time payments with additional charges for voluntary upgrades. One example from another realm illustrates the point. My wife keeps all of her recipes in an application on her iPad. One day she looked for a recipe and received a message that read, roughly, \u201cNo access unless you upgrade the app <not free>.\u201d As it turned out, a Google search recommended re-installing the current app instead. It worked, but she upgraded anyhow, just to be safe. It\u2019s your content, but who \u201cowns\u201d it if the software vendor can impose controls on how it is used?\nAs more of our work depends on software, we find ourselves in a constant upgrade loop of new hardware, new operating systems, and new applications just to maintain the status quo. We need more control over what happens to our data. Industry-wide guidelines backed by the buying power of an industry could create vendor policies that would mitigate that. Earlier in this document we noted that the future is going to extend industry change for a long time, with hardware and software evolving in ways we can\u2019t imagine. Hardware changes (anyone remember floppy disks?) inevitably make almost everything obsolete, so how do we protect our access to data and information? Floppy disks were the go-to media 40 years ago, and since then cassette tapes, magnetic tape, Zip drives, CD-ROMs, DVDs, Syquest drives, and other types of storage media have come and gone. Networked systems, at the moment, are the only consistent and reliable means of exchange and storage as datasets increase in size.\nOne point we have to take into account is that versions of applications will only function on certain versions of operating systems and databases. All three elements are going to evolve, and at some point we\u2019ll have to deal with \u201cold\u201d generations while new ones are coming online. One good answer to that is emulation. Some software systems like VMWare Corporation's VMware allow packages of operating systems, databases, and applications to operate on computers regardless of their age, with each collection residing in a \u201ccontainer\u201d; we can have several generations of those containers residing on a computer and execute them at will, as if they were still running on the original hardware. If you are looking for the data for a particular sample, go to the container that covers its time period and access it. Emulation packages are powerful tools; using them you can even run Atari 2600 games on a Windows or OS X system.\nAddressing this issue is generally bigger than what a basic laboratory-based organization can handle, involving policy making and support from information technology support groups and corporate legal and financial management. The policies have to take into account physical locations of servers, support, financing, regulatory support groups, and international laws, even if your company isn\u2019t a multinational (third-party contracting organizations may be). Given this complexity, and the fact that most companies in a given industry will be affected, industry-wide guidelines would be useful.\n\nC-4. Archived instrument data in standardized formats and standardized vendor software \nThis has been an area of interest for over 25 years, beginning with the Analytical Instrument Association's work that resulted in a set of ASTM Standard Guides:\n\nASTM E2078 - 00(2016) Standard Guide for Analytical Data Interchange Protocol for Mass Spectrometric Data[12]\nASTM E1948 - 98(2014) Standard Guide for Analytical Data Interchange Protocol for Chromatographic Data[13]\nMillipore Sigma continues to invest in solutions based on the Analytical Information Markup Language (AnIML) standard (an outgrowth of work done at NIST).[14] There have also been a variety of standards programs, all of which have a goal of moving instrument data into a neutral data format that is free of proprietary interests, allowing it to be used and analyzed as the analyst needs (e.g., JCAMP-DX[15] and GAML[16]).\nData interchange standards can help address issues in two aspects of data analysis: qualitative and quantitative work. In qualitative applications, the exported data can imported into other packages that provide facilities not found in the original data acquisition system. Examining an infrared spectra or nuclear magnetic resonance (NMR) scan depends upon peak amplitude, shape, and positions to provide useful information, and some software (including user-developed software) may provide facilities that the original data acquisition system didn\u2019t, or it might be a matter of having a file to send to someone for review or inclusion in another project.\nQuantitative analysis is a different matter. Techniques such as chromatography, inductively coupled plasma mass spectrometry (ICP), and atomic absorption spectroscopy, among others, rely on the measurement of peak characteristics or single wavelength absorbance measurements in comparison with measurements made on standards. A single chromatogram is useless (for quantitative work) unless the standards are available. (However, it may be useful for qualitative work if retention time references to appropriate known materials are available.) If you are going to export the data for any of the techniques noted, and others as well, you need the full collection of standards and samples for the data to be of any value.\nYet there is a problem with some, if not all of these programs: they trust the integrity of the analyst to use the data honestly. It is possible for people to use these exported formats in ways that circumvent current data integrity practices and falsify results.\nThere are good reasons to want vendor neutral data formats so that data sets can be examined by user-developed software, to put it into a form where the analysis is not limited by a vendor's product design. It also holds the potential for separating data acquisition from data analysis as long as all pertinent data and information (e.g., standards and samples) were held together in a package that could not be altered without detection. It may be that something akin to blockchain technology could be used to register and manage access to datasets on a company-by-company basis (each company having it\u2019s own registry that would become part of the lab's data architecture).\nThese standards formats are potentially very useful to labs, created by people with a passion for doing something useful and beneficial to the practice of science, sometimes at their own expense. This is another area where a coordinated industry-wide statement of requirements and support would lead to some significant advancements in systems development, and enhanced capability for those practicing instrumental science.\nIf these capabilities are important to you, than addressing that need has to be part of an industry-wide conversation and consensus to provide the marketing support to have the work done.\n\nC-6. Communications \nMost computers and electronic devices in the laboratory have communications capability built in: RS-232 (or similar), digital I\/O, Ethernet port, Bluetooth, Wi-Fi, IEEE-488, etc. What these connection types do is provide the potential for communications, and the realization of that potential depends on message formatting, structure of the contents, and moving beyond proprietary interests to those expressed by an industry-wide community. There are two types of devices that need to be addressed: computer systems that service instruments (instrument data systems), and, devices that don\u2019t require an external computer to function (pH meters, balances, etc.) that may have Ethernet ports, serial ASCII, digital I\/O, or IEEE-488 connections. In either of these cases, the typical situation is one in which the vendor has determined a communications structure and the user has to adapt their systems to it, often using custom programming to parse messages and take action. \nThe user community needs to determine its needs and make them known using community-wide buying power as justification for asking for vendor development efforts. As noted earlier, this is not an adversarial situation, but rather a maturation of industry-wide communities working with vendors; the \u201cbuying power\u201d comments simply give the vendors the confidence that its development efforts won\u2019t be in vain.\nIn both cases we need application-to-application message structures that meet several needs. They should be able to handle test method identifications, so that the receiving application knows what the\nmessage is addressing, as well as whether the content is a list of samples to be processed or samples that have already been processed, with results. Additionally, the worklist message should ideally contain, in attribute-value pairs, the number of samples, with a list of sample IDs to be processed. As for the completed work message, the attribute-value pairs would also ideally contain the number of samples processed, the sample IDs, and the results of the analysis (which could be one or more elements depending on the procedure). Of course, there may be other elements required of these message structures; these were just examples. Ultimately, the final framework could be implemented in a format similar to that used in HTML files: plain-text that is easily read and machine- and operating-system-independent.\nMapping the message content to database system structure (LIMS or ELN) could be done using a simple built-in application (within the LIMS or ELN) that would graphically display the received message content on one side of a window, with the database\u2019s available fields on the other side. The two sides would then graphically be mapped to one another, as shown in the Figure 4 below. (Note: Since the format of the messages are standardized, we wouldn\u2019t need a separate mapping function to accommodate different vendors).\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 4. Mapping IDS message contents to database system\n\n\n\nThe second case\u2014devices like pH meters, etc.\u2014is a little more interesting since the devices don\u2019t have the same facilities available as a computer system provides. However, in the consumer marketplace, this is well-trod ground, using both a smartphone or tablet as an interface, and translation mechanisms between small, fixed function devices and more extensive applications platforms. The only non-standard element is a single-point Bluetooth connection, as shown in Figure 5.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 5. Using a smartphone or tablet as an interface to a LIMS or ELN. The procedure takes a series of pH measurements for a series of samples.\n\n\n\nA single-point Bluetooth connection is used to exchange information between the measuring device and the smartphone or tablet. There are multi-point connection devices (the smartphone, for example) but we want to restrict the device to avoid confusion about who is in control of the measuring unit. A setup screen (not shown) would set up the worklist and populate the sample IDs. The \u201cTake Reading\u201d button would read the device and enter the value into the corresponding sample position\u2014taking them in order\u2014and enable the next reading until all samples had been read. \u201cSend\u201d would transmit the formatted message to the LIMS or ELN. Depending on the nature of the device and the procedure being used, the application can take any form, this is just a simple example taking pH measurement for a series of samples. In essence, the device-smartphone combination becomes an instrument data system, and the database setup would be the same as described above.\nThe bottom line is this: standardized message formats can greatly simplify the interfacing of instrument data systems, LIMS, ELNs, and other laboratory informatics applications. The clinical chemistry community created communications standards that would permit meaningful messages to be sent between an instrument data system and database system structured so that either end of the link would recognize the message and be able to extract and use the message content without the custom programming common to most labs today. There is no reason why the same thing can\u2019t be done in any other industry; it may even be possible to adapt the clinical chemistry protocols. The data dictionary (list of test names and attributes) would have to be adjusted, but that is a minor point that can be handled on an industry-by-industry basis and be incorporated as part of a system installation. \nWhat is needed is people coming together as a group and organizing and defining the effort. How important is it to you to streamline the effort in getting systems up and running without custom programming, to work toward a plug-and-play capability that we encounter in consumer systems (an environment where vendors know easy integration is a must or their products won\u2019t sell)?\n\nAdditional note on Bluetooth-enabled instruments \nThe addition of Bluetooth to a device can result in much more compact systems, making the footprint smaller and reducing the cost of the unit. By using a smartphone to replace the front panel controls, the programming can become much more sophisticated. Imagine a Bluetooth-enabled pH meter and a separate Bluetooth, electronically controlled titrator. That combination would permit optimized[a] delivery of titrant making the process faster and more accurate, while also providing a graphical display of the resulting titration curve, putting the full data processing capability of the smartphone and it\u2019s communications at the service of the experiment. Think about what the addition of a simple clock did for thermostats: it opened the door to programmable thermostats and better controls. What would smartphones controlling simple devices like balances, pH meters, titrators, etc. facilitate?\n\nLaboratory work and scientific production \nKey point: Laboratory work is a collection of processes and procedures that have to be carried out for research to progress, or, to support product manufacturing and production processes. In order to meet \u201cproductivity\u201d goals, get enough work done to make progress, or provide a good ROI, we need labs to transition manual to automated (partial or full) execution as much as possible. We also need to ensure and demonstrate process stability, so that lab work itself doesn\u2019t create variability and unreliable results.\n\nDifferent kinds of laboratory work \nThere are different kinds of laboratory work. Some consist of a series of experiments or observations that may not be repeated, or are repeated with expected or unplanned variations. Other experiments may be repeated over and over either without variation or with controlled variations based on gained experience. It is that second set that concerns this writing, because repetition provides a basis and a reason for automation.\nToday we are used to automation in lab work, enacted in the form of computers and robotics ranging from mechanical arms to auto-samplers, auto-titration systems, and more. Some of the technology development we've discussed began many decades before, and there's utility in looking back that far to note how technologies have developed. However, while there are definitely forms of automation and robotics available today, they are not always interconnected, integrated, or compatible with each other; they were produced as products either without an underlying integration framework, or with one that was limited to a few cooperating vendors. This can be problematic.\nThere are two major elements to repetitive laboratory methods: the underlying science and how the procedures are executed. At these methods\u2019 core, however is the underlying scientific method, be it a chemical, biological, or physical sequence. We automate processes not things. We don\u2019t, for example, automate an instrument, but rather the process of using it to accomplish something. If you are automating the use of a telescope to obtain the spectra of a series of stars, you build a control algorithm to look at each star in a list, have the instrument position itself properly, record the data using sensors, process it, and then move on the next star on the list. (You would also build in error detection with messages like \"there isn\u2019t any star there,\" \"cover on telescope,\" and \"it isn\u2019t dark yet,\" along with relevant correction routines). The control algorithm is the automation, while the telescope and sensors are just tools being used.\nWhen building a repetitive laboratory method, the method should be completely described and include aspects such as:\n\nthe underlying science\na complete list of equipment and materials when implemented as a manual process\npossible interferences\ncritical facets of the method\nvariables that have to be controlled and their operating ranges (e.g., temperature, pH of solutions, etc.)\nspecial characteristics of instrumentation\nsafety considerations\nrecommended sources of equipment and sources to be avoided\nsoftware considerations such as possible products, desirable characteristics, and things to be avoided\npotentially dangerous, hazardous situations\nan end-to-end walkthrough of the methods used\nOf course, the method has to be validated. You need to have complete confidence in its ability to produce the results necessary given the input into the process. At this point, the scientific aspects of the work are complete and finished. They may be revisited if problems arise during process automation, but no further changes should be entertained unless you are willing to absorb additional costs and alterations to the schedule. This is a serious matter; one of the leading causes of project failures is \u201cscope creep,\u201d which occurs when incremental changes are made to a process while it is under development. This results in the project becoming a moving target, with seemingly minor changes able to cause a major disruption in the project's design.\n\nScientific production \nAt this point, we have a proven procedure that we need to execute repeatedly on a set of inputs (samples). We aren\u2019t carrying out \"experiments\"; that word suggests that something may be changing in the nature of the procedure, and at this point it shouldn\u2019t. Changes to the underlying process invite a host of problems and may forfeit any chance at a successful automation effort.\nHowever, somewhere in the (distant) future something in the process is likely to change. A new piece of technology may be introduced, equipment (including software) may need to be upgraded, or the timing of a step may need to be adjusted. That\u2019s life. How that change is made is very important. Before it is implemented, the process has to be in place long enough to establish that it works reliably, that it produces useful results, and that there is a history of running the same reference sample(s) over and over again in the mix of samples to show that the process is under control with acceptable variability in results (i.e., statistical process control). In manufacturing and production language, this is referred to as \"evolutionary operations\" (EVOP). But we can pull it all together under the heading \u201cscientific production.\u201d[b]\nIf your reaction to the previous paragraph is \u201cthis is a science lab not a commercial production operation,\u201d you\u2019re getting close to the point. This is a production operation (commercial or not, it depends on the type of lab, contract testing, and other factors) going on within a science lab. It\u2019s just a matter of scale.\n\nDemonstrating process stability: The standard sample program \nOne of the characteristics of a successfully implemented stable process is consistency of the results with the same set of inputs. Basically, if everything is working properly, the same set of samples introduced at the beginning should yield the same results time after time, regardless of whether the implementation is manual or automated. A standard sample program (Std.SP) is a means of demonstrating the stability and operational integrity of a process; it can also tell you if a process is getting out of control. Introduced early in my career, the Std.SP was used to show the consistency of results between analysts in a lab, and to compare the performance of our lab and the quality control labs in the production facilities. The samples were submitted and marked like any similar sample and became part of the workflow. The lab managers maintained control charts of the labs and and individual's performance on the samples. The combined results would show whether the lab and those working there had things under control or if problems were developing. Those problems might be with an individual, a change in equipment performance, or a change in incoming raw materials used in the analysis.\nThe Std.SP was the answer to the typical \u201chow do you know you can trust these results?\u201d question, which often came up when a research program or production line was having issues. This becomes more important when automation is used and the sampling and testing throughput increases. If an automated system is having issues, you are just producing bad data faster. The Std.SP is a high-level test of the system's performance. A deviation of the results trend line or a widening of the variability are indications that something is wrong, which should lead to a detailed evaluation of the system to account for the deviations. A troubleshooting guide should be part of the original method description (containing aspects such as \u201cif something starts going wrong, here\u2019s where to look for problems\u201d or \u201cthe test method is particularly sensitive to \u2026\u201d, etc.) and notes made during the implementation process.\nReference samples are another matter, however. They have to be stable over a long enough period of use to establish a meaningful trend. If the reference samples are stable over a long period of time, you may only need two or three so that the method can be evaluated over different ranges (you may have a higher variability at a lower range than a higher one). If the reference samples are not stable, then their useful lifetime needs to be established and older ones swapped out periodically. There should be some overlap between samples near the end of their useful life and the introduction of new ones so that laboratorians can differentiate between system variation and changes in reference samples. You may need to use more reference samples in these situations.\nThis inevitably becomes a lot of work, and if your lab utilizes many procedures, it can be time-consuming to manage an Std.SP. But before you decide it is too much work, ask yourself how valuable it is to have a solid, documented answer to the \u201chow do you know you can trust these results?\u201d question. Having a validated process or method is a start, but that only holds at the start of a method\u2019s implementation. If all of this is new to you, find someone who understands process engineering and statistical process control and learn more from them.\n\n Where is the future of lab automation going to take us? \nKey point: There is no single \u201cfuture\u201d for laboratories; each lab charts it own path. However there are things they can all do to be better prepared to meet the next day, month, or decade. Plan for flexibility and adaptability. Keep your data and information accessible, educate your personnel to recognize and take advantage of opportunities, and don\u2019t be afraid of taking risks.\nWhere is the future of lab automation going to take us? The simple answer is that the future is in your hands. Lab personnel are going to be making their own choices, and that is both a reflection of reality and part of the problem. Aside from a few industries, labs have approached the subject of automation individually, making their own plans and charting their own course of action. The problems that approach engenders are that it is both inefficient and it wastes resources (e.g., funding, time, effort).\nEarlier we saw how the clinical chemistry industry handled the problem of containing costs through automation: it made the automation work better. Within any given industry, it\u2019s unlikely that the same type of testing is going to differ markedly; there will be unique demands and opportunities, but they will be the exception. Why not pool resources and solve common problems? It would benefit the end-user in the lab as well as the vendor, encouraging those vendors to build products for a specific market rather than meeting the needs of a single customer. Among the biggest issues in automation and lab computing is communications between devices; the clinical chemistry industry has solved that. Their solution has elements that are specific to their work but they should be easily adjusted to meet the needs of other groups.\nWe\u2019re in a phase where we have many technologies that work well independently for their intended purpose, but they don't work well together. We\u2019ve been there before, with two examples to be found in computer networking and computer graphics. \nComputer networks began to emerge commercially in the early 1970s. Each computer vendor had their own approach to networking (similar hardware but different communications protocols). Within each vendor's ecosystem things worked well, yet bridging across ecosystems was interesting. When network installations occurred at customer sites, the best hardware and software people were involved in planning, laying out the systems, installing software, and getting things to work. Installation time was measured in weeks. Today, a reasonably sophisticated home or business network can be install in an hour or two, maybe longer depending on the number of machines and components involved, and when you turn it on, you\u2019d be surprised if it didn\u2019t work. What made the difference? Standards were developed for communications protocols. Instead of each vendor having their own protocol, the world adopted TCP\/IP and everything changed.\nThe path of computer graphics development provides further insight. In the 1970s and 80s, when computer graphics hardware began seeing widespread use, every vendor in the market had their own graphics display hardware and software. If you liked one vendor's graphics library, you had to buy their hardware. If you liked their hardware you had to live with their software. We saw this play out in the early days of the PC, when there were multiple graphics processor card adapters (e.g., enhanced graphics adapter [EGA], color graphics adapter [CGA], video graphics adapter [VGA], Hercules, etc.), and if you had the wrong card installed, the software wouldn\u2019t work (although combination cards allowed the user to switch modes and reduce the number of boards used in the computer). This continued to frustrate users and developers, until a standardized graphics architecture was developed that was device-independent. Problem solved.\nThis is a common theme that comes up over and over. A new technology develops, vendors build products with unique features and those products don\u2019t play well with each other (i.e., they attempt to achieve a strong market positions). Customers get frustrated and standards are developed that convert \u201cdon\u2019t play well\u201d to \"easy integration,\" and the market takes off.\nHow the future of lab automation is going to play out for you is in your hands, but there are things we can do to make it easier. As we noted earlier, few if any of the developments in lab automation were initially done according to a plan, though we\u2019ve also seen what planning can actually do in at least one setting, the field of clinical chemistry. This highlights the fact that not only do you need a plan for your laboratory, but also a plan for automation in your industry would be worthwhile; it would avoid spending resources individually where pooling is a better approach, and it would give you a stronger voice to work with vendors to create products that meet your industry's needs. A set of industry-wide strategies (e.g., for mining, viniculture, cosmetics, materials research, healthcare, biotech, electronics, etc.) would give you a solid starting point for the use of computing and automation technologies in your lab. There still would be plenty of room for unique applications.\nIndustry working groups are one need, while getting people educated is another. Determining how much you are willing to rely on automation is an additional point, which we\u2019ll illustrate with a discussion of quality control labs, in the next section.\nBefore we move on to the next section, however, there are two terms that haven\u2019t been mentioned yet that need to be noted: artificial intelligence (AI) and big data. I\u2019m not sure I know what AI is (the definition and examples change frequently), or that I\u2019d trust what is available today for serious work. For every article that talks about the wonders it will have, there is one that talks about the pitfalls of design, built in bias, and concerns about misapplication. In science, aside from using it as a \u201cdid you know about this?\u201d advisory tool, I\u2019d be concerned about using it. If you don\u2019t know how the AI arrived at its answer (one of AI\u2019s characteristics) why would you trust it? Big data, on the hand, is on a more solid technical footing and should prove useful, though only if your data and information is structured to take advantage of it. You can\u2019t just say \u201chere are all my datasets\u201d; you have to organize them to be usable.\n\nApplying automation to lab work \nKey point: Today, people\u2019s attitudes about work have changed, and with it the world has changed; the COVID-19 pandemic has forced us to re-evaluate models of work. If nothing else, even though we\u2019re talking about science, the use of technology to conduct scientific work is going to take on a new importance. This won\u2019t be solved one lab at a time, but on an industry-by-industry basis. We need to think about key concepts and their implementation as a community if truly effective solutions are to be found and put into practice.\nIn Considerations in the Automation of Laboratory Procedures, the criteria for \"automation readiness\" were described. Two significant points were that a laboratory procedure must both be stable and have a duration of use sufficiently long enough to justify the cost of automation. Beyond that, additional concerns such as safety may come into play, but those two points were the most critical. From those points we can roughly derive the automation needs of research, testing, quality control, and contract laboratories.\nResearch laboratory: Automation in research is going to be predominately at the task level, i.e., portions of a process being done by automated instruments and devices rather than end-to-end automation. This is due to the two key points noted above: research labs often change procedures, with some being very short-lived, and the cost of full-automation (including validation) may not be worth it unless the equipment is designed to be modular, interconnected, and easily programmed. One exception can be seen with Bayer.[17] They are using robotics and microplate technology to automate a high-throughput screen system that couldn\u2019t\u2014in their words\u2014be done practically by people; it's too slow and too expensive. Could something like this become more commonplace?\nTesting laboratory: Non-routine testing laboratories are going to have a mix of task-level and end-to-end automation. It will depend on the stability of the test process and if there is sufficient work to justify the cost of automation, plus what level of control over the form of the incoming samples is available (see Appendix 1, section 1.3 of this guide). Often samples are submitted for multiple tests, which raised an additional concern: is the last step in a procedure destructive to the sample? Samples may be prepared so that they can be used in more than one test procedure. If the test process is destructive, then the initial preparation has to be divided so there is enough material for all testing; this can complicate the automation process.\nQuality control laboratory: Quality control (QC) labs should be the ideal place for automation to take hold. The procedures are well defined, stable, and entrenched. The benefits in faster throughput and cost reduction should be clear. The biggest place for problems is in sample preparation and getting material ready for analysis. Given that the form of the samples is known and predictable, it is worthwhile to spend time in analyzing and designing a means of working with samples. There are going to be some preparation steps that may be too difficult to justify automation; this can be seen with tensile testing of polymers and fibers. The sample or parts fabrication and need to wait for the material to stabilize may make parts of the process difficult to deal with, or not financially justifiable. Insertion into the test apparatus may also be an issue. Testing that can be done by spectroscopic, chromatographic, or other easily automated instrumentation may be migrated from the laboratory to in-line testing, raising questions about the nature of the QC process.\nContract or independent laboratory: The ability to apply automation beyond the task level depends on the kind of work being done; is it specialized or highly flexible? The greater the flexibility (e.g., a number of different procedures with a small number of samples submitted at a time) the more task level automation will apply. However, the justification for automation may not be there. Specialized labs (e.g., clinical chemistry labs) or a lab that specializes in a technique with a high demand and throughput should benefit from a move to end-to-end automation since it reduces costs and improves accuracy and turn-around times (TATs).\n\nMore on the quality control lab and its process management \nThe evolution of QC labs may change our perspective of how QC testing is accomplished, as well as the nature of QC lab operations. Our current view, and that of the last few decades, may change significantly.\n\nEarly QC testing \nSome form of quality testing was done during the production process in prior decades. For example, I\u2019ve seen production station operators bite plastic materials to get an idea of how well the production process, and the expansion of polystyrene beads, was going in the production of Styrofoam cups (though this was not encouraged by management as butane and pentane, used as expansion agents, could cause health problems). If the cup was too stiff or soft, it would indicate a problem in the molding process. There were other \u201cquick and dirty\u201d production floor tests that were run because the TAT in the lab was producing an unacceptable delay (if you\u2019re making adjustments, you need measurements quickly). People may also have wanted to avoid having a record of production problems and changes.\nIn another application (anecdotal reporting from a course on plastics manufacturing)\u2014calendering[c] of plastic coated paper\u2014the production operator would strike the laminated paper with a slender bamboo rod and listen to the sound it made to determine the product quality. According to the course instructor, he was adept at this, and no other test could replace this in-process procedure.\nAs such, it's clear not all \"quality\" testing went on in the laboratory.\n\nNormal QC testing \nTraditionally, QC lab operations became separate from production, partly because of the physical conditions needed to conduct the work, and partly to avoid bias in the results. The conclusion: it was important that the reporting lines of communication be separate and independent from production. QC labs would, and do, perform testing on incoming materials to certify them suitable for use and produced products to see if they met product specifications, as well as perform in-process testing. They would also certify products as qualified for shipment, acting as an unbiased evaluator of product quality.\nQuality tests were manually implemented until the early 1960s. Then we saw the advent of in-process instrumentation and process chromatographs by Fisher Controls and Mine Safety Apparatus, for example. While the instruments themselves were on the production floor, their management and analysis was under the control of the QC lab, at least in the facility I worked in. The instruments' maintenance, calibration, peak measurements, and sample calculations were all done by lab personnel. Since that time, we\u2019ve seen a continued growth in in-line testing for production processes. That said, what\u2019s the logical conclusion of increasing automation?\n\nA thought experiment \nLets posit that we have a production process whose raw materials are fluids and the end product is a fluid. We\u2019d be concerned with certifying in-coming raw materials as suitable for use, while monitoring the production process for product composition and potential contaminants. The end product would have to be certified for shipment.\nIf all the testing were chromatographic with in-process instruments and a chromatography data system (CDS), and\/or in-line spectrophotometric and other in-line or in-process tests, with the results becoming part of the process control system (sample IDs would be a hash of sensor type-location-timestamp), what would the role of the QC lab become? Assuring that the equipment was running properly and regularly calibrated, with periodic verification of test results (off-line testing) is one set of possibilities. Is this a desirable developmental direction? We need to look at the benefits and issues that result from this design.\nBenefits:\n\nIt provides an integrated system where all process sensor and test result data are immediately available to management. This allows them to detect issues faster and provide a more timely response (better process control), reducing down time and off-spec product. If we want to enter the world of science fiction, we can even imagine a combination AI-machine learning solution providing closed loop process monitoring and control.\nIt signifies a potential cost reduction resulting from smaller labs and lower personnel costs.\nIssues:\n\nThere's a loss of independent product evaluation. Basically, you are trusting the system to honestly monitor, report, and reject off-spec, incoming, and outgoing material. In the movie version, this is where the sound track becomes ominous. This loss of independent checking may reduce customer confidence in product quality.\nThe veracity of statistical process control and correction may suffer.\nSystem validation could be challenging as the production process has to be validated, each sensor and instrument data system has to be validated, and the integrated system has to be validated, including the statistical process control.\nAssuming a system were built, how far are we willing to trust automated systems to function without external oversight? The control room would still be populated with people managing the process and exerting a higher level of scrutiny, but you are still trusting the designers of the system to do very sophisticated work, not only in process design but also integrating testing as part of the effort. Ideally you\u2019d want the CDS and other instrument data processing equipment in the control room, where it is more easily used and maintained, than on the process floor. The role of the QC lab would then change to that of an overarching quality manager of the entire system, ensuring that equipment functioned properly, testing was accurate, the process and testing were operating within control limits, and the data logs were correct.\nSome organizations may be past this point, while for others this may be interesting, bordering on science fiction. The point of this thought experiment is to see what could happen and where your comfort level is with it. How much control do you want to give an automated system and how much do you want to retain? What are the consequences of not providing sufficient oversight? How much bad product could be made?\nAlso note that this isn\u2019t an all-or-nothing proposition; give it some room to work, see what happens, and if everything is good, give it more. Just build in a big red button that allows you to reboot and revert to manual operations; in other words, don't self-destruct, just remove some critical controls. A lot depends on the nature of the finished product. If the end product is something critical (e.g., a medical device or therapeutic), you\u2019ll want to be cautious about phasing in automated control systems.\nAll that said, two additional points should be made. First, be willing to play with the ideas. Turn it into a science fiction project (\u201csci fi\u201d is just a playground for \"what ifs\"), remove it from reality enough that people can look at it from different perspectives and see what might work and what might not. Then let people play with those ideas; you might learn something. What are all the things that could go right, and what could go wrong (and what can you do about it)? You probably won't have to worry about alien robots, but malware interference is certainly a factor, as is a network air-gap. There is always the possibility of someone causing a problem; the question of course is how do you detect it and correct it. Second, be willing to model the system. There are a number of modeling packages ideal for this purpose. You can model the behavior and see how different control methods react.\n(While this thought experiment used a process involving fluids only, as they are relatively easy to work with, its worth noting that solid materials become more of an issue, complicating the automation process [see Appendix 1, section 1.4 of this guide]. In some cases sample preparation for testing may be too cumbersome for automation. This would shift the automated-manual testing balance more toward the latter in those cases, introducing delays and disrupting the timing of results to process control.)\n\n The creation of a \"center for laboratory systems engineering\" \nKey point: Throughout this piece, the need for education has been a consistent theme. Developing and using the technologies in lab work, both scientific and informatics-related, will require people who know what they are doing, specifically educated to carry out the work noted above. We also need a means of pulling things together so that there is a centralized resource to start a learning process and continue development from there.\nLet's propose a \"center for laboratory system engineering.\" This center would firstly prepare people to be effective planners, designers, implementers, supporters, and users of laboratory informatics and automation systems in scientific applications. Additionally, the center would ideally drive innovation and provide assistance to scientific personnel and IT groups seeking to apply and manage such laboratory technologies.\nThose goals would be accomplished by:\n\nDeveloping and delivering courses for LSEs, lab personnel, and IT support (These courses would cover technical science topics as well as skills in working with people, conflict resolution, and communications. They would be presented both in-person and online or on-demand to reach a broad audience; an intensive summer course with hands-on experience should also be considered.)\nCreating an LSE certification program\nCarrying out research on the application of informatics, robotics, and computer-assisted data collection and processing\nDocumenting the best practices for an LSE\nAggregating and publishing material on the roles and requirements of the LSE\nIdeally, this center would be part of a university setting so that it could interact with both science and computer science departments, contribute to their programs, and in turn gain from that association.\n\nAppendix 1: A very brief historical note \nIt would be useful to understand how we arrived at our current state in regards to informatics and automation in science. That will make it easier to understand what we need to do to make advancements. There is one key point to take away from this: in the history of lab automation, products weren\u2019t developed according to a grand plan[d] but rather to meet perceived needs and opportunities. Thought processes in this vein have likely included:\n\n\u201cHere\u2019s a problem that needs to be solved.\u201d\n\u201cIf I can figure out how to do X, then I can accomplish Y.\u201d\n\u201cThere\u2019s a business opportunity in building product concept X, which will help people do Y and Z.\"\nSometimes these ideas were voiced by lab personnel, but most of the time they were the result of someone seeing a problem or opportunity and taking the initiative to address it.\n\n1.1 Collecting data from instruments \nIn the late 1960s and early 1970s, instrument companies recognized that connecting a computer to the analog output of an instrument would help lab personnel capture and process data. The form that this product development took depended on how the developer saw the problem. We\u2019re going to look at chromatography[e] as an example for several reasons: it received the most attention for automation, it\u2019s a data rich technique that took considerable manual effort to analyze, and it was and still is one of the most popular instrumental techniques in chemical labs. The product solutions provided by the vendor reflected the technology available and their view of the problem that needed to be solved.\nThe analysis (Figure A1) depends on recognizing and quantifying a series of peaks, each of which represents the amount of a component in a mixture of materials. The size of the peak (measured by area \"better\" or height with respect to a baseline) helps quantify the amount, and the time it takes the peak to appear can be used to help identify the component.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure A1. Illustration of peaks from chromatograph. Source: public domain\n\n\n\nReference standards are prepared and run along with the samples. The peak response of the standards and their corresponding concentrations are used to draw a calibration curve, and the samples are quantified by comparing peak sizes against that curve (Figure A2).\n\r\n\n\n\n\n\n\n\n\n\n\nFigure A2. Samples are quantified by comparison to a calibration curve\n\n\n\nThe solutions reflected both the user\u2019s input and the vendor's observations, plus being too close to the problem and not seeing the whole picture. In that same timeframe, computing was expensive and you had to have a lot of processing done to justify the costs. Otherwise you dropped down to microprocessors and scaled back the size of the problem you could tackle. The microprocessor of choice was the Intel 4004, which was superseded by the Intel 8008 in 1972.\nWith computing, the chromatographer could detect peaks and quantify the peak height and area, later printing the results on a strip of paper. This was a big help to the chromatographer since determining peak size, and area in particular, was a major hassle. Prior to computerized methods, chromatographers were using:\n\nmechanical integrators built into the strip chart recorder (that recorded the chromatograph output), which were hard to read and didn\u2019t provide for baseline corrections (critical for accurate results);\na planimeter, which was time-comsuming and demanded careful attention;\na cut-and-weigh method, where the chromatographer literally cut our a copy of the peak and weighed it on a balance, cutting out a reference area, and comparing it to a calibration chart constructed from a similar procedure; and\na method that had the chromatographer counting the boxes the peak enclosed on the chart paper, which was not very accurate, though better than peak height in some cases.\nHaving the quantified peaks printed on a piece of paper meant that the analyst could move quickly to drawing the calibration curve and evaluating the results. However, from the standpoint of being a laborious, time-consuming task, this is only a piece of the problem (a major one, but only a part). Some users connected the output of the integrators (RS-232 serial ASCII) to minicomputers, transferred the integrator information, and completed the analytical process in the larger machines. Several integrators could be connected to a mini-computer so the cost per instrument was reduced. This was a step toward a better solution, but only a step.\nComputer vendors wanted to participate in the same market, but the cost of minicomputers put them at a disadvantage unless they could come at it from a different perspective. They looked at the entire data processing problem, including the points mentioned above, plus getting the computer to compute the calibration curve, complete the analysis, print out a report, and store the report and the data for later retrieval (integrators didn\u2019t have that capability). They could also store the raw digitized instrument output for later analysis and display. The cost per instrument dropped when computer vendors began connecting multiple instruments to one system, some from different labs. Nelson Analytical used a local-to-the-instrument box for data collection and control and then forwarded those information packets to a central system for processing. This bigger-picture view of the problem greatly reduced the workload on the analyst. As computing costs dropped and the power of microchips increased, several different approaches emerged from different vendors that had varied perspectives on computing. Most took the big-picture view but worked on a one-instrument-one-computer-station approach, which benefited small labs since they didn\u2019t have to invest in a minicomputer.\nThe low-cost of microprocessors more readily allowed digital systems to join the lab, to the point where almost every lab device had a computer chip in it (\u201cdigital\u201d was a strong marketing point). Now that we had lots of digital data sources, what was the lab going to do with them?\n\n1.2 The beginning of laboratory informatics systems \nIn the early 1970s, PerkinElmer described its instrument supporting computers as \u201cdata stations.\u201d Then they announced the \u201claboratory information management system\u201d or \"LIMS,\" and the next level of informatics hit the lab market. However, \u201claboratory information management system\u201d was a problematic name for a product that did sample and test tracking. The customers thought it would take into account all of a lab's information, including personnel records, scheduling, budgets, documents, anything that \u201cinformation\u201d could be attached to, ultimately promising more than it could deliver. It took some time, but eventually something like that happened. From a marketing standpoint, it got people\u2019s attention.\nSeveral other vendors, consulting companies, startups, and computer vendors began LIMS development projects (computer vendors felt that database systems were their turf). This was viewed as a strategic offering: the testing lab's operations would revolve around the LIMS, and that gave the LIMS vendor whose product was chosen a strong marketing position in that company.\nThe introduction of the LIMS eventually opened the door to other informatics applications. The major classes and functions of said applications are show in Figure A3.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure A3. The starting point for major classes of lab informatics systems and the lab functions they addressed\n\n\n\nMissing from the chart are user-developer tools like LabView from National Instruments that enabled users to develop data acquisition and control applications via a graphical user interface.\nThe term \u201celectronic laboratory notebook\u201d or \"ELN\" has had an interesting history. It\u2019s been applied to at least three types of software before its most current iteration. Laboratory Technologies, Inc. first created LABTECH Notebook, a PC-based software package designed to assist users with the communication between a computer and its lab instruments via RS-232 connections. Then there was Wolfram's Mathematica software, an electronic notebook for advanced mathematics. And finally there was Velquest's SmartLAB, an ELN for conducting analyzes and recording information from laboratory procedures, becoming the first laboratory execution system (LES).\nFigure A3 showed a nice clean, somewhat idealized, starting point for the introduction of lab informatics. That didn\u2019t last long. Vendors saw what their competition was doing and the opportunities to expand their products' capabilities (and market acceptance). What were clean, neat, functionality silos became more complex products to attract the attention of scientists and laboratorians (Figure A4).\n\r\n\n\n\n\n\n\n\n\n\n\nFigure A4. Lab informatics capability expansion\n\n\n\nThese expanded capabilities meant that a single vendor solution could address more of a lab\u2019s needs, simplifying support, installation, and so on through the use of a singular software package. It did, however, make product and vendor selection more of a challenge; you really had to know what you needed. It also raised questions of cross-product compatibility: what instruments connected easily to what LIMS, ELN, or SDMS? If it wasn\u2019t easy, what did you do? Third-party intermediate systems were eventually developed that translated instrument communications similarly in the way database systems did.\nWhile all this was going on in the expanding digital laboratory space, people still had things to do on the lab bench.\n\n1.3 Electro-mechanical robotics \nDuring the 1970s and 80s, vendors noted the amount of time spent doing repetitive tasks at the lab bench. This resulted in two different approaches to product development:\n\nDedicated-function robotics: The development of auto-samplers, auto-titrators, and similar devices that were used to carry out a specific task\nGeneral-purpose robotics: The development of an elaborate kit the use user had to assemble to robotically complete a task; not too different from a computer programming language: the\ncomponents are there, you just have to organize them to make something useful happen\nAlthough each of these approaches was potentially useful, they each presented the user community with different sets of problems.\nThe dedicated-function robotics device generally worked well; that wasn\u2019t the issue. The problem arose when they had to connect to other components and instruments. The use of an auto-sampler, for example, required the user to adjust their sample preparation so that the material to be analyzed wound up in the appropriate sample vials. This may have required adjusting the procedure to accommodate a different set of mechanics than they were used to, e.g., sealing the vials or choosing the proper vial for an application. Auto-samplers feed samples into instruments so there is a matter of setting up control signals for proper coordination with instruments and instrument data systems.\nOther devices such as auto-titrators required a reservoir of samples (in the proper vial formats), and the vials had to be organized so that an analysis on sample ID456 was in fact that sample and that the order didn\u2019t get mixed up. The data output could also be a problem. The default for many vendors was a spreadsheet-compatible file, though it was up to the user to make sense of it and integrate it into the workflow. This was probably the best compromise available until more formalized data interchange and communications standards became available. The vendor had no idea how a particular device was going to fit into a procedure's workflow or what was going to happen to the data further downstream, so a CSV file format as a solution was simple, flexible, and easy to work with. It also meant more work on the user's part on the integration front, representing another place where changes may have to be made if a device is replaced in the future.\nWith the dedicated-function device, which could be as simple as a balance (the process of reading a force exerted on a sensor and converted to a weight is an automated process), we have potentially isolated elements of automation that are either interconnected by programming or someone reading data and re-entering it. However, there is no \u201cplug-and-go\u201d capability.\nAs for the general-purpose robotics device, they could be broken down into two categories: those that were successful in the broad marketplace and those that weren't. In the 1980s, three vendors were competing for attention in the laboratory robotics market: Zymark, Hewlett-Packard, and PerkinElmer. Each of their general-purpose systems had a moveable arm that could grip items and be programmed to carry out lab procedures. Yet they faced a daunting task: the process of gripping items was problematic. The robot didn\u2019t work with just one gripper or \"hand\"; it required multiple interchangeable hands that had to be swapped depending on the next sequence of actions and what items had to be grasped. Those items were typically things designed for human use, including flasks, test tubes, and a variety of other equipment. This problem was non-trivial, a result of having the robot work with equipment designed for humans so that the lab didn\u2019t have to buy duplicate components, which would have raised problems of cost, and the ability to carry on work if the robot didn\u2019t function. Another issue with the grippers and their holders was that they took up space. The Zymark system for example had a central robotic arm that could reach items within the shell of a hemispherical volume; grippers took up space that could be used by other devices. Some people were successful in building systems, but not enough to make the products economically viable. In many respects, the core robotics technologies should have been successful, but the peripheral devices were not up to the necessary operational levels; good for humans, lacking for automation.\nAnother problem was education. The vendors would run courses to train people to use their systems. However, successful implementations required engineering, expertise, and experience, not just experimentation. Further robust systems, those capable of running days on end with built-in error detection and correction, were rare but necessary to avoid processing samples into junk. People had the need and the desire to make working systems, but not the process engineering skills to create successful implementations.\nThe life sciences market, and biotechnology in particular, took a different approach to robotics: they standardized the sample carrier format in the form of the microplate. The physical dimensions of the microplate were constant while the number of sample cells could vary. The image on the right of Figure A5, for example, shows plates with 96, 384, and 1,536 wells, with even higher densities available.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure A5. A microplate sample holder (left); examples of 96, 384, and 1,536 well holders, care of WikiMedia Commons (right)\n\n\n\nWhat did this mean for robotics? Every device that interacted with a microplate could be reliably designed with a single gripper size and a single aperture size for plates to be inserted. Systems would \u201cknow\u201d where in the sample space the sample wells were. In short, standardization made it easier to build equipment for a marketplace. And a lot of equipment was built and put into successful use.[f]\nThe combination of robotics and microplates also worked well in biotechnology because of the prevalence of liquid samples; this point can not be stressed enough. We are good at working with fluids, including measuring amounts, transferring them, dispensing them, and so on. Solids can be a problem because of cross-contamination if equipment isn\u2019t cleaned, if the solid is tacky, if the particle size isn\u2019t right for the equipment, if it has a tendency to roll, and so on. Solids also have the potential for problems with homogeneity. (You can get layering in liquids, but that can be solved in \u201ctwo shakes.\u201d)\n\n1.3.1 Two approaches to sample processing with robotics \nBroadly speaking, there are two approaches to sample processing with robotics: batching and sample-at-a-time runs. The microplate is an example of batch processing. Each tray is handled (dilution, sealing, reading, etc.) at a station, and you don\u2019t have an opportunity to evaluate the results until the analysis for a tray is completed and reported. Other methods such as ICP, mass spectrometry, chromatography, etc. that use auto-injectors can behave in a similar fashion, or be evaluated on a sample-at-a-time basis. It depends on how the process is planned and implemented, and at what stage the results are evaluated.\nThe sample-at-a-time procedure offers an interesting alternative pathway for automated analysis. This process can include auto-titrators, in addition to the techniques noted, as well as others. Here each sample is processed in stages either one-at-a-time or in overlapping stages. While one sample is being processed in stage 3 (adding a solvent for example), a sample in stage 4 is being injected into an instrument. This means that the results for one sample can be known before the next one is processed. The benefit is that systematic errors can be caught and corrected before all samples are processed. This would reduce waste and improve overall efficiency.\n\n1.4 Sample storage organization \nBefore we leave the topic of samples, we need to address the subject of sample storage organization. This is important, as poor organization and management can nullify any gains from automation. There are two aspects to consider: the organization process itself and the physical nature of the samples.\nIn the life sciences, biobanking and refrigerated storage management have been actively discussed topics. For example a white paper commissioned for Titian Software Ltd. titled The Essential Guide to Managing Laboratory Samples goes into a fair amount of depth on the subject.[18] And if you Google \u201claboratory sample storage management,\u201d you\u2019ll get a sizeable listing of material, including peer-reviewed work by Redrup et al. titled \"Sample Management: Recommendation for Best Practices and Harmonization from the Global Bioanalysis Consortium Harmonization Team.\u201d[19] The abstract of the work of Redrup et al. reads in part[19]:\n\nThe importance of appropriate sample management in regulated bioanalysis is undeniable for clinical and non-clinical study support due to the fact that if the samples are compromised at any stage prior to analysis, the study results may be affected. Health authority regulations do not contain specific guidance on sample management; therefore, as part of the Global Bioanalysis Consortium (GBC), the A5 team was established to discuss sample management requirements and to put forward recommendations.\nIn short, you have to have control of your samples and be able to ensure their integrity.\n\n1.4.1 The nature of incoming samples \nConsiderations about the nature of incoming samples don\u2019t get as much press as they deserve. For example, if you are in quality control, the nature of your incoming samples is going to be consistent and determined by the production process. In fact, sample preparation is likely to be integrated with the test procedure\u2019s automation. If the samples are fluids, the impact on an automation process may be small compared to working with solids. One complicating factor with fluids is the need to remove extraneous material so that downstream problems aren\u2019t created. That removal may be accomplished by filtering, settling, centrifugal separation, batch centrifugation, or other means depending on the amount of material and its composition.\nWorking with solids raises several other issues. First, does the material have to be reduced to a coarse or fine powder before processing? This may be needed to permit precise weighing of amounts or providing a large surface area for solvent extractions. Second, is fabrication of a sample piece required? Some mechanical properties testing of plastics require molding test bars. Other tests may require pressing films for spectral analysis. Other issues exist as well. Some are industry-specific, for example working with hazardous materials (including toxic substances and radioactive samples), and those requiring careful environmental and\/or security controls.\nIn many labs, automation is viewed as something that happens after the samples are logged in. Yet that doesn\u2019t have to be the case. The following paragraphs focus on testing labs because they are the most likely to benefit from automation. They meet the two key criteria for automation implementation: stable procedures and sufficient workload to justify the cost of automation development. That can also happen in research labs, though; in the end it's simply a matter of the nature of the work.\nTesting labs (e.g., quality control and non-routine internal and contract labs) can take steps to streamline their sample handling operations, though unfortunately at the expense of someone else\u2019s labor (just make it worth their effort). For example, those submitting samples can be required to pre-log their samples. This can be accomplished by giving them access to restricted accounts on a LIMS that lets them log samples in, and little more. Sample containers can also be standardized with barcodes. The barcodes can then be required as part of the logging process and are critical to identifying samples that have reached the lab, as well as tracking their physical location. Additionally, sample container sizes and related container forms can also be standardized. These should match the requirements for sample handling in automated systems, if possible. (Unless you supply them, your submitters may not have the tools for sealing sample vials, etc.) Finally, all this cooperative effort to standardize sample reception can be incentivized with price breaks, which is likely to lead to faster TAT. In other words, give them an incentive to work with you, the lab. They are supplying labor that could potentially impact their productivity, so give them a good reason to work with you.\nTesting operations can, as a result, see further benefits:\n\nSample storage management and access is improved.\nYou\u2019ll be more informed of incoming work.\nAutomation and scheduling is enhanced when it begins with the requester instead of post-login.\nMy first professional lab experience was in polymers in an analytical research group (some routine work, a lot of non-routine work). Samples would arrive in a variety of containers (e.g., bags, jars, test tubes, envelopes, fabricated parts taped to sample request forms). Sample matrices would range from pellets, waxes, and powders to liquids, gas cylinders, rolls of film, and more. Classifying those samples, figuring out where to put them, locating them, and preparing them for work (which often involved grinding them in a Wiley mill) was a major time sink, sufficiently so that the actual analysis was a smaller part of the overall workflow. As such, anything you can do to streamline that process will help productivity and contribute to a successful automation project.\n\n Abbreviations, acronyms, and initialisms \nCDS: Chromatography data system\nELN: Electronic laboratory notebook\nEVOP: Evolutionary operations\nICP: Inductively coupled plasma mass spectrometry\nK\/D\/I: Knowledge, data, and information\nLIMS: Laboratory information management system\nLOF: Laboratory of the future\nLSE: Laboratory systems engineer\nNMR: Nuclear magnetic resonance\nQC: Quality control\nROI: Return on investment\nSDMS: Scientific data management system\nStd.SP: Standard sample program\nTLA: Total laboratory automation\nTAT: Turn-around time\n\nFootnotes \n\n\n\u2191 Where the amount of titrant added is adjusted based on the response to the previous addition. This should yield faster titrations with increased accuracy. \n\n\u2191 In prior writings, the term \u201cscientific manufacturing\u201d was used. The problem with that term is that we\u2019re not making products but instead producing results. Plus \u201cmanufacturing results\u201d has some negative connotations. \n\n\u2191 Wikipedia says this of calendering: \"A calender is a series of hard pressure rollers used to finish or smooth a sheet of material such as paper, textiles, or plastics. Calender rolls are also used to form some types of plastic films and to apply coatings.\" \n\n\u2191 See the discussion on clinical chemistry in the main text. \n\n\u2191 If you\u2019re not familiar with the method, Wikipedia's chromatography page is a good starting point. \n\n\u2191 See Wikipedia's microplate article for more. \n\n\nAbout the author \nInitially educated as a chemist, author Joe Liscouski (joe dot liscouski at gmail dot com) is an experienced laboratory automation\/computing professional with over forty years of experience in the field, including the design and development of automation systems (both custom and commercial systems), LIMS, robotics and data interchange standards. He also consults on the use of computing in laboratory work. He has held symposia on validation and presented technical material and short courses on laboratory automation and computing in the U.S., Europe, and Japan. He has worked\/consulted in pharmaceutical, biotech, polymer, medical, and government laboratories. His current work centers on working with companies to establish planning programs for lab systems, developing effective support groups, and helping people with the application of automation and information technologies in research and quality control environments.\n\nReferences \n\n\n\u2191 Liscouski, J. (2015). Computerized Systems in the Modern Laboratory: A Practical Guide. PDA\/DHI. pp. 432. ASIN B010EWO06S. ISBN 978-1933722863.   \n \n\n\u2191 2.0 2.1 Wilkinson, M.D.; Dumontier, M.; Aalbersberg, I.J. et al. (2016). \"The FAIR Guiding Principles for scientific data management and stewardship\". Scientific Data 3: 160018. doi:10.1038\/sdata.2016.18. PMC PMC4792175. PMID 26978244. http:\/\/www.pubmedcentral.nih.gov\/articlerender.fcgi?tool=pmcentrez&artid=PMC4792175 .   \n \n\n\u2191 Fish, M.; Minicuci, D. (2005). \"Overcoming the Challenges of a LIMS Migration\" (PDF). Research & Development 47 (2). http:\/\/apps.thermoscientific.com\/media\/SID\/Informatics\/PDF\/Article-Overcoming-the-Challanges.pdf .   \n \n\n\u2191 Fish, M.; Minicuci, D. (1 April 2013). \"Overcoming daunting business challenges of a LIMS migration\". Scientist Live. https:\/\/www.scientistlive.com\/content\/overcoming-daunting-business-challenges-lims-migration . Retrieved 17 November 2021 .   \n \n\n\u2191 \"Overcoming the Challenges of Legacy Data Migration\". FreeLIMS.org. CloudLIMS.com, LLC. 29 June 2018. https:\/\/freelims.org\/blog\/legacy-data-migration-to-lims.html . Retrieved 17 November 2021 .   \n \n\n\u2191 \"AUTO03 Laboratory Automation: Communications With Automated Clinical Laboratory Systems, Instruments, Devices, and Information Systems, 2nd Edition\". Clinical and Laboratory Standards Institute. 30 September 2009. https:\/\/clsi.org\/standards\/products\/automation-and-informatics\/documents\/auto03\/ . Retrieved 17 November 2021 .   \n \n\n\u2191 Sarkozi, L.; Simson, E.; Ramanathan, L. (2003). \"The effects of total laboratory automation on the management of a clinical chemistry laboratory. Retrospective analysis of 36 years\". Clinica Chimica Acta 329 (1\u20132): 89\u201394. doi:10.1016\/S0009-8981(03)00020-2.   \n \n\n\u2191 \"Total Laboratory Automation - Michael Bissell, MD, Ph.D\". YouTube. University of Washington. 15 July 2014. https:\/\/www.youtube.com\/watch?v=RdwFZyYE_4Q . Retrieved 17 November 2021 .   \n \n\n\u2191 \"MassLynx LIMS Interface\" (PDF). Waters Corporation. November 2016. https:\/\/www.waters.com\/webassets\/cms\/library\/docs\/720005731en%20Masslynx%20LIMS%20Interface.pdf . Retrieved 17 November 2021 .   \n \n\n\u2191 \"ASTM E1578 - 18 Standard Guide for Laboratory Informatics\". ASTM International. 2018. https:\/\/www.astm.org\/Standards\/E1578.htm . Retrieved 17 November 2021 .   \n \n\n\u2191 Schmitt, S., ed. Assuring Data Integrity for Life Sciences. DHI Publishing. pp. 385. ISBN 9781933722979. https:\/\/www.dhibooks.com\/assuring-data-integrity-for-life-sciences .   \n \n\n\u2191 \"ASTM E2078 - 00(2016) Standard Guide for Analytical Data Interchange Protocol for Mass Spectrometric Data\". ASTM International. 2016. https:\/\/www.astm.org\/Standards\/E2078.htm . Retrieved 17 November 2021 .   \n \n\n\u2191 \"ASTM E1948 - 98(2014) Standard Guide for Analytical Data Interchange Protocol for Chromatographic Data\". ASTM International. 2014. https:\/\/www.astm.org\/Standards\/E1948.htm . Retrieved 17 November 2021 .   \n \n\n\u2191 \"MilliporeSigma Acquires BSSN Software to Accelerate Customers\u2019 Digital Transformation in the Lab\". Merck KGaA. 6 August 2019. https:\/\/www.emdgroup.com\/en\/news\/bssn-software-06-08-2019.html . Retrieved 17 November 2021 .   \n \n\n\u2191 McDonald, Robert S.; Wilks, Paul A. (1 January 1988). \"JCAMP-DX: A Standard Form for Exchange of Infrared Spectra in Computer Readable Form\" (in en). Applied Spectroscopy 42 (1): 151\u2013162. doi:10.1366\/0003702884428734. ISSN 0003-7028. http:\/\/journals.sagepub.com\/doi\/10.1366\/0003702884428734 .   \n \n\n\u2191 \"Welcome to GAML.org\". GAML.org. 22 June 2007. http:\/\/www.gaml.org\/default.asp . Retrieved 17 November 2021 .   \n \n\n\u2191 CNN Business (17 April 2015). \"Micro robots drive Bayer's high-tech vision\". YouTube. https:\/\/www.youtube.com\/watch?v=_PuTKH7143c . Retrieved 17 November 2021 .   \n \n\n\u2191 Oxer, M.; Stroud, T. (2017). \"The Essential Guide to Managing Laboratory Samples\". Titian Software Ltd. https:\/\/www.titian.co.uk\/the-essential-guide-to-managing-laboratory-samples-web . Retrieved 17 November 2021 .   \n \n\n\u2191 19.0 19.1 Redrup, Michael J.; Igarashi, Harue; Schaefgen, Jay; Lin, Jenny; Geisler, Lisa; Ben M\u2019Barek, Mohamed; Ramachandran, Subramanian; Cardoso, Thales et al. (1 March 2016). \"Sample Management: Recommendation for Best Practices and Harmonization from the Global Bioanalysis Consortium Harmonization Team\" (in en). The AAPS Journal 18 (2): 290\u2013293. doi:10.1208\/s12248-016-9869-2. ISSN 1550-7416. PMC PMC4779093. PMID 26821803. http:\/\/link.springer.com\/10.1208\/s12248-016-9869-2 .   \n \n\n\n\n\n\n\nSource: <a rel=\"external_link\" class=\"external\" href=\"https:\/\/www.limswiki.org\/index.php\/LII:Directions_in_Laboratory_Systems:_One_Person%27s_Perspective\">https:\/\/www.limswiki.org\/index.php\/LII:Directions_in_Laboratory_Systems:_One_Person%27s_Perspective<\/a>\nNavigation menuPage actionsLIIDiscussionView sourceHistoryPage actionsLIIDiscussionMoreToolsIn other languagesPersonal toolsLog inRequest accountNavigationMain pageRecent changesRandom pageHelp about MediaWikiSearch\u00a0 ToolsWhat links hereRelated changesSpecial pagesPermanent linkPage informationSponsors \r\n\n\t\r\n\n\t\r\n\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\r\n\n\t\r\n\n \n\t\n\t\n\t\r\n\n\t\r\n\n \n\t\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\r\nPrint\/exportCreate a bookDownload as PDFDownload as PDFDownload as Plain textPrintable version This page was last edited on 17 November 2021, at 23:40.Content is available under a Creative Commons Attribution-ShareAlike 4.0 International License unless otherwise noted.This page has been accessed 121 times.Privacy policyAbout LIMSWikiDisclaimers\n\n\n\n","87d7f050e0c47d7762a90382989592a1_html":"<body class=\"mediawiki ltr sitedir-ltr mw-hide-empty-elt ns-202 ns-subject page-LII_Directions_in_Laboratory_Systems_One_Person_s_Perspective rootpage-LII_Directions_in_Laboratory_Systems_One_Person_s_Perspective skin-monobook action-view skin--responsive\"><div id=\"rdp-ebb-globalWrapper\"><div id=\"rdp-ebb-column-content\"><div id=\"rdp-ebb-content\" class=\"mw-body\" role=\"main\"><a id=\"rdp-ebb-top\"><\/a>\n<h1 id=\"rdp-ebb-firstHeading\" class=\"firstHeading\" lang=\"en\">LII:Directions in Laboratory Systems: One Person's Perspective<\/h1><div id=\"rdp-ebb-bodyContent\" class=\"mw-body-content\"><!-- start content --><div id=\"rdp-ebb-mw-content-text\" lang=\"en\" dir=\"ltr\" class=\"mw-content-ltr\"><div class=\"mw-parser-output\"><p><b>Title<\/b>: <i>Directions in Laboratory Systems: One Person's Perspective<\/i>\n<\/p><p><b>Author for citation<\/b>: Joe Liscouski, with editorial modifications by Shawn Douglas\n<\/p><p><b>License for content<\/b>: <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/creativecommons.org\/licenses\/by-sa\/4.0\/\" target=\"_blank\">Creative Commons Attribution-ShareAlike 4.0 International<\/a>\n<\/p><p><b>Publication date<\/b>: November 2021\n<\/p>\n\n\n<h2><span class=\"mw-headline\" id=\"Introduction\">Introduction<\/span><\/h2>\n<p>The purpose of this work is to provide one person's perspective on planning for the use of computer systems in the <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory\" title=\"Laboratory\" class=\"wiki-link\" data-key=\"c57fc5aac9e4abf31dccae81df664c33\">laboratory<\/a>, and with it a means of developing a direction for the future. Rather than concentrating on \u201cscience first, support systems second,\u201d it reverses that order, recommending the construction of a solid support structure before populating the lab with systems and processes that produce knowledge, information, and data (K\/I\/D).\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Intended_audience\">Intended audience<\/span><\/h3>\n<p>This material is intended for those working in laboratories of all types. The biggest benefit will come to those working in startup labs since they have a clean slate to work with, as well as those freshly entering into scientific work as it will help them understand the roles of various systems. Those working in existing labs will also benefit by seeing a different perspective than they may be used to, giving them an alternative path for evaluating their current structure and how they might adjust it to improve operations. \n<\/p><p>However, all labs in a given industry can benefit from this guide since one of its key points is the development of industry-wide guidelines to solving technology management and planning issues, improving personnel development, and more effectively addressing common projects in automation, instrument communications, and vendor relationships (resulting in lower costs and higher success rates). This would also provide a basis for evaluating new technologies (reducing risks to early adopters) and fostering product development with the necessary product requirements in a particular industry.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"About_the_content\">About the content<\/span><\/h3>\n<p>This material follows in the footsteps of more than 15 years of writing and presentation on the topic. That writing and presentation\u2014<a href=\"https:\/\/www.limswiki.org\/index.php\/Book:LIMSjournal_-_Laboratory_Technology_Special_Edition\" title=\"Book:LIMSjournal - Laboratory Technology Special Edition\" class=\"wiki-link\" data-key=\"ca03292ab5af765e5a5a97438216ed5c\">compiled here<\/a>\u2014includes:\n<\/p>\n<ul><li><a href=\"https:\/\/www.limswiki.org\/index.php\/LII:Are_You_a_Laboratory_Automation_Engineer%3F\" title=\"LII:Are You a Laboratory Automation Engineer?\" class=\"wiki-link\" data-key=\"67df76407d0807e78d9cde61bb3f82c9\"><i>Are You a Laboratory Automation Engineer?<\/i> (2006)<\/a><\/li>\n<li><a href=\"https:\/\/www.limswiki.org\/index.php\/LII:Elements_of_Laboratory_Technology_Management\" title=\"LII:Elements of Laboratory Technology Management\" class=\"wiki-link\" data-key=\"2000eea677bcd5ee1fcecdab32743800\"><i>Elements of Laboratory Technology Management<\/i> (2014)<\/a><\/li>\n<li><a href=\"https:\/\/www.limswiki.org\/index.php\/LII:A_Guide_for_Management:_Successfully_Applying_Laboratory_Systems_to_Your_Organization%27s_Work\" title=\"LII:A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work\" class=\"wiki-link\" data-key=\"00b300565027cb0518bcb0410d6df360\"><i>A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work<\/i> (2018)<\/a><\/li>\n<li><a href=\"https:\/\/www.limswiki.org\/index.php\/LII:Laboratory_Technology_Management_%26_Planning\" title=\"LII:Laboratory Technology Management & Planning\" class=\"wiki-link\" data-key=\"2016524e3c4b551c982fcfc23e33220d\"><i>Laboratory Technology Management & Planning<\/i> (2019)<\/a><\/li>\n<li><a href=\"https:\/\/www.limswiki.org\/index.php\/LII:Notes_on_Instrument_Data_Systems\" title=\"LII:Notes on Instrument Data Systems\" class=\"wiki-link\" data-key=\"1b7330228fd59158aab6fab82ad0e7cc\"><i>Notes on Instrument Data Systems<\/i> (2020)<\/a><\/li>\n<li><a href=\"https:\/\/www.limswiki.org\/index.php\/LII:Laboratory_Technology_Planning_and_Management:_The_Practice_of_Laboratory_Systems_Engineering\" title=\"LII:Laboratory Technology Planning and Management: The Practice of Laboratory Systems Engineering\" class=\"wiki-link\" data-key=\"655f7d48a642e9b45533745af73f0d59\"><i>Laboratory Technology Planning and Management: The Practice of Laboratory Systems Engineering<\/i> (2020)<\/a><\/li>\n<li><a href=\"https:\/\/www.limswiki.org\/index.php\/LII:Considerations_in_the_Automation_of_Laboratory_Procedures\" title=\"LII:Considerations in the Automation of Laboratory Procedures\" class=\"wiki-link\" data-key=\"e0147011cc1eb892e1a35e821657a6d9\"><i>Considerations in the Automation of Laboratory Procedures<\/i> (2021)<\/a><\/li>\n<li><a href=\"https:\/\/www.limswiki.org\/index.php\/LII:The_Application_of_Informatics_to_Scientific_Work:_Laboratory_Informatics_for_Newbies\" title=\"LII:The Application of Informatics to Scientific Work: Laboratory Informatics for Newbies\" class=\"wiki-link\" data-key=\"d8b467af534a70312a21f63b61be26cd\"><i>The Application of Informatics to Scientific Work: Laboratory Informatics for Newbies<\/i> (2021)<\/a><\/li><\/ul>\n<p>While that material covers some of the \u201cwhere do we go from here\u201d discussions, I want to bring a lot of it together in one spot so that we can see what the entire picture looks like, while still leaving some of the details to the titles above. Admittedly, there have been some changes in thinking over time from what was presented in those pieces. For example, the concept of \"laboratory automation engineering\" has morphed into \"laboratory systems engineering,\" given that in the past 15 years the scope of laboratory automation and computing has broadened significantly. Additionally, references to \"scientific manufacturing\" are now replaced with \"scientific production,\" since laboratories tend to produce ideas, knowledge, results, information, and data, not tangible widgets. And as the state of laboratories continues to dynamically evolve, there will likely come more changes.\n<\/p><p>Of special note is 2019's <i>Laboratory Technology Management & Planning<\/i> webinars. They provide additional useful background towards what is covered in this guide.\n<\/p>\n<h2><span id=\"rdp-ebb-Looking_forward_and_back:_Where_do_we_begin?\"><\/span><span class=\"mw-headline\" id=\"Looking_forward_and_back:_Where_do_we_begin.3F\">Looking forward and back: Where do we begin?<\/span><\/h2>\n<h3><span id=\"rdp-ebb-The_"laboratory_of_the_future"_and_laboratory_systems_engineering\"><\/span><span class=\"mw-headline\" id=\"The_.22laboratory_of_the_future.22_and_laboratory_systems_engineering\">The \"laboratory of the future\" and laboratory systems engineering<\/span><\/h3>\n<p>The \u201claboratory of the future\u201d (LOF) makes for an interesting playground of concepts. People's view of the LOF is often colored by their commercial and research interests. Does the future mean tomorrow, next month, six years, or twenty years from now? In reality, it means all of those time spans coupled with the length of a person's tenure in the lab, and the legacy they want to leave behind.\n<\/p><p>However, with those varied time spans we\u2019ll need flexibility and adaptability for <a href=\"https:\/\/www.limswiki.org\/index.php\/Information_management\" title=\"Information management\" class=\"wiki-link\" data-key=\"f8672d270c0750a858ed940158ca0a73\">managing data<\/a> and <a href=\"https:\/\/www.limswiki.org\/index.php\/Information\" title=\"Information\" class=\"wiki-link\" data-key=\"6300a14d9c2776dcca0999b5ed940e7d\">information<\/a> while also preserving access and utility for the products of lab work, and that requires organization and planning. Laboratory equipment will change and storage media and data formats will evolve. The instrumentation used to collect data and information will change, and so will the computers and software applications that manage that data and information. Every resource that has been expended in executing lab work has been to develop knowledge, information, and data (K\/I\/D). How are you going to meet that management challenge and retain the expected return on investment (ROI)? Answering it will be one of the hallmarks of the LOF. It will require a deliberate plan that touches on every aspect of lab work: people, equipment and systems choices, and relationships with vendors and information technology support groups. Some points reside within the lab while others require coordination with corporate groups, particularly when we address long-term storage, ease of access, and security (both physical and electronic).\n<\/p><p>During discussions of the LOF, some people focus on the technology behind the instruments and techniques used in lab work, and they will continue to impress us with their sophistication. However, the bottom line of those conversations is their ability to produce results: K\/I\/D.\n<\/p><p>Modern laboratory work is a merger of science and information technology. Some of the information technology is built into instruments and equipment, the remainder supports those devices or helps manage operations. That technology needs to be understood, planned, and engineered into smoothly functioning systems if labs are to function at a high level of performance.\n<\/p><p>Given all that, how do we prepare for the LOF, whatever that future turns out to be? One step is the development of \u201claboratory systems engineering\u201d as a means of bringing structure and discipline to the use of <a href=\"https:\/\/www.limswiki.org\/index.php\/Informatics_(academic_field)\" title=\"Informatics (academic field)\" class=\"wiki-link\" data-key=\"0391318826a5d9f9a1a1bcc88394739f\">informatics<\/a>, robotics, and <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_automation\" title=\"Laboratory automation\" class=\"wiki-link\" data-key=\"0061880849aeaca05f8aa27ae171f331\">automation<\/a> to lab systems.\n<\/p><p>But who is the laboratory systems engineer (LSE)? The LSE is someone able to understand and be conversant in both the laboratory science and IT worlds, relating them to each other to the benefit of lab operation effectiveness while guiding IT in performing their roles. A fully dedicated LSE will understand a number of important principles:\n<\/p>\n<ol><li>Knowledge, information, and data should always be protected, available, and usable.<\/li>\n<li><a href=\"https:\/\/www.limswiki.org\/index.php\/Data_integrity\" title=\"Data integrity\" class=\"wiki-link\" data-key=\"382a9bb77ee3e36bb3b37c79ed813167\">Data integrity<\/a> is paramount.<\/li>\n<li>Systems and their underlying components should be supportable, meaning they are proven to meet users' needs (validated), capable of being modified without causing conflicts with results produced by previous versions, documented, upgradable (without major disruption to lab operations), and able to survive upgrades in connected systems.<\/li>\n<li>Systems should be integrated into lab operations and not exist as isolated entities, unless there are overriding concerns.<\/li>\n<li>Systems should be portable, meaning they are able to be relocated and installed where appropriate, and not restricted to a specific combination of hardware and\/or software that can\u2019t be duplicated.<\/li>\n<li>There should be a smooth, reproducible (bi-directional, if appropriate), error-free (including error detection and correction) flow of results, from data generation to the point of use or need.<\/li><\/ol>\n<h3><span id=\"rdp-ebb-But_how_did_we_get_here?\"><\/span><span class=\"mw-headline\" id=\"But_how_did_we_get_here.3F\">But how did we get here?<\/span><\/h3>\n<p>The primary purpose of laboratory work is developing and carrying out scientific methods and experiments, which are used to answer questions. We don\u2019t want to lose sight of that, or the skill needed to do that work. Initially the work was done manually, which inevitably limited the amount of data and information that could be produced, and in turn the rate at which new knowledge could be developed and distributed.\n<\/p><p>However, the introduction of electronic instruments changed that, and the problem shifted from data and information production to data and information utilization (including the development of new knowledge), distribution, and management. That\u2019s where we are today.\n<\/p><p>Science plays a role in the production of data and information, as well as the development of knowledge. In between we have the tools used in data and information collection, storage, <a href=\"https:\/\/www.limswiki.org\/index.php\/Data_analysis\" title=\"Data analysis\" class=\"wiki-link\" data-key=\"545c95e40ca67c9e63cd0a16042a5bd1\">analysis<\/a>, etc. That\u2019s what we\u2019ll be talking about in this document. The equipment and concepts we\u2019re concerned with here are the tools used to assist in conducting that work and working with the results. They are enablers and amplifiers of lab processes. As in almost any application, the right tools used well are an asset.\n<\/p><p><a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig1_Liscouski_DirectLabSysOnePerPersp21.png\" class=\"image wiki-link\" data-key=\"518c9d8873fabfb9564ead7d65137cd7\"><img alt=\"Fig1 Liscouski DirectLabSysOnePerPersp21.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/d\/dc\/Fig1_Liscouski_DirectLabSysOnePerPersp21.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 1.<\/b> Databases for knowledge, information, and data (K\/I\/D) are represented as ovals, and the processes acting on them as arrows.<sup id=\"rdp-ebb-cite_ref-1\" class=\"reference\"><a href=\"#cite_note-1\">[1]<\/a><\/sup><\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>One problem with the distinction between science and informatics tools is that lab personnel understand the science, but they largely don't understand the intricacies of information and computing technologies that comprise the tools they use to facilitate their work. Laboratory personnel are educated in a variety of scientific disciplines, each having its own body of knowledge and practices, each requiring specialization. Shouldn't the same apply to the expertise needed to address the \u201ctools\u201d? On one hand we have <a href=\"https:\/\/www.limswiki.org\/index.php\/Chromatography\" title=\"Chromatography\" class=\"wiki-link\" data-key=\"2615535d1f14c6cffdfad7285999ad9d\">chromatographers<\/a>, <a href=\"https:\/\/www.limswiki.org\/index.php\/Spectroscopy\" title=\"Spectroscopy\" class=\"wiki-link\" data-key=\"2babfd09e1f6d00d86ad7032cbb60d91\">spectroscopists<\/a>, physical chemists, toxicologists, etc., and on the other roboticists, database experts, network specialists, and so on. In today's reality these are not IT specialists but rather LSEs who understand how to apply the IT tools to lab work with all the nuances and details needed to be successful.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Moving_forward\">Moving forward<\/span><\/h3>\n<p>If we are going to advance laboratory science and its practice, we need the right complement of experienced people. Moving forward to address laboratory- and organization-wide productivity needs a different perspective; rather than ask how we improve things at the bench, ask how we improve the processes and organization.\n<\/p><p>This guide is about long-term planning for lab automation and K\/I\/D management. The material in this guide is structured in sections, and each section starts with a summary, so you can read the summary and decide if you want more detail. However, before you toss it on the \u201clater\u201d pile, read the next few bits and then decide what to do.\n<\/p><p>Long-term planning is essential to organizational success. The longer you put it off, the more expensive it will be, the longer it will take to do it, and the more entrenched behaviors you'll have to overcome. Additionally, you won\u2019t be in a position to take advantage of the developing results.\n<\/p><p>It\u2019s time to get past the politics and the inertia and move forward. Someone has to take the lead on this, full-time or part-time, depending on the size of your organization (i.e., if there's more than one lab, know that it affects all of them). Leadership should be from the lab side, not the IT side, as IT people may not have the backgrounds needed and may view everything through their organizations priorities. (However, their support will be necessary for a successful outcome.)\n<\/p><p>The work conducted on the lab bench produces data and information. That is the start of realizing the benefits from research and testing work. The rest depends upon your ability to work with that data and information, which in turn depends on how well your data systems are organized and managed. This culminates in maximizing benefit at the least cost, i.e., ROI. It\u2019s important to you, and it\u2019s important to your organization.\n<\/p><p>Planning has to be done at least four levels:\n<\/p>\n<ol><li>Industry-wide (e.g., biotech, mining, electronics, cosmetics, food and beverage, plastics, etc.)<\/li>\n<li>Within your organization<\/li>\n<li>Within your lab<\/li>\n<li>Within your lab processes<\/li><\/ol>\n<p>One important aspect of this planning process\u2014particularly at the top, industry-wide level\u2014is the specification of a framework to coordinate product, process (methods), or standards research and development at the lower levels. This industry-wide framework is ideally not a \u201cthis is what you must do\u201d but rather a common structure that can be adapted to make the work easier and, as a basis for approaching vendors for products and product modifications that will benefit those in your industry, give them confidence that the requests have a broader market appeal. If an industry-wide approach isn\u2019t feasible, then larger companies may group together to provide the needed leadership. Note, however, that this should not be perceived as an industry\/company vs. vendor effort; rather, this is an industry\/company working with vendors. The idea of a large group effort is to demonstrate a consensus viewpoint and that vendors' development efforts won\u2019t be in vain.\n<\/p><p>The development of this framework, among other things, should cover:\n<\/p>\n<ul><li>Informatics<\/li>\n<li>Communications (networking, instrument control and data, informatics control and data, etc.)<\/li>\n<li>Physical security (including power)<\/li>\n<li>Data integrity and security<\/li>\n<li><a href=\"https:\/\/www.limswiki.org\/index.php\/Cybersecurity\" class=\"mw-redirect wiki-link\" title=\"Cybersecurity\" data-key=\"ba653dc2a1384e5f9f6ac9dc1a740109\">Cybersecurity<\/a><\/li>\n<li>The FAIR principles (the findability, accessibility, interoperability, and reusability of data<sup id=\"rdp-ebb-cite_ref-WilkinsonTheFAIR16_2-0\" class=\"reference\"><a href=\"#cite_note-WilkinsonTheFAIR16-2\">[2]<\/a><\/sup>)<\/li>\n<li>The application of <a href=\"https:\/\/www.limswiki.org\/index.php\/Cloud_computing\" title=\"Cloud computing\" class=\"wiki-link\" data-key=\"fcfe5882eaa018d920cedb88398b604f\">cloud<\/a> and <a href=\"https:\/\/www.limswiki.org\/index.php\/Virtualization\" title=\"Virtualization\" class=\"wiki-link\" data-key=\"e6ef1e0497ceb6545c0948d436eba29c\">virtualization<\/a> technologies<\/li>\n<li>Long-term usable access to lab information in databases without vendor controls (i.e., the impact of <a href=\"https:\/\/www.limswiki.org\/index.php\/Software_as_a_service\" title=\"Software as a service\" class=\"wiki-link\" data-key=\"ae8c8a7cd5ee1a264f4f0bbd4a4caedd\">software as a service<\/a> and other software subscription models)<\/li>\n<li>Bi-directional <a href=\"https:\/\/www.limswiki.org\/index.php\/Data_exchange\" title=\"Data exchange\" class=\"wiki-link\" data-key=\"7f41f95d4835a37b958fa9e870357f66\">data interchange<\/a> between archived instrument data in standardized formats and vendor software, requiring tamper-proof formats<\/li>\n<li>Instrument design for automation<\/li>\n<li>Sample storage management<\/li>\n<li>Guidance for automation<\/li>\n<li>Education for lab management and lab personnel<\/li>\n<li>The conversion of manual methods to semi- or fully automated systems<\/li><\/ul>\n<p>These topics affect both a lab's science personnel and its LSEs. While some topics will be of more interest to the engineers than the scientists, both groups have a stake in the results, as do any IT groups.\n<\/p><p>As digital systems become more entrenched in scientific work, we may need to restructure our thinking from \u201clab bench\u201d and \u201cinformatics\u201d to \u201cdata and information sources\u201d and \u201cdigital tools for working, organizing, and managing those elements.\" Data and information sources can extend to third-party labs and other published material. We have to move from digital systems causing incremental improvements (today\u2019s approach), to a true revolutionary restructuring of how science is conducted.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Laboratory_computing\">Laboratory computing<\/span><\/h2>\n<p><b>Key point<\/b>: <i>Laboratory systems are planned, designed, and engineered. They are not simply a collection of components. Laboratory computing is a transformational technology, one which has yet to fully emerge in large part because those who work in laboratories with computing aren\u2019t fully educated about it to take advantage of it.<\/i>\n<\/p><p>Laboratory computing has always been viewed as an \"add-on\" to traditional laboratory work. These add-ons have the potential to improve our work, make it faster, and make it more productive (see Appendix 1 for more details).\n<\/p><p>The common point-of-view in the discussion of lab computing has been focused on the laboratory scientist or manager, with IT providing a supporting role. That isn\u2019t the only viewpoint available to us, however. Another viewpoint is from that of the laboratory systems engineer (LSE), who focuses on data and information flow. This latter viewpoint should compel us to reconsider the role of computing in the laboratory and the higher level needs of laboratory operations.\n<\/p><p>Why is this important? Data and information generation may represent the end of the lab bench process, but it\u2019s just the beginning of its use in broader scientific work. The ability to take advantage of those elements in the scope of manufacturing and corporate research and development (R&D) is where the real value is realized. That requires planning for storage, access, and utilization over the long term.\n<\/p><p>The problem with the traditional point-of-view (i.e., instrumentation first, with computing in a supporting role) is that the data and information landscape is built supporting the portion of lab work that is the most likely to change (Figure 2). You wind up building an information architecture to meet the requirements of diverse data structures instead of making that architecture part of the product purchase criteria. Systems are installed as needs develop, not as part of a pre-planned information architecture.\n<\/p><p><a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig2_Liscouski_DirectLabSysOnePerPersp21.png\" class=\"image wiki-link\" data-key=\"55245fa5ec8c8de01e309fff006f043f\"><img alt=\"Fig2 Liscouski DirectLabSysOnePerPersp21.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/7\/70\/Fig2_Liscouski_DirectLabSysOnePerPersp21.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 2.<\/b> Hierarchy of lab systems, noting frequency of change<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Designing an informatics architecture has some things in common with building a house. You create the foundation first, a supporting structure that everything sits on. Adding the framework sets up the primary living space, which can be modified as needed without disturbing the foundation (Figure 3). If you built the living space first and then wanted to install a foundation, you\u2019d have a mess to deal with.\n<\/p><p><a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig3_Liscouski_DirectLabSysOnePerPersp21.png\" class=\"image wiki-link\" data-key=\"8fa3e98d742ed53067982cbcabdfb29a\"><img alt=\"Fig3 Liscouski DirectLabSysOnePerPersp21.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/3\/3e\/Fig3_Liscouski_DirectLabSysOnePerPersp21.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 3.<\/b> Comparison of foundation and living\/working space levels in an organization<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>The same holds true with <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_informatics\" title=\"Laboratory informatics\" class=\"wiki-link\" data-key=\"00edfa43edcde538a695f6d429280301\">laboratory informatics<\/a>. Set the foundation\u2014the <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_information_management_system\" title=\"Laboratory information management system\" class=\"wiki-link\" data-key=\"8ff56a51d34c9b1806fcebdcde634d00\">laboratory information management system<\/a> (LIMS), <a href=\"https:\/\/www.limswiki.org\/index.php\/Electronic_laboratory_notebook\" title=\"Electronic laboratory notebook\" class=\"wiki-link\" data-key=\"a9fbbd5e0807980106763fab31f1e72f\">electronic laboratory notebook<\/a> (ELN), <a href=\"https:\/\/www.limswiki.org\/index.php\/Scientific_data_management_system\" title=\"Scientific data management system\" class=\"wiki-link\" data-key=\"9f38d322b743f578fef487b6f3d7c253\">scientific data management system<\/a> (SDMS), etc.\u2014first, then add the data and information generation systems. That gives you a common set of requirements for making connections and can clarify some issues in product selection and systems integration. It may seem backwards if your focus is on data and information production, but as soon as you realize that you have to organize and manage those products, the benefits will be clear.\n<\/p><p>You might wonder how you go about setting up a LIMS, ELN, etc. before the instrumentation is set. However it isn\u2019t that much of a problem. You know why your lab is there and what kind of work you plan to do. That will guide you in setting up the foundation. The details of tests can be added as need. Most of that depends on having your people educated in what the systems are and how to use them.\n<\/p><p>Our comparison between a building and information systems does bring up some additional points. A building's access to utilities runs through control points; water and electricity don\u2019t come in from the public supply to each room but run through central control points that include a distribution system with safety and management features. We need the same thing in labs when it comes to network access. In our current society, access to private information for profit is a fact of life. While there are desirable features of lab systems available through network access (remote checking, access to libraries, updates, etc.), they should be controlled so that those with malicious intent are prevented access, and data and information are protected. Should instrument systems and office computers have access to corporate and external networks? That\u2019s your decision and revolves around how you want your lab run, as well as other applicable corporate policies.\n<\/p><p>The connections layer in Figure 3 is where devices connect to each other and the major informatics layer. This layer includes two functions: basic networking capability and application-to-application transfer. Take for example moving pH measurements to a LIMS or ELN; this is where things can get very messy. You need to define what that is and what the standards are to ensure a well-managed system (more on that when we look at industry-wide guidelines).\n<\/p><p>To complete the analogy, people do move the living space of a house from one foundation to another, often making for an interesting experience. Similarly, it\u2019s also possible to change the informatics foundation from one product set to another. It means exporting the contents of the database(s) to a product-independent format and then importing into the new system. If you think this is something that might be in your future, make the ability to engage in that process part of the product selection criteria. Like moving a house, it isn\u2019t going to be fun.<sup id=\"rdp-ebb-cite_ref-FishOvercoming05_3-0\" class=\"reference\"><a href=\"#cite_note-FishOvercoming05-3\">[3]<\/a><\/sup><sup id=\"rdp-ebb-cite_ref-FishOvercoming13_4-0\" class=\"reference\"><a href=\"#cite_note-FishOvercoming13-4\">[4]<\/a><\/sup><sup id=\"rdp-ebb-cite_ref-FreeLIMSWhat18_5-0\" class=\"reference\"><a href=\"#cite_note-FreeLIMSWhat18-5\">[5]<\/a><\/sup> The same holds true for ELN and SDMS.\n<\/p>\n<h2><span id=\"rdp-ebb-How_automation_affects_people's_work_in_the_lab\"><\/span><span class=\"mw-headline\" id=\"How_automation_affects_people.27s_work_in_the_lab\">How automation affects people's work in the lab<\/span><\/h2>\n<p><b>Key point<\/b>: <i>There are two basic ways lab personnel can approach computing: it\u2019s a black box that they don\u2019t understand but is used as part of their work, or they are fully aware of the equipment's capabilities and limitations and know how to use it to its fullest benefit.<\/i>\n<\/p><p>While lab personnel may be fully educated in the science behind their work, the role of computing\u2014from pH meters to multi-instrument data systems\u2014may be viewed with a lack of understanding. That is a significant problem because they are responsible for the results that those systems produce, and they may not be aware of what happens to the signals from the instruments, where the limitations lie, and what can turn a well-executed procedure to junk because an instrument or computer setting wasn\u2019t properly evaluated and used. \n<\/p><p>In reality, automation has both a technical impact on their work and an impact on themselves. These are outlined below.\n<\/p><p>Technical impact on work:\n<\/p>\n<ul><li>It can make routine work easier and more productive, reducing costs and improving ROI (more on that below).<\/li>\n<li>It can allow work to be performed that might otherwise be too expensive to entertain. There are techniques such as high-throughput screening and statistical experimental design that are useful in laboratory work but might be avoided because the effort of generating the needed data is too labor-intensive and time-consuming. Automated systems can relieve that problem and produce the volumes of data those techniques require.<\/li>\n<li>It can improve accuracy and reproducibility. Automated systems, properly designed and implemented, are inherently more reproducible than a corresponding manual system.<\/li>\n<li>It can increase safety by limiting people's exposure to hazardous situations and materials.<\/li>\n<li>It can also be a financial hole if proper planning and engineering aren\u2019t properly applied to a project. \u201cScope creep,\u201d changes in direction, and changes in project requirements and personnel are key reasons that projects are delayed or fail.<\/li><\/ul>\n<p>Impact on the personnel themselves:\n<\/p>\n<ul><li>It can increase technical specialization, potentially improving work opportunities and people's job satisfaction. Having people move into a new technology area gives them an opportunity to grow both personally and professionally.<\/li>\n<li>Full automation of a process can cause some jobs to end, or at least change them significantly (more on that below).<\/li>\n<li>It can elevate routine work to more significant supervisory roles.<\/li><\/ul>\n<p>Most of these impacts are straightforward to understand, but several require further elaboration.\n<\/p>\n<h3><span id=\"rdp-ebb-It_can_make_routine_work_easier_and_more_productive,_reducing_costs_and_improving_ROI\"><\/span><span class=\"mw-headline\" id=\"It_can_make_routine_work_easier_and_more_productive.2C_reducing_costs_and_improving_ROI\">It can make routine work easier and more productive, reducing costs and improving ROI<\/span><\/h3>\n<p>This sounds like a standard marketing pitch; is there any evidence to support it? In the 1980s, clinical chemistry labs were faced with a problem: the cost for their services was set on an annual basis without any adjustments permitted for rising costs during that period. If costs rose, income dropped; the more testing they did the worse the problem became. They addressed this problem as a community, and that was a key factor in their success. Clinical chemistry labs do testing on materials taken from people and animals and run standardized tests. This is the kind of environment that automation was created for, and they, as a community, embarked on a total laboratory automation (TLA) program. That program had a number of factors: education, standardization of equipment (the tests were standardized so the vendors knew exactly what they needed in equipment capabilities), and the development of instrument and computer communications protocols that enabled the transfer of data and information between devices (application to application).\n<\/p><p>Organizations such as the American Society for Clinical Laboratory Science (ASCLS) and the American Association for Clinical Chemistry (AACC), as well as many others, provide members with education and industry-wide support organization. Other examples include the Clinical and Laboratory Standards Institute (CLSI), a non-profit organization that develops standards for laboratory automation and informatics (e.g., AUTO03-A2 on laboratory automation<sup id=\"rdp-ebb-cite_ref-CLSIAUTO03_6-0\" class=\"reference\"><a href=\"#cite_note-CLSIAUTO03-6\">[6]<\/a><\/sup>), and <a href=\"https:\/\/www.limswiki.org\/index.php\/Health_Level_7\" title=\"Health Level 7\" class=\"wiki-link\" data-key=\"e0bf845fb58d2bae05a846b47629e86f\">Health Level Seven, Inc.<\/a>, a non-profit organization that provide software standards for <a href=\"https:\/\/www.limswiki.org\/index.php\/Data_integration\" title=\"Data integration\" class=\"wiki-link\" data-key=\"fd01c635859e1d5b9583e43e31ef6718\">data interoperability<\/a>. \n<\/p><p>Given that broad, industry-wide effort to address automation issues, the initial response was as follows<sup id=\"rdp-ebb-cite_ref-SarkoziTheEff03_7-0\" class=\"reference\"><a href=\"#cite_note-SarkoziTheEff03-7\">[7]<\/a><\/sup>:\n<\/p>\n<ul><li>Between 1965 and 2000, the Consumer Price Index increased by a factor of 5.5 in the United States.<\/li>\n<li>During the same 36 years, at Mount Sinai Medical Center's chemistry department, the productivity (indicated as the number of reported test results\/employee\/year) increased from 10,600 to 104,558 (9.3-fold).<\/li>\n<li>When expressed in constant 1965 dollars, the total cost per test decreased from $0.79 to $0.15.<\/li><\/ul>\n<p>In addition, the following data (Table 1 and 2) from Dr. Michael Bissell of Ohio State University provides further insight into the resulting potential increase in labor productivity by implementing TLA in the lab<sup id=\"rdp-ebb-cite_ref-BissellTotal14_8-0\" class=\"reference\"><a href=\"#cite_note-BissellTotal14-8\">[8]<\/a><\/sup>:\n<\/p>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table class=\"wikitable\" border=\"1\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td colspan=\"4\" style=\"background-color:white; padding-left:10px; padding-right:10px;\"><b>Table 1.<\/b> Overall productivity of labor<br \/> <br \/>FTE = Full-time equivalent; TLA = Total laboratory automation\n<\/td><\/tr>\n<tr>\n<th style=\"padding-left:10px; padding-right:10px;\">Ratio\n<\/th>\n<th style=\"padding-left:10px; padding-right:10px;\">Pre-TLA\n<\/th>\n<th style=\"padding-left:10px; padding-right:10px;\">Post-Phase 1\n<\/th>\n<th style=\"padding-left:10px; padding-right:10px;\">Change\n<\/th><\/tr>\n<tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">Test\/FTE\n<\/td>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">50,813\n<\/td>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">64,039\n<\/td>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">+27%\n<\/td><\/tr>\n<tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">Tests\/Tech FTE\n<\/td>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">80,058\n<\/td>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">89,120\n<\/td>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">+11%\n<\/td><\/tr>\n<tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">Tests\/Paid hour\n<\/td>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">20.8\n<\/td>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">52.9\n<\/td>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">+24%\n<\/td><\/tr>\n<tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">Tests\/Worked hour\n<\/td>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">24.4\n<\/td>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">30.8\n<\/td>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">+26%\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table class=\"wikitable\" border=\"1\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td colspan=\"4\" style=\"background-color:white; padding-left:10px; padding-right:10px;\"><b>Table 2.<\/b> Productivity of labor-processing area<br \/> <br \/>FTE = Full-time equivalent; TLA = Total laboratory\n<\/td><\/tr>\n<tr>\n<th style=\"padding-left:10px; padding-right:10px;\">Ratio\n<\/th>\n<th style=\"padding-left:10px; padding-right:10px;\">Pre-TLA\n<\/th>\n<th style=\"padding-left:10px; padding-right:10px;\">Post-Phase 1\n<\/th>\n<th style=\"padding-left:10px; padding-right:10px;\">Change\n<\/th><\/tr>\n<tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">Specimens\/Processing FTE\n<\/td>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">39,899\n<\/td>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">68,708\n<\/td>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">+72%\n<\/td><\/tr>\n<tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">Specimens\/Processing paid hour\n<\/td>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">19.1\n<\/td>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">33.0\n<\/td>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">+73%\n<\/td><\/tr>\n<tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">Requests\/Processing paid hour\n<\/td>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">12.7\n<\/td>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">21.5\n<\/td>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">+69%\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Through TLA, improvements can be seen in:\n<\/p>\n<ul><li>Improved sample throughput<\/li>\n<li>Cost reduction<\/li>\n<li>Less variability in the data<\/li>\n<li>Reduced and more predictable consumption of materials<\/li>\n<li>Improved use of people's talents<\/li><\/ul>\n<p>While this data is almost 20 years old, it illustrates the impact in a change from a manual system to an automated lab environment. It also gives us an idea of what might be expected if industry- ir community-wide automation programs were developed.\n<\/p><p>Clinical laboratories are not unique in the potential to organize industry-wide standardized aspects of technology and work, and provide education. The same can be done anywhere as long as the ability to perform a particular test procedure isn\u2019t viewed as a unique competitive advantage. The emerging <i><a href=\"https:\/\/www.limswiki.org\/index.php\/Cannabis\" title=\"Cannabis\" class=\"wiki-link\" data-key=\"af797c56ff03f240199d1fdff0690555\">Cannabis<\/a><\/i> testing industry represents one such opportunity, among others.\n<\/p><p>The clinical industry has provided a template for the development of laboratory systems and automation. The list of instruments that meet the clinical communications standard continues to grow (e.g., Waters' MassLynx LIMS Interface<sup id=\"rdp-ebb-cite_ref-WatersMassLynx16_9-0\" class=\"reference\"><a href=\"#cite_note-WatersMassLynx16-9\">[9]<\/a><\/sup>). There is nothing unique about the communications standards that prevent them from being used as a basis for development in other industries, aside from the data dictionary. As such, we need to move from an every-lab-for-itself approach to lab systems development toward a more cooperative and synergistic model.\n<\/p>\n<h3><span id=\"rdp-ebb-Automation_can_cause_some_jobs_to_end,_or_at_least_change_them_significantly\"><\/span><span class=\"mw-headline\" id=\"Automation_can_cause_some_jobs_to_end.2C_or_at_least_change_them_significantly\">Automation can cause some jobs to end, or at least change them significantly<\/span><\/h3>\n<p>One of the problems that managers and the people under them get concerned about is change. No matter how beneficial the change is for the organization, it raises people\u2019s anxiety levels and can affect their job performance unless they are prepared for it. In that context, questions and concerns staff may have in relation to automating aspects of a job include:\n<\/p>\n<ul><li>How are these changes going to affect my job and my income? Will it cause me to lose my job? That\u2019s about as basic as it gets, and it can impact people at any organizational level.<\/li>\n<li>Bringing in new technologies and products means learning new things, and that opens up the possibility that people may fail or not be as effective as they are currently. It can reshuffle the pecking order.<\/li>\n<li>The process of introducing new equipment, procedures, etc. is going to disrupt the lab's <a href=\"https:\/\/www.limswiki.org\/index.php\/Workflow\" title=\"Workflow\" class=\"wiki-link\" data-key=\"92bd8748272e20d891008dcb8243e8a8\">workflow<\/a>. The changes may be procedural structural, or both; how are you going to deal with those issues?<\/li><\/ul>\n<p>Two past examples will highlight different approaches. In the first, a multi-instrument automation system was being introduced. Management told the lab personnel what was going to happen and why, and that they would be part of the final acceptance process. If they weren\u2019t happy, the system would be modified to meet their needs. The system was installed, software written to meet their needs, instruments connected, and the system was tested. Everyone was satisfied except one technician, and that proved to be a major roadblock to putting the system into service. The matter was discussed with the lab manager, who didn\u2019t see the problem; as soon as the system was up and running, that technician would be given a new set of responsibilities, something she was interested in. But no one told her that. As she saw it, once the system came on line she was out of a job. (One of the primary methods used in that lab's work was chromatography, with the instrument output recorded on chart paper. Most measurements were done using peak height, but peak area was used for some critical analyses. Those exacting measurements, made with a planimeter, were her responsibility and her unique\u2014as she saw it\u2014contribution to the lab's work. The instrument system replaced the need for this work. The other technicians and chemists had no problem adapting to the data system.) However, a short discussion between she and the lab manager alleviated the concerns.\n<\/p><p>The second example was handled a lot differently, and was concerned with the implementation of a lab\u2019s first LIMS. The people in the lab knew something was going on but not the details. Individual meetings were held with each member of the lab team to discuss what was being considered and to learn of their impressions and concerns (these sessions were held with an outside consultant, and the results were reported to management in summary form). Once that was completed, the project started, with the lab manager holding a meeting of all lab personnel and IT, describing what was going to be done, why, how the project was going to proceed, and the role of those working in the lab in determining product requirement and product selection. The concerns raised in the individual sessions were addressed up-front, and staff all understood that no one was going to lose their job, or suffer a pay cut. Yes, some jobs would change, and where appropriate that was discussed with each individual. There was an educational course about what a LIMS was, its role in the lab's work, and how it would improve the lab\u2019s operations. When those sessions were completed, the lab\u2019s personnel looked forward to the project. They were part of the project and the process, rather than having it done without their participation. In short, instead of it happening to them, it happened with them as willing participants.\n<\/p><p>People's attitudes about automation systems and being willing participants in their development can make a big difference in a project's success or failure. You don\u2019t want people to feel that the incoming system and the questions surrounding it are threatening, or seen as something that is potentially going to end their employment. They may not freely participate or they may leave when you need them the most.\n<\/p><p>All of this may seem daunting for a lab to take on by itself. Large companies may have the resources to handle it, but we need more than a large company to do this right; we need an industry-wide effort.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Development_of_industry-wide_guidelines\">Development of industry-wide guidelines<\/span><\/h2>\n<p><b>Key point<\/b>: <i>The discussion above about clinical labs illustrates what can be accomplished when an industry group focuses on a technology problem. We need to extend that thinking\u2014and action\u2014to a broader range of industries individually. The benefits of an industry-wide approach to addressing technology and education issues include:<\/i>\n<\/p>\n<ul><li><i>providing a wider range of inputs to solving problems;<\/i><\/li>\n<li><i>providing a critical marketing mass to lobby vendors to create products that fit customers\u2019 needs in a particular industry;<\/i><\/li>\n<li><i>developing an organized educational program with emphasis on that industry's requirements;<\/i><\/li>\n<li><i>giving labs (startup and existing) a well-structured reference point to guide them (not dictate) in making technology decisions;<\/i><\/li>\n<li><i>reducing the cost of automation, with improved support; and<\/i><\/li>\n<li><i>where competitive issues aren\u2019t a factor, enabling industry-funded R&D technology development and implementation for production or manufacturing quality, process control (integration of online quality information), and process management.<\/i><\/li><\/ul>\n<p>The proposal: each industry group should define a set of guidelines to assist labs in setting up an information infrastructure. For the most part, large sections would end up similar across multiple industries, as there isn\u2019t that much behavioral distinction between some industry sets. The real separation would come in two places: the data dictionaries (data descriptions) and the nature of the testing and automation to implement that testing. Where there is a clear competitive edge to a test or its execution, each company may choose to go it alone, but that still leaves a lot of room for cooperation in method development, addressing both the basic science and its automated implementations, particularly where ASTM, USP, etc. methods are employed.\n<\/p><p>The benefits of this proposal are noted in the key point above. However, the three most significant ones are arguably:\n<\/p>\n<ul><li>the development of an organized education program with an emphasis on industry requirements;<\/li>\n<li>the development of a robust communications protocol for application-to-application transfers; and,<\/li>\n<li>the ability to lobby vendors from an industry-wide basis for product development, modification, and support.<\/li><\/ul>\n<p>In the \"Looking forward and back\" section earlier in this guide, we showed a bulleted list of considerations for the development of such a guideline-based framework. What follows is a more organized version of those points, separated into three sections, which all need to be addressed in any industry-based framework. For the purposes of this guide, we'll focus primarily on Section C: \"Issues that need concerted attention.\" Section A on background and IT support, and Section B on lab-specific background information, aren't discussed as they are addressed elsewhere, particularly in the previous works referenced in the Introduction. \n<\/p>\n<dl><dd><b>A.<\/b> General background and routine IT support\n<dl><dd>1. Physical security (including power)<\/dd>\n<dd>2. Cybersecurity<\/dd>\n<dd>3. Cloud and virtualization technologies<\/dd><\/dl><\/dd><\/dl>\n<dl><dd><b>B.<\/b> Lab-specific background information\n<dl><dd>1. Informatics (could be an industry-specific version of <i>ASTM E1578 - 18 Standard Guide for Laboratory Informatics<\/i><sup id=\"rdp-ebb-cite_ref-ASTME1578_18_10-0\" class=\"reference\"><a href=\"#cite_note-ASTME1578_18-10\">[10]<\/a><\/sup>)<\/dd>\n<dd>2. Sample storage management and organization (see Appendix 1, section 1.4 of this guide)<\/dd>\n<dd>3. Guidance for automation<\/dd>\n<dd>4: Data integrity and security (see S. Schmitt's <i>Assuring Data Integrity for Life Sciences<\/i><sup id=\"rdp-ebb-cite_ref-11\" class=\"reference\"><a href=\"#cite_note-11\">[11]<\/a><\/sup>, which has broader application outside the life sciences)<\/dd>\n<dd>5. The FAIR principles (the findability, accessibility, interoperability, and reusability of data; see Wilkinson <i>et al.<\/i> 2016<sup id=\"rdp-ebb-cite_ref-WilkinsonTheFAIR16_2-1\" class=\"reference\"><a href=\"#cite_note-WilkinsonTheFAIR16-2\">[2]<\/a><\/sup>)<\/dd><\/dl><\/dd><\/dl>\n<dl><dd><i>C.<\/i> Issues that need concerted attention:\n<dl><dd>1. Education for lab management and lab personnel<\/dd>\n<dd>2. The conversion of manual methods to semi- or full-automated methods (see <a href=\"https:\/\/www.limswiki.org\/index.php\/LII:Considerations_in_the_Automation_of_Laboratory_Procedures\" title=\"LII:Considerations in the Automation of Laboratory Procedures\" class=\"wiki-link\" data-key=\"e0147011cc1eb892e1a35e821657a6d9\"><i>Considerations in the Automation of Laboratory Procedures<\/i><\/a> for more)<\/dd>\n<dd>3. Long-term usable access to lab information in databases without vendor controls (i.e., the impact of <a href=\"https:\/\/www.limswiki.org\/index.php\/Software_as_a_service\" title=\"Software as a service\" class=\"wiki-link\" data-key=\"ae8c8a7cd5ee1a264f4f0bbd4a4caedd\">software as a service<\/a> and other software subscription models)<\/dd>\n<dd>4. Archived instrument data in standardized formats and standardized vendor software (requires tamper-proof formats)<\/dd>\n<dd>5. Instrument design for automation (Most instruments and their support software are dual-use, i.e., they work as stand-alone devices via front panels, or through software controls. While this is a useful selling tool\u2014whether by manual or automated use\u2014it means the device is larger, more complex, and expensive than a automation-only device that uses software [e.g., a smartphone or computer] for everything. Instruments and devices designed for automation should be more compact and permit more efficient automation systems.)<\/dd>\n<dd>6. Communications (networking, instrument control and data, informatics control and data, etc.)<\/dd><\/dl><\/dd><\/dl>\n<p>We'll now go on to expand upon items C-1, C-3, C-4, and C-6.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"C-1._Education_for_lab_management_and_lab_personnel\">C-1. Education for lab management and lab personnel<\/span><\/h3>\n<p>Laboratory work has become a merger of two disciplines: science and information technology (including robotics). Without the first, nothing happens; without the second, work will proceed but at a slower more costly pace. There are different levels of education requirements. For those working at the lab bench, not only do they need to understand the science, but also how the instrumentation and supporting computers systems (if any, including those embedded in the instrument) make and transform measurements into results. \u201cThe computer did it\u201d is not a satisfactory answer to \u201chow did you get that result,\u201d nor is \u201cmagic,\" or maintaining willful ignorance of exactly how the instrument measurements are taken. Laboratorians are responsible and accountable for the results, and they should be able to explain the process of how they were derived, including how settings on the instrument and computer software affect that work. Lab managers should understand the technologies at the functional level and how the systems interact with each other. They are accountable for the overall integrity of the systems and the data and information they contain. IT personnel should understand how lab systems differ from office applications and why business-as-usual for managing office applications doesn\u2019t work in the lab environment. The computer systems are there to support the instruments, and any changes that may affect that relationship should be initiated with care; the instrument vendor\u2019s support for computer systems upgrades is essential.\n<\/p><p>As such, we need a new personnel position, that of the laboratory systems engineer or LSE, to provide support for the informatics architecture. This isn\u2019t simply an IT person; it should be someone who is fluent in both the science and the information technology applied to lab work. (See <a href=\"https:\/\/www.limswiki.org\/index.php\/LII:Laboratory_Technology_Planning_and_Management:_The_Practice_of_Laboratory_Systems_Engineering\" title=\"LII:Laboratory Technology Planning and Management: The Practice of Laboratory Systems Engineering\" class=\"wiki-link\" data-key=\"655f7d48a642e9b45533745af73f0d59\"><i>Laboratory Technology Planning and Management: The Practice of Laboratory Systems Engineering<\/i><\/a> for more on this topic.)\n<\/p>\n<h3><span class=\"mw-headline\" id=\"C-3._Long-term_usable_access_to_lab_information_in_databases_without_vendor_controls\">C-3. Long-term usable access to lab information in databases without vendor controls<\/span><\/h3>\n<p>The data and information your lab produces, with the assistance of instrumentation and instrument data systems, is yours, and no one should put limits on your ability to work with it. There is a problem with modern software design: most lab data and information can only be viewed through applications software. In some cases, the files may be used with several applications, but often it is the vendor's proprietary formats that limit access. In those instances, you have to maintain licenses for that software for as long as you need access to the data and information, even if the original application has been replaced by something else. This can happen for a variety of reasons:\n<\/p>\n<ul><li>Better technology is available from another vendor<\/li>\n<li>A vendor sold part of its operations to another organization<\/li>\n<li>Organizations merge<\/li>\n<li>Completion of a consulting or research contract requires all data to be sent to the contracting organization<\/li><\/ul>\n<p>All of these, and others, are reasons for maintaining multiple versions of similar datasets that people need access to yet don\u2019t want to maintain licenses for into the future, even though they still must consider meeting regulatory (FDA, EPA, ISO, etc.) requirements.\n<\/p><p>All of this revolves around your ability to gain value from your data and information without having to pay for its access. The vendors don\u2019t want to give their software away for free either. What we need is something like the relationship between Adobe Acrobat Reader and the Adobe Acrobat application software. The latter gives you the ability to create, modify, comment, etc. documents, while the former allows you to view them. The Reader gives anyone the ability to view the contents, just not alter it. We need a \u201creader\u201d application for instrument data collected and processed by an instrument data system. We need to be able to view the reports, raw data, etc. and export the data in a useful format, everything short of acquiring and processing new data. This gives you the ability to work with your intellectual property and allows it to be viewed by regulatory agencies if that becomes necessary, without incurring unnecessary costs or depriving the vendor of justifiable income.\n<\/p><p>This has become increasingly important as vendors have shifted to a subscription model for software licenses in place of one-time payments with additional charges for voluntary upgrades. One example from another realm illustrates the point. My wife keeps all of her recipes in an application on her iPad. One day she looked for a recipe and received a message that read, roughly, \u201cNo access unless you upgrade the app <not free>.\u201d As it turned out, a Google search recommended re-installing the current app instead. It worked, but she upgraded anyhow, just to be safe. It\u2019s your content, but who \u201cowns\u201d it if the software vendor can impose controls on how it is used?\n<\/p><p>As more of our work depends on software, we find ourselves in a constant upgrade loop of new hardware, new operating systems, and new applications just to maintain the status quo. We need more control over what happens to our data. Industry-wide guidelines backed by the buying power of an industry could create vendor policies that would mitigate that. Earlier in this document we noted that the future is going to extend industry change for a long time, with hardware and software evolving in ways we can\u2019t imagine. Hardware changes (anyone remember floppy disks?) inevitably make almost everything obsolete, so how do we protect our access to data and information? Floppy disks were the go-to media 40 years ago, and since then cassette tapes, magnetic tape, Zip drives, CD-ROMs, DVDs, Syquest drives, and other types of storage media have come and gone. Networked systems, at the moment, are the only consistent and reliable means of exchange and storage as datasets increase in size.\n<\/p><p>One point we have to take into account is that versions of applications will only function on certain versions of operating systems and databases. All three elements are going to evolve, and at some point we\u2019ll have to deal with \u201cold\u201d generations while new ones are coming online. One good answer to that is emulation. Some software systems like VMWare Corporation's VMware allow packages of operating systems, databases, and applications to operate on computers regardless of their age, with each collection residing in a \u201ccontainer\u201d; we can have several generations of those containers residing on a computer and execute them at will, as if they were still running on the original hardware. If you are looking for the data for a particular sample, go to the container that covers its time period and access it. Emulation packages are powerful tools; using them you can even run Atari 2600 games on a Windows or OS X system.\n<\/p><p>Addressing this issue is generally bigger than what a basic laboratory-based organization can handle, involving policy making and support from information technology support groups and corporate legal and financial management. The policies have to take into account physical locations of servers, support, financing, regulatory support groups, and international laws, even if your company isn\u2019t a multinational (third-party contracting organizations may be). Given this complexity, and the fact that most companies in a given industry will be affected, industry-wide guidelines would be useful.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"C-4._Archived_instrument_data_in_standardized_formats_and_standardized_vendor_software\">C-4. Archived instrument data in standardized formats and standardized vendor software<\/span><\/h3>\n<p>This has been an area of interest for over 25 years, beginning with the Analytical Instrument Association's work that resulted in a set of ASTM Standard Guides:\n<\/p>\n<ul><li><i>ASTM E2078 - 00(2016) Standard Guide for Analytical Data Interchange Protocol for Mass Spectrometric Data<\/i><sup id=\"rdp-ebb-cite_ref-ASTMEE2078_16_12-0\" class=\"reference\"><a href=\"#cite_note-ASTMEE2078_16-12\">[12]<\/a><\/sup><\/li>\n<li><i>ASTM E1948 - 98(2014) Standard Guide for Analytical Data Interchange Protocol for Chromatographic Data<\/i><sup id=\"rdp-ebb-cite_ref-ASTME1948_14_13-0\" class=\"reference\"><a href=\"#cite_note-ASTME1948_14-13\">[13]<\/a><\/sup><\/li><\/ul>\n<p>Millipore Sigma continues to invest in solutions based on the Analytical Information Markup Language (AnIML) standard (an outgrowth of work done at NIST).<sup id=\"rdp-ebb-cite_ref-MerckMilli19_14-0\" class=\"reference\"><a href=\"#cite_note-MerckMilli19-14\">[14]<\/a><\/sup> There have also been a variety of standards programs, all of which have a goal of moving instrument data into a neutral data format that is free of proprietary interests, allowing it to be used and analyzed as the analyst needs (e.g., JCAMP-DX<sup id=\"rdp-ebb-cite_ref-15\" class=\"reference\"><a href=\"#cite_note-15\">[15]<\/a><\/sup> and GAML<sup id=\"rdp-ebb-cite_ref-GAML_16-0\" class=\"reference\"><a href=\"#cite_note-GAML-16\">[16]<\/a><\/sup>).\n<\/p><p>Data interchange standards can help address issues in two aspects of data analysis: qualitative and quantitative work. In qualitative applications, the exported data can imported into other packages that provide facilities not found in the original data acquisition system. Examining an infrared spectra or <a href=\"https:\/\/www.limswiki.org\/index.php\/Nuclear_magnetic_resonance_spectroscopy\" title=\"Nuclear magnetic resonance spectroscopy\" class=\"wiki-link\" data-key=\"a05c6a4eb8775761248c099371cdb82f\">nuclear magnetic resonance<\/a> (NMR) scan depends upon peak amplitude, shape, and positions to provide useful information, and some software (including user-developed software) may provide facilities that the original data acquisition system didn\u2019t, or it might be a matter of having a file to send to someone for review or inclusion in another project.\n<\/p><p>Quantitative analysis is a different matter. Techniques such as chromatography, <a href=\"https:\/\/www.limswiki.org\/index.php\/Inductively_coupled_plasma_mass_spectrometry\" title=\"Inductively coupled plasma mass spectrometry\" class=\"wiki-link\" data-key=\"56b77f19f090a156bf0d78c71c76da21\">inductively coupled plasma mass spectrometry<\/a> (ICP), and <a href=\"https:\/\/www.limswiki.org\/index.php\/Atomic_absorption_spectroscopy\" title=\"Atomic absorption spectroscopy\" class=\"wiki-link\" data-key=\"e68cc5571310e82b3cec4ab67ebb68f5\">atomic absorption spectroscopy<\/a>, among others, rely on the measurement of peak characteristics or single wavelength absorbance measurements in comparison with measurements made on standards. A single chromatogram is useless (for quantitative work) unless the standards are available. (However, it may be useful for qualitative work if retention time references to appropriate known materials are available.) If you are going to export the data for any of the techniques noted, and others as well, you need the full collection of standards and samples for the data to be of any value.\n<\/p><p>Yet there is a problem with some, if not all of these programs: they trust the integrity of the analyst to use the data honestly. It is possible for people to use these exported formats in ways that circumvent current data integrity practices and falsify results.\n<\/p><p>There are good reasons to want vendor neutral data formats so that data sets can be examined by user-developed software, to put it into a form where the analysis is not limited by a vendor's product design. It also holds the potential for separating data acquisition from <a href=\"https:\/\/www.limswiki.org\/index.php\/Data_analysis\" title=\"Data analysis\" class=\"wiki-link\" data-key=\"545c95e40ca67c9e63cd0a16042a5bd1\">data analysis<\/a> as long as all pertinent data and information (e.g., standards and <a href=\"https:\/\/www.limswiki.org\/index.php\/Sample_(material)\" title=\"Sample (material)\" class=\"wiki-link\" data-key=\"7f8cd41a077a88d02370c02a3ba3d9d6\">samples<\/a>) were held together in a package that could not be altered without detection. It may be that something akin to <a href=\"https:\/\/www.limswiki.org\/index.php\/Blockchain\" title=\"Blockchain\" class=\"wiki-link\" data-key=\"ae8b186c311716aca561aaee91944f8e\">blockchain<\/a> technology could be used to register and manage access to datasets on a company-by-company basis (each company having it\u2019s own registry that would become part of the lab's data architecture).\n<\/p><p>These standards formats are potentially very useful to labs, created by people with a passion for doing something useful and beneficial to the practice of science, sometimes at their own expense. This is another area where a coordinated industry-wide statement of requirements and support would lead to some significant advancements in systems development, and enhanced capability for those practicing instrumental science.\n<\/p><p>If these capabilities are important to you, than addressing that need has to be part of an industry-wide conversation and consensus to provide the marketing support to have the work done.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"C-6._Communications\">C-6. Communications<\/span><\/h3>\n<p>Most computers and electronic devices in the laboratory have communications capability built in: RS-232 (or similar), digital I\/O, Ethernet port, Bluetooth, Wi-Fi, IEEE-488, etc. What these connection types do is provide the potential for communications, and the realization of that potential depends on message formatting, structure of the contents, and moving beyond proprietary interests to those expressed by an industry-wide community. There are two types of devices that need to be addressed: computer systems that service instruments (instrument data systems), and, devices that don\u2019t require an external computer to function (pH meters, balances, etc.) that may have Ethernet ports, serial ASCII, digital I\/O, or IEEE-488 connections. In either of these cases, the typical situation is one in which the vendor has determined a communications structure and the user has to adapt their systems to it, often using custom programming to parse messages and take action. \n<\/p><p>The user community needs to determine its needs and make them known using community-wide buying power as justification for asking for vendor development efforts. As noted earlier, this is not an adversarial situation, but rather a maturation of industry-wide communities working with vendors; the \u201cbuying power\u201d comments simply give the vendors the confidence that its development efforts won\u2019t be in vain.\n<\/p><p>In both cases we need application-to-application message structures that meet several needs. They should be able to handle test method identifications, so that the receiving application knows what the\nmessage is addressing, as well as whether the content is a list of samples to be processed or samples that have already been processed, with results. Additionally, the worklist message should ideally contain, in attribute-value pairs, the number of samples, with a list of sample IDs to be processed. As for the completed work message, the attribute-value pairs would also ideally contain the number of samples processed, the sample IDs, and the results of the analysis (which could be one or more elements depending on the procedure). Of course, there may be other elements required of these message structures; these were just examples. Ultimately, the final framework could be implemented in a format similar to that used in HTML files: plain-text that is easily read and machine- and operating-system-independent.\n<\/p><p>Mapping the message content to database system structure (LIMS or ELN) could be done using a simple built-in application (within the LIMS or ELN) that would graphically display the received message content on one side of a window, with the database\u2019s available fields on the other side. The two sides would then graphically be mapped to one another, as shown in the Figure 4 below. (Note: Since the format of the messages are standardized, we wouldn\u2019t need a separate mapping function to accommodate different vendors).\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig4_Liscouski_DirectLabSysOnePerPersp21.png\" class=\"image wiki-link\" data-key=\"5e23312fb3d39315bd902b3f62450a52\"><img alt=\"Fig4 Liscouski DirectLabSysOnePerPersp21.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/b\/b2\/Fig4_Liscouski_DirectLabSysOnePerPersp21.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 4.<\/b> Mapping IDS message contents to database system<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>The second case\u2014devices like pH meters, etc.\u2014is a little more interesting since the devices don\u2019t have the same facilities available as a computer system provides. However, in the consumer marketplace, this is well-trod ground, using both a smartphone or tablet as an interface, and translation mechanisms between small, fixed function devices and more extensive applications platforms. The only non-standard element is a single-point Bluetooth connection, as shown in Figure 5.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig5_Liscouski_DirectLabSysOnePerPersp21.png\" class=\"image wiki-link\" data-key=\"7597192728a3c14de36813edad03a6d1\"><img alt=\"Fig5 Liscouski DirectLabSysOnePerPersp21.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/c\/cc\/Fig5_Liscouski_DirectLabSysOnePerPersp21.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 5.<\/b> Using a smartphone or tablet as an interface to a LIMS or ELN. The procedure takes a series of pH measurements for a series of samples.<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>A single-point Bluetooth connection is used to exchange information between the measuring device and the smartphone or tablet. There are multi-point connection devices (the smartphone, for example) but we want to restrict the device to avoid confusion about who is in control of the measuring unit. A setup screen (not shown) would set up the worklist and populate the sample IDs. The \u201cTake Reading\u201d button would read the device and enter the value into the corresponding sample position\u2014taking them in order\u2014and enable the next reading until all samples had been read. \u201cSend\u201d would transmit the formatted message to the LIMS or ELN. Depending on the nature of the device and the procedure being used, the application can take any form, this is just a simple example taking pH measurement for a series of samples. In essence, the device-smartphone combination becomes an instrument data system, and the database setup would be the same as described above.\n<\/p><p>The bottom line is this: standardized message formats can greatly simplify the interfacing of instrument data systems, LIMS, ELNs, and other laboratory informatics applications. The clinical chemistry community created communications standards that would permit meaningful messages to be sent between an instrument data system and database system structured so that either end of the link would recognize the message and be able to extract and use the message content without the custom programming common to most labs today. There is no reason why the same thing can\u2019t be done in any other industry; it may even be possible to adapt the clinical chemistry protocols. The data dictionary (list of test names and attributes) would have to be adjusted, but that is a minor point that can be handled on an industry-by-industry basis and be incorporated as part of a system installation. \n<\/p><p>What is needed is people coming together as a group and organizing and defining the effort. How important is it to you to streamline the effort in getting systems up and running without custom programming, to work toward a plug-and-play capability that we encounter in consumer systems (an environment where vendors know easy integration is a must or their products won\u2019t sell)?\n<\/p>\n<h4><span class=\"mw-headline\" id=\"Additional_note_on_Bluetooth-enabled_instruments\">Additional note on Bluetooth-enabled instruments<\/span><\/h4>\n<p>The addition of Bluetooth to a device can result in much more compact systems, making the footprint smaller and reducing the cost of the unit. By using a smartphone to replace the front panel controls, the programming can become much more sophisticated. Imagine a Bluetooth-enabled pH meter and a separate Bluetooth, electronically controlled titrator. That combination would permit optimized<sup id=\"rdp-ebb-cite_ref-17\" class=\"reference\"><a href=\"#cite_note-17\">[a]<\/a><\/sup> delivery of titrant making the process faster and more accurate, while also providing a graphical display of the resulting titration curve, putting the full data processing capability of the smartphone and it\u2019s communications at the service of the experiment. Think about what the addition of a simple clock did for thermostats: it opened the door to programmable thermostats and better controls. What would smartphones controlling simple devices like balances, pH meters, titrators, etc. facilitate?\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Laboratory_work_and_scientific_production\">Laboratory work and scientific production<\/span><\/h2>\n<p><b>Key point<\/b>: <i>Laboratory work is a collection of processes and procedures that have to be carried out for research to progress, or, to support product manufacturing and production processes. In order to meet \u201cproductivity\u201d goals, get enough work done to make progress, or provide a good ROI, we need labs to transition manual to automated (partial or full) execution as much as possible. We also need to ensure and demonstrate process stability, so that lab work itself doesn\u2019t create variability and unreliable results.<\/i>\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Different_kinds_of_laboratory_work\">Different kinds of laboratory work<\/span><\/h3>\n<p>There are different kinds of laboratory work. Some consist of a series of experiments or observations that may not be repeated, or are repeated with expected or unplanned variations. Other experiments may be repeated over and over either without variation or with controlled variations based on gained experience. It is that second set that concerns this writing, because repetition provides a basis and a reason for automation.\n<\/p><p>Today we are used to automation in lab work, enacted in the form of computers and robotics ranging from mechanical arms to auto-samplers, auto-titration systems, and more. Some of the technology development we've discussed began many decades before, and there's utility in looking back that far to note how technologies have developed. However, while there are definitely forms of automation and robotics available today, they are not always interconnected, integrated, or compatible with each other; they were produced as products either without an underlying integration framework, or with one that was limited to a few cooperating vendors. This can be problematic.\n<\/p><p>There are two major elements to repetitive laboratory methods: the underlying science and how the procedures are executed. At these methods\u2019 core, however is the underlying scientific method, be it a chemical, biological, or physical sequence. We automate processes not things. We don\u2019t, for example, automate an instrument, but rather the process of using it to accomplish something. If you are automating the use of a telescope to obtain the spectra of a series of stars, you build a control algorithm to look at each star in a list, have the instrument position itself properly, record the data using sensors, process it, and then move on the next star on the list. (You would also build in error detection with messages like \"there isn\u2019t any star there,\" \"cover on telescope,\" and \"it isn\u2019t dark yet,\" along with relevant correction routines). The control algorithm is the automation, while the telescope and sensors are just tools being used.\n<\/p><p>When building a repetitive laboratory method, the method should be completely described and include aspects such as:\n<\/p>\n<ul><li>the underlying science<\/li>\n<li>a complete list of equipment and materials when implemented as a manual process<\/li>\n<li>possible interferences<\/li>\n<li>critical facets of the method<\/li>\n<li>variables that have to be controlled and their operating ranges (e.g., temperature, pH of solutions, etc.)<\/li>\n<li>special characteristics of instrumentation<\/li>\n<li>safety considerations<\/li>\n<li>recommended sources of equipment and sources to be avoided<\/li>\n<li>software considerations such as possible products, desirable characteristics, and things to be avoided<\/li>\n<li>potentially dangerous, hazardous situations<\/li>\n<li>an end-to-end walkthrough of the methods used<\/li><\/ul>\n<p>Of course, the method has to be validated. You need to have complete confidence in its ability to produce the results necessary given the input into the process. At this point, the scientific aspects of the work are complete and finished. They may be revisited if problems arise during process automation, but no further changes should be entertained unless you are willing to absorb additional costs and alterations to the schedule. This is a serious matter; one of the leading causes of project failures is \u201cscope creep,\u201d which occurs when incremental changes are made to a process while it is under development. This results in the project becoming a moving target, with seemingly minor changes able to cause a major disruption in the project's design.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Scientific_production\">Scientific production<\/span><\/h3>\n<p>At this point, we have a proven procedure that we need to execute repeatedly on a set of inputs (samples). We aren\u2019t carrying out \"experiments\"; that word suggests that something may be changing in the nature of the procedure, and at this point it shouldn\u2019t. Changes to the underlying process invite a host of problems and may forfeit any chance at a successful automation effort.\n<\/p><p>However, somewhere in the (distant) future something in the process is likely to change. A new piece of technology may be introduced, equipment (including software) may need to be upgraded, or the timing of a step may need to be adjusted. That\u2019s life. How that change is made is very important. Before it is implemented, the process has to be in place long enough to establish that it works reliably, that it produces useful results, and that there is a history of running the same reference sample(s) over and over again in the mix of samples to show that the process is under control with acceptable variability in results (i.e., statistical process control). In manufacturing and production language, this is referred to as \"evolutionary operations\" (EVOP). But we can pull it all together under the heading \u201cscientific production.\u201d<sup id=\"rdp-ebb-cite_ref-18\" class=\"reference\"><a href=\"#cite_note-18\">[b]<\/a><\/sup>\n<\/p><p>If your reaction to the previous paragraph is \u201cthis is a science lab not a commercial production operation,\u201d you\u2019re getting close to the point. This is a production operation (commercial or not, it depends on the type of lab, contract testing, and other factors) going on within a science lab. It\u2019s just a matter of scale.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Demonstrating_process_stability:_The_standard_sample_program\">Demonstrating process stability: The standard sample program<\/span><\/h3>\n<p>One of the characteristics of a successfully implemented stable process is consistency of the results with the same set of inputs. Basically, if everything is working properly, the same set of samples introduced at the beginning should yield the same results time after time, regardless of whether the implementation is manual or automated. A standard sample program (Std.SP) is a means of demonstrating the stability and operational integrity of a process; it can also tell you if a process is getting out of control. Introduced early in my career, the Std.SP was used to show the consistency of results between analysts in a lab, and to compare the performance of our lab and the quality control labs in the production facilities. The samples were submitted and marked like any similar sample and became part of the workflow. The lab managers maintained control charts of the labs and and individual's performance on the samples. The combined results would show whether the lab and those working there had things under control or if problems were developing. Those problems might be with an individual, a change in equipment performance, or a change in incoming raw materials used in the analysis.\n<\/p><p>The Std.SP was the answer to the typical \u201chow do you know you can trust these results?\u201d question, which often came up when a research program or production line was having issues. This becomes more important when automation is used and the sampling and testing throughput increases. If an automated system is having issues, you are just producing bad data faster. The Std.SP is a high-level test of the system's performance. A deviation of the results trend line or a widening of the variability are indications that something is wrong, which should lead to a detailed evaluation of the system to account for the deviations. A troubleshooting guide should be part of the original method description (containing aspects such as \u201cif something starts going wrong, here\u2019s where to look for problems\u201d or \u201cthe test method is particularly sensitive to \u2026\u201d, etc.) and notes made during the implementation process.\n<\/p><p>Reference samples are another matter, however. They have to be stable over a long enough period of use to establish a meaningful trend. If the reference samples are stable over a long period of time, you may only need two or three so that the method can be evaluated over different ranges (you may have a higher variability at a lower range than a higher one). If the reference samples are not stable, then their useful lifetime needs to be established and older ones swapped out periodically. There should be some overlap between samples near the end of their useful life and the introduction of new ones so that laboratorians can differentiate between system variation and changes in reference samples. You may need to use more reference samples in these situations.\n<\/p><p>This inevitably becomes a lot of work, and if your lab utilizes many procedures, it can be time-consuming to manage an Std.SP. But before you decide it is too much work, ask yourself how valuable it is to have a solid, documented answer to the \u201chow do you know you can trust these results?\u201d question. Having a validated process or method is a start, but that only holds at the start of a method\u2019s implementation. If all of this is new to you, find someone who understands process engineering and statistical process control and learn more from them.\n<\/p>\n<h2><span id=\"rdp-ebb-Where_is_the_future_of_lab_automation_going_to_take_us?\"><\/span><span class=\"mw-headline\" id=\"Where_is_the_future_of_lab_automation_going_to_take_us.3F\">Where is the future of lab automation going to take us?<\/span><\/h2>\n<p><b>Key point<\/b>: <i>There is no single \u201cfuture\u201d for laboratories; each lab charts it own path. However there are things they can all do to be better prepared to meet the next day, month, or decade. Plan for flexibility and adaptability. Keep your data and information accessible, educate your personnel to recognize and take advantage of opportunities, and don\u2019t be afraid of taking risks.<\/i>\n<\/p><p>Where is the future of lab automation going to take us? The simple answer is that the future is in your hands. Lab personnel are going to be making their own choices, and that is both a reflection of reality and part of the problem. Aside from a few industries, labs have approached the subject of automation individually, making their own plans and charting their own course of action. The problems that approach engenders are that it is both inefficient and it wastes resources (e.g., funding, time, effort).\n<\/p><p>Earlier we saw how the clinical chemistry industry handled the problem of containing costs through automation: it made the automation work better. Within any given industry, it\u2019s unlikely that the same type of testing is going to differ markedly; there will be unique demands and opportunities, but they will be the exception. Why not pool resources and solve common problems? It would benefit the end-user in the lab as well as the vendor, encouraging those vendors to build products for a specific market rather than meeting the needs of a single customer. Among the biggest issues in automation and lab computing is communications between devices; the clinical chemistry industry has solved that. Their solution has elements that are specific to their work but they should be easily adjusted to meet the needs of other groups.\n<\/p><p>We\u2019re in a phase where we have many technologies that work well independently for their intended purpose, but they don't work well together. We\u2019ve been there before, with two examples to be found in computer networking and computer graphics. \n<\/p><p>Computer networks began to emerge commercially in the early 1970s. Each computer vendor had their own approach to networking (similar hardware but different communications protocols). Within each vendor's ecosystem things worked well, yet bridging across ecosystems was interesting. When network installations occurred at customer sites, the best hardware and software people were involved in planning, laying out the systems, installing software, and getting things to work. Installation time was measured in weeks. Today, a reasonably sophisticated home or business network can be install in an hour or two, maybe longer depending on the number of machines and components involved, and when you turn it on, you\u2019d be surprised if it didn\u2019t work. What made the difference? Standards were developed for communications protocols. Instead of each vendor having their own protocol, the world adopted TCP\/IP and everything changed.\n<\/p><p>The path of computer graphics development provides further insight. In the 1970s and 80s, when computer graphics hardware began seeing widespread use, every vendor in the market had their own graphics display hardware and software. If you liked one vendor's graphics library, you had to buy their hardware. If you liked their hardware you had to live with their software. We saw this play out in the early days of the PC, when there were multiple graphics processor card adapters (e.g., enhanced graphics adapter [EGA], color graphics adapter [CGA], video graphics adapter [VGA], Hercules, etc.), and if you had the wrong card installed, the software wouldn\u2019t work (although combination cards allowed the user to switch modes and reduce the number of boards used in the computer). This continued to frustrate users and developers, until a standardized graphics architecture was developed that was device-independent. Problem solved.\n<\/p><p>This is a common theme that comes up over and over. A new technology develops, vendors build products with unique features and those products don\u2019t play well with each other (i.e., they attempt to achieve a strong market positions). Customers get frustrated and standards are developed that convert \u201cdon\u2019t play well\u201d to \"easy integration,\" and the market takes off.\n<\/p><p>How the future of lab automation is going to play out for you is in your hands, but there are things we can do to make it easier. As we noted earlier, few if any of the developments in lab automation were initially done according to a plan, though we\u2019ve also seen what planning can actually do in at least one setting, the field of clinical chemistry. This highlights the fact that not only do you need a plan for your laboratory, but also a plan for automation in your industry would be worthwhile; it would avoid spending resources individually where pooling is a better approach, and it would give you a stronger voice to work with vendors to create products that meet your industry's needs. A set of industry-wide strategies (e.g., for mining, viniculture, cosmetics, materials research, healthcare, biotech, electronics, etc.) would give you a solid starting point for the use of computing and automation technologies in your lab. There still would be plenty of room for unique applications.\n<\/p><p>Industry working groups are one need, while getting people educated is another. Determining how much you are willing to rely on automation is an additional point, which we\u2019ll illustrate with a discussion of quality control labs, in the next section.\n<\/p><p>Before we move on to the next section, however, there are two terms that haven\u2019t been mentioned yet that need to be noted: <a href=\"https:\/\/www.limswiki.org\/index.php\/Artificial_intelligence\" title=\"Artificial intelligence\" class=\"wiki-link\" data-key=\"0c45a597361ca47e1cd8112af676276e\">artificial intelligence<\/a> (AI) and big data. I\u2019m not sure I know what AI is (the definition and examples change frequently), or that I\u2019d trust what is available today for serious work. For every article that talks about the wonders it will have, there is one that talks about the pitfalls of design, built in bias, and concerns about misapplication. In science, aside from using it as a \u201cdid you know about this?\u201d advisory tool, I\u2019d be concerned about using it. If you don\u2019t know how the AI arrived at its answer (one of AI\u2019s characteristics) why would you trust it? Big data, on the hand, is on a more solid technical footing and should prove useful, though only if your data and information is structured to take advantage of it. You can\u2019t just say \u201chere are all my datasets\u201d; you have to organize them to be usable.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Applying_automation_to_lab_work\">Applying automation to lab work<\/span><\/h2>\n<p><b>Key point<\/b>: <i>Today, people\u2019s attitudes about work have changed, and with it the world has changed; the <a href=\"https:\/\/www.limswiki.org\/index.php\/COVID-19\" class=\"mw-redirect wiki-link\" title=\"COVID-19\" data-key=\"da9bd20c492b2a17074ad66c2fe25652\">COVID-19<\/a> <a href=\"https:\/\/www.limswiki.org\/index.php\/Pandemic\" title=\"Pandemic\" class=\"wiki-link\" data-key=\"bd9a48e6c6e41b6d603ee703836b01f1\">pandemic<\/a> has forced us to re-evaluate models of work. If nothing else, even though we\u2019re talking about science, the use of technology to conduct scientific work is going to take on a new importance. This won\u2019t be solved one lab at a time, but on an industry-by-industry basis. We need to think about key concepts and their implementation as a community if truly effective solutions are to be found and put into practice.<\/i>\n<\/p><p>In <a href=\"https:\/\/www.limswiki.org\/index.php\/LII:Considerations_in_the_Automation_of_Laboratory_Procedures\" title=\"LII:Considerations in the Automation of Laboratory Procedures\" class=\"wiki-link\" data-key=\"e0147011cc1eb892e1a35e821657a6d9\"><i>Considerations in the Automation of Laboratory Procedures<\/i><\/a>, the criteria for \"automation readiness\" were described. Two significant points were that a laboratory procedure must both be stable and have a duration of use sufficiently long enough to justify the cost of automation. Beyond that, additional concerns such as safety may come into play, but those two points were the most critical. From those points we can roughly derive the automation needs of research, testing, quality control, and contract laboratories.\n<\/p><p><b>Research laboratory<\/b>: Automation in research is going to be predominately at the task level, i.e., portions of a process being done by automated instruments and devices rather than end-to-end automation. This is due to the two key points noted above: research labs often change procedures, with some being very short-lived, and the cost of full-automation (including validation) may not be worth it unless the equipment is designed to be modular, interconnected, and easily programmed. One exception can be seen with Bayer.<sup id=\"rdp-ebb-cite_ref-CNNMicro15_19-0\" class=\"reference\"><a href=\"#cite_note-CNNMicro15-19\">[17]<\/a><\/sup> They are using robotics and microplate technology to automate a high-throughput screen system that couldn\u2019t\u2014in their words\u2014be done practically by people; it's too slow and too expensive. Could something like this become more commonplace?\n<\/p><p><b>Testing laboratory<\/b>: Non-routine testing laboratories are going to have a mix of task-level and end-to-end automation. It will depend on the stability of the test process and if there is sufficient work to justify the cost of automation, plus what level of control over the form of the incoming samples is available (see Appendix 1, section 1.3 of this guide). Often samples are submitted for multiple tests, which raised an additional concern: is the last step in a procedure destructive to the sample? Samples may be prepared so that they can be used in more than one test procedure. If the test process is destructive, then the initial preparation has to be divided so there is enough material for all testing; this can complicate the automation process.\n<\/p><p><b>Quality control laboratory<\/b>: <a href=\"https:\/\/www.limswiki.org\/index.php\/Quality_control\" title=\"Quality control\" class=\"wiki-link\" data-key=\"1e0e0c2eb3e45aff02f5d61799821f0f\">Quality control<\/a> (QC) labs should be the ideal place for automation to take hold. The procedures are well defined, stable, and entrenched. The benefits in faster throughput and cost reduction should be clear. The biggest place for problems is in sample preparation and getting material ready for analysis. Given that the form of the samples is known and predictable, it is worthwhile to spend time in analyzing and designing a means of working with samples. There are going to be some preparation steps that may be too difficult to justify automation; this can be seen with tensile testing of polymers and fibers. The sample or parts fabrication and need to wait for the material to stabilize may make parts of the process difficult to deal with, or not financially justifiable. Insertion into the test apparatus may also be an issue. Testing that can be done by spectroscopic, chromatographic, or other easily automated instrumentation may be migrated from the laboratory to in-line testing, raising questions about the nature of the QC process.\n<\/p><p><b>Contract or independent laboratory<\/b>: The ability to apply automation beyond the task level depends on the kind of work being done; is it specialized or highly flexible? The greater the flexibility (e.g., a number of different procedures with a small number of samples submitted at a time) the more task level automation will apply. However, the justification for automation may not be there. Specialized labs (e.g., clinical chemistry labs) or a lab that specializes in a technique with a high demand and throughput should benefit from a move to end-to-end automation since it reduces costs and improves accuracy and turn-around times (TATs).\n<\/p>\n<h3><span class=\"mw-headline\" id=\"More_on_the_quality_control_lab_and_its_process_management\">More on the quality control lab and its process management<\/span><\/h3>\n<p>The evolution of QC labs may change our perspective of how QC testing is accomplished, as well as the nature of QC lab operations. Our current view, and that of the last few decades, may change significantly.\n<\/p>\n<h4><span class=\"mw-headline\" id=\"Early_QC_testing\">Early QC testing<\/span><\/h4>\n<p>Some form of quality testing was done during the production process in prior decades. For example, I\u2019ve seen production station operators bite plastic materials to get an idea of how well the production process, and the expansion of polystyrene beads, was going in the production of Styrofoam cups (though this was not encouraged by management as butane and pentane, used as expansion agents, could cause health problems). If the cup was too stiff or soft, it would indicate a problem in the molding process. There were other \u201cquick and dirty\u201d production floor tests that were run because the TAT in the lab was producing an unacceptable delay (if you\u2019re making adjustments, you need measurements quickly). People may also have wanted to avoid having a record of production problems and changes.\n<\/p><p>In another application (anecdotal reporting from a course on plastics manufacturing)\u2014calendering<sup id=\"rdp-ebb-cite_ref-20\" class=\"reference\"><a href=\"#cite_note-20\">[c]<\/a><\/sup> of plastic coated paper\u2014the production operator would strike the laminated paper with a slender bamboo rod and listen to the sound it made to determine the product quality. According to the course instructor, he was adept at this, and no other test could replace this in-process procedure.\n<\/p><p>As such, it's clear not all \"quality\" testing went on in the laboratory.\n<\/p>\n<h4><span class=\"mw-headline\" id=\"Normal_QC_testing\">Normal QC testing<\/span><\/h4>\n<p>Traditionally, QC lab operations became separate from production, partly because of the physical conditions needed to conduct the work, and partly to avoid bias in the results. The conclusion: it was important that the reporting lines of communication be separate and independent from production. QC labs would, and do, perform testing on incoming materials to certify them suitable for use and produced products to see if they met product specifications, as well as perform in-process testing. They would also certify products as qualified for shipment, acting as an unbiased evaluator of product quality.\n<\/p><p>Quality tests were manually implemented until the early 1960s. Then we saw the advent of in-process instrumentation and process chromatographs by Fisher Controls and Mine Safety Apparatus, for example. While the instruments themselves were on the production floor, their management and analysis was under the control of the QC lab, at least in the facility I worked in. The instruments' maintenance, calibration, peak measurements, and sample calculations were all done by lab personnel. Since that time, we\u2019ve seen a continued growth in in-line testing for production processes. That said, what\u2019s the logical conclusion of increasing automation?\n<\/p>\n<h4><span class=\"mw-headline\" id=\"A_thought_experiment\">A thought experiment<\/span><\/h4>\n<p>Lets posit that we have a production process whose raw materials are fluids and the end product is a fluid. We\u2019d be concerned with certifying in-coming raw materials as suitable for use, while monitoring the production process for product composition and potential contaminants. The end product would have to be certified for shipment.\n<\/p><p>If all the testing were chromatographic with in-process instruments and a <a href=\"https:\/\/www.limswiki.org\/index.php\/Chromatography_data_system\" title=\"Chromatography data system\" class=\"wiki-link\" data-key=\"a424bb889d8507b7e8912f2faf2570c6\">chromatography data system<\/a> (CDS), and\/or in-line spectrophotometric and other in-line or in-process tests, with the results becoming part of the process control system (sample IDs would be a hash of sensor type-location-timestamp), what would the role of the QC lab become? Assuring that the equipment was running properly and regularly calibrated, with periodic verification of test results (off-line testing) is one set of possibilities. Is this a desirable developmental direction? We need to look at the benefits and issues that result from this design.\n<\/p><p><b>Benefits<\/b>:\n<\/p>\n<ul><li>It provides an integrated system where all process sensor and test result data are immediately available to management. This allows them to detect issues faster and provide a more timely response (better process control), reducing down time and off-spec product. If we want to enter the world of science fiction, we can even imagine a combination AI-<a href=\"https:\/\/www.limswiki.org\/index.php\/Machine_learning\" title=\"Machine learning\" class=\"wiki-link\" data-key=\"79aab39cfa124c958cd1dbcab3dde122\">machine learning<\/a> solution providing closed loop process monitoring and control.<\/li>\n<li>It signifies a potential cost reduction resulting from smaller labs and lower personnel costs.<\/li><\/ul>\n<p><b>Issues<\/b>:\n<\/p>\n<ul><li>There's a loss of independent product evaluation. Basically, you are trusting the system to honestly monitor, report, and reject off-spec, incoming, and outgoing material. In the movie version, this is where the sound track becomes ominous. This loss of independent checking may reduce customer confidence in product quality.<\/li>\n<li>The veracity of statistical process control and correction may suffer.<\/li>\n<li>System validation could be challenging as the production process has to be validated, each sensor and instrument data system has to be validated, and the integrated system has to be validated, including the statistical process control.<\/li><\/ul>\n<p>Assuming a system were built, how far are we willing to trust automated systems to function without external oversight? The control room would still be populated with people managing the process and exerting a higher level of scrutiny, but you are still trusting the designers of the system to do very sophisticated work, not only in process design but also integrating testing as part of the effort. Ideally you\u2019d want the CDS and other instrument data processing equipment in the control room, where it is more easily used and maintained, than on the process floor. The role of the QC lab would then change to that of an overarching quality manager of the entire system, ensuring that equipment functioned properly, testing was accurate, the process and testing were operating within control limits, and the data logs were correct.\n<\/p><p>Some organizations may be past this point, while for others this may be interesting, bordering on science fiction. The point of this thought experiment is to see what could happen and where your comfort level is with it. How much control do you want to give an automated system and how much do you want to retain? What are the consequences of not providing sufficient oversight? How much bad product could be made?\n<\/p><p>Also note that this isn\u2019t an all-or-nothing proposition; give it some room to work, see what happens, and if everything is good, give it more. Just build in a big red button that allows you to reboot and revert to manual operations; in other words, don't self-destruct, just remove some critical controls. A lot depends on the nature of the finished product. If the end product is something critical (e.g., a <a href=\"https:\/\/www.limswiki.org\/index.php\/Medical_device\" title=\"Medical device\" class=\"wiki-link\" data-key=\"8e821122daa731f0fa8782fae57831fa\">medical device<\/a> or therapeutic), you\u2019ll want to be cautious about phasing in automated control systems.\n<\/p><p>All that said, two additional points should be made. First, be willing to play with the ideas. Turn it into a science fiction project (\u201csci fi\u201d is just a playground for \"what ifs\"), remove it from reality enough that people can look at it from different perspectives and see what might work and what might not. Then let people play with those ideas; you might learn something. What are all the things that could go right, and what could go wrong (and what can you do about it)? You probably won't have to worry about alien robots, but malware interference is certainly a factor, as is a network air-gap. There is always the possibility of someone causing a problem; the question of course is how do you detect it and correct it. Second, be willing to model the system. There are a number of modeling packages ideal for this purpose. You can model the behavior and see how different control methods react.\n<\/p><p>(While this thought experiment used a process involving fluids only, as they are relatively easy to work with, its worth noting that solid materials become more of an issue, complicating the automation process [see Appendix 1, section 1.4 of this guide]. In some cases sample preparation for testing may be too cumbersome for automation. This would shift the automated-manual testing balance more toward the latter in those cases, introducing delays and disrupting the timing of results to process control.)\n<\/p>\n<h2><span id=\"rdp-ebb-The_creation_of_a_"center_for_laboratory_systems_engineering"\"><\/span><span class=\"mw-headline\" id=\"The_creation_of_a_.22center_for_laboratory_systems_engineering.22\">The creation of a \"center for laboratory systems engineering\"<\/span><\/h2>\n<p><b>Key point<\/b>: <i>Throughout this piece, the need for education has been a consistent theme. Developing and using the technologies in lab work, both scientific and informatics-related, will require people who know what they are doing, specifically educated to carry out the work noted above. We also need a means of pulling things together so that there is a centralized resource to start a learning process and continue development from there.<\/i>\n<\/p><p>Let's propose a \"center for laboratory system engineering.\" This center would firstly prepare people to be effective planners, designers, implementers, supporters, and users of laboratory informatics and automation systems in scientific applications. Additionally, the center would ideally drive innovation and provide assistance to scientific personnel and IT groups seeking to apply and manage such laboratory technologies.\n<\/p><p>Those goals would be accomplished by:\n<\/p>\n<ul><li>Developing and delivering courses for LSEs, lab personnel, and IT support (These courses would cover technical science topics as well as skills in working with people, conflict resolution, and communications. They would be presented both in-person and online or on-demand to reach a broad audience; an intensive summer course with hands-on experience should also be considered.)<\/li>\n<li>Creating an LSE certification program<\/li>\n<li>Carrying out research on the application of informatics, robotics, and computer-assisted data collection and processing<\/li>\n<li>Documenting the best practices for an LSE<\/li>\n<li>Aggregating and publishing material on the roles and requirements of the LSE<\/li><\/ul>\n<p>Ideally, this center would be part of a university setting so that it could interact with both science and computer science departments, contribute to their programs, and in turn gain from that association.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Appendix_1:_A_very_brief_historical_note\">Appendix 1: A very brief historical note<\/span><\/h2>\n<p>It would be useful to understand how we arrived at our current state in regards to informatics and automation in science. That will make it easier to understand what we need to do to make advancements. There is one key point to take away from this: in the history of lab automation, products weren\u2019t developed according to a grand plan<sup id=\"rdp-ebb-cite_ref-21\" class=\"reference\"><a href=\"#cite_note-21\">[d]<\/a><\/sup> but rather to meet perceived needs and opportunities. Thought processes in this vein have likely included:\n<\/p>\n<ul><li>\u201cHere\u2019s a problem that needs to be solved.\u201d<\/li>\n<li>\u201cIf I can figure out how to do X, then I can accomplish Y.\u201d<\/li>\n<li>\u201cThere\u2019s a business opportunity in building product concept X, which will help people do Y and Z.\"<\/li><\/ul>\n<p>Sometimes these ideas were voiced by lab personnel, but most of the time they were the result of someone seeing a problem or opportunity and taking the initiative to address it.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"1.1_Collecting_data_from_instruments\">1.1 Collecting data from instruments<\/span><\/h3>\n<p>In the late 1960s and early 1970s, instrument companies recognized that connecting a computer to the analog output of an instrument would help lab personnel capture and process data. The form that this product development took depended on how the developer saw the problem. We\u2019re going to look at chromatography<sup id=\"rdp-ebb-cite_ref-22\" class=\"reference\"><a href=\"#cite_note-22\">[e]<\/a><\/sup> as an example for several reasons: it received the most attention for automation, it\u2019s a data rich technique that took considerable manual effort to analyze, and it was and still is one of the most popular instrumental techniques in chemical labs. The product solutions provided by the vendor reflected the technology available and their view of the problem that needed to be solved.\n<\/p><p>The analysis (Figure A1) depends on recognizing and quantifying a series of peaks, each of which represents the amount of a component in a mixture of materials. The size of the peak (measured by area \"better\" or height with respect to a baseline) helps quantify the amount, and the time it takes the peak to appear can be used to help identify the component.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:FigA1_Liscouski_DirectLabSysOnePerPersp21.png\" class=\"image wiki-link\" data-key=\"9398087969f28a1dd4d6f0e6b8182414\"><img alt=\"FigA1 Liscouski DirectLabSysOnePerPersp21.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/0\/05\/FigA1_Liscouski_DirectLabSysOnePerPersp21.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure A1.<\/b> Illustration of peaks from chromatograph. Source: public domain<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Reference standards are prepared and run along with the samples. The peak response of the standards and their corresponding concentrations are used to draw a calibration curve, and the samples are quantified by comparing peak sizes against that curve (Figure A2).\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:FigA2_Liscouski_DirectLabSysOnePerPersp21.png\" class=\"image wiki-link\" data-key=\"6b9878402119e7fc21a56039292a6ba8\"><img alt=\"FigA2 Liscouski DirectLabSysOnePerPersp21.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/c\/c8\/FigA2_Liscouski_DirectLabSysOnePerPersp21.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure A2.<\/b> Samples are quantified by comparison to a calibration curve<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>The solutions reflected both the user\u2019s input and the vendor's observations, plus being too close to the problem and not seeing the whole picture. In that same timeframe, computing was expensive and you had to have a lot of processing done to justify the costs. Otherwise you dropped down to microprocessors and scaled back the size of the problem you could tackle. The microprocessor of choice was the Intel 4004, which was superseded by the Intel 8008 in 1972.\n<\/p><p>With computing, the chromatographer could detect peaks and quantify the peak height and area, later printing the results on a strip of paper. This was a big help to the chromatographer since determining peak size, and area in particular, was a major hassle. Prior to computerized methods, chromatographers were using:\n<\/p>\n<ul><li>mechanical integrators built into the strip chart recorder (that recorded the chromatograph output), which were hard to read and didn\u2019t provide for baseline corrections (critical for accurate results);<\/li>\n<li>a planimeter, which was time-comsuming and demanded careful attention;<\/li>\n<li>a cut-and-weigh method, where the chromatographer literally cut our a copy of the peak and weighed it on a balance, cutting out a reference area, and comparing it to a calibration chart constructed from a similar procedure; and<\/li>\n<li>a method that had the chromatographer counting the boxes the peak enclosed on the chart paper, which was not very accurate, though better than peak height in some cases.<\/li><\/ul>\n<p>Having the quantified peaks printed on a piece of paper meant that the analyst could move quickly to drawing the calibration curve and evaluating the results. However, from the standpoint of being a laborious, time-consuming task, this is only a piece of the problem (a major one, but only a part). Some users connected the output of the integrators (RS-232 serial ASCII) to minicomputers, transferred the integrator information, and completed the analytical process in the larger machines. Several integrators could be connected to a mini-computer so the cost per instrument was reduced. This was a step toward a better solution, but only a step.\n<\/p><p>Computer vendors wanted to participate in the same market, but the cost of minicomputers put them at a disadvantage unless they could come at it from a different perspective. They looked at the entire data processing problem, including the points mentioned above, plus getting the computer to compute the calibration curve, complete the analysis, print out a report, and store the report and the data for later retrieval (integrators didn\u2019t have that capability). They could also store the raw digitized instrument output for later analysis and display. The cost per instrument dropped when computer vendors began connecting multiple instruments to one system, some from different labs. Nelson Analytical used a local-to-the-instrument box for data collection and control and then forwarded those information packets to a central system for processing. This bigger-picture view of the problem greatly reduced the workload on the analyst. As computing costs dropped and the power of microchips increased, several different approaches emerged from different vendors that had varied perspectives on computing. Most took the big-picture view but worked on a one-instrument-one-computer-station approach, which benefited small labs since they didn\u2019t have to invest in a minicomputer.\n<\/p><p>The low-cost of microprocessors more readily allowed digital systems to join the lab, to the point where almost every lab device had a computer chip in it (\u201cdigital\u201d was a strong marketing point). Now that we had lots of digital data sources, what was the lab going to do with them?\n<\/p>\n<h3><span class=\"mw-headline\" id=\"1.2_The_beginning_of_laboratory_informatics_systems\">1.2 The beginning of laboratory informatics systems<\/span><\/h3>\n<p>In the early 1970s, <a href=\"https:\/\/www.limswiki.org\/index.php\/PerkinElmer_Inc.\" title=\"PerkinElmer Inc.\" class=\"wiki-link\" data-key=\"dabda40785b60866d056709e611512f8\">PerkinElmer<\/a> described its instrument supporting computers as \u201cdata stations.\u201d Then they announced the \u201claboratory information management system\u201d or \"LIMS,\" and the next level of informatics hit the lab market. However, \u201claboratory information management system\u201d was a problematic name for a product that did sample and test tracking. The customers thought it would take into account all of a lab's information, including personnel records, scheduling, budgets, documents, anything that \u201cinformation\u201d could be attached to, ultimately promising more than it could deliver. It took some time, but eventually something like that happened. From a marketing standpoint, it got people\u2019s attention.\n<\/p><p>Several other vendors, consulting companies, startups, and computer vendors began LIMS development projects (computer vendors felt that database systems were their turf). This was viewed as a strategic offering: the testing lab's operations would revolve around the LIMS, and that gave the LIMS vendor whose product was chosen a strong marketing position in that company.\n<\/p><p>The introduction of the LIMS eventually opened the door to other informatics applications. The major classes and functions of said applications are show in Figure A3.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:FigA3_Liscouski_DirectLabSysOnePerPersp21.png\" class=\"image wiki-link\" data-key=\"45606351182bb832ffb420441f14f9a9\"><img alt=\"FigA3 Liscouski DirectLabSysOnePerPersp21.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/1\/1c\/FigA3_Liscouski_DirectLabSysOnePerPersp21.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure A3.<\/b> The starting point for major classes of lab informatics systems and the lab functions they addressed<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Missing from the chart are user-developer tools like LabView from National Instruments that enabled users to develop data acquisition and control applications via a graphical user interface.\n<\/p><p>The term \u201celectronic laboratory notebook\u201d or \"ELN\" has had an interesting history. It\u2019s been applied to at least three types of software before its most current iteration. Laboratory Technologies, Inc. first created LABTECH Notebook, a PC-based software package designed to assist users with the communication between a computer and its lab instruments via RS-232 connections. Then there was Wolfram's Mathematica software, an electronic notebook for advanced mathematics. And finally there was Velquest's SmartLAB, an ELN for conducting analyzes and recording information from laboratory procedures, becoming the first <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_execution_system\" title=\"Laboratory execution system\" class=\"wiki-link\" data-key=\"774bdcab852f4d09565f0486bfafc26a\">laboratory execution system<\/a> (LES).\n<\/p><p>Figure A3 showed a nice clean, somewhat idealized, starting point for the introduction of lab informatics. That didn\u2019t last long. Vendors saw what their competition was doing and the opportunities to expand their products' capabilities (and market acceptance). What were clean, neat, functionality silos became more complex products to attract the attention of scientists and laboratorians (Figure A4).\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:FigA4_Liscouski_DirectLabSysOnePerPersp21.png\" class=\"image wiki-link\" data-key=\"409e9dd3124c8da588d921b0eada249b\"><img alt=\"FigA4 Liscouski DirectLabSysOnePerPersp21.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/b\/b3\/FigA4_Liscouski_DirectLabSysOnePerPersp21.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure A4.<\/b> Lab informatics capability expansion<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>These expanded capabilities meant that a single vendor solution could address more of a lab\u2019s needs, simplifying support, installation, and so on through the use of a singular software package. It did, however, make product and vendor selection more of a challenge; you really had to know what you needed. It also raised questions of cross-product compatibility: what instruments connected easily to what LIMS, ELN, or SDMS? If it wasn\u2019t easy, what did you do? Third-party intermediate systems were eventually developed that translated instrument communications similarly in the way database systems did.\n<\/p><p>While all this was going on in the expanding digital laboratory space, people still had things to do on the lab bench.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"1.3_Electro-mechanical_robotics\">1.3 Electro-mechanical robotics<\/span><\/h3>\n<p>During the 1970s and 80s, vendors noted the amount of time spent doing repetitive tasks at the lab bench. This resulted in two different approaches to product development:\n<\/p>\n<ul><li>Dedicated-function robotics: The development of auto-samplers, auto-titrators, and similar devices that were used to carry out a specific task<\/li>\n<li>General-purpose robotics: The development of an elaborate kit the use user had to assemble to robotically complete a task; not too different from a computer programming language: the<\/li><\/ul>\n<p>components are there, you just have to organize them to make something useful happen\n<\/p><p>Although each of these approaches was potentially useful, they each presented the user community with different sets of problems.\n<\/p><p>The dedicated-function robotics device generally worked well; that wasn\u2019t the issue. The problem arose when they had to connect to other components and instruments. The use of an auto-sampler, for example, required the user to adjust their sample preparation so that the material to be analyzed wound up in the appropriate sample vials. This may have required adjusting the procedure to accommodate a different set of mechanics than they were used to, e.g., sealing the vials or choosing the proper vial for an application. Auto-samplers feed samples into instruments so there is a matter of setting up control signals for proper coordination with instruments and instrument data systems.\n<\/p><p>Other devices such as auto-titrators required a reservoir of samples (in the proper vial formats), and the vials had to be organized so that an analysis on sample ID456 was in fact that sample and that the order didn\u2019t get mixed up. The data output could also be a problem. The default for many vendors was a spreadsheet-compatible file, though it was up to the user to make sense of it and integrate it into the workflow. This was probably the best compromise available until more formalized data interchange and communications standards became available. The vendor had no idea how a particular device was going to fit into a procedure's workflow or what was going to happen to the data further downstream, so a CSV file format as a solution was simple, flexible, and easy to work with. It also meant more work on the user's part on the integration front, representing another place where changes may have to be made if a device is replaced in the future.\n<\/p><p>With the dedicated-function device, which could be as simple as a balance (the process of reading a force exerted on a sensor and converted to a weight is an automated process), we have potentially isolated elements of automation that are either interconnected by programming or someone reading data and re-entering it. However, there is no \u201cplug-and-go\u201d capability.\n<\/p><p>As for the general-purpose robotics device, they could be broken down into two categories: those that were successful in the broad marketplace and those that weren't. In the 1980s, three vendors were competing for attention in the laboratory robotics market: Zymark, Hewlett-Packard, and PerkinElmer. Each of their general-purpose systems had a moveable arm that could grip items and be programmed to carry out lab procedures. Yet they faced a daunting task: the process of gripping items was problematic. The robot didn\u2019t work with just one gripper or \"hand\"; it required multiple interchangeable hands that had to be swapped depending on the next sequence of actions and what items had to be grasped. Those items were typically things designed for human use, including flasks, test tubes, and a variety of other equipment. This problem was non-trivial, a result of having the robot work with equipment designed for humans so that the lab didn\u2019t have to buy duplicate components, which would have raised problems of cost, and the ability to carry on work if the robot didn\u2019t function. Another issue with the grippers and their holders was that they took up space. The Zymark system for example had a central robotic arm that could reach items within the shell of a hemispherical volume; grippers took up space that could be used by other devices. Some people were successful in building systems, but not enough to make the products economically viable. In many respects, the core robotics technologies should have been successful, but the peripheral devices were not up to the necessary operational levels; good for humans, lacking for automation.\n<\/p><p>Another problem was education. The vendors would run courses to train people to use their systems. However, successful implementations required engineering, expertise, and experience, not just experimentation. Further robust systems, those capable of running days on end with built-in error detection and correction, were rare but necessary to avoid processing samples into junk. People had the need and the desire to make working systems, but not the process engineering skills to create successful implementations.\n<\/p><p>The life sciences market, and biotechnology in particular, took a different approach to robotics: they standardized the sample carrier format in the form of the microplate. The physical dimensions of the microplate were constant while the number of sample cells could vary. The image on the right of Figure A5, for example, shows plates with 96, 384, and 1,536 wells, with even higher densities available.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:FigA5_Liscouski_DirectLabSysOnePerPersp21.png\" class=\"image wiki-link\" data-key=\"46e8551c1035853039fb9360aeae72d7\"><img alt=\"FigA5 Liscouski DirectLabSysOnePerPersp21.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/f\/f1\/FigA5_Liscouski_DirectLabSysOnePerPersp21.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure A5.<\/b> A microplate sample holder (<b>left<\/b>); examples of 96, 384, and 1,536 well holders, care of <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/commons.wikimedia.org\/wiki\/File:Microplates.jpg\" target=\"_blank\">WikiMedia Commons<\/a> (<b>right<\/b>)<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>What did this mean for robotics? Every device that interacted with a microplate could be reliably designed with a single gripper size and a single aperture size for plates to be inserted. Systems would \u201cknow\u201d where in the sample space the sample wells were. In short, standardization made it easier to build equipment for a marketplace. And a lot of equipment was built and put into successful use.<sup id=\"rdp-ebb-cite_ref-23\" class=\"reference\"><a href=\"#cite_note-23\">[f]<\/a><\/sup>\n<\/p><p>The combination of robotics and microplates also worked well in biotechnology because of the prevalence of liquid samples; this point can not be stressed enough. We are good at working with fluids, including measuring amounts, transferring them, dispensing them, and so on. Solids can be a problem because of cross-contamination if equipment isn\u2019t cleaned, if the solid is tacky, if the particle size isn\u2019t right for the equipment, if it has a tendency to roll, and so on. Solids also have the potential for problems with homogeneity. (You can get layering in liquids, but that can be solved in \u201ctwo shakes.\u201d)\n<\/p>\n<h4><span class=\"mw-headline\" id=\"1.3.1_Two_approaches_to_sample_processing_with_robotics\">1.3.1 Two approaches to sample processing with robotics<\/span><\/h4>\n<p>Broadly speaking, there are two approaches to sample processing with robotics: batching and sample-at-a-time runs. The microplate is an example of batch processing. Each tray is handled (dilution, sealing, reading, etc.) at a station, and you don\u2019t have an opportunity to evaluate the results until the analysis for a tray is completed and reported. Other methods such as ICP, mass spectrometry, chromatography, etc. that use auto-injectors can behave in a similar fashion, or be evaluated on a sample-at-a-time basis. It depends on how the process is planned and implemented, and at what stage the results are evaluated.\n<\/p><p>The sample-at-a-time procedure offers an interesting alternative pathway for automated analysis. This process can include auto-titrators, in addition to the techniques noted, as well as others. Here each sample is processed in stages either one-at-a-time or in overlapping stages. While one sample is being processed in stage 3 (adding a solvent for example), a sample in stage 4 is being injected into an instrument. This means that the results for one sample can be known before the next one is processed. The benefit is that systematic errors can be caught and corrected before all samples are processed. This would reduce waste and improve overall efficiency.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"1.4_Sample_storage_organization\">1.4 Sample storage organization<\/span><\/h3>\n<p>Before we leave the topic of samples, we need to address the subject of sample storage organization. This is important, as poor organization and management can nullify any gains from automation. There are two aspects to consider: the organization process itself and the physical nature of the samples.\n<\/p><p>In the life sciences, <a href=\"https:\/\/www.limswiki.org\/index.php\/Biobank\" title=\"Biobank\" class=\"wiki-link\" data-key=\"4e5f94a2b2036266701220c1fd724bd2\">biobanking<\/a> and refrigerated storage management have been actively discussed topics. For example a white paper commissioned for Titian Software Ltd. titled <i>The Essential Guide to Managing Laboratory Samples<\/i> goes into a fair amount of depth on the subject.<sup id=\"rdp-ebb-cite_ref-OxerTheEss17_24-0\" class=\"reference\"><a href=\"#cite_note-OxerTheEss17-24\">[18]<\/a><\/sup> And if you Google \u201claboratory sample storage management,\u201d you\u2019ll get a sizeable listing of material, including peer-reviewed work by Redrup <i>et al.<\/i> titled \"Sample Management: Recommendation for Best Practices and Harmonization from the Global Bioanalysis Consortium Harmonization Team.\u201d<sup id=\"rdp-ebb-cite_ref-:0_25-0\" class=\"reference\"><a href=\"#cite_note-:0-25\">[19]<\/a><\/sup> The abstract of the work of Redrup <i>et al.<\/i> reads in part<sup id=\"rdp-ebb-cite_ref-:0_25-1\" class=\"reference\"><a href=\"#cite_note-:0-25\">[19]<\/a><\/sup>:\n<\/p>\n<blockquote><p>The importance of appropriate sample management in regulated bioanalysis is undeniable for clinical and non-clinical study support due to the fact that if the samples are compromised at any stage prior to analysis, the study results may be affected. Health authority regulations do not contain specific guidance on sample management; therefore, as part of the Global Bioanalysis Consortium (GBC), the A5 team was established to discuss sample management requirements and to put forward recommendations.<\/p><\/blockquote>\n<p>In short, you have to have control of your samples and be able to ensure their integrity.\n<\/p>\n<h4><span class=\"mw-headline\" id=\"1.4.1_The_nature_of_incoming_samples\">1.4.1 The nature of incoming samples<\/span><\/h4>\n<p>Considerations about the nature of incoming samples don\u2019t get as much press as they deserve. For example, if you are in quality control, the nature of your incoming samples is going to be consistent and determined by the production process. In fact, sample preparation is likely to be integrated with the test procedure\u2019s automation. If the samples are fluids, the impact on an automation process may be small compared to working with solids. One complicating factor with fluids is the need to remove extraneous material so that downstream problems aren\u2019t created. That removal may be accomplished by filtering, settling, centrifugal separation, batch centrifugation, or other means depending on the amount of material and its composition.\n<\/p><p>Working with solids raises several other issues. First, does the material have to be reduced to a coarse or fine powder before processing? This may be needed to permit precise weighing of amounts or providing a large surface area for solvent extractions. Second, is fabrication of a sample piece required? Some mechanical properties testing of plastics require molding test bars. Other tests may require pressing films for spectral analysis. Other issues exist as well. Some are industry-specific, for example working with hazardous materials (including toxic substances and radioactive samples), and those requiring careful environmental and\/or security controls.\n<\/p><p>In many labs, automation is viewed as something that happens after the samples are logged in. Yet that doesn\u2019t have to be the case. The following paragraphs focus on testing labs because they are the most likely to benefit from automation. They meet the two key criteria for automation implementation: stable procedures and sufficient workload to justify the cost of automation development. That can also happen in research labs, though; in the end it's simply a matter of the nature of the work.\n<\/p><p>Testing labs (e.g., quality control and non-routine internal and contract labs) can take steps to streamline their sample handling operations, though unfortunately at the expense of someone else\u2019s labor (just make it worth their effort). For example, those submitting samples can be required to pre-log their samples. This can be accomplished by giving them access to restricted accounts on a LIMS that lets them log samples in, and little more. Sample containers can also be standardized with barcodes. The barcodes can then be required as part of the logging process and are critical to identifying samples that have reached the lab, as well as tracking their physical location. Additionally, sample container sizes and related container forms can also be standardized. These should match the requirements for sample handling in automated systems, if possible. (Unless you supply them, your submitters may not have the tools for sealing sample vials, etc.) Finally, all this cooperative effort to standardize sample reception can be incentivized with price breaks, which is likely to lead to faster TAT. In other words, give them an incentive to work with you, the lab. They are supplying labor that could potentially impact their productivity, so give them a good reason to work with you.\n<\/p><p>Testing operations can, as a result, see further benefits:\n<\/p>\n<ul><li>Sample storage management and access is improved.<\/li>\n<li>You\u2019ll be more informed of incoming work.<\/li>\n<li>Automation and scheduling is enhanced when it begins with the requester instead of post-login.<\/li><\/ul>\n<p>My first professional lab experience was in polymers in an analytical research group (some routine work, a lot of non-routine work). Samples would arrive in a variety of containers (e.g., bags, jars, test tubes, envelopes, fabricated parts taped to sample request forms). Sample matrices would range from pellets, waxes, and powders to liquids, gas cylinders, rolls of film, and more. Classifying those samples, figuring out where to put them, locating them, and preparing them for work (which often involved grinding them in a Wiley mill) was a major time sink, sufficiently so that the actual analysis was a smaller part of the overall workflow. As such, anything you can do to streamline that process will help productivity and contribute to a successful automation project.\n<\/p>\n<h2><span id=\"rdp-ebb-Abbreviations,_acronyms,_and_initialisms\"><\/span><span class=\"mw-headline\" id=\"Abbreviations.2C_acronyms.2C_and_initialisms\">Abbreviations, acronyms, and initialisms<\/span><\/h2>\n<p><b>CDS<\/b>: Chromatography data system\n<\/p><p><b>ELN<\/b>: Electronic laboratory notebook\n<\/p><p><b>EVOP<\/b>: Evolutionary operations\n<\/p><p><b>ICP<\/b>: Inductively coupled plasma mass spectrometry\n<\/p><p><b>K\/D\/I<\/b>: Knowledge, data, and information\n<\/p><p><b>LIMS<\/b>: Laboratory information management system\n<\/p><p><b>LOF<\/b>: Laboratory of the future\n<\/p><p><b>LSE<\/b>: Laboratory systems engineer\n<\/p><p><b>NMR<\/b>: Nuclear magnetic resonance\n<\/p><p><b>QC<\/b>: Quality control\n<\/p><p><b>ROI<\/b>: Return on investment\n<\/p><p><b>SDMS<\/b>: Scientific data management system\n<\/p><p><b>Std.SP<\/b>: Standard sample program\n<\/p><p><b>TLA<\/b>: Total laboratory automation\n<\/p><p><b>TAT<\/b>: Turn-around time\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Footnotes\">Footnotes<\/span><\/h2>\n<div class=\"reflist\" style=\"list-style-type: lower-alpha;\">\n<div class=\"mw-references-wrap\"><ol class=\"references\">\n<li id=\"cite_note-17\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-17\">\u2191<\/a><\/span> <span class=\"reference-text\">Where the amount of titrant added is adjusted based on the response to the previous addition. This should yield faster titrations with increased accuracy.<\/span>\n<\/li>\n<li id=\"cite_note-18\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-18\">\u2191<\/a><\/span> <span class=\"reference-text\">In prior writings, the term \u201cscientific manufacturing\u201d was used. The problem with that term is that we\u2019re not making products but instead producing results. Plus \u201cmanufacturing results\u201d has some negative connotations.<\/span>\n<\/li>\n<li id=\"cite_note-20\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-20\">\u2191<\/a><\/span> <span class=\"reference-text\">Wikipedia <a href=\"https:\/\/en.wikipedia.org\/wiki\/Calender\" class=\"extiw wiki-link\" title=\"wikipedia:Calender\" data-key=\"d8d8d99ee47563a86625118bdd0b05a2\">says this<\/a> of calendering: \"A calender is a series of hard pressure rollers used to finish or smooth a sheet of material such as paper, textiles, or plastics. Calender rolls are also used to form some types of plastic films and to apply coatings.\"<\/span>\n<\/li>\n<li id=\"cite_note-21\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-21\">\u2191<\/a><\/span> <span class=\"reference-text\">See the discussion on clinical chemistry in the main text.<\/span>\n<\/li>\n<li id=\"cite_note-22\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-22\">\u2191<\/a><\/span> <span class=\"reference-text\">If you\u2019re not familiar with the method, Wikipedia's <a href=\"https:\/\/www.limswiki.org\/index.php\/Chromatography\" title=\"Chromatography\" class=\"wiki-link\" data-key=\"2615535d1f14c6cffdfad7285999ad9d\">chromatography<\/a> page is a good starting point.<\/span>\n<\/li>\n<li id=\"cite_note-23\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-23\">\u2191<\/a><\/span> <span class=\"reference-text\">See Wikipedia's <a href=\"https:\/\/en.wikipedia.org\/wiki\/Microplate\" class=\"extiw wiki-link\" title=\"wikipedia:Microplate\" data-key=\"118619c5fab5efc970417c99b85e858d\">microplate<\/a> article for more.<\/span>\n<\/li>\n<\/ol><\/div><\/div>\n<h2><span class=\"mw-headline\" id=\"About_the_author\">About the author<\/span><\/h2>\n<p>Initially educated as a chemist, author Joe Liscouski (joe dot liscouski at gmail dot com) is an experienced laboratory automation\/computing professional with over forty years of experience in the field, including the design and development of automation systems (both custom and commercial systems), LIMS, robotics and data interchange standards. He also consults on the use of computing in laboratory work. He has held symposia on validation and presented technical material and short courses on laboratory automation and computing in the U.S., Europe, and Japan. He has worked\/consulted in pharmaceutical, biotech, polymer, medical, and government laboratories. His current work centers on working with companies to establish planning programs for lab systems, developing effective support groups, and helping people with the application of automation and information technologies in research and quality control environments.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"References\">References<\/span><\/h2>\n<div class=\"reflist references-column-width\" style=\"-moz-column-width: 30em; -webkit-column-width: 30em; column-width: 30em; list-style-type: decimal;\">\n<div class=\"mw-references-wrap mw-references-columns\"><ol class=\"references\">\n<li id=\"cite_note-1\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-1\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation book\">Liscouski, J. (2015). <i>Computerized Systems in the Modern Laboratory: A Practical Guide<\/i>. PDA\/DHI. pp. 432. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/ASIN\" data-key=\"5cc6746513dcbbbee6f1bf9284ead699\">ASIN<\/a> <a rel=\"external_link\" class=\"external text\" href=\"http:\/\/www.amazon.com\/dp\/B010EWO06S\" target=\"_blank\">B010EWO06S<\/a>. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/International_Standard_Book_Number\" data-key=\"f64947ba21e884434bd70e8d9e60bae6\">ISBN<\/a> 978-1933722863.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=book&rft.btitle=Computerized+Systems+in+the+Modern+Laboratory%3A+A+Practical+Guide&rft.aulast=Liscouski%2C+J.&rft.au=Liscouski%2C+J.&rft.date=2015&rft.pages=pp.%26nbsp%3B432&rft.pub=PDA%2FDHI&rft_id=info:asin\/B010EWO06S&rft.isbn=978-1933722863&rfr_id=info:sid\/en.wikipedia.org:LII:Directions_in_Laboratory_Systems:_One_Person%27s_Perspective\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-WilkinsonTheFAIR16-2\"><span class=\"mw-cite-backlink\">\u2191 <sup><a href=\"#cite_ref-WilkinsonTheFAIR16_2-0\">2.0<\/a><\/sup> <sup><a href=\"#cite_ref-WilkinsonTheFAIR16_2-1\">2.1<\/a><\/sup><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Wilkinson, M.D.; Dumontier, M.; Aalbersberg, I.J. et al. (2016). <a rel=\"external_link\" class=\"external text\" href=\"http:\/\/www.pubmedcentral.nih.gov\/articlerender.fcgi?tool=pmcentrez&artid=PMC4792175\" target=\"_blank\">\"The FAIR Guiding Principles for scientific data management and stewardship\"<\/a>. <i>Scientific Data<\/i> <b>3<\/b>: 160018. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/Digital_object_identifier\" data-key=\"ae6d69c760ab710abc2dd89f3937d2f4\">doi<\/a>:<a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.1038%2Fsdata.2016.18\" target=\"_blank\">10.1038\/sdata.2016.18<\/a>. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/PubMed_Central\" data-key=\"c85bdffd69dd30e02024b9cc3d7679e2\">PMC<\/a> <a rel=\"external_link\" class=\"external text\" href=\"http:\/\/www.ncbi.nlm.nih.gov\/pmc\/articles\/PMC4792175\/\" target=\"_blank\">PMC4792175<\/a>. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/PubMed_Identifier\" data-key=\"1d34e999f13d8801964a6b3e9d7b4e30\">PMID<\/a> <a rel=\"external_link\" class=\"external text\" href=\"http:\/\/www.ncbi.nlm.nih.gov\/pubmed\/26978244\" target=\"_blank\">26978244<\/a><span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"http:\/\/www.pubmedcentral.nih.gov\/articlerender.fcgi?tool=pmcentrez&artid=PMC4792175\" target=\"_blank\">http:\/\/www.pubmedcentral.nih.gov\/articlerender.fcgi?tool=pmcentrez&artid=PMC4792175<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=The+FAIR+Guiding+Principles+for+scientific+data+management+and+stewardship&rft.jtitle=Scientific+Data&rft.aulast=Wilkinson%2C+M.D.%3B+Dumontier%2C+M.%3B+Aalbersberg%2C+I.J.+et+al.&rft.au=Wilkinson%2C+M.D.%3B+Dumontier%2C+M.%3B+Aalbersberg%2C+I.J.+et+al.&rft.date=2016&rft.volume=3&rft.pages=160018&rft_id=info:doi\/10.1038%2Fsdata.2016.18&rft_id=info:pmc\/PMC4792175&rft_id=info:pmid\/26978244&rft_id=http%3A%2F%2Fwww.pubmedcentral.nih.gov%2Farticlerender.fcgi%3Ftool%3Dpmcentrez%26artid%3DPMC4792175&rfr_id=info:sid\/en.wikipedia.org:LII:Directions_in_Laboratory_Systems:_One_Person%27s_Perspective\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-FishOvercoming05-3\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-FishOvercoming05_3-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Fish, M.; Minicuci, D. (2005). <a rel=\"external_link\" class=\"external text\" href=\"http:\/\/apps.thermoscientific.com\/media\/SID\/Informatics\/PDF\/Article-Overcoming-the-Challanges.pdf\" target=\"_blank\">\"Overcoming the Challenges of a LIMS Migration\"<\/a> (PDF). <i>Research & Development<\/i> <b>47<\/b> (2)<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"http:\/\/apps.thermoscientific.com\/media\/SID\/Informatics\/PDF\/Article-Overcoming-the-Challanges.pdf\" target=\"_blank\">http:\/\/apps.thermoscientific.com\/media\/SID\/Informatics\/PDF\/Article-Overcoming-the-Challanges.pdf<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Overcoming+the+Challenges+of+a+LIMS+Migration&rft.jtitle=Research+%26+Development&rft.aulast=Fish%2C+M.%3B+Minicuci%2C+D.&rft.au=Fish%2C+M.%3B+Minicuci%2C+D.&rft.date=2005&rft.volume=47&rft.issue=2&rft_id=http%3A%2F%2Fapps.thermoscientific.com%2Fmedia%2FSID%2FInformatics%2FPDF%2FArticle-Overcoming-the-Challanges.pdf&rfr_id=info:sid\/en.wikipedia.org:LII:Directions_in_Laboratory_Systems:_One_Person%27s_Perspective\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-FishOvercoming13-4\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-FishOvercoming13_4-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">Fish, M.; Minicuci, D. (1 April 2013). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.scientistlive.com\/content\/overcoming-daunting-business-challenges-lims-migration\" target=\"_blank\">\"Overcoming daunting business challenges of a LIMS migration\"<\/a>. <i>Scientist Live<\/i><span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.scientistlive.com\/content\/overcoming-daunting-business-challenges-lims-migration\" target=\"_blank\">https:\/\/www.scientistlive.com\/content\/overcoming-daunting-business-challenges-lims-migration<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 17 November 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Overcoming+daunting+business+challenges+of+a+LIMS+migration&rft.atitle=Scientist+Live&rft.aulast=Fish%2C+M.%3B+Minicuci%2C+D.&rft.au=Fish%2C+M.%3B+Minicuci%2C+D.&rft.date=1+April+2013&rft_id=https%3A%2F%2Fwww.scientistlive.com%2Fcontent%2Fovercoming-daunting-business-challenges-lims-migration&rfr_id=info:sid\/en.wikipedia.org:LII:Directions_in_Laboratory_Systems:_One_Person%27s_Perspective\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-FreeLIMSWhat18-5\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-FreeLIMSWhat18_5-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/freelims.org\/blog\/legacy-data-migration-to-lims.html\" target=\"_blank\">\"Overcoming the Challenges of Legacy Data Migration\"<\/a>. <i>FreeLIMS.org<\/i>. CloudLIMS.com, LLC. 29 June 2018<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/freelims.org\/blog\/legacy-data-migration-to-lims.html\" target=\"_blank\">https:\/\/freelims.org\/blog\/legacy-data-migration-to-lims.html<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 17 November 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Overcoming+the+Challenges+of+Legacy+Data+Migration&rft.atitle=FreeLIMS.org&rft.date=29+June+2018&rft.pub=CloudLIMS.com%2C+LLC&rft_id=https%3A%2F%2Ffreelims.org%2Fblog%2Flegacy-data-migration-to-lims.html&rfr_id=info:sid\/en.wikipedia.org:LII:Directions_in_Laboratory_Systems:_One_Person%27s_Perspective\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-CLSIAUTO03-6\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-CLSIAUTO03_6-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/clsi.org\/standards\/products\/automation-and-informatics\/documents\/auto03\/\" target=\"_blank\">\"AUTO03 Laboratory Automation: Communications With Automated Clinical Laboratory Systems, Instruments, Devices, and Information Systems, 2nd Edition\"<\/a>. Clinical and Laboratory Standards Institute. 30 September 2009<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/clsi.org\/standards\/products\/automation-and-informatics\/documents\/auto03\/\" target=\"_blank\">https:\/\/clsi.org\/standards\/products\/automation-and-informatics\/documents\/auto03\/<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 17 November 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=AUTO03+Laboratory+Automation%3A+Communications+With+Automated+Clinical+Laboratory+Systems%2C+Instruments%2C+Devices%2C+and+Information+Systems%2C+2nd+Edition&rft.atitle=&rft.date=30+September+2009&rft.pub=Clinical+and+Laboratory+Standards+Institute&rft_id=https%3A%2F%2Fclsi.org%2Fstandards%2Fproducts%2Fautomation-and-informatics%2Fdocuments%2Fauto03%2F&rfr_id=info:sid\/en.wikipedia.org:LII:Directions_in_Laboratory_Systems:_One_Person%27s_Perspective\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-SarkoziTheEff03-7\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-SarkoziTheEff03_7-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Sarkozi, L.; Simson, E.; Ramanathan, L. (2003). \"The effects of total laboratory automation on the management of a clinical chemistry laboratory. Retrospective analysis of 36 years\". <i>Clinica Chimica Acta<\/i> <b>329<\/b> (1\u20132): 89\u201394. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/Digital_object_identifier\" data-key=\"ae6d69c760ab710abc2dd89f3937d2f4\">doi<\/a>:<a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.1016%2FS0009-8981%2803%2900020-2\" target=\"_blank\">10.1016\/S0009-8981(03)00020-2<\/a>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=The+effects+of+total+laboratory+automation+on+the+management+of+a+clinical+chemistry+laboratory.+Retrospective+analysis+of+36+years&rft.jtitle=Clinica+Chimica+Acta&rft.aulast=Sarkozi%2C+L.%3B+Simson%2C+E.%3B+Ramanathan%2C+L.&rft.au=Sarkozi%2C+L.%3B+Simson%2C+E.%3B+Ramanathan%2C+L.&rft.date=2003&rft.volume=329&rft.issue=1%E2%80%932&rft.pages=89%E2%80%9394&rft_id=info:doi\/10.1016%2FS0009-8981%2803%2900020-2&rfr_id=info:sid\/en.wikipedia.org:LII:Directions_in_Laboratory_Systems:_One_Person%27s_Perspective\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-BissellTotal14-8\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-BissellTotal14_8-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.youtube.com\/watch?v=RdwFZyYE_4Q\" target=\"_blank\">\"Total Laboratory Automation - Michael Bissell, MD, Ph.D\"<\/a>. <i>YouTube<\/i>. University of Washington. 15 July 2014<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.youtube.com\/watch?v=RdwFZyYE_4Q\" target=\"_blank\">https:\/\/www.youtube.com\/watch?v=RdwFZyYE_4Q<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 17 November 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Total+Laboratory+Automation+-+Michael+Bissell%2C+MD%2C+Ph.D&rft.atitle=YouTube&rft.date=15+July+2014&rft.pub=University+of+Washington&rft_id=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DRdwFZyYE_4Q&rfr_id=info:sid\/en.wikipedia.org:LII:Directions_in_Laboratory_Systems:_One_Person%27s_Perspective\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-WatersMassLynx16-9\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-WatersMassLynx16_9-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.waters.com\/webassets\/cms\/library\/docs\/720005731en%20Masslynx%20LIMS%20Interface.pdf\" target=\"_blank\">\"MassLynx LIMS Interface\"<\/a> (PDF). Waters Corporation. November 2016<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.waters.com\/webassets\/cms\/library\/docs\/720005731en%20Masslynx%20LIMS%20Interface.pdf\" target=\"_blank\">https:\/\/www.waters.com\/webassets\/cms\/library\/docs\/720005731en%20Masslynx%20LIMS%20Interface.pdf<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 17 November 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=MassLynx+LIMS+Interface&rft.atitle=&rft.date=November+2016&rft.pub=Waters+Corporation&rft_id=https%3A%2F%2Fwww.waters.com%2Fwebassets%2Fcms%2Flibrary%2Fdocs%2F720005731en%2520Masslynx%2520LIMS%2520Interface.pdf&rfr_id=info:sid\/en.wikipedia.org:LII:Directions_in_Laboratory_Systems:_One_Person%27s_Perspective\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-ASTME1578_18-10\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-ASTME1578_18_10-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.astm.org\/Standards\/E1578.htm\" target=\"_blank\">\"ASTM E1578 - 18 Standard Guide for Laboratory Informatics\"<\/a>. ASTM International. 2018<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.astm.org\/Standards\/E1578.htm\" target=\"_blank\">https:\/\/www.astm.org\/Standards\/E1578.htm<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 17 November 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=ASTM+E1578+-+18+Standard+Guide+for+Laboratory+Informatics&rft.atitle=&rft.date=2018&rft.pub=ASTM+International&rft_id=https%3A%2F%2Fwww.astm.org%2FStandards%2FE1578.htm&rfr_id=info:sid\/en.wikipedia.org:LII:Directions_in_Laboratory_Systems:_One_Person%27s_Perspective\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-11\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-11\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation book\">Schmitt, S., ed. <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.dhibooks.com\/assuring-data-integrity-for-life-sciences\" target=\"_blank\"><i>Assuring Data Integrity for Life Sciences<\/i><\/a>. DHI Publishing. pp. 385. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/International_Standard_Book_Number\" data-key=\"f64947ba21e884434bd70e8d9e60bae6\">ISBN<\/a> 9781933722979<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.dhibooks.com\/assuring-data-integrity-for-life-sciences\" target=\"_blank\">https:\/\/www.dhibooks.com\/assuring-data-integrity-for-life-sciences<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=book&rft.btitle=Assuring+Data+Integrity+for+Life+Sciences&rft.pages=pp.%26nbsp%3B385&rft.pub=DHI+Publishing&rft.isbn=9781933722979&rft_id=https%3A%2F%2Fwww.dhibooks.com%2Fassuring-data-integrity-for-life-sciences&rfr_id=info:sid\/en.wikipedia.org:LII:Directions_in_Laboratory_Systems:_One_Person%27s_Perspective\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-ASTMEE2078_16-12\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-ASTMEE2078_16_12-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.astm.org\/Standards\/E2078.htm\" target=\"_blank\">\"ASTM E2078 - 00(2016) Standard Guide for Analytical Data Interchange Protocol for Mass Spectrometric Data\"<\/a>. ASTM International. 2016<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.astm.org\/Standards\/E2078.htm\" target=\"_blank\">https:\/\/www.astm.org\/Standards\/E2078.htm<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 17 November 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=ASTM+E2078+-+00%282016%29+Standard+Guide+for+Analytical+Data+Interchange+Protocol+for+Mass+Spectrometric+Data&rft.atitle=&rft.date=2016&rft.pub=ASTM+International&rft_id=https%3A%2F%2Fwww.astm.org%2FStandards%2FE2078.htm&rfr_id=info:sid\/en.wikipedia.org:LII:Directions_in_Laboratory_Systems:_One_Person%27s_Perspective\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-ASTME1948_14-13\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-ASTME1948_14_13-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.astm.org\/Standards\/E1948.htm\" target=\"_blank\">\"ASTM E1948 - 98(2014) Standard Guide for Analytical Data Interchange Protocol for Chromatographic Data\"<\/a>. ASTM International. 2014<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.astm.org\/Standards\/E1948.htm\" target=\"_blank\">https:\/\/www.astm.org\/Standards\/E1948.htm<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 17 November 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=ASTM+E1948+-+98%282014%29+Standard+Guide+for+Analytical+Data+Interchange+Protocol+for+Chromatographic+Data&rft.atitle=&rft.date=2014&rft.pub=ASTM+International&rft_id=https%3A%2F%2Fwww.astm.org%2FStandards%2FE1948.htm&rfr_id=info:sid\/en.wikipedia.org:LII:Directions_in_Laboratory_Systems:_One_Person%27s_Perspective\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-MerckMilli19-14\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-MerckMilli19_14-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.emdgroup.com\/en\/news\/bssn-software-06-08-2019.html\" target=\"_blank\">\"MilliporeSigma Acquires BSSN Software to Accelerate Customers\u2019 Digital Transformation in the Lab\"<\/a>. Merck KGaA. 6 August 2019<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.emdgroup.com\/en\/news\/bssn-software-06-08-2019.html\" target=\"_blank\">https:\/\/www.emdgroup.com\/en\/news\/bssn-software-06-08-2019.html<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 17 November 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=MilliporeSigma+Acquires+BSSN+Software+to+Accelerate+Customers%E2%80%99+Digital+Transformation+in+the+Lab&rft.atitle=&rft.date=6+August+2019&rft.pub=Merck+KGaA&rft_id=https%3A%2F%2Fwww.emdgroup.com%2Fen%2Fnews%2Fbssn-software-06-08-2019.html&rfr_id=info:sid\/en.wikipedia.org:LII:Directions_in_Laboratory_Systems:_One_Person%27s_Perspective\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-15\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-15\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">McDonald, Robert S.; Wilks, Paul A. (1 January 1988). <a rel=\"external_link\" class=\"external text\" href=\"http:\/\/journals.sagepub.com\/doi\/10.1366\/0003702884428734\" target=\"_blank\">\"JCAMP-DX: A Standard Form for Exchange of Infrared Spectra in Computer Readable Form\"<\/a> (in en). <i>Applied Spectroscopy<\/i> <b>42<\/b> (1): 151\u2013162. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/Digital_object_identifier\" data-key=\"ae6d69c760ab710abc2dd89f3937d2f4\">doi<\/a>:<a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.1366%2F0003702884428734\" target=\"_blank\">10.1366\/0003702884428734<\/a>. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/International_Standard_Serial_Number\" data-key=\"a5dec3e4d005e654c29ad167ab53f53a\">ISSN<\/a> <a rel=\"external_link\" class=\"external text\" href=\"http:\/\/www.worldcat.org\/issn\/0003-7028\" target=\"_blank\">0003-7028<\/a><span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"http:\/\/journals.sagepub.com\/doi\/10.1366\/0003702884428734\" target=\"_blank\">http:\/\/journals.sagepub.com\/doi\/10.1366\/0003702884428734<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=JCAMP-DX%3A+A+Standard+Form+for+Exchange+of+Infrared+Spectra+in+Computer+Readable+Form&rft.jtitle=Applied+Spectroscopy&rft.aulast=McDonald&rft.aufirst=Robert+S.&rft.au=McDonald%2C%26%2332%3BRobert+S.&rft.au=Wilks%2C%26%2332%3BPaul+A.&rft.date=1+January+1988&rft.volume=42&rft.issue=1&rft.pages=151%E2%80%93162&rft_id=info:doi\/10.1366%2F0003702884428734&rft.issn=0003-7028&rft_id=http%3A%2F%2Fjournals.sagepub.com%2Fdoi%2F10.1366%2F0003702884428734&rfr_id=info:sid\/en.wikipedia.org:LII:Directions_in_Laboratory_Systems:_One_Person%27s_Perspective\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-GAML-16\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-GAML_16-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"http:\/\/www.gaml.org\/default.asp\" target=\"_blank\">\"Welcome to GAML.org\"<\/a>. <i>GAML.org<\/i>. 22 June 2007<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"http:\/\/www.gaml.org\/default.asp\" target=\"_blank\">http:\/\/www.gaml.org\/default.asp<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 17 November 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Welcome+to+GAML.org&rft.atitle=GAML.org&rft.date=22+June+2007&rft_id=http%3A%2F%2Fwww.gaml.org%2Fdefault.asp&rfr_id=info:sid\/en.wikipedia.org:LII:Directions_in_Laboratory_Systems:_One_Person%27s_Perspective\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-CNNMicro15-19\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-CNNMicro15_19-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">CNN Business (17 April 2015). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.youtube.com\/watch?v=_PuTKH7143c\" target=\"_blank\">\"Micro robots drive Bayer's high-tech vision\"<\/a>. <i>YouTube<\/i><span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.youtube.com\/watch?v=_PuTKH7143c\" target=\"_blank\">https:\/\/www.youtube.com\/watch?v=_PuTKH7143c<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 17 November 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Micro+robots+drive+Bayer%27s+high-tech+vision&rft.atitle=YouTube&rft.aulast=CNN+Business&rft.au=CNN+Business&rft.date=17+April+2015&rft_id=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3D_PuTKH7143c&rfr_id=info:sid\/en.wikipedia.org:LII:Directions_in_Laboratory_Systems:_One_Person%27s_Perspective\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-OxerTheEss17-24\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-OxerTheEss17_24-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">Oxer, M.; Stroud, T. (2017). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.titian.co.uk\/the-essential-guide-to-managing-laboratory-samples-web\" target=\"_blank\">\"The Essential Guide to Managing Laboratory Samples\"<\/a>. Titian Software Ltd<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.titian.co.uk\/the-essential-guide-to-managing-laboratory-samples-web\" target=\"_blank\">https:\/\/www.titian.co.uk\/the-essential-guide-to-managing-laboratory-samples-web<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 17 November 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=The+Essential+Guide+to+Managing+Laboratory+Samples&rft.atitle=&rft.aulast=Oxer%2C+M.%3B+Stroud%2C+T.&rft.au=Oxer%2C+M.%3B+Stroud%2C+T.&rft.date=2017&rft.pub=Titian+Software+Ltd&rft_id=https%3A%2F%2Fwww.titian.co.uk%2Fthe-essential-guide-to-managing-laboratory-samples-web&rfr_id=info:sid\/en.wikipedia.org:LII:Directions_in_Laboratory_Systems:_One_Person%27s_Perspective\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-:0-25\"><span class=\"mw-cite-backlink\">\u2191 <sup><a href=\"#cite_ref-:0_25-0\">19.0<\/a><\/sup> <sup><a href=\"#cite_ref-:0_25-1\">19.1<\/a><\/sup><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Redrup, Michael J.; Igarashi, Harue; Schaefgen, Jay; Lin, Jenny; Geisler, Lisa; Ben M\u2019Barek, Mohamed; Ramachandran, Subramanian; Cardoso, Thales <i>et al.<\/i> (1 March 2016). <a rel=\"external_link\" class=\"external text\" href=\"http:\/\/link.springer.com\/10.1208\/s12248-016-9869-2\" target=\"_blank\">\"Sample Management: Recommendation for Best Practices and Harmonization from the Global Bioanalysis Consortium Harmonization Team\"<\/a> (in en). <i>The AAPS Journal<\/i> <b>18<\/b> (2): 290\u2013293. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/Digital_object_identifier\" data-key=\"ae6d69c760ab710abc2dd89f3937d2f4\">doi<\/a>:<a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.1208%2Fs12248-016-9869-2\" target=\"_blank\">10.1208\/s12248-016-9869-2<\/a>. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/International_Standard_Serial_Number\" data-key=\"a5dec3e4d005e654c29ad167ab53f53a\">ISSN<\/a> <a rel=\"external_link\" class=\"external text\" href=\"http:\/\/www.worldcat.org\/issn\/1550-7416\" target=\"_blank\">1550-7416<\/a>. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/PubMed_Central\" data-key=\"c85bdffd69dd30e02024b9cc3d7679e2\">PMC<\/a> <a rel=\"external_link\" class=\"external text\" href=\"http:\/\/www.ncbi.nlm.nih.gov\/pmc\/articles\/PMC4779093\/\" target=\"_blank\">PMC4779093<\/a>. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/PubMed_Identifier\" data-key=\"1d34e999f13d8801964a6b3e9d7b4e30\">PMID<\/a> <a rel=\"external_link\" class=\"external text\" href=\"http:\/\/www.ncbi.nlm.nih.gov\/pubmed\/26821803\" target=\"_blank\">26821803<\/a><span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"http:\/\/link.springer.com\/10.1208\/s12248-016-9869-2\" target=\"_blank\">http:\/\/link.springer.com\/10.1208\/s12248-016-9869-2<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Sample+Management%3A+Recommendation+for+Best+Practices+and+Harmonization+from+the+Global+Bioanalysis+Consortium+Harmonization+Team&rft.jtitle=The+AAPS+Journal&rft.aulast=Redrup&rft.aufirst=Michael+J.&rft.au=Redrup%2C%26%2332%3BMichael+J.&rft.au=Igarashi%2C%26%2332%3BHarue&rft.au=Schaefgen%2C%26%2332%3BJay&rft.au=Lin%2C%26%2332%3BJenny&rft.au=Geisler%2C%26%2332%3BLisa&rft.au=Ben+M%E2%80%99Barek%2C%26%2332%3BMohamed&rft.au=Ramachandran%2C%26%2332%3BSubramanian&rft.au=Cardoso%2C%26%2332%3BThales&rft.au=Hillewaert%2C%26%2332%3BVera&rft.date=1+March+2016&rft.volume=18&rft.issue=2&rft.pages=290%E2%80%93293&rft_id=info:doi\/10.1208%2Fs12248-016-9869-2&rft.issn=1550-7416&rft_id=info:pmc\/PMC4779093&rft_id=info:pmid\/26821803&rft_id=http%3A%2F%2Flink.springer.com%2F10.1208%2Fs12248-016-9869-2&rfr_id=info:sid\/en.wikipedia.org:LII:Directions_in_Laboratory_Systems:_One_Person%27s_Perspective\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<\/ol><\/div><\/div>\n<!-- \nNewPP limit report\nCached time: 20211117234047\nCache expiry: 86400\nDynamic content: false\nComplications: []\nCPU time usage: 0.208 seconds\nReal time usage: 0.302 seconds\nPreprocessor visited node count: 14382\/1000000\nPost\u2010expand include size: 100084\/2097152 bytes\nTemplate argument size: 34791\/2097152 bytes\nHighest expansion depth: 25\/40\nExpensive parser function count: 0\/100\nUnstrip recursion depth: 0\/20\nUnstrip post\u2010expand size: 34153\/5000000 bytes\n-->\n<!--\nTransclusion expansion time report (%,ms,calls,template)\n100.00% 157.766 1 -total\n 86.68% 136.757 2 Template:Reflist\n 66.67% 105.176 19 Template:Citation\/core\n 39.47% 62.267 12 Template:Cite_web\n 20.78% 32.779 5 Template:Cite_journal\n 15.07% 23.772 2 Template:Cite_book\n 11.50% 18.136 14 Template:Date\n 5.85% 9.228 13 Template:Citation\/identifier\n 3.96% 6.250 24 Template:Citation\/make_link\n 3.15% 4.962 6 Template:Efn\n-->\n\n<!-- Saved in parser cache with key limswiki:pcache:idhash:12787-0!canonical and timestamp 20211117234049 and revision id 45048. Serialized with JSON.\n -->\n<\/div><\/div><div class=\"printfooter\">Source: <a rel=\"external_link\" class=\"external\" href=\"https:\/\/www.limswiki.org\/index.php\/LII:Directions_in_Laboratory_Systems:_One_Person%27s_Perspective\">https:\/\/www.limswiki.org\/index.php\/LII:Directions_in_Laboratory_Systems:_One_Person%27s_Perspective<\/a><\/div>\n<!-- end content --><div class=\"visualClear\"><\/div><\/div><\/div><div class=\"visualClear\"><\/div><\/div><!-- end of the left (by default at least) column --><div class=\"visualClear\"><\/div><\/div>\n\n\n\n<\/body>","87d7f050e0c47d7762a90382989592a1_images":["https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/d\/dc\/Fig1_Liscouski_DirectLabSysOnePerPersp21.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/7\/70\/Fig2_Liscouski_DirectLabSysOnePerPersp21.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/3\/3e\/Fig3_Liscouski_DirectLabSysOnePerPersp21.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/b\/b2\/Fig4_Liscouski_DirectLabSysOnePerPersp21.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/c\/cc\/Fig5_Liscouski_DirectLabSysOnePerPersp21.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/0\/05\/FigA1_Liscouski_DirectLabSysOnePerPersp21.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/c\/c8\/FigA2_Liscouski_DirectLabSysOnePerPersp21.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/1\/1c\/FigA3_Liscouski_DirectLabSysOnePerPersp21.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/b\/b3\/FigA4_Liscouski_DirectLabSysOnePerPersp21.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/f\/f1\/FigA5_Liscouski_DirectLabSysOnePerPersp21.png"],"87d7f050e0c47d7762a90382989592a1_timestamp":1637252749,"d8b467af534a70312a21f63b61be26cd_type":"article","d8b467af534a70312a21f63b61be26cd_title":"The Application of Informatics to Scientific Work: Laboratory Informatics for Newbies (Liscouski 2021)","d8b467af534a70312a21f63b61be26cd_url":"https:\/\/www.limswiki.org\/index.php\/LII:The_Application_of_Informatics_to_Scientific_Work:_Laboratory_Informatics_for_Newbies","d8b467af534a70312a21f63b61be26cd_plaintext":"\n\nLII:The Application of Informatics to Scientific Work: Laboratory Informatics for NewbiesFrom LIMSWikiJump to navigationJump to searchTitle: The Application of Informatics to Scientific Work: Laboratory Informatics for Newbies\nAuthor for citation: Joe Liscouski, with editorial modifications by Shawn Douglas\nLicense for content: Creative Commons Attribution-ShareAlike 4.0 International\nPublication date: April 2021\n\nContents \n\n1 Introduction \n\n1.1 Intended audience \n\n\n2 Types of scientific and laboratory work \n\n2.1 Basic and applied research \n\n2.1.1 The research process \n\n\n\n\n3 Testing laboratories \n4 The laboratory notebook \n5 Using laboratory records \n6 Bringing informatics into the lab \n\n6.1 Backup strategy \n6.2 Witness review or sign-off \n6.3 Instrument-computer integration \n6.4 Expanding the research team \n\n\n7 Meeting the needs of the testing laboratory \n8 Other laboratory informatics systems \n\n8.1 Scientific data management system (SDMS) \n8.2 Laboratory execution system (LES) \n8.3 Instrument data system (IDS) \n\n\n9 Planning for laboratory informatics \n\n9.1 Education \n9.2 Planning \n9.3 Why projects fail \n\n\n10 Closing \n11 Footnotes \n12 About the author \n13 References \n\n\n\nIntroduction \nThe purpose of this piece is to introduce people who are not intimately familiar with laboratory work to the basics of laboratory operations and the role that informatics can play in assisting scientists, engineers, and technicians in their efforts. The concepts are important because they provide a functional foundation for understanding lab work and how that work is done in the early part of the twenty-first century (things will change, just wait for it).\n\nIntended audience \nThis material is intended for anyone who is interested in seeing how modern informatics tools can help those doing scientific work. It will provide an orientation to scientific and laboratory work, as well as the systems that have been developed to make that work more productive. It\u2019s for people coming out of school who have carried out lab experiments but not corporate research projects, for those who need to understand how testing labs work, and for IT professionals who may be faced with supporting computing systems in lab environments. It\u2019s also for those who may be tasked with managing projects to choose, install, and make informatics tools useful.\nFigure 1 shows the elements we\u2019ll be discussing in this piece. The treatment of the technical material will be on the lighter side, leaving in-depth subject matter to other works. Instrument data systems will be covered lightly, as any serious discussion becomes lengthy and discipline-specific very quickly; additionally, that material has been covered in other works.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 1. Elements we\u2019ll be covering\n\n\n\nTypes of scientific and laboratory work \nScience is about seeking truthful answers to questions. Sometimes those questions are open-ended without any idea where they will lead you in answering them (e.g. \u201cWhy does water ice float?\u201d). Others are very specific, concerning material composition or properties (e.g., \u201cHow much lead is in this drinking water?\u201d, \u201cHow much does a butterfly weigh?\u201d). Still others may take some effort before you determine the best approach to working on them. The approach someone uses to address these questions depends on the nature of the question; some are destined for research, while others are addressed using specific test methods.\nThere are two types of research: basic and applied. Both can include field work, observations, experiments, models (mathematical, computer, and simulation), etc. Applied research is also done in testing or service laboratories, as with, for example, the development of new methods of analysis.\n\nBasic and applied research \nBasic research is open-ended, as you are looking into something without any idea of where the work will lead. It is often funded by grants through universities or government institutions; continued support depends on the perceived value of the research. Projects can range in size from the work of a single individual to a small team to large-scale groups studying astronomy, high-energy physics, engineering, the life sciences, or a number of fields.\nApplied research, on the other hand, is directed toward a goal. That goal could be a cure for a disease, the development of a COVID-19 vaccine, or work towards artificial intelligence (AI). As with basic research, until some early goals have been reached the work may begin with a single individual or a small team, and then the project scales up. The effort may be broken down into a set of more narrowly focused efforts, whose results will be combined as the development proceeds. Since applied research is goal-directed, funding will depend upon who benefits from those goals being met. Projects of national interest, including security, may be wholly or partially funded by the government. Projects with a commercial interest tend to be funded by corporate interests, including individual companies in their own laboratories or through contract research organizations with expertise useful to the program. Where there is interest from a number of corporate and\/or government groups, consortiums may form to distribute the cost and share in the results.\nBoth basic and applied research can be found in government institutions (including military groups, research and development agencies like the Defense Advanced Research Project Agency [DARPA], and task-specific institutions such as the National Institutes of Health [NIH]), public and private non-profit groups, corporations, consortia, and contract research organizations.\n\nThe research process \nThe research process begins with a question. Any question will do, including \u201cwhy is the sky blue?\u201d We\u2019ll bypass Greek mythology[a] by asking more questions and planning how to proceed to answer them. For example, \u201cIs the sky always blue?\u201d, \u201cWhen is\/isn\u2019t it?\u201d, and \u201cWhat other colors can it be?\u201d Once the process begins, it can include a number of steps, the choice and direction depending upon the nature of the research and the mindset of the researcher:\n\nObservations: This includes basic note-taking with support material (text, photos, drawings, charts, and scanned material). Research (e.g., as with basic astronomy, field biology) can be as simple as looking something up on Google or as complex as understanding how a virus works. Research is about asking questions and looking for answers, which often leads to more questions. It\u2019s a little like my granddaughter who always asks \u201cwhy?\u201d no matter how well I answer the previous question (or at least how well I think I did).\nEnhanced observations: This includes interacting with items under observation, as well as non-directed interactions, preliminary data gathering, and behavioral analysis.\nExperiments and information gathering: This includes organized experiments that are planned, directed, and purpose-driven, as well as data and information gathering.\nTeam building: This includes the creation of teams or networks of people working on the same or similar projects.\nAnalytics and reporting: This includes data and information analysis, data modeling (e.g., mathematical, computer algorithm, and simulation), information synthesis, and knowledge creation.\nTechnology acquisition: This includes gaining access to public, commercial, remote, and other types of databases to assist the research.\nPinning down a \u201ctypical\u201d approach to research isn\u2019t possible because the routes people follow are as individual as the researchers and their area of work are. However, this is generally not the case with testing labs.\n\nTesting laboratories \nIn addition to research labs, there are also testing or \"service\" laboratories. Service labs carry out specific test routines on samples and specimens; you may be familiar with them as quality control labs, clinical labs, toxicology labs, forensic science labs, etc. They are called service labs because they support other organizations, including research organizations, and they have similar modes of operation and work organization, running different tests depending on their area of specialization.\nContract testing labs are another flavor of service laboratories, acting as independent labs that do testing for a fee. These labs can offer capabilities and expertise that their customer doesn\u2019t have, either because the equipment is specialized and not frequently needed or because the customer is looking for a second opinion on an analysis.\nRegardless of the type of service lab, they all have one thing in common: the way they function. For a moment let\u2019s forget about science and think about something else. Take for example a company that does graphics printing as a service to graphic designers and marketing groups. The company could offer a variety of printing services:\n\nbusiness cards\nstationery (e.g., envelopes, letterhead, etc.)\npostcards\nbrochures\nsigns\ngraphics for trade shows (including mounting of an image on backing, lightboxes, etc.)\npostal marketing services\nIn this scenario, customers can come into the shop and drop off work to be done or place orders online (the company website provides a good description of their services and products). One of their biggest concerns is workflow management: what work is coming in, what is in progress, what is in quality control, and what is ready for delivery. Many activities may be associated with this workflow.\n\nOrder acceptance: Log the job into a log book, the go-to reference for work orders. Add the corresponding work order to a binder of work orders; work orders can be removed from the binders as needed (for example, when people are working on the job) and returned when completed. Comments are made on the work order and referenced to an individual\u2019s notebook for details. Work orders shouldn\u2019t be duplicated since people may not be aware of the duplicates and information may be lost. This does add some inefficiency to the process if a work order contains multiple components (e.g., brochures and trade show graphics); if someone needs to work on a task and someone else has the work order, they have to find it. Work orders contain the names of graphics files and their location. Then a check is made to ensure all the needed information is there, notifying people if something is missing. This includes checking to see if the graphics files are available, in the correct format, etc. The priority of the work is determined with respect to other work. Then the customer is notified of the work order status and the expected completion date.\nScheduling: The work order is assigned to one or more individuals for completion.\nPerforming the work: This is where the actual work is performed, including task coordination if an order has multiple components.\nCustomer service: This includes answering customer questions about the work order and addressing inquiries on completion date.\nDraft review: This involves obtaining customer sign-off on a prototype stage if required, making adjustments if needed, and then proceeding to completion.\nQuality control: This is where projects are reviewed and approved for completion.\nDelivery: This involves shipping the material back to the customer or notifying the customer the order is ready for pick-up.\nBilling: After satisfaction with the completed work is acknowledged, the work order is billed to the customer.\nWhen the shop has a large number of projects going on, such a manual, paper-based workflow is difficult and time-consuming to manage. Projects have to be scheduled so that they get done and don\u2019t interfere with other projects that might be on a tight deadline. And then there is inventory management, making sure you have the materials you need on hand when you need them. There is also the occasional rescheduling that occurs if equipment breaks down or someone is out sick. A simplified workflow based on the above is shown in Figure 2.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 2. Simplified print shop workflow, with some details omitted for clarity\n\n\n\nLet's say our print shop has seven people working there. The owner manages the overall operation, and an administrator logs in work orders, notes the location of files that will be used on those orders, and does the final checkout of work, shipping, and billing. The remaining five people\u2014staff, although everyone is at the same organizational level\u2014take care of the actual production work; everyone is cross-trained, but some are more adept on some tasks than others.\nImagine you worked in this shop; how might your day go if you were one of the staff? The administrator will have prioritized the work depending on urgency and grouping similar work orders (or partial orders if there is request for multiple services) together. This is just a matter of efficiency: if you are using a particular piece of equipment and it has to be set up, calibrated, and cleaned when finished, you may as well make the most of that effort and run as many similar jobs as you can. Tracking down copies of work orders is an issue if someone is already working part of the order as there is only one copy so that notes and comments don\u2019t get lost. Each staff member has a notebook to keep track of work, any settings used on equipment, and comments about how the work progressed. These notebook entries are important and useful in case questions come up about a job, how it was run, and if any issues were encountered. As one set of jobs is completed, you move on to the next set. Inventory has to be checked to make sure that the needed materials are on-hand or ordered; if something is missing work has to be rescheduled. The workflow is a continual, organized mix of tasks, with people scheduling time on equipment as needed.\nYou can begin to appreciate how difficult the manual, paper-based workflow in a shop like that is to manage, particularly when it depends upon people communicating clearly. It is the same workflow as any service-oriented business, from a florist to a repair shop. What differs is the size of the organization, the complexity of the work, and the education needed to perform the required tasks.\nNow let's get back to the service laboratory. The print shop workflow is much like the structural workflow of such a laboratory. In the end, it\u2019s the nature of the tasks; the complexity of equipment, instrumentation, and electronic systems used; and the education needed to carry out the work that sets the service laboratory apart from other service operations. However, there is one other, critical aspect that sets it apart: most service labs have to meet federal or industry regulations (e.g., the ISO 9000 family of standards) for their operations. \nAs noted earlier, there are many different types of service laboratories. The basic workflow is the same (see Figure 3 for one perspective on the commonalities of research and service laboratories), but the nature of the testing separates one from another. A water testing lab uses different test procedures than a toxicology lab does, or a clinical lab. Those working in different types of labs have to learn how to run different tests, and they also have to learn about the materials they work with. After all, people's observations about the material tested will differ depending upon how much experience they have with different kinds of materials. \n\r\n\n\n\n\n\n\n\n\n\n\nFigure 3. This diagram represents one perspective on the relationship between laboratory types. This is a bit simplified, particularly on the roles of research labs. Large research facilities, or those in which waiting for test results impacts the progress of research work, may incorporate a \u201cservice lab\u201d function within their operations; the same workflow, just a merger of boxes. The downside of doing that is the loss of independent verification of test results, as people sometimes see what they want to see. This can be addressed by having critical and random samples analyzed independently.\n\n\n\nWhile workflows vary between research and service labs, there is one consistent factor that cuts across both: record keeping.\n\nThe laboratory notebook \nThe laboratory notebook has been a fixture in scientific work for centuries. The laboratory notebook is essentially a diary and can contain text, drawings, pasted photos, illustrations, charts, and so on. Historically, at least until the mid-1970s, it was a paper document that has evolved as legal and regulatory considerations have developed. Figure 4 shows part of Dr. Alexander Graham Bell\u2019s notebook.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 4. Pages 40-41 of Alexander Graham Bell Family Papers in the Library of Congress, Manuscript Division, Public Domain\n\n\n\nThe format of today\u2019s paper notebooks has changed somewhat, and the process of using it has become more rigorous. Take for example Scientific Bindery Productions, a modern manufacturer of professional laboratory notebooks. The description for their duplicate lined notebook includes the following elements[1]:\n\ntable of contents\ninstructions page, for how to use the notebook and address patent protection\nheaders and footers, with legally defensible language\nheaders that include title, project number, and book number fields, as well as a \"work continued from page\" section\nfooters that include signature, date, disclosed to, and understood by fields, as well as a \"work continued to page\" section\nThe details of some of these points are called out in Figure 5, courtesy of Dr. Raquel Cumeras.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 5. A lab notebook example, courtesey of Dr. Raquel Cumeras, Science 2 Knowledge blog, 2019\n\n\n\nOver the years, several guidelines have been published about the use of laboratory notebooks. Examples include:\n\nGood manufacturing practice (GMP) and good laboratory practice (GLP) recordkeeping, from David West, St. Louis Community College, Center for Plant and Life Sciences[2]\nNIH scientific recordkeeping guidelines, from the National Institutes of Health[3]\nGeneral laboratory notebook guidelines, from Science editor Elisabeth Pain[4]\nA Google search of \"guidelines for maintaining laboratory notebooks\" or something similar will provide more examples, including those developed by leading universities.\nAt this point, you\u2019re probably wondering why we\u2019re spending so much time on this. The point: good record keeping is the foundation for documenting scientific work regardless of the media, be it paper or electronic. Yes, the laboratory notebook has an electronic equivalent: the electronic laboratory notebook (ELN). These ELNs and other laboratory informatics systems have to support everything paper systems do or they will fail in ensuring the integrity of documented work.\n\nUsing laboratory records \nLaboratory records, whether in laboratory notebooks or some other format, can be acted upon in many ways. Laboratory personnel interact with them by:\n\nrecording observations, results, instrument data output, photos, and charts\ndescribing research processes, goals, and results\nensuring the authenticity of laboratory work\nplanning and collaborating on experiments\nextracting information for reporting\nbacking up data\nquerying data\nsharing data\npublishing data\narchiving and retrieving data\nsecuring data\nEverything on that list can be done with paper records; however, those activities are easier, faster, and less error prone with electronic systems. Paper records aren\u2019t going away anytime soon, for example when needing to record comments and information that may not have been provided for in electronic systems. This is particularly true as a project team expands from one person to more people. However, the need to have shared access to information becomes a limiting factor in productivity when we rely on paper-based systems. Paper-based systems also depend upon the proximity of people working together, something that became problematic during the COVID-19 pandemic. Social distancing requirements made sharing paper-based notebook pages more challenging, requiring scanning and emailing. This was perhaps feasible for small amounts of physical materials, but less so for large projects with significant paper-based records.\nThat brings up another important point concerning ownership: whose data is it? When people are handed a notebook, they are told \u201cthis is your notebook, a place to write down your work and observations, and you are responsible for it.\u201d Depending upon how employment or consulting contracts are written, the content that goes into the notebook belongs to whoever is paying for the work. When I worked in a lab, the notebook I used as mine was referenced by others as \u201cyour notebook\u201d (it even had my name on it) even though it wasn\u2019t mine but rather the company\u2019s property. Yet when it was filled, they took possession of it and archived it. This concept of ownership has become a stumbling block in some organizations when they decide to install an ELN or laboratory information management system (LIMS), particularly if there are people who have been working there for a long time and have ingrained behaviors. Those people become concerned that someone is going to see their work in an incomplete state before they\u2019ve reviewed and completed it. It\u2019s their work and they don\u2019t want anyone to look at it until it\u2019s done. While the true owners of the work have always had that right, they may not have exercised it, respecting people\u2019s privacy until the work is complete. If you\u2019re considering an informatics system, does it address that concern about ownership?\n\nBringing informatics into the lab \nSo far this guide has hinted at the implications of adding laboratory informatics systems into the laboratory, but now it's time to discuss it further. Deciding how and when to bring such systems into the lab depends on a number of factors.\n1. What is the lab's budget? If an informatics implementation can't be properly funded, don\u2019t start that implementation until it can be. The cost of a laptop computer is trivial compared to the total cost of implementation.\n2. Do we have in-house access to educated and experienced IT support? That staff should understand that laboratory operations are not just another PC- or Microsoft-dominated arena, but rather an environment which has needs for informatics solutions beyond the typical office solutions. For example, laboratory instruments need to be connected to computers, and the resulting data stores should ideally be integrated to make the data more actionable.\n3. Are laboratory staff ready to use the technologies and take responsibility for the things that go with it? Staff must be trained in more than how to operate an instrument. Can they back up data? Do they understand the security and data privacy risks associated with handling the data?\n4. Are organizational policies flexible enough to allow practical use of the technology while still keeping security and data privacy risks in mind? The organization should have some sort of network access both internally and externally. Remote access should be possible, particularly given the circumstances surrounding pandemics and the like. A balanced policy on taking an organizational laptop out of the laboratory should be in place. Policies on cameras should also be reasonable, allowing researchers to capture images of samples and specimens for their notebooks. If organizational policies are too restrictive, the technology's usefulness is largely overshadowed.\n5. What are the lab's space constraints? The size of a lab and the experiments it must conduct can affect the choice of informatics tools.\n6. What is the nature of the lab's operations? Is it a research lab, service lab, or a combination of both? If you are in a service lab situation, bringing in informatics support as early as possible is essential to your workflow and sanity. You want to minimize having to deal with two separate processes and procedures: the old way we did it (paper-based) and the informatics-supported way.\n7. Is your lab\u2019s operation governed by external regulatory requirements? If it\u2019s going to be in the future, you may as well start as though it currently is. Note that everything should be validated, regardless of whether or not the lab is subject regulations. Validation isn't done to satisfy regulators but rather to prove that a process works properly. If you don\u2019t have that proof, what\u2019s the point of using that process? Do you really want to trust what a process produces without proof that it works?\nMost of the points above are easily understood, but let's go into further detail. Let's start by looking at a simple case of you and your project, where you are the sole contributor, without any need for regulatory oversight. Your primary need is to record your planning notes, observations, results, etc. There are tools within the world of computer software to help with this, most notably the word processor. If you have access to one of those, you probably also have access to spreadsheets and other applications that can make your efforts far easier than working with paper. If you search Google for \u201cword processors as lab notebooks\u201d you will find a number of references, including Dr. Martin Engel's guide to using Microsoft OneNote as an ELN[5] and Labforward's 2020 ELN selection guide.[6]\nHowever, simply switching from paper to electronic doesn't mean you're done. There's more to consider, like developing backup policies, addressing witness review, connecting to instruments, and addressing the effects of team expansion, including expanding to more comprehensive purpose-built informatics solutions.\n\nBackup strategy \nYou have the electronic documentation tools and the skills to use them, but what else do you need? A backup strategy is imperative. Imagine a scenario where you are using a desktop computer, laptop, or tablet to do your work and it has one copy of the document you\u2019ve been working on for weeks. You press the power button one morning and nothing happens. However, you are not (completely) worried or panicked but rather largely calm because:\n\nyou have a local backup on removable media (e.g., flash drive, disk), several instances, in fact, that were backed up at least daily, with backups containing everything on the system you were using (you may have a separate backup of your project);\nyou have a remote backup on your organization's servers (perhaps on a virtual machine);\nyou have a cloud-based backup of at least your project files, and as much of the system files that the cloud storage permits (depending on bandwidth and cost), all secured with two-factor authentication; and\ndepending on the operating system you are using, you may have built-in backup-and-recover abilities, e.g. as with Mac OS X's \"Time Machine\" functionality.\nYou've done these things because you've asked yourself \"how much is my work worth to me and my organization?\"\n\nWitness review or sign-off \nThe need for a witness review or sign-off can occur for several reasons, including a potential patent filing or proof that the work was done and data recorded properly, on a certain date, in case it is challenged. One of the ramifications is that you have to identify a second person to be that witness (though this would also be the case if you were using a paper notebook).\nA second issue is that you would have to format the pages of your word processor document (using templates) so as to emulate a signature block and page numbering structure that meets the requirements noted earlier for paper notebooks. You also have to provide a means for either physical (printouts) or electronic signatures (e.g., Adobe Acrobat and other applications provide a useful template that could be used as a lab notebook in this case, doing a cut-and-paste from a Word file). You would also have to ensure that once dated and signed, no edits could be made to that material. If you choose the printed route, then you\u2019re back to paper management. One possibility for dealing with that is to scan the signed pages and upload them to a secure server using the file server's date and time stamp system, or to a document management system to demonstrate that documents haven\u2019t been changed.\nThere is another possibility for time-stamping printed material that is scanned with a high-quality scanner. The concept of machine identification code or \"tracking dots\" allows a set of barely perceptible, forensically traceable colored dots to be printed onto pages, with the arrangement of the dots providing information that identifies the printer (by serial number), as well as the date and time a page was printed. Recent research has demonstrated ways to decode these dots for analysis, particularly as part of printer forensics.[7][8]\n\nInstrument-computer integration \nUsing a computer to host a laboratory notebook raises another concern: how can lab instruments or separate instrument data systems connect to automatically transmit data and information to the computer? Controlled data collection will require software beyond a simple word processor, though many of today's ELN vendors provide integration components with their solutions. This connection, possibly using a spreadsheet import, will greatly improve laboratory productivity and the reliability of data transfers.\n\nExpanding the research team \nIncreasing the size of your project by adding more people will have a significant impact on the complexity of using a more electronic laboratory workflow. However, while the basic issues of sharing, collaboration, etc. are not any different than they would be if paper-based notebooks were in use, the electronic solutions to handling those issues are much more useful. At the core of an expanded laboratory operation is getting personnel to agree on how they are going to communicate data and information, how they are going to collaborate using that data and information, and how they will make that agreement sustainable. There is going to be some small allowance for individual approaches to laboratory activities, but critical matters such as data organization and management need to be strictly adhered to. As such, there are several issues to be mindful of.\n1. How will data and information be organized and compiled? If multiple people are contributing to a project, the results of their work need to be organized in one place so that the status of the work can be ascertained, missing data can be identified, and material is easier to work with. As a project's direction evolves, the formatting and content may change, potentially requiring a complete overhaul of the data structure. That is just a consequence of the way research programs progress.\n2. How will data and information be updated, versioned, and retained? If your lab works with paper files, this isn't so difficult. The lab may have one printed file detailing experimental methods, and when that method file gets updated, the old printed document is removed and the new one added. In labs where each person has their own individual method file, the process would be repeated for each person. As such, there's no confusion as to the current version; methods would have revision histories so that you would know why they were changed. In cases where those methods are kept in electronic files, more attention has to be paid to ensuring that old method files are archived, and that everyone is working with the same version. This means clear communications procedures are essential. Additionally, the name of the file should have the current revision information; don\u2019t rely on the computer's creation or modification dates for files.\n3. How is access to common files controlled? Having people edit or add to common files without careful lockout controls is dangerous. If two people open the same document at the same time and make changes, the one who saves the file last can overwrite any changes the other individual has made. You need a document management system that prevents this; if a file is checked out by one person, others may read it but not write to it until the first person is done. Old versions of files should be archived but not deleted so that a detailed revision history is maintained.\nIf an organization grows too large for a consensus-based cooperation of people using office suite products, it will be time to transition to a more purpose-built solution like a multiuser ELN that is capable of allowing multiple people to contribute to a project simultaneously, providing a common organization for data and information, and allowing users to either import data from an instrument or have a direct connection between the ELN and the instrument data system (or through the use of scientific data management system [SDMS]). But how large is too large? It may be the point when personnel become frustrated with waiting for access to documents, or when processes just don\u2019t seem to move as smoothly as they used to. While there is a significant cost factor in implementing an ELN, it should be done sooner rather than later so your lab can benefit from the organizational structure that an ELN provides and reduce the amount of effort it will take to import data and information into the structure. Ideally you are better off if you can start your work with an ELN once the initial project startup work is taken care of. \nELNs afford a number of additional benefits, including but not limited to access to:\n\nvendor databases for ordering supplies and managing inventory,\nchemical structure databases,\nreaction databases, and\nexternal libraries.\nELNs are appropriate for both basic research and applied research, including work in testing labs where method development and special project work is being done. The prior bulleted list might give you the impression that ELN are oriented toward chemistry; however, that is a reflection of my experience not the industry. The ELN is used in many other industries. Several examples of ELNs past and present, covering many industries, are listed below.\n\nLabTech Notebook was released in 1986 and discontinued in 2004, and it was designed to provide communications between computers and laboratory instruments that used RS-232 serial communications. This ELN was applicable to a variety of industries and disciplines.\nSmartLab from Velquest was released in the early 2000s and was the first commercial product to carry the \"electronic laboratory notebook\" identifier. It was designed as a platform to encode and guide people conducting lab procedures in GLP\/GMP\/GALP environments. Now owned by Dassault Syst\u00e8mes and rebranded as a laboratory execution system (LES) as part of the BIOVIA product line, the solution's same conceptual functionality has since been incorporated into LIMS and ELNs that fit the more current expectation for an ELN.\nWolfram Research has as series of notebook products geared toward mathematics.\nThere is a growing list of ELN systems in a variety of disciplines. Given that the scope of activities an ELN can cover is fairly broad, you have to be careful to define your needs before uttering the words \u201cwe\u2019re going to implement an ELN.\u201d We\u2019ll address that point later.\nThe organization of an ELN application is flexible. The layout is user-defined and can contain queryable text, figures, graphics, and fields for instrument-generated data. Because the ELN is inherently flexible and there usually isn\u2019t any quick-start structure, you have to know what you are looking for in a product and how you want to use it. This requires quality due-dilligence research. \u201cIf only I had known\u201d are among the saddest words after product selection.\nSome organizations have chosen to develop their own ELNs. That is something that should be undertaken with fear and trepidation. Part of the original justification for this route is typically based on the belief that you have special needs that can't be met otherwise. Another concern may be that you don\u2019t want to tie yourself to a vendor that may go out of business. Those points are more a matter of product selection criteria than a justification for a software development project. If you do choose to go with an internal or even a contracted development route, you will potentially have to live with a product that has just one customer (unless the contractor decides to productize it, which is an entirely different discussion). You will also be saddled with the support of that product for the rest of its functional life. And that doesn\u2019t even get into the \"who\" of software product management and development, the \u201cwhen will it be done,\u201d and the inevitable scope creep (i.e., the change and expansions of development requirements). In the initial stages of research projects, the organization of data is subject to change as needs change. This is often at the behest of \u201cunstructured\u201d data and the need to manage it. (Note that it isn\u2019t that the data is truly unstructured, it\u2019s just in a variable structure until the project finds its direction.) This can lead to frustration in setting up a commercial ELN system, let alone designing one from scratch. The next section will look at an organization that is more highly structured.\n\nMeeting the needs of the testing laboratory \nThe earlier print shop description will give you an idea of the workflow of a testing or service lab, for example a quality control (QC) lab. In a manually managed QC or research laboratory, operations can become overwhelming quickly. Imagine a study of 40 samples[b] with multiple tests per sample, each of which has to be individually cataloged. Actually I don\u2019t have to imagine it; it happened frequently in our lab, which supported several research groups. When one of these large sample sets shows up\u2014and sometimes more than one\u2014you don't talk to the person doing the sample logging for fear of disrupting their required concentration. You can get them coffee, but no talking.\nWith a LIMS, this isn't so much an issue. You can log in one sample or a hundred, simply by telling it the starting sample ID, how many samples, and what tests should be scheduled. The LIMS then organizes everything and prints the labels for each sample. With some systems, the requestor of a test can even log them in from a web portal, and the central LIMS automatically updates when the samples actually arrive in the lab. \nA LIMS makes life easier for laboratories in a number of other ways as well. Want to find a list of samples that are pending a particular test? A quality LIMS can readily display that information, including the sample numbers, priorities, and current locations, with no need to manually check work request sheets. Does a third party want to find out the status of one or more of their in-process samples? Role-based access management means a third party can receive limited access to view that status, without seeing anyone else's sensitive data. What about verifying and approving results? The LIMS can provide some level of results checking, with final verification and approval by lab management. When approved, the reports for each set of requests can be printed, emailed, or stored for portal access. And what about integrating data and systems? The LIMS can be connected to an instrument data system (IDS). Depending on the sophistication of that system, the LIMS can generate a worklist of samples that needs to be processed by that device, with the list downloaded to the IDS. When the work is completed, the results can be uploaded directly to the LIMS. This type of system interaction is one of the places where significant productivity gains can be had.\nThe key attributes of a LIMS are shown in Figure 6.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 6. Slide detailing the core components of a LIMS, from A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work, Part 1: Laboratory Informatics Technologies, a webinar series by Joe Liscouski\n\n\n\nThese are just the highlights of what a LIMS can do. Why is a LIMS more effective out-of-the-box than an ELN? The operational behavior of testing labs is essentially the same across industries and disciplines, and as a result vendors have a more predictable and stable base of customers and user requirements to work against than an ELN. It\u2019s a bit like comparing home building contractors. Some have a basic structural architecture that they can modify to meet the needs of a broad market. Others do completely custom homes from the basement up, each of which is unique. The economies of scale and a broader, more predictable customer base show up in products that are easier to work with and adapt both for the primary vendor and those providing add-ons. Those LIMS add-ons include specialized facilities for different industries, including enology and viticulture analysis, water treatment, mineral analysis, cannabis testing, and clinical diagnostics (though the system used in the clinical setting is typically referred to as a laboratory information system or LIS). Regardless, and as is the case with ELNs, you want to install a LIMS as soon as you are able to avoid issues surrounding differing workflows based on pre- and post-implementation.\n\nOther laboratory informatics systems \nWe've mentioned a few other systems in passing, but here we'll provide a brief overview of a few of them.\n\n Scientific data management system (SDMS) \nThe SDMS helps laboratories better solve the problem of dealing with a large number of data files that are being generated, basically by acting as a giant file cabinet that LIMS, ELN, and IDS can connect to. For example, you may not want to put large datasets, spectra, etc. in a LIMS or ELN, you but still have to reference those large files within other internal LIMS and ELN files. This is where the SDMS steps in, storing those large files in the \"file cabinet\" while maintaining references and metadata that are usable by other informatics systems.\n\n Laboratory execution system (LES) \nYou may not know about them, but there are lesser-known systems called LES that are designed to ensure testing protocols are executed properly and that all the data that is necessary is properly recorded. The initial incarnation arose from the previously discussed SmartLab[c], a stand-alone product that would guide an analyst through the steps of an analysis, recording each detail of the work in a file that regulatory agencies could inspect to ensure that work was done properly. It found a ready market in any lab that needed to meet GLP\/GMP requirements. The functionality needed to create the same capability can be found in some LIMS and ELNs, but programming is required.\n\n Instrument data system (IDS) \nAny laboratory instrument popular in the marketplace has had a computer either attached to it or built into it as a package. That\u2019s what an IDS is. It provides automated control over an instrument, collecting and analyzing the data produced by the measuring components. Depending on the sophistication of the vendor, and the demands of the marketplace, the connections between the IDS and another laboratory informatics solution may range from user-programmable interfaces (via a network, USB, serial, or digital I\/O connection) to built-in communications systems that are almost plug-and-play. The latter are most commonly found in the clinical chemistry market, where a great deal of attention has been paid to integration and systems communication via Health Level 7 (HL7) and related protocols. (The details, however, are beyond the scope of this document.)\n\nPlanning for laboratory informatics \nThere are two key requirements to the successful implementation of informatics products: education and planning. \n\nEducation \nAs far as education goes, the webinar series noted in Figure 6 is a good tool, as are documents provided by technical standards body ASTM International. ASTM documents that may be of value to you include:\n\nASTM E1578-18 Standard Guide for Laboratory Informatics[9]\nASTM E1578-13 Standard Guide for Laboratory Informatics[10] (contains some information not found in 2018 version)\nASTM E1947-98(2014) Standard Specification for Analytical Data Interchange Protocol for Chromatographic Data[11]\nASTM D8244-21 Standard Guide for Analytical Laboratory Operations Supporting the Cannabis\/Hemp Industry[12]\nIf you search the ASTM website for \"LIMS\" or \"informatics,\" you'll be surprised by the amount of other material that shows up.\nOther sources of educational material include:\n\nThe Tutorial section of LIMSforum.com[13]\nLaboratory Information Systems Project Management: A Guidebook for International Implementations, by the APHL\nLaboratory Informatics Guide 2021, by Scientific Computing World[14]\nLab Manager magazine, published by LabX Media Group[15]\nComputerized Systems in the Modern Laboratory: A Practical Guide, by Joe Liscouski[16]\nAny vendor with an informatics product will send you a nearly endless stream of material.\nUltimately, however, the responsibility for informatics implementation projects doesn't exist solely on the shoulders of laboratory personnel. Everyone connected with a laboratory implementing informatics solutions should have some level of awareness regarding laboratory informatics, including upper management. However, the level of knowledge required may vary slightly depending on the role. For example, laboratory personnel should be fully educated on common laboratory technologies at a minimum. They should also understand why an informatics project is being considered, what the scope of the implementation will be, and what their role will be in the implementation. Upper management should remember to ask laboratory personnel for input on the project, including the topic of product requirements, in order for personnel to feel like they are part of the process, not simply an observer or \"something that's happening to them.\" Finally, laboratory personnel should understand how their jobs are going to change once the implementation is complete. This needs to be addressed very early in project discussions; it is a matter of change, and change scares people, particularly if it affects their job and income. This is not a \u201cwe\u2019ll deal with that later\u201d point. Don\u2019t start the discussion until you figure this out. Things may change, but people want security.\nOf course, any information technology (IT) personnel will also be involved, requiring significant knowledge about not only networking and software installations but also systems integrations. IT personnel need to understand their role in the implementation project, which can include support, project management, evaluating the vendor support capabilities, and more. They should also be fully aware of and understand the technologies the organization is considering for implementation, and they will be a vital part of the project planning and vendor selection process. IT personnel also will be interested in questions about any enterprise resource planning (ERP) aspects, which may raise issues of \"build or buy.\" The organization needs to be prepared to both address these concerns and gain IT personnel as strong supporters of the project.\nFinally, upper management\u2014those who are going to approve the project and provide funding\u2014need to be educated enough to understand the benefits and risks of the proposed implementation, including the likely time scale for the work. Upper management will need to be active in the project in at least two critical junctures, plus at specific milestones as needed. The first time upper management will need informed participation will be during initial project planning. They will help the organization lay out the issues that need to be addressed, the scope of options that will be investigated, and how the organization is going to proceed. They may pose questions such as \u201ccan we use existing system to solve the problem,\u201d particularly if there already has been an investment in an ERP solution such as SAP. Such technology questions will also be of interest to IT personnel since they have an investment in those systems. The second time upper management needs to undoubtedly be involved is when the actual project proposal is finished and is ready to be pitched. They will ask need to ask questions about the reasoning behind the choices made, why current systems are insufficient, what kind of investment the project will require, how the implementation will be scheduled, and how the roll-out would be planned. Understanding the answers to these and other questions will be difficult if upper management doesn\u2019t understand the technology, the issues, the options, and the benefits of the proposed laboratory informatics project.\nIf the world of informatics is new to any or all these stakeholders, the organization must consider getting outside support (not from a vendor) to guide the organization through the process of evaluating the lab's operations, scoping out an overall implementation program, dividing it into milestones, setting priorities, and developing user requirements.\n\nPlanning \nThe whole point of project planning is to get your organization from the starting point to the end goal, preferably using the most effective path. So, where is the organization going, and why? Everything pretty much boils down to those two questions. Once those questions are answered, more will arise concerning how the organization is actually going to get to the project's end goal.\nInitial planning will look at the lab from a standpoint of the minimum number of computers required to do the work. In some cases, perhaps just those computational systems built into instruments will be sufficient. Whatever the decision, that\u2019s the planning baseline. Then consider what kind of lab it is: a research lab, a service lab, or a blend? That helps direct concentration on potential solutions; however, be sure not to completely eliminate other options just yet. For example, if your lab is a QC lab, it's probably a service lab in need of a LIMS, but even then there are still options to evaluate.\nFrom there, the organization must also think in terms of where that baseline lab is going to be in five years; it\u2019s an arbitrary number but a starting point. Why that far out? It will take a year or two to plan, implement, validate, and educate people to work with the new systems and for the lab to change its behavior and settle down to a new mode of working. The organization should also consider what other changes are likely to take place. What new equipment, procedures, and management changes can be anticipated? Ideally any implemented informatics system will be in place and stable for a few years before people start asking for significant changes; minor ones will likely happen early on. Any hint of \u201cwe didn\u2019t plan for that\u201d will be viewed as poor leadership. Figure 7 shows some of the key points you need to look at during planning.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 7. Slide detailing considerations of laboratory informatics implementation in the lab, from A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work, Part 4: LIMS\/LIS, ELN, SDMS, IT & Education, a webinar series by Joe Liscouski\n\n\n\nAnother factor that needs to be considered is that the considerations shown in Figure 7 can be repeated for each lab in the organization. The project planning team also needs to consider how current laboratory workflow impacts other labs. Are there synergistic effects that can be gained by broadening the scope of what the lab is doing?\n\nWhy projects fail \nWe shouldn't finish the planning section of this guide without discussing why laboratory informatics implementation projects can and do fail. These types of projects are large software projects, and delays, cost over-runs, and failures are common (but hopefully not for your organization's project). Just Google \"why IT projects fail\" and read through some anecdotes. The following are some common reasons informatics implementation projects fail.\n\nInsufficient budgeting: Projects can run short of funding, requiring an awkward meeting with management asking for additional funding (without a project change to account for it), inevitably showing a lack of foresight and planning. Build in a large contingency fund because the unexpected happens. If you\u2019d like some education on the topic, watch a few episodes of any home upgrade project on HGTV, in particular Love It or List It.[d]\nInsufficient management support: If sufficient communication isn\u2019t made with management, problems may arise. For example, project delays are a fact of life. Keep clear communications with upper management so that they, and everyone else on the project, know what is going on. Miscommunication or lack of communication of other aspects of the project may inevitably doom the project.\nPoor understanding of the scope and nature of the project: This is an educational issue, and a lack of education for all involved parties is almost a guarantee of failure. If you need help, bring in an independent consultant who can lend confidence to the project and its management.\nLack of quality or attention to detail: \u201cThere is always time to do it over, but not enough to do it right\u201d is a common complaint in engineering projects. If you hear it on your organization's project, you are in trouble. Basically the complaint is that project members are cutting corners and not doing things properly, and in a way that is not supportable. This never ends well; sometimes not quickly, but in the long run it leads to problems.\nPoor or unrealistic timelines: You may as well face reality from the start: an aggressive timeline just leads to problems (see bullet above). Timelines expand, but they almost never get shorter. If the project team is always rushing, something will most certainly get missed, causing problems later down the road.\nPoor project management: Well-managed projects are obvious, and so are poorly managed ones. Just watch the people working, their demeanor, and their attitude about the work; it will tell you all you need to know. Well-managed projects may not always run smoothly, but they make consistent progress. Poorly managed projects cause you to make excuses.\nClosing \nThis guide is like the directory signs in shopping malls: they tell you where you are and what shops and restaurants in the facility to consider. Once you figure out what you are looking for, you can find your way there. Hopefully in reading this you\u2019ve formed an idea of what you want to look at and what your path to finding it is.\n\nFootnotes \n\n\n\u2191 According to Greek mythology (from the E2BN Myths page): \"Long long ago, when Queen Athena (Zeus's daughter) was born, Zeus blessed her with two boons for when she came of age. After almost 15 years, Athena was told to think up two things to ask for ... 1) To have a city in Greece named after her (Athens) [and] 2) To have all the people of the world see her face every day of the year (what you are seeing are only her eyes). Thus, the sky is blue, just like the color of Athena's eyes...\" \n\n\u2191 In some life science drug screening studies, the number can be far higher, which is where robotics and automation becomes important. \n\n\u2191 A name that has been used by conference organizers after the product was sold and renamed. If you do a Google search on \u201cSmartLab,\u201d you may be surprised at what turns up. \n\n\u2191 Some season highlights can be found on the HGTV website. \n\n\nAbout the author \nInitially educated as a chemist, author Joe Liscouski (joe dot liscouski at gmail dot com) is an experienced laboratory automation\/computing professional with over forty years of experience in the field, including the design and development of automation systems (both custom and commercial systems), LIMS, robotics and data interchange standards. He also consults on the use of computing in laboratory work. He has held symposia on validation and presented technical material and short courses on laboratory automation and computing in the U.S., Europe, and Japan. He has worked\/consulted in pharmaceutical, biotech, polymer, medical, and government laboratories. His current work centers on working with companies to establish planning programs for lab systems, developing effective support groups, and helping people with the application of automation and information technologies in research and quality control environments.\n\nReferences \n\n\n\u2191 \"Duplicate Lined Notebook\". Scientific Bindery Productions, Inc. https:\/\/scientificbindery.com\/products\/duplicate-lined-notebook\/ . Retrieved 12 May 2021 .   \n \n\n\u2191 West, D. (18 November 2021). \"GMP\/GLP Recordkeeping\". St. Louis Community College, Center for Plant and Life Sciences. https:\/\/users.stlcc.edu\/departments\/fvbio\/Lab_Practices_GLP_STLCC.htm . Retrieved 12 May 2021 .   \n \n\n\u2191 National Institutes of Health (December 2008). \"Guidelines for Scientific Record Keeping in the Intramural Research Program at the NIH\" (PDF). Office of the Director. https:\/\/oir.nih.gov\/sites\/default\/files\/uploads\/sourcebook\/documents\/ethical_conduct\/guidelines-scientific_recordkeeping.pdf . Retrieved 12 May 2021 .   \n \n\n\u2191 Pain, E. (3 September 2019). \"How to keep a lab notebook\". Science. doi:10.1126\/science.caredit.aaz3678. https:\/\/www.sciencemag.org\/careers\/2019\/09\/how-keep-lab-notebook . Retrieved 12 May 2021 .   \n \n\n\u2191 Engel, M. (December 2015). \"Blog: How to use onenote as your electronic lab book\". MartinEngel.net. http:\/\/martinengel.net\/2015\/12\/how-to-use-onenote-as-your-electronic-notebook\/ . Retrieved 12 May 2021 .   \n \n\n\u2191 Bungers, S. (2020). \"The Electronic Lab Notebook in 2020: A comprehensive guide\". Labforward GmbH. https:\/\/www.labfolder.com\/electronic-lab-notebook-eln-research-guide . Retrieved 12 May 2021 .   \n \n\n\u2191 Richter, T.; Escher, S.; Sch\u00f6nfeld, D. et al. (2018). \"Forensic Analysis and Anonymisation of Printed Documents\". Proceedings of the 6th ACM Workshop on Information Hiding and Multimedia Security: 127\u201338. doi:10.1145\/3206004.3206019.   \n \n\n\u2191 Baraniuk, C. (7 June 2017). \"Why printers add secret tracking dots\". BBC Future. Archived from the original on 02 November 2019. https:\/\/web.archive.org\/web\/20191102031255\/https:\/\/www.bbc.com\/future\/article\/20170607-why-printers-add-secret-tracking-dots . Retrieved 13 May 2021 .   \n \n\n\u2191 \"ASTM E1578-18 Standard Guide for Laboratory Informatics\". ASTM International. 2018. doi:10.1520\/E1578-18. https:\/\/www.astm.org\/Standards\/E1578.htm . Retrieved 13 May 2021 .   \n \n\n\u2191 \"ASTM E1578-13 Standard Guide for Laboratory Informatics\". ASTM International. 2013. doi:10.1520\/E1578-13. https:\/\/www.astm.org\/DATABASE.CART\/HISTORICAL\/E1578-13.htm . Retrieved 13 May 2021 .   \n \n\n\u2191 \"ASTM E1947-98(2014) Standard Specification for Analytical Data Interchange Protocol for Chromatographic Data\". ASTM International. 2014. doi:10.1520\/E1947-98R14. https:\/\/www.astm.org\/Standards\/E1947.htm . Retrieved 13 May 2021 .   \n \n\n\u2191 \"ASTM D8244-21 Standard Guide for Analytical Laboratory Operations Supporting the Cannabis\/Hemp Industry\". ASTM International. 2021. doi:10.1520\/D8244-21. https:\/\/www.astm.org\/Standards\/D8244.htm . Retrieved 13 May 2021 .   \n \n\n\u2191 \"Tutorials\". LIMSforum. LabLynx, Inc. https:\/\/www.limsforum.com\/category\/education\/tutorials-education\/ . Retrieved 13 May 2021 .   \n \n\n\u2191 \"Laboratory Informatics Guide 2021\". Scientific Computing World. 2021. https:\/\/www.scientific-computing.com\/issue\/laboratory-informatics-guide-2021 . Retrieved 13 May 2021 .   \n \n\n\u2191 \"Lab Manager\". LabX Media Group. 2021. https:\/\/www.labmanager.com\/magazine . Retrieved 13 May 2021 .   \n \n\n\u2191 Liscouski, J.G. (2015). Computerized Systems in the Modern Laboratory: A Practical Guide. DHI Publishing. pp. 432. ISBN 9781933722863. https:\/\/www.dhibooks.com\/computerized-systems-in-the-modern-laboratory-a-practical-guide .   \n \n\n\n\n\n\n\nSource: <a rel=\"external_link\" class=\"external\" href=\"https:\/\/www.limswiki.org\/index.php\/LII:The_Application_of_Informatics_to_Scientific_Work:_Laboratory_Informatics_for_Newbies\">https:\/\/www.limswiki.org\/index.php\/LII:The_Application_of_Informatics_to_Scientific_Work:_Laboratory_Informatics_for_Newbies<\/a>\nNavigation menuPage actionsLIIDiscussionView sourceHistoryPage actionsLIIDiscussionMoreToolsIn other languagesPersonal toolsLog inRequest accountNavigationMain pageRecent changesRandom pageHelp about MediaWikiSearch\u00a0 ToolsWhat links hereRelated changesSpecial pagesPermanent linkPage informationSponsors \r\n\n\t\r\n\n\t\r\n\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\r\n\n\t\r\n\n \n\t\n\t\n\t\r\n\n\t\r\n\n \n\t\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\r\nPrint\/exportCreate a bookDownload as PDFDownload as PDFDownload as Plain textPrintable version This page was last edited on 14 May 2021, at 17:31.Content is available under a Creative Commons Attribution-ShareAlike 4.0 International License unless otherwise noted.This page has been accessed 589 times.Privacy policyAbout LIMSWikiDisclaimers\n\n\n\n","d8b467af534a70312a21f63b61be26cd_html":"<body class=\"mediawiki ltr sitedir-ltr mw-hide-empty-elt ns-202 ns-subject page-LII_The_Application_of_Informatics_to_Scientific_Work_Laboratory_Informatics_for_Newbies rootpage-LII_The_Application_of_Informatics_to_Scientific_Work_Laboratory_Informatics_for_Newbies skin-monobook action-view skin--responsive\"><div id=\"rdp-ebb-globalWrapper\"><div id=\"rdp-ebb-column-content\"><div id=\"rdp-ebb-content\" class=\"mw-body\" role=\"main\"><a id=\"rdp-ebb-top\"><\/a>\n<h1 id=\"rdp-ebb-firstHeading\" class=\"firstHeading\" lang=\"en\">LII:The Application of Informatics to Scientific Work: Laboratory Informatics for Newbies<\/h1><div id=\"rdp-ebb-bodyContent\" class=\"mw-body-content\"><!-- start content --><div id=\"rdp-ebb-mw-content-text\" lang=\"en\" dir=\"ltr\" class=\"mw-content-ltr\"><div class=\"mw-parser-output\"><p><b>Title<\/b>: <i>The Application of Informatics to Scientific Work: Laboratory Informatics for Newbies<\/i>\n<\/p><p><b>Author for citation<\/b>: Joe Liscouski, with editorial modifications by Shawn Douglas\n<\/p><p><b>License for content<\/b>: <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/creativecommons.org\/licenses\/by-sa\/4.0\/\" target=\"_blank\">Creative Commons Attribution-ShareAlike 4.0 International<\/a>\n<\/p><p><b>Publication date<\/b>: April 2021\n<\/p>\n\n\n<h2><span class=\"mw-headline\" id=\"Introduction\">Introduction<\/span><\/h2>\n<p>The purpose of this piece is to introduce people who are not intimately familiar with <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory\" title=\"Laboratory\" class=\"wiki-link\" data-key=\"c57fc5aac9e4abf31dccae81df664c33\">laboratory<\/a> work to the basics of laboratory operations and the role that <a href=\"https:\/\/www.limswiki.org\/index.php\/Informatics_(academic_field)\" title=\"Informatics (academic field)\" class=\"wiki-link\" data-key=\"0391318826a5d9f9a1a1bcc88394739f\">informatics<\/a> can play in assisting scientists, engineers, and technicians in their efforts. The concepts are important because they provide a functional foundation for understanding lab work and how that work is done in the early part of the twenty-first century (things will change, just wait for it).\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Intended_audience\">Intended audience<\/span><\/h3>\n<p>This material is intended for anyone who is interested in seeing how modern informatics tools can help those doing scientific work. It will provide an orientation to scientific and laboratory work, as well as the systems that have been developed to make that work more productive. It\u2019s for people coming out of school who have carried out lab experiments but not corporate research projects, for those who need to understand how testing labs work, and for IT professionals who may be faced with supporting computing systems in lab environments. It\u2019s also for those who may be tasked with managing projects to choose, install, and make informatics tools useful.\n<\/p><p>Figure 1 shows the elements we\u2019ll be discussing in this piece. The treatment of the technical material will be on the lighter side, leaving in-depth subject matter to other works. Instrument data systems will be covered lightly, as any serious discussion becomes lengthy and discipline-specific very quickly; additionally, that material has been covered <a href=\"https:\/\/www.limswiki.org\/index.php\/LII:Notes_on_Instrument_Data_Systems\" title=\"LII:Notes on Instrument Data Systems\" class=\"wiki-link\" data-key=\"1b7330228fd59158aab6fab82ad0e7cc\">in other works<\/a>.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig1_Liscouski_AppInfoSciWork21.png\" class=\"image wiki-link\" data-key=\"fd326d7208c88d3103f5af9e4a306269\"><img alt=\"Fig1 Liscouski AppInfoSciWork21.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/5\/5c\/Fig1_Liscouski_AppInfoSciWork21.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 1.<\/b> Elements we\u2019ll be covering<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<h2><span class=\"mw-headline\" id=\"Types_of_scientific_and_laboratory_work\">Types of scientific and laboratory work<\/span><\/h2>\n<p>Science is about seeking truthful answers to questions. Sometimes those questions are open-ended without any idea where they will lead you in answering them (e.g. \u201cWhy does water ice float?\u201d). Others are very specific, concerning material composition or properties (e.g., \u201cHow much lead is in this drinking water?\u201d, \u201cHow much does a butterfly weigh?\u201d). Still others may take some effort before you determine the best approach to working on them. The approach someone uses to address these questions depends on the nature of the question; some are destined for research, while others are addressed using specific test methods.\n<\/p><p>There are two types of research: basic and applied. Both can include field work, observations, experiments, models (mathematical, computer, and simulation), etc. Applied research is also done in testing or service laboratories, as with, for example, the development of new methods of analysis.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Basic_and_applied_research\">Basic and applied research<\/span><\/h3>\n<p>Basic research is open-ended, as you are looking into something without any idea of where the work will lead. It is often funded by grants through universities or government institutions; continued support depends on the perceived value of the research. Projects can range in size from the work of a single individual to a small team to large-scale groups studying astronomy, high-energy physics, engineering, the life sciences, or a number of fields.\n<\/p><p>Applied research, on the other hand, is directed toward a goal. That goal could be a cure for a disease, the development of a <a href=\"https:\/\/www.limswiki.org\/index.php\/COVID-19\" class=\"mw-redirect wiki-link\" title=\"COVID-19\" data-key=\"da9bd20c492b2a17074ad66c2fe25652\">COVID-19<\/a> vaccine, or work towards <a href=\"https:\/\/www.limswiki.org\/index.php\/Artificial_intelligence\" title=\"Artificial intelligence\" class=\"wiki-link\" data-key=\"0c45a597361ca47e1cd8112af676276e\">artificial intelligence<\/a> (AI). As with basic research, until some early goals have been reached the work may begin with a single individual or a small team, and then the project scales up. The effort may be broken down into a set of more narrowly focused efforts, whose results will be combined as the development proceeds. Since applied research is goal-directed, funding will depend upon who benefits from those goals being met. Projects of national interest, including security, may be wholly or partially funded by the government. Projects with a commercial interest tend to be funded by corporate interests, including individual companies in their own laboratories or through contract research organizations with expertise useful to the program. Where there is interest from a number of corporate and\/or government groups, consortiums may form to distribute the cost and share in the results.\n<\/p><p>Both basic and applied research can be found in government institutions (including military groups, research and development agencies like the Defense Advanced Research Project Agency [DARPA], and task-specific institutions such as the <a href=\"https:\/\/www.limswiki.org\/index.php\/National_Institutes_of_Health\" title=\"National Institutes of Health\" class=\"wiki-link\" data-key=\"e5c215c48e73ae58b0695dc2af951cd0\">National Institutes of Health<\/a> [NIH]), public and private non-profit groups, corporations, consortia, and contract research organizations.\n<\/p>\n<h4><span class=\"mw-headline\" id=\"The_research_process\">The research process<\/span><\/h4>\n<p>The research process begins with a question. Any question will do, including \u201cwhy is the sky blue?\u201d We\u2019ll bypass Greek mythology<sup id=\"rdp-ebb-cite_ref-1\" class=\"reference\"><a href=\"#cite_note-1\">[a]<\/a><\/sup> by asking more questions and planning how to proceed to answer them. For example, \u201cIs the sky always blue?\u201d, \u201cWhen is\/isn\u2019t it?\u201d, and \u201cWhat other colors can it be?\u201d Once the process begins, it can include a number of steps, the choice and direction depending upon the nature of the research and the mindset of the researcher:\n<\/p>\n<ul><li>Observations: This includes basic note-taking with support material (text, photos, drawings, charts, and scanned material). Research (e.g., as with basic astronomy, field biology) can be as simple as looking something up on Google or as complex as understanding how a virus works. Research is about asking questions and looking for answers, which often leads to more questions. It\u2019s a little like my granddaughter who always asks \u201cwhy?\u201d no matter how well I answer the previous question (or at least how well I think I did).<\/li>\n<li>Enhanced observations: This includes interacting with items under observation, as well as non-directed interactions, preliminary data gathering, and behavioral analysis.<\/li>\n<li>Experiments and information gathering: This includes organized experiments that are planned, directed, and purpose-driven, as well as data and information gathering.<\/li>\n<li>Team building: This includes the creation of teams or networks of people working on the same or similar projects.<\/li>\n<li>Analytics and reporting: This includes data and information analysis, data modeling (e.g., mathematical, computer algorithm, and simulation), information synthesis, and knowledge creation.<\/li>\n<li>Technology acquisition: This includes gaining access to public, commercial, remote, and other types of databases to assist the research.<\/li><\/ul>\n<p>Pinning down a \u201ctypical\u201d approach to research isn\u2019t possible because the routes people follow are as individual as the researchers and their area of work are. However, this is generally not the case with testing labs.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Testing_laboratories\">Testing laboratories<\/span><\/h2>\n<p>In addition to research labs, there are also testing or \"service\" laboratories. Service labs carry out specific test routines on samples and specimens; you may be familiar with them as <a href=\"https:\/\/www.limswiki.org\/index.php\/Quality_control\" title=\"Quality control\" class=\"wiki-link\" data-key=\"1e0e0c2eb3e45aff02f5d61799821f0f\">quality control<\/a> labs, <a href=\"https:\/\/www.limswiki.org\/index.php\/Clinical_laboratory\" title=\"Clinical laboratory\" class=\"wiki-link\" data-key=\"307bcdf1bdbcd1bb167cee435b7a5463\">clinical labs<\/a>, toxicology labs, <a href=\"https:\/\/www.limswiki.org\/index.php\/Forensic_laboratory\" class=\"mw-redirect wiki-link\" title=\"Forensic laboratory\" data-key=\"dabef2566b55c1f13b395543b47ae81e\">forensic science labs<\/a>, etc. They are called service labs because they support other organizations, including research organizations, and they have similar modes of operation and work organization, running different tests depending on their area of specialization.\n<\/p><p>Contract testing labs are another flavor of service laboratories, acting as independent labs that do testing for a fee. These labs can offer capabilities and expertise that their customer doesn\u2019t have, either because the equipment is specialized and not frequently needed or because the customer is looking for a second opinion on an analysis.\n<\/p><p>Regardless of the type of service lab, they all have one thing in common: the way they function. For a moment let\u2019s forget about science and think about something else. Take for example a company that does graphics printing as a service to graphic designers and marketing groups. The company could offer a variety of printing services:\n<\/p>\n<ul><li>business cards<\/li>\n<li>stationery (e.g., envelopes, letterhead, etc.)<\/li>\n<li>postcards<\/li>\n<li>brochures<\/li>\n<li>signs<\/li>\n<li>graphics for trade shows (including mounting of an image on backing, lightboxes, etc.)<\/li>\n<li>postal marketing services<\/li><\/ul>\n<p>In this scenario, customers can come into the shop and drop off work to be done or place orders online (the company website provides a good description of their services and products). One of their biggest concerns is <a href=\"https:\/\/www.limswiki.org\/index.php\/Workflow\" title=\"Workflow\" class=\"wiki-link\" data-key=\"92bd8748272e20d891008dcb8243e8a8\">workflow<\/a> management: what work is coming in, what is in progress, what is in quality control, and what is ready for delivery. Many activities may be associated with this workflow.\n<\/p>\n<ul><li>Order acceptance: Log the job into a log book, the go-to reference for work orders. Add the corresponding work order to a binder of work orders; work orders can be removed from the binders as needed (for example, when people are working on the job) and returned when completed. Comments are made on the work order and referenced to an individual\u2019s notebook for details. Work orders shouldn\u2019t be duplicated since people may not be aware of the duplicates and information may be lost. This does add some inefficiency to the process if a work order contains multiple components (e.g., brochures and trade show graphics); if someone needs to work on a task and someone else has the work order, they have to find it. Work orders contain the names of graphics files and their location. Then a check is made to ensure all the needed information is there, notifying people if something is missing. This includes checking to see if the graphics files are available, in the correct format, etc. The priority of the work is determined with respect to other work. Then the customer is notified of the work order status and the expected completion date.<\/li>\n<li>Scheduling: The work order is assigned to one or more individuals for completion.<\/li>\n<li>Performing the work: This is where the actual work is performed, including task coordination if an order has multiple components.<\/li>\n<li>Customer service: This includes answering customer questions about the work order and addressing inquiries on completion date.<\/li>\n<li>Draft review: This involves obtaining customer sign-off on a prototype stage if required, making adjustments if needed, and then proceeding to completion.<\/li>\n<li>Quality control: This is where projects are reviewed and approved for completion.<\/li>\n<li>Delivery: This involves shipping the material back to the customer or notifying the customer the order is ready for pick-up.<\/li>\n<li>Billing: After satisfaction with the completed work is acknowledged, the work order is billed to the customer.<\/li><\/ul>\n<p>When the shop has a large number of projects going on, such a manual, paper-based workflow is difficult and time-consuming to manage. Projects have to be scheduled so that they get done and don\u2019t interfere with other projects that might be on a tight deadline. And then there is inventory management, making sure you have the materials you need on hand when you need them. There is also the occasional rescheduling that occurs if equipment breaks down or someone is out sick. A simplified workflow based on the above is shown in Figure 2.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig2_Liscouski_AppInfoSciWork21.png\" class=\"image wiki-link\" data-key=\"e6a7aef6488ffa2faf96dfaaa770b627\"><img alt=\"Fig2 Liscouski AppInfoSciWork21.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/1\/14\/Fig2_Liscouski_AppInfoSciWork21.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 2.<\/b> Simplified print shop workflow, with some details omitted for clarity<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Let's say our print shop has seven people working there. The owner manages the overall operation, and an administrator logs in work orders, notes the location of files that will be used on those orders, and does the final checkout of work, shipping, and billing. The remaining five people\u2014staff, although everyone is at the same organizational level\u2014take care of the actual production work; everyone is cross-trained, but some are more adept on some tasks than others.\n<\/p><p>Imagine you worked in this shop; how might your day go if you were one of the staff? The administrator will have prioritized the work depending on urgency and grouping similar work orders (or partial orders if there is request for multiple services) together. This is just a matter of efficiency: if you are using a particular piece of equipment and it has to be set up, calibrated, and cleaned when finished, you may as well make the most of that effort and run as many similar jobs as you can. Tracking down copies of work orders is an issue if someone is already working part of the order as there is only one copy so that notes and comments don\u2019t get lost. Each staff member has a notebook to keep track of work, any settings used on equipment, and comments about how the work progressed. These notebook entries are important and useful in case questions come up about a job, how it was run, and if any issues were encountered. As one set of jobs is completed, you move on to the next set. Inventory has to be checked to make sure that the needed materials are on-hand or ordered; if something is missing work has to be rescheduled. The workflow is a continual, organized mix of tasks, with people scheduling time on equipment as needed.\n<\/p><p>You can begin to appreciate how difficult the manual, paper-based workflow in a shop like that is to manage, particularly when it depends upon people communicating clearly. It is the same workflow as any service-oriented business, from a florist to a repair shop. What differs is the size of the organization, the complexity of the work, and the education needed to perform the required tasks.\n<\/p><p>Now let's get back to the service laboratory. The print shop workflow is much like the structural workflow of such a laboratory. In the end, it\u2019s the nature of the tasks; the complexity of equipment, instrumentation, and electronic systems used; and the education needed to carry out the work that sets the service laboratory apart from other service operations. However, there is one other, critical aspect that sets it apart: most service labs have to meet federal or industry regulations (e.g., the <a href=\"https:\/\/www.limswiki.org\/index.php\/ISO_9000\" title=\"ISO 9000\" class=\"wiki-link\" data-key=\"53ace2d12e80a7d890ce881bc6fe244a\">ISO 9000<\/a> family of standards) for their operations. \n<\/p><p>As noted earlier, there are many different types of service laboratories. The basic workflow is the same (see Figure 3 for one perspective on the commonalities of research and service laboratories), but the nature of the testing separates one from another. A water testing lab uses different test procedures than a toxicology lab does, or a clinical lab. Those working in different types of labs have to learn how to run different tests, and they also have to learn about the materials they work with. After all, people's observations about the material tested will differ depending upon how much experience they have with different kinds of materials. \n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig3_Liscouski_AppInfoSciWork21.png\" class=\"image wiki-link\" data-key=\"ee73859dfa7a4ff773b8022433c7c3d9\"><img alt=\"Fig3 Liscouski AppInfoSciWork21.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/6\/62\/Fig3_Liscouski_AppInfoSciWork21.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 3.<\/b> This diagram represents one perspective on the relationship between laboratory types. This is a bit simplified, particularly on the roles of research labs. Large research facilities, or those in which waiting for test results impacts the progress of research work, may incorporate a \u201cservice lab\u201d function within their operations; the same workflow, just a merger of boxes. The downside of doing that is the loss of independent verification of test results, as people sometimes see what they want to see. This can be addressed by having critical and random samples analyzed independently.<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>While workflows vary between research and service labs, there is one consistent factor that cuts across both: record keeping.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"The_laboratory_notebook\">The laboratory notebook<\/span><\/h2>\n<p>The <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_notebook\" title=\"Laboratory notebook\" class=\"wiki-link\" data-key=\"be60c7be96aba8e9a84537fd8835fa54\">laboratory notebook<\/a> has been a fixture in scientific work for centuries. The laboratory notebook is essentially a diary and can contain text, drawings, pasted photos, illustrations, charts, and so on. Historically, at least until the mid-1970s, it was a paper document that has evolved as legal and regulatory considerations have developed. Figure 4 shows part of Dr. Alexander Graham Bell\u2019s notebook.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:AGBell_Notebook.jpg\" class=\"image wiki-link\" data-key=\"ba0c4bed9e7d80a0155a84a8d79159bf\"><img alt=\"AGBell Notebook.jpg\" src=\"https:\/\/upload.wikimedia.org\/wikipedia\/commons\/0\/0c\/AGBell_Notebook.jpg\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 4.<\/b> Pages 40-41 of Alexander Graham Bell Family Papers in the Library of Congress, Manuscript Division, Public Domain<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>The format of today\u2019s paper notebooks has changed somewhat, and the process of using it has become more rigorous. Take for example Scientific Bindery Productions, a modern manufacturer of professional laboratory notebooks. The description for their duplicate lined notebook includes the following elements<sup id=\"rdp-ebb-cite_ref-SBPDuplicate_2-0\" class=\"reference\"><a href=\"#cite_note-SBPDuplicate-2\">[1]<\/a><\/sup>:\n<\/p>\n<ul><li>table of contents<\/li>\n<li>instructions page, for how to use the notebook and address patent protection<\/li>\n<li>headers and footers, with legally defensible language<\/li>\n<li>headers that include title, project number, and book number fields, as well as a \"work continued from page\" section<\/li>\n<li>footers that include signature, date, disclosed to, and understood by fields, as well as a \"work continued to page\" section<\/li><\/ul>\n<p>The details of some of these points are called out in Figure 5, courtesy of Dr. Raquel Cumeras.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig5_Liscouski_AppInfoSciWork21.png\" class=\"image wiki-link\" data-key=\"2cb81832fb7493dc7d899dce7df6a005\"><img alt=\"Fig5 Liscouski AppInfoSciWork21.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/6\/65\/Fig5_Liscouski_AppInfoSciWork21.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 5.<\/b> <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/science2knowledge.wordpress.com\/writing-the-laboratory-notebook\/\" target=\"_blank\">A lab notebook example<\/a>, courtesey of Dr. Raquel Cumeras, <i>Science 2 Knowledge<\/i> blog, 2019<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Over the years, several guidelines have been published about the use of laboratory notebooks. Examples include:\n<\/p>\n<ul><li>Good manufacturing practice (GMP) and good laboratory practice (GLP) recordkeeping, from David West, St. Louis Community College, Center for Plant and Life Sciences<sup id=\"rdp-ebb-cite_ref-WestGMP_3-0\" class=\"reference\"><a href=\"#cite_note-WestGMP-3\">[2]<\/a><\/sup><\/li>\n<li>NIH scientific recordkeeping guidelines, from the National Institutes of Health<sup id=\"rdp-ebb-cite_ref-NIHGuidelines08_4-0\" class=\"reference\"><a href=\"#cite_note-NIHGuidelines08-4\">[3]<\/a><\/sup><\/li>\n<li>General laboratory notebook guidelines, from <i>Science<\/i> editor Elisabeth Pain<sup id=\"rdp-ebb-cite_ref-PainBow19_5-0\" class=\"reference\"><a href=\"#cite_note-PainBow19-5\">[4]<\/a><\/sup><\/li><\/ul>\n<p>A Google search of \"guidelines for maintaining laboratory notebooks\" or something similar will provide more examples, including those developed by leading universities.\n<\/p><p>At this point, you\u2019re probably wondering why we\u2019re spending so much time on this. The point: good record keeping is the foundation for documenting scientific work regardless of the media, be it paper or electronic. Yes, the laboratory notebook has an electronic equivalent: the <a href=\"https:\/\/www.limswiki.org\/index.php\/Electronic_laboratory_notebook\" title=\"Electronic laboratory notebook\" class=\"wiki-link\" data-key=\"a9fbbd5e0807980106763fab31f1e72f\">electronic laboratory notebook<\/a> (ELN). These ELNs and other <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_informatics\" title=\"Laboratory informatics\" class=\"wiki-link\" data-key=\"00edfa43edcde538a695f6d429280301\">laboratory informatics<\/a> systems have to support everything paper systems do or they will fail in ensuring the integrity of documented work.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Using_laboratory_records\">Using laboratory records<\/span><\/h2>\n<p>Laboratory records, whether in laboratory notebooks or some other format, can be acted upon in many ways. Laboratory personnel interact with them by:\n<\/p>\n<ul><li>recording observations, results, instrument data output, photos, and charts<\/li>\n<li>describing research processes, goals, and results<\/li>\n<li>ensuring the authenticity of laboratory work<\/li>\n<li>planning and collaborating on experiments<\/li>\n<li>extracting information for reporting<\/li>\n<li>backing up data<\/li>\n<li>querying data<\/li>\n<li>sharing data<\/li>\n<li>publishing data<\/li>\n<li>archiving and retrieving data<\/li>\n<li>securing data<\/li><\/ul>\n<p>Everything on that list can be done with paper records; however, those activities are easier, faster, and less error prone with electronic systems. Paper records aren\u2019t going away anytime soon, for example when needing to record comments and information that may not have been provided for in electronic systems. This is particularly true as a project team expands from one person to more people. However, the need to have shared access to information becomes a limiting factor in productivity when we rely on paper-based systems. Paper-based systems also depend upon the proximity of people working together, something that became problematic during the <a href=\"https:\/\/www.limswiki.org\/index.php\/COVID-19\" class=\"mw-redirect wiki-link\" title=\"COVID-19\" data-key=\"da9bd20c492b2a17074ad66c2fe25652\">COVID-19<\/a> pandemic. Social distancing requirements made sharing paper-based notebook pages more challenging, requiring scanning and emailing. This was perhaps feasible for small amounts of physical materials, but less so for large projects with significant paper-based records.\n<\/p><p>That brings up another important point concerning ownership: whose data is it? When people are handed a notebook, they are told \u201cthis is your notebook, a place to write down your work and observations, and you are responsible for it.\u201d Depending upon how employment or consulting contracts are written, the content that goes into the notebook belongs to whoever is paying for the work. When I worked in a lab, the notebook I used as mine was referenced by others as \u201cyour notebook\u201d (it even had my name on it) even though it wasn\u2019t mine but rather the company\u2019s property. Yet when it was filled, they took possession of it and archived it. This concept of ownership has become a stumbling block in some organizations when they decide to install an ELN or <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_information_management_system\" title=\"Laboratory information management system\" class=\"wiki-link\" data-key=\"8ff56a51d34c9b1806fcebdcde634d00\">laboratory information management system<\/a> (LIMS), particularly if there are people who have been working there for a long time and have ingrained behaviors. Those people become concerned that someone is going to see their work in an incomplete state before they\u2019ve reviewed and completed it. It\u2019s their work and they don\u2019t want anyone to look at it until it\u2019s done. While the true owners of the work have always had that right, they may not have exercised it, respecting people\u2019s privacy until the work is complete. If you\u2019re considering an informatics system, does it address that concern about ownership?\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Bringing_informatics_into_the_lab\">Bringing informatics into the lab<\/span><\/h2>\n<p>So far this guide has hinted at the implications of adding laboratory informatics systems into the laboratory, but now it's time to discuss it further. Deciding how and when to bring such systems into the lab depends on a number of factors.\n<\/p><p>1. What is the lab's budget? If an informatics implementation can't be properly funded, don\u2019t start that implementation until it can be. The cost of a laptop computer is trivial compared to the total cost of implementation.\n<\/p><p>2. Do we have in-house access to educated and experienced IT support? That staff should understand that laboratory operations are not just another PC- or Microsoft-dominated arena, but rather an environment which has needs for informatics solutions beyond the typical office solutions. For example, laboratory instruments need to be connected to computers, and the resulting data stores should ideally be integrated to make the data more actionable.\n<\/p><p>3. Are laboratory staff ready to use the technologies and take responsibility for the things that go with it? Staff must be trained in more than how to operate an instrument. Can they back up data? Do they understand the security and data privacy risks associated with handling the data?\n<\/p><p>4. Are organizational policies flexible enough to allow practical use of the technology while still keeping security and data privacy risks in mind? The organization should have some sort of network access both internally and externally. Remote access should be possible, particularly given the circumstances surrounding pandemics and the like. A balanced policy on taking an organizational laptop out of the laboratory should be in place. Policies on cameras should also be reasonable, allowing researchers to capture images of samples and specimens for their notebooks. If organizational policies are too restrictive, the technology's usefulness is largely overshadowed.\n<\/p><p>5. What are the lab's space constraints? The size of a lab and the experiments it must conduct can affect the choice of informatics tools.\n<\/p><p>6. What is the nature of the lab's operations? Is it a research lab, service lab, or a combination of both? If you are in a service lab situation, bringing in informatics support as early as possible is essential to your workflow and sanity. You want to minimize having to deal with two separate processes and procedures: the old way we did it (paper-based) and the informatics-supported way.\n<\/p><p>7. Is your lab\u2019s operation governed by external regulatory requirements? If it\u2019s going to be in the future, you may as well start as though it currently is. Note that everything should be validated, regardless of whether or not the lab is subject regulations. Validation isn't done to satisfy regulators but rather to prove that a process works properly. If you don\u2019t have that proof, what\u2019s the point of using that process? Do you really want to trust what a process produces without proof that it works?\n<\/p><p>Most of the points above are easily understood, but let's go into further detail. Let's start by looking at a simple case of you and your project, where you are the sole contributor, without any need for regulatory oversight. Your primary need is to record your planning notes, observations, results, etc. There are tools within the world of computer software to help with this, most notably the word processor. If you have access to one of those, you probably also have access to spreadsheets and other applications that can make your efforts far easier than working with paper. If you search Google for \u201cword processors as lab notebooks\u201d you will find a number of references, including Dr. Martin Engel's guide to using Microsoft OneNote as an ELN<sup id=\"rdp-ebb-cite_ref-EngelBlog15_6-0\" class=\"reference\"><a href=\"#cite_note-EngelBlog15-6\">[5]<\/a><\/sup> and <a href=\"https:\/\/www.limswiki.org\/index.php\/Labforward_GmbH\" title=\"Labforward GmbH\" class=\"wiki-link\" data-key=\"8a9842138692008e4f5896ee023849ab\">Labforward's<\/a> 2020 ELN selection guide.<sup id=\"rdp-ebb-cite_ref-BungersTheElect20_7-0\" class=\"reference\"><a href=\"#cite_note-BungersTheElect20-7\">[6]<\/a><\/sup>\n<\/p><p>However, simply switching from paper to electronic doesn't mean you're done. There's more to consider, like developing backup policies, addressing witness review, connecting to instruments, and addressing the effects of team expansion, including expanding to more comprehensive purpose-built informatics solutions.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Backup_strategy\">Backup strategy<\/span><\/h3>\n<p>You have the electronic documentation tools and the skills to use them, but what else do you need? A backup strategy is imperative. Imagine a scenario where you are using a desktop computer, laptop, or tablet to do your work and it has one copy of the document you\u2019ve been working on for weeks. You press the power button one morning and nothing happens. However, you are not (completely) worried or panicked but rather largely calm because:\n<\/p>\n<ul><li>you have a local backup on removable media (e.g., flash drive, disk), several instances, in fact, that were backed up at least daily, with backups containing everything on the system you were using (you may have a separate backup of your project);<\/li>\n<li>you have a remote backup on your organization's servers (perhaps on a virtual machine);<\/li>\n<li>you have a <a href=\"https:\/\/www.limswiki.org\/index.php\/Cloud_computing\" title=\"Cloud computing\" class=\"wiki-link\" data-key=\"fcfe5882eaa018d920cedb88398b604f\">cloud-based<\/a> backup of at least your project files, and as much of the system files that the cloud storage permits (depending on bandwidth and cost), all secured with two-factor authentication; and<\/li>\n<li>depending on the operating system you are using, you may have built-in backup-and-recover abilities, e.g. as with Mac OS X's \"Time Machine\" functionality.<\/li><\/ul>\n<p>You've done these things because you've asked yourself \"how much is my work worth to me and my organization?\"\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Witness_review_or_sign-off\">Witness review or sign-off<\/span><\/h3>\n<p>The need for a witness review or sign-off can occur for several reasons, including a potential patent filing or proof that the work was done and data recorded properly, on a certain date, in case it is challenged. One of the ramifications is that you have to identify a second person to be that witness (though this would also be the case if you were using a paper notebook).\n<\/p><p>A second issue is that you would have to format the pages of your word processor document (using templates) so as to emulate a signature block and page numbering structure that meets the requirements noted earlier for paper notebooks. You also have to provide a means for either physical (printouts) or <a href=\"https:\/\/www.limswiki.org\/index.php\/Electronic_signature\" title=\"Electronic signature\" class=\"wiki-link\" data-key=\"dd6997760552a80c6babaf1174c092f4\">electronic signatures<\/a> (e.g., Adobe Acrobat and other applications provide a useful template that could be used as a lab notebook in this case, doing a cut-and-paste from a Word file). You would also have to ensure that once dated and signed, no edits could be made to that material. If you choose the printed route, then you\u2019re back to paper management. One possibility for dealing with that is to scan the signed pages and upload them to a secure server using the file server's date and time stamp system, or to a document management system to demonstrate that documents haven\u2019t been changed.\n<\/p><p>There is another possibility for time-stamping printed material that is scanned with a high-quality scanner. The concept of machine identification code or \"tracking dots\" allows a set of barely perceptible, forensically traceable colored dots to be printed onto pages, with the arrangement of the dots providing information that identifies the printer (by serial number), as well as the date and time a page was printed. Recent research has demonstrated ways to decode these dots for analysis, particularly as part of printer forensics.<sup id=\"rdp-ebb-cite_ref-RichterForensic18_8-0\" class=\"reference\"><a href=\"#cite_note-RichterForensic18-8\">[7]<\/a><\/sup><sup id=\"rdp-ebb-cite_ref-BaraniukWhy17_9-0\" class=\"reference\"><a href=\"#cite_note-BaraniukWhy17-9\">[8]<\/a><\/sup>\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Instrument-computer_integration\">Instrument-computer integration<\/span><\/h3>\n<p>Using a computer to host a laboratory notebook raises another concern: how can lab instruments or separate instrument data systems connect to automatically transmit data and information to the computer? Controlled data collection will require software beyond a simple word processor, though many of today's ELN vendors provide integration components with their solutions. This connection, possibly using a spreadsheet import, will greatly improve laboratory productivity and the reliability of data transfers.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Expanding_the_research_team\">Expanding the research team<\/span><\/h3>\n<p>Increasing the size of your project by adding more people will have a significant impact on the complexity of using a more electronic laboratory workflow. However, while the basic issues of sharing, collaboration, etc. are not any different than they would be if paper-based notebooks were in use, the electronic solutions to handling those issues are much more useful. At the core of an expanded laboratory operation is getting personnel to agree on how they are going to communicate data and information, how they are going to collaborate using that data and information, and how they will make that agreement sustainable. There is going to be some small allowance for individual approaches to laboratory activities, but critical matters such as data organization and management need to be strictly adhered to. As such, there are several issues to be mindful of.\n<\/p><p>1. How will data and information be organized and compiled? If multiple people are contributing to a project, the results of their work need to be organized in one place so that the status of the work can be ascertained, missing data can be identified, and material is easier to work with. As a project's direction evolves, the formatting and content may change, potentially requiring a complete overhaul of the data structure. That is just a consequence of the way research programs progress.\n<\/p><p>2. How will data and information be updated, versioned, and retained? If your lab works with paper files, this isn't so difficult. The lab may have one printed file detailing experimental methods, and when that method file gets updated, the old printed document is removed and the new one added. In labs where each person has their own individual method file, the process would be repeated for each person. As such, there's no confusion as to the current version; methods would have revision histories so that you would know why they were changed. In cases where those methods are kept in electronic files, more attention has to be paid to ensuring that old method files are archived, and that everyone is working with the same version. This means clear communications procedures are essential. Additionally, the name of the file should have the current revision information; don\u2019t rely on the computer's creation or modification dates for files.\n<\/p><p>3. How is access to common files controlled? Having people edit or add to common files without careful lockout controls is dangerous. If two people open the same document at the same time and make changes, the one who saves the file last can overwrite any changes the other individual has made. You need a document management system that prevents this; if a file is checked out by one person, others may read it but not write to it until the first person is done. Old versions of files should be archived but not deleted so that a detailed revision history is maintained.\n<\/p><p>If an organization grows too large for a consensus-based cooperation of people using office suite products, it will be time to transition to a more purpose-built solution like a multiuser ELN that is capable of allowing multiple people to contribute to a project simultaneously, providing a common organization for data and information, and allowing users to either import data from an instrument or have a direct connection between the ELN and the instrument data system (or through the use of <a href=\"https:\/\/www.limswiki.org\/index.php\/Scientific_data_management_system\" title=\"Scientific data management system\" class=\"wiki-link\" data-key=\"9f38d322b743f578fef487b6f3d7c253\">scientific data management system<\/a> [SDMS]). But how large is too large? It may be the point when personnel become frustrated with waiting for access to documents, or when processes just don\u2019t seem to move as smoothly as they used to. While there is a significant cost factor in implementing an ELN, it should be done sooner rather than later so your lab can benefit from the organizational structure that an ELN provides and reduce the amount of effort it will take to import data and information into the structure. Ideally you are better off if you can start your work with an ELN once the initial project startup work is taken care of. \n<\/p><p>ELNs afford a number of additional benefits, including but not limited to access to:\n<\/p>\n<ul><li>vendor databases for ordering supplies and managing inventory,<\/li>\n<li>chemical structure databases,<\/li>\n<li>reaction databases, and<\/li>\n<li>external libraries.<\/li><\/ul>\n<p>ELNs are appropriate for both basic research and applied research, including work in testing labs where method development and special project work is being done. The prior bulleted list might give you the impression that ELN are oriented toward chemistry; however, that is a reflection of my experience not the industry. The ELN is used in many other industries. Several examples of ELNs past and present, covering many industries, are listed below.\n<\/p>\n<ul><li>LabTech Notebook was released in 1986 and discontinued in 2004, and it was designed to provide communications between computers and laboratory instruments that used RS-232 serial communications. This ELN was applicable to a variety of industries and disciplines.<\/li>\n<li>SmartLab from Velquest was released in the early 2000s and was the first commercial product to carry the \"electronic laboratory notebook\" identifier. It was designed as a platform to encode and guide people conducting lab procedures in GLP\/GMP\/GALP environments. Now owned by <a href=\"https:\/\/www.limswiki.org\/index.php\/Dassault_Syst%C3%A8mes_SA\" title=\"Dassault Syst\u00e8mes SA\" class=\"wiki-link\" data-key=\"1be69bd73e35bc3db0c3229284bf9416\">Dassault Syst\u00e8mes<\/a> and rebranded as a <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_execution_system\" title=\"Laboratory execution system\" class=\"wiki-link\" data-key=\"774bdcab852f4d09565f0486bfafc26a\">laboratory execution system<\/a> (LES) as part of the BIOVIA product line, the solution's same conceptual functionality has since been incorporated into LIMS and ELNs that fit the more current expectation for an ELN.<\/li>\n<li>Wolfram Research has as series of notebook products geared toward mathematics.<\/li><\/ul>\n<p>There is a growing list of ELN systems in a variety of disciplines. Given that the scope of activities an ELN can cover is fairly broad, you have to be careful to define your needs before uttering the words \u201cwe\u2019re going to implement an ELN.\u201d We\u2019ll address that point later.\n<\/p><p>The organization of an ELN application is flexible. The layout is user-defined and can contain queryable text, figures, graphics, and fields for instrument-generated data. Because the ELN is inherently flexible and there usually isn\u2019t any quick-start structure, you have to know what you are looking for in a product and how you want to use it. This requires quality due-dilligence research. \u201cIf only I had known\u201d are among the saddest words after product selection.\n<\/p><p>Some organizations have chosen to develop their own ELNs. That is something that should be undertaken with fear and trepidation. Part of the original justification for this route is typically based on the belief that you have special needs that can't be met otherwise. Another concern may be that you don\u2019t want to tie yourself to a vendor that may go out of business. Those points are more a matter of product selection criteria than a justification for a software development project. If you do choose to go with an internal or even a contracted development route, you will potentially have to live with a product that has just one customer (unless the contractor decides to productize it, which is an entirely different discussion). You will also be saddled with the support of that product for the rest of its functional life. And that doesn\u2019t even get into the \"who\" of software product management and development, the \u201cwhen will it be done,\u201d and the inevitable scope creep (i.e., the change and expansions of development requirements). In the initial stages of research projects, the organization of data is subject to change as needs change. This is often at the behest of \u201cunstructured\u201d data and the need to manage it. (Note that it isn\u2019t that the data is truly unstructured, it\u2019s just in a variable structure until the project finds its direction.) This can lead to frustration in setting up a commercial ELN system, let alone designing one from scratch. The next section will look at an organization that is more highly structured.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Meeting_the_needs_of_the_testing_laboratory\">Meeting the needs of the testing laboratory<\/span><\/h2>\n<p>The earlier print shop description will give you an idea of the workflow of a testing or service lab, for example a quality control (QC) lab. In a manually managed QC or research laboratory, operations can become overwhelming quickly. Imagine a study of 40 samples<sup id=\"rdp-ebb-cite_ref-10\" class=\"reference\"><a href=\"#cite_note-10\">[b]<\/a><\/sup> with multiple tests per sample, each of which has to be individually cataloged. Actually I don\u2019t have to imagine it; it happened frequently in our lab, which supported several research groups. When one of these large sample sets shows up\u2014and sometimes more than one\u2014you don't talk to the person doing the sample logging for fear of disrupting their required concentration. You can get them coffee, but no talking.\n<\/p><p>With a LIMS, this isn't so much an issue. You can log in one sample or a hundred, simply by telling it the starting sample ID, how many samples, and what tests should be scheduled. The LIMS then organizes everything and prints the labels for each sample. With some systems, the requestor of a test can even log them in from a web portal, and the central LIMS automatically updates when the samples actually arrive in the lab. \n<\/p><p>A LIMS makes life easier for laboratories in a number of other ways as well. Want to find a list of samples that are pending a particular test? A quality LIMS can readily display that information, including the sample numbers, priorities, and current locations, with no need to manually check work request sheets. Does a third party want to find out the status of one or more of their in-process samples? Role-based access management means a third party can receive limited access to view that status, without seeing anyone else's sensitive data. What about verifying and approving results? The LIMS can provide some level of results checking, with final verification and approval by lab management. When approved, the reports for each set of requests can be printed, emailed, or stored for portal access. And what about integrating data and systems? The LIMS can be connected to an instrument data system (IDS). Depending on the sophistication of that system, the LIMS can generate a worklist of samples that needs to be processed by that device, with the list downloaded to the IDS. When the work is completed, the results can be uploaded directly to the LIMS. This type of system interaction is one of the places where significant productivity gains can be had.\n<\/p><p>The key attributes of a LIMS are shown in Figure 6.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig6_Liscouski_AppInfoSciWork21.png\" class=\"image wiki-link\" data-key=\"d409444eac6274c0b9a34c2bd8e1a551\"><img alt=\"Fig6 Liscouski AppInfoSciWork21.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/d\/d3\/Fig6_Liscouski_AppInfoSciWork21.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 6.<\/b> Slide detailing the core components of a LIMS, from <i><a href=\"https:\/\/www.limswiki.org\/index.php\/LII:A_Guide_for_Management:_Successfully_Applying_Laboratory_Systems_to_Your_Organization%27s_Work\" title=\"LII:A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work\" class=\"wiki-link\" data-key=\"00b300565027cb0518bcb0410d6df360\">A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work<\/a><\/i>, Part 1: Laboratory Informatics Technologies, a webinar series by Joe Liscouski<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>These are just the highlights of what a LIMS can do. Why is a LIMS more effective out-of-the-box than an ELN? The operational behavior of testing labs is essentially the same across industries and disciplines, and as a result vendors have a more predictable and stable base of customers and user requirements to work against than an ELN. It\u2019s a bit like comparing home building contractors. Some have a basic structural architecture that they can modify to meet the needs of a broad market. Others do completely custom homes from the basement up, each of which is unique. The economies of scale and a broader, more predictable customer base show up in products that are easier to work with and adapt both for the primary vendor and those providing add-ons. Those LIMS add-ons include specialized facilities for different industries, including enology and viticulture analysis, water treatment, mineral analysis, cannabis testing, and clinical diagnostics (though the system used in the clinical setting is typically referred to as a <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_information_system\" title=\"Laboratory information system\" class=\"wiki-link\" data-key=\"37add65b4d1c678b382a7d4817a9cf64\">laboratory information system<\/a> or LIS). Regardless, and as is the case with ELNs, you want to install a LIMS as soon as you are able to avoid issues surrounding differing workflows based on pre- and post-implementation.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Other_laboratory_informatics_systems\">Other laboratory informatics systems<\/span><\/h2>\n<p>We've mentioned a few other systems in passing, but here we'll provide a brief overview of a few of them.\n<\/p>\n<h3><span id=\"rdp-ebb-Scientific_data_management_system_(SDMS)\"><\/span><span class=\"mw-headline\" id=\"Scientific_data_management_system_.28SDMS.29\">Scientific data management system (SDMS)<\/span><\/h3>\n<p>The SDMS helps laboratories better solve the problem of dealing with a large number of data files that are being generated, basically by acting as a giant file cabinet that LIMS, ELN, and IDS can connect to. For example, you may not want to put large datasets, spectra, etc. in a LIMS or ELN, you but still have to reference those large files within other internal LIMS and ELN files. This is where the SDMS steps in, storing those large files in the \"file cabinet\" while maintaining references and metadata that are usable by other informatics systems.\n<\/p>\n<h3><span id=\"rdp-ebb-Laboratory_execution_system_(LES)\"><\/span><span class=\"mw-headline\" id=\"Laboratory_execution_system_.28LES.29\">Laboratory execution system (LES)<\/span><\/h3>\n<p>You may not know about them, but there are lesser-known systems called LES that are designed to ensure testing protocols are executed properly and that all the data that is necessary is properly recorded. The initial incarnation arose from the previously discussed SmartLab<sup id=\"rdp-ebb-cite_ref-11\" class=\"reference\"><a href=\"#cite_note-11\">[c]<\/a><\/sup>, a stand-alone product that would guide an analyst through the steps of an analysis, recording each detail of the work in a file that regulatory agencies could inspect to ensure that work was done properly. It found a ready market in any lab that needed to meet GLP\/GMP requirements. The functionality needed to create the same capability can be found in some LIMS and ELNs, but programming is required.\n<\/p>\n<h3><span id=\"rdp-ebb-Instrument_data_system_(IDS)\"><\/span><span class=\"mw-headline\" id=\"Instrument_data_system_.28IDS.29\">Instrument data system (IDS)<\/span><\/h3>\n<p>Any laboratory instrument popular in the marketplace has had a computer either attached to it or built into it as a package. That\u2019s what an IDS is. It provides automated control over an instrument, collecting and analyzing the data produced by the measuring components. Depending on the sophistication of the vendor, and the demands of the marketplace, the connections between the IDS and another laboratory informatics solution may range from user-programmable interfaces (via a network, USB, serial, or digital I\/O connection) to built-in communications systems that are almost plug-and-play. The latter are most commonly found in the clinical chemistry market, where a great deal of attention has been paid to integration and systems communication via <a href=\"https:\/\/www.limswiki.org\/index.php\/Health_Level_7\" title=\"Health Level 7\" class=\"wiki-link\" data-key=\"e0bf845fb58d2bae05a846b47629e86f\">Health Level 7<\/a> (HL7) and related protocols. (The details, however, are beyond the scope of this document.)\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Planning_for_laboratory_informatics\">Planning for laboratory informatics<\/span><\/h2>\n<p>There are two key requirements to the successful implementation of informatics products: education and planning. \n<\/p>\n<h3><span class=\"mw-headline\" id=\"Education\">Education<\/span><\/h3>\n<p>As far as education goes, the webinar series noted in Figure 6 is a good tool, as are documents provided by technical standards body <a href=\"https:\/\/www.limswiki.org\/index.php\/ASTM_International\" title=\"ASTM International\" class=\"wiki-link\" data-key=\"dfeafbac63fa786e77b472c3f86d07ed\">ASTM International<\/a>. ASTM documents that may be of value to you include:\n<\/p>\n<ul><li><i>ASTM E1578-18 Standard Guide for Laboratory Informatics<\/i><sup id=\"rdp-ebb-cite_ref-ASTME1578_12-0\" class=\"reference\"><a href=\"#cite_note-ASTME1578-12\">[9]<\/a><\/sup><\/li>\n<li><i>ASTM E1578-13 Standard Guide for Laboratory Informatics<\/i><sup id=\"rdp-ebb-cite_ref-ASTME1578-13_13-0\" class=\"reference\"><a href=\"#cite_note-ASTME1578-13-13\">[10]<\/a><\/sup> (contains some information not found in 2018 version)<\/li>\n<li><i>ASTM E1947-98(2014) Standard Specification for Analytical Data Interchange Protocol for Chromatographic Data<\/i><sup id=\"rdp-ebb-cite_ref-ASTME1947_14-0\" class=\"reference\"><a href=\"#cite_note-ASTME1947-14\">[11]<\/a><\/sup><\/li>\n<li><i>ASTM D8244-21 Standard Guide for Analytical Laboratory Operations Supporting the Cannabis\/Hemp Industry<\/i><sup id=\"rdp-ebb-cite_ref-ASTMD8244_15-0\" class=\"reference\"><a href=\"#cite_note-ASTMD8244-15\">[12]<\/a><\/sup><\/li>\n<li>If you search the ASTM website for \"LIMS\" or \"informatics,\" you'll be surprised by the amount of other material that shows up.<\/li><\/ul>\n<p>Other sources of educational material include:\n<\/p>\n<ul><li>The Tutorial section of LIMSforum.com<sup id=\"rdp-ebb-cite_ref-LIMSforumTutorial_16-0\" class=\"reference\"><a href=\"#cite_note-LIMSforumTutorial-16\">[13]<\/a><\/sup><\/li>\n<li><i><a href=\"https:\/\/www.limswiki.org\/index.php\/LII:Laboratory_Information_Systems_Project_Management:_A_Guidebook_for_International_Implementations\" title=\"LII:Laboratory Information Systems Project Management: A Guidebook for International Implementations\" class=\"wiki-link\" data-key=\"de9b2109f3975a703634052af790b2d1\">Laboratory Information Systems Project Management: A Guidebook for International Implementations<\/a><\/i>, by the APHL<\/li>\n<li><i>Laboratory Informatics Guide 2021<\/i>, by <i>Scientific Computing World<\/i><sup id=\"rdp-ebb-cite_ref-SCWLabInfo21_17-0\" class=\"reference\"><a href=\"#cite_note-SCWLabInfo21-17\">[14]<\/a><\/sup><\/li>\n<li><i>Lab Manager<\/i> magazine, published by LabX Media Group<sup id=\"rdp-ebb-cite_ref-LabXLabManager_18-0\" class=\"reference\"><a href=\"#cite_note-LabXLabManager-18\">[15]<\/a><\/sup><\/li>\n<li><i>Computerized Systems in the Modern Laboratory: A Practical Guide<\/i>, by Joe Liscouski<sup id=\"rdp-ebb-cite_ref-LiscouskiComput15_19-0\" class=\"reference\"><a href=\"#cite_note-LiscouskiComput15-19\">[16]<\/a><\/sup><\/li>\n<li>Any vendor with an informatics product will send you a nearly endless stream of material.<\/li><\/ul>\n<p>Ultimately, however, the responsibility for informatics implementation projects doesn't exist solely on the shoulders of laboratory personnel. Everyone connected with a laboratory implementing informatics solutions should have some level of awareness regarding laboratory informatics, including upper management. However, the level of knowledge required may vary slightly depending on the role. For example, laboratory personnel should be fully educated on common laboratory technologies at a minimum. They should also understand why an informatics project is being considered, what the scope of the implementation will be, and what their role will be in the implementation. Upper management should remember to ask laboratory personnel for input on the project, including the topic of product requirements, in order for personnel to feel like they are part of the process, not simply an observer or \"something that's happening to them.\" Finally, laboratory personnel should understand how their jobs are going to change once the implementation is complete. This needs to be addressed very early in project discussions; it is a matter of change, and change scares people, particularly if it affects their job and income. This is not a \u201cwe\u2019ll deal with that later\u201d point. Don\u2019t start the discussion until you figure this out. Things may change, but people want security.\n<\/p><p>Of course, any information technology (IT) personnel will also be involved, requiring significant knowledge about not only networking and software installations but also systems integrations. IT personnel need to understand their role in the implementation project, which can include support, project management, evaluating the vendor support capabilities, and more. They should also be fully aware of and understand the technologies the organization is considering for implementation, and they will be a vital part of the project planning and vendor selection process. IT personnel also will be interested in questions about any <a href=\"https:\/\/www.limswiki.org\/index.php\/Enterprise_resource_planning\" title=\"Enterprise resource planning\" class=\"wiki-link\" data-key=\"07be791b94a208f794e38224f0c0950b\">enterprise resource planning<\/a> (ERP) aspects, which may raise issues of \"build or buy.\" The organization needs to be prepared to both address these concerns and gain IT personnel as strong supporters of the project.\n<\/p><p>Finally, upper management\u2014those who are going to approve the project and provide funding\u2014need to be educated enough to understand the benefits and risks of the proposed implementation, including the likely time scale for the work. Upper management will need to be active in the project in at least two critical junctures, plus at specific milestones as needed. The first time upper management will need informed participation will be during initial project planning. They will help the organization lay out the issues that need to be addressed, the scope of options that will be investigated, and how the organization is going to proceed. They may pose questions such as \u201ccan we use existing system to solve the problem,\u201d particularly if there already has been an investment in an ERP solution such as SAP. Such technology questions will also be of interest to IT personnel since they have an investment in those systems. The second time upper management needs to undoubtedly be involved is when the actual project proposal is finished and is ready to be pitched. They will ask need to ask questions about the reasoning behind the choices made, why current systems are insufficient, what kind of investment the project will require, how the implementation will be scheduled, and how the roll-out would be planned. Understanding the answers to these and other questions will be difficult if upper management doesn\u2019t understand the technology, the issues, the options, and the benefits of the proposed laboratory informatics project.\n<\/p><p>If the world of informatics is new to any or all these stakeholders, the organization must consider getting outside support (not from a vendor) to guide the organization through the process of evaluating the lab's operations, scoping out an overall implementation program, dividing it into milestones, setting priorities, and developing user requirements.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Planning\">Planning<\/span><\/h3>\n<p>The whole point of project planning is to get your organization from the starting point to the end goal, preferably using the most effective path. So, where is the organization going, and why? Everything pretty much boils down to those two questions. Once those questions are answered, more will arise concerning how the organization is actually going to get to the project's end goal.\n<\/p><p>Initial planning will look at the lab from a standpoint of the minimum number of computers required to do the work. In some cases, perhaps just those computational systems built into instruments will be sufficient. Whatever the decision, that\u2019s the planning baseline. Then consider what kind of lab it is: a research lab, a service lab, or a blend? That helps direct concentration on potential solutions; however, be sure not to completely eliminate other options just yet. For example, if your lab is a QC lab, it's probably a service lab in need of a LIMS, but even then there are still options to evaluate.\n<\/p><p>From there, the organization must also think in terms of where that baseline lab is going to be in five years; it\u2019s an arbitrary number but a starting point. Why that far out? It will take a year or two to plan, implement, validate, and educate people to work with the new systems and for the lab to change its behavior and settle down to a new mode of working. The organization should also consider what other changes are likely to take place. What new equipment, procedures, and management changes can be anticipated? Ideally any implemented informatics system will be in place and stable for a few years before people start asking for significant changes; minor ones will likely happen early on. Any hint of \u201cwe didn\u2019t plan for that\u201d will be viewed as poor leadership. Figure 7 shows some of the key points you need to look at during planning.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig7_Liscouski_AppInfoSciWork21.png\" class=\"image wiki-link\" data-key=\"3af5b365b0632c3c57c0acfe6b912fd8\"><img alt=\"Fig7 Liscouski AppInfoSciWork21.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/1\/18\/Fig7_Liscouski_AppInfoSciWork21.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 7.<\/b> Slide detailing considerations of laboratory informatics implementation in the lab, from <i><a href=\"https:\/\/www.limswiki.org\/index.php\/LII:A_Guide_for_Management:_Successfully_Applying_Laboratory_Systems_to_Your_Organization%27s_Work\" title=\"LII:A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work\" class=\"wiki-link\" data-key=\"00b300565027cb0518bcb0410d6df360\">A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work<\/a><\/i>, Part 4: LIMS\/LIS, ELN, SDMS, IT & Education, a webinar series by Joe Liscouski<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Another factor that needs to be considered is that the considerations shown in Figure 7 can be repeated for each lab in the organization. The project planning team also needs to consider how current laboratory workflow impacts other labs. Are there synergistic effects that can be gained by broadening the scope of what the lab is doing?\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Why_projects_fail\">Why projects fail<\/span><\/h3>\n<p>We shouldn't finish the planning section of this guide without discussing why laboratory informatics implementation projects can and do fail. These types of projects are large software projects, and delays, cost over-runs, and failures are common (but hopefully not for your organization's project). Just Google \"why IT projects fail\" and read through some anecdotes. The following are some common reasons informatics implementation projects fail.\n<\/p>\n<ul><li>Insufficient budgeting: Projects can run short of funding, requiring an awkward meeting with management asking for additional funding (without a project change to account for it), inevitably showing a lack of foresight and planning. Build in a large contingency fund because the unexpected happens. If you\u2019d like some education on the topic, watch a few episodes of any home upgrade project on HGTV, in particular <i>Love It or List It<\/i>.<sup id=\"rdp-ebb-cite_ref-20\" class=\"reference\"><a href=\"#cite_note-20\">[d]<\/a><\/sup><\/li>\n<li>Insufficient management support: If sufficient communication isn\u2019t made with management, problems may arise. For example, project delays are a fact of life. Keep clear communications with upper management so that they, and everyone else on the project, know what is going on. Miscommunication or lack of communication of other aspects of the project may inevitably doom the project.<\/li>\n<li>Poor understanding of the scope and nature of the project: This is an educational issue, and a lack of education for all involved parties is almost a guarantee of failure. If you need help, bring in an independent consultant who can lend confidence to the project and its management.<\/li>\n<li>Lack of quality or attention to detail: \u201cThere is always time to do it over, but not enough to do it right\u201d is a common complaint in engineering projects. If you hear it on your organization's project, you are in trouble. Basically the complaint is that project members are cutting corners and not doing things properly, and in a way that is not supportable. This never ends well; sometimes not quickly, but in the long run it leads to problems.<\/li>\n<li>Poor or unrealistic timelines: You may as well face reality from the start: an aggressive timeline just leads to problems (see bullet above). Timelines expand, but they almost never get shorter. If the project team is always rushing, something will most certainly get missed, causing problems later down the road.<\/li>\n<li>Poor project management: Well-managed projects are obvious, and so are poorly managed ones. Just watch the people working, their demeanor, and their attitude about the work; it will tell you all you need to know. Well-managed projects may not always run smoothly, but they make consistent progress. Poorly managed projects cause you to make excuses.<\/li><\/ul>\n<h2><span class=\"mw-headline\" id=\"Closing\">Closing<\/span><\/h2>\n<p>This guide is like the directory signs in shopping malls: they tell you where you are and what shops and restaurants in the facility to consider. Once you figure out what you are looking for, you can find your way there. Hopefully in reading this you\u2019ve formed an idea of what you want to look at and what your path to finding it is.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Footnotes\">Footnotes<\/span><\/h2>\n<div class=\"reflist\" style=\"list-style-type: lower-alpha;\">\n<div class=\"mw-references-wrap\"><ol class=\"references\">\n<li id=\"cite_note-1\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-1\">\u2191<\/a><\/span> <span class=\"reference-text\">According to Greek mythology (from the <a rel=\"external_link\" class=\"external text\" href=\"http:\/\/myths.e2bn.org\/mythsandlegends\/userstory14957-why-is-the-sky-blue.html\" target=\"_blank\">E2BN Myths page<\/a>): \"Long long ago, when Queen Athena (Zeus's daughter) was born, Zeus blessed her with two boons for when she came of age. After almost 15 years, Athena was told to think up two things to ask for ... 1) To have a city in Greece named after her (Athens) [and] 2) To have all the people of the world see her face every day of the year (what you are seeing are only her eyes). Thus, the sky is blue, just like the color of Athena's eyes...\"<\/span>\n<\/li>\n<li id=\"cite_note-10\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-10\">\u2191<\/a><\/span> <span class=\"reference-text\">In some life science drug screening studies, the number can be far higher, which is where robotics and automation becomes important.<\/span>\n<\/li>\n<li id=\"cite_note-11\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-11\">\u2191<\/a><\/span> <span class=\"reference-text\">A name that has been used by conference organizers after the product was sold and renamed. If you do a Google search on \u201cSmartLab,\u201d you may be surprised at what turns up.<\/span>\n<\/li>\n<li id=\"cite_note-20\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-20\">\u2191<\/a><\/span> <span class=\"reference-text\">Some season highlights <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.hgtv.com\/shows\/love-it-or-list-it\" target=\"_blank\">can be found<\/a> on the HGTV website.<\/span>\n<\/li>\n<\/ol><\/div><\/div>\n<h2><span class=\"mw-headline\" id=\"About_the_author\">About the author<\/span><\/h2>\n<p>Initially educated as a chemist, author Joe Liscouski (joe dot liscouski at gmail dot com) is an experienced laboratory automation\/computing professional with over forty years of experience in the field, including the design and development of automation systems (both custom and commercial systems), LIMS, robotics and data interchange standards. He also consults on the use of computing in laboratory work. He has held symposia on validation and presented technical material and short courses on laboratory automation and computing in the U.S., Europe, and Japan. He has worked\/consulted in pharmaceutical, biotech, polymer, medical, and government laboratories. His current work centers on working with companies to establish planning programs for lab systems, developing effective support groups, and helping people with the application of automation and information technologies in research and quality control environments.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"References\">References<\/span><\/h2>\n<div class=\"reflist references-column-width\" style=\"-moz-column-width: 30em; -webkit-column-width: 30em; column-width: 30em; list-style-type: decimal;\">\n<div class=\"mw-references-wrap mw-references-columns\"><ol class=\"references\">\n<li id=\"cite_note-SBPDuplicate-2\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-SBPDuplicate_2-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/scientificbindery.com\/products\/duplicate-lined-notebook\/\" target=\"_blank\">\"Duplicate Lined Notebook\"<\/a>. Scientific Bindery Productions, Inc<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/scientificbindery.com\/products\/duplicate-lined-notebook\/\" target=\"_blank\">https:\/\/scientificbindery.com\/products\/duplicate-lined-notebook\/<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 12 May 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Duplicate+Lined+Notebook&rft.atitle=&rft.pub=Scientific+Bindery+Productions%2C+Inc&rft_id=https%3A%2F%2Fscientificbindery.com%2Fproducts%2Fduplicate-lined-notebook%2F&rfr_id=info:sid\/en.wikipedia.org:LII:The_Application_of_Informatics_to_Scientific_Work:_Laboratory_Informatics_for_Newbies\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-WestGMP-3\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-WestGMP_3-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">West, D. (18 November 2021). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/users.stlcc.edu\/departments\/fvbio\/Lab_Practices_GLP_STLCC.htm\" target=\"_blank\">\"GMP\/GLP Recordkeeping\"<\/a>. St. Louis Community College, Center for Plant and Life Sciences<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/users.stlcc.edu\/departments\/fvbio\/Lab_Practices_GLP_STLCC.htm\" target=\"_blank\">https:\/\/users.stlcc.edu\/departments\/fvbio\/Lab_Practices_GLP_STLCC.htm<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 12 May 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=GMP%2FGLP+Recordkeeping&rft.atitle=&rft.aulast=West%2C+D.&rft.au=West%2C+D.&rft.date=18+November+2021&rft.pub=St.+Louis+Community+College%2C+Center+for+Plant+and+Life+Sciences&rft_id=https%3A%2F%2Fusers.stlcc.edu%2Fdepartments%2Ffvbio%2FLab_Practices_GLP_STLCC.htm&rfr_id=info:sid\/en.wikipedia.org:LII:The_Application_of_Informatics_to_Scientific_Work:_Laboratory_Informatics_for_Newbies\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-NIHGuidelines08-4\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-NIHGuidelines08_4-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">National Institutes of Health (December 2008). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/oir.nih.gov\/sites\/default\/files\/uploads\/sourcebook\/documents\/ethical_conduct\/guidelines-scientific_recordkeeping.pdf\" target=\"_blank\">\"Guidelines for Scientific Record Keeping in the Intramural Research Program at the NIH\"<\/a> (PDF). Office of the Director<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/oir.nih.gov\/sites\/default\/files\/uploads\/sourcebook\/documents\/ethical_conduct\/guidelines-scientific_recordkeeping.pdf\" target=\"_blank\">https:\/\/oir.nih.gov\/sites\/default\/files\/uploads\/sourcebook\/documents\/ethical_conduct\/guidelines-scientific_recordkeeping.pdf<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 12 May 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Guidelines+for+Scientific+Record+Keeping+in+the+Intramural+Research+Program+at+the+NIH&rft.atitle=&rft.aulast=National+Institutes+of+Health&rft.au=National+Institutes+of+Health&rft.date=December+2008&rft.pub=Office+of+the+Director&rft_id=https%3A%2F%2Foir.nih.gov%2Fsites%2Fdefault%2Ffiles%2Fuploads%2Fsourcebook%2Fdocuments%2Fethical_conduct%2Fguidelines-scientific_recordkeeping.pdf&rfr_id=info:sid\/en.wikipedia.org:LII:The_Application_of_Informatics_to_Scientific_Work:_Laboratory_Informatics_for_Newbies\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-PainBow19-5\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-PainBow19_5-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">Pain, E. (3 September 2019). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.sciencemag.org\/careers\/2019\/09\/how-keep-lab-notebook\" target=\"_blank\">\"How to keep a lab notebook\"<\/a>. <i>Science<\/i>. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/Digital_object_identifier\" data-key=\"ae6d69c760ab710abc2dd89f3937d2f4\">doi<\/a>:<a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.1126%2Fscience.caredit.aaz3678\" target=\"_blank\">10.1126\/science.caredit.aaz3678<\/a><span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.sciencemag.org\/careers\/2019\/09\/how-keep-lab-notebook\" target=\"_blank\">https:\/\/www.sciencemag.org\/careers\/2019\/09\/how-keep-lab-notebook<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 12 May 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=How+to+keep+a+lab+notebook&rft.atitle=Science&rft.aulast=Pain%2C+E.&rft.au=Pain%2C+E.&rft.date=3+September+2019&rft_id=info:doi\/10.1126%2Fscience.caredit.aaz3678&rft_id=https%3A%2F%2Fwww.sciencemag.org%2Fcareers%2F2019%2F09%2Fhow-keep-lab-notebook&rfr_id=info:sid\/en.wikipedia.org:LII:The_Application_of_Informatics_to_Scientific_Work:_Laboratory_Informatics_for_Newbies\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-EngelBlog15-6\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-EngelBlog15_6-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">Engel, M. (December 2015). <a rel=\"external_link\" class=\"external text\" href=\"http:\/\/martinengel.net\/2015\/12\/how-to-use-onenote-as-your-electronic-notebook\/\" target=\"_blank\">\"Blog: How to use onenote as your electronic lab book\"<\/a>. <i>MartinEngel.net<\/i><span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"http:\/\/martinengel.net\/2015\/12\/how-to-use-onenote-as-your-electronic-notebook\/\" target=\"_blank\">http:\/\/martinengel.net\/2015\/12\/how-to-use-onenote-as-your-electronic-notebook\/<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 12 May 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Blog%3A+How+to+use+onenote+as+your+electronic+lab+book&rft.atitle=MartinEngel.net&rft.aulast=Engel%2C+M.&rft.au=Engel%2C+M.&rft.date=December+2015&rft_id=http%3A%2F%2Fmartinengel.net%2F2015%2F12%2Fhow-to-use-onenote-as-your-electronic-notebook%2F&rfr_id=info:sid\/en.wikipedia.org:LII:The_Application_of_Informatics_to_Scientific_Work:_Laboratory_Informatics_for_Newbies\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-BungersTheElect20-7\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-BungersTheElect20_7-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">Bungers, S. (2020). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.labfolder.com\/electronic-lab-notebook-eln-research-guide\" target=\"_blank\">\"The Electronic Lab Notebook in 2020: A comprehensive guide\"<\/a>. Labforward GmbH<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.labfolder.com\/electronic-lab-notebook-eln-research-guide\" target=\"_blank\">https:\/\/www.labfolder.com\/electronic-lab-notebook-eln-research-guide<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 12 May 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=The+Electronic+Lab+Notebook+in+2020%3A+A+comprehensive+guide&rft.atitle=&rft.aulast=Bungers%2C+S.&rft.au=Bungers%2C+S.&rft.date=2020&rft.pub=Labforward+GmbH&rft_id=https%3A%2F%2Fwww.labfolder.com%2Felectronic-lab-notebook-eln-research-guide&rfr_id=info:sid\/en.wikipedia.org:LII:The_Application_of_Informatics_to_Scientific_Work:_Laboratory_Informatics_for_Newbies\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-RichterForensic18-8\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-RichterForensic18_8-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Richter, T.; Escher, S.; Sch\u00f6nfeld, D. et al. (2018). \"Forensic Analysis and Anonymisation of Printed Documents\". <i>Proceedings of the 6th ACM Workshop on Information Hiding and Multimedia Security<\/i>: 127\u201338. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/Digital_object_identifier\" data-key=\"ae6d69c760ab710abc2dd89f3937d2f4\">doi<\/a>:<a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.1145%2F3206004.3206019\" target=\"_blank\">10.1145\/3206004.3206019<\/a>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Forensic+Analysis+and+Anonymisation+of+Printed+Documents&rft.jtitle=Proceedings+of+the+6th+ACM+Workshop+on+Information+Hiding+and+Multimedia+Security&rft.aulast=Richter%2C+T.%3B+Escher%2C+S.%3B+Sch%C3%B6nfeld%2C+D.+et+al.&rft.au=Richter%2C+T.%3B+Escher%2C+S.%3B+Sch%C3%B6nfeld%2C+D.+et+al.&rft.date=2018&rft.pages=127%E2%80%9338&rft_id=info:doi\/10.1145%2F3206004.3206019&rfr_id=info:sid\/en.wikipedia.org:LII:The_Application_of_Informatics_to_Scientific_Work:_Laboratory_Informatics_for_Newbies\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-BaraniukWhy17-9\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-BaraniukWhy17_9-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">Baraniuk, C. (7 June 2017). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/web.archive.org\/web\/20191102031255\/https:\/\/www.bbc.com\/future\/article\/20170607-why-printers-add-secret-tracking-dots\" target=\"_blank\">\"Why printers add secret tracking dots\"<\/a>. <i>BBC Future<\/i>. Archived from <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.bbc.com\/future\/article\/20170607-why-printers-add-secret-tracking-dots\" target=\"_blank\">the original<\/a> on 02 November 2019<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/web.archive.org\/web\/20191102031255\/https:\/\/www.bbc.com\/future\/article\/20170607-why-printers-add-secret-tracking-dots\" target=\"_blank\">https:\/\/web.archive.org\/web\/20191102031255\/https:\/\/www.bbc.com\/future\/article\/20170607-why-printers-add-secret-tracking-dots<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 13 May 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Why+printers+add+secret+tracking+dots&rft.atitle=BBC+Future&rft.aulast=Baraniuk%2C+C.&rft.au=Baraniuk%2C+C.&rft.date=7+June+2017&rft_id=https%3A%2F%2Fweb.archive.org%2Fweb%2F20191102031255%2Fhttps%3A%2F%2Fwww.bbc.com%2Ffuture%2Farticle%2F20170607-why-printers-add-secret-tracking-dots&rfr_id=info:sid\/en.wikipedia.org:LII:The_Application_of_Informatics_to_Scientific_Work:_Laboratory_Informatics_for_Newbies\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-ASTME1578-12\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-ASTME1578_12-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.astm.org\/Standards\/E1578.htm\" target=\"_blank\">\"ASTM E1578-18 Standard Guide for Laboratory Informatics\"<\/a>. ASTM International. 2018. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/Digital_object_identifier\" data-key=\"ae6d69c760ab710abc2dd89f3937d2f4\">doi<\/a>:<a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.1520%2FE1578-18\" target=\"_blank\">10.1520\/E1578-18<\/a><span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.astm.org\/Standards\/E1578.htm\" target=\"_blank\">https:\/\/www.astm.org\/Standards\/E1578.htm<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 13 May 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=ASTM+E1578-18+Standard+Guide+for+Laboratory+Informatics&rft.atitle=&rft.date=2018&rft.pub=ASTM+International&rft_id=info:doi\/10.1520%2FE1578-18&rft_id=https%3A%2F%2Fwww.astm.org%2FStandards%2FE1578.htm&rfr_id=info:sid\/en.wikipedia.org:LII:The_Application_of_Informatics_to_Scientific_Work:_Laboratory_Informatics_for_Newbies\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-ASTME1578-13-13\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-ASTME1578-13_13-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.astm.org\/DATABASE.CART\/HISTORICAL\/E1578-13.htm\" target=\"_blank\">\"ASTM E1578-13 Standard Guide for Laboratory Informatics\"<\/a>. ASTM International. 2013. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/Digital_object_identifier\" data-key=\"ae6d69c760ab710abc2dd89f3937d2f4\">doi<\/a>:<a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.1520%2FE1578-13\" target=\"_blank\">10.1520\/E1578-13<\/a><span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.astm.org\/DATABASE.CART\/HISTORICAL\/E1578-13.htm\" target=\"_blank\">https:\/\/www.astm.org\/DATABASE.CART\/HISTORICAL\/E1578-13.htm<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 13 May 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=ASTM+E1578-13+Standard+Guide+for+Laboratory+Informatics&rft.atitle=&rft.date=2013&rft.pub=ASTM+International&rft_id=info:doi\/10.1520%2FE1578-13&rft_id=https%3A%2F%2Fwww.astm.org%2FDATABASE.CART%2FHISTORICAL%2FE1578-13.htm&rfr_id=info:sid\/en.wikipedia.org:LII:The_Application_of_Informatics_to_Scientific_Work:_Laboratory_Informatics_for_Newbies\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-ASTME1947-14\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-ASTME1947_14-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.astm.org\/Standards\/E1947.htm\" target=\"_blank\">\"ASTM E1947-98(2014) Standard Specification for Analytical Data Interchange Protocol for Chromatographic Data\"<\/a>. ASTM International. 2014. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/Digital_object_identifier\" data-key=\"ae6d69c760ab710abc2dd89f3937d2f4\">doi<\/a>:<a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.1520%2FE1947-98R14\" target=\"_blank\">10.1520\/E1947-98R14<\/a><span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.astm.org\/Standards\/E1947.htm\" target=\"_blank\">https:\/\/www.astm.org\/Standards\/E1947.htm<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 13 May 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=ASTM+E1947-98%282014%29+Standard+Specification+for+Analytical+Data+Interchange+Protocol+for+Chromatographic+Data&rft.atitle=&rft.date=2014&rft.pub=ASTM+International&rft_id=info:doi\/10.1520%2FE1947-98R14&rft_id=https%3A%2F%2Fwww.astm.org%2FStandards%2FE1947.htm&rfr_id=info:sid\/en.wikipedia.org:LII:The_Application_of_Informatics_to_Scientific_Work:_Laboratory_Informatics_for_Newbies\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-ASTMD8244-15\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-ASTMD8244_15-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.astm.org\/Standards\/D8244.htm\" target=\"_blank\">\"ASTM D8244-21 Standard Guide for Analytical Laboratory Operations Supporting the Cannabis\/Hemp Industry\"<\/a>. ASTM International. 2021. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/Digital_object_identifier\" data-key=\"ae6d69c760ab710abc2dd89f3937d2f4\">doi<\/a>:<a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.1520%2FD8244-21\" target=\"_blank\">10.1520\/D8244-21<\/a><span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.astm.org\/Standards\/D8244.htm\" target=\"_blank\">https:\/\/www.astm.org\/Standards\/D8244.htm<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 13 May 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=ASTM+D8244-21+Standard+Guide+for+Analytical+Laboratory+Operations+Supporting+the+Cannabis%2FHemp+Industry&rft.atitle=&rft.date=2021&rft.pub=ASTM+International&rft_id=info:doi\/10.1520%2FD8244-21&rft_id=https%3A%2F%2Fwww.astm.org%2FStandards%2FD8244.htm&rfr_id=info:sid\/en.wikipedia.org:LII:The_Application_of_Informatics_to_Scientific_Work:_Laboratory_Informatics_for_Newbies\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-LIMSforumTutorial-16\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-LIMSforumTutorial_16-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.limsforum.com\/category\/education\/tutorials-education\/\" target=\"_blank\">\"Tutorials\"<\/a>. <i>LIMSforum<\/i>. LabLynx, Inc<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.limsforum.com\/category\/education\/tutorials-education\/\" target=\"_blank\">https:\/\/www.limsforum.com\/category\/education\/tutorials-education\/<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 13 May 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Tutorials&rft.atitle=LIMSforum&rft.pub=LabLynx%2C+Inc&rft_id=https%3A%2F%2Fwww.limsforum.com%2Fcategory%2Feducation%2Ftutorials-education%2F&rfr_id=info:sid\/en.wikipedia.org:LII:The_Application_of_Informatics_to_Scientific_Work:_Laboratory_Informatics_for_Newbies\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-SCWLabInfo21-17\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-SCWLabInfo21_17-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.scientific-computing.com\/issue\/laboratory-informatics-guide-2021\" target=\"_blank\">\"Laboratory Informatics Guide 2021\"<\/a>. Scientific Computing World. 2021<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.scientific-computing.com\/issue\/laboratory-informatics-guide-2021\" target=\"_blank\">https:\/\/www.scientific-computing.com\/issue\/laboratory-informatics-guide-2021<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 13 May 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Laboratory+Informatics+Guide+2021&rft.atitle=&rft.date=2021&rft.pub=Scientific+Computing+World&rft_id=https%3A%2F%2Fwww.scientific-computing.com%2Fissue%2Flaboratory-informatics-guide-2021&rfr_id=info:sid\/en.wikipedia.org:LII:The_Application_of_Informatics_to_Scientific_Work:_Laboratory_Informatics_for_Newbies\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-LabXLabManager-18\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-LabXLabManager_18-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.labmanager.com\/magazine\" target=\"_blank\">\"Lab Manager\"<\/a>. LabX Media Group. 2021<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.labmanager.com\/magazine\" target=\"_blank\">https:\/\/www.labmanager.com\/magazine<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 13 May 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Lab+Manager&rft.atitle=&rft.date=2021&rft.pub=LabX+Media+Group&rft_id=https%3A%2F%2Fwww.labmanager.com%2Fmagazine&rfr_id=info:sid\/en.wikipedia.org:LII:The_Application_of_Informatics_to_Scientific_Work:_Laboratory_Informatics_for_Newbies\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-LiscouskiComput15-19\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-LiscouskiComput15_19-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation book\">Liscouski, J.G. (2015). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.dhibooks.com\/computerized-systems-in-the-modern-laboratory-a-practical-guide\" target=\"_blank\"><i>Computerized Systems in the Modern Laboratory: A Practical Guide<\/i><\/a>. DHI Publishing. pp. 432. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/International_Standard_Book_Number\" data-key=\"f64947ba21e884434bd70e8d9e60bae6\">ISBN<\/a> 9781933722863<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.dhibooks.com\/computerized-systems-in-the-modern-laboratory-a-practical-guide\" target=\"_blank\">https:\/\/www.dhibooks.com\/computerized-systems-in-the-modern-laboratory-a-practical-guide<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=book&rft.btitle=Computerized+Systems+in+the+Modern+Laboratory%3A+A+Practical+Guide&rft.aulast=Liscouski%2C+J.G.&rft.au=Liscouski%2C+J.G.&rft.date=2015&rft.pages=pp.%26nbsp%3B432&rft.pub=DHI+Publishing&rft.isbn=9781933722863&rft_id=https%3A%2F%2Fwww.dhibooks.com%2Fcomputerized-systems-in-the-modern-laboratory-a-practical-guide&rfr_id=info:sid\/en.wikipedia.org:LII:The_Application_of_Informatics_to_Scientific_Work:_Laboratory_Informatics_for_Newbies\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<\/ol><\/div><\/div>\n<!-- \nNewPP limit report\nCached time: 20211118162549\nCache expiry: 86400\nDynamic content: false\nComplications: []\nCPU time usage: 0.263 seconds\nReal time usage: 0.824 seconds\nPreprocessor visited node count: 11265\/1000000\nPost\u2010expand include size: 72806\/2097152 bytes\nTemplate argument size: 27702\/2097152 bytes\nHighest expansion depth: 18\/40\nExpensive parser function count: 0\/100\nUnstrip recursion depth: 0\/20\nUnstrip post\u2010expand size: 26769\/5000000 bytes\n-->\n<!--\nTransclusion expansion time report (%,ms,calls,template)\n100.00% 139.105 1 -total\n 89.31% 124.237 2 Template:Reflist\n 66.43% 92.402 16 Template:Citation\/core\n 61.59% 85.682 14 Template:Cite_web\n 10.69% 14.875 12 Template:Date\n 8.10% 11.261 1 Template:Cite_journal\n 6.57% 9.135 1 Template:Cite_book\n 5.62% 7.822 7 Template:Citation\/identifier\n 4.68% 6.504 4 Template:Efn\n 3.64% 5.063 21 Template:Citation\/make_link\n-->\n\n<!-- Saved in parser cache with key limswiki:pcache:idhash:12527-0!canonical and timestamp 20211118162548 and revision id 42703. Serialized with JSON.\n -->\n<\/div><\/div><div class=\"printfooter\">Source: <a rel=\"external_link\" class=\"external\" href=\"https:\/\/www.limswiki.org\/index.php\/LII:The_Application_of_Informatics_to_Scientific_Work:_Laboratory_Informatics_for_Newbies\">https:\/\/www.limswiki.org\/index.php\/LII:The_Application_of_Informatics_to_Scientific_Work:_Laboratory_Informatics_for_Newbies<\/a><\/div>\n<!-- end content --><div class=\"visualClear\"><\/div><\/div><\/div><div class=\"visualClear\"><\/div><\/div><!-- end of the left (by default at least) column --><div class=\"visualClear\"><\/div><\/div>\n\n\n\n<\/body>","d8b467af534a70312a21f63b61be26cd_images":["https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/5\/5c\/Fig1_Liscouski_AppInfoSciWork21.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/1\/14\/Fig2_Liscouski_AppInfoSciWork21.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/6\/62\/Fig3_Liscouski_AppInfoSciWork21.png","https:\/\/upload.wikimedia.org\/wikipedia\/commons\/0\/0c\/AGBell_Notebook.jpg","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/6\/65\/Fig5_Liscouski_AppInfoSciWork21.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/d\/d3\/Fig6_Liscouski_AppInfoSciWork21.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/1\/18\/Fig7_Liscouski_AppInfoSciWork21.png"],"d8b467af534a70312a21f63b61be26cd_timestamp":1637252748,"e0147011cc1eb892e1a35e821657a6d9_type":"article","e0147011cc1eb892e1a35e821657a6d9_title":"Considerations in the Automation of Laboratory Procedures (Liscouski 2021)","e0147011cc1eb892e1a35e821657a6d9_url":"https:\/\/www.limswiki.org\/index.php\/LII:Considerations_in_the_Automation_of_Laboratory_Procedures","e0147011cc1eb892e1a35e821657a6d9_plaintext":"\n\nLII:Considerations in the Automation of Laboratory ProceduresFrom LIMSWikiJump to navigationJump to searchTitle: Considerations in the Automation of Laboratory Procedures\nAuthor for citation: Joe Liscouski, with editorial modifications by Shawn Douglas\nLicense for content: Creative Commons Attribution 4.0 International\nPublication date: January 2021\n\nContents \n\n1 Introduction \n2 How does this discussion relate to previous work? \n\n2.1 Before we get too far into this... \n\n\n3 Transitioning from typical lab operations to automated systems \n\n3.1 What will happen to people\u2019s jobs as a result of automation? \n3.2 What is the role of AI and ML in automation? \n3.3 Where do we find the resources to carry out automation projects\/programs? \n3.4 What equipment would we need for automated processes, and will it be different that what we currently have? \n3.5 What role does a LES play in laboratory automation? \n\n3.5.1 What does this have to do with automation? \n\n\n3.6 How do we go about planning for automation? \n\n3.6.1 Justification, expectations, and goals \n3.6.2 Analyzing the process \n3.6.3 Scheduling automation projects \n3.6.4 Budgeting \n\n\n\n\n4 Build, buy, or cooperate? \n5 Project planning \n6 Conclusions (so far) \n7 Abbreviations, acronyms, and initialisms \n8 Footnotes \n9 About the author \n10 References \n\n\n\nIntroduction \nScientists have been dealing with the issue of laboratory automation for decades, and during that time the meaning of those words has expanded from the basics of connecting an instrument to a computer, to the possibility of a fully integrated informatics infrastructure beginning with sample preparation and continuing on to the laboratory information management system (LIMS), electronic laboratory notebook (ELN), and beyond. Throughout this evolution there has been one underlying concern: how do we go about doing this?\nThe answer to that question has changed from a focus on hardware and programming, to today\u2019s need for a lab-wide informatics strategy. We\u2019ve moved from the bits and bytes of assembly language programming to managing terabytes of files and data structures.\nThe high-end of the problem\u2014the large informatics database systems\u2014has received significant industry-wide attention in the last decade. The stuff on the lab bench, while the target of a lot of individual products, has been less organized and more experimental. Failed or incompletely met promises have to yield to planned successes. How we do it needs to change. This document is about the considerations required when making that change. The haphazard \"let's try this\" method has to give way to more engineered solutions and a realistic appraisal of the human issues, as well as the underlying technology management and planning.\nWhy is this important? Whether you are conducting intense laboratory experiments to produce data and information or making chocolate chip cookies in the kitchen, two things remain important: productivity and the quality of the products. In either case, if the productivity isn\u2019t high enough, you won\u2019t be able to justify your work; if the quality isn\u2019t there, no one will want what you produce. Conducting laboratory work and making cookies have a lot in common. Your laboratories exist to answer questions. What happens if I do this? What is the purity of this material? What is the structure of this compound? The field of laboratories asking these questions is extensive, basically covering the entire array of lab bench and scientific work, including chemistry, life sciences, physics, and electronics labs. The more efficiently we answer those questions, the more likely it will be that these labs will continue operating and, that you\u2019ll achieve the goals your organization has set. At some point, it comes down to performance against goals and the return on the investment organizations make in lab operations.\nIn addition to product quality and productivity, there are a number of other points that favor automation over manual implementations of lab processes. They include:\n\nlower costs per test;\nbetter control over expenditures;\na stronger basis for better workflow planning;\nreproducibility;\npredictably; and\ntighter adherence to procedures, i.e., consistency.\nLists similar to the one above can be found in justifications for lab automation, and cookie production, without further comment. It\u2019s just assumed that everyone agrees and that the reasoning is obvious. Since we are going to use those items to justify the cost and effort that goes into automation, we should take a closer look at them.\nLets begin with reproducibility, predictability, and consistency, very similar concerns that reflect automation\u2019s ability to produce the same product with the desired characteristics over and over. For data and information, that means that the same analysis on the same materials will yield the same results, that all the steps are documented and that the process is under control. The variability that creeps into the execution of a process by people is eliminated. That variability in human labor can result from the quality of training, equipment setup and calibration, readings from analog devices (e.g., meters, pipette meniscus, charts, etc.), there is a long list of potential issues.\nConcerns with reproducibility, predictability, and consistency are common to production environments, general lab work, manufacturing, and even food service. There are several pizza restaurants in our area using one of two methods of making the pies. Both start the preparation the same way, spreading dough and adding cheese and toppings, but the differences are in how they are cooked. Once method uses standard ovens (e.g., gas, wood, or electric heating); the pizza goes in, the cook watches it, and then removes it when the cooking is completed. This leads to a lot of variability in the product, some a function of the cook\u2019s attention, some depending on requests for over or under cooking the crust. Some is based on \"have it your way\" customization. The second method uses a metal conveyor belt to move the pie through an oven. The oven temperature is set as is the speed of the belt, and as long as the settings are the same, you get a reproducible, consistent product order after order. It\u2019s a matter of priorities. Manual verses automated. Consistent product quality verses how the cook feels that day. In the end, reducing variability and being able to demonstrate consistent, accurate, results gives people confidence in your product.\nLower costs per test, better control over expenditures, and better workflow planning also benefit from automation. Automated processes are more cost-efficient since the sample throughput is higher and the labor cost is reduced. The cost per test and the material usage is predictable since variability in components used in testing is reduced or eliminated, and workflow planning is improved since the time per test is known, work can be better scheduled. Additionally, process scale-up should be easier if there is a high demand for particular procedures. However there is a lot of work that has to be considered before automation is realizable, and that is where this discussion is headed.\n\n How does this discussion relate to previous work? \nThis work follows on the heels of two previous works:\n\nComputerized Systems in the Modern Laboratory: A Practical Guide (2015): This book presents the range of informatics technologies, their relationship to each other, and the role they play in laboratory work. It differentiates a LIMS from an ELN and scientific data management system (SDMS) for example, contrasting their use and how they would function in different lab working environments. In addition, it covers topics such as support and regulatory issues.\nA Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work (2018): This webinar series complements the above text. It begins by introducing the major topics in informatics (e.g., LIMS, ELN, etc.) and then discusses their use from a strategic viewpoint. Where and how do you start planning? What is your return on investment? What should get implemented first, and then what are my options? The series then moves on to developing an information management strategy for the lab, taking into account budgets, support, ease of implementation, and the nature of your lab\u2019s work.\nThe material in this write-up picks up where the last part of the webinar series ends. The last session covers lab processes, amd this picks up that thread and goes into more depth concerning a basic issue: how do you move from manual methods to automated systems?\nProductivity has always been an issue in laboratory work. Until the 1950s, a lab had little choice but to add more people if more work needed to be done. Since then, new technologies have afforded wider options, including new instrument technologies. The execution of the work was still done by people, but the tools were better. Now we have other options. We just have to figure out when, if, and how to use them.\n\nBefore we get too far into this... \nWith elements such as productivity, return on investment (ROI), data quality, and data integrity as driving factors in this work, you shouldn\u2019t be surprised if a lot of the material reads like a discussion of manufacturing methodologies; we\u2019ve already seen some examples. We are talking about scientific work, but the same things that drive the elements noted in labs have very close parallels in product manufacturing. The work we are describing here will be referenced as \"scientific manufacturing,\" manufacturing or production in support of scientific programs.[a]\nThe key points of a productivity conversation in both lab and material production environments are almost exact overlays, the only significant difference is that the results of the efforts are data and information in one case, and a physical item you might sell in the other. Product quality and integrity are valued considerations in both. For scientists, this may require an adjustment to their perspectives when dealing with automation. On the plus side, the lessons learned in product manufacturing can be applied to lab bench work, making the path to implementation a bit easier while providing a framework for understanding what a successful automation effort looks like. People with backgrounds in product manufacturing can be a useful resource in the lab, with a bit of an adjustment in perspective on their part.\n\nTransitioning from typical lab operations to automated systems \nTransitioning a lab from its current state of operations to one that incorporates automation can raise a number of questions, and people\u2019s anxiety levels. There are several questions that should be considered to set expectations for automated systems and how they will impact jobs and the introduction of new technologies. They include:\n\nWhat will happen to people\u2019s jobs as a result of automation?\nWhat is the role of artificial intelligence (AI) and machine learning (ML) in automation?\nWhere do we find the resources to carry out automation projects\/programs?\nWhat equipment would we need for automated processes, and will it be different that what we currently have?\nWhat role does a laboratory execution system (LES) play in laboratory automation?\nHow do we go about planning for automation?\n What will happen to people\u2019s jobs as a result of automation? \nStories are appearing in print, online, and in television news reporting about the potential for automation to replace human effort in the labor force. It seems like it is an all-or-none situation, either people will continue working in their occupations or automation (e.g., mechanical, software, AI, etc.) will replace them. The storyline is people are expensive and automated work can be less costly in the long run. If commercial manufacturing is a guide, automation is a preferred option from both a productivity and an ROI perspective. In order to make the productivity gains from automation similar to those seen in commercial manufacturing, there are some basic requirements and conditions that have to be met:\n\nThe process has to be well documented and understood, down to the execution of each step without variation, while error detection and recovery have to be designed in.\nThe process has to remain static and be expected to continue over enough execution cycles to make it economically attractive to design, build, and maintain.\nAutomation-compatible equipment has to be available. Custom-built components are going to be expensive and could represent a barrier to successful implementation.\nThere has to be a driving need to justify the cost of automation; economics, the volume of work that has to be addressed, working with hazardous materials, and lack of educated workers are just a few of the factors that would need to be considered.\nThere are places in laboratory work where production-scale automation has been successfully implemented; life sciences applications for processes based on microplate technologies are one example. When we look at the broad scope of lab work across disciplines, most lab processes don\u2019t lend themselves to that level of automation, at least not yet. We\u2019ll get into this in more detail later. But that brings us back to the starting point: what happens to people's jobs?\nIn the early stages of manufacturing automation, as well as fields such as mining where work was labor intensive and repetitive, people did lose jobs when new methods of production were introduced. That shift from a human workforce to automated task execution is expanding as system designers probe markets from retail to transportation.[1] Lower skilled occupations gave way first, and we find ourselves facing automation efforts that are moving up the skills ladder, most recently is the potential for automated driving, a technology that has yet to be fully embraced but is moving in that direction. The problem that leaves us with is providing displaced workers with a means of employment that gives them at least a living income, and the purpose, dignity, and self-worth that they\u2019d like to have. This is going to require significant education, and people are going to have to come to grips with the realization that education never stops.\nDue to the push for increased productivity, lab work has seen some similar developments in automation. The development of automated pipettes, titration stations, auto-injectors, computer-assisted instrumentation, and automation built to support microplate technologies represent just a few places where specific tasks have been addressed. However these developments haven\u2019t moved people out of the workplace as has happened in manufacturing, mining, etc. In some cases they\u2019ve changed the work, replacing repetitive time-consuming tasks with equipment that allows lab personnel to take on different tasks. In other cases the technology addresses work that couldn\u2019t be performed in a cost-effective manner with human effort; without automation, that work might just not be feasible due to the volume of work (whose delivery might be limited by the availability of the right people, equipment, and facilities) or the need to work with hazardous materials. Automation may prevent the need for hiring new people while giving those currently working more challenging tasks.\nAs noted in the previous paragraph, much of the automation in lab work is at the task level: equipment designed to carry out a specific function such as Karl-Fisher titrations. Some equipment designed around microplate formats can function at both the task level and as part of user-integrated robotics system. This gives the planner useful options about the introduction of automation that makes it easier for personnel to get accustomed to automation before moving into scientific manufacturing.\nOverall, laboratory people shouldn\u2019t be loosing their jobs as a result of lab automation, but they do have to be open to changes in their jobs, and that could require an investment in their education. Take someone whose current job is to carry out a lab procedure, someone who understands all aspects of the work, including troubleshooting equipment, reagents, and any special problems that may crop up. Someone else may have developed the procedure, but that person is the expert in its execution.\nFirst of all you need these experts to help plan and test the automated systems if you decide to create that project. These would also be the best people to educate as automated systems managers; they know how the process is supposed to work and should be in a position to detect problems. If it crashes, you\u2019ll need someone who can cover the work while problems are be addressed. Secondly, if lab personnel get the idea that they are watching their replacement being installed, they may leave before the automated systems are ready. In the event of a delay, you\u2019ll have a backlog and no one to handle it.\nBeyond that, people will be freed from the routine of carrying out processes and be able to address work that had been put on a back burner until it could be addressed. As we move toward automated systems, jobs will change by expansion to accommodate typical lab work, as well as the management, planning, maintenance, and evolution of laboratory automation and computing.\nAutomation in lab work is not an \"all or none\" situation. Processes can be structured so that the routine work is done by systems, and the analyst can spend time reviewing the results, looking for anomalies and interesting patterns, while being able to make decisions about the need for and nature of follow-on efforts.\n\n What is the role of AI and ML in automation? \nWhen we discuss automation, what we are referencing now is basic robotics and programming. AI may, and likely will, play a role in the work, but first we have to get the foundations right before we consider the next step; we need to put in the human intelligence first. Part of the issue with AI is that we don\u2019t know what it is.\nScience fiction aside, many of today's applications of AI have a limited role in lab work today. Here are some examples:\n\nHaving a system that can bring up all relevant information on a research question\u2014a sort of super Google\u2014or a variation of IBM\u2019s Watson could have significant benefits.\nAnalyzing complex data or large volumes of data could be beneficial, e.g., the analysis of radio astronomy data to find fast radio bursts (FRB). After discovering 21 FRB signals upon analyzing five hours of data, researchers at Green Bank Telescope used AI to analyze 400 terabytes of older data and detected another 100.[2]\n\"[A] team at Glasgow University has paired a machine-learning system with a robot that can run and analyze its own chemical reaction. The result is a system that can figure out every reaction that's possible from a given set of starting materials.\"[3]\nHelixAI is using Amazon's Alexa as a digital assitant for laboratory work.[4]\nNote that the points above are research-based applications, not routine production environments where regulatory issues are important. While there are research applications that might be more forgiving of AI systems because the results are evaluated by human intelligence, and problematic results can be made subject to further verification, data entry systems such as voice entry have to be carefully tested and the results of that data entry verified and shown to be correct.\nPharma IQ continues to publish material on advanced topics in laboratory informatics, including articles on how labs are benefiting from new technologies[5] and survey reports such as AI 2020: The Future of Drug Discovery. In that report they note[6]:\n\n\"94% of pharma professionals expect that intelligent technologies will have a noticeable impact on the pharmaceutical industry over the next two years.\"\n\"Almost one fifth of pharma professionals believe that we are on the cusp of a revolution.\"\n\"Intelligent automation and predictive analytics are expected to have the most significant impact on the industry.\"\n\"However, a lack of understanding and awareness about the benefits of AI-led technologies remain a hindrance to their implementation.\"\nNote that these are expectations, not a reflection of current reality. That same report makes comments about the impact of AI on headcount disruption, asking, \"Do you expect intelligent enterprise technologies[b] to significantly cut and\/or create jobs in pharma through 2020?\" Among the responses, 47 percent said they expected those technologies to do both, 40 percent said it will create new job opportunities, and 13 percent said there will be no dramatic change, with zero percent saying they expected solely job losses.[6]\nWhile there are high levels of expectations and hopes for results, we need to approach the idea of AI in labs with some caution. We read about examples based on machine learning (ML), for example using computer systems to recognize cats in photos, to recognized faces in a crowd, etc. We don\u2019t know how they accomplish their tasks, and we can\u2019t analyze their algorithms and decision-making. That leaves us with testing in quality, which at best is an uncertain process with qualified results (it has worked so far). One problem with testing AI systems based on ML is that they are going to continually evolve, so testing may affect the ML processes by providing a bias. It may also cause continued, redundant testing, because something we thought was evaluated was changed by the \u201cexperiences\u201d the AI based it\u2019s learning on. As one example, could the AI modify the science through process changes without our knowing because it didn\u2019t understand the science or the goals of the work?\nAI is a black box with ever-changing contents. That shouldn\u2019t be taken as a condemnation of AI in the lab, but rather as a challenge to human intelligence in evaluating, proving, and applying the technology. That application includes defining the operating boundaries of an AI system. Rather than creating a master AI for a complete process, we may elect to divide the AI\u2019s area of operation into multiple, independent segments, with segment integration occurring in later stages once we are confident in their ability to work and show clear evidence of systems stability. In all of this we need to remember that our goal is the production of high-quality data and information in a controlled, predictable environment, not gee-wiz technology. One place where AI (or clever programming) could be of use is in better workflow planning, which takes into account current workloads and assignments, factors in the inevitable panic-level testing need, and, perhaps in a QC\/production environment, anticipates changes in analysis requirements based on changes in production operations.\nThroughout this section I've treated \u201cAI\u201d as \u201cartificial intelligence,\u201d its common meaning. There may be a better way of looking at it for lab use as, noted in this excerpt from the October 2018 issue of Wired magazine[7]:\n\nAugmented intelligence. Not \u201cartificial,\u201d but how Doug Engelbart[c] envisioned our relationship with computer: AI doesn\u2019t replace humans. It offers idiot-savant assistants that enable us to become the best humans we can be.\nAugmented intelligence (AuI) is a better term for what we might experience in lab work, at least in the near future. It suggests something that is both more realistic and attainable, with the synergism that would make it, and automation, attractive to lab management and personnel\u2014a tool they can work with and improve lab operations that doesn\u2019t carry the specter of something going on that they don\u2019t understand or control. OPUS\/SEARCH from Bruker might be just such an entry in this category.[8] AuI may serve as a first-pass filter for large data sets\u2014as noted in the radio astronomy and chemistry examples noted earlier\u2014reducing those sets of data and information to smaller collections that human intelligence can\/should evaluate. However, that does put a burden on the AuI to avoid excessive false positives or negatives, something that can be adjusted over time.\nBeyond that there is the possibility of more cooperative work between people and AuI systems. An article in Scientific American titled \u201cMy Boss the Robot\u201d[9] describes the advantage of a human-robot team, with the robot doing the heavy work and the human\u2014under the robots guidance\u2014doing work he was more adept at, verses a team of experts with the same task. The task, welding a Humvee frame, was competed by the human machine pair in 10 hours at a cost of $1,150; the team of experts took 89 hours and a cost of $7,075. That might translate into terms of laboratory work by having a robot do routine, highly repetitive tasks and the analyst overseeing the operation and doing higher-level analysis of the results.\nCertainly, AI\/AuI is going to change over time as programming and software technology becomes more sophisticated and capable; today\u2019s example of AuI might be seen as tomorrow\u2019s clever software. However, a lot depends on the experience of the user.\nThere is something important to ask about laboratory technology development, and AI in particular: is the direction of development going to be the result of someone\u2019s innovation that people look at and embrace, or will it be the result of a deliberate choice of lab people saying \u201cthis is where we need to go, build systems that will get us there\u201d? The difference is important, and lab managers and personnel need to be in control of the planning and implementation of systems.\n\n Where do we find the resources to carry out automation projects\/programs? \nGiven the potential scope of work, you may need people with skills in programming, robotics, instrumentation, and possibly mechanical or electrical engineering if off-the-shelf components aren\u2019t available. The biggest need is for people who can do the planning and optimization that is needed as you move from manual to semi- or fully-automated systems, particularly specialists in process engineering who can organize and plan the work, including the process controls and provision for statistical process control.\nWe need to develop people who are well versed in laboratory work and the technologies that can be applied to that work, as assets in laboratory automation development and planning. In the past, this role has been filled with lab personnel having an interest in the subject, IT people willing to extend their responsibilities, and\/or outside consultants. A 2017 report by Salesforce Research states \"77% of IT leaders believe IT functions as an extension\/partner of business units rather than as a separate function.\"[10] The report makes no mention of laboratory work or manufacturing aside from those being functions within businesses surveyed. Unless a particular effort is made, IT personnel rarely have the backgrounds needed to meet the needs of lab work. In many cases, they will try and fit lab needs into software they are already familiar with, rather then extend their backgrounds into new computational environments. Office and pure database applications are easily handled, but when we get to the lab bench, it's another matter entirely.\nThe field is getting complex enough that we need people whose responsibilities span both science and technology. This subject is discussed in the webinar series A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work, Part 5 \"Supporting Laboratory Systems.\"\n\n What equipment would we need for automated processes, and will it be different that what we currently have? \nThis is an interesting issue and it directly addresses the commitment labs have to automation, particularly robotics. In the early days of lab automation when Zymark (Zymate and Benchmate), Perkin Elmer, and Hewlett Packard (ORCA) were the major players in the market, the robot had to adapt to equipment that was designed for human use: standard laboratory equipment. They did that through special modifications and the use of different grippers to handle test tubes, beakers, and flasks. While some companies wanted to test the use of robotics in the lab, they didn\u2019t want to invest in equipment that could only be used with robots; they wanted lab workers to pick up where the robots left off in case the robots didn\u2019t work.\nSince then, equipment has evolved to support automation more directly. In some cases it is a device (e.g., a balance, pH meter, etc.) that has front panel human operator capability and rear connectors for computer communications. Liquid handling systems have seen the most advancement through the adoption of microplate formats and equipment designed to work with them. However, the key point is standardization of the sample containers. Vials and microplates lend themselves to a variety of automation devices, from sample processing to auto-injectors\/samplers. The issue is getting the samples into those formats.\nOne point that labs, in any scientific discipline, have to come to grips with is the commitment to automation. That commitment isn\u2019t going to be done on a lab-wide basis, but on a procedure-by-procedure basis. Full automation may not be appropriate for all lab work, whereas partial automation may be a better choice, and in some cases no automation may be required (we\u2019ll get into that later). The point that needs to be addressed is the choice of equipment. In most cases, equipment is designed for use by people, with options for automation and electronic communications. However, if you want to maximize throughput, you may have to follow examples from manufacturing and commit to equipment that is only used by automation. That will mean a redesign of the equipment, a shared risk for both the vendors and the users. The upside to this is that equipment can be specifically designed for a task, be more efficient, have the links needed for integration, use less material, and, more likely, take up less space. One example is the microplate, allowing for tens, hundreds, or thousands (depending on the plate used) of sample cells in a small space. What used to take many cubic feet of space as test tubes (the precursor to using microplates) is now a couple of cubic inches, using much less material and working space. Note, however, that while microplates are used by lab personnel, their use in automated systems provides greater efficiency and productivity.\nThe idea of equipment used only in an automated process isn\u2019t new. The development and commercialization of segmented flow analyzers\u2014initially by Technicon in the form of the AutoAnalyzers for general use, and the SMA (Sequential Multiple Analyzer) and SMAC (Sequential Multiple Analyzer with Computer) in clinical markets\u2014improved a lab's ability to process samples. These systems were phased out with new equipment that consumed less material. Products like these are being provided by Seal Analytical[11] for environmental work and Bran+Luebbe (a division of SPX Process Equipment in Germany).[12]\nThe issue in committing to automated equipment is that vendors and users will have to agree on equipment specifications and use them within procedures. One place this has been done successfully is in clinical chemistry labs. What other industry workflows could benefit? Do the vendors lead or do the users drive the issue? Vendors need to be convinced that there is a viable market for product before making an investment, and users need to be equally convinced that they will succeed in applying those products. In short, procedures that are important to a particular industry have to be identified, and both users and vendors have to come together to develop automated procedure and equipment specifications for products. This has been done successfully in clinical chemistry markets to the extent that equipment is marketed for use as validated for particular procedures.\n\n What role does a LES play in laboratory automation? \nBefore ELNs settled into their current role in laboratory work, the initial implementations differed considerably from what we have now. LabTech Notebook was released in 1986 (discontinued in 2004) to provide communications between computers and devices that used RS-232 serial communications. In the early 2000s SmartLab from Velquest was the first commercial product to carry the \"electronic laboratory notebook\" identifier. That product became a stand-alone entry in the laboratory execution system (LES) market; since its release, the same conceptual functionality has been incorporated into LIMS and ELNs that fit the more current expectation for an ELN.\nAt it\u2019s core, LES are scripted test procedures that an analyst would follow to carry out a laboratory method, essentially functioning as the programmed execution of a lab process. Each step in a process is described, followed exactly, and provision is made within the script for data collection. In addition, the LES can\/will (depending on the implementation; \"can\" in the case of SmartLab) check to see if the analyst is qualified to carry out the work and that the equipment and reagents are current, calibrated, and suitable for use. The systems can also have access to help files that an analyst can reference if there are questions about how to carry out a step or resolve issues. Beyond that, the software had the ability to work with lab instruments and automatically acquire data either through direct interfaces (e.g., balances, pH meters, etc.) or through parsing PDF files of instrument reports.\nThere are two reasons that these systems are attractive. First, they provide for a rigorous execution of a process with each step being logged as it is done. Second, that log provides a regulatory inspector with documented evidence that the work was done properly, making it easier for the lab to meet any regulatory burden.\nSince the initial development of SmartLab, that product has changed ownership and is currently in the hands of Dassault Syst\u00e8mes as part of the BIOVIA product line. As noted above, LIMS and ELN vendors have incorporated similar functionality into their products. Using those features requires \u201cscripting\u201d (in reality, software development), but it does allow the ability to access the database structures within those products. The SmartLab software needed programmed interfaces to other vendors' LIMS and ELNs to gain access to the same information.\n\n What does this have to do with automation? \nWhen we think about automated systems, particularly full-automation with robotic support, it is a programmed process from start to finish. The samples are introduced at the start, and the process continues until the final data\/information is reported and stored. These can be large scale systems using microplate formats, including tape-based systems from Douglas Scientific[13], programmable autosamplers such as those from Agilent[14], or systems built around robotics arms from a variety of vendors that move samples from one station to another.\nBoth LES and the automation noted in the previous paragraph have the following point in common: there is a strict process that must be followed, with no provision for variation. The difference is that in one case that process is implemented completely through the use of computers, as well as electronic and mechanical equipment. In the other case, the process is being carried out by lab personnel using computers, as well as electronic and mechanical lab equipment. In essence, people take the place of mechanical robots, which conjures up all kinds of images going back to the 1927 film Metropolis.[d] Though the LES represents a step toward more sophisticated automation, both methods still require:\n\nprogramming, including \u201cscripting\u201d (the LES methods are a script that has to be followed);\nvalidated, proven processes; and\nqualified staff, though the qualifications differ. (In both cases they have to be fully qualified to carry out the process in question. However in the full automation case, they will require more education on running, managing, and troubleshooting the systems.)\nIn the case of full automation, there has to be sufficient justification for the automation of the process, including sufficient sample processing. The LES-human implementation can be run for a single sample if needed, and the operating personnel can be trained on multiple procedures, switching tasks as needed. Electro-mechanical automation would require a change in programming, verification that the system is operating properly, and may require equipment re-configuration. Which method is better for a particular lab depends on trade-offs between sample load, throughput requirements, cost, and flexibility. People are adaptable, easily moving between tasks, whereas equipment has to be adapted to a task.\n\n How do we go about planning for automation? \nThere are three forms of automation to be considered:\n\nNo automation \u2013 Instead, the lab relies on lab personnel to carry out all steps of a procedure.\nPartial automation \u2013 Automated equipment is used to carry out steps in a procedure. Given the current state of laboratory systems, this is the most prevalent since most lab equipment has computer components in them to facilitate their use.\nFull automation - The entire process is automated. The definition of \u201centire\u201d is open to each labs interpretation and may vary from one process to another. For example, some samples may need some handing before they are suitable for use in a procedure. That might be a selection process from a freezer, grinding materials prior to a solvent extraction, and so on, representing cases where the equipment available isn\u2019t suitable for automated equipment interaction. One goal is to minimize this effort since it can put a limit on the productivity of the entire process. This is also an area where negotiation between the lab and the sample submitter can be useful. Take plastic pellets for example, which often need to be ground into a course powder before they can be analyzed; having the submitter provide them in this form will reduce the time and cost of the analysis. Standardizing on the sample container can also facilitate the analysis (having the lab provide the submitter with standard sample vials using barcodes or RFID chips can streamline the process).\nOne common point that these three forms share is a well-described method (procedure, process) that needs to be addressed. That method should be fully developed, tested, and validated. This is the reference point for evaluating any form of automation (Figure 1).\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 1. Items to be considered in automating systems\n\n\n\nThe documentation for the chosen method should include the bulleted list of items from Figure 1, as they describe the science aspects of the method. The last four points are important. The method should be validated since the manual procedure is a reference point for determining if the automated system is producing useful results. The reproducibility metric offers a means of evaluating at least one expected improvement in an automated system; you\u2019d expect less variability in the results. This requires a set of reference sample materials that can be repeatedly evaluated to compare the manual and automated systems, and to periodically test the methods in use to ensure that there aren\u2019t any trends developing that would compromise the method\u2019s use. Basically, this amounts to statistical quality control on the processes.\nThe next step is to decide what improvements you are looking for in an automated system: increased throughput, lower cost of operation, the ability to off-load human work, reduced variability, etc. In short, what are your goals?\nThat brings us to the matter of project planning. We\u2019re not going to go into a lot of depth in this piece about project planning, as there are a number of references[e] on the subject, including material produced by the former Institute for Laboratory Automation.[f] There are some aspects of the subject that we do need to touch on, however, and they include:\n\njustifying the project and setting expectations and goals;\nanalyzing the process;\nscheduling automation projects; and\nbudgeting.\n Justification, expectations, and goals \nBasically why are you doing this, what do you expect to gain? What arguments are you going to use to justify the work and expense involved in the project? How will you determine if the project is successful?\nFundamentally, automation efforts are about productivity and the bulleted items noted in the introduction of this piece, repeated below with additional commentary:\n\nLower costs per test, and better control over expenditure: These can result from a reduction in labor and materials costs, including more predictable and consistent reagent usage per test.\nStronger basis for better workflow planning: Informatics systems can provide better management over workloads and resource allocation, while key performance indicators can show where bottlenecks are occurring or if samples are taking too long to process. These can be triggers for procedure automation to improve throughput.\nReproducibility: The test results from automated procedures can be expected to be more reproducible by eliminating the variability that is typical of steps executed by people. Small variation in dispensing reagents, for example, could be eliminated.\nPredictability: The time to completion for a given test is more predictable in automated programs; once the process starts it keeps going without interruptions that can be found in human centered activities\nTighter adherence to procedures: Automated procedures have no choice but to be consistent in procedure execution; that is what programming and automation is about.\nOf these, which are important to your project? If you achieved these goals, what would it mean to your labs operations and the organization as a whole; this is part of the justification for carrying out the projects.\nAs noted earlier, there are several things to consider in order to justify a project. First, there has to be a growing need that supports a procedures automation, one that can\u2019t be satisfied by other means that could include adding people, equipment, and lab space, or outsourcing the work (with the added burden of insuring data quality and integrity, and integrating that work with the lab\u2019s data\/information). Second, the cost of the project must be balanced by it\u2019s benefits. This includes any savings in cost, people (not reducing headcount, but avoiding new hires), material, and equipment, as well as the improvement of timeliness of results and overall lab operations. Third, when considering project justification, the automated process\u2019s useful lifetime has to be long enough to justify the development work. And finally, the process has to be stable so that you aren\u2019t in a constant re-development situation (this differs from periodic upgrades and performance improvements, EVOP in manufacturing terms). One common point of failure in projects is changes in underlying procedures; if the basic process model changes, you are trying to hit a moving target. That ruins schedules and causes budgets to inflate.\nThis may seem like a lot of things to think about for something that could be as simple as perhaps moving from manual pipettes to automatic units, but that just means the total effort to do the work will be small. However it is still important since it impacts data quality and integrity, and your ability to defend your results should they be challenged. And, by the way, the issue of automated pipettes isn\u2019t simple; there is a lot to consider in properly specifying and using these products.[g]\n\nAnalyzing the process \nAssuming that you have a well-described, thoroughly tested and validated procedure, that process has to be analyzed for optimization and suitability for automation. This is an end-to-end evaluation, not just a examination of isolated steps. This is an important point. Looking at a single step without taking into account the rest of the process may improve that portion of the process but have consequences elsewhere.\nTake a common example: working in a testing environment where samples are being submitted by outside groups (Figure 2).\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 2. Lab sample processing, initial data entry through results\n\n\n\nMost LIMS will permit sample submitters (with appropriate permissions) to enter the sample description information directly into the LIMS, reducing some of the clerical burden. Standardizing on sample containers, with barcodes, reduces the effort and cost in some aspects of sample handling. A barcode scanner could be used to scan samples as they arrive into the lab, letting the system know that they are ready to be tested.\nThat brings us to an evaluation of the process as a whole, as well as an examination of the individual steps in the procedure. As shown in Figure 1, automation can be done in one of two ways: automating the full process or automating individual steps. Your choice depends on several factors, not the least of which is your comfort level and confidence in adopting automation as a strategy for increasing productivity. For some, concentrating on improvements in individual steps is an attractive approach. The cost and risk may be lower and if a problem occurs you can always backup to a fully manual implementation until they are resolved.\nCare does have to be taken in choosing which steps to improve. From one perspective, you\u2019d want to do the step-wise implementation of automation as close to the end of the process as possible. The problem with doing it earlier is that you may create a backup in later stages of the process. Optimizing step 2, for example, doesn\u2019t do you much good if step 3 is overloaded and requires more people, or additional (possibly unplanned) automation to relieve a bottleneck there. In short, before you automate or improve a given step, you need to be sure that downstream processing can absorb the increase in materials flow. In addition, optimizing all the individual steps, one at time, doesn\u2019t necessarily add up to a well-designed full system automation. The transition between steps may not be as effective or efficient if the system were evaluated as a whole. If the end of the process is carried out by commercial instrumentation, the ability to absorb more work is easier since most of these systems are automated with computer data acquisition and processing, and many have auto-samplers available to accumulate samples that can be processed automatically. Some of those auto-samplers have built in robotics for common sample handling functions. If the workload builds, additional instruments can pick up the load, and equipment such as Baytek International\u2019s TurboTube[15] can accumulate sample vials in a common system and route them to individual instruments for processing.\nAnother consideration for partial automation is where the process is headed in the future. If the need for the process persists over a long period of time, will you eventually get to the point of needing to redo the automation to an integrated stream? If so, is it better to take the plunge early on instead of continually expending resources to upgrade it?\nOther considerations include the ability to re-purpose equipment. If a process isn\u2019t used full-time (a justification for partial automation) the same components may be used in improving other processes. Ideally, if you go the full-process automation route, you\u2019ll have sufficient sample throughput to keep it running for an extended period of time, and not have to start and stop the system as samples accumulate. A smoothly running slower automation process is better than a faster system that lies idle for significant periods of time, particularly since startup and shutdown procedures may diminish the operational cost savings in both equipment use and people\u2019s time.\nAll these points become part of both the technical justification and budget requirements.\nAnalyzing the process: Simulation and modeling\nSimulation and modeling have been part of science and engineering for decades, supported by ever-increasing powerful computing hardware and software. Continuous systems simulations have shown us the details of how machinery works, how chemical reactions occur, and how chromatographic systems and other instrumentation behaves.[16] There is another aspect to modeling and simulation that is appropriate here.\nDiscrete-events simulation (DES) is used to model and understand processes in business and manufacturing applications, evaluating the interactions between service providers and customers, for example. One application of DES is to determine the best way to distribute incoming customers to a limited number of servers, taking into account that not all customers have the same needs; some will tie up a service provider a lot longer than others, as represented by the classic bank teller line problem. That is one question that discrete systems can analyze. This form of simulation and modeling is appropriate to event-driven processes where the action is focused on discrete steps (like materials moving from one workstation to another) rather than as a continuous function of time (most naturally occurring systems fall into this category, e.g., heat flow and models using differential equations).\nThe processes in your lab can be described and analyzed via DES systems.[17][18][19] Those laboratory procedures are a sequence of steps, each having a precursor, variable duration, and following step until the end of the process is reached; this is basically the same as a manufacturing operation where modeling and simulation have been used successfully for decades. DES can be used to evaluate those processes and ask questions that can guide you on the best paths to take in applying automation technologies and solving productivity or throughput problems. For example:\n\nWhat happens if we tighten up the variability in a particular step; how will that affect the rest of the system?\nWhat happens at the extremes of the variability in process steps; does it create a situation where samples pile up?\nHow much of a workload can the process handle before one step becomes saturated with work and the entire system backs up?\nCan you introduce an alternate path to process those samples and avoid problems (e.g., if samples are held for too long in one stage, do they deteriorate)?\nCan the output of several parallel slower procedures be merged into a feed stream for a common instrumental technique?\nIn complex procedures some steps may be sensitive to small delays, and DES can help test and uncover them. Note that setting up these models will require the collection of a lot of data about the processes and their timing, so this is not something to be taken casually.\nPrevious research[16][17][18][19] suggests only a few ideas where simulation can be effective, including one where an entire labs operation\u2019s was evaluated. Models that extensive can be used to not only look at procedures, but also the introduction of informatics systems. This may appear to be a significant undertaking, and it can be depending on the complexity of the lab processes. However, simple processes can be initially modeled on spreadsheets to see if more significant effort is justified. Operations research, of which DES is a part, has been usefully applied in production operations to increase throughput and improve ROI. It might be successfully applied to some routine production oriented lab work.\nMost lab processes are linear in their execution, one step following another, with the potential for loop-backs should problems be recognized with samples, reagents (e.g., being out-of-date, doesn\u2019t look right, need to obtain new materials), or equipment (e.g., not functioning properly, out of calibration, busy due to other work). On one level, the modeling of a manually implemented process should appear to be simple: each step takes a certain amount of time. If you add up the times, you have a picture of the process execution through time. However, the reality is quite different if you take into account problems (and their resolution) that can occur in each of those steps. The data collection used to model the procedure can change how that picture looks and your ability to improve it. By monitoring the process over a number of iterations, you can find out how much variation there is in the execution time for each step and whether or not the variation is a normal distribution or skewed (e.g., if one step is skewed, how does it impact others?).\nQuestions to ask about potential problems that could occur at each step include:\n\nHow often do problems with reagents occur and how much of a delay does that create?\nIs instrumentation always in calibration (do you know?), are there operational problems with devices and their control systems (what are the ramifications?), are procedures delayed due to equipment being in use by someone else, and how long does it take to make changeovers in operating conditions?\nWhat happens to the samples; do they degrade over time? What impact does this have on the accuracy of results and their reproducibility?\nHow often are workflows interrupted by the need to deal with high-priority samples, and what effect does it have on the processing of other samples?\nJust the collection of data can suggest useful improvements before there are any considerations for automation, and perhaps negating the need for it. The answer to a lab\u2019s productivity might be as simple as adding another instrument if that is a bottleneck. It might also suggest that an underutilized device might be more productive if sample preparation for different procedures workflows were organized differently. Underutilization might be a consequence of the amount of time needed to prepare the equipment for service: doing so for one sample might be disproportionately time consuming (and expensive) and cause other samples to wait until there were enough of them to justify the preparation. It could also suggest that some lab processes should be outsourced to groups that have a more consistent sample flow and turn-around time (TAT) for that technique. Some of these points are illustrated in Figures 3a and 3b below.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 3a. Simplified process views versus some modeling considerations. Note that the total procedure execution time is affected by the variability in each step, plus equipment and material availability delays; these can change from one day to the next in manual implementations.\n\n\n\n\n\n\n\n\n\n\n\n\nFigure 3b. The execution times of each step include the variable execution times of potential issues that can occur in each stage. Note that because each factor has a different distribution curve, the total execution time has a much wider variability than the individual factors.\n\n\n\nHow does the simulation system work? Once you have all the data set up, the simulation runs thousands of times using random number generators to pick out variables in execution times for each component in each step. For example, if there is a one-in-ten chance a piece of equipment will be in use when needed, 10% of the runs will show that with each one picking a delay time based on the input delay distribution function. With a large number of runs, you can see where delays exist and how they impact the overall processes behavior. You can also adjust the factors (what happens if equipment delays are cut in half) and see the effect of doing that. By testing the system, you can make better judgments on how to apply your resources.\nSome of the issues that surface may be things that lab personnel know about and just deal with. It isn\u2019t until the problems are looked at that the impact on operations are fully realized and addressed. Modeling and simulation may appear to be overkill for lab process automation, something reserved for large- scale production projects. The physical size of the project is not the key factor, it is the complexity of the system that matters and the potential for optimization.\nOne benefit of a well-structured simulation of lab processes is that it would provide a solid basis for making recommendations for project approval and budgeting. The most significant element in modeling and simulation is the initial data collection, asking lab personnel to record the time it takes to carry out steps. This isn\u2019t likely to be popular if they don\u2019t understand why it is being done and what the benefits will be to them and the lab; accurate information is essential. This is another case where \u201cbad data is worse than no data.\u201d\nGuidleines for process automation\nThere are two types of guidelines that will be of interest to those conducting automation work: those that help you figure out what to do and how to do it, and those that must be met to satisfy regulatory requirements (both those evaluated by internal or external groups or organizations).\nThe first is going to depend on the nature of the science and automation being done to support it. Equipment vendor community support groups can be of assistance. Additionally, professional groups like the Pharmaceutical Research and Manufacturers of America (PhRMA), International Society for Pharmaceutical Engineering (ISPE), and Parenteral Drug Association (PDA) in the pharmaceutical and biotechnology industrues, with similar organizations in other industries and other countries. This may seem like a large jump from laboratory work, but it is appropriate when we consider the ramification of full-process automation. You are essentially developing a manufacturing operation on a lab bench, and the same concerns that large-scale production have also apply here; you have to ensure that the process is maintained and in control. The same is true of manual or semi-automated lab work, but it is more critical in fully-automated systems because of the potential high volume of results that can be produced.\nThe second set is going to consist of regulatory guidelines from groups appropriate to your industry: the Food and Drug Administration (FDA), Environmental Protection Agency (EPA), and International Organization for Standardization (ISO), as well as international groups (e.g., GAMP, GALP) etc. The interesting point is that we are looking at a potentially complete automation scheme for a procedure; does that come under manufacturing or laboratory? The likelihood is that laboratory guidelines will apply since the work is being done within the lab's footprint; however, there are things that can be learned from their manufacturing counterparts that may assist in project management and documentation. One interesting consideration is what happens when fully automated testing, such as on-line analyzers, becomes integrated with both the lab and production or process control data\/information streams. Which regulatory guidelines apply? It may come down to who is responsible for managing and supporting those systems.\n\nScheduling automation projects \nThere are two parts to the schedule issue: how long is it going to take to compete the project (dependent on the process and people), and when do you start? The second point will be addressed here.\nThe timing of an automated process coming online is important. If it comes on too soon, there may not be enough work to justify it\u2019s use, and startup\/shutdown procedures may create more work than the system saves. If it comes too late, people will be frustrated with a heavy workload while the system that was supposed to provide relief is under development. \nIn Figure 4, the blue line represents the growing need for sample\/material processing using a given laboratory procedure. Ideally, you\u2019d like the automated version to be available when that blue line crosses the \u201cautomation needed on-line\u201d level of processing requirements; this the point where the current (manual?) implementation can no longer meet the demands of sample throughput requirements.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 4. Timing the development of an automated system\n\n\n\nThose throughput limits are something you are going to have to evaluate and measure on a regular basis and use to make adjustments to the planning process (accelerating or slowing it as appropriate). How fast is the demand growing and at what point will your current methods be overwhelmed? Hiring more people is one option, but then the lab's operating expenses increase due to the cost of people, equipment, and lab space.\nOnce we have an idea of when something has to be working, we can begin the process of planning; note: the planning can begin at any point, it would be good to get the preliminaries done as soon as a manual process is finalized so that you have an idea of what you\u2019ll be getting into. Those preliminaries include looking at equipment that might be used (keeping track of its development), training requirements, developer resources, and implementation strategies, all of which would be updated as new information becomes available. The \u201cwe\u2019ll-get-to-it-when-we-need-it\u201d approach is just going to create a lot of stress and frustration.\nYou need to put together a first-pass project plan so that you can detail what you know, and more importantly what you don\u2019t know. The goal is to have enough information, updated as noted above, so that you can determine if an automated solution is feasible, make an informed initial choice between full and partial automation, and have a timeline for implementation. Any time estimate is going to be subject to change as you gather information and refine your implementation approach. The point of the timeline is to figure out how long the yellow box in Figure 4 is because that is going to tell you how much time you have to get the plan together and working; it is a matter of setting priorities and recognizing what they are. The time between now and the start of the yellow box is what you have to work with for planning and evaluating plans, and any decisions that are needed before you begin, including corporate project management requirements and approvals.\nThose plans have to include time for validation and the evaluation of the new implementation against the standard implementation. Does it work? Do we know how to use and maintain it? And are people educated in its use? Is there documentation for the project?\n\nBudgeting \nAt some point, all the material above and following this section comes down to budgeting: how much will it cost to implement a program and is it worth it? Of the two points, the latter is the one that is most important. How do you go about that? (Note: Some of this material is also covered in the webinar series A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work in the section on ROI.)\nWhat a lot of this comes down to is explaining and justifying the choices you\u2019ve made in your project proposal. We\u2019re not going to go into a lot of depth, but just note some of the key issues:\n\nDid you choose full or partial automation for your process?\nWhat drove that choice? If in your view it would be less expensive than the full automation of a process, how long will it be until the next upgrade is needed to another stage?\nHow independent are the potential, sequential implementation efforts that may be undertaken in the future? Will there be a need to connect them, and if so, how will the incremental costs compare to just doing it once and getting it over with?\nThere is a tendency in lab work to treat problems and the products that might be used to address them in isolation. You see the need for a LIMS or ELN, or an instrument data system, and the focus is on those issues. Effective decisions have to consider both the immediate and longer-term aspects of a problem. If you want to get access to a LIMS, have you considered how it will affect other aspects of lab work such as connecting instrument to it?\nThe same holds true for partial automation as a solution to a lab process productivity problem. While you are addressing a particular step, should you be looking at the potential for synergism by addressing other concerns. Modeling and simulations of processes can help resolve that issue.\nHave you factored in the cost of support and education? The support issue needs to address the needs of lab personnel in managing the equipment and the options for vendor support, as well as the impact on IT groups. Note that the IT group will require access to vendor support, as well as being educated on their role in any project work.\nWhat happens if you don\u2019t automate? One way to justify the cost of a project is to help people understand what the lab\u2019s operations will be like without it. Will more people, equipment, space, or added shifts be needed? At what cost? What would the impact be on those who need the results and how would it affect their programs?\n\n Build, buy, or cooperate? \nIn this write up and some of the referenced materials, we\u2019ve noted several times the benefits that clinical labs have gained through automation, although crediting it all to the use of automation alone isn\u2019t fair. What the clinical laboratory industry did was recognize that there was a need for the use of automation to solve problems with the operational costs of running labs, and recognition that they could benefit further by coming together and cooperatively addressing lab operational problems.\nIt\u2019s that latter point that made the difference and resulted in standardized communications, and purpose-built commercial equipment that could be used to implement automation in their labs. They also had common sample types, common procedures, and data processing. That same commonality applies to segments of industrial and academic lab work. Take life sciences as an example. Where possible, that industry has standardized on micro-plates for sample processing. The result is a wide selection of instruments and robotics built around that sample-holding format that greatly improves lab economics and throughput. While it isn\u2019t the answer to everything, it\u2019s a good answer to a lot of things.\nIf your industry segment came together and recognized that you used common procedures, how would you benefit by creating a common approach to automation instead of each lab doing it on their own? It would open the development of common products or product variations from vendors and relieve the need for each lab developing its own answer to the need. The result could be more effective and easily supportable solutions.\n\nProject planning \nOnce you\u2019ve decided on the project you are going to undertake, the next stage is looking at the steps needed to manage your project (Figure 5).\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 5. Steps in a laboratory automation project. This diagram is modeled after the GAMP V for systems validation.\n\n\n\nThe planning begins with the method description from Figure 1, which describes the science behind the project and the specification of how the automation is expected to be put into effect: as full-process automation, a specific step, or steps in the process. The provider of those documents is considered the \u201ccustomer\u201d and is consistent with GAMP V nomenclature (Figure 6); that consistency is important due to the need for system-wide validation protocols.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 6. GAMP V model for showing customer and supplier roles in specifying and evaluating project components for computer hardware and software.\n\n\n\nFrom there the \u201csupplier\u201d (e.g., internal development group, consultant, IT services, etc.) responds with a functional specification that is reviewed by the customer. The \u201canalysis, prototyping, and evaluation\u201d step, represented in the third box of Figure 5, is not the same as the process analysis noted earlier in this piece. The earlier section was to help you determine what work needed to be done and documented in the user requirements specification. The analysis and associated tasks here are specific to the implementation of this project. The colored arrows refer to the diagram in Figure 7. That process defines the equipment needed, dependencies, and options\/technologies for automation implementations, including robotics, instrument design requirements, pre-built automation (e.g., titrators, etc.) and any custom components. The documentation and specifications are part of the validation protocol.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 7. Defining dependencies and qualification of equipment\n\n\n\nThe prototyping function is an important part of the overall process. It is rare that someone will look at a project and come up with a working solution on the first pass. There is always tinkering and modifications that occur as you move from a blank slate to a working system. You make notes along the way about what should be done differently in the final product, and places where improvements or adjustments are needed. These all become part of the input to the system design specification that will be reviewed and approved by the customer and supplier. The prototype can be considered a proof of concept or a demonstration of what will occur in the finished product. Remember also that prototypes would not have to be validated since they wouldn\u2019t be used in a production environment; they are simply a test bed used prior to the development of a production system.\nThe component design specifications are the refined requirement for elements that will be used in the final design. Those refinements could point to updated models of components or equipment used, modifications needed, or recommendations for products with capabilities other than those used in the prototype.\nThe boxes on the left side of Figure 5 are documents that go into increasing depth as the system is designed and specified. The details in those items will vary with the extent of the project. The right side of the diagram is a series of increasingly sophisticated testing and evaluation against steps in the right side, culminating in the final demonstration that the system works, has been validated, and is accepted by the customer. It also means that lab and support personnel are educated in their roles.\n\n Conclusions (so far) \n\u201cLaboratory automation\u201d has to give way to \u201claboratory automation engineering.\u201d From the initial need to the completion of the validation process, we have to plan, design, and implement successful systems on a routine basis. Just as the manufacturing industries transitioned from cottage industries to production lines and then to integrated production-information systems, the execution of laboratory science has to tread a similar path if the demands for laboratory results are going to be met in a financially responsible manner. The science is fundamental; however, we need to pay attention now to efficient execution.\n\n Abbreviations, acronyms, and initialisms \nAI: Artificial intelligence\nAuI: Augmented intelligence\nDES: Discrete-events simulation\nELN: Electronic laboratory notebook\nEPA: Environmental Protection Agency\nFDA: Food and Drug Administration\nFRB: Fast radio bursts\nGALP: Good automated laboratory practices\nGAMP: Good automated manufacturing practice\nISO: International Organization for Standardization\nLES: Laboratory execution system\nLIMS: Laboratory information management system\nML: Machine learning\nROI: Return on investment\nSDMS: Scientific data management system\nTAT: Turn-around time\n\r\n\n\nFootnotes \n\n\n\u2191 The term \"scientific manufacturing\" was first mentioned to the author by Mr. Alberto Correia, then of Cambridge Biomedical, Boston, MA. \n\n\u2191 Intelligent enterprise technologies referenced in the report include robotic process automation, machine learning, artificial intelligence, the internet Of things, predictive analysis, and cognitive computing. \n\n\u2191 Doug Engelbart found the field of human-computer interaction and is credited with the invention of the computer mouse, and the \u201cMother of All Demos\u201d in 1968. \n\n\u2191 See Metropolis (1927 film) on Wikipedia. \n\n\u2191 See for example https:\/\/www.projectmanager.com\/project-planning; the simplest thing to do it put \u201cproject planning\u201d in a search engine and browse the results for something interesting. \n\n\u2191 See for example https:\/\/theinformationdrivenlaboratory.wordpress.com\/category\/resources\/; note that any references to the ILA should be ignored as the original site is gone, with the domain name perhaps having been leased by another organization that has no affiliation with the original Institute for Laboratory Automation. \n\n\u2191 As a starting point, view the Artel, Inc. site as one source. Also, John Bradshaw gave an informative presentation on \u201cThe Importance of Liquid Handling Details and Their Impact on your Assays\u201d at the 2012 European Lab Automation Conference, Hamburg, Germany. \n\n\nAbout the author \nInitially educated as a chemist, author Joe Liscouski (joe dot liscouski at gmail dot com) is an experienced laboratory automation\/computing professional with over forty years of experience in the field, including the design and development of automation systems (both custom and commercial systems), LIMS, robotics and data interchange standards. He also consults on the use of computing in laboratory work. He has held symposia on validation and presented technical material and short courses on laboratory automation and computing in the U.S., Europe, and Japan. He has worked\/consulted in pharmaceutical, biotech, polymer, medical, and government laboratories. His current work centers on working with companies to establish planning programs for lab systems, developing effective support groups, and helping people with the application of automation and information technologies in research and quality control environments.\n\nReferences \n\n\n\u2191 Frey, C.B.; Osborne, M.A. (17 September 2013). \"The Future of Employment: How Susceptible Are Jobs to Computerisation?\" (PDF). Oxford Martin School, University of Oxford. https:\/\/www.oxfordmartin.ox.ac.uk\/downloads\/academic\/The_Future_of_Employment.pdf . Retrieved 04 February 2021 .   \n \n\n\u2191 Hsu, J. (24 September 2018). \"Is it aliens? Scientists detect more mysterious radio signals from distant galaxy\". NBC News MACH. https:\/\/www.nbcnews.com\/mach\/science\/it-aliens-scientists-detect-more-mysterious-radio-signals-distant-galaxy-ncna912586 . Retrieved 04 February 2021 .   \n \n\n\u2191 Timmer, J. (18 July 2018). \"AI plus a chemistry robot finds all the reactions that will work\". Ars Technica. https:\/\/arstechnica.com\/science\/2018\/07\/ai-plus-a-chemistry-robot-finds-all-the-reactions-that-will-work\/5\/ . Retrieved 04 February 2021 .   \n \n\n\u2191 \"HelixAI - Voice Powered Digital Laboratory Assistants for Scientific Laboratories\". HelixAI. http:\/\/www.askhelix.io\/ . Retrieved 04 February 2021 .   \n \n\n\u2191 PharmaIQ News (20 August 2018). \"Automation, IoT and the future of smarter research environments\". PharmaIQ. https:\/\/www.pharma-iq.com\/pre-clinical-discovery-and-development\/news\/automation-iot-and-the-future-of-smarter-research-environments . Retrieved 04 February 2021 .   \n \n\n\u2191 6.0 6.1 PharmaIQ (14 November 2017). \"The Future of Drug Discovery: AI 2020\". PharmaIQ. https:\/\/www.pharma-iq.com\/pre-clinical-discovery-and-development\/whitepapers\/the-future-of-drug-discovery-ai-2020 . Retrieved 04 February 2021 .   \n \n\n\u2191 Rossetto, L. (2018). \"Fight the Dour\". Wired (October): 826\u20137. https:\/\/www.magzter.com\/stories\/Science\/WIRED\/Fight-The-Dour .   \n \n\n\u2191 \"OPUS Package: SEARCH & IDENT\". Bruker Corporation. https:\/\/www.bruker.com\/en\/products-and-solutions\/infrared-and-raman\/opus-spectroscopy-software\/search-identify.html . Retrieved 04 February 2021 .   \n \n\n\u2191 Bourne, D. (2013). \"My Boss the Robot\". Scientific American 308 (5): 38\u201341. doi:10.1038\/scientificamerican0513-38. PMID 23627215.   \n \n\n\u2191 SalesForce Research (2017). \"Second Annual State of IT\" (PDF). SalesForce. https:\/\/a.sfdcstatic.com\/content\/dam\/www\/ocms\/assets\/pdf\/misc\/2017-state-of-it-report-salesforce.pdf . Retrieved 04 February 2021 .   \n \n\n\u2191 \"Seal Analytical - Products\". Seal Analytical. https:\/\/seal-analytical.com\/Products\/tabid\/55\/language\/en-US\/Default.aspx . Retrieved 04 February 2021 .   \n \n\n\u2191 \"Bran+Luebbe\". SPX FLOW, Inc. https:\/\/www.spxflow.com\/bran-luebbe\/ . Retrieved 04 February 2021 .   \n \n\n\u2191 \"Array Tape Advanced Consumable\". Douglas Scientific. https:\/\/www.douglasscientific.com\/Products\/ArrayTape.aspx . Retrieved 04 February 2021 .   \n \n\n\u2191 \"Agilent 1200 Series Standard and Preparative Autosamplers - User Manual\" (PDF). Agilent Technologies. November 2008. https:\/\/www.agilent.com\/cs\/library\/usermanuals\/Public\/G1329-90012_StandPrepSamplers_ebook.pdf . Retrieved 04 February 2021 .   \n \n\n\u2191 \"iPRO Interface - Products\". Baytek International, Inc. https:\/\/www.baytekinternational.com\/products\/ipro-interface\/89-products . Retrieved 05 February 2021 .   \n \n\n\u2191 16.0 16.1 Joyce, J. (2018). \"Computer Modeling and Simulation\". Lab Manager (9): 32\u201335. https:\/\/www.labmanager.com\/laboratory-technology\/computer-modeling-and-simulation-1826 .   \n \n\n\u2191 17.0 17.1 Costigliola, A.; Ata\u00edde, F.A.P.; Vieira, S.M. et al. (2017). \"Simulation Model of a Quality Control Laboratory in Pharmaceutical Industry\". IFAC-PapersOnLine 50 (1): 9014-9019. doi:10.1016\/j.ifacol.2017.08.1582.   \n \n\n\u2191 18.0 18.1 Meng, L.; Liu, R.; Essick, C. et al. (2013). \"Improving Medical Laboratory Operations via Discrete-event Simulation\". Proceedings of the 2013 INFORMS Healthcare Conference. https:\/\/www.researchgate.net\/publication\/263238201_Improving_Medical_Laboratory_Operations_via_Discrete-event_Simulation .   \n \n\n\u2191 19.0 19.1 \"Application of discrete-event simulation in health care clinics: A survey\". Journal of the Operational Research Society 50: 109\u201323. 1999. doi:10.1057\/palgrave.jors.2600669.   \n \n\n\n\n\n\n\nSource: <a rel=\"external_link\" class=\"external\" href=\"https:\/\/www.limswiki.org\/index.php\/LII:Considerations_in_the_Automation_of_Laboratory_Procedures\">https:\/\/www.limswiki.org\/index.php\/LII:Considerations_in_the_Automation_of_Laboratory_Procedures<\/a>\nNavigation menuPage actionsLIIDiscussionView sourceHistoryPage actionsLIIDiscussionMoreToolsIn other languagesPersonal toolsLog inRequest accountNavigationMain pageRecent changesRandom pageHelp about MediaWikiSearch\u00a0 ToolsWhat links hereRelated changesSpecial pagesPermanent linkPage informationSponsors \r\n\n\t\r\n\n\t\r\n\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\r\n\n\t\r\n\n \n\t\n\t\n\t\r\n\n\t\r\n\n \n\t\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\r\nPrint\/exportCreate a bookDownload as PDFDownload as PDFDownload as Plain textPrintable version This page was last edited on 18 February 2021, at 19:46.Content is available under a Creative Commons Attribution-ShareAlike 4.0 International License unless otherwise noted.This page has been accessed 1,408 times.Privacy policyAbout LIMSWikiDisclaimers\n\n\n\n","e0147011cc1eb892e1a35e821657a6d9_html":"<body class=\"mediawiki ltr sitedir-ltr mw-hide-empty-elt ns-202 ns-subject page-LII_Considerations_in_the_Automation_of_Laboratory_Procedures rootpage-LII_Considerations_in_the_Automation_of_Laboratory_Procedures skin-monobook action-view skin--responsive\"><div id=\"rdp-ebb-globalWrapper\"><div id=\"rdp-ebb-column-content\"><div id=\"rdp-ebb-content\" class=\"mw-body\" role=\"main\"><a id=\"rdp-ebb-top\"><\/a>\n<h1 id=\"rdp-ebb-firstHeading\" class=\"firstHeading\" lang=\"en\">LII:Considerations in the Automation of Laboratory Procedures<\/h1><div id=\"rdp-ebb-bodyContent\" class=\"mw-body-content\"><!-- start content --><div id=\"rdp-ebb-mw-content-text\" lang=\"en\" dir=\"ltr\" class=\"mw-content-ltr\"><div class=\"mw-parser-output\"><p><b>Title<\/b>: <i>Considerations in the Automation of Laboratory Procedures<\/i>\n<\/p><p><b>Author for citation<\/b>: Joe Liscouski, with editorial modifications by Shawn Douglas\n<\/p><p><b>License for content<\/b>: <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/creativecommons.org\/licenses\/by\/4.0\/\" target=\"_blank\">Creative Commons Attribution 4.0 International<\/a>\n<\/p><p><b>Publication date<\/b>: January 2021\n<\/p>\n\n\n<h2><span class=\"mw-headline\" id=\"Introduction\">Introduction<\/span><\/h2>\n<p>Scientists have been dealing with the issue of <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_automation\" title=\"Laboratory automation\" class=\"wiki-link\" data-key=\"0061880849aeaca05f8aa27ae171f331\">laboratory automation<\/a> for decades, and during that time the meaning of those words has expanded from the basics of connecting an instrument to a computer, to the possibility of a fully integrated <a href=\"https:\/\/www.limswiki.org\/index.php\/Informatics_(academic_field)\" title=\"Informatics (academic field)\" class=\"wiki-link\" data-key=\"0391318826a5d9f9a1a1bcc88394739f\">informatics<\/a> infrastructure beginning with <a href=\"https:\/\/www.limswiki.org\/index.php\/Sample_(material)\" title=\"Sample (material)\" class=\"wiki-link\" data-key=\"7f8cd41a077a88d02370c02a3ba3d9d6\">sample<\/a> preparation and continuing on to the <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_information_management_system\" title=\"Laboratory information management system\" class=\"wiki-link\" data-key=\"8ff56a51d34c9b1806fcebdcde634d00\">laboratory information management system<\/a> (LIMS), <a href=\"https:\/\/www.limswiki.org\/index.php\/Electronic_laboratory_notebook\" title=\"Electronic laboratory notebook\" class=\"wiki-link\" data-key=\"a9fbbd5e0807980106763fab31f1e72f\">electronic laboratory notebook<\/a> (ELN), and beyond. Throughout this evolution there has been one underlying concern: how do we go about doing this?\n<\/p><p>The answer to that question has changed from a focus on hardware and programming, to today\u2019s need for a lab-wide informatics strategy. We\u2019ve moved from the bits and bytes of assembly language programming to managing terabytes of files and data structures.\n<\/p><p>The high-end of the problem\u2014the large informatics database systems\u2014has received significant industry-wide attention in the last decade. The stuff on the lab bench, while the target of a lot of individual products, has been less organized and more experimental. Failed or incompletely met promises have to yield to planned successes. How we do it needs to change. This document is about the considerations required when making that change. The haphazard \"let's try this\" method has to give way to more engineered solutions and a realistic appraisal of the human issues, as well as the underlying technology management and planning.\n<\/p><p>Why is this important? Whether you are conducting intense laboratory experiments to produce data and <a href=\"https:\/\/www.limswiki.org\/index.php\/Information\" title=\"Information\" class=\"wiki-link\" data-key=\"6300a14d9c2776dcca0999b5ed940e7d\">information<\/a> or making chocolate chip cookies in the kitchen, two things remain important: productivity and the quality of the products. In either case, if the productivity isn\u2019t high enough, you won\u2019t be able to justify your work; if the quality isn\u2019t there, no one will want what you produce. Conducting laboratory work and making cookies have a lot in common. Your laboratories exist to answer questions. What happens if I do this? What is the purity of this material? What is the structure of this compound? The field of laboratories asking these questions is extensive, basically covering the entire array of lab bench and scientific work, including chemistry, life sciences, physics, and electronics labs. The more efficiently we answer those questions, the more likely it will be that these labs will continue operating and, that you\u2019ll achieve the goals your organization has set. At some point, it comes down to performance against goals and the return on the investment organizations make in lab operations.\n<\/p><p>In addition to product quality and productivity, there are a number of other points that favor automation over manual implementations of lab processes. They include:\n<\/p>\n<ul><li>lower costs per test;<\/li>\n<li>better control over expenditures;<\/li>\n<li>a stronger basis for better <a href=\"https:\/\/www.limswiki.org\/index.php\/Workflow\" title=\"Workflow\" class=\"wiki-link\" data-key=\"92bd8748272e20d891008dcb8243e8a8\">workflow<\/a> planning;<\/li>\n<li>reproducibility;<\/li>\n<li>predictably; and<\/li>\n<li>tighter adherence to procedures, i.e., consistency.<\/li><\/ul>\n<p>Lists similar to the one above can be found in justifications for lab automation, and cookie production, without further comment. It\u2019s just assumed that everyone agrees and that the reasoning is obvious. Since we are going to use those items to justify the cost and effort that goes into automation, we should take a closer look at them.\n<\/p><p>Lets begin with reproducibility, predictability, and consistency, very similar concerns that reflect automation\u2019s ability to produce the same product with the desired characteristics over and over. For data and information, that means that the same analysis on the same materials will yield the same results, that all the steps are documented and that the process is under control. The variability that creeps into the execution of a process by people is eliminated. That variability in human labor can result from the quality of training, equipment setup and calibration, readings from analog devices (e.g., meters, pipette meniscus, charts, etc.), there is a long list of potential issues.\n<\/p><p>Concerns with reproducibility, predictability, and consistency are common to production environments, general lab work, manufacturing, and even food service. There are several pizza restaurants in our area using one of two methods of making the pies. Both start the preparation the same way, spreading dough and adding cheese and toppings, but the differences are in how they are cooked. Once method uses standard ovens (e.g., gas, wood, or electric heating); the pizza goes in, the cook watches it, and then removes it when the cooking is completed. This leads to a lot of variability in the product, some a function of the cook\u2019s attention, some depending on requests for over or under cooking the crust. Some is based on \"have it your way\" customization. The second method uses a metal conveyor belt to move the pie through an oven. The oven temperature is set as is the speed of the belt, and as long as the settings are the same, you get a reproducible, consistent product order after order. It\u2019s a matter of priorities. Manual verses automated. Consistent product quality verses how the cook feels that day. In the end, reducing variability and being able to demonstrate consistent, accurate, results gives people confidence in your product.\n<\/p><p>Lower costs per test, better control over expenditures, and better workflow planning also benefit from automation. Automated processes are more cost-efficient since the sample throughput is higher and the labor cost is reduced. The cost per test and the material usage is predictable since variability in components used in testing is reduced or eliminated, and workflow planning is improved since the time per test is known, work can be better scheduled. Additionally, process scale-up should be easier if there is a high demand for particular procedures. However there is a lot of work that has to be considered before automation is realizable, and that is where this discussion is headed.\n<\/p>\n<h2><span id=\"rdp-ebb-How_does_this_discussion_relate_to_previous_work?\"><\/span><span class=\"mw-headline\" id=\"How_does_this_discussion_relate_to_previous_work.3F\">How does this discussion relate to previous work?<\/span><\/h2>\n<p>This work follows on the heels of two previous works:\n<\/p>\n<ul><li><i><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.pda.org\/bookstore\/product-detail\/2684-computerized-systems-in-modern-lab\" target=\"_blank\">Computerized Systems in the Modern Laboratory: A Practical Guide<\/a><\/i> (2015): This book presents the range of informatics technologies, their relationship to each other, and the role they play in laboratory work. It differentiates a LIMS from an ELN and <a href=\"https:\/\/www.limswiki.org\/index.php\/Scientific_data_management_system\" title=\"Scientific data management system\" class=\"wiki-link\" data-key=\"9f38d322b743f578fef487b6f3d7c253\">scientific data management system<\/a> (SDMS) for example, contrasting their use and how they would function in different lab working environments. In addition, it covers topics such as support and regulatory issues.<\/li><\/ul>\n<ul><li><i><a href=\"https:\/\/www.limswiki.org\/index.php\/LII:A_Guide_for_Management:_Successfully_Applying_Laboratory_Systems_to_Your_Organization%27s_Work\" title=\"LII:A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work\" class=\"wiki-link\" data-key=\"00b300565027cb0518bcb0410d6df360\">A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work<\/a><\/i> (2018): This webinar series complements the above text. It begins by introducing the major topics in informatics (e.g., LIMS, ELN, etc.) and then discusses their use from a strategic viewpoint. Where and how do you start planning? What is your return on investment? What should get implemented first, and then what are my options? The series then moves on to developing an <a href=\"https:\/\/www.limswiki.org\/index.php\/Information_management\" title=\"Information management\" class=\"wiki-link\" data-key=\"f8672d270c0750a858ed940158ca0a73\">information management<\/a> strategy for the lab, taking into account budgets, support, ease of implementation, and the nature of your lab\u2019s work.<\/li><\/ul>\n<p>The material in this write-up picks up where the last part of the webinar series ends. The last session covers lab processes, amd this picks up that thread and goes into more depth concerning a basic issue: how do you move from manual methods to automated systems?\n<\/p><p>Productivity has always been an issue in laboratory work. Until the 1950s, a lab had little choice but to add more people if more work needed to be done. Since then, new technologies have afforded wider options, including new instrument technologies. The execution of the work was still done by people, but the tools were better. Now we have other options. We just have to figure out when, if, and how to use them.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Before_we_get_too_far_into_this...\">Before we get too far into this...<\/span><\/h3>\n<p>With elements such as productivity, return on investment (ROI), <a href=\"https:\/\/www.limswiki.org\/index.php\/Data_quality\" title=\"Data quality\" class=\"wiki-link\" data-key=\"7fe43b05eae4dfa9b5c0547cc8cfcceb\">data quality<\/a>, and <a href=\"https:\/\/www.limswiki.org\/index.php\/Data_integrity\" title=\"Data integrity\" class=\"wiki-link\" data-key=\"382a9bb77ee3e36bb3b37c79ed813167\">data integrity<\/a> as driving factors in this work, you shouldn\u2019t be surprised if a lot of the material reads like a discussion of manufacturing methodologies; we\u2019ve already seen some examples. We are talking about scientific work, but the same things that drive the elements noted in labs have very close parallels in product manufacturing. The work we are describing here will be referenced as \"scientific manufacturing,\" manufacturing or production in support of scientific programs.<sup id=\"rdp-ebb-cite_ref-1\" class=\"reference\"><a href=\"#cite_note-1\">[a]<\/a><\/sup>\n<\/p><p>The key points of a productivity conversation in both lab and material production environments are almost exact overlays, the only significant difference is that the results of the efforts are data and information in one case, and a physical item you might sell in the other. Product quality and integrity are valued considerations in both. For scientists, this may require an adjustment to their perspectives when dealing with automation. On the plus side, the lessons learned in product manufacturing can be applied to lab bench work, making the path to implementation a bit easier while providing a framework for understanding what a successful automation effort looks like. People with backgrounds in product manufacturing can be a useful resource in the lab, with a bit of an adjustment in perspective on their part.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Transitioning_from_typical_lab_operations_to_automated_systems\">Transitioning from typical lab operations to automated systems<\/span><\/h2>\n<p>Transitioning a lab from its current state of operations to one that incorporates automation can raise a number of questions, and people\u2019s anxiety levels. There are several questions that should be considered to set expectations for automated systems and how they will impact jobs and the introduction of new technologies. They include:\n<\/p>\n<ul><li>What will happen to people\u2019s jobs as a result of automation?<\/li>\n<li>What is the role of <a href=\"https:\/\/www.limswiki.org\/index.php\/Artificial_intelligence\" title=\"Artificial intelligence\" class=\"wiki-link\" data-key=\"0c45a597361ca47e1cd8112af676276e\">artificial intelligence<\/a> (AI) and <a href=\"https:\/\/www.limswiki.org\/index.php\/Machine_learning\" title=\"Machine learning\" class=\"wiki-link\" data-key=\"79aab39cfa124c958cd1dbcab3dde122\">machine learning<\/a> (ML) in automation?<\/li>\n<li>Where do we find the resources to carry out automation projects\/programs?<\/li>\n<li>What equipment would we need for automated processes, and will it be different that what we currently have?<\/li>\n<li>What role does a <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_execution_system\" title=\"Laboratory execution system\" class=\"wiki-link\" data-key=\"774bdcab852f4d09565f0486bfafc26a\">laboratory execution system<\/a> (LES) play in laboratory automation?<\/li>\n<li>How do we go about planning for automation?<\/li><\/ul>\n<h3><span id=\"rdp-ebb-What_will_happen_to_people\u2019s_jobs_as_a_result_of_automation?\"><\/span><span class=\"mw-headline\" id=\"What_will_happen_to_people.E2.80.99s_jobs_as_a_result_of_automation.3F\">What will happen to people\u2019s jobs as a result of automation?<\/span><\/h3>\n<p>Stories are appearing in print, online, and in television news reporting about the potential for automation to replace human effort in the labor force. It seems like it is an all-or-none situation, either people will continue working in their occupations or automation (e.g., mechanical, software, AI, etc.) will replace them. The storyline is people are expensive and automated work can be less costly in the long run. If commercial manufacturing is a guide, automation is a preferred option from both a productivity and an ROI perspective. In order to make the productivity gains from automation similar to those seen in commercial manufacturing, there are some basic requirements and conditions that have to be met:\n<\/p>\n<ul><li>The process has to be well documented and understood, down to the execution of each step without variation, while error detection and recovery have to be designed in.<\/li>\n<li>The process has to remain static and be expected to continue over enough execution cycles to make it economically attractive to design, build, and maintain.<\/li>\n<li>Automation-compatible equipment has to be available. Custom-built components are going to be expensive and could represent a barrier to successful implementation.<\/li>\n<li>There has to be a driving need to justify the cost of automation; economics, the volume of work that has to be addressed, working with hazardous materials, and lack of educated workers are just a few of the factors that would need to be considered.<\/li><\/ul>\n<p>There are places in laboratory work where production-scale automation has been successfully implemented; life sciences applications for processes based on microplate technologies are one example. When we look at the broad scope of lab work across disciplines, most lab processes don\u2019t lend themselves to that level of automation, at least not yet. We\u2019ll get into this in more detail later. But that brings us back to the starting point: what happens to people's jobs?\n<\/p><p>In the early stages of manufacturing automation, as well as fields such as mining where work was labor intensive and repetitive, people did lose jobs when new methods of production were introduced. That shift from a human workforce to automated task execution is expanding as system designers probe markets from retail to transportation.<sup id=\"rdp-ebb-cite_ref-FreyTheFuture13_2-0\" class=\"reference\"><a href=\"#cite_note-FreyTheFuture13-2\">[1]<\/a><\/sup> Lower skilled occupations gave way first, and we find ourselves facing automation efforts that are moving up the skills ladder, most recently is the potential for automated driving, a technology that has yet to be fully embraced but is moving in that direction. The problem that leaves us with is providing displaced workers with a means of employment that gives them at least a living income, and the purpose, dignity, and self-worth that they\u2019d like to have. This is going to require significant education, and people are going to have to come to grips with the realization that education never stops.\n<\/p><p>Due to the push for increased productivity, lab work has seen some similar developments in automation. The development of automated pipettes, titration stations, auto-injectors, computer-assisted instrumentation, and automation built to support microplate technologies represent just a few places where specific tasks have been addressed. However these developments haven\u2019t moved people out of the workplace as has happened in manufacturing, mining, etc. In some cases they\u2019ve changed the work, replacing repetitive time-consuming tasks with equipment that allows lab personnel to take on different tasks. In other cases the technology addresses work that couldn\u2019t be performed in a cost-effective manner with human effort; without automation, that work might just not be feasible due to the volume of work (whose delivery might be limited by the availability of the right people, equipment, and facilities) or the need to work with hazardous materials. Automation may prevent the need for hiring new people while giving those currently working more challenging tasks.\n<\/p><p>As noted in the previous paragraph, much of the automation in lab work is at the task level: equipment designed to carry out a specific function such as Karl-Fisher titrations. Some equipment designed around microplate formats can function at both the task level and as part of user-integrated robotics system. This gives the planner useful options about the introduction of automation that makes it easier for personnel to get accustomed to automation before moving into scientific manufacturing.\n<\/p><p>Overall, laboratory people shouldn\u2019t be loosing their jobs as a result of lab automation, but they do have to be open to changes in their jobs, and that could require an investment in their education. Take someone whose current job is to carry out a lab procedure, someone who understands all aspects of the work, including troubleshooting equipment, reagents, and any special problems that may crop up. Someone else may have developed the procedure, but that person is the expert in its execution.\n<\/p><p>First of all you need these experts to help plan and test the automated systems if you decide to create that project. These would also be the best people to educate as automated systems managers; they know how the process is supposed to work and should be in a position to detect problems. If it crashes, you\u2019ll need someone who can cover the work while problems are be addressed. Secondly, if lab personnel get the idea that they are watching their replacement being installed, they may leave before the automated systems are ready. In the event of a delay, you\u2019ll have a backlog and no one to handle it.\n<\/p><p>Beyond that, people will be freed from the routine of carrying out processes and be able to address work that had been put on a back burner until it could be addressed. As we move toward automated systems, jobs will change by expansion to accommodate typical lab work, as well as the management, planning, maintenance, and evolution of laboratory automation and computing.\n<\/p><p>Automation in lab work is not an \"all or none\" situation. Processes can be structured so that the routine work is done by systems, and the analyst can spend time reviewing the results, looking for anomalies and interesting patterns, while being able to make decisions about the need for and nature of follow-on efforts.\n<\/p>\n<h3><span id=\"rdp-ebb-What_is_the_role_of_AI_and_ML_in_automation?\"><\/span><span class=\"mw-headline\" id=\"What_is_the_role_of_AI_and_ML_in_automation.3F\">What is the role of AI and ML in automation?<\/span><\/h3>\n<p>When we discuss automation, what we are referencing now is basic robotics and programming. AI may, and likely will, play a role in the work, but first we have to get the foundations right before we consider the next step; we need to put in the human intelligence first. Part of the issue with AI is that we don\u2019t know what it is.\n<\/p><p>Science fiction aside, many of today's applications of AI have a limited role in lab work today. Here are some examples:\n<\/p>\n<ul><li>Having a system that can bring up all relevant information on a research question\u2014a sort of super Google\u2014or a variation of IBM\u2019s Watson could have significant benefits.<\/li>\n<li>Analyzing complex data or large volumes of data could be beneficial, e.g., the analysis of radio astronomy data to find fast radio bursts (FRB). After discovering 21 FRB signals upon analyzing five hours of data, researchers at Green Bank Telescope used AI to analyze 400 terabytes of older data and detected another 100.<sup id=\"rdp-ebb-cite_ref-HsuIsIt18_3-0\" class=\"reference\"><a href=\"#cite_note-HsuIsIt18-3\">[2]<\/a><\/sup><\/li>\n<li>\"[A] team at Glasgow University has paired a machine-learning system with a robot that can run and analyze its own chemical reaction. The result is a system that can figure out every reaction that's possible from a given set of starting materials.\"<sup id=\"rdp-ebb-cite_ref-TimmerAIPlus18_4-0\" class=\"reference\"><a href=\"#cite_note-TimmerAIPlus18-4\">[3]<\/a><\/sup><\/li>\n<li>HelixAI is using Amazon's Alexa as a digital assitant for laboratory work.<sup id=\"rdp-ebb-cite_ref-HelixAIHome_5-0\" class=\"reference\"><a href=\"#cite_note-HelixAIHome-5\">[4]<\/a><\/sup><\/li><\/ul>\n<p>Note that the points above are research-based applications, not routine production environments where regulatory issues are important. While there are research applications that might be more forgiving of AI systems because the results are evaluated by human intelligence, and problematic results can be made subject to further verification, data entry systems such as voice entry have to be carefully tested and the results of that data entry verified and shown to be correct.\n<\/p><p>Pharma IQ continues to publish material on advanced topics in laboratory informatics, including articles on how labs are benefiting from new technologies<sup id=\"rdp-ebb-cite_ref-PharmaIQNewsAutom18_6-0\" class=\"reference\"><a href=\"#cite_note-PharmaIQNewsAutom18-6\">[5]<\/a><\/sup> and survey reports such as <i>AI 2020: The Future of Drug Discovery<\/i>. In that report they note<sup id=\"rdp-ebb-cite_ref-PharmaIQTheFuture17_7-0\" class=\"reference\"><a href=\"#cite_note-PharmaIQTheFuture17-7\">[6]<\/a><\/sup>:\n<\/p>\n<ul><li>\"94% of pharma professionals expect that intelligent technologies will have a noticeable impact on the pharmaceutical industry over the next two years.\"<\/li>\n<li>\"Almost one fifth of pharma professionals believe that we are on the cusp of a revolution.\"<\/li>\n<li>\"Intelligent automation and predictive analytics are expected to have the most significant impact on the industry.\"<\/li>\n<li>\"However, a lack of understanding and awareness about the benefits of AI-led technologies remain a hindrance to their implementation.\"<\/li><\/ul>\n<p>Note that these are expectations, not a reflection of current reality. That same report makes comments about the impact of AI on headcount disruption, asking, \"Do you expect intelligent enterprise technologies<sup id=\"rdp-ebb-cite_ref-8\" class=\"reference\"><a href=\"#cite_note-8\">[b]<\/a><\/sup> to significantly cut and\/or create jobs in pharma through 2020?\" Among the responses, 47 percent said they expected those technologies to do both, 40 percent said it will create new job opportunities, and 13 percent said there will be no dramatic change, with zero percent saying they expected solely job losses.<sup id=\"rdp-ebb-cite_ref-PharmaIQTheFuture17_7-1\" class=\"reference\"><a href=\"#cite_note-PharmaIQTheFuture17-7\">[6]<\/a><\/sup>\n<\/p><p>While there are high levels of expectations and hopes for results, we need to approach the idea of AI in labs with some caution. We read about examples based on machine learning (ML), for example using computer systems to recognize cats in photos, to recognized faces in a crowd, etc. We don\u2019t know how they accomplish their tasks, and we can\u2019t analyze their algorithms and decision-making. That leaves us with testing in quality, which at best is an uncertain process with qualified results (it has worked so far). One problem with testing AI systems based on ML is that they are going to continually evolve, so testing may affect the ML processes by providing a bias. It may also cause continued, redundant testing, because something we thought was evaluated was changed by the \u201cexperiences\u201d the AI based it\u2019s learning on. As one example, could the AI modify the science through process changes without our knowing because it didn\u2019t understand the science or the goals of the work?\n<\/p><p>AI is a black box with ever-changing contents. That shouldn\u2019t be taken as a condemnation of AI in the lab, but rather as a challenge to human intelligence in evaluating, proving, and applying the technology. That application includes defining the operating boundaries of an AI system. Rather than creating a master AI for a complete process, we may elect to divide the AI\u2019s area of operation into multiple, independent segments, with segment integration occurring in later stages once we are confident in their ability to work and show clear evidence of systems stability. In all of this we need to remember that our goal is the production of high-quality data and information in a controlled, predictable environment, not gee-wiz technology. One place where AI (or clever programming) could be of use is in better workflow planning, which takes into account current workloads and assignments, factors in the inevitable panic-level testing need, and, perhaps in a QC\/production environment, anticipates changes in analysis requirements based on changes in production operations.\n<\/p><p>Throughout this section I've treated \u201cAI\u201d as \u201cartificial intelligence,\u201d its common meaning. There may be a better way of looking at it for lab use as, noted in this excerpt from the October 2018 issue of Wired magazine<sup id=\"rdp-ebb-cite_ref-RossettoFight18_9-0\" class=\"reference\"><a href=\"#cite_note-RossettoFight18-9\">[7]<\/a><\/sup>:\n<\/p>\n<blockquote><p>Augmented intelligence. Not \u201cartificial,\u201d but how Doug Engelbart<sup id=\"rdp-ebb-cite_ref-10\" class=\"reference\"><a href=\"#cite_note-10\">[c]<\/a><\/sup> envisioned our relationship with computer: AI doesn\u2019t replace humans. It offers idiot-savant assistants that enable us to become the best humans we can be.<\/p><\/blockquote>\n<p>Augmented intelligence (AuI) is a better term for what we might experience in lab work, at least in the near future. It suggests something that is both more realistic and attainable, with the synergism that would make it, and automation, attractive to lab management and personnel\u2014a tool they can work with and improve lab operations that doesn\u2019t carry the specter of something going on that they don\u2019t understand or control. OPUS\/SEARCH from Bruker might be just such an entry in this category.<sup id=\"rdp-ebb-cite_ref-BrukerOPUS_11-0\" class=\"reference\"><a href=\"#cite_note-BrukerOPUS-11\">[8]<\/a><\/sup> AuI may serve as a first-pass filter for large data sets\u2014as noted in the radio astronomy and chemistry examples noted earlier\u2014reducing those sets of data and information to smaller collections that human intelligence can\/should evaluate. However, that does put a burden on the AuI to avoid excessive false positives or negatives, something that can be adjusted over time.\n<\/p><p>Beyond that there is the possibility of more cooperative work between people and AuI systems. An article in <i>Scientific American<\/i> titled \u201cMy Boss the Robot\u201d<sup id=\"rdp-ebb-cite_ref-BourneMyBoss13_12-0\" class=\"reference\"><a href=\"#cite_note-BourneMyBoss13-12\">[9]<\/a><\/sup> describes the advantage of a human-robot team, with the robot doing the heavy work and the human\u2014under the robots guidance\u2014doing work he was more adept at, verses a team of experts with the same task. The task, welding a Humvee frame, was competed by the human machine pair in 10 hours at a cost of $1,150; the team of experts took 89 hours and a cost of $7,075. That might translate into terms of laboratory work by having a robot do routine, highly repetitive tasks and the analyst overseeing the operation and doing higher-level analysis of the results.\n<\/p><p>Certainly, AI\/AuI is going to change over time as programming and software technology becomes more sophisticated and capable; today\u2019s example of AuI might be seen as tomorrow\u2019s clever software. However, a lot depends on the experience of the user.\n<\/p><p>There is something important to ask about laboratory technology development, and AI in particular: is the direction of development going to be the result of someone\u2019s innovation that people look at and embrace, or will it be the result of a deliberate choice of lab people saying \u201cthis is where we need to go, build systems that will get us there\u201d? The difference is important, and lab managers and personnel need to be in control of the planning and implementation of systems.\n<\/p>\n<h3><span id=\"rdp-ebb-Where_do_we_find_the_resources_to_carry_out_automation_projects\/programs?\"><\/span><span class=\"mw-headline\" id=\"Where_do_we_find_the_resources_to_carry_out_automation_projects.2Fprograms.3F\">Where do we find the resources to carry out automation projects\/programs?<\/span><\/h3>\n<p>Given the potential scope of work, you may need people with skills in programming, robotics, instrumentation, and possibly mechanical or electrical engineering if off-the-shelf components aren\u2019t available. The biggest need is for people who can do the planning and optimization that is needed as you move from manual to semi- or fully-automated systems, particularly specialists in process engineering who can organize and plan the work, including the process controls and provision for statistical process control.\n<\/p><p>We need to develop people who are well versed in laboratory work and the technologies that can be applied to that work, as assets in laboratory automation development and planning. In the past, this role has been filled with lab personnel having an interest in the subject, IT people willing to extend their responsibilities, and\/or outside consultants. A 2017 report by Salesforce Research states \"77% of IT leaders believe IT functions as an extension\/partner of business units rather than as a separate function.\"<sup id=\"rdp-ebb-cite_ref-SalesForceSecondAnn17_13-0\" class=\"reference\"><a href=\"#cite_note-SalesForceSecondAnn17-13\">[10]<\/a><\/sup> The report makes no mention of laboratory work or manufacturing aside from those being functions within businesses surveyed. Unless a particular effort is made, IT personnel rarely have the backgrounds needed to meet the needs of lab work. In many cases, they will try and fit lab needs into software they are already familiar with, rather then extend their backgrounds into new computational environments. Office and pure database applications are easily handled, but when we get to the lab bench, it's another matter entirely.\n<\/p><p>The field is getting complex enough that we need people whose responsibilities span both science and technology. This subject is discussed in the webinar series <i><a href=\"https:\/\/www.limswiki.org\/index.php\/LII:A_Guide_for_Management:_Successfully_Applying_Laboratory_Systems_to_Your_Organization%27s_Work\" title=\"LII:A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work\" class=\"wiki-link\" data-key=\"00b300565027cb0518bcb0410d6df360\">A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work<\/a><\/i>, Part 5 \"Supporting Laboratory Systems.\"\n<\/p>\n<h3><span id=\"rdp-ebb-What_equipment_would_we_need_for_automated_processes,_and_will_it_be_different_that_what_we_currently_have?\"><\/span><span class=\"mw-headline\" id=\"What_equipment_would_we_need_for_automated_processes.2C_and_will_it_be_different_that_what_we_currently_have.3F\">What equipment would we need for automated processes, and will it be different that what we currently have?<\/span><\/h3>\n<p>This is an interesting issue and it directly addresses the commitment labs have to automation, particularly robotics. In the early days of lab automation when Zymark (Zymate and Benchmate), <a href=\"https:\/\/www.limswiki.org\/index.php\/PerkinElmer_Inc.\" title=\"PerkinElmer Inc.\" class=\"wiki-link\" data-key=\"dabda40785b60866d056709e611512f8\">Perkin Elmer<\/a>, and Hewlett Packard (ORCA) were the major players in the market, the robot had to adapt to equipment that was designed for human use: standard laboratory equipment. They did that through special modifications and the use of different grippers to handle test tubes, beakers, and flasks. While some companies wanted to test the use of robotics in the lab, they didn\u2019t want to invest in equipment that could only be used with robots; they wanted lab workers to pick up where the robots left off in case the robots didn\u2019t work.\n<\/p><p>Since then, equipment has evolved to support automation more directly. In some cases it is a device (e.g., a balance, pH meter, etc.) that has front panel human operator capability and rear connectors for computer communications. Liquid handling systems have seen the most advancement through the adoption of microplate formats and equipment designed to work with them. However, the key point is standardization of the sample containers. Vials and microplates lend themselves to a variety of automation devices, from sample processing to auto-injectors\/samplers. The issue is getting the samples into those formats.\n<\/p><p>One point that labs, in any scientific discipline, have to come to grips with is the commitment to automation. That commitment isn\u2019t going to be done on a lab-wide basis, but on a procedure-by-procedure basis. Full automation may not be appropriate for all lab work, whereas partial automation may be a better choice, and in some cases no automation may be required (we\u2019ll get into that later). The point that needs to be addressed is the choice of equipment. In most cases, equipment is designed for use by people, with options for automation and electronic communications. However, if you want to maximize throughput, you may have to follow examples from manufacturing and commit to equipment that is only used by automation. That will mean a redesign of the equipment, a shared risk for both the vendors and the users. The upside to this is that equipment can be specifically designed for a task, be more efficient, have the links needed for integration, use less material, and, more likely, take up less space. One example is the microplate, allowing for tens, hundreds, or thousands (depending on the plate used) of sample cells in a small space. What used to take many cubic feet of space as test tubes (the precursor to using microplates) is now a couple of cubic inches, using much less material and working space. Note, however, that while microplates are used by lab personnel, their use in automated systems provides greater efficiency and productivity.\n<\/p><p>The idea of equipment used only in an automated process isn\u2019t new. The development and commercialization of segmented flow analyzers\u2014initially by Technicon in the form of the AutoAnalyzers for general use, and the SMA (Sequential Multiple Analyzer) and SMAC (Sequential Multiple Analyzer with Computer) in clinical markets\u2014improved a lab's ability to process samples. These systems were phased out with new equipment that consumed less material. Products like these are being provided by Seal Analytical<sup id=\"rdp-ebb-cite_ref-SealAnal_14-0\" class=\"reference\"><a href=\"#cite_note-SealAnal-14\">[11]<\/a><\/sup> for environmental work and Bran+Luebbe (a division of SPX Process Equipment in Germany).<sup id=\"rdp-ebb-cite_ref-BranLuebbe_15-0\" class=\"reference\"><a href=\"#cite_note-BranLuebbe-15\">[12]<\/a><\/sup>\n<\/p><p>The issue in committing to automated equipment is that vendors and users will have to agree on equipment specifications and use them within procedures. One place this has been done successfully is in clinical chemistry labs. What other industry workflows could benefit? Do the vendors lead or do the users drive the issue? Vendors need to be convinced that there is a viable market for product before making an investment, and users need to be equally convinced that they will succeed in applying those products. In short, procedures that are important to a particular industry have to be identified, and both users and vendors have to come together to develop automated procedure and equipment specifications for products. This has been done successfully in clinical chemistry markets to the extent that equipment is marketed for use as validated for particular procedures.\n<\/p>\n<h3><span id=\"rdp-ebb-What_role_does_a_LES_play_in_laboratory_automation?\"><\/span><span class=\"mw-headline\" id=\"What_role_does_a_LES_play_in_laboratory_automation.3F\">What role does a LES play in laboratory automation?<\/span><\/h3>\n<p>Before ELNs settled into their current role in laboratory work, the initial implementations differed considerably from what we have now. LabTech Notebook was released in 1986 (discontinued in 2004) to provide communications between computers and devices that used RS-232 serial communications. In the early 2000s SmartLab from Velquest was the first commercial product to carry the \"electronic laboratory notebook\" identifier. That product became a stand-alone entry in the laboratory execution system (LES) market; since its release, the same conceptual functionality has been incorporated into LIMS and ELNs that fit the more current expectation for an ELN.\n<\/p><p>At it\u2019s core, LES are scripted test procedures that an analyst would follow to carry out a laboratory method, essentially functioning as the programmed execution of a lab process. Each step in a process is described, followed exactly, and provision is made within the script for data collection. In addition, the LES can\/will (depending on the implementation; \"can\" in the case of SmartLab) check to see if the analyst is qualified to carry out the work and that the equipment and reagents are current, calibrated, and suitable for use. The systems can also have access to help files that an analyst can reference if there are questions about how to carry out a step or resolve issues. Beyond that, the software had the ability to work with lab instruments and automatically acquire data either through direct interfaces (e.g., balances, pH meters, etc.) or through parsing PDF files of instrument reports.\n<\/p><p>There are two reasons that these systems are attractive. First, they provide for a rigorous execution of a process with each step being logged as it is done. Second, that log provides a regulatory inspector with documented evidence that the work was done properly, making it easier for the lab to meet any regulatory burden.\n<\/p><p>Since the initial development of SmartLab, that product has changed ownership and is currently in the hands of <a href=\"https:\/\/www.limswiki.org\/index.php\/Dassault_Syst%C3%A8mes_SA\" title=\"Dassault Syst\u00e8mes SA\" class=\"wiki-link\" data-key=\"1be69bd73e35bc3db0c3229284bf9416\">Dassault Syst\u00e8mes<\/a> as part of the BIOVIA product line. As noted above, LIMS and ELN vendors have incorporated similar functionality into their products. Using those features requires \u201cscripting\u201d (in reality, software development), but it does allow the ability to access the database structures within those products. The SmartLab software needed programmed interfaces to other vendors' LIMS and ELNs to gain access to the same information.\n<\/p>\n<h4><span id=\"rdp-ebb-What_does_this_have_to_do_with_automation?\"><\/span><span class=\"mw-headline\" id=\"What_does_this_have_to_do_with_automation.3F\">What does this have to do with automation?<\/span><\/h4>\n<p>When we think about automated systems, particularly full-automation with robotic support, it is a programmed process from start to finish. The samples are introduced at the start, and the process continues until the final data\/information is reported and stored. These can be large scale systems using microplate formats, including tape-based systems from Douglas Scientific<sup id=\"rdp-ebb-cite_ref-DouglasScientificArrayTape_16-0\" class=\"reference\"><a href=\"#cite_note-DouglasScientificArrayTape-16\">[13]<\/a><\/sup>, programmable autosamplers such as those from <a href=\"https:\/\/www.limswiki.org\/index.php\/Agilent_Technologies,_Inc.\" title=\"Agilent Technologies, Inc.\" class=\"wiki-link\" data-key=\"dcea1a676a012bcbe3af9562dd17f8a0\">Agilent<\/a><sup id=\"rdp-ebb-cite_ref-Agilent1200Series_17-0\" class=\"reference\"><a href=\"#cite_note-Agilent1200Series-17\">[14]<\/a><\/sup>, or systems built around robotics arms from a variety of vendors that move samples from one station to another.\n<\/p><p>Both LES and the automation noted in the previous paragraph have the following point in common: there is a strict process that must be followed, with no provision for variation. The difference is that in one case that process is implemented completely through the use of computers, as well as electronic and mechanical equipment. In the other case, the process is being carried out by lab personnel using computers, as well as electronic and mechanical lab equipment. In essence, people take the place of mechanical robots, which conjures up all kinds of images going back to the 1927 film <i>Metropolis<\/i>.<sup id=\"rdp-ebb-cite_ref-18\" class=\"reference\"><a href=\"#cite_note-18\">[d]<\/a><\/sup> Though the LES represents a step toward more sophisticated automation, both methods still require:\n<\/p>\n<ul><li>programming, including \u201cscripting\u201d (the LES methods are a script that has to be followed);<\/li>\n<li>validated, proven processes; and<\/li>\n<li>qualified staff, though the qualifications differ. (In both cases they have to be fully qualified to carry out the process in question. However in the full automation case, they will require more education on running, managing, and troubleshooting the systems.)<\/li><\/ul>\n<p>In the case of full automation, there has to be sufficient justification for the automation of the process, including sufficient sample processing. The LES-human implementation can be run for a single sample if needed, and the operating personnel can be trained on multiple procedures, switching tasks as needed. Electro-mechanical automation would require a change in programming, verification that the system is operating properly, and may require equipment re-configuration. Which method is better for a particular lab depends on trade-offs between sample load, throughput requirements, cost, and flexibility. People are adaptable, easily moving between tasks, whereas equipment has to be adapted to a task.\n<\/p>\n<h3><span id=\"rdp-ebb-How_do_we_go_about_planning_for_automation?\"><\/span><span class=\"mw-headline\" id=\"How_do_we_go_about_planning_for_automation.3F\">How do we go about planning for automation?<\/span><\/h3>\n<p>There are three forms of automation to be considered:\n<\/p>\n<ol><li>No automation \u2013 Instead, the lab relies on lab personnel to carry out all steps of a procedure.<\/li>\n<li>Partial automation \u2013 Automated equipment is used to carry out steps in a procedure. Given the current state of laboratory systems, this is the most prevalent since most lab equipment has computer components in them to facilitate their use.<\/li>\n<li>Full automation - The entire process is automated. The definition of \u201centire\u201d is open to each labs interpretation and may vary from one process to another. For example, some samples may need some handing before they are suitable for use in a procedure. That might be a selection process from a freezer, grinding materials prior to a solvent extraction, and so on, representing cases where the equipment available isn\u2019t suitable for automated equipment interaction. One goal is to minimize this effort since it can put a limit on the productivity of the entire process. This is also an area where negotiation between the lab and the sample submitter can be useful. Take plastic pellets for example, which often need to be ground into a course powder before they can be analyzed; having the submitter provide them in this form will reduce the time and cost of the analysis. Standardizing on the sample container can also facilitate the analysis (having the lab provide the submitter with standard sample vials using barcodes or RFID chips can streamline the process).<\/li><\/ol>\n<p>One common point that these three forms share is a well-described method (procedure, process) that needs to be addressed. That method should be fully developed, tested, and validated. This is the reference point for evaluating any form of automation (Figure 1).\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig1_Liscouski_ConsidAutoLabProc21.png\" class=\"image wiki-link\" data-key=\"684c380f41ae1fa9a0e3ed8844cc8c6d\"><img alt=\"Fig1 Liscouski ConsidAutoLabProc21.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/3\/31\/Fig1_Liscouski_ConsidAutoLabProc21.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 1.<\/b> Items to be considered in automating systems<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>The documentation for the chosen method should include the bulleted list of items from Figure 1, as they describe the science aspects of the method. The last four points are important. The method should be validated since the manual procedure is a reference point for determining if the automated system is producing useful results. The reproducibility metric offers a means of evaluating at least one expected improvement in an automated system; you\u2019d expect less variability in the results. This requires a set of reference sample materials that can be repeatedly evaluated to compare the manual and automated systems, and to periodically test the methods in use to ensure that there aren\u2019t any trends developing that would compromise the method\u2019s use. Basically, this amounts to statistical quality control on the processes.\n<\/p><p>The next step is to decide what improvements you are looking for in an automated system: increased throughput, lower cost of operation, the ability to off-load human work, reduced variability, etc. In short, what are your goals?\n<\/p><p>That brings us to the matter of project planning. We\u2019re not going to go into a lot of depth in this piece about project planning, as there are a number of references<sup id=\"rdp-ebb-cite_ref-19\" class=\"reference\"><a href=\"#cite_note-19\">[e]<\/a><\/sup> on the subject, including material produced by the former Institute for Laboratory Automation.<sup id=\"rdp-ebb-cite_ref-20\" class=\"reference\"><a href=\"#cite_note-20\">[f]<\/a><\/sup> There are some aspects of the subject that we do need to touch on, however, and they include:\n<\/p>\n<ul><li>justifying the project and setting expectations and goals;<\/li>\n<li>analyzing the process;<\/li>\n<li>scheduling automation projects; and<\/li>\n<li>budgeting.<\/li><\/ul>\n<h4><span id=\"rdp-ebb-Justification,_expectations,_and_goals\"><\/span><span class=\"mw-headline\" id=\"Justification.2C_expectations.2C_and_goals\">Justification, expectations, and goals<\/span><\/h4>\n<p>Basically why are you doing this, what do you expect to gain? What arguments are you going to use to justify the work and expense involved in the project? How will you determine if the project is successful?\n<\/p><p>Fundamentally, automation efforts are about productivity and the bulleted items noted in the introduction of this piece, repeated below with additional commentary:\n<\/p>\n<ul><li>Lower costs per test, and better control over expenditure: These can result from a reduction in labor and materials costs, including more predictable and consistent reagent usage per test.<\/li>\n<li>Stronger basis for better workflow planning: Informatics systems can provide better management over workloads and resource allocation, while key performance indicators can show where bottlenecks are occurring or if samples are taking too long to process. These can be triggers for procedure automation to improve throughput.<\/li>\n<li>Reproducibility: The test results from automated procedures can be expected to be more reproducible by eliminating the variability that is typical of steps executed by people. Small variation in dispensing reagents, for example, could be eliminated.<\/li>\n<li>Predictability: The time to completion for a given test is more predictable in automated programs; once the process starts it keeps going without interruptions that can be found in human centered activities<\/li>\n<li>Tighter adherence to procedures: Automated procedures have no choice but to be consistent in procedure execution; that is what programming and automation is about.<\/li><\/ul>\n<p>Of these, which are important to your project? If you achieved these goals, what would it mean to your labs operations and the organization as a whole; this is part of the justification for carrying out the projects.\n<\/p><p>As noted earlier, there are several things to consider in order to justify a project. First, there has to be a growing need that supports a procedures automation, one that can\u2019t be satisfied by other means that could include adding people, equipment, and lab space, or outsourcing the work (with the added burden of insuring data quality and integrity, and integrating that work with the lab\u2019s data\/information). Second, the cost of the project must be balanced by it\u2019s benefits. This includes any savings in cost, people (not reducing headcount, but avoiding new hires), material, and equipment, as well as the improvement of timeliness of results and overall lab operations. Third, when considering project justification, the automated process\u2019s useful lifetime has to be long enough to justify the development work. And finally, the process has to be stable so that you aren\u2019t in a constant re-development situation (this differs from periodic upgrades and performance improvements, EVOP in manufacturing terms). One common point of failure in projects is changes in underlying procedures; if the basic process model changes, you are trying to hit a moving target. That ruins schedules and causes budgets to inflate.\n<\/p><p>This may seem like a lot of things to think about for something that could be as simple as perhaps moving from manual pipettes to automatic units, but that just means the total effort to do the work will be small. However it is still important since it impacts data quality and integrity, and your ability to defend your results should they be challenged. And, by the way, the issue of automated pipettes isn\u2019t simple; there is a lot to consider in properly specifying and using these products.<sup id=\"rdp-ebb-cite_ref-21\" class=\"reference\"><a href=\"#cite_note-21\">[g]<\/a><\/sup>\n<\/p>\n<h4><span class=\"mw-headline\" id=\"Analyzing_the_process\">Analyzing the process<\/span><\/h4>\n<p>Assuming that you have a well-described, thoroughly tested and validated procedure, that process has to be analyzed for optimization and suitability for automation. This is an end-to-end evaluation, not just a examination of isolated steps. This is an important point. Looking at a single step without taking into account the rest of the process may improve that portion of the process but have consequences elsewhere.\n<\/p><p>Take a common example: working in a testing environment where samples are being submitted by outside groups (Figure 2).\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig2_Liscouski_ConsidAutoLabProc21.png\" class=\"image wiki-link\" data-key=\"e1944129f376f676452d5befa53e5a78\"><img alt=\"Fig2 Liscouski ConsidAutoLabProc21.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/d\/d2\/Fig2_Liscouski_ConsidAutoLabProc21.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 2.<\/b> Lab sample processing, initial data entry through results<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Most LIMS will permit sample submitters (with appropriate permissions) to enter the sample description information directly into the LIMS, reducing some of the clerical burden. Standardizing on sample containers, with barcodes, reduces the effort and cost in some aspects of sample handling. A barcode scanner could be used to scan samples as they arrive into the lab, letting the system know that they are ready to be tested.\n<\/p><p>That brings us to an evaluation of the process as a whole, as well as an examination of the individual steps in the procedure. As shown in Figure 1, automation can be done in one of two ways: automating the full process or automating individual steps. Your choice depends on several factors, not the least of which is your comfort level and confidence in adopting automation as a strategy for increasing productivity. For some, concentrating on improvements in individual steps is an attractive approach. The cost and risk may be lower and if a problem occurs you can always backup to a fully manual implementation until they are resolved.\n<\/p><p>Care does have to be taken in choosing which steps to improve. From one perspective, you\u2019d want to do the step-wise implementation of automation as close to the end of the process as possible. The problem with doing it earlier is that you may create a backup in later stages of the process. Optimizing step 2, for example, doesn\u2019t do you much good if step 3 is overloaded and requires more people, or additional (possibly unplanned) automation to relieve a bottleneck there. In short, before you automate or improve a given step, you need to be sure that downstream processing can absorb the increase in materials flow. In addition, optimizing all the individual steps, one at time, doesn\u2019t necessarily add up to a well-designed full system automation. The transition between steps may not be as effective or efficient if the system were evaluated as a whole. If the end of the process is carried out by commercial instrumentation, the ability to absorb more work is easier since most of these systems are automated with computer data acquisition and processing, and many have auto-samplers available to accumulate samples that can be processed automatically. Some of those auto-samplers have built in robotics for common sample handling functions. If the workload builds, additional instruments can pick up the load, and equipment such as <a href=\"https:\/\/www.limswiki.org\/index.php\/Baytek_International,_Inc.\" title=\"Baytek International, Inc.\" class=\"wiki-link\" data-key=\"92bd2781da39f29dfafa73d5f07fd530\">Baytek International\u2019s<\/a> TurboTube<sup id=\"rdp-ebb-cite_ref-BaytekiPRO_22-0\" class=\"reference\"><a href=\"#cite_note-BaytekiPRO-22\">[15]<\/a><\/sup> can accumulate sample vials in a common system and route them to individual instruments for processing.\n<\/p><p>Another consideration for partial automation is where the process is headed in the future. If the need for the process persists over a long period of time, will you eventually get to the point of needing to redo the automation to an integrated stream? If so, is it better to take the plunge early on instead of continually expending resources to upgrade it?\n<\/p><p>Other considerations include the ability to re-purpose equipment. If a process isn\u2019t used full-time (a justification for partial automation) the same components may be used in improving other processes. Ideally, if you go the full-process automation route, you\u2019ll have sufficient sample throughput to keep it running for an extended period of time, and not have to start and stop the system as samples accumulate. A smoothly running slower automation process is better than a faster system that lies idle for significant periods of time, particularly since startup and shutdown procedures may diminish the operational cost savings in both equipment use and people\u2019s time.\n<\/p><p>All these points become part of both the technical justification and budget requirements.\n<\/p><p><b>Analyzing the process: Simulation and modeling<\/b>\n<\/p><p>Simulation and modeling have been part of science and engineering for decades, supported by ever-increasing powerful computing hardware and software. Continuous systems simulations have shown us the details of how machinery works, how chemical reactions occur, and how chromatographic systems and other instrumentation behaves.<sup id=\"rdp-ebb-cite_ref-JoyceComputer18_23-0\" class=\"reference\"><a href=\"#cite_note-JoyceComputer18-23\">[16]<\/a><\/sup> There is another aspect to modeling and simulation that is appropriate here.\n<\/p><p>Discrete-events simulation (DES) is used to model and understand processes in business and manufacturing applications, evaluating the interactions between service providers and customers, for example. One application of DES is to determine the best way to distribute incoming customers to a limited number of servers, taking into account that not all customers have the same needs; some will tie up a service provider a lot longer than others, as represented by the classic bank teller line problem. That is one question that discrete systems can analyze. This form of simulation and modeling is appropriate to event-driven processes where the action is focused on discrete steps (like materials moving from one workstation to another) rather than as a continuous function of time (most naturally occurring systems fall into this category, e.g., heat flow and models using differential equations).\n<\/p><p>The processes in your lab can be described and analyzed via DES systems.<sup id=\"rdp-ebb-cite_ref-CostigliolaSimul17_24-0\" class=\"reference\"><a href=\"#cite_note-CostigliolaSimul17-24\">[17]<\/a><\/sup><sup id=\"rdp-ebb-cite_ref-MengImprov13_25-0\" class=\"reference\"><a href=\"#cite_note-MengImprov13-25\">[18]<\/a><\/sup><sup id=\"rdp-ebb-cite_ref-JunApplic99_26-0\" class=\"reference\"><a href=\"#cite_note-JunApplic99-26\">[19]<\/a><\/sup> Those laboratory procedures are a sequence of steps, each having a precursor, variable duration, and following step until the end of the process is reached; this is basically the same as a manufacturing operation where modeling and simulation have been used successfully for decades. DES can be used to evaluate those processes and ask questions that can guide you on the best paths to take in applying automation technologies and solving productivity or throughput problems. For example:\n<\/p>\n<ul><li>What happens if we tighten up the variability in a particular step; how will that affect the rest of the system?<\/li>\n<li>What happens at the extremes of the variability in process steps; does it create a situation where samples pile up?<\/li>\n<li>How much of a workload can the process handle before one step becomes saturated with work and the entire system backs up?<\/li>\n<li>Can you introduce an alternate path to process those samples and avoid problems (e.g., if samples are held for too long in one stage, do they deteriorate)?<\/li>\n<li>Can the output of several parallel slower procedures be merged into a feed stream for a common instrumental technique?<\/li><\/ul>\n<p>In complex procedures some steps may be sensitive to small delays, and DES can help test and uncover them. Note that setting up these models will require the collection of a lot of data about the processes and their timing, so this is not something to be taken casually.\n<\/p><p>Previous research<sup id=\"rdp-ebb-cite_ref-JoyceComputer18_23-1\" class=\"reference\"><a href=\"#cite_note-JoyceComputer18-23\">[16]<\/a><\/sup><sup id=\"rdp-ebb-cite_ref-CostigliolaSimul17_24-1\" class=\"reference\"><a href=\"#cite_note-CostigliolaSimul17-24\">[17]<\/a><\/sup><sup id=\"rdp-ebb-cite_ref-MengImprov13_25-1\" class=\"reference\"><a href=\"#cite_note-MengImprov13-25\">[18]<\/a><\/sup><sup id=\"rdp-ebb-cite_ref-JunApplic99_26-1\" class=\"reference\"><a href=\"#cite_note-JunApplic99-26\">[19]<\/a><\/sup> suggests only a few ideas where simulation can be effective, including one where an entire labs operation\u2019s was evaluated. Models that extensive can be used to not only look at procedures, but also the introduction of informatics systems. This may appear to be a significant undertaking, and it can be depending on the complexity of the lab processes. However, simple processes can be initially modeled on spreadsheets to see if more significant effort is justified. Operations research, of which DES is a part, has been usefully applied in production operations to increase throughput and improve ROI. It might be successfully applied to some routine production oriented lab work.\n<\/p><p>Most lab processes are linear in their execution, one step following another, with the potential for loop-backs should problems be recognized with samples, reagents (e.g., being out-of-date, doesn\u2019t look right, need to obtain new materials), or equipment (e.g., not functioning properly, out of calibration, busy due to other work). On one level, the modeling of a manually implemented process should appear to be simple: each step takes a certain amount of time. If you add up the times, you have a picture of the process execution through time. However, the reality is quite different if you take into account problems (and their resolution) that can occur in each of those steps. The data collection used to model the procedure can change how that picture looks and your ability to improve it. By monitoring the process over a number of iterations, you can find out how much variation there is in the execution time for each step and whether or not the variation is a normal distribution or skewed (e.g., if one step is skewed, how does it impact others?).\n<\/p><p>Questions to ask about potential problems that could occur at each step include:\n<\/p>\n<ul><li>How often do problems with reagents occur and how much of a delay does that create?<\/li>\n<li>Is instrumentation always in calibration (do you know?), are there operational problems with devices and their control systems (what are the ramifications?), are procedures delayed due to equipment being in use by someone else, and how long does it take to make changeovers in operating conditions?<\/li>\n<li>What happens to the samples; do they degrade over time? What impact does this have on the accuracy of results and their reproducibility?<\/li>\n<li>How often are workflows interrupted by the need to deal with high-priority samples, and what effect does it have on the processing of other samples?<\/li><\/ul>\n<p>Just the collection of data can suggest useful improvements before there are any considerations for automation, and perhaps negating the need for it. The answer to a lab\u2019s productivity might be as simple as adding another instrument if that is a bottleneck. It might also suggest that an underutilized device might be more productive if sample preparation for different procedures workflows were organized differently. Underutilization might be a consequence of the amount of time needed to prepare the equipment for service: doing so for one sample might be disproportionately time consuming (and expensive) and cause other samples to wait until there were enough of them to justify the preparation. It could also suggest that some lab processes should be outsourced to groups that have a more consistent sample flow and turn-around time (TAT) for that technique. Some of these points are illustrated in Figures 3a and 3b below.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig3a_Liscouski_ConsidAutoLabProc21.png\" class=\"image wiki-link\" data-key=\"b54669998fa4961ccecbfd829e32d89d\"><img alt=\"Fig3a Liscouski ConsidAutoLabProc21.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/a\/af\/Fig3a_Liscouski_ConsidAutoLabProc21.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 3a.<\/b> Simplified process views versus some modeling considerations. Note that the total procedure execution time is affected by the variability in each step, plus equipment and material availability delays; these can change from one day to the next in manual implementations.<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p><a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig3b_Liscouski_ConsidAutoLabProc21.png\" class=\"image wiki-link\" data-key=\"dd6e3ce511391ee3f325577fdb4f9ca8\"><img alt=\"Fig3b Liscouski ConsidAutoLabProc21.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/e\/eb\/Fig3b_Liscouski_ConsidAutoLabProc21.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 3b.<\/b> The execution times of each step include the variable execution times of potential issues that can occur in each stage. Note that because each factor has a different distribution curve, the total execution time has a much wider variability than the individual factors.<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>How does the simulation system work? Once you have all the data set up, the simulation runs thousands of times using random number generators to pick out variables in execution times for each component in each step. For example, if there is a one-in-ten chance a piece of equipment will be in use when needed, 10% of the runs will show that with each one picking a delay time based on the input delay distribution function. With a large number of runs, you can see where delays exist and how they impact the overall processes behavior. You can also adjust the factors (what happens if equipment delays are cut in half) and see the effect of doing that. By testing the system, you can make better judgments on how to apply your resources.\n<\/p><p>Some of the issues that surface may be things that lab personnel know about and just deal with. It isn\u2019t until the problems are looked at that the impact on operations are fully realized and addressed. Modeling and simulation may appear to be overkill for lab process automation, something reserved for large- scale production projects. The physical size of the project is not the key factor, it is the complexity of the system that matters and the potential for optimization.\n<\/p><p>One benefit of a well-structured simulation of lab processes is that it would provide a solid basis for making recommendations for project approval and budgeting. The most significant element in modeling and simulation is the initial data collection, asking lab personnel to record the time it takes to carry out steps. This isn\u2019t likely to be popular if they don\u2019t understand why it is being done and what the benefits will be to them and the lab; accurate information is essential. This is another case where \u201cbad data is worse than no data.\u201d\n<\/p><p><b>Guidleines for process automation<\/b>\n<\/p><p>There are two types of guidelines that will be of interest to those conducting automation work: those that help you figure out what to do and how to do it, and those that must be met to satisfy regulatory requirements (both those evaluated by internal or external groups or organizations).\n<\/p><p>The first is going to depend on the nature of the science and automation being done to support it. Equipment vendor community support groups can be of assistance. Additionally, professional groups like the Pharmaceutical Research and Manufacturers of America (PhRMA), International Society for Pharmaceutical Engineering (ISPE), and Parenteral Drug Association (PDA) in the pharmaceutical and biotechnology industrues, with similar organizations in other industries and other countries. This may seem like a large jump from laboratory work, but it is appropriate when we consider the ramification of full-process automation. You are essentially developing a manufacturing operation on a lab bench, and the same concerns that large-scale production have also apply here; you have to ensure that the process is maintained and in control. The same is true of manual or semi-automated lab work, but it is more critical in fully-automated systems because of the potential high volume of results that can be produced.\n<\/p><p>The second set is going to consist of regulatory guidelines from groups appropriate to your industry: the <a href=\"https:\/\/www.limswiki.org\/index.php\/Food_and_Drug_Administration\" title=\"Food and Drug Administration\" class=\"wiki-link\" data-key=\"e2be8927071ac419c0929f7aa1ede7fe\">Food and Drug Administration<\/a> (FDA), <a href=\"https:\/\/www.limswiki.org\/index.php\/United_States_Environmental_Protection_Agency\" title=\"United States Environmental Protection Agency\" class=\"wiki-link\" data-key=\"877b052e12328aa52f6f7c3f2d56f99a\">Environmental Protection Agency<\/a> (EPA), and <a href=\"https:\/\/www.limswiki.org\/index.php\/International_Organization_for_Standardization\" title=\"International Organization for Standardization\" class=\"wiki-link\" data-key=\"116defc5d89c8a55f5b7c1be0790b442\">International Organization for Standardization<\/a> (ISO), as well as international groups (e.g., <a href=\"https:\/\/www.limswiki.org\/index.php\/Good_Automated_Manufacturing_Practice\" title=\"Good Automated Manufacturing Practice\" class=\"wiki-link\" data-key=\"a0f3d9c5bc4e330dbaec137fbe7f5dbb\">GAMP<\/a>, <a href=\"https:\/\/www.limswiki.org\/index.php\/Good_Automated_Laboratory_Practices\" title=\"Good Automated Laboratory Practices\" class=\"wiki-link\" data-key=\"bef4ea1fa3e792ccf7471f9f09b804e6\">GALP<\/a>) etc. The interesting point is that we are looking at a potentially complete automation scheme for a procedure; does that come under manufacturing or laboratory? The likelihood is that laboratory guidelines will apply since the work is being done within the lab's footprint; however, there are things that can be learned from their manufacturing counterparts that may assist in project management and documentation. One interesting consideration is what happens when fully automated testing, such as on-line analyzers, becomes integrated with both the lab and production or process control data\/information streams. Which regulatory guidelines apply? It may come down to who is responsible for managing and supporting those systems.\n<\/p>\n<h4><span class=\"mw-headline\" id=\"Scheduling_automation_projects\">Scheduling automation projects<\/span><\/h4>\n<p>There are two parts to the schedule issue: how long is it going to take to compete the project (dependent on the process and people), and when do you start? The second point will be addressed here.\n<\/p><p>The timing of an automated process coming online is important. If it comes on too soon, there may not be enough work to justify it\u2019s use, and startup\/shutdown procedures may create more work than the system saves. If it comes too late, people will be frustrated with a heavy workload while the system that was supposed to provide relief is under development. \n<\/p><p>In Figure 4, the blue line represents the growing need for sample\/material processing using a given laboratory procedure. Ideally, you\u2019d like the automated version to be available when that blue line crosses the \u201cautomation needed on-line\u201d level of processing requirements; this the point where the current (manual?) implementation can no longer meet the demands of sample throughput requirements.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig4_Liscouski_ConsidAutoLabProc21.png\" class=\"image wiki-link\" data-key=\"a92c905b5cb118443f1fe9176f6b38f5\"><img alt=\"Fig4 Liscouski ConsidAutoLabProc21.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/a\/a8\/Fig4_Liscouski_ConsidAutoLabProc21.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 4.<\/b> Timing the development of an automated system<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Those throughput limits are something you are going to have to evaluate and measure on a regular basis and use to make adjustments to the planning process (accelerating or slowing it as appropriate). How fast is the demand growing and at what point will your current methods be overwhelmed? Hiring more people is one option, but then the lab's operating expenses increase due to the cost of people, equipment, and lab space.\n<\/p><p>Once we have an idea of when something has to be working, we can begin the process of planning; note: the planning can begin at any point, it would be good to get the preliminaries done as soon as a manual process is finalized so that you have an idea of what you\u2019ll be getting into. Those preliminaries include looking at equipment that might be used (keeping track of its development), training requirements, developer resources, and implementation strategies, all of which would be updated as new information becomes available. The \u201cwe\u2019ll-get-to-it-when-we-need-it\u201d approach is just going to create a lot of stress and frustration.\n<\/p><p>You need to put together a first-pass project plan so that you can detail what you know, and more importantly what you don\u2019t know. The goal is to have enough information, updated as noted above, so that you can determine if an automated solution is feasible, make an informed initial choice between full and partial automation, and have a timeline for implementation. Any time estimate is going to be subject to change as you gather information and refine your implementation approach. The point of the timeline is to figure out how long the yellow box in Figure 4 is because that is going to tell you how much time you have to get the plan together and working; it is a matter of setting priorities and recognizing what they are. The time between now and the start of the yellow box is what you have to work with for planning and evaluating plans, and any decisions that are needed before you begin, including corporate project management requirements and approvals.\n<\/p><p>Those plans have to include time for validation and the evaluation of the new implementation against the standard implementation. Does it work? Do we know how to use and maintain it? And are people educated in its use? Is there documentation for the project?\n<\/p>\n<h4><span class=\"mw-headline\" id=\"Budgeting\">Budgeting<\/span><\/h4>\n<p>At some point, all the material above and following this section comes down to budgeting: how much will it cost to implement a program and is it worth it? Of the two points, the latter is the one that is most important. How do you go about that? (Note: Some of this material is also covered in the webinar series <i><a href=\"https:\/\/www.limswiki.org\/index.php\/LII:A_Guide_for_Management:_Successfully_Applying_Laboratory_Systems_to_Your_Organization%27s_Work\" title=\"LII:A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work\" class=\"wiki-link\" data-key=\"00b300565027cb0518bcb0410d6df360\">A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work<\/a><\/i> in the section on ROI.)\n<\/p><p>What a lot of this comes down to is explaining and justifying the choices you\u2019ve made in your project proposal. We\u2019re not going to go into a lot of depth, but just note some of the key issues:\n<\/p>\n<ul><li>Did you choose full or partial automation for your process?<\/li>\n<li>What drove that choice? If in your view it would be less expensive than the full automation of a process, how long will it be until the next upgrade is needed to another stage?<\/li>\n<li>How independent are the potential, sequential implementation efforts that may be undertaken in the future? Will there be a need to connect them, and if so, how will the incremental costs compare to just doing it once and getting it over with?<\/li><\/ul>\n<p>There is a tendency in lab work to treat problems and the products that might be used to address them in isolation. You see the need for a LIMS or ELN, or an instrument data system, and the focus is on those issues. Effective decisions have to consider both the immediate and longer-term aspects of a problem. If you want to get access to a LIMS, have you considered how it will affect other aspects of lab work such as connecting instrument to it?\n<\/p><p>The same holds true for partial automation as a solution to a lab process productivity problem. While you are addressing a particular step, should you be looking at the potential for synergism by addressing other concerns. Modeling and simulations of processes can help resolve that issue.\n<\/p><p>Have you factored in the cost of support and education? The support issue needs to address the needs of lab personnel in managing the equipment and the options for vendor support, as well as the impact on IT groups. Note that the IT group will require access to vendor support, as well as being educated on their role in any project work.\n<\/p><p>What happens if you don\u2019t automate? One way to justify the cost of a project is to help people understand what the lab\u2019s operations will be like without it. Will more people, equipment, space, or added shifts be needed? At what cost? What would the impact be on those who need the results and how would it affect their programs?\n<\/p>\n<h2><span id=\"rdp-ebb-Build,_buy,_or_cooperate?\"><\/span><span class=\"mw-headline\" id=\"Build.2C_buy.2C_or_cooperate.3F\">Build, buy, or cooperate?<\/span><\/h2>\n<p>In this write up and some of the referenced materials, we\u2019ve noted several times the benefits that clinical labs have gained through automation, although crediting it all to the use of automation alone isn\u2019t fair. What the clinical laboratory industry did was recognize that there was a need for the use of automation to solve problems with the operational costs of running labs, and recognition that they could benefit further by coming together and cooperatively addressing lab operational problems.\n<\/p><p>It\u2019s that latter point that made the difference and resulted in standardized communications, and purpose-built commercial equipment that could be used to implement automation in their labs. They also had common sample types, common procedures, and data processing. That same commonality applies to segments of industrial and academic lab work. Take life sciences as an example. Where possible, that industry has standardized on micro-plates for sample processing. The result is a wide selection of instruments and robotics built around that sample-holding format that greatly improves lab economics and throughput. While it isn\u2019t the answer to everything, it\u2019s a good answer to a lot of things.\n<\/p><p>If your industry segment came together and recognized that you used common procedures, how would you benefit by creating a common approach to automation instead of each lab doing it on their own? It would open the development of common products or product variations from vendors and relieve the need for each lab developing its own answer to the need. The result could be more effective and easily supportable solutions.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Project_planning\">Project planning<\/span><\/h2>\n<p>Once you\u2019ve decided on the project you are going to undertake, the next stage is looking at the steps needed to manage your project (Figure 5).\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig5_Liscouski_ConsidAutoLabProc21.png\" class=\"image wiki-link\" data-key=\"63b05e9703977e362222fe94d673287c\"><img alt=\"Fig5 Liscouski ConsidAutoLabProc21.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/4\/4d\/Fig5_Liscouski_ConsidAutoLabProc21.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 5.<\/b> Steps in a laboratory automation project. This diagram is modeled after the GAMP V for systems validation.<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>The planning begins with the method description from Figure 1, which describes the science behind the project and the specification of how the automation is expected to be put into effect: as full-process automation, a specific step, or steps in the process. The provider of those documents is considered the \u201ccustomer\u201d and is consistent with GAMP V nomenclature (Figure 6); that consistency is important due to the need for system-wide validation protocols.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig6_Liscouski_ConsidAutoLabProc21.png\" class=\"image wiki-link\" data-key=\"71f06c8346c550b519af5eb78dccb441\"><img alt=\"Fig6 Liscouski ConsidAutoLabProc21.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/a\/a2\/Fig6_Liscouski_ConsidAutoLabProc21.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 6.<\/b> GAMP V model for showing customer and supplier roles in specifying and evaluating project components for computer hardware and software.<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>From there the \u201csupplier\u201d (e.g., internal development group, consultant, IT services, etc.) responds with a functional specification that is reviewed by the customer. The \u201canalysis, prototyping, and evaluation\u201d step, represented in the third box of Figure 5, is not the same as the process analysis noted earlier in this piece. The earlier section was to help you determine what work needed to be done and documented in the user requirements specification. The analysis and associated tasks here are specific to the implementation of this project. The colored arrows refer to the diagram in Figure 7. That process defines the equipment needed, dependencies, and options\/technologies for automation implementations, including robotics, instrument design requirements, pre-built automation (e.g., titrators, etc.) and any custom components. The documentation and specifications are part of the validation protocol.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig6_Liscouski_ConsidAutoLabProc21.png\" class=\"image wiki-link\" data-key=\"71f06c8346c550b519af5eb78dccb441\"><img alt=\"Fig6 Liscouski ConsidAutoLabProc21.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/a\/a2\/Fig6_Liscouski_ConsidAutoLabProc21.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 7.<\/b> Defining dependencies and qualification of equipment<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>The prototyping function is an important part of the overall process. It is rare that someone will look at a project and come up with a working solution on the first pass. There is always tinkering and modifications that occur as you move from a blank slate to a working system. You make notes along the way about what should be done differently in the final product, and places where improvements or adjustments are needed. These all become part of the input to the system design specification that will be reviewed and approved by the customer and supplier. The prototype can be considered a proof of concept or a demonstration of what will occur in the finished product. Remember also that prototypes would not have to be validated since they wouldn\u2019t be used in a production environment; they are simply a test bed used prior to the development of a production system.\n<\/p><p>The component design specifications are the refined requirement for elements that will be used in the final design. Those refinements could point to updated models of components or equipment used, modifications needed, or recommendations for products with capabilities other than those used in the prototype.\n<\/p><p>The boxes on the left side of Figure 5 are documents that go into increasing depth as the system is designed and specified. The details in those items will vary with the extent of the project. The right side of the diagram is a series of increasingly sophisticated testing and evaluation against steps in the right side, culminating in the final demonstration that the system works, has been validated, and is accepted by the customer. It also means that lab and support personnel are educated in their roles.\n<\/p>\n<h2><span id=\"rdp-ebb-Conclusions_(so_far)\"><\/span><span class=\"mw-headline\" id=\"Conclusions_.28so_far.29\">Conclusions (so far)<\/span><\/h2>\n<p>\u201cLaboratory automation\u201d has to give way to \u201claboratory automation engineering.\u201d From the initial need to the completion of the validation process, we have to plan, design, and implement successful systems on a routine basis. Just as the manufacturing industries transitioned from cottage industries to production lines and then to integrated production-information systems, the execution of laboratory science has to tread a similar path if the demands for laboratory results are going to be met in a financially responsible manner. The science is fundamental; however, we need to pay attention now to efficient execution.\n<\/p>\n<h2><span id=\"rdp-ebb-Abbreviations,_acronyms,_and_initialisms\"><\/span><span class=\"mw-headline\" id=\"Abbreviations.2C_acronyms.2C_and_initialisms\">Abbreviations, acronyms, and initialisms<\/span><\/h2>\n<p><b>AI<\/b>: Artificial intelligence\n<\/p><p><b>AuI<\/b>: Augmented intelligence\n<\/p><p><b>DES<\/b>: Discrete-events simulation\n<\/p><p><b>ELN<\/b>: Electronic laboratory notebook\n<\/p><p><b>EPA<\/b>: Environmental Protection Agency\n<\/p><p><b>FDA<\/b>: Food and Drug Administration\n<\/p><p><b>FRB<\/b>: Fast radio bursts\n<\/p><p><b>GALP<\/b>: Good automated laboratory practices\n<\/p><p><b>GAMP<\/b>: Good automated manufacturing practice\n<\/p><p><b>ISO<\/b>: International Organization for Standardization\n<\/p><p><b>LES<\/b>: Laboratory execution system\n<\/p><p><b>LIMS<\/b>: Laboratory information management system\n<\/p><p><b>ML<\/b>: Machine learning\n<\/p><p><b>ROI<\/b>: Return on investment\n<\/p><p><b>SDMS<\/b>: Scientific data management system\n<\/p><p><b>TAT<\/b>: Turn-around time\n<\/p><p><br \/>\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Footnotes\">Footnotes<\/span><\/h2>\n<div class=\"reflist\" style=\"list-style-type: lower-alpha;\">\n<div class=\"mw-references-wrap\"><ol class=\"references\">\n<li id=\"cite_note-1\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-1\">\u2191<\/a><\/span> <span class=\"reference-text\">The term \"scientific manufacturing\" was first mentioned to the author by Mr. Alberto Correia, then of Cambridge Biomedical, Boston, MA.<\/span>\n<\/li>\n<li id=\"cite_note-8\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-8\">\u2191<\/a><\/span> <span class=\"reference-text\">Intelligent enterprise technologies referenced in the report include robotic process automation, machine learning, artificial intelligence, the internet Of things, predictive analysis, and cognitive computing.<\/span>\n<\/li>\n<li id=\"cite_note-10\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-10\">\u2191<\/a><\/span> <span class=\"reference-text\"><a rel=\"nofollow\" class=\"external text wiki-link\" href=\"https:\/\/en.wikipedia.org\/wiki\/Douglas_Engelbart\" data-key=\"91bc4cc7c8f10296b73c8689f9f470bb\">Doug Engelbart<\/a> found the field of human-computer interaction and is credited with the invention of the computer mouse, and the \u201cMother of All Demos\u201d in 1968.<\/span>\n<\/li>\n<li id=\"cite_note-18\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-18\">\u2191<\/a><\/span> <span class=\"reference-text\">See <a href=\"https:\/\/en.wikipedia.org\/wiki\/Metropolis_(1927_film)\" class=\"extiw wiki-link\" title=\"wikipedia:Metropolis (1927 film)\" data-key=\"0e4d0869e79f689fae15419230e0e902\"><i>Metropolis<\/i> (1927 film)<\/a> on Wikipedia.<\/span>\n<\/li>\n<li id=\"cite_note-19\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-19\">\u2191<\/a><\/span> <span class=\"reference-text\">See for example <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.projectmanager.com\/project-planning\" target=\"_blank\">https:\/\/www.projectmanager.com\/project-planning<\/a>; the simplest thing to do it put \u201cproject planning\u201d in a search engine and browse the results for something interesting.<\/span>\n<\/li>\n<li id=\"cite_note-20\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-20\">\u2191<\/a><\/span> <span class=\"reference-text\">See for example <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/theinformationdrivenlaboratory.wordpress.com\/category\/resources\/\" target=\"_blank\">https:\/\/theinformationdrivenlaboratory.wordpress.com\/category\/resources\/<\/a>; note that any references to the ILA should be ignored as the original site is gone, with the domain name perhaps having been leased by another organization that has no affiliation with the original Institute for Laboratory Automation.<\/span>\n<\/li>\n<li id=\"cite_note-21\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-21\">\u2191<\/a><\/span> <span class=\"reference-text\">As a starting point, view the <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.artel.co\/\" target=\"_blank\">Artel, Inc. site<\/a> as one source. Also, John Bradshaw gave an <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.artel.co\/learning_center\/2589\/\" target=\"_blank\">informative presentation<\/a> on \u201cThe Importance of Liquid Handling Details and Their Impact on your Assays\u201d at the 2012 European Lab Automation Conference, Hamburg, Germany.<\/span>\n<\/li>\n<\/ol><\/div><\/div>\n<h2><span class=\"mw-headline\" id=\"About_the_author\">About the author<\/span><\/h2>\n<p>Initially educated as a chemist, author Joe Liscouski (joe dot liscouski at gmail dot com) is an experienced laboratory automation\/computing professional with over forty years of experience in the field, including the design and development of automation systems (both custom and commercial systems), LIMS, robotics and data interchange standards. He also consults on the use of computing in laboratory work. He has held symposia on validation and presented technical material and short courses on laboratory automation and computing in the U.S., Europe, and Japan. He has worked\/consulted in pharmaceutical, biotech, polymer, medical, and government laboratories. His current work centers on working with companies to establish planning programs for lab systems, developing effective support groups, and helping people with the application of automation and information technologies in research and quality control environments.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"References\">References<\/span><\/h2>\n<div class=\"reflist references-column-width\" style=\"-moz-column-width: 30em; -webkit-column-width: 30em; column-width: 30em; list-style-type: decimal;\">\n<div class=\"mw-references-wrap mw-references-columns\"><ol class=\"references\">\n<li id=\"cite_note-FreyTheFuture13-2\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-FreyTheFuture13_2-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">Frey, C.B.; Osborne, M.A. (17 September 2013). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.oxfordmartin.ox.ac.uk\/downloads\/academic\/The_Future_of_Employment.pdf\" target=\"_blank\">\"The Future of Employment: How Susceptible Are Jobs to Computerisation?\"<\/a> (PDF). Oxford Martin School, University of Oxford<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.oxfordmartin.ox.ac.uk\/downloads\/academic\/The_Future_of_Employment.pdf\" target=\"_blank\">https:\/\/www.oxfordmartin.ox.ac.uk\/downloads\/academic\/The_Future_of_Employment.pdf<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 04 February 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=The+Future+of+Employment%3A+How+Susceptible+Are+Jobs+to+Computerisation%3F&rft.atitle=&rft.aulast=Frey%2C+C.B.%3B+Osborne%2C+M.A.&rft.au=Frey%2C+C.B.%3B+Osborne%2C+M.A.&rft.date=17+September+2013&rft.pub=Oxford+Martin+School%2C+University+of+Oxford&rft_id=https%3A%2F%2Fwww.oxfordmartin.ox.ac.uk%2Fdownloads%2Facademic%2FThe_Future_of_Employment.pdf&rfr_id=info:sid\/en.wikipedia.org:LII:Considerations_in_the_Automation_of_Laboratory_Procedures\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-HsuIsIt18-3\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-HsuIsIt18_3-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">Hsu, J. (24 September 2018). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.nbcnews.com\/mach\/science\/it-aliens-scientists-detect-more-mysterious-radio-signals-distant-galaxy-ncna912586\" target=\"_blank\">\"Is it aliens? Scientists detect more mysterious radio signals from distant galaxy\"<\/a>. <i>NBC News MACH<\/i><span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.nbcnews.com\/mach\/science\/it-aliens-scientists-detect-more-mysterious-radio-signals-distant-galaxy-ncna912586\" target=\"_blank\">https:\/\/www.nbcnews.com\/mach\/science\/it-aliens-scientists-detect-more-mysterious-radio-signals-distant-galaxy-ncna912586<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 04 February 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Is+it+aliens%3F+Scientists+detect+more+mysterious+radio+signals+from+distant+galaxy&rft.atitle=NBC+News+MACH&rft.aulast=Hsu%2C+J.&rft.au=Hsu%2C+J.&rft.date=24+September+2018&rft_id=https%3A%2F%2Fwww.nbcnews.com%2Fmach%2Fscience%2Fit-aliens-scientists-detect-more-mysterious-radio-signals-distant-galaxy-ncna912586&rfr_id=info:sid\/en.wikipedia.org:LII:Considerations_in_the_Automation_of_Laboratory_Procedures\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-TimmerAIPlus18-4\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-TimmerAIPlus18_4-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">Timmer, J. (18 July 2018). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/arstechnica.com\/science\/2018\/07\/ai-plus-a-chemistry-robot-finds-all-the-reactions-that-will-work\/5\/\" target=\"_blank\">\"AI plus a chemistry robot finds all the reactions that will work\"<\/a>. <i>Ars Technica<\/i><span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/arstechnica.com\/science\/2018\/07\/ai-plus-a-chemistry-robot-finds-all-the-reactions-that-will-work\/5\/\" target=\"_blank\">https:\/\/arstechnica.com\/science\/2018\/07\/ai-plus-a-chemistry-robot-finds-all-the-reactions-that-will-work\/5\/<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 04 February 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=AI+plus+a+chemistry+robot+finds+all+the+reactions+that+will+work&rft.atitle=Ars+Technica&rft.aulast=Timmer%2C+J.&rft.au=Timmer%2C+J.&rft.date=18+July+2018&rft_id=https%3A%2F%2Farstechnica.com%2Fscience%2F2018%2F07%2Fai-plus-a-chemistry-robot-finds-all-the-reactions-that-will-work%2F5%2F&rfr_id=info:sid\/en.wikipedia.org:LII:Considerations_in_the_Automation_of_Laboratory_Procedures\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-HelixAIHome-5\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-HelixAIHome_5-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"http:\/\/www.askhelix.io\/\" target=\"_blank\">\"HelixAI - Voice Powered Digital Laboratory Assistants for Scientific Laboratories\"<\/a>. HelixAI<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"http:\/\/www.askhelix.io\/\" target=\"_blank\">http:\/\/www.askhelix.io\/<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 04 February 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=HelixAI+-+Voice+Powered+Digital+Laboratory+Assistants+for+Scientific+Laboratories&rft.atitle=&rft.pub=HelixAI&rft_id=http%3A%2F%2Fwww.askhelix.io%2F&rfr_id=info:sid\/en.wikipedia.org:LII:Considerations_in_the_Automation_of_Laboratory_Procedures\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-PharmaIQNewsAutom18-6\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-PharmaIQNewsAutom18_6-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">PharmaIQ News (20 August 2018). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.pharma-iq.com\/pre-clinical-discovery-and-development\/news\/automation-iot-and-the-future-of-smarter-research-environments\" target=\"_blank\">\"Automation, IoT and the future of smarter research environments\"<\/a>. <i>PharmaIQ<\/i><span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.pharma-iq.com\/pre-clinical-discovery-and-development\/news\/automation-iot-and-the-future-of-smarter-research-environments\" target=\"_blank\">https:\/\/www.pharma-iq.com\/pre-clinical-discovery-and-development\/news\/automation-iot-and-the-future-of-smarter-research-environments<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 04 February 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Automation%2C+IoT+and+the+future+of+smarter+research+environments&rft.atitle=PharmaIQ&rft.aulast=PharmaIQ+News&rft.au=PharmaIQ+News&rft.date=20+August+2018&rft_id=https%3A%2F%2Fwww.pharma-iq.com%2Fpre-clinical-discovery-and-development%2Fnews%2Fautomation-iot-and-the-future-of-smarter-research-environments&rfr_id=info:sid\/en.wikipedia.org:LII:Considerations_in_the_Automation_of_Laboratory_Procedures\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-PharmaIQTheFuture17-7\"><span class=\"mw-cite-backlink\">\u2191 <sup><a href=\"#cite_ref-PharmaIQTheFuture17_7-0\">6.0<\/a><\/sup> <sup><a href=\"#cite_ref-PharmaIQTheFuture17_7-1\">6.1<\/a><\/sup><\/span> <span class=\"reference-text\"><span class=\"citation web\">PharmaIQ (14 November 2017). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.pharma-iq.com\/pre-clinical-discovery-and-development\/whitepapers\/the-future-of-drug-discovery-ai-2020\" target=\"_blank\">\"The Future of Drug Discovery: AI 2020\"<\/a>. PharmaIQ<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.pharma-iq.com\/pre-clinical-discovery-and-development\/whitepapers\/the-future-of-drug-discovery-ai-2020\" target=\"_blank\">https:\/\/www.pharma-iq.com\/pre-clinical-discovery-and-development\/whitepapers\/the-future-of-drug-discovery-ai-2020<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 04 February 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=The+Future+of+Drug+Discovery%3A+AI+2020&rft.atitle=&rft.aulast=PharmaIQ&rft.au=PharmaIQ&rft.date=14+November+2017&rft.pub=PharmaIQ&rft_id=https%3A%2F%2Fwww.pharma-iq.com%2Fpre-clinical-discovery-and-development%2Fwhitepapers%2Fthe-future-of-drug-discovery-ai-2020&rfr_id=info:sid\/en.wikipedia.org:LII:Considerations_in_the_Automation_of_Laboratory_Procedures\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-RossettoFight18-9\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-RossettoFight18_9-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Rossetto, L. (2018). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.magzter.com\/stories\/Science\/WIRED\/Fight-The-Dour\" target=\"_blank\">\"Fight the Dour\"<\/a>. <i>Wired<\/i> (October): 826\u20137<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.magzter.com\/stories\/Science\/WIRED\/Fight-The-Dour\" target=\"_blank\">https:\/\/www.magzter.com\/stories\/Science\/WIRED\/Fight-The-Dour<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Fight+the+Dour&rft.jtitle=Wired&rft.aulast=Rossetto%2C+L.&rft.au=Rossetto%2C+L.&rft.date=2018&rft.issue=October&rft.pages=826%E2%80%937&rft_id=https%3A%2F%2Fwww.magzter.com%2Fstories%2FScience%2FWIRED%2FFight-The-Dour&rfr_id=info:sid\/en.wikipedia.org:LII:Considerations_in_the_Automation_of_Laboratory_Procedures\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-BrukerOPUS-11\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-BrukerOPUS_11-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.bruker.com\/en\/products-and-solutions\/infrared-and-raman\/opus-spectroscopy-software\/search-identify.html\" target=\"_blank\">\"OPUS Package: SEARCH & IDENT\"<\/a>. Bruker Corporation<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.bruker.com\/en\/products-and-solutions\/infrared-and-raman\/opus-spectroscopy-software\/search-identify.html\" target=\"_blank\">https:\/\/www.bruker.com\/en\/products-and-solutions\/infrared-and-raman\/opus-spectroscopy-software\/search-identify.html<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 04 February 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=OPUS+Package%3A+SEARCH+%26+IDENT&rft.atitle=&rft.pub=Bruker+Corporation&rft_id=https%3A%2F%2Fwww.bruker.com%2Fen%2Fproducts-and-solutions%2Finfrared-and-raman%2Fopus-spectroscopy-software%2Fsearch-identify.html&rfr_id=info:sid\/en.wikipedia.org:LII:Considerations_in_the_Automation_of_Laboratory_Procedures\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-BourneMyBoss13-12\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-BourneMyBoss13_12-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Bourne, D. (2013). \"My Boss the Robot\". <i>Scientific American<\/i> <b>308<\/b> (5): 38\u201341. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/Digital_object_identifier\" data-key=\"ae6d69c760ab710abc2dd89f3937d2f4\">doi<\/a>:<a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.1038%2Fscientificamerican0513-38\" target=\"_blank\">10.1038\/scientificamerican0513-38<\/a>. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/PubMed_Identifier\" data-key=\"1d34e999f13d8801964a6b3e9d7b4e30\">PMID<\/a> <a rel=\"external_link\" class=\"external text\" href=\"http:\/\/www.ncbi.nlm.nih.gov\/pubmed\/23627215\" target=\"_blank\">23627215<\/a>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=My+Boss+the+Robot&rft.jtitle=Scientific+American&rft.aulast=Bourne%2C+D.&rft.au=Bourne%2C+D.&rft.date=2013&rft.volume=308&rft.issue=5&rft.pages=38%E2%80%9341&rft_id=info:doi\/10.1038%2Fscientificamerican0513-38&rft_id=info:pmid\/23627215&rfr_id=info:sid\/en.wikipedia.org:LII:Considerations_in_the_Automation_of_Laboratory_Procedures\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-SalesForceSecondAnn17-13\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-SalesForceSecondAnn17_13-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">SalesForce Research (2017). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/a.sfdcstatic.com\/content\/dam\/www\/ocms\/assets\/pdf\/misc\/2017-state-of-it-report-salesforce.pdf\" target=\"_blank\">\"Second Annual State of IT\"<\/a> (PDF). SalesForce<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/a.sfdcstatic.com\/content\/dam\/www\/ocms\/assets\/pdf\/misc\/2017-state-of-it-report-salesforce.pdf\" target=\"_blank\">https:\/\/a.sfdcstatic.com\/content\/dam\/www\/ocms\/assets\/pdf\/misc\/2017-state-of-it-report-salesforce.pdf<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 04 February 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Second+Annual+State+of+IT&rft.atitle=&rft.aulast=SalesForce+Research&rft.au=SalesForce+Research&rft.date=2017&rft.pub=SalesForce&rft_id=https%3A%2F%2Fa.sfdcstatic.com%2Fcontent%2Fdam%2Fwww%2Focms%2Fassets%2Fpdf%2Fmisc%2F2017-state-of-it-report-salesforce.pdf&rfr_id=info:sid\/en.wikipedia.org:LII:Considerations_in_the_Automation_of_Laboratory_Procedures\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-SealAnal-14\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-SealAnal_14-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/seal-analytical.com\/Products\/tabid\/55\/language\/en-US\/Default.aspx\" target=\"_blank\">\"Seal Analytical - Products\"<\/a>. Seal Analytical<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/seal-analytical.com\/Products\/tabid\/55\/language\/en-US\/Default.aspx\" target=\"_blank\">https:\/\/seal-analytical.com\/Products\/tabid\/55\/language\/en-US\/Default.aspx<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 04 February 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Seal+Analytical+-+Products&rft.atitle=&rft.pub=Seal+Analytical&rft_id=https%3A%2F%2Fseal-analytical.com%2FProducts%2Ftabid%2F55%2Flanguage%2Fen-US%2FDefault.aspx&rfr_id=info:sid\/en.wikipedia.org:LII:Considerations_in_the_Automation_of_Laboratory_Procedures\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-BranLuebbe-15\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-BranLuebbe_15-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.spxflow.com\/bran-luebbe\/\" target=\"_blank\">\"Bran+Luebbe\"<\/a>. SPX FLOW, Inc<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.spxflow.com\/bran-luebbe\/\" target=\"_blank\">https:\/\/www.spxflow.com\/bran-luebbe\/<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 04 February 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Bran%2BLuebbe&rft.atitle=&rft.pub=SPX+FLOW%2C+Inc&rft_id=https%3A%2F%2Fwww.spxflow.com%2Fbran-luebbe%2F&rfr_id=info:sid\/en.wikipedia.org:LII:Considerations_in_the_Automation_of_Laboratory_Procedures\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-DouglasScientificArrayTape-16\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-DouglasScientificArrayTape_16-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.douglasscientific.com\/Products\/ArrayTape.aspx\" target=\"_blank\">\"Array Tape Advanced Consumable\"<\/a>. Douglas Scientific<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.douglasscientific.com\/Products\/ArrayTape.aspx\" target=\"_blank\">https:\/\/www.douglasscientific.com\/Products\/ArrayTape.aspx<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 04 February 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Array+Tape+Advanced+Consumable&rft.atitle=&rft.pub=Douglas+Scientific&rft_id=https%3A%2F%2Fwww.douglasscientific.com%2FProducts%2FArrayTape.aspx&rfr_id=info:sid\/en.wikipedia.org:LII:Considerations_in_the_Automation_of_Laboratory_Procedures\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-Agilent1200Series-17\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-Agilent1200Series_17-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.agilent.com\/cs\/library\/usermanuals\/Public\/G1329-90012_StandPrepSamplers_ebook.pdf\" target=\"_blank\">\"Agilent 1200 Series Standard and Preparative Autosamplers - User Manual\"<\/a> (PDF). Agilent Technologies. November 2008<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.agilent.com\/cs\/library\/usermanuals\/Public\/G1329-90012_StandPrepSamplers_ebook.pdf\" target=\"_blank\">https:\/\/www.agilent.com\/cs\/library\/usermanuals\/Public\/G1329-90012_StandPrepSamplers_ebook.pdf<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 04 February 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Agilent+1200+Series+Standard+and+Preparative+Autosamplers+-+User+Manual&rft.atitle=&rft.date=November+2008&rft.pub=Agilent+Technologies&rft_id=https%3A%2F%2Fwww.agilent.com%2Fcs%2Flibrary%2Fusermanuals%2FPublic%2FG1329-90012_StandPrepSamplers_ebook.pdf&rfr_id=info:sid\/en.wikipedia.org:LII:Considerations_in_the_Automation_of_Laboratory_Procedures\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-BaytekiPRO-22\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-BaytekiPRO_22-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.baytekinternational.com\/products\/ipro-interface\/89-products\" target=\"_blank\">\"iPRO Interface - Products\"<\/a>. Baytek International, Inc<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.baytekinternational.com\/products\/ipro-interface\/89-products\" target=\"_blank\">https:\/\/www.baytekinternational.com\/products\/ipro-interface\/89-products<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 05 February 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=iPRO+Interface+-+Products&rft.atitle=&rft.pub=Baytek+International%2C+Inc&rft_id=https%3A%2F%2Fwww.baytekinternational.com%2Fproducts%2Fipro-interface%2F89-products&rfr_id=info:sid\/en.wikipedia.org:LII:Considerations_in_the_Automation_of_Laboratory_Procedures\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-JoyceComputer18-23\"><span class=\"mw-cite-backlink\">\u2191 <sup><a href=\"#cite_ref-JoyceComputer18_23-0\">16.0<\/a><\/sup> <sup><a href=\"#cite_ref-JoyceComputer18_23-1\">16.1<\/a><\/sup><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Joyce, J. (2018). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.labmanager.com\/laboratory-technology\/computer-modeling-and-simulation-1826\" target=\"_blank\">\"Computer Modeling and Simulation\"<\/a>. <i>Lab Manager<\/i> (9): 32\u201335<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.labmanager.com\/laboratory-technology\/computer-modeling-and-simulation-1826\" target=\"_blank\">https:\/\/www.labmanager.com\/laboratory-technology\/computer-modeling-and-simulation-1826<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Computer+Modeling+and+Simulation&rft.jtitle=Lab+Manager&rft.aulast=Joyce%2C+J.&rft.au=Joyce%2C+J.&rft.date=2018&rft.issue=9&rft.pages=32%E2%80%9335&rft_id=https%3A%2F%2Fwww.labmanager.com%2Flaboratory-technology%2Fcomputer-modeling-and-simulation-1826&rfr_id=info:sid\/en.wikipedia.org:LII:Considerations_in_the_Automation_of_Laboratory_Procedures\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-CostigliolaSimul17-24\"><span class=\"mw-cite-backlink\">\u2191 <sup><a href=\"#cite_ref-CostigliolaSimul17_24-0\">17.0<\/a><\/sup> <sup><a href=\"#cite_ref-CostigliolaSimul17_24-1\">17.1<\/a><\/sup><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Costigliola, A.; Ata\u00edde, F.A.P.; Vieira, S.M. et al. (2017). \"Simulation Model of a Quality Control Laboratory in Pharmaceutical Industry\". <i>IFAC-PapersOnLine<\/i> <b>50<\/b> (1): 9014-9019. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/Digital_object_identifier\" data-key=\"ae6d69c760ab710abc2dd89f3937d2f4\">doi<\/a>:<a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.1016%2Fj.ifacol.2017.08.1582\" target=\"_blank\">10.1016\/j.ifacol.2017.08.1582<\/a>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Simulation+Model+of+a+Quality+Control+Laboratory+in+Pharmaceutical+Industry&rft.jtitle=IFAC-PapersOnLine&rft.aulast=Costigliola%2C+A.%3B+Ata%C3%ADde%2C+F.A.P.%3B+Vieira%2C+S.M.+et+al.&rft.au=Costigliola%2C+A.%3B+Ata%C3%ADde%2C+F.A.P.%3B+Vieira%2C+S.M.+et+al.&rft.date=2017&rft.volume=50&rft.issue=1&rft.pages=9014-9019&rft_id=info:doi\/10.1016%2Fj.ifacol.2017.08.1582&rfr_id=info:sid\/en.wikipedia.org:LII:Considerations_in_the_Automation_of_Laboratory_Procedures\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-MengImprov13-25\"><span class=\"mw-cite-backlink\">\u2191 <sup><a href=\"#cite_ref-MengImprov13_25-0\">18.0<\/a><\/sup> <sup><a href=\"#cite_ref-MengImprov13_25-1\">18.1<\/a><\/sup><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Meng, L.; Liu, R.; Essick, C. et al. (2013). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.researchgate.net\/publication\/263238201_Improving_Medical_Laboratory_Operations_via_Discrete-event_Simulation\" target=\"_blank\">\"Improving Medical Laboratory Operations via Discrete-event Simulation\"<\/a>. <i>Proceedings of the 2013 INFORMS Healthcare Conference<\/i><span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.researchgate.net\/publication\/263238201_Improving_Medical_Laboratory_Operations_via_Discrete-event_Simulation\" target=\"_blank\">https:\/\/www.researchgate.net\/publication\/263238201_Improving_Medical_Laboratory_Operations_via_Discrete-event_Simulation<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Improving+Medical+Laboratory+Operations+via+Discrete-event+Simulation&rft.jtitle=Proceedings+of+the+2013+INFORMS+Healthcare+Conference&rft.aulast=Meng%2C+L.%3B+Liu%2C+R.%3B+Essick%2C+C.+et+al.&rft.au=Meng%2C+L.%3B+Liu%2C+R.%3B+Essick%2C+C.+et+al.&rft.date=2013&rft_id=https%3A%2F%2Fwww.researchgate.net%2Fpublication%2F263238201_Improving_Medical_Laboratory_Operations_via_Discrete-event_Simulation&rfr_id=info:sid\/en.wikipedia.org:LII:Considerations_in_the_Automation_of_Laboratory_Procedures\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-JunApplic99-26\"><span class=\"mw-cite-backlink\">\u2191 <sup><a href=\"#cite_ref-JunApplic99_26-0\">19.0<\/a><\/sup> <sup><a href=\"#cite_ref-JunApplic99_26-1\">19.1<\/a><\/sup><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">\"Application of discrete-event simulation in health care clinics: A survey\". <i>Journal of the Operational Research Society<\/i> <b>50<\/b>: 109\u201323. 1999. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/Digital_object_identifier\" data-key=\"ae6d69c760ab710abc2dd89f3937d2f4\">doi<\/a>:<a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.1057%2Fpalgrave.jors.2600669\" target=\"_blank\">10.1057\/palgrave.jors.2600669<\/a>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Application+of+discrete-event+simulation+in+health+care+clinics%3A+A+survey&rft.jtitle=Journal+of+the+Operational+Research+Society&rft.date=1999&rft.volume=50&rft.pages=109%E2%80%9323&rft_id=info:doi\/10.1057%2Fpalgrave.jors.2600669&rfr_id=info:sid\/en.wikipedia.org:LII:Considerations_in_the_Automation_of_Laboratory_Procedures\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<\/ol><\/div><\/div>\n<!-- \nNewPP limit report\nCached time: 20211118162548\nCache expiry: 86400\nDynamic content: false\nComplications: []\nCPU time usage: 0.218 seconds\nReal time usage: 0.310 seconds\nPreprocessor visited node count: 12823\/1000000\nPost\u2010expand include size: 81176\/2097152 bytes\nTemplate argument size: 31614\/2097152 bytes\nHighest expansion depth: 18\/40\nExpensive parser function count: 0\/100\nUnstrip recursion depth: 0\/20\nUnstrip post\u2010expand size: 32635\/5000000 bytes\n-->\n<!--\nTransclusion expansion time report (%,ms,calls,template)\n100.00% 143.888 1 -total\n 84.41% 121.452 2 Template:Reflist\n 65.28% 93.929 19 Template:Citation\/core\n 49.72% 71.538 13 Template:Cite_web\n 22.32% 32.120 6 Template:Cite_journal\n 8.49% 12.209 7 Template:Date\n 7.11% 10.237 7 Template:Efn\n 4.26% 6.133 4 Template:Citation\/identifier\n 3.68% 5.290 22 Template:Citation\/make_link\n 2.08% 3.000 8 Template:Clear\n-->\n\n<!-- Saved in parser cache with key limswiki:pcache:idhash:12315-0!canonical and timestamp 20211118162548 and revision id 41798. Serialized with JSON.\n -->\n<\/div><\/div><div class=\"printfooter\">Source: <a rel=\"external_link\" class=\"external\" href=\"https:\/\/www.limswiki.org\/index.php\/LII:Considerations_in_the_Automation_of_Laboratory_Procedures\">https:\/\/www.limswiki.org\/index.php\/LII:Considerations_in_the_Automation_of_Laboratory_Procedures<\/a><\/div>\n<!-- end content --><div class=\"visualClear\"><\/div><\/div><\/div><div class=\"visualClear\"><\/div><\/div><!-- end of the left (by default at least) column --><div class=\"visualClear\"><\/div><\/div>\n\n\n\n<\/body>","e0147011cc1eb892e1a35e821657a6d9_images":["https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/3\/31\/Fig1_Liscouski_ConsidAutoLabProc21.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/d\/d2\/Fig2_Liscouski_ConsidAutoLabProc21.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/a\/af\/Fig3a_Liscouski_ConsidAutoLabProc21.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/e\/eb\/Fig3b_Liscouski_ConsidAutoLabProc21.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/a\/a8\/Fig4_Liscouski_ConsidAutoLabProc21.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/4\/4d\/Fig5_Liscouski_ConsidAutoLabProc21.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/a\/a2\/Fig6_Liscouski_ConsidAutoLabProc21.png"],"e0147011cc1eb892e1a35e821657a6d9_timestamp":1637252748,"655f7d48a642e9b45533745af73f0d59_type":"article","655f7d48a642e9b45533745af73f0d59_title":"Laboratory Technology Planning and Management: The Practice of Laboratory Systems Engineering (Liscouski 2020)","655f7d48a642e9b45533745af73f0d59_url":"https:\/\/www.limswiki.org\/index.php\/LII:Laboratory_Technology_Planning_and_Management:_The_Practice_of_Laboratory_Systems_Engineering","655f7d48a642e9b45533745af73f0d59_plaintext":"\n\nLII:Laboratory Technology Planning and Management: The Practice of Laboratory Systems EngineeringFrom LIMSWikiJump to navigationJump to searchTitle: Laboratory Technology Planning and Management: The Practice of Laboratory Systems Engineering\nAuthor for citation: Joe Liscouski, with editorial modifications by Shawn Douglas\nLicense for content: Creative Commons Attribution 4.0 International\nPublication date: December 2020\n\nContents \n\n1 Introduction \n\n1.1 Directions in lab operations \n\n1.1.1 The lab of the future \n1.1.2 Trends in science applications \n\n\n1.2 Education \n\n\n2 Making laboratory informatics and automation work \n\n2.1 Introduction to this section \n\n2.1.1 The point of planning \n2.1.2 Who is responsible for laboratory technology planning and management (TPM)? \n2.1.3 Why put so much effort into planning and technology management? \n\n\n\n\n3 Different ways of looking at laboratories \n4 Labs in transition, from manual operation to modern facilities \n\n4.1 There's a plan for that? \n4.2 Thinking about a model for lab operations \n\n\n5 The seven goals of planning and managing lab technologies \n\n5.1 First goal: Support an environment that fosters productivity and innovation \n5.2 Second goal: Develop high-quality data and information \n5.3 Third goal: Manage K\/I\/D effectively, putting them in a structure that encourages use and protects value \n5.4 Fourth goal: Ensure a high level of data integrity at every step \n\n5.4.1 Definitions of data integrity \n\n\n5.5 Fifth goal: Addressing security throughout the lab \n\n5.5.1 Examine aspects of the facility itself \n5.5.2 Security and the working environment \n5.5.3 Summary \n\n\n5.6 Sixth goal: Acquiring and developing \"products\" that support regulatory requirements \n\n5.6.1 The importance of documentation \n5.6.2 Additional considerations in creating supportable systems and processes \n5.6.3 Product life cycles \n5.6.4 Summary \n\n\n5.7 Seventh goal: Addressing systems integration and harmonization \n\n5.7.1 Harmonization \n\n\n\n\n6 Laboratory systems engineers \n7 Closing \n8 Abbreviations, acronyms, and initialisms \n9 Footnotes \n10 About the author \n11 References \n\n\n\nIntroduction \nWhat separates successful advanced laboratories from all the others? It's largely their ability to meet their goals, with the effective use of resources: people, time, money, equipment, data, and information. The fundamental goals of laboratory work haven\u2019t changed, but they are under increased pressure to do more and do it faster, with a better return on investment (ROI). Laboratory managers have turned to electronic technologies (e.g., computers, networks, robotics, microprocessors, database systems, etc.) to meet those demands. However, without effective planning, technology management, and education, those technologies will only get labs part of the way to meeting their needs. We need to learn how to close the gap between getting part-way there and getting where we need to be. The practice of science has changed; we need to meet that change to be successful.\nThis document was written to get people thinking more seriously about the technologies used in laboratory work and how those technologies contribute to meeting the challenges labs are facing. There are three primary concerns:\n\nThe need for planning and management: When digital components began to be added to lab systems, it was a slow incremental process: integrators and microprocessors grew in capability as the marketplace accepted them. That development gave us the equipment we have now, equipment that can be used in isolation or in a networked, integrated system. In either case, they need attention in their application and management to protect electronic laboratory data, ensure that it can be effectively used, and ensure that the systems and products put in place are both the right ones, and that they fully contribute to improvements in lab operations.\nThe need for more laboratory systems engineers (LSEs): There is increasing demand for people who have the education and skills needed to accomplish the points above and provide research and testing groups with the support they need.[a]\nThe need to collaborate with vendors: In order to develop the best products needed for laboratory work, vendors should be provided more user input. Too often vendors have an idea for a product or modifications to existing products, yet they lack a fully qualified audience to bounce ideas off of. With the planning in the first concern in place, we should be able to approach vendors and say, with confidence, \"this is what is needed\" and explain why.\nIf the audience for this work were product manufacturing or production facilities, everything that was being said would have been history. The efficiency and productivity of production operations directly impacts profitability and customer satisfaction; the effort to optimize operations would have been an essential goal. When it comes to laboratory operations, that same level of attention found in production operations must be in place to accelerate laboratory research and testing operations, reducing cost and improving productivity. Aside from a few lab installations in large organizations, this same level of attention isn\u2019t given, as people aren\u2019t educated as to its importance. The purpose of this work is to present ideas of what laboratory technology challenges can be addressed through planning activities using a series of goals.\nThis material is an expansion upon two presentations:\n\n\"Laboratory Technology Management & Planning,\" 2nd Annual Lab Asset & Facility Management in Pharma 2019, San Diego, CA, October 22, 2019\n\"How Digital Technologies are Changing the Landscape of Lab Operations,\" Lab Manager webinar, April 2020\nDirections in lab operations \nThe lab of the future \nPeople often ask what the lab of the future (LOF) is going to look like, as if there were a design or model that we should be aspiring toward. There isn\u2019t. Your lab's future is in your hands to mold, a blank sheet of paper upon which you define your lab's future by setting objectives, developing a functional physical and digital architecture, planning processes and implementations, and managing technology that supports both scientific and laboratory information management. If that sound scary, it\u2019s understandable. But you must take the time to educate yourself and bring in people (e.g., LSEs, consultants, etc.) who can assist you.\nToo often, if vendors and consultants are asked what the LOF is going to look like, the response lines up with their corporate interests. No one knows what the LOF is because there isn\u2019t a singular future, but rather different futures for different types of labs. (Just think of all the different scientific disciplines that exist; one future doesn\u2019t fit all.) Your lab's future is in your hands. What do you want it to be?\nThe material in this document isn\u2019t intended to define your LOF, but to help you realize it once the framework has been created, and you are in the best position to create it. As you create that framework, you'll be asking:\n\nAre you satisfied with your lab's operations? What works and what doesn\u2019t? What needs fixing and how shall it be prioritized?\nHas management raised any concerns?\nWhat do those working in the lab have to say?\nHow is your lab going to change in the next one to five years?\nDoes your industry have a working group for lab operations, computing, and automation?\nAdding to question five, many companies tend to keep the competition at arm's length, minimizing contact for fear of divulging confidential information. However, if practically everyone is using the same set of test procedures from a trusted neutral source (e.g., ASTM International, United States Pharmacopeia, etc.), there\u2019s nothing confidential there. Instead of developing automated versions of the same procedure independently, companies can join forces, spread the cost, and perhaps come up with a better solution. With that effort as a given, you collectively have something to approach the vendor community with and say \u201cwe need this modification or new product.\u201d This is particularly beneficial to the vendor when they receive a vetted product requirements document to work from.\nAgain, you don\u2019t wait for the lab of the future to happen, you create it. If you want to see the direction lab operations in the future can take, look to the manufacturing industry: it has everything from flexible manufacturing, cooperative robotics[1][2], and so on.[b] This is appropriate in both basic and applied research, as well as quality control.\nBoth manufacturing and lab work are process-driven with a common goal: a high-quality product whose quality can be defended through appeal to process and data integrity.\nLab work can be broadly divided into two activities, with parallels to manufacturing: experimental procedure development (akin to manufacturing process development) and procedure execution (product production). (Note: Administrative work is part of lab operations but not an immediate concern here.) As such, we have to address the fact that lab work is part original science and part production work based on that science, e.g., as seen with quality control, clinical chemistry, and high-throughput screening labs. The routine production work of these and other labs can benefit most from automation efforts. We need to think more broadly about the use of automation technologies\u2014driving their development\u2014instead of waiting to see what vendors develop. \nWhere manufacturing and lab work differ is in the scale of the work environment, the nature of the work station equipment, the skills needed to carry out the work, and the adaptability of those doing the work to unexpected situations.\nMy hope is that this guide will get laboratory managers and other stakeholders to begin thinking more about planning and technology management, as well as the need for more education in that work.\n\nTrends in science applications \nIf new science isn\u2019t being developed, vendors will add digital hardware and software technology to existing equipment to improve capabilities and ease-of-use, separating themselves from the competition. However, there is still an obvious need for an independent organization to evaluate that technology (i.e., the lab version of Consumer Reports); as is, that evaluation process, done properly, would be time consuming for individual labs and would require a consistent methodology. With the increased use of automation, we need to do this better, such that the results can be used more widely (rather than every lab doing their own thing) and with more flexibility, using specialized equipment designed for automation applications.\nArtificial intelligence (AI) and machine learning (ML) are two other trending topics, but they are not quite ready for widespread real-world applications. However, modern examples still exist:\n\nHaving a system that can bring up all relevant information on a research question\u2014a sort of super Google\u2014or a variation of IBM\u2019s Watson could have significant benefits.\nAnalyzing complex data or large volumes of data could be beneficial, e.g., the analysis of radio astronomy data to find fast radio bursts (FRB).[3]\n\"[A] team at Glasgow University has paired a machine-learning system with a robot that can run and analyze its own chemical reaction. The result is a system that can figure out every reaction that's possible from a given set of starting materials.\"[4]\nHelixAI is using Amazon's Alexa as a digital assitant for laboratory work.[5]\nHowever there are problems using these technologies. ML systems have been shown to be susceptible to biases in their output depending on the nature and quality of the training materials. As for AI, at least in the public domain, we really don\u2019t know what that is, and what we think it is keeps changing as purported example emerge. One large problem for lab use is whether or not you can trust the results of an AI's output. We are used to the idea that lab systems and methods have to be validated before they are trusted, so how do you validate a system based on ML or AI?\n\nEducation \nThe major issue in all of this is having people educated to the point where they can successfully handle the planning and management of laboratory technology. One key point: most lab management programs focus on personnel issues, but managers also have to understand the capabilities and limitations of information technology and automation systems.\nOne result of the COVID-19 pandemic is that we are seeing the limitations of the four-year undergraduate degree program in science and engineering, as well as the state of remote learning. With the addition of information technologies, general systems thinking and modeling[c], statistical experimental design, and statistical process control have become multidisciplinary fields. We need options for continuing education throughout people\u2019s careers so they can maintain their competence and learn new material as needed.\n\nMaking laboratory informatics and automation work \nMaking laboratory informatics and automation work? \"Isn\u2019t that a job for IT or lab personnel?\" someone might ask. One of the problems in modern science is the development of specialists in disciplines. The laboratory and IT fields have many specialties, and specialists can be very good within those areas while at the same time not having an appreciation of wider operational issues. Topics like lab operations, technology management, and planning aren\u2019t covered in formal education courses, and they're often not well-covered in short courses or online programs.\n\u201cMaking it work\u201d depends on planning performed at a high enough level in the organization to encompass all affected facilities and departments, including information technology (IT) and facilities management. This wider perspective gives us the potential for synergistic operations across labs, consistent policies for facilities management and IT, and more effective use of outside resources (e.g., lab information technology support staff [LAB-IT], laboratory automation engineers [LAEs][d], equipment vendors, etc.). \nWe need to apply the same diligence to planning lab operations as we do any other critical corporate resource. Planning provides a structure for enabling effective and successful lab operations.\n\nIntroduction to this section \nThe common view of science laboratories is that of rooms filled with glassware, lab benches, and instruments being used by scientists to carry out experiments. While this is a reasonable perspective, what isn\u2019t as visually obvious is the end result of that work: the development of knowledge, information, and data.\nThe progress of laboratory work\u2014as well as the planning, documentation, analytical results related to that work\u2014have been recorded in paper-based laboratory notebooks for generations, and people are still using them today. However, these aren't the only paper records that have existed and are still in use; scientists also depend on charts, log books, administrative records, reports, indexes, and reference material. The latter half of the twentieth century introduced electronics into the lab and with it electronic recording in the form of computers and data storage systems. Early adopters of these technologies had to extend their expertise into the information technology realm because there were few people who understood both these new devices and their application to lab work\u2014you had to be an expert in both laboratory science and computer science.\nIn the 1980s and 90s, computers became commonplace and where once you had to understand hardware, software, operating systems, programming and application packages, you then simply had to know how to turn them on; no more impressive arrays of blinking lights, just a blinking cursor waiting for you to do something.\nAs systems gained ease-of-use, however, we lost the basic understanding of what these systems were and what they did, that they had faults, and that if we didn\u2019t plan for their effective use and counter those faults, we were opening ourselves to unpleasant surprises. The consequences at times were system crashes, lost data, and a lack of a real understanding of how the output of an instrument was transformed into a set of numbers, which meant we couldn\u2019t completely account for the results we were reporting. \nWe need to step back, take control, and institute effective technology planning and management, with appropriate corresponding education, so that the various data we are putting into laboratory informatics technologies have the desired outcome. We need to ensure that these technologies are providing a foundation for improving laboratory operations efficiency and a solid return on investment (ROI), while substantively advancing your business' ability to work and be productive. That's the purpose of the work we'll be discussing.\n\nThe point of planning \nThe point of planning and technology management is pretty simple: to ensure ...\n\nthat the right technologies are in people's hands when they need them, and\nthat those technologies complement each other as much as possible.\nThese are straightforward statements with a lot packed into them.\nRegarding the first point, the key words are \u201cthe right technologies.\u201d In order to define what that means, lab personnel have to understand the technologies in question and how they apply to their work. If those personnel have used or were taught about the technologies under consideration, it should be easy enough to do. However, laboratory informatics doesn\u2019t fall into that basket of things. The level of understanding has to be more than superficial. While personnel don\u2019t have to be software developers, they do have to understand what is happening within informatics systems, and how data processing handles their data and produces results. Determining the \u201cright technologies\u201d depends on the quality and depth of education possessed by lab personnel, and eventually by lab information technology support staff (LAB-IT?) as they become involved in the selection process.\nThe second point also has a lot buried inside it. Lab managers and personnel are used to specifying and purchasing items (e.g., instruments) as discrete tools. When it comes to laboratory informatics, we\u2019re working with things that connect to each other, in addition to performing a task. When we explore those connections, we need to assess how they are made, what we expect to gain, what compatibility issues exist, how to support them, how to upgrade them, what their life cycle is, etc. Most of the inter-connected devices people encounter in their daily lives are things that were expected to be connected with using a limited set of choices; the vendors know what those choices are and make it easy to do so, or otherwise their products won\u2019t sell. The laboratory technology market, on the other hand, is too open-ended. The options for physical connections might be there, but are they the right ones, and will they work? Do you have a good relationship with your IT people, and are they able to help (not a given)? Again, education is a major factor.\n\n Who is responsible for laboratory technology planning and management (TPM)? \nWhen asking who is responsible for TPM, the question really is \"who are the TPM stakeholders,\" or \"who has an invested interest in seeing TPM prove successful?\"\n\nCorporate or organizational management: These stakeholders set priorities and authorize funding, while also rationalizing and coordinating goals between groups. Unless the organization has a strong scientific base, they may not appreciate the options and benefits of TPM in lab work, or the possibilities of connecting the lab into the rest of the corporate data structure.\nLaboratory management: These stakeholders are responsible for developing and implementing plans, as well as translating corporate goals into lab priorities.\nLaboratory personnel: These stakeholders are the ones that actually do the work. However, they are in the best position to understand where technologies can be applied. They would also be relied on to provide user requirements documents for new projects and meet both internal and external (e.g., Food and Drug Administration [FDA], Environmental Protection Agency [EPA], International Organization for Standardization [ISO], etc.) performance guidelines.\nIT management and their support staff: While these stakeholders' traditional role is the support of computers, connected devices (e.g., printers, etc.) and network infrastructure, they may also be the first line of support for computers connected to lab equipment. IT staff either need to be educated to meet that need and support lab personnel, or have additional resources available to them. They may also be asked to participate in planning activities as subject matter experts on computing hardware and software.\nLAB-IT specialists: These stakeholders act as the \"additional resources\" alluded to in the previous point. These are crossover specialists that span the lab and IT spaces and can provide informed support to both. In most organizations, aside from large science-based companies, this isn\u2019t a real \"position,\" although once stated, its role is immediately recognized. In the past, I\u2019ve also referenced these stakeholders as being \u201claboratory automation engineers.\u201d[6]\nFacility management: These stakeholders need to ensure that the facilities support the evolving state of laboratory workspace requirements as traditional formats change to support robotics, instrumentation, computers, material flow, power, and HVAC requirements.\nCarrying out this work is going to rely heavily on expanding the education of those participating in the planning work; the subject matter goes well beyond material covered in degree programs.\n\n Why put so much effort into planning and technology management? \nEarlier we mentioned paper laboratory notebooks, the most common recording device since scientific research began (although for sheer volume, it may have been eclipsed by computer hard drives). Have you ever wondered about the economics of laboratory notebooks? Cost is easy to understand, but the value of the data and information that is recorded there requires further explanation.\nThe value of the material recorded in a notebook depends on two key factors: the quality of the work and an inherent ability to put that documented work to use. The quality of the work is a function of those doing the work, how diligent they are, and the veracity of what has been written down. The inherent ability to use it depends upon the clarity of the writing, people\u2019s ability to understand it without recourse to the author, and access to the material. That last point is extremely important. Just by glancing at Figure 1, you can figure out where this is going.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 1. Paper notebooks' cost vs. value, as a function of usage\n\n\n\nAs a scientist\u2019s notebook fills with entries, it gains value because of the content. Once filled, it reaches an upper limit and is placed in a library. There it takes a slight drop in value because its ease-of-access has changed; it isn\u2019t readily at hand. As library space fills, the notebooks are moved to secondary storage (in one company I worked at, secondary storage consisted of trailers in a parking lot). Costs go up due to the cost of owning or renting the secondary storage and the space they take. The object's value drops, not because of the content but due to the difficulty in retrieving that content (e.g., which trailer? which box?). Unless the project is still active, the normal turn-over of personnel (e.g., via promotions, movement around the company, leaving the company) mean that institutional memory diminishes and people begin to forget the work exists. If few researchers can remember it, find it, and access it, the value drops regardless of the resources that went into the work. That is compounded by the potential for physical deterioration of the object (e.g., water damage, mice, etc.).\nPreventing the loss of access to the results of your investment in R&D projects will rely on information technology. That reliance will be built upon planning an effective informatics environment, which is precisely where this discussion is going. How is putting you lab results into a computer system any different than a paper-based laboratory notebook? There are obvious things like faster searching and so on, but from our previous discussion on them, not much is different; you still have essentially a single point of failure, unless you plan for that eventuality. That is the fundamental difference and what will drive the rest of this writing: \n\nPlanning builds in reliability, security, and protection against loss. (Oh, and it allows us to work better, too!)\nYou could plan for failure in a paper-based system by making copies, but those copies still represent paper that has to be physically managed. With electronic systems, we can plan for failure by using automated backup procedures that make faithful copies, as many as we\u2019d like, at low cost. This issue isn\u2019t unique to laboratory notebooks, but it is a problem for organizations that depends on paper records.\nThe difference between writing on paper and using electronic systems isn\u2019t limited to how the document is realized. If you were to use a typewriter, the characters would show up on the paper and you'd be able to read them; all you needed was the ability to read (which could include braille formats) and understand what was written. However, if you were using a word processor, the keystrokes would be captured by software, displayed on the screen, placed in the computer\u2019s memory, and then written to storage. If you want to read the file, you need something\u2014software\u2014to retrieve it from storage, interpret the file contents, determine how to display it, and then display it. Without that software the file is useless. A complete backup process has to include the software needed to read the file, plus all the underlying components that it depends upon. You could correctly argue that the hardware is required as well, but there are economic tradeoffs as well as practical ones; you could transfer the file to other hardware and read it there for example. \nThat point brings us to the second subject of this writing: technology management. What do I have to do to make sure that I have the right tools to enable me to work? The problem is simple enough when all you're concerned with is writing and preparing accompanying graphics. Upon shifting the conversation to laboratory computing, it gets more complicated. Rather than being concerned with one computer and a few software packages, you have computers that acquire and process data in real-time[e], transmit it to other computers for storage in databases, and systems that control sample processing and administrative work. Not only do the individual computer systems and the equipment and people they support have to work well, but also they have to work cooperatively, and that is why we have to address planning and technology management in laboratory work.\nThat brings us to a consideration of what lab work is all about.\n\nDifferent ways of looking at laboratories \nWhen you think about a \u201claboratory,\u201d a lot depends on your perspective: are you on the outside looking in, do you work in a lab, or are you taking that high school chemistry class? When someone walks into a science laboratory, the initial impression is that of confusing collection of stuff, unless they're familiar with the setting. \u201cStuff\u201d can consist of instruments, glassware, tubing, robots, incubators, refrigerators and freezers, and even petri dishes, cages, fish tanks, and more depending on the kind of work that is being pursued.\nFrom a corporate point of view, a \"laboratory\" can appear differently and have different functions. Possible corporate views of the laboratory include:\n\nA laboratory is where questions are studied, which may support other projects or provide a source of new products, acting as basic and applied R&D. What is expected out of these labs is the development of new knowledge, usually in the form of reports or other documentation that can move a project forward.\nA laboratory acts as a research testing facility (e.g., analytical, physical properties, mechanical, electronics, etc.) that supports research and manufacturing through the development of new test methods, special analysis projects, troubleshooting techniques, and both routine and non-routine testing. The laboratory's results come in the form of reports, test procedures, and other types of documented information.\nA laboratory acts as a quality assurance\/quality control (QA\/QC) facility that provides routine testing, producing information in support of production facilities. This can include incoming materials testing, product testing, and product certification.\nTypically, stakeholders outside the lab are looking for some form of result that can be used to move projects and other work forward. They want it done quickly and at low cost, but also want the work to be of high quality and reliability. Those considerations help set the goals for lab operations.\nWithin the laboratory there are two basic operating modes or workflows: project-driven or task-driven work. With project-driven workflows, a project goal is set, experiments are planned and carried out, the results are evaluated, and a follow-up course of action is determined. This all requires careful documentation for the planning and execution of lab work. This can also include developing and revising standard operating procedures (SOPs). Task-driven workflows, on the other hand, essentially depend on the specific steps of a process. A collection of samples needs to be processed according to an SOP, and the results recorded. Depending upon the nature of the SOP and the number of samples that have to be processed, the work can be done manually, using instruments, or with partial or full automation, including robotics. With the exception of QA\/QC labs, a given laboratory can use a combination of these modes or workflows over time as work progresses and the internal\/external resources become available. QA\/QC labs are almost exclusively task-driven; contract testing labs are as well, although they may take on project-driven work.\nWithin the realm of laboratory informatics, project-focused work centers on the electronic laboratory notebook (ELN), which can be described as a lab-wide diary of work and results. Task-driven work is organized around the laboratory information management system (LIMS)\u2014or laboratory information system (LIS) in clinical lab settings\u2014which can be viewed as a workflow manager of tests to be done, results to be recorded, and analyses to be finalized. Both of these technologies replaced the paper-based laboratory notebook discussed earlier, coming with considerable improvements in productivity. And although ELNs are considerably more expensive than paper systems, the short- and long-term benefits of an ELN overshadow that cost issue.\n\n Labs in transition, from manual operation to modern facilities \n Figure 2. Chemical laboratory at Oswego SUNY 1893Laboratories didn\u2019t start with lots of electronic components; they began with people, lab benches, glassware, Bunsen burners, and other equipment. Lab operations were primarily concerned with peoples' ability to work. The technology was fairly simple by today\u2019s standards (Figure 2), and an individual\u2019s skills were the driving factor in producing quality results.\nFor the most part, the skills you learned in school were the skills you needed to be successful here as far as technical matters went; management education was another issue. That changed when electronic instrumentation became available. Analog instruments such as scanning spectrophotometers, chromatographs, mass spectrometer, differential scanning calorimeters, tensile testers, and so on introduced a new career path to laboratory work: the instrument specialist, who combined an understanding of the basic science with the an understanding of the instrument\u2019s design, as well as how to use it (and modify it where needed), maintain it, troubleshoot issues, and analyze the results. Specialization created a problem for schools: they couldn\u2019t afford all the equipment, find knowledgeable instructors, and encourage room in the curriculum for the expanding subject matter. Schools were no longer able to educate people to meet the requirements of industry and graduate-level academia. And then digital electronics happened. Computers first became attached to instruments, and then incorporated into the instrumentation.[f]\nThe addition of computer hardware and software to an instrument increased the depth of specialization in those techniques. Not only did you have to understand the science noted above, but also the use of computer programs used to work with the instrument, how to collect the data, and how to perform the analysis. An entire new layer of skills was added to an already complex subject.\nThe latest level of complexity added to laboratory operations has been the incorporation of LIMS, ELNs, scientific data management systems (SDMS), and laboratory execution systems (LES) either as stand-alone modules or combined into more integrated packages or \"platforms.\"\n\n There's a plan for that? \nIt is rare to find a lab that has an informatics plan or strategy in place before the first computer comes through the door; those machines enter as part of an instrument-computer control system. Several computers may use that route to become part of the lab's technology base before people realize that they need to start taking lab computing seriously, including how to handle backups, maintenance, support, etc.\nFirst computers come into the lab, and then the planning begins, often months later, as an incremental planning effort, which is the complete reverse of how things need to be developed. Planning is essential as soon as you decide that a lab space will be created. That almost never happens, in part because no one has told you that is required, let alone why or how to go about it.\n\nThinking about a model for lab operations \nThe basic purpose of laboratory work is to answer questions. \u201cHow do we make this work?\u201d \u201cWhat is it?\u201d \u201cWhat\u2019s the purity of this material?\u201d These questions and others like them occur in chemistry, physics, and the biological sciences. Answering those questions is a matter of gathering data and information through observation and experimental work, organizing it, analyzing it, and determining the next steps needed as the work moves forward (Figure 3). Effective organization is essential, as lab personnel will need to search data and information, extract it, move it from one data system to another for analysis, make decisions, update planning, and produce interim and ultimately final reports.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 3. Simplified flow of data\/information from sources to collection system\n\n\n\nOnce the planning is done, scientific work generally begins with collecting observations and measurements (Data\/Information Sources 1\u20134, Figure 3) from a variety of sources. Lab bench work usually involves instrumentation, and many instruments have computer controls and data systems as part of them. This is the more visible part of lab work and the one that matches people\u2019s expectations for a \u201cscientific lab.\u201d This is where most of the money is spent on equipment, materials, and people\u2019s expertise and time. All that expenditure of resources results in \u201cthe pH of the glowing liquid is 6.5,\u201d \u201cthe concentration of iron in the icky stuff is 1500 ppm,\u201d and so on. That\u2019s the end result of all those resources, time, and effort put into the scientific workflow. That\u2019s why you built a million-dollar facility (in some spheres of science such as astronomy, high energy physics, and the space sciences, the cost of collection is significantly higher). So what do you do with those results? Prior to the 1970s, the collection points were paper: forms, notebooks, and other document, all with their earlier discussed issues.\nThe material on those instrument data systems needs to be moved to an intermediate system for long-term storage and reference (the second step of Figure 3). This is needed because those initial data systems may fail, be replaced, or added to as the work continues. After all, the data and information they\u2019ve collected needs to be preserved, organized, and managed to support continued lab work.\nThe analyzed results need to be collected into a reference system that is the basis of long-term analysis, management\/administration work, and reporting. This last system in the flow is the central hub of lab activities; it is also the distribution point for material sent to other parts of the organization (the third and fourth stages of Figure 3). While it is natural for scientists to focus on the production of data and information, the organization and centralized management of the results of laboratory work needs to be a primary consideration. That organization will be focused of short- and long-term data analysis and evaluation. The results of this get used to demonstrate the lab's performance towards meeting its goals, and it will show those investing in your work that you\u2019ve got your management act together, which is useful when looking for continued support.\nToday, those systems come in two basic forms: LIMS and ELN. The details of those systems are the subject of a number of articles and books.[7] Without getting into too much detail:\n\nLIMS are used to support testing labs managing sample workflows and planning, as well as cataloging results (e.g., short text and numerical information).\nELNs are usually found in research functioning as an electronic diary of lab work for one or more scientists and technicians. The entries may contain extensive textural material, numerical entries, charts, graphics, etc. The ELN is generally more flexible than a LIMS.\nThat distinction is simplistic; some labs support both activities and need both types of systems, or even a hybrid package. However, the description is sufficient to get us to the next point: the lifespan of systems varies, depending on where you are looking in Figure 3's model. Figure 4 gives a comparison.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 4. The relative lifespan of laboratory systems\n\n\n\nThe experimental methods\/procedures used in lab work will change over time as the needs of the lab change. Older instruments may be updated and new ones introduced. Retirement is a problem, particularly if data systems are part of the equipment. You have to have access to the data. That need will live on long past the equipment's life. That is one reason that moving data and information to an intermediate system like an SDMS is important. However, in some circumstances, even that isn\u2019t going to be sufficient (regulated industries where the original data structures and software that generated them need to be preserved as an operating entity). In those cases, you may have old computers stacked up just in case you need access to their contents. A better way is to virtualize the systems as containers on servers that support a virtualized environment.\nVirtualization\u2014making an electronic copy of computer system and running on a server\u2014is potentially a useful technology in lab work; while it won\u2019t participate in day-to-day activities it does have a role. Suppose you have an instrument-data system that is being replaced or retired. Maybe the computer is showing signs of aging or failing. What do you do with the files and software that are on the computer portion of the combination? You can\u2019t dispose of them because you may need access to those data files and software later. On the other hand, do you really want to collect computer systems that have to be maintained just to have access to the data if and when you need it? Instead, virtualization is a software\/hardware technology that allows you to make a complete copy of everything that is on that computer\u2014including operating system files, applications, and data files\u2014and stores it in one big file referred to as a \u201ccontainer.\u201d That container can be moved to a computer that is a virtual server and has software that emulates various operating environment, allowing the software in the container to run as if it were on its own computer hardware. A virtual server can support a lot of containers, and the operating systems in those containers can be updated as needed. The basic idea is that you don\u2019t need access to a separate physical computer; you just need the ability to run the software that was on it. If your reaction to that is one of dismay and confusion, it\u2019s time to buy your favorite IT person a cup of coffee and have a long talk. We\u2019ll get into more details when we cover data backup issues.\nWhy is this important to you?\nWhile the science behind producing results is the primary reason your lab exists, gaining the most value from the results is essential to the organization overall. That value is going to be governed by the quality of the results, ease of access, the ability to find and extract needed information easily, and a well-managed K\/I\/D architecture. All of that addresses a key point from management\u2019s perspective: return on investment or ROI. If you can demonstrate that your data systems are well organized and maintained, and that you can easily find and use the results from experimental work and contribute to advancing the organization\u2019s goals, you\u2019ll make it easier to demonstrate solid ROI and gain funding for projects, equipment, and people needed to meet your lab's goals.\n\r\n\n\nThe seven goals of planning and managing lab technologies \nThe preceding material described the need for planning and managing lab technologies, and making sure lab personnel are qualified and educated to participate in that work. The next step is the actual planning. There are at least two key aspects to that work: planning activities that are specific and unique to your lab(s) and addressing broader scope issues that are common to all labs. The discussion found in the rest of this guide is going to focus on the latter points.\nEffective planning is accomplished by setting goals and determining how you are going to achieve them. The following sections of this guide look at those goals, specifically:\n\nSupporting an environment that fosters productivity and innovation\nDeveloping high-quality data and information\nManaging knowledge, information, and data effectively, putting them in a structure that encourages use and protects value\nEnsuring a high level of data integrity at every step\nAddressing security throughout the lab\nAcquiring and developing \"products\" that support regulatory requirements\nAddressing systems integration and harmonization\nThe material below begins the sections on goal setting. Some of these goals are obvious and understandable, others like \u201charmonization\u201d are less so. The goals are provided as an introduction rather than an in-depth discussion. The intent is to offer something suitable for the purpose of this material and a basis for a more detailed exploration at a later point. The intent of these goals is not to tell you how to do things, but rather what things need to be addressed. The content is provided as a set of questions that you need to think about. The answers aren't mine to give, but rather yours to develop and implement; it's your lab. In many cases, developing and implementing those answers will be a joint effort by all stakeholders.\n\nFirst goal: Support an environment that fosters productivity and innovation \nIn order to successfully plan for and manage lab technologies, the business environment should ideally be committed to fostering a work environment that encourages productivity and innovation. This requires:\n\nproven, supportable workflow methodologies;\neducated personnel;\nfully functional, inter-departmental cooperation;\nmanagement buy-in; and\nsystems that meet users' needs.\nThis is one of those statements that people tend to read, say \u201csure,\u201d and move on. But before you do that, let\u2019s take a look at a few points. Innovation may be uniquely human (not even going to consider AI), and the ability to be \u201cinnovative\u201d may not be universal.\nPeople need to be educated, be able to separate true facts from \u201cbeliefs,\u201d and question everything (which may require management support). Innovation doesn\u2019t happen in a highly structured environment, you need the freedom to question, challenge, etc. You also need the tools to work with. The inspiration that leads to innovation can happen anywhere, anytime. All of a sudden all the pieces fit. And then what? That is where a discussion of tools and this work come together.\nIf a sudden burst of inspiration hits, you want to do it now and not after traveling to an office, particularly if it is weekend or vacation. You need access to knowledge (e.g., documents, reports), information, and data (K\/I\/D). In order to do that, a few things have to be in place:\n\nBusiness and operations K\/I\/D must be accessible.\nSystems security has to be such that a qualified user can gain access to K\/I\/D remotely, while preventing its unauthorized use.\nQualified users must have the hardware and software tools required to access the K\/I\/D, work with it, and transmit the results of that work to whoever needs to see it.\nQualified users must also be able to remotely initiate actions such as testing.\nThose elements depend on a well-designed laboratory and corporate informatics infrastructure. Laboratory infrastructure is important because that is where the systems are that people need access to, and corporate infrastructure is important since corporate facilities have to provide access, controls, and security. Implementation of those corporate components has to be carefully thought through; they must be strong enough to frustrate unwarranted access (e.g., multi-factor logins) while allowing people to get real work done.\nAll of this requires flexibility and trust in people, an important part of corporate culture. This will become more important as society adjusts to new modes of working (e.g., working online due to a pandemic) and the realization that the fixed format work week isn\u2019t the only way people can be productive. For example, working from home or off-site is increasingly commonplace. Laboratory professionals work in two modes: intellectual, which can be done anywhere, and the lab bench, where physical research tasks are performed. We need to strike a balance between those modes and the need for in-person vs virtual contact.\nLet's take another look at the previous Figure 3, which offered one possible structure for organizing lab systems:\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 3. Simplified flow of data\/information from sources to collection system\n\n\n\nThis use of an intermediate file storage system like an SDMS and the aggregation of some instruments to a common computer (e.g., one chromatographic data system for all chromatographs vs. one per instrument) becomes more important for two reasons: 1. it limits the number of systems that have to be accessed to search, organize, extract, and work with K\/D\/I, and 2. it makes it easier to address security concerns. There are additional reasons why this organization of lab systems is advantageous, but we\u2019ll cover those in later installments. The critical point here is a sound informatics architecture is key to supporting innovation. People need access to tools and K\/D\/I when they are working, regardless of where they are working from. As such, those same people need to be well-versed in the capabilities of the systems available to them, how to access them, use them, and how to recognize \u201cmissing technologies,\u201d capabilities they need but don\u2019t have access to or simply don't exist.\nImagine this. A technology expert consults for two large organizations, one tightly controlled (Company A), the other with a liberal view of trusting people to do good work (Company B). In the first case, getting work done can be difficult, with the expert fighting through numerous reviews, sign-offs, and politics. Company A has a stated philosophy that they don\u2019t want to be the first in the market with a new product, but would rather be a strong number two. They justify their position through the cost of developing markets for new products: let someone else do the heavy lifting and follow behind them. This is not a culture that spawns innovation. Company B, however, thrives on innovation. While processes and procedures are certainly in place, the company has a more relaxed philosophy about work assignments. If the expert has a realizable idea, Company B lets them run with it, as long as they complete their assigned workload in a timely fashion. This is what spurs the human side of innovation.\n\nSecond goal: Develop high-quality data and information \nAsking staff to \"develop high-quality data and information\" seems like a pretty obvious point, but this is where professional experience and the rest of the world part company. Most of the world treats \u201cdata\u201d and \u201cinformation\u201d as interchangeable words. Not here.\nThere are three key words that are going to be important in this discussion of goals: knowledge, information, and data (K\/I\/D). We\u2019ll start with \u201cknowledge\u201d. The type of knowledge we will be looking at is at the laboratory\/corporate level, the stuff that governs how a laboratory operates, including reports, administrative material, and most importantly standard operating procedures (SOPs). SOPs tell us how lab work is carried out via its methods, procedures, etc. (This subject parallels the topic of \u201cdata integrity,\u201d which will be covered later.) Figure 5 positions K\/I\/D with respect to each other within laboratory processes.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 5. Simplified flow of data\/information from sources to collection system\n\n\n\nThe diagram in Figure 5 is a little complicated, and we\u2019ll get into the details as the material develops. For the moment, we\u2019ll concentrate on the elements in black.\nAs noted above, SOPs guide activities within the lab. As work is defined\u2014both research and testing\u2014SOPs have to be developed so that people know how to carry out their tasks consistently. Our first concern then is proper management of SOPs. Sounds simple, but in practice it isn\u2019t. It\u2019s a matter of first developing and updating the procedures, documenting them, and then managing both the documents and the lab personnel using them.\nWhen developing, updating, and documenting procedures, a lab will primarily be looking at the science its working with and how regulatory requirements affect it, particularly in research environments. Once developed, those procedures will eventually need to be updated. But why is an update to a procedure needed? What will the effects of the update be based on the changes that were made, and how do the results of the new version compare to the previous version? That last point is important, and to answer it you need a reference sample that has been run repeatedly under the older version so that you have a solid history of the results (i.e., control chart) over time. You also need the ability to run that same reference sample under the new procedure to show that there are no differences, or that differences can be accounted for. If differences persist, what do you do about the previous test results under the old procedure?\nThe idea of running one or more stable reference samples periodically is a matter of instituting statistical process control over the analysis process. It can show that a process is under control, detect drift in results, and demonstrate that the lab is doing its job properly. If multiple analysts are doing the same work, it can also reveal how their work compares and if there are any problems. It is in effect looking over their shoulders, but that just comes with the job. If you find that the amount of reference material is running low, then phase in a replacement, running both samples in parallel to get a documented comparison with a clean transition from one reference sample to another. It\u2019s a lot of work and it\u2019s annoying, but you\u2019ll have a solid response when asked \u201care you confident in these results?\u201d You can then say, \u201cYes, and here is the evidence to back it up.\u201d\nAfter the SOPs have been documented, they must then be effectively managed and implemented. First, take note of the education and experience required for lab personnel to properly implement any SOP. Periodic evaluation (or even certification) would be useful to ensure things are working as they should. This is particularly true of procedures that aren\u2019t run often, as people may forget things. \nAnother issue of concern with managing SOPs is how to manage versioning. Consider two labs. Lab 1 is a well-run lab. When a new procedure is issued, the lab secretary visits each analyst, takes their copy of\nthe old method, destroys it, provides a copy of the new one, requires the analyst sign for receipt, and later requires a second signature after the method has been reviewed and understood. Additional education is also provided on an as-needed basis. Lab 2 has good intentions, but it's not as proactive as Lab 1. Lab 2 retains all documents on a central server. Analysts are able to copy a method to their machines and use it. However, there is no formalized method of letting people know when a new method is released. At any given time there may be several analysts running the same method using different versions of the related SOP. The end result is having a mix of samples run by different people according to different SOPs. \nThis comparison of two labs isn\u2019t electronic versions vs. paper, but rather a formal management structure vs. a loose one. There\u2019s no problem maintaining SOPs in an electronic format, as there are many benefits, but there shouldn\u2019t be any question about the current version, and there should be a clear process for notifying people about updates while also ensuring that analysts are currently educated in the new method's use.\nManaging this set of problems\u2014analyst education, versions of SOPs, qualification of equipment, current reagents, etc.\u2014 was the foundation for one of the early original ELNs, SmartLab by Velquest, now developed as a LES by Dassault Syst\u00e8mes as part of the BIOVIA product line. And while Dassault's LES, and much of the Biovia product line, narrowly focuses on their intended market, the product remains suitable for any lab where careful control over procedure execution is warranted. This is important to note, as a LES is designed to guide a person through a procedure from start to finish, making it one step away from engaging in a full robotics system (robotics may play a role in stages of the process). The use of an LES doesn\u2019t mean that personnel aren\u2019t trusted or deemed incompetent; rather, it is a mechanism for developing documented evidence that methods have been executed correctly. That evidence builds confidence in results.\nLESs are available from several vendors, often as part of their LIMS or ELN offerings. Using any of these systems requires planning and scripting (a gentler way of saying \u201cprogramming\u201d), and the cost of implementation has to be balanced against the need (does the execution of a method require that level of sophistication) and ROI.\nUp to this point, we\u2019ve looked at developing and managing SOPs, as well as at least one means of controlling experiment\/procedure execution. However, there are other ways of going about this, including manual and full robotics systems. Figure 6 takes us farther down the K\/I\/D model to elaborate further on experiment\/procedure execution.[g]\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 6. K\/I\/D flow diagram focusing on experiment\/procedure execution\n\n\n\nAs we move from knowledge development and management (i.e., SOPs), and then on to sample preparation (i.e., pre-experiment), the next step is usually some sort of measurement by an instrument, whether it is pH meter or spectrometer, yielding your result. That brings us to two words we noted earlier: \"data\" and \"information.\" We'll note the differences between the two using a gas chromatography system as an example (Figure 7), as it and other base chromatography systems are among the most widely used of upper-tier instrumentation and widely found in labs where chemical analysis is performed.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 7. Gas chromatography \"data\" and \"information\"\n\n\n\nAs we look at Figure 7, we notice to the right of the vertical blue line is an output signal from a gas chromatograph. This is what chromatographers analyzed and measured when they carried out their work. The addition of a computer made life easier by removing the burden of calculations, but it also added complexity to the work in the form of having to manage the captured electronic data and information. An analog-to-digital (A\/D) converter transformed those smooth curves to a sequence of numbers that are processed to yield parameters that described the peaks, which in turn were used to calculate the amount of substance in the sample. Everything up to that last calculation\u2014left of the vertical blue line\u2014is \u201cdata,\u201d a set of numerical values that, taken individually, have no meaning by themself. It is only when we combine it with other data sets that we can calculate a meaningful result, which gives us \u201cinformation.\u201d\nThe paragraph above describes two different types of data:\n1. the digitized detector output or \"raw data,\" constituting a series of readings that could be plotted to show the instrument output; and\n2. the processed digitized data that provides descriptors about the output, with those descriptors depending largely upon the nature of the instrument (in the case of chromatography, the descriptors would be peak height, retention time, uncorrected peak area, peak widths, etc.).\nBoth are useful and neither of them should be discarded; the fact that you have the descriptors doesn\u2019t mean you don\u2019t need the raw data. The descriptors are processed data that depends on user-provided parameters. Changing the parameter can change the processing and the values assigned to those descriptors. If there are accuracy concerns, you need the raw data as a backup. Since storage is cheap, there really isn\u2019t any reason to discard anything, ever. (And in some regulatory environments, keeping raw data is mandated for a period of time.)\nIf you want to study the data and how it was processed to yield a result, you need more data, specifically the reference samples (standards) used to evaluate each sample. An instrument file by itself is almost useless without the reference material run with that sample. Ideally, you\u2019d want a file that contains all the sample and reference data that was analyzed in one session. That might be a series of manual samples analyzed or an entire auto-sampler tray.\nEverything we've discussed here positively contributes to developing high-quality data and information. When methods are proven and you have documented evidence that they were executed by properly educated personnel using qualified reagents and instruments, you then have the instrument data to support each sample result and any other information gleaned from that data.\nYou might wonder what laboratorians did before computers. They dealt with stacks of spectra, rolls of chromatograms, and laboratory notebooks, all on paper. If they wanted to find the data (e.g., a pen trace on paper) for a sample, they turned to the lab's physical filing system to locate it.[h] Why does this matter? That has to do with our third goal.\n\n Third goal: Manage K\/I\/D effectively, putting them in a structure that encourages use and protects value \nIn the previous section we introduced three key elements of laboratory work: knowledge, information, and data (K\/I\/D). Each of these is \u201cdatabase\u201d structures (\u201cdata\u201d in the general sense). We also looked at SOP management as an example of knowledge management, and distinguished \u201cdata\u201d and \u201cinformation\u201d management as separate but related concerns. We also introduced flow diagrams (Figures 5 and 6) that show the relationship and development of each of those elements.\nIn order for those elements to justify the cost of their development, they have to be placed in systems that encourage utilization and thus retain their value. Modern informatics tools assist in many ways:\n\nDocument management systems support knowledge databases (and some LIMS and ELNs inherently support document management).\nLIMS and ELNs provide a solid base for laboratory information, and they may also support other administrative and operational functions.\nInstrument data systems and SDMS collect instrument output in the form of reports, data, and information.\nYou may notice there is significant functional redundancy as vendors try to create the \u201cultimate laboratory system.\u201d Part of lab management\u2019s responsibility is to define what the functional architecture should look like based on their current and perceived needs, rather than having it defined for them. It\u2019s a matter of knowing what is required and seeing what fits rather than fitting requirements into someone else\u2019s idea of what's needed.\nManaging large database systems is only one aspect of handling K\/I\/D. Another aspect involves the consideration of cloud vs. local storage systems. What option works best for your situation, is the easiest to manage, and is supported by IT? We also have to address the data held in various desktop and mobile computing devices, as well as bench top systems like instrument data systems. There are a number of considerations here, not the least of which is product turnover (e.g., new systems, retired systems, upgrades\/updates, etc.). (Some of these points will be covered latter on in other sections.)\nWhat you should think about now is the number of computer systems and software packages that you use on a daily basis, some of which are connected to instruments. How many different vendors are involved? How big are vendors (e.g., small companies\/limit staff, large organizations)? How often do they upgrade their systems? What\u2019s the likelihood they\u2019ll be around in two or five years?\nAlso ask what data file formats the vendor uses; these formats vary widely among vendors. Some put everything in CSV files, others in proprietary formats. In the latter case, you may not be able to use the data files without the vendor's software. In order to maintain the ability to work with instrument data, you will have to manage the software needed to open files and work with it, in addition to just making sure you have copies of the data files. In short, if you have an instrument-computer combination that does some really nice stuff and you want to preserve the ability to gain value from that instrument's data files, you have to make a backup copy of the software environment and the data files. This is particularly important if you're considering retiring a system that you'll still want to access data from, plus you may have to maintain any underlying software license. This is where the previous conversation about virtualization and containers comes in.\nIf you think about a computer system it has two parts: hardware (e.g., circuit boards, hard drive, memory, etc.) and software (e.g., the OS, applications, data files, etc.). From the standpoint of the computer\u2019s processor, everything is either data or instructions read from one big file on the hard drive, which the operating system has segmented for housing different types of files (that segmentation is done for your convenience; the processor just sees it all as a source of instructions and data). Virtualization takes everything on the hard drive, turns it into a complete file, and places that file onto a virtualization server where it is stored as a file called a \u201ccontainer.\u201d That server allows you to log in, open a container, and run it as though it were still on the original computer. You may not be able to connect it the original instruments to the containerized environment, but all the data processing functions will still be there. As such, a collection of physical computers can become a collection of containers. An added benefit of virtualizations applies when you're worried about an upgrade creating havoc with your application; instead, make a container as a backup.[i]\nThe advantage of all this is that you continue to have the ability to gain value and access to all of your data and information even if the original computer has gone to the recycle bin. This of course assumes your IT group supports virtualization servers, which provide an advantage in that they are easier to maintain and don\u2019t take up much space. In larger organization this may already be happening, and in smaller organizations a conversation may be had to determine IT's stance. The potential snag in all this is whether or not the software application's vendor license will cover the operation of their software on a virtual server. That is something you may want to negotiate as part of the purchase agreement when you buy the system.\nThis section has shown that effective management of K\/I\/D is more than just the typical consideration of database issues, system upgrades, and backups. You also have to maintain and support the entire operating system, the application, and the data file ecosystem so that you have both the files needed and the ability to work with them.\n\nFourth goal: Ensure a high level of data integrity at every step \n\u201cData integrity\u201d is an interesting couple of words. It shows up in marketing literature to get your attention, often because it's a significant regulatory concern. There are different aspects to the topic, and the attention given often depends on a vendor's product or the perspective of a particular author. In reality, it touches on all areas of laboratory work. The following is an introduction to the goal, with more detail given in later sections.\n\nDefinitions of data integrity \nThere are multiple definitions of \"data integrity.\" A broad encyclopedic definition can be found at Wikipedia, described as \"the maintenance of, and the assurance of, data accuracy and consistency over its entire life-cycle\" and \"a critical aspect to the design, implementation, and usage of any system that stores, processes, or retrieves data.\"[8]\nAnother definition to consider is from a more regulatory perspective, that of the FDA. In their view, data integrity focuses on the completeness, consistency, accuracy, and validity of data, particularly through a mechanism called the ALCOA+ principles. This means the data should be[9]:\n\nAttributable: You can link the creation or alteration of data to the person responsible.\nLegible: The data can be read both visually and electronically.\nContemporaneous: The data was created at the same time that the activity it relates to was conducted.\nOriginal: The source or primary documents relating to the activity the data records are available, or certified versions of those documents are available, e.g., a notebook or raw database. (This is one reason why you should collect and maintain as much data and information from an instrument as possible for each sample.)\nAccurate: The data is free of errors, and any amendments or edits are documented.\nPlus, the data should be:\n\nComplete: The data must include all related analyses, repeated results, and associated metadata.\nConsistent: The complete data record should maintain the full sequence of events, with date and time stamps, such that the steps can be repeated.\nEnduring: The data should be able to be retrieved throughout its intended or mandated lifetime.\nAvailable: The data is able to be accessed readily by authorized individuals when and where they need it.\nBoth definitions revolve around the same point: the data a lab produces has to be reliable. The term \"data integrity\" and its associated definitions are a bit misleading. If you read the paragraphs above you get the impression that the focus in on the results of laboratory work, when in fact it is about every aspect of laboratory work, including the methods used and those who conduct those methods.\nIn order to gain meaningful value from laboratory K\/I\/D, you have to be assured of its integrity; \u201cthe only thing worse than no data, is data you can\u2019t trust.\u201d That is the crux of the matter. How do you build that trust? Building a sense of confidence in a lab's data integrity efforts requires addressing three areas of concern and their paired intersections: science, people, and informatics technology. Once we have successfully managed those areas and intersection points, we are left with the intersection common to all of them: constructed confidence in a laboratory's data integrity efforts (Figure 8). \n\r\n\n\n\n\n\n\n\n\n\n\nFigure 8. The three areas contributing to data integrity and how they intersect\n\n\n\n Figure 9. The \"Science\" component\nThe science\nWe\u2019ll begin with a look at the scientific component of the conversation (Figure 9). Regardless of the kinds of questions being addressed, the process of answering them is rooted in methods and procedures. Within the context of this guide, those methods have to be validated or else your first step in building confidence has failed. If those methods end with electronic measurements, then that equipment (including settings, algorithms, analysis, and reporting) have to be fully understood and qualified for use in the validated process. The manufacturer's default settings should either be demonstrated as suitable or avoided.\n\r\n\nThe people\n\n Figure 10. The \"People\" componentPeople (Figure 10) need to be thoroughly educated and competent to meet the needs of the laboratory's operational procedures and scientific work. That education needs to extend beyond the traditional undergraduate program and include the specifics of instrumental techniques used. A typical four-year program doesn\u2019t have the time to cover the basic science and the practical aspects of how science is conducted in modern labs, and few schools can afford the equipment needed to meet that challenge. This broader educational emphasis is part of the intersection of people and science.\nAnother aspect of \u201cpeople\u201d is the development of a culture that contributes to data integrity. Lab personnel need to be educated on the organization\u2019s expectations of how lab work needs to be managed and maintained. This includes items such as records retention, dealing with erroneous results, and what constitutes original data. They should also be fully aware of corporate and regulatory guidelines and the effort needed to enforce them.[j] This is another instance where education beyond that provided in the undergraduate curriculum is needed.\n\r\n\nInformatics technology\n\n Figure 11. The \"Informatics Technology\" componentLaboratory informatics technology (Figure 11) is another area where data integrity can either be enhanced or lost. The lab's digital architecture needs to be designed to support relatively easy access (within the scope of necessary security considerations) to the lab's data from the raw digitized detector output, through intermediate processed stages and to the final processed information. Unnecessary duplication of K\/D\/I must be avoided. You also need to ensure that the products chosen for lab work are suitable for the work and have the ability to be integrated electronically. After all, the goal is to avoid situations where the output of one system is printed and then manually entered into another.\nThe implementation and use of informatics technology should be the result of careful product selection and their intentional design\u2014from the lab bench to central database systems such as LIMS, ELN,\nSDMS, etc.\u2014rather than haphazard approach of an aggregate of lab computers.\nOther areas of concern with informatics technology include backups, security, and product life cycles, which will be addressed in later sections. If as we continue onward through these goals it appears like everything touches on data integrity, it's because it does. Data integrity can be considered an optimal result of the sum of well-executed laboratory operations.\n\r\n\nThe intersection points\nTwo of the three intersection points deserve minor elaboration (Figure 12). First, the intersection of people and informatics technologies has several aspects the address. The first is laboratory personnel\u2019s responsibility\u2014which may be shared with corporate or LAB-IT\u2014for the selection and management of informatics products. The second is the fact that this requires those personnel to be knowledgeable concerning the application of informatics technologies in laboratory environments. Ensure the selected personnel have the appropriate backgrounds and knowledge to consider, select, and effectively use those products and technologies.\nThe other intersection point to be addressed is that of science with informatics technology. Here, stakeholders are concerned with product selection, system design (for automated processes), and system integration and communication with other systems and instruments. Again, as noted above, we go into more detail in later sections. The primary point here, however, can be summed up as determining whether or not the products selected for your scientific endeavors are compatible with your data integrity goals.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 12. The intersection of People and Informatics Technology (left) and Science and Informatics Technology (right)\n\n\n\nAddressing the needs of these two intersection points requires deliberate effort and many planning questions regarding vendor support, quality of design, system interoperability, result output, and scientific support mechanisms. Questions to ask include:\n\nVendor support: How responsive are vendors to product issues? Do you get a fast, usable response or are you left hanging? A product that is having problems can affect data quality and reliability.\nQuality of design: How easy is the system to use? Are controls, settings, and working parameters clearly defined and easily understood? Do you know what effect changes in those points will have on your results? Has the system been tuned to your needs (not adjusted to give you the answers you want, but set to give results that truly represent the analysis)? Problems with adjusting settings properly can distort results. (This is one area where data integrity may be maintained throughout a process, and then lost because of improper or untested controls on an instrument's operation.)\nSystem interoperability: Will there be any difficulty in integrating a software product or instrument into a workflow? Problems with sample container compatibility, operation, control software, etc. can cause errors to develop in the execution of a process flow. For example, problems with pipette tips can cause errors in fluid delivery.\nResult output: Is an electronic transfer of data possible, or does the system produce printed output (which means someone typing results into another system)? How effective is the communications protocol; is it based on a standard or does it require custom coding, which could be error prone or subject to interference? Is the format of the data file one that prevents changes to the original data? For example, CSV files allow easy editing and have the potential for corruption, nullifying data integrity efforts.\nScientific support mechanisms: Does the product fully meet the intended need for functionality, reliability, and accuracy?\nThe underlying goal in this section goes well beyond the material that is covered in schools. Technology development in instrumentation and the application of computing and informatics is progressing rapidly, and you can\u2019t assume that everything is working as advertised, particularly for your application. Software has bugs and hardware has limitations. Applying healthy skepticism towards products and requiring proof that things work as needed protect the quality of your work. \nIf you\u2019re a scientist reading this material, you might wonder why you should care. The answer is simply this: it is the modern evolution of how laboratory work gets done and how results are put to use. If you don\u2019t pay attention to the points noted, data integrity may be compromised. You may also find yourself the unhappy recipient of a regulatory warning letter.\nWhile there are some outcomes that could occur that you prefer didn't, there are also positive outcomes to come from your data integrity efforts: your work will be easier and protected from loss, results will be easier to organize and analyze, and you\u2019ll have a better functioning lab. You\u2019ll also have fewer unpleasant surprises when technology changes occur and you need to transition from one way of doing things to another. Yet there's more to protecting the integrity of your K\/I\/D than addressing the science, people, and information technology of your lab. The security of your lab and its information systems must also be addressed.\n\nFifth goal: Addressing security throughout the lab \nSecurity is about protection, and there are two considerations in this matter: what are we protecting and how do we enact that protection? The first is easily answered by stating that we're protecting our ability to effectively work, as well as the results of that work. This is largely tied to the laboratory's data integrity efforts. The second consideration, however, requires a few more words.\nBroadly speaking, security is not a popular subject in science, as it is viewed as not advancing scientific work or the development of K\/I\/D. Security is often viewed as inhibiting work by imposing a behavioral structure on people's freedom to do their work how they wish. Given these perceptions, it should be a lab's goal to create a functional security system that provides the protection needed while at the same time minimizing the intrusion in people\u2019s ability to work.\nThis section will look at a series of topics that address the physical and electronic security of laboratory work. Those major topics are shown in Figure 13 below. The depth of the commentary will vary, with some topics getting discussed at length and others by brief reference to others' work.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 13. The key issues of laboratory security\n\n\n\nWhy must security be addressed in the laboratory? There are many reasons, which are best diagramed, as seen in Figure 14:\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 14. The primary reasons why security issues need to be addressed in the lab\n\n\n\nAll of these reasons have one thing in common: they affect our ability to work and access the results of that work. This requires a security plan. In the end, implemented security efforts either preserve those abilities, or they reduce the value and utility of the work and results, particularly if security isn't implemented well or adds a burden to personnel's ability to work. While addressing these reasons and their corresponding protections, we should keep in mind a number of issues when developing and implementing a security plan within the lab (Figure 15). Issues like remote access have taken on particular significance over the course of the COVID-19 pandemic.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 15. Issues to keep in mind when planning security programs\n\n\n\nWhen the subject of security comes up, people's minds usually go in one of two directions: physical security (i.e., controlled access) and electronic security (i.e., malware, viruses, ransomware, etc.). We\u2019re going to come at it from a different angle: how do the people in your lab want to work? Instead of looking at a collection of solutions to security issues, we\u2019re going to first consider how lab personnel want to be working and within what constraints, and then we'll see what tools can be used to make that possible. Coming at security from that perspective will impact the tools you use and their selection, including everything from instrument data systems to database products, analytical tools, and cloud computing. The lab bench is where work is executed, and the planning and thinking take place between our ears, something that can happen anywhere. How do we provide people with the freedom to be creative and work effectively (something that may be different for each of us) while maintaining a needed level of physical and intellectual property security? Too often security procedures seem to be designed to frustrate work, as noted in the previous Figure 15.\nThe purpose of security procedures are to protect intellectual property, data integrity, resources, our ability to work, and lab personnel, all of which can be impacted by the reasons given in the prior Figure 14. However, the planning for how to approach these security procedures requires the coordination with and cooperation of several stakeholders within and tangentially related to the laboratory. Ensure these and any other necessary stakeholders are involved with the security planning efforts of your laboratory:\n\nFacilities management: These stakeholders manage the physical infrastructure you are working in and have overall responsibility for access control and managing the human security assets in larger companies. In smaller companies and startups, the first line of security may be the receptionist; how well trained are they to deal with the subject?\nIT groups: These stakeholders will be responsible for designing and maintaining (along with facilities management) the electronic security systems, which range from passkeys to networks.\nLegal: These stakeholders may work with human resources to set personnel standards for security, reviewing licensing agreements and contracts\/leases for outside contractors and buildings (more later).\nLab personnel: From the standpoint of this guide, this is all about the people doing the analytical and research work within the laboratory.\nConsultants: Security is a complex and rapidly developing subject, and you will likely need outside support to advise you on what is necessary and possible, as well as how to go about making that a reality.\nBut what else must be considered during you and your stakeholders' planning efforts? Before we can get into the specific technologies and practices that may be implemented within a facility, we need to look at the facility itself. \n\nExamine aspects of the facility itself \nDoes your company own the building you are working in? Is it leased? Is it shared with other companies in a single industrial complex? If you own the facility, life is simpler since you control everything. Working in a shared space that is leased or rented requires more planning and thought, preferably before you sign an agreement. You're likely to have additional aspects to seriously consider about your facility. Have the locks and door codes been changed since the last tenant left? Is power shared across other businesses in your building? Is the backup generator\u2014if there is one\u2014sufficient to run your systems? What fire protections are in place? How is networking managed in the facility? Are security personnel attuned to the needs of your company? Let's take a look at some of these and other questions that should be addressed.\nIs the physical space well-defined, and does building maintenance have open access to your various spaces? \nBuilding codes vary from place to place. Some are very specific and strict, while others are almost live-and-let-live. One thing you want to be able to do is to define and control your organization's physical space and set up any necessary and additional protective boundaries. Physical firewalls are one way of doing that. A firewall should be a solid structure that acts as a barrier to fire propagation between your space and neighboring spaces, extending from below-ground areas to the roof. If it is a multi-level structure, the levels should be isolated. This may seems obvious, but in some single-level shared buildings (e.g., strip malls) the walls may not go to the roof to make it easier to route utilities like HVAC, power, and fire suppression. This can acts as an access point to your space.\nBuilding maintenance is another issue. Do they have access to your space? Does that access come with formal consent or is that consent assumed as part of the lease or rental agreement? Several problems must be considered. First, know that anyone who has access to your physical space should be considered a weak point in your security. Employees should inherently have a vested interest in protecting your assets, but building maintenance is a different matter. Who vets them? Since these notes are focused on laboratory systems, who trains them about what to touch and what not to? (For example, an experiment could be ruined because maintenance personnel opened a fume hood, disturbing the airflow, despite any signage placed on the hood glass.) Consider more than just office systems in your space analysis, including other equipment that may be running after-hours that doesn\u2019t handle tampering, curiosity, or power outages well. Do you have robotics running multiple shifts or other moving equipment that might attract someone\u2019s curiosity? Security cameras would be useful, as would \u201cDo Not Enter\u201d signs. \nSecond, most maintenance staff will notify you (hopefully in writing) about their activities so you can plan accordingly, but what about emergency issues? If they have to fix a leak or a power problem, what are the procedures for safely shutting down systems? Do they have a contact person on your staff in case a problem occurs? Is there hazardous material on-site that requires special handling? Are the maintenance people aware of it and how to handle it? Answers to these questions should be formalized in policy and disseminated to both maintenance and security management personnel, and be made available to new personnel who may not be up to speed.\nIs power shared across other businesses in your building?\nShared power is another significant issue in any building environment. Unless someone paid careful attention to a lab's needs during construction, it can affect any facility. A number of issues can arise from misconfigured or unsupported power systems. Real-life examples of issues a computer support specialist friend of mine has encountered in the past include computers that:\n\nwere connected to the same circuit box as heavy duty garage doors. Deliveries would come in early in the morning and when the doors opened the computers crashed.\nwere on the same circuit as air conditioners. The computers didn\u2019t crash, but the electrical noise and surging power use created havoc with systems operations and disk drives.\nconnected to circuits that didn\u2019t have proper grounding or had separate grounding systems in the same room. Some didn\u2019t have external grounding at all. We worked on a problem with one computer-instrument system that had each device plugged into different power outlets. The computer\u2019s was grounded, but the instrument's power supply wasn\u2019t; once that was fixed everything worked.\nwere too close to a radio tower. Every night when the radio system changed its antenna configuration, the computer experienced problems. Today, many devices generate radio signals that might interfere with each other. The fact that they are \u201cdigital\u201d systems doesn\u2019t matter; they are made of analog components.\nIs the power clean, and is the backup generator\u2014if there is one\u2014sufficient to run your systems?\nAnother problem is power continuity and quality. Laboratories depend on clean, reliable power. What will the impact of power outages\u2014lasting anywhere from seconds to days\u2014be on your ability to function? The longer end of the scale is easy; you stop working or relocate critical operations. Generators are one solution option, and we\u2019ll come back to those. The shorter outages, particularly if they are of the power up-down-up variety, are a separate issue. Networkable sensors with alarms and alerts for monitoring power, refrigeration, etc., permitting remote monitoring, may be required. Considerations for these intermittent outages include:\n\nDo you know when they happened? What was their duration? How can you tell? (Again, consider sensor-based monitoring.)\nWhat effect did intermittent outages have on experiments that were running? Did the systems and instruments reset? Was data lost? Were in-process samples compromised?\nWhat effect did they have on stored samples? If samples had to be maintained under controlled climate conditions, were they compromised?\nDid power loss and power cycling cause any problems with instrumentation? How do you check?\nDid systems fail into a safe mode?\nHow real are power problems? As Ula Chrobak notes in an August 2020 Popular Science article, infrastructure failures, storms, climate change, etc. are not out of the realm of possibility; if you were in California during that time, you saw the reality first-hand.[10]\nIf laboratory operations depend on reliable power, what steps can we take to ensure that reliability?\nFirst, site selection naturally tops the list. You want to be somewhere that has a reputation for reliable power and rapid repairs if service is lost. A site with buried wiring would be optimal, but that only benefits you a little if the industrial park has buried wiring but is actually fed with overhead wiring. Another consideration is the age of the site: an older established site may have outdated cables that are more likely to fail. The geography is also important. Nearby rivers, lakes, or an ocean might be liable to producing floods, causing water intrusion into wiring. Also, don\u2019t overlook the potential issues associated with earthquakes or nearby industries with hazardous facilities such as chemical plants or refineries. Areas prone to severe weather conditions are an additional consideration.\nSecond, address the overall quality of the building and its infrastructure. This affects buildings you own as well as lease; however, the difference is in your ability to make changes. How old is the wiring? Has it been inspected? Are the grounding systems well implemented? Do you have your own electrical meters, and is your power supply isolated from other units if you are leasing? Will your computers and instruments be on circuits isolated from heavy equipment and garage doors? Make an estimate of your power requirements, then at least double it. Given that, is there sufficient amperage coming into the site to manage all your instruments, computers, HVAC systems, and freezers? How long will you be occupying that space, and is there sufficient power capacity to support potential future expansion?\nThird, consider how to implement generators and battery backup power. These are obvious solutions to power loss, yet they come with their own considerations:\n\nWho has control over generator implementation? If you own the building, you do. If the building is leased, the owner does, and they may not even provide generator back-up power. If not, your best bet\u2014unless you are planning on staying there for a long time\u2014is to go somewhere else; the cost of installing, permitting, and maintaining a generator on a leased site may be prohibitive. A good whole-house system can run up to $10,000, plus the cost of a fueling system.\nHow much power will you need and for how long, and is sufficient fuel available? Large propane tanks may need to be buried. Diesel is another option, though fire codes may limit fuel choices in multi-use facilities. The expected duration of an outage is important, also. Often we think perhaps a few hours, but snow, ice, hurricane, tornados, and earthquakes may push that out to a week or more.\nIs the generator\u2019s output suitable for the computers and instruments in your facility? A major problem to acknowledge is electrical noise: too much and you\u2019ll create more problems than you would have if the equipment had just been shut down.\nWhat is the startup delay of the generator? A generator can take anywhere from a few seconds to several minutes to get up to speed and produce power. Can you afford that delay? Probably not.\nThe answer to the problems noted in the last two bullets is battery backup power. These can range from individual units that are used one-per-device, like home battery backups for computers and other equipment, to battery-walls that are being offered for larger applications. The advantage is that they can come online anywhere from instantly (i.e., always-on, online systems) to a few milliseconds for standby systems. The always-on, online options contain batteries that are constantly being charged and at the same time constantly providing power to whatever they are connected to. More expensive than standby systems, they provide clean power even from a source that might otherwise be problematic. On the other hand, standby systems are constantly charging but pass through power without conditioning; noisy power in, noisy power out until a power failure occurs.\n\nSecurity and the working environment \nWhen we are looking at security as a topic, we have keep in mind that we are affecting people's ability to work. Some of the laboratory's work is done at the lab bench or on instruments (which, depending on the field you\u2019re in, could range from pH meters to telescopes). However, significant work occurs away from bench, thinking and planning wherever a thought strikes. What kind of flexibility do you want people to have? Security will often be perceived as stuff that gets in the way of personnel's ability to work, despite the fact that well-implemented security protects their work. \nWe need to view security as a support structure enabling flexibility in how people work, not simply as a series of barriers that frustrate people. You can begin by defining the work structure as you\u2019d like it to be, at the same time recognizing that there are two sides to lab work: the intellectual (planning, thinking, evaluating, etc.) and the performed (where you have to work with actual samples, equipment, and instruments). One can be done anywhere, while the other is performed in a specific space. The availability of computers and networks can blur the distinction. \nKeeping these things in mind, any security planning should consider the following:\n\nHow much flexibility do personnel want in their work environment vs. what they can actually have? In some areas of work, there may be significant need for lockdown with little room for flexibility, while other areas may be pretty loose.\nDo people want to work from remote places? Does the nature of the work permit it? This can be motivated by anything from \u201cthe babysitter is sick\u201d to \u201cI just need to get away and think.\u201d\nWhile working remotely, do people need access to lab computers for data, files (i.e., upload\/download), or to interact with experiments? Some of this can be a matter of respecting people\u2019s time. If you have an experiment running overnight or during the weekend, it would be nice to check the status remotely instead of driving back to work.\nDo people need after-hours access to the lab facilities?\nThe answers to these planning questions lay the groundwork for hardware, software, and security system requirements. Can you support the needs of personnel, and if so, how is security implemented to make it work? Will you be using gateway systems to the lab network, with additional logins for each system, two-factor authentication, or other mechanisms? The goal is to allow people to be as productive as possible while protecting the organization's resources and meeting regulatory requirements. That said, keep in mind that unless physical and virtual access points are well controlled, others may compromise the integrity of your facility and its holdings.\nEmployees need to be well educated in security requirements in general and how they are implemented in your facility. They need to be a willing part of the processes and not grudgingly accepting them; that lack of willingness to work within the system is a security weak point, things people will try to circumvent. One obvious problem is with username-password combinations for computer access; rather than typing that information in, biometric features are faster and less error prone.\nThat said, personnel should readily accept that no system should be open to unauthorized access, and that hierarchical levels of control may be needed, depending on the type of system; some people will have access to some capabilities and not others. This type of \"role-based\" access shouldn\u2019t be viewed as a matter of trust, but rather as a matter of protection. Unless the company is tiny, senior management, for example, shouldn\u2019t have administrative system level access to database systems or robotics. If management is going to have access to those levels, ensure they know exactly what they are doing. By denying access to areas not needed in a role-based manner, you limit the ability of personnel to improperly interrogate or compromise those systems for nefarious purposes.\nWhat are your security control requirements?\nFigure 16 lists some of the key areas of concern for security controls. Some we\u2019ll touch on, others we'll leave to those better informed (e.g., see Tulsi 2019[11] and Riley 2020[12]).\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 16. Key areas of concern for security controls\n\n\n\nWhat is your policy on system backups?\nWhen it comes to your computer systems, are you backing up their K\/I\/D? If so, how often? How much K\/I\/D can you afford to lose? Look at your backups on at least three levels. First, backing up the hard drive of a computer protects against failure of that computer and drive. Backing up all of a lab\u2019s data systems to an on-site server in a separate building (or virtualized locally) protects against physical damage to the lab (e.g., fire, storms, earthquake, floods, etc.). Backing up all of a lab\u2019s data systems to a remote server (or virtualized remotely) provides even more protection against physical damage to the lab facility, particularly if the server is located someplace that won\u2019t be affected by the same problems your site may be facing. It should also be somewhere that doesn\u2019t compromise legal control over your stuff; if it is on a third-party server farm in another country, that country\u2019s laws apply to access and seizure of your files if legal action is taken against you.\nShould your various laboratory spaces and components be internet-connected?\nWhen looking at your lab bench spaces, instruments, database systems, etc., determine whether they should be connected to the internet. This largely depends on what capabilities you expect to gain from internet access. Downloading updates, performing online troubleshooting with vendors, and conducting online database searches (e.g., spectra, images, etc.) are a few useful capabilities, but are they worth the potential risk of intrusion? Does your IT group have sufficient protection in place to allow access and still be protected? Note, however, any requirement for a cloud-based system would render this a moot point.\nLab systems should be protected against any intrusion, including vendors. Vendor-provided files can be downloaded to flash drives, which can then be checked for malware and integrity before being manually transferred to lab systems. Consider what is more important: convenience or data protection? This may give you more to think about when you consider your style of working (e.g., remote access). However, having trusted employees access the lab network is different than third-parties.\n\nSummary \nWe\u2019ve only been able to touch on a few topics; a more thorough review would require a well-maintained document, as things are changing that quickly.[k] In many labs, security is a layered activity, where as the work of the lab is planned, security issues are then considered. We\u2019d be far better off if security planning was instead conducted in concert with lab systems planning; support for security could would then become part of the product selection criteria.\n\n Sixth goal: Acquiring and developing \"products\" that support regulatory requirements \nProducts should be supportable. That seems pretty simple, but what exactly does that mean? How do we acquire them, and more importantly, how do we develop them? The methods and procedures you develop for lab use are \u201cproducts\u201d\u2014we\u2019ll come back to that.\nFirst, here\u2019s an analogy using an automobile. The oil pan on a car may need to be replaced if it is leaking due to damage or a failed gasket; if it isn\u2019t repaired, the oil can leak out. Some vehicles are more difficult to work on than others given their engineering. For example, replacing the oil pan in some cars requires you to lift the engine block out of the car. That same car design could also force you to move the air conditioner compressor to change spark plugs. In the end, some automobile manufactures have built a reputation for cars that are easy to service and maintain, which translates into lower repair costs and longer service life. \nHow does that analogy translate to the commercial products you purchase for lab use, as well as the processes and procedures you develop for your lab? The ability to effectively (i.e., with ease, a low cost, etc.) support a product has to be baked into the design from the start. It can\u2019t be retrofitted.\nLet\u2019s begin with the commercial products you purchase for lab use, including instruments, computer systems, and so on. One of the purchase criteria for those items is how well they are supported: mature products should have a better support infrastructure, built up over time and use. However, that doesn\u2019t always translate to high-quality support; you may find a new product getting eager support because the vendor is heavily invested in market acceptance, working the bugs out, and using referral sites to support their sales. When it comes to supporting these commercial products, we expect to see:\n\nUser Guides \u2013 This should tell you how the device works, what the components are (including those you shouldn\u2019t touch), how to use the control functions, what the expected operating environment is, what you need to provide to make the item usable, and so on. For electronic devices with control signal communications and data communications, the vendor will describe how it works and how they expect it to be used, but not necessarily how to use it with third-party equipment. There are limitations of liability and implied support commitments that they prefer not to get involved with. They provide a level of capability, while it\u2019s largely left up to you to make it work in your application.\nTraining materials \u2013 This will take you from opening the box, setting up whatever you\u2019ve purchased, and walking through all the features and some examples of their use. The intent is to get you oriented and familiar with using it, with the finer details located in user guides. Either this document or the user guide should tell you how to ensure that the device is installed and operating properly, and what to do if it isn\u2019t. This category can also include in-person short courses as well as online courses (an increasingly popular option as something you can do at your convenience).\nMaintenance and troubleshooting manuals \u2013 This material describes what needs to be periodically maintained (e.g, installing software upgrades, cleaning equipment, etc.) and what to do if something isn\u2019t working properly.\nSupport avenues - Be it telephone, e-mail, or online chat, there are typically many different ways of reaching the vendor for help. Online support can also include a \u201cknowledgebase\u201d of articles on related topics, as well as chat functions.\nUser groups \u2013 Whether conducted in-person or online, venues for giving users a chance to solve problems and present material together can also prove valuable.\nFrom the commercial side of laboratory equipment and systems, support is an easy thing to deal with. If you have good products and support, people will buy them. If your support is lacking, they will go somewhere else, or you will have fostered the development of a third-party support business if your product is otherwise desirable.\nFrom the system user\u2019s perspective, lab equipment support is a key concern. Users typically don\u2019t want to take on a support role in the lab as that isn\u2019t their job. This brings us to an interesting consideration: product life cycles. You buy something, put it to use, and eventually it has to be upgraded (particularly if it involves software) or possibly replaced (as with software, equipment, instruments, etc.). Depending on how that item was integrated into the lab\u2019s processes, this can be a painful experience or an easy one. Product life cycles are covered in more detail later in this section, but for now know they are important because they apply, asynchronously, to every software system and device in your lab. Upgrade requirements may not be driven by a change in the functionality that is important to the lab, but rather due to a change to an underlying component, e.g., the computer's operating system. The reason that this is important in a discussion about support is this: when you evaluate a vendor's support capabilities, you need to cover this facet of the work. How well do they evaluate changes in the operating system (OS) in relation to the functionality of their product? Can they advise you about which upgrades are critical and those that can be done at a more convenient time? If a change to OS or a database product occurs, how quickly do they respond?\nNow that we have an idea what support means for commercial products, let\u2019s consider what support means for the \"products\"\u2014i.e., the procedures and methods\u2014developed in your lab.\nThe end result of a typical laboratory-developed method is a product that incorporates a process (Figure 17). This idea is nothing new in the commercial space. Fluid Management Systems, Inc. has complex sample preparation processing systems as products[13], as do instrument vendors that combine autosamplers, an instrument, and a data system (e.g., some of Agilent\u2019s PAL autosampler systems incorporate sample preparation processing as part of their design[14]). Those lab methods and procedures can range from a few steps to an extensive process whose implementations can include fully manual execution steps, semi-automated steps (e.g., manual plus instrumentation), and fully automated steps. In the first two cases, execution can occur with either printed or electronic documentation, or it can be managed by a LES. However, all of these implementations are subject to regulatory requirements (commercial products are subject to ISO 9000 requirements).\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 17. A laboratory process\n\n\n\nThe importance of documentation \nRegulatory requirements and guidelines (e.g., from the FDA, EPA, ISO, etc.) have been with production and R&D for decades. However, some still occasionally question those regulations' and guidelines' application to research work. Rather than viewing them as hurdles which a lab must cross to be deemed qualified, they should be viewed as hallmarks of a well-run lab. With that perspective, they remain applicable for any laboratory.\nFor purposes of this guide, there is one aspect of regulatory requirements that will be emphasized here: process validation, or more specifically the end result, which is a validated process. Laboratory processes, all of which have to be validated, are essentially products for a limited set of customers; in many cases its one customer, in others the same process may be replicated in other labs as-is. The more complex the implementation, and the longer the process is expected to be in use, the more important it is to incorporate some of the tools from commercial developers into lab development (Table 1). However, regardless of development path, complete documentation is of the utmost concern. \n\n\n\n\n\n\n\nTable 1. Process and product development documentation\n\n\nLaboratory-developed products\n\nCommercially developed products\n\n\n\u2022 user requirements document\r\n\u2022 functional specification\r\n\u2022 system design and testing\r\n\u2022 implementation\r\n\u2022 installation, operational, and performance qualification\r\n(IQ, OQ, PQ) evaluation and acceptance\n\n\u2022 product requirements document\r\n\u2022 engineering or functional specification\r\n\u2022 product testing protocol\r\n\u2022 product readiness for market acceptance criteria\r\n \r\nPLUS product support elements like:\r\n\u2022 user guides\r\n\u2022 training materials\r\n\u2022 maintenance and troubleshooting guides\r\n\u2022 support mechanisms\r\n\u2022 user groups\n\n\n\nDocumentation is valuable because:\n\nIt's educational: Quality documentation ensures those carrying out the process or maintaining it are thoroughly educated. Written documentation (with edits and audit trails, as appropriate) acts as a stable reference point for how things should be done. The \u201cfollow-me-and-I\u2019ll-show-you\u201d approach is flawed. That method depends on someone accurately remembering and explaining the details while having the time to actually do it, all while hoping bad habits don't creep in and become part of \u201chow it\u2019s done.\u201d\nIt informs: Quality documentation that is accessible provides a reference for questions and problems as they occur. The depth of that documentation, however, should be based on the nature of the process. Even manual methods that are relatively simple need some basic elements. To be innformative, it should address numerous questions. Has the instrument calibration been accurately verified? How do you tell, and how do you correct the problem if the instrument is out of calibration? What information is provided about reagents, including their age, composition, strength, and purity? When is a technician qualified to use a reagent? How are reference materials incorporated as part of the process to ensure that it is being executed properly and consistently?\nNote that the support documents noted in Table 1 are not usually part of process validation. The intent of process validation is to show that something works as expected once it is installed.\nOne aspect that hasn\u2019t been mentioned so far is how to address necessary change within processes. Any lab process is going to change over time. There may be a need for increased throughput, lower operating costs, less manual work, the ability to run over multiple shifts, etc. There may also be new technologies that improve lab operations that eventually need to get incorporated into the process. As such, planning and process documentation should describe how processes are reviewed and modified, along with any associated documentation and training. This requires the original project development to be thoroughly documented, from functionality scoping to design and implementation. By including process review and modification as part of a process allows that process to be upgraded without having to rebuild everything from scratch. This level of documentation is rarely included due to the initial cost and impact on the schedule. It will affect both, but it will also show its value once changes have to be made. In the end, by adding process review and modification mechanism, you ensure a process is supportable.\nTo be clear, the initial design and planning of process and methods has to be done well for a supportable product. This means keeping in mind future process review and modification even as the initial process or method is being developed. It\u2019s the difference between engineering a functional and supportable system and \u201cjust making something that works.\u201d Here are three examples:\n\nOne instrument system vendor, in a discussion between sessions of a meeting, described how several of his customers successfully connected a chromatography data system (CDS) to a LIMS. It was a successful endeavor until one of the systems had to be upgraded, then everything broke. The programmer had made programming changes to areas of the packages that they shouldn\u2019t have. When the upgrade occurred, those changes were overwritten. The project had to be scrapped and re-developed.\nA sample preparation robotics system was similarly implemented by creating communications between devices in ways that were less than ideal. When it came time for an upgrade to one device, the whole system failed.\nA consultant was called in to evaluate a project to interface a tensile tester to a computer, as the original developer had left the company. The consultant recommended the project be scrapped and begun anew. The original developer had not left any design documentation, the code wasn\u2019t documented, and no one knew if any of it worked or how it was supposed to work. Trying to understand someone else\u2019s programming without documentation assistance is really a matter of trying to figure out their thinking process, and that can be very difficult.\nThere are a number of reasons why problems like this exist. Examples include lack of understanding of manual and automated systems design and engineering methodologies and pressure from management (e.g., \u201chow fast can you get it done,\u201d \u201ckeep costs down,\u201d and \u201cwe\u2019ll fix it in the next version\u201d). Succumbing to these short-term views will inevitably come back to haunt you in the long-term. Upgrades, things you didn\u2019t think of when the original project was planned, and support problems all tend to highlight work that could have been done better. Another saying that frequently comes up is \u201cthere is never time to do it right, but there is always time to do it over,\u201d usually at a considerably higher cost.\n\nAdditional considerations in creating supportable systems and processes \nSingle-vendor or multi-vendor?\nLet\u2019s assume we are starting with a manual method that works and has been fully validated with all appropriate documentation. And then someone wants to change that method in order to meet a company goal such as increasing productivity or lowering operational costs. Achieving goals like those usually means introducing some sort of automation, anything from automated pipettes to instrumentation depending on the nature of the work. Even more important, if a change in the fundamental science underlying the methodology is proposed, that would also require re-validation of the process.\nJust to keep things simple, let\u2019s say the manual process has an instrument in it, and you want to add an autosampler to keep the instrument fed with samples to process. This means you also need something to capture and process the output or any productivity gains on the input may be lost in handling the output. We\u2019ll avoid that discussion because our concern here is supportability. There are a couple directions you can go in choosing an add-on sampler: buy one from the same vendor as the instrument, or buy it from another vendor because it is less expensive or has some interesting features (though unless those features are critical to improving the method, they should be considered \u201cnice but not necessary\u201d).\nHow difficult is it going to be to make any physical and electronic (control) connections to the autosampler? Granted, particularly for devices like autosamplers, vendors strive for compatibility, but there may be special features that need attention. You need to consider not just the immediate situation but also how things might develop in the future. If you purchase the autosampler from the same vendor as the instrument and control system, they are going to ensure that things continue to work properly if upgrades occur or new generations of equipment are produced (see the product life cycle discussion in the next section). If the two devices are from different vendors, compatibility across upgrades is your issue to resolve. Both vendors will do what they can to make sure their products are operating properly and answer questions about how they function, but making them work together is still your responsibility.\nFrom the standpoint of supportability, the simpler approach is the easiest to support. Single-vendor solutions put the support burden on them. If you use multi-vendor implementations, then all the steps in making the project work have to be thoroughly documented from the statement of need, through to the functional specifications and the finished product. The documentation may not be very long, but any assistance you can give someone who has to work with the system in the future\u2014including yourself (i.e., \u201cwhat was I thinking when I did this?\u201d)\u2014will be greatly appreciated.\nOn-device programming or supervisory control?\nAnother consideration is for semi- or fully-automated systems where a component is being added or substituted. When we are looking at programmable devices, one approach is to make connections between devices via on-device programming. For example, say Device A needs to work with Device B, so programming changes are made to both to accomplish the task. While this can be made to work and be fully documented, it isn\u2019t a good choice since changing one of them (via upgrade or swap) will likely mean the programming has to be re-implemented. A better approach is to use a supervisory control system to control them both, and others that may be part of the same process. It allows for a more robust design, easier adaptations, and smoother implementation. It should be easier to support since programming changes will be limited to communications codes.\nThird-party developers and contractors?\nFrequently, third parties are brought in to provide services that aren\u2019t available through the lab or onsite IT staff. For example, a functional specification usually describes what you want as the end result and what their product is supposed to do, not how it is supposed to do it. This is left to the developer to figure out. You need to add supportability as a requirement, that the end result not only meet regulatory requirements, but that it also is designed and documented with sufficient information to have someone unfamiliar with the project understand what would have to be done if a change were made, which also requires you to think about where changes might be made in the future. This includes considering what components might be swapped for newer technologies, handling software upgrades (and what might break as a result of them), and knowing what to do if components reach their supported end-of-life and have to be replaced.\nConsulting firms may respond with \u201cif something needs to be changed or fixed, just call us, we built it,\u201d which sounds reasonable. However, suppose the people who built it aren\u2019t there anymore or aren't available because they're working on other projects. The reality is the \u201ccompany\u201d didn\u2019t build the product, people working for them did.\n\nProduct life cycles \nWhen discussing product life cycles, whether it's digital products or hardware systems, the bottom line problem is this: what do you do when a critical product needs to be updated or replaced? This can be an easy issue or a very painful one, depending on how much thought went into the design of the original procedure using that device. It's generally easy if you had the forethought of noting \u201csomeday this is going to be replaced, so how do we simplify that?\" It's more difficult if you, through wiring or software, linked devices and systems together and then can\u2019t easily separate them, particularly if no one documented how that might be accomplished. It\u2019s all a matter of systems engineering done well.\nNote: This material was originally published in Computerized Systems in the Modern Laboratory: A Practical Guide.\nAn analog pH meter\u2014as long as it has been maintained\u2014will still work today. So will double-pan balances, again as long as they have been maintained and people are proficient in their use. Old lab equipment that will still function properly has been replaced with more modern equipment due to better accuracy, ease of use, and other factors. Analog instruments can still be found operating decades after their design end-of-life. It is in the digital realm that equipment that should work (and probably does) but can\u2019t be used after a few years of service. The rate of technology change is such that tools become obsolete on the order of a half-decade. For example, rotating disks, an evolving computer staple that replaced magnetic tape drives, are now being replaced with solid-state storage.\nDigital systems require two components to work: hardware (e.g., the computer, plus interfaces for the hard disk, and ports for cable connections) and software (e.g., operating systems plus software drivers to access the hardware). Both hardware packaging and operating systems are changing at an increasing rate. Hardware systems are faster, with more storage, and operating systems are becoming more complex to meet consumer demands, with a trend toward more emphasis on mobile or social computing. Those changes mean that device interfaces we rely on may not be there in the next computer you have to purchase. The RS-232 serial port, a standard for instrument connections, is being replaced with USB, Firewire, and Thunderbolt connections that give support to a much wider range of devices and simplify computer design, with more usable and less costly devices. It also means that the instrument with the RS-232 interface may not work with a new computer due to there being no RS-232 ports, and the operating system may also no longer be compatible with the instrumentation.\nOne aspect of technology planning in laboratory work is change management, specifically the impact of technology changes and product life cycles on your ability to work. The importance of planning around product life cycles has taken on an added dimension in the digital laboratory. Prior to the use of computers, instruments were the tools used to obtain data, which was recorded in paper laboratory notebooks. End result: getting the data and recording and managing it were separate steps in lab work. If the tools were updated or replaced, the data recorded wasn\u2019t affected. In the digital realm, changes in tools can affect your ability to work with new and old data and information. The digital-enabled laboratory requires planning, with a time horizon of decades to meet legal and regulatory requirements. The systems and other tools you use may not last for decades; in fact, they will probably change several times. However, you will have to plan for the transfer of the data and information they contain and address the issue of database access and file formats. The primary situation to avoid is having data in files that you can\u2019t read.\nWhile we are going to begin looking at planning strategies for isolated products as a starting point, please keep in mind that in reality products do not exist in isolation. The laboratory\u2019s K\/I\/D is increasingly interconnected, and changes in one part of your overall technology plan can have implications across your lab's working technology landscape. The drive toward integration and paperless laboratories has consequences that we are not fully prepared to address. We\u2019ll start with the simpler cases and build upon that foundation.\nDigital products change for a number of reasons: \n\nThe underlying software that support informatics applications could change (e.g., operating systems, database systems), and the layers of software that build on that base have to be updated to work properly.\nProducts could see improvements due to market research, customer comments, and competitive pressures.\nVendors could get acquired, merge with another company, or split up, resulting in products merging or one product being discarded in favor of another.\nYour company could be acquired, merge with another company, or split into two or more organizations.\nProducts could simply fail.\nYour lab could require a change, replacing older technologies with systems that provide more capability.\nIn each of these cases, you have a decision to make about how K\/I\/D is going to be managed and integrated with the new system(s).\nBut how often do digital products change? Unfortunately, there isn't much detailed information published about changes in vendor products. Historically, operating systems used to be updated with new versions on an annual basis, with updates (e.g., bug fixes, minor changes) occurring more frequently. With a shift toward subscription services, version changes can occur more frequently. The impact of an OS version change will vary depending on the OS. Some vendors take responsibility and control for the hardware and software, and as a result, upgrades support both the hardware and OS until the vendor no longer supports new OS\nversions on older systems. Other computer systems, where the hardware and software components come from different vendors, can result in the inability to access hardware components due to an upgrade. The OS upgrade only supports certain hardware features. Support for specific add-on equipment (including components provided by the computer vendor) may require finding and reinstalling drivers from the original component vendor. As for the applications that run on operating systems, they will need to be tested with each OS version change. \nApplications tend to be updated on an irregular basis, for both direct installs and for cloud-hosted solutions. Microsoft Office and Adobe\u2019s Creative Cloud products may be updated as they see a need. Since both product suites are now accessed via the internet on a subscription basis (software as a service or SaaS), user action isn\u2019t required. Lab-specific applications may be upgraded or updated as the vendor sees a need; SaaS implementations are managed by the vendor according to the vendor's internal planning. Larger, stable vendors may provide upgrades on a regular, annual basis for on-site installations. Small vendors may only update when a significant change is made, which might include new features, or when forced to because of OS changes. If those OS compatible changes aren\u2019t made, you will see yourself running software that is increasingly out-of-date. That doesn\u2019t necessarily mean it will stop working (for example, Microsoft dropped support for Windows XP in the Spring of 2014, and computers running it didn\u2019t suddenly stop). What it does mean is that if your computer hardware has to be replaced, you may not be able to re-install a working copy of the software. The working lifetime of an application, particularly a large one, can be on the order of a decade or more. Small applications depend upon market acceptance and the vendor\u2019s ability to stay in business. Your need for access to data may exceed the product's life.\nThe perception of the typical product life cycle runs like this: a need is perceived; product requirements are drafted; the product is developed, tested, and sold, based on market response to the product; new product requirements are determined; and the cycle continues. The reality is a bit more complicated. Figure 18 shows a more realistic view of a product\u2019s life cycle. The letters in the circles refer to key points where decisions can have an impact on your lab (\"H\" = high, \"M\" = medium, \"L\" = low).\n\n\n\n\n\n\n\n\n\nFigure 18. The product life cycle\n\n\n\nThe process begins with an initial product concept, followed by the product's development, introduction, and marketing programs, and finally its release to customers. If the product is successful, the vendor gathers customer comments, analyzes competitive technologies and any new technologies that might be relevant, and determines a need for an upgrade.\nThis brings us to the first decision point: is an upgrade possible with the existing product? If it is, the upgrade requirements are researched and documented and the process moves to development, with generally a low impact on users. \u201cGenerally\u201d because it depends on the nature of the product and what modifications, changes, and customizations have been made by the user. If it is an application that brings a data file in, processes it, and then saves the result in an easily accessible file format, allowing no user modifications to the application itself, \u201clow impact\u201d is a fair assessment. Statistical analysis packages, image processing, and others such applications fall into this set. Problems can arise when user modifications are overwritten by the upgrade and have to be reinstalled (only a minor issue if it is a plug-in with no programming changes) or re-implemented by making programming changes (a major problem since it requires re-validation). Normally any customization (e.g., naming database elements) and data held within an application's database should be transferred without any problems, though you do need to make some checks and evaluations to ensure that this is the case. This is the inner loop of Figure 18.\nSignificant problems can begin if the vendor determines that the current product generation needs to be replaced to meet market demands. If this is a hardware product (e.g., pH meter, balance, instrument, computer, etc.) there shouldn\u2019t be any immediate impact (the hardware will continue to work). However, once there is a need for equipment replacement, it becomes a different matter; we\u2019ll pick this thread up later when we discuss product retirement.\nSoftware is a different situation. The new generation of software may not be compatible with the hardware you currently have. What you have will still work, but if there are features in the new generation that you\u2019d like to have, you may find yourself having to re-implement any software changes that you\u2019ve made to the existing system. It will be like starting over again, unless the vendor takes pains to ensure that the upgrade installation is compatible with the existing software system. This includes assurances that all the data, user programming, settings, and all the details you\u2019ve implemented to make a general product work in your environment can successfully migrate. You will also have to address user education, and plan for a transition from the old system to the new one.\nOne problem that often occurs with new generations of software is change in the underlying data file structures. The vendor may have determined that in order to make the next generation work and to be able to offer the features they want included, the file structure and storage formats will change. This will require you to re-map the existing file structure into the new one. You may also find that some features do not work the same as they did before and your processes have to be modified. In the past, even new versions of Microsoft Office products have had compatibility issues with older versions. In large applications such as informatics or instrument data systems (e.g., multi-user chromatography data system), changes in formats can be significant. It can have an effect on importing instrument data into informatics products. For example, some vendors and users use a formatted PDF file as a means of exchanging instrument data with a LES, SDMS, or ELN. If the new version of an instrument data system changes its report formatting, the PDF parsing routine will have be updated.\nAt this point, it's important to note that just because a vendor comes out with a new software or hardware package doesn\u2019t mean that you have to upgrade. If what you have is working and the new version or generation doesn\u2019t offer anything of significant value (particularly when the cost of upgrading and the impact on lab operations is factored in) then bypass the upgrade. Among the factors that can tug you into an upgrade is the potential loss of support for the products you are concerned with.\nWhat we\u2019ve been discussing in the last few paragraphs covers the outer loop to the right of Figure 18. The next point we need to note in that figure is the \u201cNo\u201d branch from \u201cNew Product Generation Justified?\u201d and \u201cProduct Fails,\u201d both of which lead to product retirement. For both hardware and software, you face the loss of customer support and the eventual need for product replacement. In both cases, there are steps that you can take to manage the potential risks.\nTo begin with, unless the vendor is going out of business, they are going to want to maintain a good relationship with you. You are a current and potential future customer, and they\u2019d like to avoid bad press and problems with customer relationships. Explain how the product retirement is going to affect you and get them to work with you on managing the issue; you aren\u2019t the only one affected by this (see the commentary on user groups later). If you are successful, they will see a potential liability turn into a potential asset: you can now be a referral for the quality of their customer service and support. Realistically, however, your management of product retirement or major product changes has to occur much earlier in the process.\nYour involvement begins at the time of purchase. At that point you should be asking what the vendors update and upgrade policies are, how frequently they occur, what the associated costs are, how much advanced notice they give for planning, and what level of support is provided. In addition, determine where the product you are considering lies in the product\u2019s life cycle. Ask questions such as:\n\nIs it new and potentially at-risk for retirement due to a lack of market acceptance? If it is and the vendor is looking for reference sites, use that to drive a better purchase agreement. Make sure that the product is worth the risk, and be prepared in case the worst-case scenario occurs.\nIs it near the end-of-life with the potential for retirement? Look at the frequency of updates and upgrades. Are they tailing off or is the product undergoing active development?\nWhat is the firm\u2019s financial position? Is it running into declining sales or are customers actively seeking it? Is there talk of acquisitions or mergers, either of which can put the product's future into question?\nYou should also ask for detailed technical documents that describe where programming modifications are permitted and preserved against vendor changes, and how data will be protected, along with any tools for data migration. Once you know what the limitations are for coding changes, device additions, and so on are, the consequences of deviating from them are your responsibility; whatever you do should be done deliberately and with full awareness of their impact in the future.\nOne point that should be clarified during the purchase process is whether you are purchasing a product or a product license. If you a purchasing a product, you own it and can do what you like with it, at least for hardware products. Products that are combinations of hardware and software may be handled differently since the hardware won\u2019t function without the software. Licenses are \u201crights to use\u201d with benefits and restrictions. Those should be clearly understood, as well as what you can expect in terms of support, upgrades, the ability to transfer products, how products can be used, etc. If there are any questions, the time to get them answered is before you sign purchase agreements. You have the best leverage for gaining information and getting reasonable concessions that are important to you while the vendor is trying to sell you something. If you license a product, the intellectual property within the product belongs to the vendor while you own your K\/I\/D; if you decide to stop using a product, you should have the ability to extract your K\/I\/D in a usable form.\nAnother point: if the product is ever retired, what considerations are provided to you? For a large product, they may not be willing to offer documented copies of the code so that you can provide self-support, but a small company trying to break into the market might. It doesn\u2019t hurt to ask and get any responses in writing, don\u2019t trust someone\u2019s verbal comments; they may not be there when upgrades or product retirement occurs. Additionally, it's always beneficial to conduct negotiations on purchase and licenses in cooperation with your company's IT and legal groups. IT can advise on industry practices, and the legal department\u2019s support will be needed for any agreements.\nAnother direction you should take is participating in user groups. Most major vendor and products have user groups that may exist as virtual organizations on LinkedIn, Yahoo, or other forums. Additionally, they often have user group meetings at major conferences. Company-sponsored group meetings provide a means for learning about product directions, raising issues, discussing problems, etc. Normally these meeting are divided into private (registered users only) and public sessions, the former being the most interesting since they provide a means of unrestricted comments. If a new version or upgrade is being considered, it will be announced and discussed at group meetings. These will also provide a mechanism for making needs known and if a product is being retired, lobbying for support. The membership contact list will provide a resource for exchanging support dialogue, particularly if the vendor is reluctant to address points that are important to you.\nIf a group doesn\u2019t exist, start a virtual conference and see where it goes. If participation is active, let the vendor know about it; they may take an interest and participate or make it a corporate function. It is in a company's best interest to work with its customers rather than antagonize them. Your company\u2019s support may be needed for involvement in, or starting, user groups because of the potential for liability, intellectual property protection, or other issues. Activities performed in these types of groups can be wide-ranging, from providing support (e.g., trading advice, code, tutorials, etc.) and sharing information (e.g., where to get parts, information for out-of-warranty products) to identifying and critiquing repair options and meeting informally for conferences.\nThe key issue is to preserve your ability to carry out your work with as little disruption as possible. That means you have to protect your access to the K\/I\/D you\u2019ve collected, along with the ability to work with it. In this regards, software systems have one possible advantage: virtualization.\nVirtualization: An alternative to traditional computing models\nThere are situations in laboratory computing that are similar to the old joke \u201cyour teeth are in great shape but the gums have to go.\u201d The equivalent situation is running a software package and finding out that the computer hardware is failing and the software isn\u2019t compatible with new equipment. That can happen if the new computer uses a different processor than the one you are working with. An answer to the problem is a technology called virtualization. In the context of the joke, it lets you move your teeth to a new set of gums; or to put it another way, it allows you to run older software packages on new hardware and avoid losing access to older data (there are some limitations).\nBriefly put, virtualization allows you to run software (including the operating system) designed for one computer on an entirely different system. An example: the Windows XP operating system and applications running on a Macintosh computer using the MAC OS X operating system via VMware\u2019s Fusion product. In addition to rescuing old software, virtualization can:\n\nreduce computing costs by consolidating multiple software packages on servers;\nreduce software support issues by preventing operating system upgrades from conflicting with lab software;\nprovide design options for multiple labs using informatics products without incurring hardware costs and giving up lab space to on-site computers; and\nreduce interference between software packages running on the same computer.\nRegarding that last benefit, it's worth noting that with virtualization, adding software packages means each gets its own \u201ccomputer\u201d without additional hardware costs. Product warrantees may state the software warrantee is limited to instances where the software is installed on a \u201cclean machine\u201d (just the current operation system and that software package, nothing else). Most people put more than one application on a computer, technically voiding the warrantee. Virtualized containers let you go back to that clean machine concept without buying extra hardware.\nIn order to understand virtualization, we have to discuss computing, but just the basics. Figure 19 shows an arrangement of the elements. When the computer is first turned on, there are three key items engaged: the central processing unit (CPU), memory, and mass storage. The first thing that happens (after the hardware startup sequence) is that portions of the operating system are placed in the memory where the CPU can read instructions and begin working. The key point is that the operating system, applications, and files are a collection of binary data elements (words) that are passed on to the CPU.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 19. Key elements in a computer system\n\n\n\nThe behavior of the CPU can be emulated by a software program. We can have a program that acts like an Intel processor for example, or a processor from another vendor. If we feed that program the instructions from an application, it will execute that application. There are emulators for example, that will allow your computer to emulate an Atari 2600 game console and run Asteroids. There are also emulators for other game consoles, so your computer can behave like any game console you like, as long as you have an emulator for it. Each emulator has all the programming needed to execute copies of the original game programming. They don\u2019t wear out or break. This configuration is shown in Figure 20.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 20. Emulation on a computer\n\n\n\nYou have a series of game emulators, each with its collection of games. Any game emulator can be loaded into memory and execute a game from its collection; games for other emulators won\u2019t work. Each game emulator and game collection is called a container. When you want to play, you access the appropriate container and go. If the mass storage is on a shared server, other people can access the same containers and run the games on their computers without interfering with each other.\nHow does this apply to a computer with failing hardware that is running a data analysis program? Virtualized systems allow you to make a copy of mass storage on the computer, create a container containing the CPU emulator, OS, applications, and data files, and place it on a server for later access. The hardware no longer matters because it is being replaced with the CPU emulator. Your program\u2019s container can be copied, stored, backed up, etc. It will never wear out or grow old. When you want to run it, access the container and run it. A server can support many containers.\nThere are some restrictions, however. First, most computers that are purchased from stores or online come pre-loaded with an operating system. Those operating systems are OEM (original equipment manufacturer) copies of the OS whose license cost is buried in the purchase price. They can\u2019t be copied or transferred legally, and some virtual servers will recognize OEM copies and not transfer them. As a result, in order to make a virtualized container, you need a fully licensed copy of the OS. Your IT group may have corporate licenses for widely used operating systems, so that may not pose a problem. Next, recognize that some applications will require a separate license for use on a virtualized system. As frequently noted, planning ahead is key: explore this option as part of the purchase agreement: you may get a better deal. Third, it's important to note that virtualized systems cannot support real-time applications such as direct analog, clock-driven, time-critical data acquisition from an instrument. The virtualized software shares resources with other containers in a time-sharing mode, and as a result the close coordination for data acquisition will not work. Fortunately, direct data acquisition (as contrasted with computer-to-instrument communications via RS-232, USB, Ethernet, etc.) is occurring less often in favor of buffered data communications with dedicated data acquisition controllers, so this is becoming less of a problem. If you need direct computer-controlled data acquisition and experiment control, this isn\u2019t the technology for you. Finally, containerized software running on virtualized systems cannot access hardware that wasn\u2019t part of the original configuration. If the computer you are using has a piece of hardware that you\u2019d like to use but wasn\u2019t on the original virtualized computer, it won\u2019t be able to use it since it doesn\u2019t have the software drivers to access it.\nIf the applications software permits it, applications can have shared access to common database software. A virtualized LIMS may be a good way to implement the application since it doesn\u2019t require hardware in the lab and uses servers that are probably under IT control, and as a result the systems are backed up regularly. The major hang-up on these installations is instrument connections. IT groups tend to get very conservative about that subject. Middleware could help isolate actual instrument connections from the network and could potentially resolve the situation. The issue is part technical, part political. However, virtualized LIMS containers still prove beneficial for educational purposes. A student can work with the contents of a container, experiment as needed, and when done dismiss the container without saving it; the results of the experiments are gone, mistakes and all.\nThere are different types of virtualization. One has containers sharing a common emulator and operating system. As a result, you update or upgrade the emulator software and\/or operating system once and the change is made across all containers. That can cause problems for some applications; however, they can be moved to a second type of virtualization in which each container has it own copy of the operating system and they can be excluded from updates.\nIf you find this technology appealing, check with your vendor to see if the products of interest will function in a virtualized environment (not all will). Carefully ask questions, perhaps asking if their software will run under VMware\u2019s products or Microsoft\u2019s Desktop Virtualization products, or even Microsoft\u2019s Hyper-V server. Some vendors don\u2019t understand the difference between virtualization and client-server computing. Get any responses in writing.\nRetirement of hardware\nReplacing retired hardware can be a challenge. If it is a stand-alone, isolated product (not connected to anything else), the problem can be resolved by determining the specifications for a replacement, conducting due diligence, etc. It is when data systems, storage, and connections to computers enter the picture that life gets interesting. For example, replacing an instrument sans data system, such as a chromatograph or spectrometer with analog and digital I\/O (sense switches not data) connections to a computer, is essentially just a hardware replacement.\nHardware interfaced to a computer has issues because of software controls and data exchanges. What appears to be the simplest and most common situation is with serial communications (RS-232, RS-xxx). Complications include:\n\nWiring: Serial communications products do not always obey conventions for wiring, so wiring changes have to be considered and tested.\nControl functions and data exchange: Interacting with serial devices via a computer requires both control functions and data exchange. There are no standards for these, so a new purchase will likely require software changes to these. That may be avoided if the replacement device (e.g., a balance) is from the same vendor as the one you currently have, and is part of a family of products. The vendor may preserve the older command set and add new commands to access new features. If that is the case, you still have a plug-compatible replacement that needs to be tested and qualified for use.\nInterfaces: Moving from an RS-232 or similar RS- device to another interface such as USB will require a new interface (although USB ports are on almost every computer) and software changes.\nIf you are using a USB device, the wiring problems go away but the command structure and data transfer issues remain. Potential software problems are best addressed when the software is first planned and designed; good design means planning ahead for change. The primary issue is control of the external device, as data formats may also change. Those points can be addressed by device-independent programming. That means placing all device-dependent commands in one place\u2014a subroutine\u2014and formatting data into a device-independent format. Doing this makes changes, testing, and other aspects easier.\nLet\u2019s take a single pan balance that has two functions: tare and get_weight. Each of those has a different command sequence of characters that are sent to the balance, and each returns either a completed code or a numerical value that maybe encoded in ASCII, BCD, or binary depending on the vendor\u2019s choice. If the commands to work with the balance are scattered throughout a program, you have a lot of changes to find, make, test, and certify as working. Device-independent programming puts them in two areas: one for the tare command and one for the get_weight command, which returns a floating-point value (e.g., 1.67).\nIf you have to replace the device with a new one, the command codes are changed in two places, and the returned numeric code reformatted into a standard floating point value in one place. The rest of the program works with the value without any concern for its source. That allows for a lot of flexibility in choosing balances in the lab, as different units can be used for different applications with minor software adjustments.\n\nSummary \nAs noted when we first started talking about this goal, the ability to support a product has to be designed and built in, not added on. The issues can be difficult enough when you are working with one vendor. When a second or third vendor is added to the mix, you have an entirely new level of issues to deal with. This is a matter of engineering, not just science. Supportable systems and methods have to be designed, documented, engineered, and validated to be supportable. A system or method isn\u2019t supportable simply because its individual components or steps are.\n\nSeventh goal: Addressing systems integration and harmonization \nIn the 1960s, audio stereo systems came in three forms: a packaged, integrated product that combined an AM\/FM radio tuner, turntable, and speakers; all those components but purchasable individually; and a do-it-yourself, get-out-the-soldering-iron format. The integrated products were attractive because you could just plug them into a power source and they worked. The packaging was attractive, and you didn\u2019t have to know much beyond choosing the functions you wanted to use. In terms of component quality, that was up to the manufacturer; it was rarely top-quality as the components still met the basic needs of the application at a particular price point. \nComponent systems appealed to a different type of customer. They wanted to pick the best components that met their budgets, trading off the characteristics of one item against another, always with the idea of upgrading elements as needed. Each components manufacturer would guarantee that their product worked, but making the entire system work was your issue. In the 1960s, everything was analog so it was a matter of connecting wires. Some went so far as to build the components from kits (lower cost), or design something and get it to work just for the fun of it. HDTV\u2019s have some of the same characteristics as component systems, as they can work out of the box, but if you want better sound or want to add a DVD or streaming box, you have to make sure that the right set of connectors exist on the product and that there is enough of them.\nIn the product cases noted above, there are a limited set of choices for mixing components, so user customization isn't that much of a problem. Most things work together from a hardware standpoint, but software apps are another matter. The laboratory world isn\u2019t quite so neat and tidy.\nFrom a lab standpoint, integrated systems are attractive for a number of reasons:\n\nIt suggests that someone has actually thought about how the system should function and what components need to be present, and they put a working package together (may be more than one component).\nIt\u2019s installed as a package: when the installation is done, all of it works, both hardware and software.\nIt\u2019s been tested as a system and all the components work together.\nYou have a single point of contact for support.\nIf an upgrade occurs, someone (hopefully) has made sure that upgrading some portions of the system doesn\u2019t mean others are not working; an upgrade embraces the functionality of the entire system.\nThe documentation addresses the entire system, from training and support to maintenance, etc.\nIt should be easier to work with because the system\u2019s functionality and organization have been thought through.\nIt looks nice. Someone designed a packaged system that doesn\u2019t have a number of separate boxes with wires exposed. In the lab, that may be pushing it.\nAchieving that on a component-by-component basis may be a bit of a challenge. Obtaining an integrated system comes down to a few considerations, not the least of which is what you define as a \u201csystem.\u201d A system is a collection of elements that are used to accomplish a task. Figure 21 shows a view of overall laboratory operations.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 21. Laboratory operations and connections to other groups\n\n\n\nLaboratory operations have three levels of systems: the corporate level, the lab administrative level (darker grey, including office work), and the lab bench level. Our concern is going to be primarily with the latter two; however, we have to be aware that the results of lab work (both research and testing) will find their way into the corporate sphere. If the grey box is our system, where does integration fit in and what strategies are available for meeting that goal? Integration is about connecting devices and systems so that there is a smooth flow of command\/control messages, data, and information that does not depend on human intervention. We are not, for example, taking a printout from one instrument system and then manually entering it into another; that transfer should be electronic and bi-directional where appropriate. Why is this important? Improving the efficiency of lab operations, as well as ROI, has long been on the list of desired outcomes from the use of lab automation and computing. Developing integrated systems with carefully designed mechanisms for the flow, storage, and management of K\/I\/D is central to achieving those goals.\nThere are different strategies for building integrated systems like these. One is to create the all-encompassing computer system that does and controls everything. Think HAL in the movie 2001, or popular conceptions of an advanced AI. However, aside from the pitfalls in popular sci-fi, that isn\u2019t an advisable strategy. First, it will most likely never be finished. Trying to come up with a set of functional specifications would take years if they were ever completed. People would be constantly adding features, some conflicting, and that alone (called scope creep) would doom the process, as it has in similar situations. Even if somehow the project were completed, it would reflect the thinking of those involved at the start. In the time that the project was underway, the needs of the lab would change, and the system would be out-of-date as soon as it was turned on. If you were to develop an adaptable system, you'd again still be dealing with scope creep. Other problems would crop up too. If any component needed maintenance, the entire system could be brought to a halt and nothing would get done. Additionally, staff turn-over would be a constant source of delays as new people were brought on board and trained, and as this system would be unique, you couldn't find people with prior experience. Finally, the budget would be hard to deal with, from the initial estimate to the likely overruns.\nAnother approach is to redefine the overall system as a cooperative set of smaller systems, each with its own integration strategy, with the entire unit interconnected. Integrated systems in the lab world are difficult to define as a product set, since the full scope of a lab's processes is highly variable, drawing on a wide range of instruments and equipment. We can define a functional instrument system (e.g., titrators, chromatographic equipment, etc.) but sample prep variability frustrates a complete package. One place that has overcome this is the clinical chemistry market.\nAt this point, we have to pan back a bit and take a few additional aspects of laboratory operations into consideration. Let's look back at Figure 17 again:\n\n\n\n\n\n\n\n\n\nFigure 17. A laboratory process\n\n\n\nWe are reminded that processes are what rule lab work, and the instrumentation and other equipment play an important but subservient role. We need to define the process first and then move on from there; this is the standard validation procedure. In regards to the integration of that instrumentation and equipment, any integration has to support the entire process, not just the individual instrument and equipment packages. This is one of the reasons integration is as difficult as it is. A vendor can create an instrument package, but their ability to put components together is limited to their understanding of their usage.\nUnless the vendor is trying to fit in a well-defined process with enough of a market to justify the work, there is a limit to what they can do. This is why pre-validated products are available in the clinical chemistry market: the process is pre-defined and everyone does the same thing. Note that there is nothing to prevent the same thing happening in other industries; if there were a market for a product that fully implemented a particular ASTM or USP method, vendors might take notice. There is one issue with that, however. When a lab purchases laboratory instrumentation, they often buy general-purpose components. You may purchase an infrared spectrophotometer that covers a wide spectral range that can be used in a variety of applications to justify its cost. Yet you rarely, if ever, purchase one instrument for each lab process that it might be used in unless there was sufficient demand to justify it. And that's the run: if a vendor were to create an equipment package for a specific procedure, the measuring instrument would be stripped down and tailored to the application, and it may not be usable in another process. Is there enough demand for testing to warrant the development of a packaged system? If you were doing blood work, yes, because all blood testing is done the same way; it\u2019s just a question of whether or not your lab is getting enough samples. If it\u2019s ASTM xxx, maybe not.\nAdditionally, the development of integrated systems needs to take the real world into account. Ask whether or not the same equipment can be used for different processes. If the same equipment setup is being used in different processes with different reagents or instrument setups (e.g., different columns in chromatography), developing an integrated electro-mechanical computer system may be a viable project since all the control and electro-mechanical systems would be the same. Returning to the cookie analogy, it's the same equipment, different settings, and different dough mixes (e.g., chocolate chip, sugar cookies, etc.). You just have to demonstrate that the proper settings are being used to ensure that you are getting consistent results.\nIf this sounds a little confusing, it\u2019s because \u201cintegration\u201d can occur on two levels: the movement of data and information and the movement of material. On one hand, we may be talking about integration in the context of merging the sources of data and information (where they are generated) into a common area where they can be used, managed, and accessed according to the lab's and corporate's needs. Along the way, as we\u2019ve seen in the K\/I\/D discussions, different types of data and information will be produced, all of which has to be organized and coordinated. This flow is bi-directional: generated data and information in one direction, and work lists in the other. On the other hand\u2014regarding the movement of materials\u2014we may be talking about automated devices and robotics. Those two hands are joined at the measuring instrument.\nWe\u2019ll begin by discussing the movement of data and information: integration is a matter of bringing laboratory-generated data and information into a structured system where it can be accessed, used, and managed. That \u201csystem\u201d may be a single database or a collection of interconnected data structures. The intent in modern lab integration is that connections between the elements of that structure are electronic, not manual, and transfers may be initiated by user commands or automated processes. Yet note that someone may consider a completely manual implementation process to be \u201cintegrated,\u201d and be willing to accept slower and less efficient data transfers (that\u2019s what we had in the last century). However, that methodology doesn\u2019t give us the improvements in productivity and ROI that are desired.\nThe idea that integration is moving all laboratory K\/I\/D into a structured system is considerably different than the way we viewed things in the past. Previously, the goal was to accumulate lab results into a LIMS or an ELN, with the intermediate K\/I\/D (e.g., instrument data files, etc.) placed in an SDMS. That was a short-sighted attempt at considering only data storage, without fully considering the topic of data utilization. The end goal of using a LIMS or ELN to accumulate lab results was still valid\u2014particularly for further research, reporting, planning, and administrative work\u2014but it didn\u2019t deal effectively with the material those results were based on or the potential need to revisit that work.\nIn this discussion we\u2019ll consider the functional hub of the lab to be a LIMS and\/or ELN, with an optional SDMS; it\u2019s your lab, and you get to choose. We\u2019re referring to this as a hub for a couple of reasons. First, it is the center of K\/I\/D management, as well as planning and administrative efforts. Second, these should be the most stable information systems in the lab; durable and slow to change. Instruments and their data systems will change and be replaced as the lab\u2019s operational needs progress. As a result, the hub is where planning efforts have to begin since decisions made here have a major impact on a lab's ability to meet its goals. The choice of a cloud-based system vs. an on-site system is just one factor to consider.\nHistorically, laboratories have begun by putting the data- and information-generating capability in place first. That\u2019s not unusual, since new companies need data and information to drive their business development. However, what they really need is a mechanism for managing the data and information. Today, however, the place that the development of a laboratory electronic infrastructure needs to begin is with the systems that are used to collect and manage data and information. Then we can put in place the data and information generators. It\u2019s a bit like starting a production line without considering what you\u2019re going to do with all the material you\u2019re producing.\nThe types of data and information generators can vary greatly. Examples include:\n\na human-based reading that is recorded manually;\na reading recorded by an instrument with limited storage and communication abilities, e.g., balances and pH meters;\na reading recorded by a limited-functionality device, where data is recorded and stored but must be transmitted out of the machine to be analyzed; and\na reading recorded by a combination instrument-computer, which has the ability to record, store, analyze, and output data in various forms.\nThe issue we need to deal with for each of those generators is how to plan for where the output should be stored so that it is accessible and useful. That was the problem with earlier thinking. We focused too much on where the K\/I\/D could be stored and maintained over the long term, but not enough on its ability to be used and managed. Once the analysis was done, we recognized the need to have access to the backup data to support the results, but not the ability to work with it. The next section will look at some of the ramifications of planning for those data types.\nPlanning the integration of your data generators\nThere are several ramifications of planning for your data generators that need to be discussed. Before we begin, though, we need to add two additional criteria for the planning stage:\n\nYou should avoid duplication of K\/I\/D unless there is a clear need for it (e.g., a backup).\nIn the progression from sample preparation to sample processing, to measurement, to analysis, and then to reporting, there should not be any question of both the provenance and the location of the K\/I\/D generated from that progression.\nThat said, let's look at each generator in greater detail to better understand how we plan for their integration and the harmonization of their resulting K\/I\/D.\n\r\n\n1. A human-based reading that is recorded manually, or recorded from an instrument with limited storage and communication abilities\nExamples: The types of devices we are looking at for these generators are balances, pH meters, volt meters, single-reading spectrophotometers, etc.\nMethod of connection: Both the manual and digital generators don\u2019t leave much choice: the integration is direct data entry into a hub system unless they are being used as part of a LES. Manual modes mean typing (with data entry verification), while digital systems provide for an electronic transfer as the method of integration.\nIssues: There are problems with these generators being directly tied to a hub component (see \u201cA,\u201d Figure 22). Each device or version has its own communications protocol, and the programming is specific to that model. If the item is replaced, the data transfer protocols may differ and the programming has to change. These devices are often used for single readings or weighing a sample, with the result stored in the hub, to be use in later calculations or reporting. However, even though these are single-reading instruments, things can get complicated. Problems may crop up if their measurements are part of a time series, require a weight measurement at specific time intervals, are used for measuring the weights of similar items in medical tablet uniformity testing, or used for measuring pH during a titration. Those applications wouldn\u2019t work well with a direct connection to a hub and would be better served through a separate processor (see \u201cB,\u201d Figure 22) that builds a file of measurements that could be processed, with the results sent to the hub. This creates a need to manage the file and its link to the transmitted results. An SDMS would work well, but the hub system just became more complex. Instead of a device being directly connected to a hub, we have an intermediate system connected to an SDMS and the HUB. Integration is still easily feasible, but more planning is required. Should the intermediate system take on the role of an SDMS (all the files are stored in its file structure), you would also have to provide backup and security facilities to ensure that the files weren\u2019t tampered with and secured against loss. The SDMS would be responsible for entering the results of the work into the hub. (Remember that access to the files is needed to support any questions about the results; printed versions would require re-entering the data to show that the calculations were done properly, which is time-consuming and requires verification.)\nDiagram:\n\n\n\n\n\n\n\n\n\nFigure 22. Representation of the movement of data and information in the lab from a human-based reading that is recorded manually, or from an instrument with limited storage and communication abilities\n\n\n\n\r\n\n2. A reading recorded by a limited-functionality device\nExamples: Measurements are made on one or more samples, with results stored locally, which have to be transmitted to another computer for processing or viewing. Further use may be inhibited until that is done (e.g., the device may have to transmit one set of measurements before the next set can begin). Such devices include microplate readers and spectrophotometers. Some devices can be operated manually from front panel controls, or via network connections through a higher-level controller.\nMethod of connection: Some devices may retain the old RS-232\/422 scheme for serial transmission of measurements and receiving commands, though most have transitioned to USB, Ethernet, or possibly wireless networking.\nIssues: Most of these devices do not produce final calculated results; that work is left to an intermediate process placed between the device and the hub system (Figure 23). As a result, integration depends on those intermediate processes controlling one or more devices, sometimes coordinating with other equipment, calculating final results, and communicating them to the hub system. Measurement files need to be kept in electronic form to make them easier to back up, copy, transmit, and generally work with. If only printed output is available, it should be scanned and an equivalent machine-readable version created and verified. Each experimental run, which may include one or more samples, should have the associated files bundled together into an archive so that all data is maintained in one place. That may be the intermediate processor\u2019s storage or an SDMS with the appropriate organization and indexing capabilities, including links back from the hub. The electronic files may be used to answer questions about how results were produced, re-run an analysis using the original or new set of algorithms, or have the results analyzed as part of a larger study.\nDiagram:\n\n\n\n\n\n\n\n\n\nFigure 23. Representation of the movement of data and information in the lab from a limited-functionality device\n\n\n\n3. A reading recorded by a combination instrument-computer\nExamples: These generators include one-to-one instrument-to-instrument data systems (IDS), many-to-one instrument-to-computer systems, NMR spectroscopy, chromatography, mass spectrometry, thermal analysis, spectrophotometers, etc.\nMethod of connection: There are several means of connection: 1) detector to computer (A\/D), 2) computer to instrument control and accessory devices such as autosamplers (via, e.g., digital I\/O, USB), and 3) computer to centralized hub systems (via, e.g., USB, Ethernet, wireless networks). Integration between the IDS is accomplished through vendor-supported application programming interfaces (APIs) on both sides of the connection.\nIssues: The primary with these generators is managing the data structures that consist of captured detector output files, partially processed data (e.g., descriptors such as peak size, width, area, etc.) computed results, sample information, worklists, and processing algorithms. Some of this material will get transmitted to the hub, but the hub isn't generally designed to incorporate all of it. A portion of it\u2014 the content of a printed report, for example\u2014could be sent to an SDMS. However, the bulk of it has to stay with the IDS since the software needs to interpret and present the sample data contents; the data files by themselves are useless without software to unpack and make sense of them.\nFrom a planning standpoint, you want to reduce the number of IDSs as much as possible. While chromatography is presumably the only technique that offers a choice between one-to-one and many-to-one instruments to computers, hopefully over time that list will expand to provide better data management. Consider three chromatographs, each with its own IDS. If you are looking for data, you have three systems to check, and hopefully each has its own series of unique sample IDs. Three instruments on one IDS is a lot easier to manage and search. You also have to consider backups, upgrades, general maintenance, and cost.\nMoving the instrument data files to an SDMS may not be effective unless the vendor has made provision for it. The problem is data integrity. If you have the ability to move data out of the system and then re-import it, you open up the possibility of importing data that has been edited. Some vendors prohibit this sort of activity.\nDiagram:\n\n\n\n\n\n\n\n\n\nFigure 24. Representation of the movement of data and information in the lab from a combination instrument-computer\n\n\n\nThe above looked at generator types in isolation; however, in reality, devices and instruments are used in combinations, each producing results that have to be maintained and organized. We must look at the data sets that are generated in the course of executing an experiment or method.\nThe procedures associated with an experiment or method can be executed three ways: manually, using a LES, or using a robotics implementation. The real world isn\u2019t so neatly separated; manual and LES implementations may have some steps that use automated tools. The issue we need to address in planning is the creation of an \u201cexperiment data set\u201d that brings all the results produced into one package. Should questions arise about an experiment, you have a data set that can be used as a reference. That \u201cpackage\u201d may be pages in a notebook, a word processor file, or some other log. It should contain all data recorded during a procedure, or, in the case of IDS capture data, file references or pointers to that instrument data or information. You want to be able to pull up that record and be able to answer any questions that may arise about the work.\nAll of that may seem pretty obvious, but there is one point that needs to be addressed: the database structure, including hub systems, IDS file structures, and SDMS all have to be well defined before you accumulate a number of experiment packages. You don\u2019t want to find yourself in a situation where you have a working system of data or information storage and then have to make significant changes to it. That could mean that all previous packages have to be updated to reflect the new system, or, worse, have to deal with an \u201cold\u201d and \u201cnew\u201d system of managing experimental work.\nLES systems come in two forms: stand-alone software packages, and script-based systems that are part of a LIMS or ELN. The stand-alone systems should produce the experiment record automatically with all data and pointers to IDS captured data or information. For script-based systems, the programming for the LES function has to take that into account. As for laboratory robotics, they can be viewed as an extension of a LES: instead of a person following instructions, a robot or a collection of robotic components follows its programming to carry out a process. Developing an experimental record is part of that process.\nThe bottom line in all of this is simple: the management architecture for your K\/I\/D has to be designed deliberately and put in place early in a lab's development. If it is allowed to be created on an as-needed basis, the resulting collection of computers and storage will be difficult to maintain, manage, and expand in an orderly fashion. At some point, someone is going to have to reorganize it, and that will be an expensive and perhaps painful process.\n\nHarmonization \nHarmonization is a companion goal to integration. Approached with the right mindset it can reduce:\n\ninstallation costs,\nsupport costs,\neducation and training requirements, and\ndevelopment effort.\nHarmonization efforts, if used inappropriately, can create strife, increasing inter-departmental friction and conflict. The general idea of harmonization is to use common hardware and software platforms to implement laboratory systems, while ensuring that move toward commonality doesn\u2019t force people to use products that are force-fits, that don\u2019t really meet the lab's needs, but serve some other agenda. The purpose of computing systems is to help people get their work done; if they need a specific product to do it, end of story. If it can be provided using common hardware and software platforms, great, but that should not be a limiting factor. If used as a guide in the development of database systems, harmonization can make it easier to access laboratory K\/I\/D across labs. It may slow down implementations because more people\u2019s opinions have to be taken into account, but the end result will be the ability to gain more use out of your K\/I\/D.\nFigure 25 shows a common LIMS server supporting three different labs. Each lab has its own database structure, avoiding conflicts and unnecessary compromises in the conduct of lab work. It does benefit reduced implementation costs and support costs. While some vendors support this, others may not; see if they are willing to work a deal since there are multiple labs systems involved. If we couple this with the common structure definitions of K\/I\/D noted earlier, accessing information across labs will be more productive.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 25. Harmonizing LIMS platforms\n\n\n\nAn alternative is to force everyone into one data structure, usually to reduce costs. Savings on licensing costs may be offset by development delays as multiple labs resolve conflicts in database organization, security, access control, etc. In short, keep it simple; things will work smoother and in the long run be less costly from an implementation, maintenance, and support perspective. If there is a need or desire to go through the databases for accounting purposes or other organizational requirements, the necessary material can be exported into another file structure that can be analyzed as needed. This provides a layer of security between the lab and the rest of the organization. It\u2019s basically a matter of planning how database contents are being managed with the lab and what has to be accessed from other parts of the organization.\nPart of harmonization planning process involves examining how computers are paired with instruments. You may not have multiple instances of higher-priced equipment such as mass spectrometers, NMRs, or other instruments, and having a computer dedicated to each device makes sense. However there is one instrument that you may have several of: chromatographs. You can purchase a computer for each instrument, but in this case, most CDS can support multiple instruments (Figure 26).\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 26. Consolidate instrument-computer connections where feasible\n\n\n\nThere are advantages to having multiple instruments on one computer:\n\nYou have only one system to support, maintain, and back up.\nAll the K\/I\/D is in one system.\nThe qualification or validation process is performed once, rather than having it repeated for each system.\nOverall cost is reduced.\nThis \"multiple instruments to one computer\" configuration is the result of a low data collection rate, the modest computing requirements needed to process the instrument data, and user demands on the vendors. Given the developments in computing power and distributed data acquisition and control, this many-to-one configuration should be extended to other instrument techniques, reducing costs and bringing more efficiency to the management of K\/I\/D.\nRegarding computer systems...\nHarmonization doesn't mean that everything should run the same OS or the same version of the OS. It means doing it where possible, but not at the expense of doing lab work effectively.\nWith the wide diversity of products in the laboratory market, you\u2019re going to find a mix of large and small vendors. Some may be small, growing companies that are managed by a few people, and as a result, keeping up with the latest versions of operating systems and underlying software may not be critical if it doesn\u2019t affect their product's usability or performance. Their product certification on the latest version of a software platform may lag larger vendors. That means that requiring all systems to be at the same operating system level isn\u2019t realistic. Upgrading the OS may disable the software that lab personnel depend upon.\nRegarding the data...\nDuring November 19-20, 2019 Pharma IQ\u2019s Laboratory Informatics Summit held a meeting on \"Data Standardization for Lab Informatics.\" The meeting highlighted the emerging FAIR Guiding Principles, which state that K\/I\/D should be findable, accessible, interoperable, and reusable (FAIR). The point of mentioning this is to highlight the growing, industry-wide importance of protecting the value of the K\/I\/D that you are collecting. No matter how much it costs to produce, if you can\u2019t find the K\/I\/D you need, it has no value because it isn\u2019t usable. The same holds true if the data supporting information can\u2019t be found.\nUtilization is at the core of much of what we\u2019ve been discussing. Supporting the FAIR Guiding Principles should be part of every discussion about products and what they produce, how the database is designed, and what the interoperability between labs in your organization looks like.\nAnother aspect of this subject is harmonizing data definitions across your organization. The same set of terms should be used to describe an object or aspect, and their database representation should be compatible, etc. The point is to make it easier to find something and make use of it.\nPutting this all to use\nHow do you apply all of this to your new lab (an easier task) or existing lab (more challenging)? This is going to be a broad-brush discussion since every lab has their own way of handling things, from its overall mission to its equipment and procedures, so you\u2019re going to have to take these points and adjust them to fit your requirements.\nTo start, assume you have a hub system (either a LIMS or ELN) as the center of gravity for all your K\/I\/D collection. You build your lab's K\/I\/D management infrastructure from this center of gravity outward; effectively everything revolves around the hub and radiates out from it.[l] \nFor each K\/I\/D generator, ask:\n\nWhat does it produce, which of the K\/I\/D generator types noted earlier matches, and which model is appropriate?\nDoes it generate a file that has to be processed, or is it the final measurement?\nDoes the device or the system supporting it have all the information needed to move it on to the next phase of the process? For example if the device is a pH meter, what is going to key the result into the next step? It will need a sample or experiment reference ID so that it knows where the result should go.\nFor each device output, ask:\n\nWhat happens to the generated K\/I\/D and how is it used? And remember, nothing should ever get deleted.\nIs the device output a single measurement or part of a set? Will it be combined with measurements from other devices, sample IDs, and calibration information?\nWhere is the best place to put it? In an intermediate server, SDMS, or hub?\nIf it is a final result of that process, should it be in the hub?\nIf it is an intermediate file or result, then where?\nHow might it be used in the future and what is the best way to prepare for that? Files may need to be examined in audits, transferred to another group or organization[m], or recalculated with new algorithms. Does your system provide trace-back from final results to source data?\nFor all devices, ask: \n\nDoes every device that has storage and communications capability have back up procedures put in place?\nDepending on your point of view, whether it is on the science, laboratory operations management, or lab administration, your interest in lab computing may range from \u201cnecessary evil\u201d to \u201cmakes life easier\u201d to \u201cneeded to make the lab function,\u201d or some other perspective. You may be of the opinion that all this is interesting but not your responsibility. If not yours, then who? That topic will be covered in the next write-up.\n\nLaboratory systems engineers \nIf you believe that the technology planning and management considerations noted so far in this guide are important to your laboratory, it's time to ask to whom that responsibility falls upon?\nThe purpose of this guide has been to highlight that the practice of science has changed, become more complex, and become more dependent on technologies that demand a lot of attention. Those technologies are not only the digital systems we\u2019ve covered, but also the scientific methodologies and instrumentation whose effective use can take\u2014through increasing specialization and depth of material\u2014an entire career to learn and apply. The user of scientific computing typically views it as a tool for getting work done and not another career. Once upon a time, the scientist knowledgeable in both laboratory work and computing was necessary; if you wanted to use computers you had to understand how they worked. Today, if you tried to do that, you\u2019d find yourself spread thin across your workload, with developments happening faster in science and computers than you a single individual can keep up with.\nLet's look at what a laboratory scientist would need to be able to do in order to also support their laboratory's scientific computing needs, in addition to their normal tasks (Figure 27).\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 27. Partial task list for supporting laboratory scientific computing\n\n\n\nOn top of those tasks, the lone scientist would also have to have the following technological knowledge and personal capabilities (Figure 28):\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 28. Partial list of technological knowledge and personal capability requirements for supporting laboratory scientific computing\n\n\n\nLooking at these two figures, we're realistically considering two levels of expertise: a high, overview level that can look at the broader issues and see how architectures can be constructed and applied, and specialists in areas such as robotic, etc. However, the current state of undergraduate\u2014and to a lesser extent graduate\u2014education doesn\u2019t typically have room for the depth of course work needed to cover the material noted above. Expanding your knowledge base into something that is synergistic with your current course work is straightforward; doing it with something that is from a separate discipline creates difficulties. Where do digital systems fit into your life and laboratory career?\nLet's look at what the average laboratory scientist does today. Figure 29 shows the tasks that are found in modern laboratory operations in both research and testing facilities. \n\r\n\n\n\n\n\n\n\n\n\n\nFigure 29. Typical laboratory activities in any scientific discipline\n\n\n\nNow let's compare that set of tasks in Figure 29 with the task emphasis provided in today's undergraduate laboratory science courses and technology courses (Figure 30 and 31).\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 30. Task-level emphasis of laboratory science in higher-education courses\n\n\n\n\n\n\n\n\n\n\n\n\nFigure 31. Task-level emphasis of information technology in higher-education courses\n\n\n\nIn cases where science students have access to instrumentation-computer systems, the computers are treated as \u201cblack boxes\u201d that acquire the data (data capture), process it (data processing) and report it. How those things happen is rarely if ever discussed, with no mention of analog-digital converters, sampling rates, analysis algorithms, etc. \u201cStuff happens,\u201d yet that \u201cstuff,\u201d if not properly ran with tested parameters, can turn good bench science into junk data. How would they know? Students may or may not get exposure to LIMS or ELN systems even though it would be useful for students to capture and work with their lab results, but schools may not be willing to invest in them.\nIT students will be exposed to data and information management through database courses, but not at the level that LIMS and ELNs require (e.g., instrument communications and control); the rest of the tasks in Figure 31 is practically unknown to them. They\u2019d be happy to work on the computer in Figure 31, but the instrument and the instrument connections\u2014the things that justify the computer's role\u2014aren\u2019t something they\u2019d be exposed to.\nWhat we need are people with a foot in both fields, able to understand and be conversant in both the laboratory science and IT worlds, relating them to each other to the benefit of lab operation effectiveness while guiding IT in performing their roles. We need \u201claboratory systems engineers\u201d (LSEs).\nPreviously referred to as \"laboratory automation engineers\" (LAEs)[6] and \"LAB-IT\" specialists, we now realize both titles fall short of the mark. \"Laboratory automation engineer\" emphasizes automation too strongly when the work is much broader than that. And \"LAB-IT\" is a way of nudging IT personnel into lab-related work without really addressing the full scope of systems that exist in labs, including robotics and data acquisition and control.\nLaboratory information technology support differs considerably from classical IT work (Figure 32). The differences are primarily two-fold. First, the technologies used in lab work, including those in which instruments are attached to computers and robotics, are different than those commonly encountered in classical IT. The computers are the same, but the added interface and communications requirements imposed by instrument-computer connections change the nature of the work. When troubleshooting, it can be difficult to separate computer issues from those resulting from the connection to instruments and digital control systems. Second, the typical IT specialist, maybe straight out of school, doesn\u2019t have a frame of reference for understanding what they are dealing with in a laboratory setting. The work is foreign, the discussions involve terminology they may not understand, and there may be no common ground for discussing problems. In classical IT, the IT personnel may be using the same office software as the people they support, but they can't say the same for the laboratory software used by scientists.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 32. Comparison of corporate IT with laboratory IT (LAB-IT)\n\n\n\nHaving noted the differences between classing IT and laboratory IT, as well as the growing need for competent LSEs, we need to take a closer look at some of the roles that classic IT and LSE personnel can take. Figure 33 provides a sub-set of the items from Figure 32 and reflects tasks that IT groups could be comfortable with. Typical IT backgrounds with no lab tech familiarity won\u2019t get you beyond the basic level of support. To be effective, IT personnel need to become familiar with the lab environment, the applications and technologies used, and the language of laboratory work. It isn\u2019t necessary for IT support to become experts in instrumental techniques, but they should understand the basic \"instrument to control system to computer\" model as well as the related database applications, to the point where they can provide support, advise people on product selections, etc. We need people who can straddle the IT-laboratory application environment. They could be lab people with an interest in computing or IT people with a strong interest in science.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 33. Potential roles for IT and LSE support in laboratory work\n\n\n\nThere are ways of bridging that education gap (Figure 34), but today they depend upon individual initiative more than corporate direction to educate people to the level needed. On-the-job training is not an effective substitute for real education; on the surface it is cheaper, but you lose out in the long run because people really don\u2019t understand what is going on, which limits their effectiveness and prevents them from being innovative or even catching problems in the early stages before they become serious. A big issue is this: due to a lack of education, are people developing bad K\/I\/D and aren't aware of it? The problem isn\u2019t limited to the level of systems we are talking about here. It also extends to techniques such as pipetting.[15]\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 34. Bridging the education gap\n\n\n\nIt is also a matter of getting people to understand the breadth of material they have to be familiar with. In 2018, a webinar series was created (Figure 35) to educate management on the planning requirements for implementing lab systems. The live sessions were well attended. The chart shows the viewing rate for the individual topics through early December 2020. Note that the highest viewed items were technology-specific; people wanted to know about LIMS, ELN, etc. The details about planning, education, support, etc. haven\u2019t received near the amount of attention they need. People want to know about product classes but aren\u2019t willing to learn about what it takes to be successful. Even if you are relying on vendors or consultants, lab management is still accountable for the success of planning, implementation, and effectiveness of lab systems.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 35. Laboratory technology webinar series views after initial release\n\n\n\nPrior to the COVID-19 pandemic of 2020, undergraduate education depended on the standard model of in-person instruction. With the challenges of COVID-19 spreading, online learning took on a new importance and stronger acceptance, building on the ground established by online universities and university programs. This gives us an acceptable model for two types of course development: a fully-dedicated LSE program or an expanded program that would expand student\u2019s backgrounds in both the laboratory sciences and IT. One issue that would need to be addressed, however, is bridging the gap between presentation material and hands-on experience with lab systems. Videos and evaluation tests will only get you so far; you need the hands-on experience to make it real and provide the confidence that what you've learned can be effectively applied.\nThere are several steps that can be taken to build an LSE program. The first is to develop a definition of a common set of skills and knowledge that an LSE should have, recognizing that people will come from two different backgrounds (i.e., laboratory science and IT), and those have to be built up to reach a common balanced knowledge base. Those with a strong laboratory science background need to add information technology experience, while those from IT will need to gain an understanding of how laboratory science is done. Remember, however, that those from IT experiences don\u2019t need to be educated in chemistry, biology, and physics, etc. After all, they aren\u2019t going to be developing methods; they will be helping to implement them. There are things common to all sciences that they need to understand such as record keeping, the workflow models of testing and research, data acquisition and processes, instrumentation, and so on. That curriculum should also help people who want to specialize in particular subject areas such as laboratory database systems, robotics, etc. The second step is to build a curriculum that allows students to meet those requirements. This requires solid forethought in the development and curation of course materials. A lot of material already exists and is spread over the internet on university, government, and company web sites. A good first step would be to collect and organize those references into a single site (the actual courses need not be moved to the site, just their descriptions, access requirements, and links). Presentation and organization of the content is also important. Someone visiting the site will need a guide of what LSE is about, how to find material appropriate for different subject areas, and how to get access to it. Consider your site audience a visitor that knows nothing about the field: where do they start and how do we facilitate their progress? Providing clear orientation and direction are key. First give them an understanding of what LSE is all about, and then a map to whatever interests them. With the curriculum built, you can then identify areas that need more material and then move to further develop the program. Of course, you'll also want to make it possible to take advantage of online demonstration systems and simulators to give people a feel for working with the various laboratory systems. This is a half-step to what is needed: there\u2019s no substitute for hands-on work with equipment.\nAs it stands today, we\u2019ve seemingly progressed from manual methods, to computer-assisted methods, and then to automated systems in the course of developing laboratory technologies over the years, and yet our educational programs are a patchwork of courses largely driven by individual needs. We need to take a new look at lab technologies and their use and how best to prepare people for their work with solid educational opportunities.\n\nClosing \nThis guide has addressed the following:\n\nWhy technology planning and management needs to be addressed: Because integrated system need attention in their application and management to protect electronic laboratory K\/I\/D, ensure that it can be effectively used, and ensure that the systems and products put in place are both the right ones, and that they fully contribute to improvements in lab operations.\nWhat's changed about that planning and management since the introduction of computers in the lab: As technology in the lab expanded, we lost the basic understanding of what the new computer and instrument system were and what they did, that they had faults, and that if we didn\u2019t plan for their effective use and counter those faults, we were opening ourselves to unpleasant surprises. The consequences at times were system crashes, lost data, and a lack of a real understanding of how the output of an instrument was transformed into a set of numbers, which meant we couldn\u2019t completely account for the results we were reporting. A more purposeful set of planning and management activities, at the earliest point possible, have become increasingly more important.\nWhy developing an environment that fosters productivity and innovation is important: Innovation doesn\u2019t happen in a highly structured environment: you need the freedom to question, challenge, etc. You also need the tools to work with. The inspiration that leads to innovation can happen anywhere, anytime. All of a sudden all the pieces fit. This requires flexibility and trust in people, an important part of corporate culture.\nWhy developing high-quality K\/I\/D is desirable: There are different types of data structures that are used in lab work, and careful attention is needed to work with and manage them. This includes the effective management of K\/I\/D, putting it in a structure that encourages its use and protects its value. When methods are proven and you have documented evidence that they were executed by properly educated personnel using qualified reagents, instruments, and methods, you should then have high-quality K\/I\/D to support each sample result and any other information gleaned from that data.\nWhy fostering a culture around data integrity is important to lab operations, addressing both technical and personnel issues: Positive outcomes will come from your data integrity efforts: your work will be easier and protected from loss, results will be easier to organize and analyze, and you\u2019ll have a better functioning lab. You\u2019ll also have fewer unpleasant surprises when technology changes occur and you need to transition from one way of doing things to another.\nHow to address digital, facility, and backup security: Preventing unauthorized electronic and physical intrusion is critical to data integrity and meeting regulatory requirements. It also ensures that access to K\/I\/D is protected against loss from a wide variety of threats to the organization's facilities, all while securing your ability to work. This included addressing power backup, continuity of operations, systems backup, and more.\nHow to acquire and develop \"products\" that support regulatory requirements: Careful engineering and well-planned and -documented internal processes are needed to ensure that systems and methods that are being used can remain in use and be supported over the life span of a lab. This means recognizing the initial design and planning of processes and methods has to be done well for a supportable product, and keeping in mind the potential for future process review and modification even as the initial process or method is being developed. Additionally, the lab must also recognize the complete product life cycle and how that affects the supportability of systems and methods.\nThe importance of system integrations and the harmonization of K\/I\/D: Integrated systems can benefit a lab's operations and the planning needed to work with different types of database systems as the results of lab work becoming more concentrated in LIMS and ELNs, including making decisions about how K\/I\/D is stored and distributed over multiple databases. At the same time, harmonization efforts using common hardware and software platforms to implement laboratory systems is important, but those efforts must also ensure that the move toward commonality doesn\u2019t force people to use products that are forced fits, that don\u2019t really meet the lab's needs, but serve some other agenda.\nWhy the development of comprehensive higher-education courses dedicated to the laboratory systems engineer or lab science-IT hybrid is a must with today's modern laboratory technology: In today's world, typical IT backgrounds with no lab tech familiarity, or typical laboratory science backgrounds with no IT familiarity won\u2019t get you beyond the basic level of support for your laboratory systems. To be effective, IT personnel need to become familiar with the lab environment, the applications and technologies used, and the language of laboratory work, while scientists must become more familiar with the management of K\/I\/D from the technical perspective. This gap must be closed through new and improved higher-education programs.\nAnd thus we return back to the start to close this guide. First, there's a definite need for better planning and management of laboratory technologies. Careful attention is required in to protect electronic laboratory knowledge, information, and data (K\/I\/D), ensure that it can be effectively used, and ensure that the systems and products put in place are both the right ones, and that they fully contribute to improvements in lab operations. Second, seven clear goals highlight this apparent need for laboratory technology planning and management and improve how it's performed. From supporting an environment that fosters productivity and innovation all the way to ensuring proper systems integration and harmonization, planning and management is a multi-step process with many clear benefits. And finally, there's a definitive need for more laboratory systems engineers (LSEs) who have the education and skills needed to accomplish all that planning and management in an effective manner, from the very start. This will require a more concerted effort in academia, and perhaps even among professional organizations catering to laboratories. All of this together hopefully means a more thoughtful, modern, and deliberate approach to implementing laboratory technologies in your lab.\n\n Abbreviations, acronyms, and initialisms \nA\/D: Analog-to-digital\nAI: Artificial intelligence\nALCOA: Attributable, legible, contemporaneous, original, and accurate\nAPI: Application programming interface\nCDS: Chromatography data system\nCPU: Central processing unit\nELN: Electronic laboratory notebook\nEPA: Environmental Protection Agency\nFAIR: Findable, accessible, interoperable, and reusable\nFDA: Food and Drug Administration\nFRB: Fast radio bursts\nIT: Information technology\nISO: International Organization for Standardization\nK\/D\/I: Knowledge, data, and information\nLAB-IT: Laboratory information technology support staff\nLAE: Laboratory automation engineering (or engineer)\nLES: Laboratory execution system\nLIMS: Laboratory information management system\nLIS: Laboratory information system\nLOF: Laboratory of the future\nLSE: Laboratory systems engineer\nML: Machine learning\nOS: Operating system\nQA\/QC: Quality assurance\/quality control\nROI: Return on investment\nSDMS: Scientific data management system\nSOP: Standard operating procedure\nTPM: Technology planning and management\n\r\n\n\nFootnotes \n\n\n\u2191 See Elements of Laboratory Technology Management and the LSE material in this document. \n\n\u2191 See the \"Scientific Manufacturing\" section of Elements of Laboratory Technology Management. \n\n\u2191 By \u201cgeneral systems\u201d I\u2019m not referring to simply computer systems, but the models and systems found under \u201cgeneral systems theory\u201d in mathematics. \n\n\u2191 Regarding LAB-IT and LAEs, my thinking about these titles has changed over time; the last section of this document \u201cLaboratory systems engineers\u201d goes into more detail. \n\n\u2191 \u201cReal-time\u201d has a different meaning inside the laboratory than it does in office applications. Instead of a response of a couple seconds between an action and response, lab \u201creal-time\u201d is often a millisecond or faster precision; missing a single sampling timing out of thousands can invalidate an entire sample analysis. \n\n\u2191 See Notes on Instrument Data Systems for more on this topic. \n\n\u2191 For a more detailed description of the K\/I\/D model, please refer to Computerized Systems in the Modern Laboratory: A Practical Guide. \n\n\u2191 For more detailed discussion on this, see Notes on Instrument Data Systems. \n\n\u2191 For more information on virtualization, particularly if the subject is new to you, look at Next-Gen Virtualization for Dummies. The for Dummies series is designed to educate people new to a topic, getting away from jargon and presenting material in clear, easy-to-understand language. This book is particularly good at that. \n\n\u2191 One good reference on this subject is a presentation Building A Data Integrity Strategy To Accompany Your Digital Enablement by Julie Spirk Russom of BioTherapeutics Pharmaceutical Science. \n\n\u2191 Though it may not see significant updates, consider reading the Comprehensive Guide to Developing and Implementing a Cybersecurity Plan for a much more comprehensive look at security in the lab. \n\n\u2191 Yes the scientific work you do is essential to the lab\u2019s purpose, but our focus is on one element of the lab\u2019s operations: what happens after the scientific work is done. \n\n\u2191 For example, a product line is sold to another company or transferred to another division, and they then want copies of all relevant information. Meeting regulatory requirements is another example. \n\n\nAbout the author \nInitially educated as a chemist, author Joe Liscouski (joe dot liscouski at gmail dot com) is an experienced laboratory automation\/computing professional with over forty years of experience in the field, including the design and development of automation systems (both custom and commercial systems), LIMS, robotics and data interchange standards. He also consults on the use of computing in laboratory work. He has held symposia on validation and presented technical material and short courses on laboratory automation and computing in the U.S., Europe, and Japan. He has worked\/consulted in pharmaceutical, biotech, polymer, medical, and government laboratories. His current work centers on working with companies to establish planning programs for lab systems, developing effective support groups, and helping people with the application of automation and information technologies in research and quality control environments.\n\nReferences \n\n\n\u2191 Bourne, D. (2013). \"My boss the robot\". Scientific American 308 (5): 38\u201341. doi:10.1038\/scientificamerican0513-38. PMID 23627215.   \n \n\n\u2191 Cook, B. (2020). \"Collaborative Robots: Mobile and Adaptable Labmates\". Lab Manager 15 (11): 10\u201313. https:\/\/www.labmanager.com\/laboratory-technology\/collaborative-robots-mobile-and-adaptable-labmates-24474 .   \n \n\n\u2191 Hsu, J. (24 September 2018). \"Is it aliens? Scientists detect more mysterious radio signals from distant galaxy\". NBC News MACH. https:\/\/www.nbcnews.com\/mach\/science\/it-aliens-scientists-detect-more-mysterious-radio-signals-distant-galaxy-ncna912586 . Retrieved 04 February 2021 .   \n \n\n\u2191 Timmer, J. (18 July 2018). \"AI plus a chemistry robot finds all the reactions that will work\". Ars Technica. https:\/\/arstechnica.com\/science\/2018\/07\/ai-plus-a-chemistry-robot-finds-all-the-reactions-that-will-work\/5\/ . Retrieved 04 February 2021 .   \n \n\n\u2191 \"HelixAI - Voice Powered Digital Laboratory Assistants for Scientific Laboratories\". HelixAI. http:\/\/www.askhelix.io\/ . Retrieved 04 February 2021 .   \n \n\n\u2191 6.0 6.1 Liscouski, J.G. (2006). \"Are You a Laboratory Automation Engineer?\". SLAS Technology 11 (3): 157-162. doi:10.1016\/j.jala.2006.04.002.   \n \n\n\u2191 Liscouski, J. (2015). \"Which Laboratory Software Is the Right One for Your Lab?\". PDA Letter (November\/December 2015): 38\u201341. https:\/\/www.researchgate.net\/publication\/291971749_Which_Laboratory_Software_is_the_Right_One_for_Your_Lab .   \n \n\n\u2191 \"Data integrity\". Wikipedia. 3 February 2021. https:\/\/en.wikipedia.org\/wiki\/Data_integrity . Retrieved 07 February 2021 .   \n \n\n\u2191 Harmon, C. (20 November 2020). \"What Is Data Integrity?\". Technology Networks. https:\/\/www.technologynetworks.com\/informatics\/articles\/what-is-data-integrity-343068 . Retrieved 07 February 2021 .   \n \n\n\u2191 Chrobak, U. (17 August 2020). \"The US has more power outages than any other developed country. Here\u2019s why\". Popular Science. https:\/\/www.popsci.com\/story\/environment\/why-us-lose-power-storms\/ . Retrieved 08 February 2021 .   \n \n\n\u2191 Tulsi, B.B. (4 September 2019). \"Greater Awareness and Vigilance in Laboratory Data Security\". Lab Manager. https:\/\/www.labmanager.com\/business-management\/greater-awareness-and-vigilance-in-laboratory-data-security-776 . Retrieved 09 February 2021 .   \n \n\n\u2191 Riley, D. (21 May 2020). \"GitLab runs phishing test against employees \u2013 and 20% handed over credentials\". Silicon Angle. https:\/\/siliconangle.com\/2020\/05\/21\/gitlab-runs-phishing-test-employees-20-handing-credentials\/ . Retrieved 09 February 2021 .   \n \n\n\u2191 \"Automated Sample Preparation\". Fluid Management Systems, Inc. https:\/\/www.fms-inc.com\/sample-prep\/ . Retrieved 09 February 2021 .   \n \n\n\u2191 \"PAL Auto Sampler Systems\". Agilent Technologies, Inc. https:\/\/www.agilent.com\/en\/product\/gas-chromatography\/gc-sample-preparation-introduction\/pal-auto-sampler-systems . Retrieved 09 February 2021 .   \n \n\n\u2191 Bradshaw, J.T. (30 May 2012). \"The Importance of Liquid Handling Details and Their Impact on Your Assays\" (PDF). European Lab Automation Conference 2012. Artel, Inc. https:\/\/d1wfu1xu79s6d2.cloudfront.net\/wp-content\/uploads\/2013\/10\/The-Importance-of-Liquid-Handling-Details-and-Their-Impact-on-Your-Assays.pdf . Retrieved 11 February 2021 .   \n \n\n\n\n\n\n\nSource: <a rel=\"external_link\" class=\"external\" href=\"https:\/\/www.limswiki.org\/index.php\/LII:Laboratory_Technology_Planning_and_Management:_The_Practice_of_Laboratory_Systems_Engineering\">https:\/\/www.limswiki.org\/index.php\/LII:Laboratory_Technology_Planning_and_Management:_The_Practice_of_Laboratory_Systems_Engineering<\/a>\nNavigation menuPage actionsLIIDiscussionView sourceHistoryPage actionsLIIDiscussionMoreToolsIn other languagesPersonal toolsLog inRequest accountNavigationMain pageRecent changesRandom pageHelp about MediaWikiSearch\u00a0 ToolsWhat links hereRelated changesSpecial pagesPermanent linkPage informationSponsors \r\n\n\t\r\n\n\t\r\n\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\r\n\n\t\r\n\n \n\t\n\t\n\t\r\n\n\t\r\n\n \n\t\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\r\nPrint\/exportCreate a bookDownload as PDFDownload as PDFDownload as Plain textPrintable version This page was last edited on 18 February 2021, at 19:33.Content is available under a Creative Commons Attribution-ShareAlike 4.0 International License unless otherwise noted.This page has been accessed 2,945 times.Privacy policyAbout LIMSWikiDisclaimers\n\n\n\n","655f7d48a642e9b45533745af73f0d59_html":"<body class=\"mediawiki ltr sitedir-ltr mw-hide-empty-elt ns-202 ns-subject page-LII_Laboratory_Technology_Planning_and_Management_The_Practice_of_Laboratory_Systems_Engineering rootpage-LII_Laboratory_Technology_Planning_and_Management_The_Practice_of_Laboratory_Systems_Engineering skin-monobook action-view skin--responsive\"><div id=\"rdp-ebb-globalWrapper\"><div id=\"rdp-ebb-column-content\"><div id=\"rdp-ebb-content\" class=\"mw-body\" role=\"main\"><a id=\"rdp-ebb-top\"><\/a>\n<h1 id=\"rdp-ebb-firstHeading\" class=\"firstHeading\" lang=\"en\">LII:Laboratory Technology Planning and Management: The Practice of Laboratory Systems Engineering<\/h1><div id=\"rdp-ebb-bodyContent\" class=\"mw-body-content\"><!-- start content --><div id=\"rdp-ebb-mw-content-text\" lang=\"en\" dir=\"ltr\" class=\"mw-content-ltr\"><div class=\"mw-parser-output\"><p><b>Title<\/b>: <i>Laboratory Technology Planning and Management: The Practice of Laboratory Systems Engineering<\/i>\n<\/p><p><b>Author for citation<\/b>: Joe Liscouski, with editorial modifications by Shawn Douglas\n<\/p><p><b>License for content<\/b>: <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/creativecommons.org\/licenses\/by\/4.0\/\" target=\"_blank\">Creative Commons Attribution 4.0 International<\/a>\n<\/p><p><b>Publication date<\/b>: December 2020\n<\/p>\n\n\n<h2><span class=\"mw-headline\" id=\"Introduction\">Introduction<\/span><\/h2>\n<p>What separates successful advanced <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory\" title=\"Laboratory\" class=\"wiki-link\" data-key=\"c57fc5aac9e4abf31dccae81df664c33\">laboratories<\/a> from all the others? It's largely their ability to meet their goals, with the effective use of resources: people, time, money, equipment, data, and <a href=\"https:\/\/www.limswiki.org\/index.php\/Information\" title=\"Information\" class=\"wiki-link\" data-key=\"6300a14d9c2776dcca0999b5ed940e7d\">information<\/a>. The fundamental goals of laboratory work haven\u2019t changed, but they are under increased pressure to do more and do it faster, with a better return on investment (ROI). Laboratory managers have turned to electronic technologies (e.g., computers, networks, robotics, microprocessors, database systems, etc.) to meet those demands. However, without effective planning, technology management, and education, those technologies will only get labs part of the way to meeting their needs. We need to learn how to close the gap between getting part-way there and getting where we need to be. The practice of science has changed; we need to meet that change to be successful.\n<\/p><p>This document was written to get people thinking more seriously about the technologies used in laboratory work and how those technologies contribute to meeting the challenges labs are facing. There are three primary concerns:\n<\/p>\n<ol><li>The need for planning and management: When digital components began to be added to lab systems, it was a slow incremental process: integrators and microprocessors grew in capability as the marketplace accepted them. That development gave us the equipment we have now, equipment that can be used in isolation or in a networked, integrated system. In either case, they need attention in their application and management to protect electronic laboratory data, ensure that it can be effectively used, and ensure that the systems and products put in place are both the right ones, and that they fully contribute to improvements in lab operations.<\/li>\n<li>The need for more laboratory systems engineers (LSEs): There is increasing demand for people who have the education and skills needed to accomplish the points above and provide research and testing groups with the support they need.<sup id=\"rdp-ebb-cite_ref-1\" class=\"reference\"><a href=\"#cite_note-1\">[a]<\/a><\/sup><\/li>\n<li>The need to collaborate with vendors: In order to develop the best products needed for laboratory work, vendors should be provided more user input. Too often vendors have an idea for a product or modifications to existing products, yet they lack a fully qualified audience to bounce ideas off of. With the planning in the first concern in place, we should be able to approach vendors and say, with confidence, \"this is what is needed\" and explain why.<\/li><\/ol>\n<p>If the audience for this work were product manufacturing or production facilities, everything that was being said would have been history. The efficiency and productivity of production operations directly impacts profitability and customer satisfaction; the effort to optimize operations would have been an essential goal. When it comes to laboratory operations, that same level of attention found in production operations must be in place to accelerate laboratory research and testing operations, reducing cost and improving productivity. Aside from a few lab installations in large organizations, this same level of attention isn\u2019t given, as people aren\u2019t educated as to its importance. The purpose of this work is to present ideas of what laboratory technology challenges can be addressed through planning activities using a series of goals.\n<\/p><p>This material is an expansion upon two presentations:\n<\/p>\n<ul><li>\"<a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.13140\/RG.2.2.24722.40645\" target=\"_blank\">Laboratory Technology Management & Planning<\/a>,\" 2nd Annual Lab Asset & Facility Management in Pharma 2019, San Diego, CA, October 22, 2019<\/li>\n<li>\"<a rel=\"external_link\" class=\"external text\" href=\"https:\/\/go.labmanager.com\/webinar-2020-digital-technologies\" target=\"_blank\">How Digital Technologies are Changing the Landscape of Lab Operations<\/a>,\" <i>Lab Manager<\/i> webinar, April 2020<\/li><\/ul>\n<h3><span class=\"mw-headline\" id=\"Directions_in_lab_operations\">Directions in lab operations<\/span><\/h3>\n<h4><span class=\"mw-headline\" id=\"The_lab_of_the_future\">The lab of the future<\/span><\/h4>\n<p>People often ask what the lab of the future (LOF) is going to look like, as if there were a design or model that we should be aspiring toward. There isn\u2019t. Your lab's future is in your hands to mold, a blank sheet of paper upon which you define your lab's future by setting objectives, developing a functional physical and digital architecture, planning processes and implementations, and managing technology that supports both scientific and laboratory <a href=\"https:\/\/www.limswiki.org\/index.php\/Information_management\" title=\"Information management\" class=\"wiki-link\" data-key=\"f8672d270c0750a858ed940158ca0a73\">information management<\/a>. If that sound scary, it\u2019s understandable. But you must take the time to educate yourself and bring in people (e.g., LSEs, consultants, etc.) who can assist you.\n<\/p><p>Too often, if vendors and consultants are asked what the LOF is going to look like, the response lines up with their corporate interests. No one knows what the LOF is because there isn\u2019t a singular future, but rather different futures for different types of labs. (Just think of all the different scientific disciplines that exist; one future doesn\u2019t fit all.) Your lab's future is in your hands. What do you want it to be?\n<\/p><p>The material in this document isn\u2019t intended to define your LOF, but to help you realize it once the framework has been created, and you are in the best position to create it. As you create that framework, you'll be asking:\n<\/p>\n<ol><li>Are you satisfied with your lab's operations? What works and what doesn\u2019t? What needs fixing and how shall it be prioritized?<\/li>\n<li>Has management raised any concerns?<\/li>\n<li>What do those working in the lab have to say?<\/li>\n<li>How is your lab going to change in the next one to five years?<\/li>\n<li>Does your industry have a working group for lab operations, computing, and automation?<\/li><\/ol>\n<p>Adding to question five, many companies tend to keep the competition at arm's length, minimizing contact for fear of divulging confidential information. However, if practically everyone is using the same set of test procedures from a trusted neutral source (e.g., <a href=\"https:\/\/www.limswiki.org\/index.php\/ASTM_International\" title=\"ASTM International\" class=\"wiki-link\" data-key=\"dfeafbac63fa786e77b472c3f86d07ed\">ASTM International<\/a>, United States Pharmacopeia, etc.), there\u2019s nothing confidential there. Instead of developing automated versions of the same procedure independently, companies can join forces, spread the cost, and perhaps come up with a better solution. With that effort as a given, you collectively have something to approach the vendor community with and say \u201cwe need this modification or new product.\u201d This is particularly beneficial to the vendor when they receive a vetted product requirements document to work from.\n<\/p><p>Again, you don\u2019t wait for the lab of the future to happen, you create it. If you want to see the direction lab operations in the future can take, look to the manufacturing industry: it has everything from flexible manufacturing, cooperative robotics<sup id=\"rdp-ebb-cite_ref-BourneMyBoss13_2-0\" class=\"reference\"><a href=\"#cite_note-BourneMyBoss13-2\">[1]<\/a><\/sup><sup id=\"rdp-ebb-cite_ref-CookCollab20_3-0\" class=\"reference\"><a href=\"#cite_note-CookCollab20-3\">[2]<\/a><\/sup>, and so on.<sup id=\"rdp-ebb-cite_ref-4\" class=\"reference\"><a href=\"#cite_note-4\">[b]<\/a><\/sup> This is appropriate in both basic and applied research, as well as <a href=\"https:\/\/www.limswiki.org\/index.php\/Quality_control\" title=\"Quality control\" class=\"wiki-link\" data-key=\"1e0e0c2eb3e45aff02f5d61799821f0f\">quality control<\/a>.\n<\/p><p>Both manufacturing and lab work are process-driven with a common goal: a high-quality product whose quality can be defended through appeal to process and <a href=\"https:\/\/www.limswiki.org\/index.php\/Data_integrity\" title=\"Data integrity\" class=\"wiki-link\" data-key=\"382a9bb77ee3e36bb3b37c79ed813167\">data integrity<\/a>.\n<\/p><p>Lab work can be broadly divided into two activities, with parallels to manufacturing: experimental procedure development (akin to manufacturing process development) and procedure execution (product production). (Note: Administrative work is part of lab operations but not an immediate concern here.) As such, we have to address the fact that lab work is part original science and part production work based on that science, e.g., as seen with quality control, clinical chemistry, and high-throughput screening labs. The routine production work of these and other labs can benefit most from <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_automation\" title=\"Laboratory automation\" class=\"wiki-link\" data-key=\"0061880849aeaca05f8aa27ae171f331\">automation<\/a> efforts. We need to think more broadly about the use of automation technologies\u2014driving their development\u2014instead of waiting to see what vendors develop. \n<\/p><p>Where manufacturing and lab work differ is in the scale of the work environment, the nature of the work station equipment, the skills needed to carry out the work, and the adaptability of those doing the work to unexpected situations.\n<\/p><p>My hope is that this guide will get laboratory managers and other stakeholders to begin thinking more about planning and technology management, as well as the need for more education in that work.\n<\/p>\n<h4><span class=\"mw-headline\" id=\"Trends_in_science_applications\">Trends in science applications<\/span><\/h4>\n<p>If new science isn\u2019t being developed, vendors will add digital hardware and software technology to existing equipment to improve capabilities and ease-of-use, separating themselves from the competition. However, there is still an obvious need for an independent organization to evaluate that technology (i.e., the lab version of <i>Consumer Reports<\/i>); as is, that evaluation process, done properly, would be time consuming for individual labs and would require a consistent methodology. With the increased use of automation, we need to do this better, such that the results can be used more widely (rather than every lab doing their own thing) and with more flexibility, using specialized equipment designed for automation applications.\n<\/p><p><a href=\"https:\/\/www.limswiki.org\/index.php\/Artificial_intelligence\" title=\"Artificial intelligence\" class=\"wiki-link\" data-key=\"0c45a597361ca47e1cd8112af676276e\">Artificial intelligence<\/a> (AI) and <a href=\"https:\/\/www.limswiki.org\/index.php\/Machine_learning\" title=\"Machine learning\" class=\"wiki-link\" data-key=\"79aab39cfa124c958cd1dbcab3dde122\">machine learning<\/a> (ML) are two other trending topics, but they are not quite ready for widespread real-world applications. However, modern examples still exist:\n<\/p>\n<ul><li>Having a system that can bring up all relevant information on a research question\u2014a sort of super Google\u2014or a variation of IBM\u2019s Watson could have significant benefits.<\/li>\n<li>Analyzing complex data or large volumes of data could be beneficial, e.g., the analysis of radio astronomy data to find fast radio bursts (FRB).<sup id=\"rdp-ebb-cite_ref-HsuIsIt18_5-0\" class=\"reference\"><a href=\"#cite_note-HsuIsIt18-5\">[3]<\/a><\/sup><\/li>\n<li>\"[A] team at Glasgow University has paired a machine-learning system with a robot that can run and analyze its own chemical reaction. The result is a system that can figure out every reaction that's possible from a given set of starting materials.\"<sup id=\"rdp-ebb-cite_ref-TimmerAIPlus18_6-0\" class=\"reference\"><a href=\"#cite_note-TimmerAIPlus18-6\">[4]<\/a><\/sup><\/li>\n<li>HelixAI is using Amazon's Alexa as a digital assitant for laboratory work.<sup id=\"rdp-ebb-cite_ref-HelixAIHome_7-0\" class=\"reference\"><a href=\"#cite_note-HelixAIHome-7\">[5]<\/a><\/sup><\/li><\/ul>\n<p>However there are problems using these technologies. ML systems have been shown to be susceptible to biases in their output depending on the nature and quality of the training materials. As for AI, at least in the public domain, we really don\u2019t know what that is, and what we think it is keeps changing as purported example emerge. One large problem for lab use is whether or not you can trust the results of an AI's output. We are used to the idea that lab systems and methods have to be validated before they are trusted, so how do you validate a system based on ML or AI?\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Education\">Education<\/span><\/h3>\n<p>The major issue in all of this is having people educated to the point where they can successfully handle the planning and management of laboratory technology. One key point: most lab management programs focus on personnel issues, but managers also have to understand the capabilities and limitations of information technology and automation systems.\n<\/p><p>One result of the <a href=\"https:\/\/www.limswiki.org\/index.php\/COVID-19\" class=\"mw-redirect wiki-link\" title=\"COVID-19\" data-key=\"da9bd20c492b2a17074ad66c2fe25652\">COVID-19<\/a> pandemic is that we are seeing the limitations of the four-year undergraduate degree program in science and engineering, as well as the state of remote learning. With the addition of information technologies, general systems thinking and modeling<sup id=\"rdp-ebb-cite_ref-8\" class=\"reference\"><a href=\"#cite_note-8\">[c]<\/a><\/sup>, statistical experimental design, and statistical process control have become multidisciplinary fields. We need options for continuing education throughout people\u2019s careers so they can maintain their competence and learn new material as needed.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Making_laboratory_informatics_and_automation_work\">Making laboratory informatics and automation work<\/span><\/h2>\n<p>Making <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_informatics\" title=\"Laboratory informatics\" class=\"wiki-link\" data-key=\"00edfa43edcde538a695f6d429280301\">laboratory informatics<\/a> and automation work? \"Isn\u2019t that a job for IT or lab personnel?\" someone might ask. One of the problems in modern science is the development of specialists in disciplines. The laboratory and IT fields have many specialties, and specialists can be very good within those areas while at the same time not having an appreciation of wider operational issues. Topics like lab operations, technology management, and planning aren\u2019t covered in formal education courses, and they're often not well-covered in short courses or online programs.\n<\/p><p>\u201cMaking it work\u201d depends on planning performed at a high enough level in the organization to encompass all affected facilities and departments, including information technology (IT) and facilities management. This wider perspective gives us the potential for synergistic operations across labs, consistent policies for facilities management and IT, and more effective use of outside resources (e.g., lab information technology support staff [LAB-IT], laboratory automation engineers [LAEs]<sup id=\"rdp-ebb-cite_ref-9\" class=\"reference\"><a href=\"#cite_note-9\">[d]<\/a><\/sup>, equipment vendors, etc.). \n<\/p><p>We need to apply the same diligence to planning lab operations as we do any other critical corporate resource. Planning provides a structure for enabling effective and successful lab operations.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Introduction_to_this_section\">Introduction to this section<\/span><\/h3>\n<p>The common view of science laboratories is that of rooms filled with glassware, lab benches, and instruments being used by scientists to carry out experiments. While this is a reasonable perspective, what isn\u2019t as visually obvious is the end result of that work: the development of knowledge, information, and data.\n<\/p><p>The progress of laboratory work\u2014as well as the planning, documentation, analytical results related to that work\u2014have been recorded in paper-based <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_notebook\" title=\"Laboratory notebook\" class=\"wiki-link\" data-key=\"be60c7be96aba8e9a84537fd8835fa54\">laboratory notebooks<\/a> for generations, and people are still using them today. However, these aren't the only paper records that have existed and are still in use; scientists also depend on charts, log books, administrative records, reports, indexes, and reference material. The latter half of the twentieth century introduced electronics into the lab and with it electronic recording in the form of computers and data storage systems. Early adopters of these technologies had to extend their expertise into the information technology realm because there were few people who understood both these new devices and their application to lab work\u2014you had to be an expert in both laboratory science and computer science.\n<\/p><p>In the 1980s and 90s, computers became commonplace and where once you had to understand hardware, software, operating systems, programming and application packages, you then simply had to know how to turn them on; no more impressive arrays of blinking lights, just a blinking cursor waiting for you to do something.\n<\/p><p>As systems gained ease-of-use, however, we lost the basic understanding of what these systems were and what they did, that they had faults, and that if we didn\u2019t plan for their effective use and counter those faults, we were opening ourselves to unpleasant surprises. The consequences at times were system crashes, lost data, and a lack of a real understanding of how the output of an instrument was transformed into a set of numbers, which meant we couldn\u2019t completely account for the results we were reporting. \n<\/p><p>We need to step back, take control, and institute effective technology planning and management, with appropriate corresponding education, so that the various data we are putting into laboratory informatics technologies have the desired outcome. We need to ensure that these technologies are providing a foundation for improving laboratory operations efficiency and a solid return on investment (ROI), while substantively advancing your business' ability to work and be productive. That's the purpose of the work we'll be discussing.\n<\/p>\n<h4><span class=\"mw-headline\" id=\"The_point_of_planning\">The point of planning<\/span><\/h4>\n<p>The point of planning and technology management is pretty simple: to ensure ...\n<\/p>\n<ul><li>that the right technologies are in people's hands when they need them, and<\/li>\n<li>that those technologies complement each other as much as possible.<\/li><\/ul>\n<p>These are straightforward statements with a lot packed into them.\n<\/p><p>Regarding the first point, the key words are \u201cthe right technologies.\u201d In order to define what that means, lab personnel have to understand the technologies in question and how they apply to their work. If those personnel have used or were taught about the technologies under consideration, it should be easy enough to do. However, laboratory informatics doesn\u2019t fall into that basket of things. The level of understanding has to be more than superficial. While personnel don\u2019t have to be software developers, they do have to understand what is happening within informatics systems, and how data processing handles their data and produces results. Determining the \u201cright technologies\u201d depends on the quality and depth of education possessed by lab personnel, and eventually by lab information technology support staff (LAB-IT?) as they become involved in the selection process.\n<\/p><p>The second point also has a lot buried inside it. Lab managers and personnel are used to specifying and purchasing items (e.g., instruments) as discrete tools. When it comes to laboratory informatics, we\u2019re working with things that connect to each other, in addition to performing a task. When we explore those connections, we need to assess how they are made, what we expect to gain, what compatibility issues exist, how to support them, how to upgrade them, what their life cycle is, etc. Most of the inter-connected devices people encounter in their daily lives are things that were expected to be connected with using a limited set of choices; the vendors know what those choices are and make it easy to do so, or otherwise their products won\u2019t sell. The laboratory technology market, on the other hand, is too open-ended. The options for physical connections might be there, but are they the right ones, and will they work? Do you have a good relationship with your IT people, and are they able to help (not a given)? Again, education is a major factor.\n<\/p>\n<h4><span id=\"rdp-ebb-Who_is_responsible_for_laboratory_technology_planning_and_management_(TPM)?\"><\/span><span class=\"mw-headline\" id=\"Who_is_responsible_for_laboratory_technology_planning_and_management_.28TPM.29.3F\">Who is responsible for laboratory technology planning and management (TPM)?<\/span><\/h4>\n<p>When asking who is responsible for TPM, the question really is \"who are the TPM stakeholders,\" or \"who has an invested interest in seeing TPM prove successful?\"\n<\/p>\n<ul><li>Corporate or organizational management: These stakeholders set priorities and authorize funding, while also rationalizing and coordinating goals between groups. Unless the organization has a strong scientific base, they may not appreciate the options and benefits of TPM in lab work, or the possibilities of connecting the lab into the rest of the corporate data structure.<\/li>\n<li>Laboratory management: These stakeholders are responsible for developing and implementing plans, as well as translating corporate goals into lab priorities.<\/li>\n<li>Laboratory personnel: These stakeholders are the ones that actually do the work. However, they are in the best position to understand where technologies can be applied. They would also be relied on to provide user requirements documents for new projects and meet both internal and external (e.g., <a href=\"https:\/\/www.limswiki.org\/index.php\/Food_and_Drug_Administration\" title=\"Food and Drug Administration\" class=\"wiki-link\" data-key=\"e2be8927071ac419c0929f7aa1ede7fe\">Food and Drug Administration<\/a> [FDA], <a href=\"https:\/\/www.limswiki.org\/index.php\/United_States_Environmental_Protection_Agency\" title=\"United States Environmental Protection Agency\" class=\"wiki-link\" data-key=\"877b052e12328aa52f6f7c3f2d56f99a\">Environmental Protection Agency<\/a> [EPA], <a href=\"https:\/\/www.limswiki.org\/index.php\/International_Organization_for_Standardization\" title=\"International Organization for Standardization\" class=\"wiki-link\" data-key=\"116defc5d89c8a55f5b7c1be0790b442\">International Organization for Standardization<\/a> [ISO], etc.) performance guidelines.<\/li>\n<li>IT management and their support staff: While these stakeholders' traditional role is the support of computers, connected devices (e.g., printers, etc.) and network infrastructure, they may also be the first line of support for computers connected to lab equipment. IT staff either need to be educated to meet that need and support lab personnel, or have additional resources available to them. They may also be asked to participate in planning activities as subject matter experts on computing hardware and software.<\/li>\n<li>LAB-IT specialists: These stakeholders act as the \"additional resources\" alluded to in the previous point. These are crossover specialists that span the lab and IT spaces and can provide informed support to both. In most organizations, aside from large science-based companies, this isn\u2019t a real \"position,\" although once stated, its role is immediately recognized. In the past, I\u2019ve also referenced these stakeholders as being \u201claboratory automation engineers.\u201d<sup id=\"rdp-ebb-cite_ref-LiscouskiAreYou06_10-0\" class=\"reference\"><a href=\"#cite_note-LiscouskiAreYou06-10\">[6]<\/a><\/sup><\/li>\n<li>Facility management: These stakeholders need to ensure that the facilities support the evolving state of laboratory workspace requirements as traditional formats change to support robotics, instrumentation, computers, material flow, power, and HVAC requirements.<\/li><\/ul>\n<p>Carrying out this work is going to rely heavily on expanding the education of those participating in the planning work; the subject matter goes well beyond material covered in degree programs.\n<\/p>\n<h4><span id=\"rdp-ebb-Why_put_so_much_effort_into_planning_and_technology_management?\"><\/span><span class=\"mw-headline\" id=\"Why_put_so_much_effort_into_planning_and_technology_management.3F\">Why put so much effort into planning and technology management?<\/span><\/h4>\n<p>Earlier we mentioned paper laboratory notebooks, the most common recording device since scientific research began (although for sheer volume, it may have been eclipsed by computer hard drives). Have you ever wondered about the economics of laboratory notebooks? Cost is easy to understand, but the value of the data and information that is recorded there requires further explanation.\n<\/p><p>The value of the material recorded in a notebook depends on two key factors: the quality of the work and an inherent ability to put that documented work to use. The quality of the work is a function of those doing the work, how diligent they are, and the veracity of what has been written down. The inherent ability to use it depends upon the clarity of the writing, people\u2019s ability to understand it without recourse to the author, and access to the material. That last point is extremely important. Just by glancing at Figure 1, you can figure out where this is going.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig1_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"18969a8a26ae94a6de9754c2cd6aa45e\"><img alt=\"Fig1 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/f\/f1\/Fig1_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 1.<\/b> Paper notebooks' cost vs. value, as a function of usage<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>As a scientist\u2019s notebook fills with entries, it gains value because of the content. Once filled, it reaches an upper limit and is placed in a library. There it takes a slight drop in value because its ease-of-access has changed; it isn\u2019t readily at hand. As library space fills, the notebooks are moved to secondary storage (in one company I worked at, secondary storage consisted of trailers in a parking lot). Costs go up due to the cost of owning or renting the secondary storage and the space they take. The object's value drops, not because of the content but due to the difficulty in retrieving that content (e.g., which trailer? which box?). Unless the project is still active, the normal turn-over of personnel (e.g., via promotions, movement around the company, leaving the company) mean that institutional memory diminishes and people begin to forget the work exists. If few researchers can remember it, find it, and access it, the value drops regardless of the resources that went into the work. That is compounded by the potential for physical deterioration of the object (e.g., water damage, mice, etc.).\n<\/p><p>Preventing the loss of access to the results of your investment in R&D projects will rely on information technology. That reliance will be built upon planning an effective informatics environment, which is precisely where this discussion is going. How is putting you lab results into a computer system any different than a paper-based laboratory notebook? There are obvious things like faster searching and so on, but from our previous discussion on them, not much is different; you still have essentially a single point of failure, unless you plan for that eventuality. That is the fundamental difference and what will drive the rest of this writing: \n<\/p>\n<dl><dd>Planning builds in reliability, security, and protection against loss. (Oh, and it allows us to work better, too!)<\/dd><\/dl>\n<p>You could plan for failure in a paper-based system by making copies, but those copies still represent paper that has to be physically managed. With electronic systems, we can plan for failure by using automated <a href=\"https:\/\/www.limswiki.org\/index.php\/Backup\" title=\"Backup\" class=\"wiki-link\" data-key=\"e12548e6bf5f28bfee99099fe8662dde\">backup<\/a> procedures that make faithful copies, as many as we\u2019d like, at low cost. This issue isn\u2019t unique to laboratory notebooks, but it is a problem for organizations that depends on paper records.\n<\/p><p>The difference between writing on paper and using electronic systems isn\u2019t limited to how the document is realized. If you were to use a typewriter, the characters would show up on the paper and you'd be able to read them; all you needed was the ability to read (which could include braille formats) and understand what was written. However, if you were using a word processor, the keystrokes would be captured by software, displayed on the screen, placed in the computer\u2019s memory, and then written to storage. If you want to read the file, you need something\u2014software\u2014to retrieve it from storage, interpret the file contents, determine how to display it, and then display it. Without that software the file is useless. A complete backup process has to include the software needed to read the file, plus all the underlying components that it depends upon. You could correctly argue that the hardware is required as well, but there are economic tradeoffs as well as practical ones; you could transfer the file to other hardware and read it there for example. \n<\/p><p>That point brings us to the second subject of this writing: technology management. What do I have to do to make sure that I have the right tools to enable me to work? The problem is simple enough when all you're concerned with is writing and preparing accompanying graphics. Upon shifting the conversation to laboratory computing, it gets more complicated. Rather than being concerned with one computer and a few software packages, you have computers that acquire and process data in real-time<sup id=\"rdp-ebb-cite_ref-11\" class=\"reference\"><a href=\"#cite_note-11\">[e]<\/a><\/sup>, transmit it to other computers for storage in databases, and systems that control <a href=\"https:\/\/www.limswiki.org\/index.php\/Sample_(material)\" title=\"Sample (material)\" class=\"wiki-link\" data-key=\"7f8cd41a077a88d02370c02a3ba3d9d6\">sample<\/a> processing and administrative work. Not only do the individual computer systems and the equipment and people they support have to work well, but also they have to work cooperatively, and that is why we have to address planning and technology management in laboratory work.\n<\/p><p>That brings us to a consideration of what lab work is all about.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Different_ways_of_looking_at_laboratories\">Different ways of looking at laboratories<\/span><\/h2>\n<p>When you think about a \u201claboratory,\u201d a lot depends on your perspective: are you on the outside looking in, do you work in a lab, or are you taking that high school chemistry class? When someone walks into a science laboratory, the initial impression is that of confusing collection of stuff, unless they're familiar with the setting. \u201cStuff\u201d can consist of instruments, glassware, tubing, robots, incubators, refrigerators and freezers, and even petri dishes, cages, fish tanks, and more depending on the kind of work that is being pursued.\n<\/p><p>From a corporate point of view, a \"laboratory\" can appear differently and have different functions. Possible corporate views of the laboratory include:\n<\/p>\n<ol><li>A laboratory is where questions are studied, which may support other projects or provide a source of new products, acting as basic and applied R&D. What is expected out of these labs is the development of new knowledge, usually in the form of reports or other documentation that can move a project forward.<\/li>\n<li>A laboratory acts as a research testing facility (e.g., analytical, physical properties, mechanical, electronics, etc.) that supports research and manufacturing through the development of new test methods, special analysis projects, troubleshooting techniques, and both routine and non-routine testing. The laboratory's results come in the form of reports, test procedures, and other types of documented information.<\/li>\n<li>A laboratory acts as a <a href=\"https:\/\/www.limswiki.org\/index.php\/Quality_assurance\" title=\"Quality assurance\" class=\"wiki-link\" data-key=\"2ede4490f0ea707b14456f44439c0984\">quality assurance<\/a>\/<a href=\"https:\/\/www.limswiki.org\/index.php\/Quality_control\" title=\"Quality control\" class=\"wiki-link\" data-key=\"1e0e0c2eb3e45aff02f5d61799821f0f\">quality control<\/a> (QA\/QC) facility that provides routine testing, producing information in support of production facilities. This can include incoming materials testing, product testing, and product certification.<\/li><\/ol>\n<p>Typically, stakeholders outside the lab are looking for some form of result that can be used to move projects and other work forward. They want it done quickly and at low cost, but also want the work to be of high quality and reliability. Those considerations help set the goals for lab operations.\n<\/p><p>Within the laboratory there are two basic operating modes or <a href=\"https:\/\/www.limswiki.org\/index.php\/Workflow\" title=\"Workflow\" class=\"wiki-link\" data-key=\"92bd8748272e20d891008dcb8243e8a8\">workflows<\/a>: project-driven or task-driven work. With project-driven workflows, a project goal is set, experiments are planned and carried out, the results are evaluated, and a follow-up course of action is determined. This all requires careful documentation for the planning and execution of lab work. This can also include developing and revising standard operating procedures (SOPs). Task-driven workflows, on the other hand, essentially depend on the specific steps of a process. A collection of samples needs to be processed according to an SOP, and the results recorded. Depending upon the nature of the SOP and the number of samples that have to be processed, the work can be done manually, using instruments, or with partial or full automation, including robotics. With the exception of QA\/QC labs, a given laboratory can use a combination of these modes or workflows over time as work progresses and the internal\/external resources become available. QA\/QC labs are almost exclusively task-driven; contract testing labs are as well, although they may take on project-driven work.\n<\/p><p>Within the realm of laboratory informatics, project-focused work centers on the <a href=\"https:\/\/www.limswiki.org\/index.php\/Electronic_laboratory_notebook\" title=\"Electronic laboratory notebook\" class=\"wiki-link\" data-key=\"a9fbbd5e0807980106763fab31f1e72f\">electronic laboratory notebook<\/a> (ELN), which can be described as a lab-wide diary of work and results. Task-driven work is organized around the <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_information_management_system\" title=\"Laboratory information management system\" class=\"wiki-link\" data-key=\"8ff56a51d34c9b1806fcebdcde634d00\">laboratory information management system<\/a> (LIMS)\u2014or <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_information_system\" title=\"Laboratory information system\" class=\"wiki-link\" data-key=\"37add65b4d1c678b382a7d4817a9cf64\">laboratory information system<\/a> (LIS) in clinical lab settings\u2014which can be viewed as a workflow manager of tests to be done, results to be recorded, and analyses to be finalized. Both of these technologies replaced the paper-based laboratory notebook discussed earlier, coming with considerable improvements in productivity. And although ELNs are considerably more expensive than paper systems, the short- and long-term benefits of an ELN overshadow that cost issue.\n<\/p>\n<h2><span id=\"rdp-ebb-Labs_in_transition,_from_manual_operation_to_modern_facilities\"><\/span><span class=\"mw-headline\" id=\"Labs_in_transition.2C_from_manual_operation_to_modern_facilities\">Labs in transition, from manual operation to modern facilities<\/span><\/h2>\n<div class=\"thumb tright\"><div class=\"thumbinner\" style=\"width:502px;\"><a href=\"https:\/\/www.limswiki.org\/index.php\/File:PSM_V43_D075_Chemical_laboratory.jpg\" class=\"image wiki-link\" data-key=\"f30aee78def64774a6987a368702b555\"><img alt=\"\" src=\"https:\/\/upload.wikimedia.org\/wikipedia\/commons\/b\/ba\/PSM_V43_D075_Chemical_laboratory.jpg\" decoding=\"async\" class=\"thumbimage\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a> <div class=\"thumbcaption\"><div class=\"magnify\"><a href=\"https:\/\/www.limswiki.org\/index.php\/File:PSM_V43_D075_Chemical_laboratory.jpg\" class=\"internal wiki-link\" title=\"Enlarge\" data-key=\"f30aee78def64774a6987a368702b555\"><\/a><\/div><b>Figure 2.<\/b> Chemical laboratory at Oswego SUNY 1893<\/div><\/div><\/div><p>Laboratories didn\u2019t start with lots of electronic components; they began with people, lab benches, glassware, Bunsen burners, and other equipment. Lab operations were primarily concerned with peoples' ability to work. The technology was fairly simple by today\u2019s standards (Figure 2), and an individual\u2019s skills were the driving factor in producing quality results.\n<\/p><p>For the most part, the skills you learned in school were the skills you needed to be successful here as far as technical matters went; management education was another issue. That changed when electronic instrumentation became available. Analog instruments such as scanning <a href=\"https:\/\/www.limswiki.org\/index.php\/Spectrophotometer\" title=\"Spectrophotometer\" class=\"wiki-link\" data-key=\"6382bb48c914f3c490400c13f9eb16e6\">spectrophotometers<\/a>, <a href=\"https:\/\/www.limswiki.org\/index.php\/Chromatography\" title=\"Chromatography\" class=\"wiki-link\" data-key=\"2615535d1f14c6cffdfad7285999ad9d\">chromatographs<\/a>, <a href=\"https:\/\/www.limswiki.org\/index.php\/Mass_spectrometry\" title=\"Mass spectrometry\" class=\"wiki-link\" data-key=\"fb548eafe2596c35d7ea741849aa83d4\">mass spectrometer<\/a>, differential scanning <a href=\"https:\/\/www.limswiki.org\/index.php\/Calorimeter\" title=\"Calorimeter\" class=\"wiki-link\" data-key=\"af2c641f435c259a996f13fc0c612eed\">calorimeters<\/a>, tensile testers, and so on introduced a new career path to laboratory work: the instrument specialist, who combined an understanding of the basic science with the an understanding of the instrument\u2019s design, as well as how to use it (and modify it where needed), maintain it, troubleshoot issues, and analyze the results. Specialization created a problem for schools: they couldn\u2019t afford all the equipment, find knowledgeable instructors, and encourage room in the curriculum for the expanding subject matter. Schools were no longer able to educate people to meet the requirements of industry and graduate-level academia. And then digital electronics happened. Computers first became attached to instruments, and then incorporated into the instrumentation.<sup id=\"rdp-ebb-cite_ref-12\" class=\"reference\"><a href=\"#cite_note-12\">[f]<\/a><\/sup>\n<\/p><p>The addition of computer hardware and software to an instrument increased the depth of specialization in those techniques. Not only did you have to understand the science noted above, but also the use of computer programs used to work with the instrument, how to collect the data, and how to perform the analysis. An entire new layer of skills was added to an already complex subject.\n<\/p><p>The latest level of complexity added to laboratory operations has been the incorporation of LIMS, ELNs, <a href=\"https:\/\/www.limswiki.org\/index.php\/Scientific_data_management_system\" title=\"Scientific data management system\" class=\"wiki-link\" data-key=\"9f38d322b743f578fef487b6f3d7c253\">scientific data management systems<\/a> (SDMS), and <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_execution_system\" title=\"Laboratory execution system\" class=\"wiki-link\" data-key=\"774bdcab852f4d09565f0486bfafc26a\">laboratory execution systems<\/a> (LES) either as stand-alone modules or combined into more integrated packages or \"platforms.\"\n<\/p>\n<h3><span id=\"rdp-ebb-There's_a_plan_for_that?\"><\/span><span class=\"mw-headline\" id=\"There.27s_a_plan_for_that.3F\">There's a plan for that?<\/span><\/h3>\n<p>It is rare to find a lab that has an informatics plan or strategy in place before the first computer comes through the door; those machines enter as part of an instrument-computer control system. Several computers may use that route to become part of the lab's technology base before people realize that they need to start taking lab computing seriously, including how to handle backups, maintenance, support, etc.\n<\/p><p>First computers come into the lab, and then the planning begins, often months later, as an incremental planning effort, which is the complete reverse of how things need to be developed. Planning is essential as soon as you decide that a lab space will be created. That almost never happens, in part because no one has told you that is required, let alone why or how to go about it.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Thinking_about_a_model_for_lab_operations\">Thinking about a model for lab operations<\/span><\/h3>\n<p>The basic purpose of laboratory work is to answer questions. \u201cHow do we make this work?\u201d \u201cWhat is it?\u201d \u201cWhat\u2019s the purity of this material?\u201d These questions and others like them occur in chemistry, physics, and the biological sciences. Answering those questions is a matter of gathering data and information through observation and experimental work, organizing it, analyzing it, and determining the next steps needed as the work moves forward (Figure 3). Effective organization is essential, as lab personnel will need to search data and information, extract it, move it from one data system to another for analysis, make decisions, update planning, and produce interim and ultimately final reports.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig3_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"45986849e53d67a4c64847724f493759\"><img alt=\"Fig3 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/5\/56\/Fig3_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 3.<\/b> Simplified flow of data\/information from sources to collection system<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Once the planning is done, scientific work generally begins with collecting observations and measurements (Data\/Information Sources 1\u20134, Figure 3) from a variety of sources. Lab bench work usually involves instrumentation, and many instruments have computer controls and data systems as part of them. This is the more visible part of lab work and the one that matches people\u2019s expectations for a \u201cscientific lab.\u201d This is where most of the money is spent on equipment, materials, and people\u2019s expertise and time. All that expenditure of resources results in \u201cthe pH of the glowing liquid is 6.5,\u201d \u201cthe concentration of iron in the icky stuff is 1500 ppm,\u201d and so on. That\u2019s the end result of all those resources, time, and effort put into the scientific workflow. That\u2019s why you built a million-dollar facility (in some spheres of science such as astronomy, high energy physics, and the space sciences, the cost of collection is significantly higher). So what do you do with those results? Prior to the 1970s, the collection points were paper: forms, notebooks, and other document, all with their earlier discussed issues.\n<\/p><p>The material on those instrument data systems needs to be moved to an intermediate system for long-term storage and reference (the second step of Figure 3). This is needed because those initial data systems may fail, be replaced, or added to as the work continues. After all, the data and information they\u2019ve collected needs to be preserved, organized, and managed to support continued lab work.\n<\/p><p>The analyzed results need to be collected into a reference system that is the basis of long-term analysis, management\/administration work, and reporting. This last system in the flow is the central hub of lab activities; it is also the distribution point for material sent to other parts of the organization (the third and fourth stages of Figure 3). While it is natural for scientists to focus on the production of data and information, the organization and centralized management of the results of laboratory work needs to be a primary consideration. That organization will be focused of short- and long-term <a href=\"https:\/\/www.limswiki.org\/index.php\/Data_analysis\" title=\"Data analysis\" class=\"wiki-link\" data-key=\"545c95e40ca67c9e63cd0a16042a5bd1\">data analysis<\/a> and evaluation. The results of this get used to demonstrate the lab's performance towards meeting its goals, and it will show those investing in your work that you\u2019ve got your management act together, which is useful when looking for continued support.\n<\/p><p>Today, those systems come in two basic forms: LIMS and ELN. The details of those systems are the subject of a number of articles and books.<sup id=\"rdp-ebb-cite_ref-LiscouskiWhich15_13-0\" class=\"reference\"><a href=\"#cite_note-LiscouskiWhich15-13\">[7]<\/a><\/sup> Without getting into too much detail:\n<\/p>\n<ul><li>LIMS are used to support testing labs managing sample workflows and planning, as well as cataloging results (e.g., short text and numerical information).<\/li><\/ul>\n<ul><li>ELNs are usually found in research functioning as an electronic diary of lab work for one or more scientists and technicians. The entries may contain extensive textural material, numerical entries, charts, graphics, etc. The ELN is generally more flexible than a LIMS.<\/li><\/ul>\n<p>That distinction is simplistic; some labs support both activities and need both types of systems, or even a hybrid package. However, the description is sufficient to get us to the next point: the lifespan of systems varies, depending on where you are looking in Figure 3's model. Figure 4 gives a comparison.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig4_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"71a82219dd4e0aba35959d868be597e3\"><img alt=\"Fig4 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/a\/ad\/Fig4_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 4.<\/b> The relative lifespan of laboratory systems<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>The experimental methods\/procedures used in lab work will change over time as the needs of the lab change. Older instruments may be updated and new ones introduced. Retirement is a problem, particularly if data systems are part of the equipment. You have to have access to the data. That need will live on long past the equipment's life. That is one reason that moving data and information to an intermediate system like an SDMS is important. However, in some circumstances, even that isn\u2019t going to be sufficient (regulated industries where the original data structures and software that generated them need to be preserved as an operating entity). In those cases, you may have old computers stacked up just in case you need access to their contents. A better way is to virtualize the systems as containers on servers that support a virtualized environment.\n<\/p><p>Virtualization\u2014making an electronic copy of computer system and running on a server\u2014is potentially a useful technology in lab work; while it won\u2019t participate in day-to-day activities it does have a role. Suppose you have an instrument-data system that is being replaced or retired. Maybe the computer is showing signs of aging or failing. What do you do with the files and software that are on the computer portion of the combination? You can\u2019t dispose of them because you may need access to those data files and software later. On the other hand, do you really want to collect computer systems that have to be maintained just to have access to the data if and when you need it? Instead, virtualization is a software\/hardware technology that allows you to make a complete copy of everything that is on that computer\u2014including operating system files, applications, and data files\u2014and stores it in one big file referred to as a \u201ccontainer.\u201d That container can be moved to a computer that is a virtual server and has software that emulates various operating environment, allowing the software in the container to run as if it were on its own computer hardware. A virtual server can support a lot of containers, and the operating systems in those containers can be updated as needed. The basic idea is that you don\u2019t need access to a separate physical computer; you just need the ability to run the software that was on it. If your reaction to that is one of dismay and confusion, it\u2019s time to buy your favorite IT person a cup of coffee and have a long talk. We\u2019ll get into more details when we cover data backup issues.\n<\/p><p><b>Why is this important to you?<\/b>\n<\/p><p>While the science behind producing results is the primary reason your lab exists, gaining the most value from the results is essential to the organization overall. That value is going to be governed by the quality of the results, ease of access, the ability to find and extract needed information easily, and a well-managed K\/I\/D architecture. All of that addresses a key point from management\u2019s perspective: return on investment or ROI. If you can demonstrate that your data systems are well organized and maintained, and that you can easily find and use the results from experimental work and contribute to advancing the organization\u2019s goals, you\u2019ll make it easier to demonstrate solid ROI and gain funding for projects, equipment, and people needed to meet your lab's goals.\n<\/p><p><br \/>\n<\/p>\n<h2><span class=\"mw-headline\" id=\"The_seven_goals_of_planning_and_managing_lab_technologies\">The seven goals of planning and managing lab technologies<\/span><\/h2>\n<p>The preceding material described the need for planning and managing lab technologies, and making sure lab personnel are qualified and educated to participate in that work. The next step is the actual planning. There are at least two key aspects to that work: planning activities that are specific and unique to your lab(s) and addressing broader scope issues that are common to all labs. The discussion found in the rest of this guide is going to focus on the latter points.\n<\/p><p>Effective planning is accomplished by setting goals and determining how you are going to achieve them. The following sections of this guide look at those goals, specifically:\n<\/p>\n<ol><li>Supporting an environment that fosters productivity and innovation<\/li>\n<li>Developing high-quality data and information<\/li>\n<li>Managing knowledge, information, and data effectively, putting them in a structure that encourages use and protects value<\/li>\n<li>Ensuring a high level of data integrity at every step<\/li>\n<li>Addressing security throughout the lab<\/li>\n<li>Acquiring and developing \"products\" that support regulatory requirements<\/li>\n<li>Addressing systems integration and harmonization<\/li><\/ol>\n<p>The material below begins the sections on goal setting. Some of these goals are obvious and understandable, others like \u201charmonization\u201d are less so. The goals are provided as an introduction rather than an in-depth discussion. The intent is to offer something suitable for the purpose of this material and a basis for a more detailed exploration at a later point. The intent of these goals is not to tell you how to do things, but rather what things need to be addressed. The content is provided as a set of questions that you need to think about. The answers aren't mine to give, but rather yours to develop and implement; it's your lab. In many cases, developing and implementing those answers will be a joint effort by all stakeholders.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"First_goal:_Support_an_environment_that_fosters_productivity_and_innovation\">First goal: Support an environment that fosters productivity and innovation<\/span><\/h3>\n<p>In order to successfully plan for and manage lab technologies, the business environment should ideally be committed to fostering a work environment that encourages productivity and innovation. This requires:\n<\/p>\n<ul><li>proven, supportable workflow methodologies;<\/li>\n<li>educated personnel;<\/li>\n<li>fully functional, inter-departmental cooperation;<\/li>\n<li>management buy-in; and<\/li>\n<li>systems that meet users' needs.<\/li><\/ul>\n<p>This is one of those statements that people tend to read, say \u201csure,\u201d and move on. But before you do that, let\u2019s take a look at a few points. Innovation may be uniquely human (not even going to consider AI), and the ability to be \u201cinnovative\u201d may not be universal.\n<\/p><p>People need to be educated, be able to separate true facts from \u201cbeliefs,\u201d and question everything (which may require management support). Innovation doesn\u2019t happen in a highly structured environment, you need the freedom to question, challenge, etc. You also need the tools to work with. The inspiration that leads to innovation can happen anywhere, anytime. All of a sudden all the pieces fit. And then what? That is where a discussion of tools and this work come together.\n<\/p><p>If a sudden burst of inspiration hits, you want to do it now and not after traveling to an office, particularly if it is weekend or vacation. You need access to knowledge (e.g., documents, reports), information, and data (K\/I\/D). In order to do that, a few things have to be in place:\n<\/p>\n<ul><li>Business and operations K\/I\/D must be accessible.<\/li>\n<li>Systems security has to be such that a qualified user can gain access to K\/I\/D remotely, while preventing its unauthorized use.<\/li>\n<li>Qualified users must have the hardware and software tools required to access the K\/I\/D, work with it, and transmit the results of that work to whoever needs to see it.<\/li>\n<li>Qualified users must also be able to remotely initiate actions such as testing.<\/li><\/ul>\n<p>Those elements depend on a well-designed laboratory and corporate informatics infrastructure. Laboratory infrastructure is important because that is where the systems are that people need access to, and corporate infrastructure is important since corporate facilities have to provide access, controls, and security. Implementation of those corporate components has to be carefully thought through; they must be strong enough to frustrate unwarranted access (e.g., multi-factor logins) while allowing people to get real work done.\n<\/p><p>All of this requires flexibility and trust in people, an important part of corporate culture. This will become more important as society adjusts to new modes of working (e.g., working online due to a pandemic) and the realization that the fixed format work week isn\u2019t the only way people can be productive. For example, working from home or off-site is increasingly commonplace. Laboratory professionals work in two modes: intellectual, which can be done anywhere, and the lab bench, where physical research tasks are performed. We need to strike a balance between those modes and the need for in-person vs virtual contact.\n<\/p><p>Let's take another look at the previous Figure 3, which offered one possible structure for organizing lab systems:\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig3_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"45986849e53d67a4c64847724f493759\"><img alt=\"Fig3 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/5\/56\/Fig3_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 3.<\/b> Simplified flow of data\/information from sources to collection system<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>This use of an intermediate file storage system like an SDMS and the aggregation of some instruments to a common computer (e.g., one chromatographic data system for all chromatographs vs. one per instrument) becomes more important for two reasons: 1. it limits the number of systems that have to be accessed to search, organize, extract, and work with K\/D\/I, and 2. it makes it easier to address security concerns. There are additional reasons why this organization of lab systems is advantageous, but we\u2019ll cover those in later installments. The critical point here is a sound informatics architecture is key to supporting innovation. People need access to tools and K\/D\/I when they are working, regardless of where they are working from. As such, those same people need to be well-versed in the capabilities of the systems available to them, how to access them, use them, and how to recognize \u201cmissing technologies,\u201d capabilities they need but don\u2019t have access to or simply don't exist.\n<\/p><p>Imagine this. A technology expert consults for two large organizations, one tightly controlled (Company A), the other with a liberal view of trusting people to do good work (Company B). In the first case, getting work done can be difficult, with the expert fighting through numerous reviews, sign-offs, and politics. Company A has a stated philosophy that they don\u2019t want to be the first in the market with a new product, but would rather be a strong number two. They justify their position through the cost of developing markets for new products: let someone else do the heavy lifting and follow behind them. This is not a culture that spawns innovation. Company B, however, thrives on innovation. While processes and procedures are certainly in place, the company has a more relaxed philosophy about work assignments. If the expert has a realizable idea, Company B lets them run with it, as long as they complete their assigned workload in a timely fashion. This is what spurs the human side of innovation.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Second_goal:_Develop_high-quality_data_and_information\">Second goal: Develop high-quality data and information<\/span><\/h3>\n<p>Asking staff to \"develop high-quality data and information\" seems like a pretty obvious point, but this is where professional experience and the rest of the world part company. Most of the world treats \u201cdata\u201d and \u201cinformation\u201d as interchangeable words. Not here.\n<\/p><p>There are three key words that are going to be important in this discussion of goals: knowledge, information, and data (K\/I\/D). We\u2019ll start with \u201cknowledge\u201d. The type of knowledge we will be looking at is at the laboratory\/corporate level, the stuff that governs how a laboratory operates, including reports, administrative material, and most importantly standard operating procedures (SOPs). SOPs tell us how lab work is carried out via its methods, procedures, etc. (This subject parallels the topic of \u201cdata integrity,\u201d which will be covered later.) Figure 5 positions K\/I\/D with respect to each other within laboratory processes.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig5_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"009d875ad276a207e3b7a3932e0d3bf1\"><img alt=\"Fig5 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/2\/27\/Fig5_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 5.<\/b> Simplified flow of data\/information from sources to collection system<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>The diagram in Figure 5 is a little complicated, and we\u2019ll get into the details as the material develops. For the moment, we\u2019ll concentrate on the elements in black.\n<\/p><p>As noted above, SOPs guide activities within the lab. As work is defined\u2014both research and testing\u2014SOPs have to be developed so that people know how to carry out their tasks consistently. Our first concern then is proper management of SOPs. Sounds simple, but in practice it isn\u2019t. It\u2019s a matter of first developing and updating the procedures, documenting them, and then managing both the documents and the lab personnel using them.\n<\/p><p>When developing, updating, and documenting procedures, a lab will primarily be looking at the science its working with and how regulatory requirements affect it, particularly in research environments. Once developed, those procedures will eventually need to be updated. But why is an update to a procedure needed? What will the effects of the update be based on the changes that were made, and how do the results of the new version compare to the previous version? That last point is important, and to answer it you need a reference sample that has been run repeatedly under the older version so that you have a solid history of the results (i.e., control chart) over time. You also need the ability to run that same reference sample under the new procedure to show that there are no differences, or that differences can be accounted for. If differences persist, what do you do about the previous test results under the old procedure?\n<\/p><p>The idea of running one or more stable reference samples periodically is a matter of instituting statistical process control over the analysis process. It can show that a process is under control, detect drift in results, and demonstrate that the lab is doing its job properly. If multiple analysts are doing the same work, it can also reveal how their work compares and if there are any problems. It is in effect looking over their shoulders, but that just comes with the job. If you find that the amount of reference material is running low, then phase in a replacement, running both samples in parallel to get a documented comparison with a clean transition from one reference sample to another. It\u2019s a lot of work and it\u2019s annoying, but you\u2019ll have a solid response when asked \u201care you confident in these results?\u201d You can then say, \u201cYes, and here is the evidence to back it up.\u201d\n<\/p><p>After the SOPs have been documented, they must then be effectively managed and implemented. First, take note of the education and experience required for lab personnel to properly implement any SOP. Periodic evaluation (or even certification) would be useful to ensure things are working as they should. This is particularly true of procedures that aren\u2019t run often, as people may forget things. \n<\/p><p>Another issue of concern with managing SOPs is how to manage versioning. Consider two labs. Lab 1 is a well-run lab. When a new procedure is issued, the lab secretary visits each analyst, takes their copy of\nthe old method, destroys it, provides a copy of the new one, requires the analyst sign for receipt, and later requires a second signature after the method has been reviewed and understood. Additional education is also provided on an as-needed basis. Lab 2 has good intentions, but it's not as proactive as Lab 1. Lab 2 retains all documents on a central server. Analysts are able to copy a method to their machines and use it. However, there is no formalized method of letting people know when a new method is released. At any given time there may be several analysts running the same method using different versions of the related SOP. The end result is having a mix of samples run by different people according to different SOPs. \n<\/p><p>This comparison of two labs isn\u2019t electronic versions vs. paper, but rather a formal management structure vs. a loose one. There\u2019s no problem maintaining SOPs in an electronic format, as there are many benefits, but there shouldn\u2019t be any question about the current version, and there should be a clear process for notifying people about updates while also ensuring that analysts are currently educated in the new method's use.\n<\/p><p>Managing this set of problems\u2014analyst education, versions of SOPs, qualification of equipment, current reagents, etc.\u2014 was the foundation for one of the early original ELNs, SmartLab by Velquest, now developed as a LES by <a href=\"https:\/\/www.limswiki.org\/index.php\/Dassault_Syst%C3%A8mes_SA\" title=\"Dassault Syst\u00e8mes SA\" class=\"wiki-link\" data-key=\"1be69bd73e35bc3db0c3229284bf9416\">Dassault Syst\u00e8mes<\/a> as part of the BIOVIA product line. And while Dassault's LES, and much of the Biovia product line, narrowly focuses on their intended market, the product remains suitable for any lab where careful control over procedure execution is warranted. This is important to note, as a LES is designed to guide a person through a procedure from start to finish, making it one step away from engaging in a full robotics system (robotics may play a role in stages of the process). The use of an LES doesn\u2019t mean that personnel aren\u2019t trusted or deemed incompetent; rather, it is a mechanism for developing documented evidence that methods have been executed correctly. That evidence builds confidence in results.\n<\/p><p>LESs are available from several vendors, often as part of their LIMS or ELN offerings. Using any of these systems requires planning and scripting (a gentler way of saying \u201cprogramming\u201d), and the cost of implementation has to be balanced against the need (does the execution of a method require that level of sophistication) and ROI.\n<\/p><p>Up to this point, we\u2019ve looked at developing and managing SOPs, as well as at least one means of controlling experiment\/procedure execution. However, there are other ways of going about this, including manual and full robotics systems. Figure 6 takes us farther down the K\/I\/D model to elaborate further on experiment\/procedure execution.<sup id=\"rdp-ebb-cite_ref-14\" class=\"reference\"><a href=\"#cite_note-14\">[g]<\/a><\/sup>\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig6_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"085c4095953e3d63bb21a90911219f3c\"><img alt=\"Fig6 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/1\/13\/Fig6_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 6.<\/b> K\/I\/D flow diagram focusing on experiment\/procedure execution<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>As we move from knowledge development and management (i.e., SOPs), and then on to sample preparation (i.e., pre-experiment), the next step is usually some sort of measurement by an instrument, whether it is pH meter or spectrometer, yielding your result. That brings us to two words we noted earlier: \"data\" and \"information.\" We'll note the differences between the two using a <a href=\"https:\/\/www.limswiki.org\/index.php\/Gas_chromatography\" title=\"Gas chromatography\" class=\"wiki-link\" data-key=\"e621fc6f90266fbc8db27d516e9cbb94\">gas chromatography<\/a> system as an example (Figure 7), as it and other base chromatography systems are among the most widely used of upper-tier instrumentation and widely found in labs where chemical analysis is performed.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig7_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"1bf85a473790960af479c0df53f9c4b6\"><img alt=\"Fig7 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/6\/67\/Fig7_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 7.<\/b> Gas chromatography \"data\" and \"information\"<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>As we look at Figure 7, we notice to the right of the vertical blue line is an output signal from a gas chromatograph. This is what chromatographers analyzed and measured when they carried out their work. The addition of a computer made life easier by removing the burden of calculations, but it also added complexity to the work in the form of having to manage the captured electronic data and information. An analog-to-digital (A\/D) converter transformed those smooth curves to a sequence of numbers that are processed to yield parameters that described the peaks, which in turn were used to calculate the amount of substance in the sample. Everything up to that last calculation\u2014left of the vertical blue line\u2014is \u201cdata,\u201d a set of numerical values that, taken individually, have no meaning by themself. It is only when we combine it with other data sets that we can calculate a meaningful result, which gives us \u201cinformation.\u201d\n<\/p><p>The paragraph above describes two different types of data:\n<\/p><p>1. the digitized detector output or \"raw data,\" constituting a series of readings that could be plotted to show the instrument output; and\n<\/p><p>2. the processed digitized data that provides descriptors about the output, with those descriptors depending largely upon the nature of the instrument (in the case of chromatography, the descriptors would be peak height, retention time, uncorrected peak area, peak widths, etc.).\n<\/p><p>Both are useful and neither of them should be discarded; the fact that you have the descriptors doesn\u2019t mean you don\u2019t need the raw data. The descriptors are processed data that depends on user-provided parameters. Changing the parameter can change the processing and the values assigned to those descriptors. If there are accuracy concerns, you need the raw data as a backup. Since storage is cheap, there really isn\u2019t any reason to discard anything, ever. (And in some regulatory environments, keeping raw data is mandated for a period of time.)\n<\/p><p>If you want to study the data and how it was processed to yield a result, you need more data, specifically the reference samples (standards) used to evaluate each sample. An instrument file by itself is almost useless without the reference material run with that sample. Ideally, you\u2019d want a file that contains all the sample and reference data that was analyzed in one session. That might be a series of manual samples analyzed or an entire auto-sampler tray.\n<\/p><p>Everything we've discussed here positively contributes to developing high-quality data and information. When methods are proven and you have documented evidence that they were executed by properly educated personnel using qualified reagents and instruments, you then have the instrument data to support each sample result and any other information gleaned from that data.\n<\/p><p>You might wonder what laboratorians did before computers. They dealt with stacks of spectra, rolls of chromatograms, and laboratory notebooks, all on paper. If they wanted to find the data (e.g., a pen trace on paper) for a sample, they turned to the lab's physical filing system to locate it.<sup id=\"rdp-ebb-cite_ref-15\" class=\"reference\"><a href=\"#cite_note-15\">[h]<\/a><\/sup> Why does this matter? That has to do with our third goal.\n<\/p>\n<h3><span id=\"rdp-ebb-Third_goal:_Manage_K\/I\/D_effectively,_putting_them_in_a_structure_that_encourages_use_and_protects_value\"><\/span><span class=\"mw-headline\" id=\"Third_goal:_Manage_K.2FI.2FD_effectively.2C_putting_them_in_a_structure_that_encourages_use_and_protects_value\">Third goal: Manage K\/I\/D effectively, putting them in a structure that encourages use and protects value<\/span><\/h3>\n<p>In the previous section we introduced three key elements of laboratory work: knowledge, information, and data (K\/I\/D). Each of these is \u201cdatabase\u201d structures (\u201cdata\u201d in the general sense). We also looked at SOP management as an example of knowledge management, and distinguished \u201cdata\u201d and \u201cinformation\u201d management as separate but related concerns. We also introduced flow diagrams (Figures 5 and 6) that show the relationship and development of each of those elements.\n<\/p><p>In order for those elements to justify the cost of their development, they have to be placed in systems that encourage utilization and thus retain their value. Modern informatics tools assist in many ways:\n<\/p>\n<ul><li><a href=\"https:\/\/www.limswiki.org\/index.php\/Document_management_system\" title=\"Document management system\" class=\"wiki-link\" data-key=\"b9d429b16b89b35fe74097a3e0c2d1ee\">Document management systems<\/a> support knowledge databases (and some LIMS and ELNs inherently support document management).<\/li>\n<li>LIMS and ELNs provide a solid base for laboratory information, and they may also support other administrative and operational functions.<\/li>\n<li>Instrument data systems and SDMS collect instrument output in the form of reports, data, and information.<\/li><\/ul>\n<p>You may notice there is significant functional redundancy as vendors try to create the \u201cultimate laboratory system.\u201d Part of lab management\u2019s responsibility is to define what the functional architecture should look like based on their current and perceived needs, rather than having it defined for them. It\u2019s a matter of knowing what is required and seeing what fits rather than fitting requirements into someone else\u2019s idea of what's needed.\n<\/p><p>Managing large database systems is only one aspect of handling K\/I\/D. Another aspect involves the consideration of <a href=\"https:\/\/www.limswiki.org\/index.php\/Cloud_computing\" title=\"Cloud computing\" class=\"wiki-link\" data-key=\"fcfe5882eaa018d920cedb88398b604f\">cloud<\/a> vs. local storage systems. What option works best for your situation, is the easiest to manage, and is supported by IT? We also have to address the data held in various desktop and mobile computing devices, as well as bench top systems like instrument data systems. There are a number of considerations here, not the least of which is product turnover (e.g., new systems, retired systems, upgrades\/updates, etc.). (Some of these points will be covered latter on in other sections.)\n<\/p><p>What you should think about now is the number of computer systems and software packages that you use on a daily basis, some of which are connected to instruments. How many different vendors are involved? How big are vendors (e.g., small companies\/limit staff, large organizations)? How often do they upgrade their systems? What\u2019s the likelihood they\u2019ll be around in two or five years?\n<\/p><p>Also ask what data file formats the vendor uses; these formats vary widely among vendors. Some put everything in CSV files, others in proprietary formats. In the latter case, you may not be able to use the data files without the vendor's software. In order to maintain the ability to work with instrument data, you will have to manage the software needed to open files and work with it, in addition to just making sure you have copies of the data files. In short, if you have an instrument-computer combination that does some really nice stuff and you want to preserve the ability to gain value from that instrument's data files, you have to make a backup copy of the software environment and the data files. This is particularly important if you're considering retiring a system that you'll still want to access data from, plus you may have to maintain any underlying software license. This is where the previous conversation about virtualization and containers comes in.\n<\/p><p>If you think about a computer system it has two parts: hardware (e.g., circuit boards, hard drive, memory, etc.) and software (e.g., the OS, applications, data files, etc.). From the standpoint of the computer\u2019s processor, everything is either data or instructions read from one big file on the hard drive, which the operating system has segmented for housing different types of files (that segmentation is done for your convenience; the processor just sees it all as a source of instructions and data). Virtualization takes everything on the hard drive, turns it into a complete file, and places that file onto a virtualization server where it is stored as a file called a \u201ccontainer.\u201d That server allows you to log in, open a container, and run it as though it were still on the original computer. You may not be able to connect it the original instruments to the containerized environment, but all the data processing functions will still be there. As such, a collection of physical computers can become a collection of containers. An added benefit of virtualizations applies when you're worried about an upgrade creating havoc with your application; instead, make a container as a backup.<sup id=\"rdp-ebb-cite_ref-16\" class=\"reference\"><a href=\"#cite_note-16\">[i]<\/a><\/sup>\n<\/p><p>The advantage of all this is that you continue to have the ability to gain value and access to all of your data and information even if the original computer has gone to the recycle bin. This of course assumes your IT group supports virtualization servers, which provide an advantage in that they are easier to maintain and don\u2019t take up much space. In larger organization this may already be happening, and in smaller organizations a conversation may be had to determine IT's stance. The potential snag in all this is whether or not the software application's vendor license will cover the operation of their software on a virtual server. That is something you may want to negotiate as part of the purchase agreement when you buy the system.\n<\/p><p>This section has shown that effective management of K\/I\/D is more than just the typical consideration of database issues, system upgrades, and backups. You also have to maintain and support the entire operating system, the application, and the data file ecosystem so that you have both the files needed and the ability to work with them.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Fourth_goal:_Ensure_a_high_level_of_data_integrity_at_every_step\">Fourth goal: Ensure a high level of data integrity at every step<\/span><\/h3>\n<p>\u201cData integrity\u201d is an interesting couple of words. It shows up in marketing literature to get your attention, often because it's a significant regulatory concern. There are different aspects to the topic, and the attention given often depends on a vendor's product or the perspective of a particular author. In reality, it touches on all areas of laboratory work. The following is an introduction to the goal, with more detail given in later sections.\n<\/p>\n<h4><span class=\"mw-headline\" id=\"Definitions_of_data_integrity\">Definitions of data integrity<\/span><\/h4>\n<p>There are multiple definitions of \"data integrity.\" A broad encyclopedic definition can be found at Wikipedia, described as \"the maintenance of, and the assurance of, data accuracy and consistency over its entire life-cycle\" and \"a critical aspect to the design, implementation, and usage of any system that stores, processes, or retrieves data.\"<sup id=\"rdp-ebb-cite_ref-WPDataInt_17-0\" class=\"reference\"><a href=\"#cite_note-WPDataInt-17\">[8]<\/a><\/sup>\n<\/p><p>Another definition to consider is from a more regulatory perspective, that of the FDA. In their view, data integrity focuses on the completeness, consistency, accuracy, and validity of data, particularly through a mechanism called the ALCOA+ principles. This means the data should be<sup id=\"rdp-ebb-cite_ref-HarmonWhatIs20_18-0\" class=\"reference\"><a href=\"#cite_note-HarmonWhatIs20-18\">[9]<\/a><\/sup>:\n<\/p>\n<ul><li><b>Attributable<\/b>: You can link the creation or alteration of data to the person responsible.<\/li>\n<li><b>Legible<\/b>: The data can be read both visually and electronically.<\/li>\n<li><b>Contemporaneous<\/b>: The data was created at the same time that the activity it relates to was conducted.<\/li>\n<li><b>Original<\/b>: The source or primary documents relating to the activity the data records are available, or certified versions of those documents are available, e.g., a notebook or raw database. (This is one reason why you should collect and maintain as much data and information from an instrument as possible for each sample.)<\/li>\n<li><b>Accurate<\/b>: The data is free of errors, and any amendments or edits are documented.<\/li><\/ul>\n<p>Plus, the data should be:\n<\/p>\n<ul><li><b>Complete<\/b>: The data must include all related analyses, repeated results, and associated metadata.<\/li>\n<li><b>Consistent<\/b>: The complete data record should maintain the full sequence of events, with date and time stamps, such that the steps can be repeated.<\/li>\n<li><b>Enduring<\/b>: The data should be able to be retrieved throughout its intended or mandated lifetime.<\/li>\n<li><b>Available<\/b>: The data is able to be accessed readily by authorized individuals when and where they need it.<\/li><\/ul>\n<p>Both definitions revolve around the same point: the data a lab produces has to be reliable. The term \"data integrity\" and its associated definitions are a bit misleading. If you read the paragraphs above you get the impression that the focus in on the results of laboratory work, when in fact it is about every aspect of laboratory work, including the methods used and those who conduct those methods.\n<\/p><p>In order to gain meaningful value from laboratory K\/I\/D, you have to be assured of its integrity; \u201cthe only thing worse than no data, is data you can\u2019t trust.\u201d That is the crux of the matter. How do you build that trust? Building a sense of confidence in a lab's data integrity efforts requires addressing three areas of concern and their paired intersections: science, people, and informatics technology. Once we have successfully managed those areas and intersection points, we are left with the intersection common to all of them: constructed confidence in a laboratory's data integrity efforts (Figure 8). \n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig8_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"496854229785eef610b2b5f0620f7df4\"><img alt=\"Fig8 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/2\/20\/Fig8_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 8.<\/b> The three areas contributing to data integrity and how they intersect<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<div class=\"thumb tright\"><div class=\"thumbinner\" style=\"width:152px;\"><a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig9_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"fc1aa2b241f71809573f67794ab6e19f\"><img alt=\"\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/3\/3a\/Fig9_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" width=\"150\" height=\"146\" class=\"thumbimage\" \/><\/a> <div class=\"thumbcaption\"><div class=\"magnify\"><a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig9_Liscouski_LabTechPlanMan20.png\" class=\"internal wiki-link\" title=\"Enlarge\" data-key=\"fc1aa2b241f71809573f67794ab6e19f\"><\/a><\/div><b>Figure 9.<\/b> The \"Science\" component<\/div><\/div><\/div>\n<p><b>The science<\/b>\n<\/p><p>We\u2019ll begin with a look at the scientific component of the conversation (Figure 9). Regardless of the kinds of questions being addressed, the process of answering them is rooted in methods and procedures. Within the context of this guide, those methods have to be validated or else your first step in building confidence has failed. If those methods end with electronic measurements, then that equipment (including settings, algorithms, analysis, and reporting) have to be fully understood and qualified for use in the validated process. The manufacturer's default settings should either be demonstrated as suitable or avoided.\n<\/p><p><br \/>\n<\/p><p><b>The people<\/b>\n<\/p>\n<div class=\"thumb tleft\"><div class=\"thumbinner\" style=\"width:152px;\"><a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig10_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"71773f0cca9237f7694aadea8d9fe421\"><img alt=\"\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/6\/62\/Fig10_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" width=\"150\" height=\"142\" class=\"thumbimage\" \/><\/a> <div class=\"thumbcaption\"><div class=\"magnify\"><a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig10_Liscouski_LabTechPlanMan20.png\" class=\"internal wiki-link\" title=\"Enlarge\" data-key=\"71773f0cca9237f7694aadea8d9fe421\"><\/a><\/div><b>Figure 10.<\/b> The \"People\" component<\/div><\/div><\/div><p>People (Figure 10) need to be thoroughly educated and competent to meet the needs of the laboratory's operational procedures and scientific work. That education needs to extend beyond the traditional undergraduate program and include the specifics of instrumental techniques used. A typical four-year program doesn\u2019t have the time to cover the basic science and the practical aspects of how science is conducted in modern labs, and few schools can afford the equipment needed to meet that challenge. This broader educational emphasis is part of the intersection of people and science.\n<\/p><p>Another aspect of \u201cpeople\u201d is the development of a culture that contributes to data integrity. Lab personnel need to be educated on the organization\u2019s expectations of how lab work needs to be managed and maintained. This includes items such as <a href=\"https:\/\/www.limswiki.org\/index.php\/Data_retention\" title=\"Data retention\" class=\"wiki-link\" data-key=\"d77533b92d003d39cee958a82b62391a\">records retention<\/a>, dealing with erroneous results, and what constitutes original data. They should also be fully aware of corporate and regulatory guidelines and the effort needed to enforce them.<sup id=\"rdp-ebb-cite_ref-19\" class=\"reference\"><a href=\"#cite_note-19\">[j]<\/a><\/sup> This is another instance where education beyond that provided in the undergraduate curriculum is needed.\n<\/p><p><br \/>\n<\/p><p><b>Informatics technology<\/b>\n<\/p>\n<div class=\"thumb tright\"><div class=\"thumbinner\" style=\"width:152px;\"><a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig11_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"7a917f9531b90ef5bb4960b6b358b481\"><img alt=\"\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/7\/7a\/Fig11_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" width=\"150\" height=\"146\" class=\"thumbimage\" \/><\/a> <div class=\"thumbcaption\"><div class=\"magnify\"><a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig11_Liscouski_LabTechPlanMan20.png\" class=\"internal wiki-link\" title=\"Enlarge\" data-key=\"7a917f9531b90ef5bb4960b6b358b481\"><\/a><\/div><b>Figure 11.<\/b> The \"Informatics Technology\" component<\/div><\/div><\/div><p>Laboratory informatics technology (Figure 11) is another area where data integrity can either be enhanced or lost. The lab's digital architecture needs to be designed to support relatively easy access (within the scope of necessary security considerations) to the lab's data from the raw digitized detector output, through intermediate processed stages and to the final processed information. Unnecessary duplication of K\/D\/I must be avoided. You also need to ensure that the products chosen for lab work are suitable for the work and have the ability to be integrated electronically. After all, the goal is to avoid situations where the output of one system is printed and then manually entered into another.\n<\/p><p>The implementation and use of informatics technology should be the result of careful product selection and their intentional design\u2014from the lab bench to central database systems such as LIMS, ELN,\nSDMS, etc.\u2014rather than haphazard approach of an aggregate of lab computers.\n<\/p><p>Other areas of concern with informatics technology include backups, security, and product life cycles, which will be addressed in later sections. If as we continue onward through these goals it appears like everything touches on data integrity, it's because it does. Data integrity can be considered an optimal result of the sum of well-executed laboratory operations.\n<\/p><p><br \/>\n<\/p><p><b>The intersection points<\/b>\n<\/p><p>Two of the three intersection points deserve minor elaboration (Figure 12). First, the intersection of people and informatics technologies has several aspects the address. The first is laboratory personnel\u2019s responsibility\u2014which may be shared with corporate or LAB-IT\u2014for the selection and management of informatics products. The second is the fact that this requires those personnel to be knowledgeable concerning the application of informatics technologies in laboratory environments. Ensure the selected personnel have the appropriate backgrounds and knowledge to consider, select, and effectively use those products and technologies.\n<\/p><p>The other intersection point to be addressed is that of science with informatics technology. Here, stakeholders are concerned with product selection, system design (for automated processes), and system integration and communication with other systems and instruments. Again, as noted above, we go into more detail in later sections. The primary point here, however, can be summed up as determining whether or not the products selected for your scientific endeavors are compatible with your data integrity goals.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig12_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"9f06bfd4131804e8eb9cec219f8b813c\"><img alt=\"Fig12 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/8\/80\/Fig12_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 12.<\/b> The intersection of People and Informatics Technology (left) and Science and Informatics Technology (right)<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Addressing the needs of these two intersection points requires deliberate effort and many planning questions regarding vendor support, quality of design, system interoperability, result output, and scientific support mechanisms. Questions to ask include:\n<\/p>\n<ul><li><b>Vendor support<\/b>: How responsive are vendors to product issues? Do you get a fast, usable response or are you left hanging? A product that is having problems can affect data quality and reliability.<\/li>\n<li><b>Quality of design<\/b>: How easy is the system to use? Are controls, settings, and working parameters clearly defined and easily understood? Do you know what effect changes in those points will have on your results? Has the system been tuned to your needs (not adjusted to give you the answers you want, but set to give results that truly represent the analysis)? Problems with adjusting settings properly can distort results. (This is one area where data integrity may be maintained throughout a process, and then lost because of improper or untested controls on an instrument's operation.)<\/li>\n<li><b>System interoperability<\/b>: Will there be any difficulty in integrating a software product or instrument into a workflow? Problems with sample container compatibility, operation, control software, etc. can cause errors to develop in the execution of a process flow. For example, problems with pipette tips can cause errors in fluid delivery.<\/li>\n<li><b>Result output<\/b>: Is an electronic transfer of data possible, or does the system produce printed output (which means someone typing results into another system)? How effective is the communications protocol; is it based on a standard or does it require custom coding, which could be error prone or subject to interference? Is the format of the data file one that prevents changes to the original data? For example, CSV files allow easy editing and have the potential for corruption, nullifying data integrity efforts.<\/li>\n<li><b>Scientific support mechanisms<\/b>: Does the product fully meet the intended need for functionality, reliability, and accuracy?<\/li><\/ul>\n<p>The underlying goal in this section goes well beyond the material that is covered in schools. Technology development in instrumentation and the application of computing and informatics is progressing rapidly, and you can\u2019t assume that everything is working as advertised, particularly for your application. Software has bugs and hardware has limitations. Applying healthy skepticism towards products and requiring proof that things work as needed protect the quality of your work. \n<\/p><p>If you\u2019re a scientist reading this material, you might wonder why you should care. The answer is simply this: it is the modern evolution of how laboratory work gets done and how results are put to use. If you don\u2019t pay attention to the points noted, data integrity may be compromised. You may also find yourself the unhappy recipient of a regulatory warning letter.\n<\/p><p>While there are some outcomes that could occur that you prefer didn't, there are also positive outcomes to come from your data integrity efforts: your work will be easier and protected from loss, results will be easier to organize and analyze, and you\u2019ll have a better functioning lab. You\u2019ll also have fewer unpleasant surprises when technology changes occur and you need to transition from one way of doing things to another. Yet there's more to protecting the integrity of your K\/I\/D than addressing the science, people, and information technology of your lab. The security of your lab and its information systems must also be addressed.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Fifth_goal:_Addressing_security_throughout_the_lab\">Fifth goal: Addressing security throughout the lab<\/span><\/h3>\n<p>Security is about protection, and there are two considerations in this matter: what are we protecting and how do we enact that protection? The first is easily answered by stating that we're protecting our ability to effectively work, as well as the results of that work. This is largely tied to the laboratory's data integrity efforts. The second consideration, however, requires a few more words.\n<\/p><p>Broadly speaking, security is not a popular subject in science, as it is viewed as not advancing scientific work or the development of K\/I\/D. Security is often viewed as inhibiting work by imposing a behavioral structure on people's freedom to do their work how they wish. Given these perceptions, it should be a lab's goal to create a functional security system that provides the protection needed while at the same time minimizing the intrusion in people\u2019s ability to work.\n<\/p><p>This section will look at a series of topics that address the physical and electronic security of laboratory work. Those major topics are shown in Figure 13 below. The depth of the commentary will vary, with some topics getting discussed at length and others by brief reference to others' work.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig13_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"02ea08d21ba9577fdbb05acb12e0a76e\"><img alt=\"Fig13 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/4\/42\/Fig13_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 13.<\/b> The key issues of laboratory security<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Why must security be addressed in the laboratory? There are many reasons, which are best diagramed, as seen in Figure 14:\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig14_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"3e83b80cf5ca03e07b0ba68cacdd1e84\"><img alt=\"Fig14 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/4\/41\/Fig14_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 14.<\/b> The primary reasons why security issues need to be addressed in the lab<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>All of these reasons have one thing in common: they affect our ability to work and access the results of that work. This requires a security plan. In the end, implemented security efforts either preserve those abilities, or they reduce the value and utility of the work and results, particularly if security isn't implemented well or adds a burden to personnel's ability to work. While addressing these reasons and their corresponding protections, we should keep in mind a number of issues when developing and implementing a security plan within the lab (Figure 15). Issues like remote access have taken on particular significance over the course of the COVID-19 pandemic.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig15_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"3aba7203ca3ad469373d042c947630a7\"><img alt=\"Fig15 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/1\/12\/Fig15_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 15.<\/b> Issues to keep in mind when planning security programs<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>When the subject of security comes up, people's minds usually go in one of two directions: physical security (i.e., controlled access) and electronic security (i.e., malware, viruses, ransomware, etc.). We\u2019re going to come at it from a different angle: how do the people in your lab want to work? Instead of looking at a collection of solutions to security issues, we\u2019re going to first consider how lab personnel want to be working and within what constraints, and then we'll see what tools can be used to make that possible. Coming at security from that perspective will impact the tools you use and their selection, including everything from instrument data systems to database products, analytical tools, and cloud computing. The lab bench is where work is executed, and the planning and thinking take place between our ears, something that can happen anywhere. How do we provide people with the freedom to be creative and work effectively (something that may be different for each of us) while maintaining a needed level of physical and intellectual property security? Too often security procedures seem to be designed to frustrate work, as noted in the previous Figure 15.\n<\/p><p>The purpose of security procedures are to protect intellectual property, data integrity, resources, our ability to work, and lab personnel, all of which can be impacted by the reasons given in the prior Figure 14. However, the planning for how to approach these security procedures requires the coordination with and cooperation of several stakeholders within and tangentially related to the laboratory. Ensure these and any other necessary stakeholders are involved with the security planning efforts of your laboratory:\n<\/p>\n<ul><li>Facilities management: These stakeholders manage the physical infrastructure you are working in and have overall responsibility for access control and managing the human security assets in larger companies. In smaller companies and startups, the first line of security may be the receptionist; how well trained are they to deal with the subject?<\/li>\n<li>IT groups: These stakeholders will be responsible for designing and maintaining (along with facilities management) the electronic security systems, which range from passkeys to networks.<\/li>\n<li>Legal: These stakeholders may work with human resources to set personnel standards for security, reviewing licensing agreements and contracts\/leases for outside contractors and buildings (more later).<\/li>\n<li>Lab personnel: From the standpoint of this guide, this is all about the people doing the analytical and research work within the laboratory.<\/li>\n<li>Consultants: Security is a complex and rapidly developing subject, and you will likely need outside support to advise you on what is necessary and possible, as well as how to go about making that a reality.<\/li><\/ul>\n<p>But what else must be considered during you and your stakeholders' planning efforts? Before we can get into the specific technologies and practices that may be implemented within a facility, we need to look at the facility itself. \n<\/p>\n<h4><span class=\"mw-headline\" id=\"Examine_aspects_of_the_facility_itself\">Examine aspects of the facility itself<\/span><\/h4>\n<p>Does your company own the building you are working in? Is it leased? Is it shared with other companies in a single industrial complex? If you own the facility, life is simpler since you control everything. Working in a shared space that is leased or rented requires more planning and thought, preferably before you sign an agreement. You're likely to have additional aspects to seriously consider about your facility. Have the locks and door codes been changed since the last tenant left? Is power shared across other businesses in your building? Is the backup generator\u2014if there is one\u2014sufficient to run your systems? What fire protections are in place? How is networking managed in the facility? Are security personnel attuned to the needs of your company? Let's take a look at some of these and other questions that should be addressed.\n<\/p><p><b>Is the physical space well-defined, and does building maintenance have open access to your various spaces? <\/b>\n<\/p><p>Building codes vary from place to place. Some are very specific and strict, while others are almost live-and-let-live. One thing you want to be able to do is to define and control your organization's physical space and set up any necessary and additional protective boundaries. Physical firewalls are one way of doing that. A firewall should be a solid structure that acts as a barrier to fire propagation between your space and neighboring spaces, extending from below-ground areas to the roof. If it is a multi-level structure, the levels should be isolated. This may seems obvious, but in some single-level shared buildings (e.g., strip malls) the walls may not go to the roof to make it easier to route utilities like HVAC, power, and fire suppression. This can acts as an access point to your space.\n<\/p><p>Building maintenance is another issue. Do they have access to your space? Does that access come with formal consent or is that consent assumed as part of the lease or rental agreement? Several problems must be considered. First, know that anyone who has access to your physical space should be considered a weak point in your security. Employees should inherently have a vested interest in protecting your assets, but building maintenance is a different matter. Who vets them? Since these notes are focused on laboratory systems, who trains them about what to touch and what not to? (For example, an experiment could be ruined because maintenance personnel opened a fume hood, disturbing the airflow, despite any signage placed on the hood glass.) Consider more than just office systems in your space analysis, including other equipment that may be running after-hours that doesn\u2019t handle tampering, curiosity, or power outages well. Do you have robotics running multiple shifts or other moving equipment that might attract someone\u2019s curiosity? Security cameras would be useful, as would \u201cDo Not Enter\u201d signs. \n<\/p><p>Second, most maintenance staff will notify you (hopefully in writing) about their activities so you can plan accordingly, but what about emergency issues? If they have to fix a leak or a power problem, what are the procedures for safely shutting down systems? Do they have a contact person on your staff in case a problem occurs? Is there hazardous material on-site that requires special handling? Are the maintenance people aware of it and how to handle it? Answers to these questions should be formalized in policy and disseminated to both maintenance and security management personnel, and be made available to new personnel who may not be up to speed.\n<\/p><p><b>Is power shared across other businesses in your building?<\/b>\n<\/p><p>Shared power is another significant issue in any building environment. Unless someone paid careful attention to a lab's needs during construction, it can affect any facility. A number of issues can arise from misconfigured or unsupported power systems. Real-life examples of issues a computer support specialist friend of mine has encountered in the past include computers that:\n<\/p>\n<ul><li>were connected to the same circuit box as heavy duty garage doors. Deliveries would come in early in the morning and when the doors opened the computers crashed.<\/li>\n<li>were on the same circuit as air conditioners. The computers didn\u2019t crash, but the electrical noise and surging power use created havoc with systems operations and disk drives.<\/li>\n<li>connected to circuits that didn\u2019t have proper grounding or had separate grounding systems in the same room. Some didn\u2019t have external grounding at all. We worked on a problem with one computer-instrument system that had each device plugged into different power outlets. The computer\u2019s was grounded, but the instrument's power supply wasn\u2019t; once that was fixed everything worked.<\/li>\n<li>were too close to a radio tower. Every night when the radio system changed its antenna configuration, the computer experienced problems. Today, many devices generate radio signals that might interfere with each other. The fact that they are \u201cdigital\u201d systems doesn\u2019t matter; they are made of analog components.<\/li><\/ul>\n<p><b>Is the power clean, and is the backup generator\u2014if there is one\u2014sufficient to run your systems?<\/b>\n<\/p><p>Another problem is power continuity and quality. Laboratories depend on clean, reliable power. What will the impact of power outages\u2014lasting anywhere from seconds to days\u2014be on your ability to function? The longer end of the scale is easy; you stop working or relocate critical operations. Generators are one solution option, and we\u2019ll come back to those. The shorter outages, particularly if they are of the power up-down-up variety, are a separate issue. Networkable sensors with alarms and alerts for monitoring power, refrigeration, etc., permitting remote monitoring, may be required. Considerations for these intermittent outages include:\n<\/p>\n<ul><li>Do you know when they happened? What was their duration? How can you tell? (Again, consider sensor-based monitoring.)<\/li>\n<li>What effect did intermittent outages have on experiments that were running? Did the systems and instruments reset? Was data lost? Were in-process samples compromised?<\/li>\n<li>What effect did they have on stored samples? If samples had to be maintained under controlled climate conditions, were they compromised?<\/li>\n<li>Did power loss and power cycling cause any problems with instrumentation? How do you check?<\/li>\n<li>Did systems fail into a safe mode?<\/li><\/ul>\n<p>How real are power problems? As Ula Chrobak notes in an August 2020 Popular Science article, infrastructure failures, storms, climate change, etc. are not out of the realm of possibility; if you were in California during that time, you saw the reality first-hand.<sup id=\"rdp-ebb-cite_ref-ChrobakTheUS20_20-0\" class=\"reference\"><a href=\"#cite_note-ChrobakTheUS20-20\">[10]<\/a><\/sup>\n<\/p><p><b>If laboratory operations depend on reliable power, what steps can we take to ensure that reliability?<\/b>\nFirst, site selection naturally tops the list. You want to be somewhere that has a reputation for reliable power and rapid repairs if service is lost. A site with buried wiring would be optimal, but that only benefits you a little if the industrial park has buried wiring but is actually fed with overhead wiring. Another consideration is the age of the site: an older established site may have outdated cables that are more likely to fail. The geography is also important. Nearby rivers, lakes, or an ocean might be liable to producing floods, causing water intrusion into wiring. Also, don\u2019t overlook the potential issues associated with earthquakes or nearby industries with hazardous facilities such as chemical plants or refineries. Areas prone to severe weather conditions are an additional consideration.\n<\/p><p>Second, address the overall quality of the building and its infrastructure. This affects buildings you own as well as lease; however, the difference is in your ability to make changes. How old is the wiring? Has it been inspected? Are the grounding systems well implemented? Do you have your own electrical meters, and is your power supply isolated from other units if you are leasing? Will your computers and instruments be on circuits isolated from heavy equipment and garage doors? Make an estimate of your power requirements, then at least double it. Given that, is there sufficient amperage coming into the site to manage all your instruments, computers, HVAC systems, and freezers? How long will you be occupying that space, and is there sufficient power capacity to support potential future expansion?\n<\/p><p>Third, consider how to implement generators and battery backup power. These are obvious solutions to power loss, yet they come with their own considerations:\n<\/p>\n<ul><li>Who has control over generator implementation? If you own the building, you do. If the building is leased, the owner does, and they may not even provide generator back-up power. If not, your best bet\u2014unless you are planning on staying there for a long time\u2014is to go somewhere else; the cost of installing, permitting, and maintaining a generator on a leased site may be prohibitive. A good whole-house system can run up to $10,000, plus the cost of a fueling system.<\/li>\n<li>How much power will you need and for how long, and is sufficient fuel available? Large propane tanks may need to be buried. Diesel is another option, though fire codes may limit fuel choices in multi-use facilities. The expected duration of an outage is important, also. Often we think perhaps a few hours, but snow, ice, hurricane, tornados, and earthquakes may push that out to a week or more.<\/li>\n<li>Is the generator\u2019s output suitable for the computers and instruments in your facility? A major problem to acknowledge is electrical noise: too much and you\u2019ll create more problems than you would have if the equipment had just been shut down.<\/li>\n<li>What is the startup delay of the generator? A generator can take anywhere from a few seconds to several minutes to get up to speed and produce power. Can you afford that delay? Probably not.<\/li><\/ul>\n<p>The answer to the problems noted in the last two bullets is battery backup power. These can range from individual units that are used one-per-device, like home battery backups for computers and other equipment, to battery-walls that are being offered for larger applications. The advantage is that they can come online anywhere from instantly (i.e., always-on, online systems) to a few milliseconds for standby systems. The always-on, online options contain batteries that are constantly being charged and at the same time constantly providing power to whatever they are connected to. More expensive than standby systems, they provide clean power even from a source that might otherwise be problematic. On the other hand, standby systems are constantly charging but pass through power without conditioning; noisy power in, noisy power out until a power failure occurs.\n<\/p>\n<h4><span class=\"mw-headline\" id=\"Security_and_the_working_environment\">Security and the working environment<\/span><\/h4>\n<p>When we are looking at security as a topic, we have keep in mind that we are affecting people's ability to work. Some of the laboratory's work is done at the lab bench or on instruments (which, depending on the field you\u2019re in, could range from pH meters to telescopes). However, significant work occurs away from bench, thinking and planning wherever a thought strikes. What kind of flexibility do you want people to have? Security will often be perceived as stuff that gets in the way of personnel's ability to work, despite the fact that well-implemented security protects their work. \n<\/p><p>We need to view security as a support structure enabling flexibility in how people work, not simply as a series of barriers that frustrate people. You can begin by defining the work structure as you\u2019d like it to be, at the same time recognizing that there are two sides to lab work: the intellectual (planning, thinking, evaluating, etc.) and the performed (where you have to work with actual samples, equipment, and instruments). One can be done anywhere, while the other is performed in a specific space. The availability of computers and networks can blur the distinction. \n<\/p><p>Keeping these things in mind, any security planning should consider the following:\n<\/p>\n<ul><li>How much flexibility do personnel want in their work environment vs. what they can actually have? In some areas of work, there may be significant need for lockdown with little room for flexibility, while other areas may be pretty loose.<\/li>\n<li>Do people want to work from remote places? Does the nature of the work permit it? This can be motivated by anything from \u201cthe babysitter is sick\u201d to \u201cI just need to get away and think.\u201d<\/li>\n<li>While working remotely, do people need access to lab computers for data, files (i.e., upload\/download), or to interact with experiments? Some of this can be a matter of respecting people\u2019s time. If you have an experiment running overnight or during the weekend, it would be nice to check the status remotely instead of driving back to work.<\/li>\n<li>Do people need after-hours access to the lab facilities?<\/li><\/ul>\n<p>The answers to these planning questions lay the groundwork for hardware, software, and security system requirements. Can you support the needs of personnel, and if so, how is security implemented to make it work? Will you be using gateway systems to the lab network, with additional logins for each system, two-factor authentication, or other mechanisms? The goal is to allow people to be as productive as possible while protecting the organization's resources and meeting regulatory requirements. That said, keep in mind that unless physical and virtual access points are well controlled, others may compromise the integrity of your facility and its holdings.\n<\/p><p>Employees need to be well educated in security requirements in general and how they are implemented in your facility. They need to be a willing part of the processes and not grudgingly accepting them; that lack of willingness to work within the system is a security weak point, things people will try to circumvent. One obvious problem is with username-password combinations for computer access; rather than typing that information in, biometric features are faster and less error prone.\n<\/p><p>That said, personnel should readily accept that no system should be open to unauthorized access, and that hierarchical levels of control may be needed, depending on the type of system; some people will have access to some capabilities and not others. This type of \"role-based\" access shouldn\u2019t be viewed as a matter of trust, but rather as a matter of protection. Unless the company is tiny, senior management, for example, shouldn\u2019t have administrative system level access to database systems or robotics. If management is going to have access to those levels, ensure they know exactly what they are doing. By denying access to areas not needed in a role-based manner, you limit the ability of personnel to improperly interrogate or compromise those systems for nefarious purposes.\n<\/p><p><b>What are your security control requirements?<\/b>\n<\/p><p>Figure 16 lists some of the key areas of concern for security controls. Some we\u2019ll touch on, others we'll leave to those better informed (e.g., see Tulsi 2019<sup id=\"rdp-ebb-cite_ref-TulsiGreater19_21-0\" class=\"reference\"><a href=\"#cite_note-TulsiGreater19-21\">[11]<\/a><\/sup> and Riley 2020<sup id=\"rdp-ebb-cite_ref-RileyGitLab20_22-0\" class=\"reference\"><a href=\"#cite_note-RileyGitLab20-22\">[12]<\/a><\/sup>).\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig16_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"62197432df9a7414f81fa4d22fbd9902\"><img alt=\"Fig16 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/4\/44\/Fig16_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 16.<\/b> Key areas of concern for security controls<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p><b>What is your policy on system backups?<\/b>\n<\/p><p>When it comes to your computer systems, are you backing up their K\/I\/D? If so, how often? How much K\/I\/D can you afford to lose? Look at your backups on at least three levels. First, backing up the hard drive of a computer protects against failure of that computer and drive. Backing up all of a lab\u2019s data systems to an on-site server in a separate building (or virtualized locally) protects against physical damage to the lab (e.g., fire, storms, earthquake, floods, etc.). Backing up all of a lab\u2019s data systems to a remote server (or virtualized remotely) provides even more protection against physical damage to the lab facility, particularly if the server is located someplace that won\u2019t be affected by the same problems your site may be facing. It should also be somewhere that doesn\u2019t compromise legal control over your stuff; if it is on a third-party server farm in another country, that country\u2019s laws apply to access and seizure of your files if legal action is taken against you.\n<\/p><p><b>Should your various laboratory spaces and components be internet-connected?<\/b>\n<\/p><p>When looking at your lab bench spaces, instruments, database systems, etc., determine whether they should be connected to the internet. This largely depends on what capabilities you expect to gain from internet access. Downloading updates, performing online troubleshooting with vendors, and conducting online database searches (e.g., spectra, images, etc.) are a few useful capabilities, but are they worth the potential risk of intrusion? Does your IT group have sufficient protection in place to allow access and still be protected? Note, however, any requirement for a cloud-based system would render this a moot point.\n<\/p><p>Lab systems should be protected against any intrusion, including vendors. Vendor-provided files can be downloaded to flash drives, which can then be checked for malware and integrity before being manually transferred to lab systems. Consider what is more important: convenience or data protection? This may give you more to think about when you consider your style of working (e.g., remote access). However, having trusted employees access the lab network is different than third-parties.\n<\/p>\n<h4><span class=\"mw-headline\" id=\"Summary\">Summary<\/span><\/h4>\n<p>We\u2019ve only been able to touch on a few topics; a more thorough review would require a well-maintained document, as things are changing that quickly.<sup id=\"rdp-ebb-cite_ref-23\" class=\"reference\"><a href=\"#cite_note-23\">[k]<\/a><\/sup> In many labs, security is a layered activity, where as the work of the lab is planned, security issues are then considered. We\u2019d be far better off if security planning was instead conducted in concert with lab systems planning; support for security could would then become part of the product selection criteria.\n<\/p>\n<h3><span id=\"rdp-ebb-Sixth_goal:_Acquiring_and_developing_"products"_that_support_regulatory_requirements\"><\/span><span class=\"mw-headline\" id=\"Sixth_goal:_Acquiring_and_developing_.22products.22_that_support_regulatory_requirements\">Sixth goal: Acquiring and developing \"products\" that support regulatory requirements<\/span><\/h3>\n<p>Products should be supportable. That seems pretty simple, but what exactly does that mean? How do we acquire them, and more importantly, how do we develop them? The methods and procedures you develop for lab use are \u201cproducts\u201d\u2014we\u2019ll come back to that.\n<\/p><p>First, here\u2019s an analogy using an automobile. The oil pan on a car may need to be replaced if it is leaking due to damage or a failed gasket; if it isn\u2019t repaired, the oil can leak out. Some vehicles are more difficult to work on than others given their engineering. For example, replacing the oil pan in some cars requires you to lift the engine block out of the car. That same car design could also force you to move the air conditioner compressor to change spark plugs. In the end, some automobile manufactures have built a reputation for cars that are easy to service and maintain, which translates into lower repair costs and longer service life. \n<\/p><p>How does that analogy translate to the commercial products you purchase for lab use, as well as the processes and procedures you develop for your lab? The ability to effectively (i.e., with ease, a low cost, etc.) support a product has to be baked into the design from the start. It can\u2019t be retrofitted.\n<\/p><p>Let\u2019s begin with the commercial products you purchase for lab use, including instruments, computer systems, and so on. One of the purchase criteria for those items is how well they are supported: mature products should have a better support infrastructure, built up over time and use. However, that doesn\u2019t always translate to high-quality support; you may find a new product getting eager support because the vendor is heavily invested in market acceptance, working the bugs out, and using referral sites to support their sales. When it comes to supporting these commercial products, we expect to see:\n<\/p>\n<ul><li>User Guides \u2013 This should tell you how the device works, what the components are (including those you shouldn\u2019t touch), how to use the control functions, what the expected operating environment is, what you need to provide to make the item usable, and so on. For electronic devices with control signal communications and data communications, the vendor will describe how it works and how they expect it to be used, but not necessarily how to use it with third-party equipment. There are limitations of liability and implied support commitments that they prefer not to get involved with. They provide a level of capability, while it\u2019s largely left up to you to make it work in your application.<\/li>\n<li>Training materials \u2013 This will take you from opening the box, setting up whatever you\u2019ve purchased, and walking through all the features and some examples of their use. The intent is to get you oriented and familiar with using it, with the finer details located in user guides. Either this document or the user guide should tell you how to ensure that the device is installed and operating properly, and what to do if it isn\u2019t. This category can also include in-person short courses as well as online courses (an increasingly popular option as something you can do at your convenience).<\/li>\n<li>Maintenance and troubleshooting manuals \u2013 This material describes what needs to be periodically maintained (e.g, installing software upgrades, cleaning equipment, etc.) and what to do if something isn\u2019t working properly.<\/li>\n<li>Support avenues - Be it telephone, e-mail, or online chat, there are typically many different ways of reaching the vendor for help. Online support can also include a \u201cknowledgebase\u201d of articles on related topics, as well as chat functions.<\/li>\n<li>User groups \u2013 Whether conducted in-person or online, venues for giving users a chance to solve problems and present material together can also prove valuable.<\/li><\/ul>\n<p>From the commercial side of laboratory equipment and systems, support is an easy thing to deal with. If you have good products and support, people will buy them. If your support is lacking, they will go somewhere else, or you will have fostered the development of a third-party support business if your product is otherwise desirable.\n<\/p><p>From the system user\u2019s perspective, lab equipment support is a key concern. Users typically don\u2019t want to take on a support role in the lab as that isn\u2019t their job. This brings us to an interesting consideration: product life cycles. You buy something, put it to use, and eventually it has to be upgraded (particularly if it involves software) or possibly replaced (as with software, equipment, instruments, etc.). Depending on how that item was integrated into the lab\u2019s processes, this can be a painful experience or an easy one. Product life cycles are covered in more detail later in this section, but for now know they are important because they apply, asynchronously, to every software system and device in your lab. Upgrade requirements may not be driven by a change in the functionality that is important to the lab, but rather due to a change to an underlying component, e.g., the computer's operating system. The reason that this is important in a discussion about support is this: when you evaluate a vendor's support capabilities, you need to cover this facet of the work. How well do they evaluate changes in the operating system (OS) in relation to the functionality of their product? Can they advise you about which upgrades are critical and those that can be done at a more convenient time? If a change to OS or a database product occurs, how quickly do they respond?\n<\/p><p>Now that we have an idea what support means for commercial products, let\u2019s consider what support means for the \"products\"\u2014i.e., the procedures and methods\u2014developed in your lab.\n<\/p><p>The end result of a typical laboratory-developed method is a product that incorporates a process (Figure 17). This idea is nothing new in the commercial space. Fluid Management Systems, Inc. has complex sample preparation processing systems as products<sup id=\"rdp-ebb-cite_ref-FMSAutomated_24-0\" class=\"reference\"><a href=\"#cite_note-FMSAutomated-24\">[13]<\/a><\/sup>, as do instrument vendors that combine autosamplers, an instrument, and a data system (e.g., some of Agilent\u2019s PAL autosampler systems incorporate sample preparation processing as part of their design<sup id=\"rdp-ebb-cite_ref-AgilentPAL_25-0\" class=\"reference\"><a href=\"#cite_note-AgilentPAL-25\">[14]<\/a><\/sup>). Those lab methods and procedures can range from a few steps to an extensive process whose implementations can include fully manual execution steps, semi-automated steps (e.g., manual plus instrumentation), and fully automated steps. In the first two cases, execution can occur with either printed or electronic documentation, or it can be managed by a LES. However, all of these implementations are subject to regulatory requirements (commercial products are subject to <a href=\"https:\/\/www.limswiki.org\/index.php\/ISO_9000\" title=\"ISO 9000\" class=\"wiki-link\" data-key=\"53ace2d12e80a7d890ce881bc6fe244a\">ISO 9000<\/a> requirements).\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig17_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"2d332579e52fb24a7257395161cd95c0\"><img alt=\"Fig17 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/5\/59\/Fig17_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 17.<\/b> A laboratory process<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<h4><span class=\"mw-headline\" id=\"The_importance_of_documentation\">The importance of documentation<\/span><\/h4>\n<p>Regulatory requirements and guidelines (e.g., from the FDA, EPA, ISO, etc.) have been with production and R&D for decades. However, some still occasionally question those regulations' and guidelines' application to research work. Rather than viewing them as hurdles which a lab must cross to be deemed qualified, they should be viewed as hallmarks of a well-run lab. With that perspective, they remain applicable for any laboratory.\n<\/p><p>For purposes of this guide, there is one aspect of regulatory requirements that will be emphasized here: process validation, or more specifically the end result, which is a validated process. Laboratory processes, all of which have to be validated, are essentially products for a limited set of customers; in many cases its one customer, in others the same process may be replicated in other labs as-is. The more complex the implementation, and the longer the process is expected to be in use, the more important it is to incorporate some of the tools from commercial developers into lab development (Table 1). However, regardless of development path, complete documentation is of the utmost concern. \n<\/p>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table class=\"wikitable\" border=\"1\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\" colspan=\"2\"><b>Table 1.<\/b> Process and product development documentation\n<\/td><\/tr>\n<tr>\n<th style=\"background-color:#dddddd; padding-left:10px; padding-right:10px;\">Laboratory-developed products\n<\/th>\n<th style=\"background-color:#dddddd; padding-left:10px; padding-right:10px;\">Commercially developed products\n<\/th><\/tr>\n<tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">\u2022 user requirements document<br \/>\u2022 functional specification<br \/>\u2022 system design and testing<br \/>\u2022 implementation<br \/>\u2022 installation, operational, and performance qualification<br \/>(IQ, OQ, PQ) evaluation and acceptance\n<\/td>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\">\u2022 product requirements document<br \/>\u2022 engineering or functional specification<br \/>\u2022 product testing protocol<br \/>\u2022 product readiness for market acceptance criteria<br \/> <br \/>PLUS product support elements like:<br \/>\u2022 user guides<br \/>\u2022 training materials<br \/>\u2022 maintenance and troubleshooting guides<br \/>\u2022 support mechanisms<br \/>\u2022 user groups\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Documentation is valuable because:\n<\/p>\n<ul><li>It's educational: Quality documentation ensures those carrying out the process or maintaining it are thoroughly educated. Written documentation (with edits and <a href=\"https:\/\/www.limswiki.org\/index.php\/Audit_trail\" title=\"Audit trail\" class=\"wiki-link\" data-key=\"96a617b543c5b2f26617288ba923c0f0\">audit trails<\/a>, as appropriate) acts as a stable reference point for how things should be done. The \u201cfollow-me-and-I\u2019ll-show-you\u201d approach is flawed. That method depends on someone accurately remembering and explaining the details while having the time to actually do it, all while hoping bad habits don't creep in and become part of \u201chow it\u2019s done.\u201d<\/li>\n<li>It informs: Quality documentation that is accessible provides a reference for questions and problems as they occur. The depth of that documentation, however, should be based on the nature of the process. Even manual methods that are relatively simple need some basic elements. To be innformative, it should address numerous questions. Has the instrument calibration been accurately verified? How do you tell, and how do you correct the problem if the instrument is out of calibration? What information is provided about reagents, including their age, composition, strength, and purity? When is a technician qualified to use a reagent? How are reference materials incorporated as part of the process to ensure that it is being executed properly and consistently?<\/li><\/ul>\n<p>Note that the support documents noted in Table 1 are not usually part of process validation. The intent of process validation is to show that something works as expected once it is installed.\n<\/p><p>One aspect that hasn\u2019t been mentioned so far is how to address necessary change within processes. Any lab process is going to change over time. There may be a need for increased throughput, lower operating costs, less manual work, the ability to run over multiple shifts, etc. There may also be new technologies that improve lab operations that eventually need to get incorporated into the process. As such, planning and process documentation should describe how processes are reviewed and modified, along with any associated documentation and training. This requires the original project development to be thoroughly documented, from functionality scoping to design and implementation. By including process review and modification as part of a process allows that process to be upgraded without having to rebuild everything from scratch. This level of documentation is rarely included due to the initial cost and impact on the schedule. It will affect both, but it will also show its value once changes have to be made. In the end, by adding process review and modification mechanism, you ensure a process is supportable.\n<\/p><p>To be clear, the initial design and planning of process and methods has to be done well for a supportable product. This means keeping in mind future process review and modification even as the initial process or method is being developed. It\u2019s the difference between engineering a functional and supportable system and \u201cjust making something that works.\u201d Here are three examples:\n<\/p>\n<ul><li>One instrument system vendor, in a discussion between sessions of a meeting, described how several of his customers successfully connected a <a href=\"https:\/\/www.limswiki.org\/index.php\/Chromatography_data_system\" title=\"Chromatography data system\" class=\"wiki-link\" data-key=\"a424bb889d8507b7e8912f2faf2570c6\">chromatography data system<\/a> (CDS) to a LIMS. It was a successful endeavor until one of the systems had to be upgraded, then everything broke. The programmer had made programming changes to areas of the packages that they shouldn\u2019t have. When the upgrade occurred, those changes were overwritten. The project had to be scrapped and re-developed.<\/li>\n<li>A sample preparation robotics system was similarly implemented by creating communications between devices in ways that were less than ideal. When it came time for an upgrade to one device, the whole system failed.<\/li>\n<li>A consultant was called in to evaluate a project to interface a tensile tester to a computer, as the original developer had left the company. The consultant recommended the project be scrapped and begun anew. The original developer had not left any design documentation, the code wasn\u2019t documented, and no one knew if any of it worked or how it was supposed to work. Trying to understand someone else\u2019s programming without documentation assistance is really a matter of trying to figure out their thinking process, and that can be very difficult.<\/li><\/ul>\n<p>There are a number of reasons why problems like this exist. Examples include lack of understanding of manual and automated systems design and engineering methodologies and pressure from management (e.g., \u201chow fast can you get it done,\u201d \u201ckeep costs down,\u201d and \u201cwe\u2019ll fix it in the next version\u201d). Succumbing to these short-term views will inevitably come back to haunt you in the long-term. Upgrades, things you didn\u2019t think of when the original project was planned, and support problems all tend to highlight work that could have been done better. Another saying that frequently comes up is \u201cthere is never time to do it right, but there is always time to do it over,\u201d usually at a considerably higher cost.\n<\/p>\n<h4><span class=\"mw-headline\" id=\"Additional_considerations_in_creating_supportable_systems_and_processes\">Additional considerations in creating supportable systems and processes<\/span><\/h4>\n<p><b>Single-vendor or multi-vendor?<\/b>\n<\/p><p>Let\u2019s assume we are starting with a manual method that works and has been fully validated with all appropriate documentation. And then someone wants to change that method in order to meet a company goal such as increasing productivity or lowering operational costs. Achieving goals like those usually means introducing some sort of automation, anything from automated pipettes to instrumentation depending on the nature of the work. Even more important, if a change in the fundamental science underlying the methodology is proposed, that would also require re-validation of the process.\n<\/p><p>Just to keep things simple, let\u2019s say the manual process has an instrument in it, and you want to add an autosampler to keep the instrument fed with samples to process. This means you also need something to capture and process the output or any productivity gains on the input may be lost in handling the output. We\u2019ll avoid that discussion because our concern here is supportability. There are a couple directions you can go in choosing an add-on sampler: buy one from the same vendor as the instrument, or buy it from another vendor because it is less expensive or has some interesting features (though unless those features are critical to improving the method, they should be considered \u201cnice but not necessary\u201d).\n<\/p><p>How difficult is it going to be to make any physical and electronic (control) connections to the autosampler? Granted, particularly for devices like autosamplers, vendors strive for compatibility, but there may be special features that need attention. You need to consider not just the immediate situation but also how things might develop in the future. If you purchase the autosampler from the same vendor as the instrument and control system, they are going to ensure that things continue to work properly if upgrades occur or new generations of equipment are produced (see the product life cycle discussion in the next section). If the two devices are from different vendors, compatibility across upgrades is your issue to resolve. Both vendors will do what they can to make sure their products are operating properly and answer questions about how they function, but making them work together is still your responsibility.\n<\/p><p>From the standpoint of supportability, the simpler approach is the easiest to support. Single-vendor solutions put the support burden on them. If you use multi-vendor implementations, then all the steps in making the project work have to be thoroughly documented from the statement of need, through to the functional specifications and the finished product. The documentation may not be very long, but any assistance you can give someone who has to work with the system in the future\u2014including yourself (i.e., \u201cwhat was I thinking when I did this?\u201d)\u2014will be greatly appreciated.\n<\/p><p><b>On-device programming or supervisory control?<\/b>\n<\/p><p>Another consideration is for semi- or fully-automated systems where a component is being added or substituted. When we are looking at programmable devices, one approach is to make connections between devices via on-device programming. For example, say Device A needs to work with Device B, so programming changes are made to both to accomplish the task. While this can be made to work and be fully documented, it isn\u2019t a good choice since changing one of them (via upgrade or swap) will likely mean the programming has to be re-implemented. A better approach is to use a supervisory control system to control them both, and others that may be part of the same process. It allows for a more robust design, easier adaptations, and smoother implementation. It should be easier to support since programming changes will be limited to communications codes.\n<\/p><p><b>Third-party developers and contractors?<\/b>\n<\/p><p>Frequently, third parties are brought in to provide services that aren\u2019t available through the lab or onsite IT staff. For example, a functional specification usually describes what you want as the end result and what their product is supposed to do, not how it is supposed to do it. This is left to the developer to figure out. You need to add supportability as a requirement, that the end result not only meet regulatory requirements, but that it also is designed and documented with sufficient information to have someone unfamiliar with the project understand what would have to be done if a change were made, which also requires you to think about where changes might be made in the future. This includes considering what components might be swapped for newer technologies, handling software upgrades (and what might break as a result of them), and knowing what to do if components reach their supported end-of-life and have to be replaced.\n<\/p><p>Consulting firms may respond with \u201cif something needs to be changed or fixed, just call us, we built it,\u201d which sounds reasonable. However, suppose the people who built it aren\u2019t there anymore or aren't available because they're working on other projects. The reality is the \u201ccompany\u201d didn\u2019t build the product, people working for them did.\n<\/p>\n<h4><span class=\"mw-headline\" id=\"Product_life_cycles\">Product life cycles<\/span><\/h4>\n<p>When discussing product life cycles, whether it's digital products or hardware systems, the bottom line problem is this: what do you do when a critical product needs to be updated or replaced? This can be an easy issue or a very painful one, depending on how much thought went into the design of the original procedure using that device. It's generally easy if you had the forethought of noting \u201csomeday this is going to be replaced, so how do we simplify that?\" It's more difficult if you, through wiring or software, linked devices and systems together and then can\u2019t easily separate them, particularly if no one documented how that might be accomplished. It\u2019s all a matter of systems engineering done well.\n<\/p><p><b>Note<\/b>: This material was originally published in <i><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.researchgate.net\/publication\/275351757_Computerized_Systems_in_the_Modern_Laboratory_A_Practical_Guide\" target=\"_blank\">Computerized Systems in the Modern Laboratory: A Practical Guide<\/a><\/i>.\n<\/p><p>An analog pH meter\u2014as long as it has been maintained\u2014will still work today. So will double-pan balances, again as long as they have been maintained and people are proficient in their use. Old lab equipment that will still function properly has been replaced with more modern equipment due to better accuracy, ease of use, and other factors. Analog instruments can still be found operating decades after their design end-of-life. It is in the digital realm that equipment that should work (and probably does) but can\u2019t be used after a few years of service. The rate of technology change is such that tools become obsolete on the order of a half-decade. For example, rotating disks, an evolving computer staple that replaced magnetic tape drives, are now being replaced with solid-state storage.\n<\/p><p>Digital systems require two components to work: hardware (e.g., the computer, plus interfaces for the hard disk, and ports for cable connections) and software (e.g., operating systems plus software drivers to access the hardware). Both hardware packaging and operating systems are changing at an increasing rate. Hardware systems are faster, with more storage, and operating systems are becoming more complex to meet consumer demands, with a trend toward more emphasis on mobile or social computing. Those changes mean that device interfaces we rely on may not be there in the next computer you have to purchase. The RS-232 serial port, a standard for instrument connections, is being replaced with USB, Firewire, and Thunderbolt connections that give support to a much wider range of devices and simplify computer design, with more usable and less costly devices. It also means that the instrument with the RS-232 interface may not work with a new computer due to there being no RS-232 ports, and the operating system may also no longer be compatible with the instrumentation.\n<\/p><p>One aspect of technology planning in laboratory work is change management, specifically the impact of technology changes and product life cycles on your ability to work. The importance of planning around product life cycles has taken on an added dimension in the digital laboratory. Prior to the use of computers, instruments were the tools used to obtain data, which was recorded in paper laboratory notebooks. End result: getting the data and recording and managing it were separate steps in lab work. If the tools were updated or replaced, the data recorded wasn\u2019t affected. In the digital realm, changes in tools can affect your ability to work with new and old data and information. The digital-enabled laboratory requires planning, with a time horizon of decades to meet legal and regulatory requirements. The systems and other tools you use may not last for decades; in fact, they will probably change several times. However, you will have to plan for the transfer of the data and information they contain and address the issue of database access and file formats. The primary situation to avoid is having data in files that you can\u2019t read.\n<\/p><p>While we are going to begin looking at planning strategies for isolated products as a starting point, please keep in mind that in reality products do not exist in isolation. The laboratory\u2019s K\/I\/D is increasingly interconnected, and changes in one part of your overall technology plan can have implications across your lab's working technology landscape. The drive toward integration and paperless laboratories has consequences that we are not fully prepared to address. We\u2019ll start with the simpler cases and build upon that foundation.\n<\/p><p>Digital products change for a number of reasons: \n<\/p>\n<ul><li>The underlying software that support informatics applications could change (e.g., operating systems, database systems), and the layers of software that build on that base have to be updated to work properly.<\/li>\n<li>Products could see improvements due to market research, customer comments, and competitive pressures.<\/li>\n<li>Vendors could get acquired, merge with another company, or split up, resulting in products merging or one product being discarded in favor of another.<\/li>\n<li>Your company could be acquired, merge with another company, or split into two or more organizations.<\/li>\n<li>Products could simply fail.<\/li>\n<li>Your lab could require a change, replacing older technologies with systems that provide more capability.<\/li><\/ul>\n<p>In each of these cases, you have a decision to make about how K\/I\/D is going to be managed and integrated with the new system(s).\n<\/p><p>But how often do digital products change? Unfortunately, there isn't much detailed information published about changes in vendor products. Historically, operating systems used to be updated with new versions on an annual basis, with updates (e.g., bug fixes, minor changes) occurring more frequently. With a shift toward subscription services, version changes can occur more frequently. The impact of an OS version change will vary depending on the OS. Some vendors take responsibility and control for the hardware and software, and as a result, upgrades support both the hardware and OS until the vendor no longer supports new OS\nversions on older systems. Other computer systems, where the hardware and software components come from different vendors, can result in the inability to access hardware components due to an upgrade. The OS upgrade only supports certain hardware features. Support for specific add-on equipment (including components provided by the computer vendor) may require finding and reinstalling drivers from the original component vendor. As for the applications that run on operating systems, they will need to be tested with each OS version change. \n<\/p><p>Applications tend to be updated on an irregular basis, for both direct installs and for cloud-hosted solutions. Microsoft Office and Adobe\u2019s Creative Cloud products may be updated as they see a need. Since both product suites are now accessed via the internet on a subscription basis (<a href=\"https:\/\/www.limswiki.org\/index.php\/Software_as_a_service\" title=\"Software as a service\" class=\"wiki-link\" data-key=\"ae8c8a7cd5ee1a264f4f0bbd4a4caedd\">software as a service<\/a> or SaaS), user action isn\u2019t required. Lab-specific applications may be upgraded or updated as the vendor sees a need; SaaS implementations are managed by the vendor according to the vendor's internal planning. Larger, stable vendors may provide upgrades on a regular, annual basis for on-site installations. Small vendors may only update when a significant change is made, which might include new features, or when forced to because of OS changes. If those OS compatible changes aren\u2019t made, you will see yourself running software that is increasingly out-of-date. That doesn\u2019t necessarily mean it will stop working (for example, Microsoft dropped support for Windows XP in the Spring of 2014, and computers running it didn\u2019t suddenly stop). What it does mean is that if your computer hardware has to be replaced, you may not be able to re-install a working copy of the software. The working lifetime of an application, particularly a large one, can be on the order of a decade or more. Small applications depend upon market acceptance and the vendor\u2019s ability to stay in business. Your need for access to data may exceed the product's life.\n<\/p><p>The perception of the typical product life cycle runs like this: a need is perceived; product requirements are drafted; the product is developed, tested, and sold, based on market response to the product; new product requirements are determined; and the cycle continues. The reality is a bit more complicated. Figure 18 shows a more realistic view of a product\u2019s life cycle. The letters in the circles refer to key points where decisions can have an impact on your lab (\"H\" = high, \"M\" = medium, \"L\" = low).\n<\/p><p><a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig18_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"2d8772c77fd3c2956352cf924ce549f3\"><img alt=\"Fig18 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/0\/02\/Fig18_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 18.<\/b> The product life cycle<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>The process begins with an initial product concept, followed by the product's development, introduction, and marketing programs, and finally its release to customers. If the product is successful, the vendor gathers customer comments, analyzes competitive technologies and any new technologies that might be relevant, and determines a need for an upgrade.\n<\/p><p>This brings us to the first decision point: is an upgrade possible with the existing product? If it is, the upgrade requirements are researched and documented and the process moves to development, with generally a low impact on users. \u201cGenerally\u201d because it depends on the nature of the product and what modifications, changes, and customizations have been made by the user. If it is an application that brings a data file in, processes it, and then saves the result in an easily accessible file format, allowing no user modifications to the application itself, \u201clow impact\u201d is a fair assessment. Statistical analysis packages, image processing, and others such applications fall into this set. Problems can arise when user modifications are overwritten by the upgrade and have to be reinstalled (only a minor issue if it is a plug-in with no programming changes) or re-implemented by making programming changes (a major problem since it requires re-validation). Normally any customization (e.g., naming database elements) and data held within an application's database should be transferred without any problems, though you do need to make some checks and evaluations to ensure that this is the case. This is the inner loop of Figure 18.\n<\/p><p>Significant problems can begin if the vendor determines that the current product generation needs to be replaced to meet market demands. If this is a hardware product (e.g., pH meter, balance, instrument, computer, etc.) there shouldn\u2019t be any immediate impact (the hardware will continue to work). However, once there is a need for equipment replacement, it becomes a different matter; we\u2019ll pick this thread up later when we discuss product retirement.\n<\/p><p>Software is a different situation. The new generation of software may not be compatible with the hardware you currently have. What you have will still work, but if there are features in the new generation that you\u2019d like to have, you may find yourself having to re-implement any software changes that you\u2019ve made to the existing system. It will be like starting over again, unless the vendor takes pains to ensure that the upgrade installation is compatible with the existing software system. This includes assurances that all the data, user programming, settings, and all the details you\u2019ve implemented to make a general product work in your environment can successfully migrate. You will also have to address user education, and plan for a transition from the old system to the new one.\n<\/p><p>One problem that often occurs with new generations of software is change in the underlying data file structures. The vendor may have determined that in order to make the next generation work and to be able to offer the features they want included, the file structure and storage formats will change. This will require you to re-map the existing file structure into the new one. You may also find that some features do not work the same as they did before and your processes have to be modified. In the past, even new versions of Microsoft Office products have had compatibility issues with older versions. In large applications such as informatics or instrument data systems (e.g., multi-user chromatography data system), changes in formats can be significant. It can have an effect on importing instrument data into informatics products. For example, some vendors and users use a formatted PDF file as a means of exchanging instrument data with a LES, SDMS, or ELN. If the new version of an instrument data system changes its report formatting, the PDF parsing routine will have be updated.\n<\/p><p>At this point, it's important to note that just because a vendor comes out with a new software or hardware package doesn\u2019t mean that you have to upgrade. If what you have is working and the new version or generation doesn\u2019t offer anything of significant value (particularly when the cost of upgrading and the impact on lab operations is factored in) then bypass the upgrade. Among the factors that can tug you into an upgrade is the potential loss of support for the products you are concerned with.\n<\/p><p>What we\u2019ve been discussing in the last few paragraphs covers the outer loop to the right of Figure 18. The next point we need to note in that figure is the \u201cNo\u201d branch from \u201cNew Product Generation Justified?\u201d and \u201cProduct Fails,\u201d both of which lead to product retirement. For both hardware and software, you face the loss of customer support and the eventual need for product replacement. In both cases, there are steps that you can take to manage the potential risks.\n<\/p><p>To begin with, unless the vendor is going out of business, they are going to want to maintain a good relationship with you. You are a current and potential future customer, and they\u2019d like to avoid bad press and problems with customer relationships. Explain how the product retirement is going to affect you and get them to work with you on managing the issue; you aren\u2019t the only one affected by this (see the commentary on user groups later). If you are successful, they will see a potential liability turn into a potential asset: you can now be a referral for the quality of their customer service and support. Realistically, however, your management of product retirement or major product changes has to occur much earlier in the process.\n<\/p><p>Your involvement begins at the time of purchase. At that point you should be asking what the vendors update and upgrade policies are, how frequently they occur, what the associated costs are, how much advanced notice they give for planning, and what level of support is provided. In addition, determine where the product you are considering lies in the product\u2019s life cycle. Ask questions such as:\n<\/p>\n<ul><li>Is it new and potentially at-risk for retirement due to a lack of market acceptance? If it is and the vendor is looking for reference sites, use that to drive a better purchase agreement. Make sure that the product is worth the risk, and be prepared in case the worst-case scenario occurs.<\/li>\n<li>Is it near the end-of-life with the potential for retirement? Look at the frequency of updates and upgrades. Are they tailing off or is the product undergoing active development?<\/li>\n<li>What is the firm\u2019s financial position? Is it running into declining sales or are customers actively seeking it? Is there talk of acquisitions or mergers, either of which can put the product's future into question?<\/li><\/ul>\n<p>You should also ask for detailed technical documents that describe where programming modifications are permitted and preserved against vendor changes, and how data will be protected, along with any tools for data migration. Once you know what the limitations are for coding changes, device additions, and so on are, the consequences of deviating from them are your responsibility; whatever you do should be done deliberately and with full awareness of their impact in the future.\n<\/p><p>One point that should be clarified during the purchase process is whether you are purchasing a product or a product license. If you a purchasing a product, you own it and can do what you like with it, at least for hardware products. Products that are combinations of hardware and software may be handled differently since the hardware won\u2019t function without the software. Licenses are \u201crights to use\u201d with benefits and restrictions. Those should be clearly understood, as well as what you can expect in terms of support, upgrades, the ability to transfer products, how products can be used, etc. If there are any questions, the time to get them answered is before you sign purchase agreements. You have the best leverage for gaining information and getting reasonable concessions that are important to you while the vendor is trying to sell you something. If you license a product, the intellectual property within the product belongs to the vendor while you own your K\/I\/D; if you decide to stop using a product, you should have the ability to extract your K\/I\/D in a usable form.\n<\/p><p>Another point: if the product is ever retired, what considerations are provided to you? For a large product, they may not be willing to offer documented copies of the code so that you can provide self-support, but a small company trying to break into the market might. It doesn\u2019t hurt to ask and get any responses in writing, don\u2019t trust someone\u2019s verbal comments; they may not be there when upgrades or product retirement occurs. Additionally, it's always beneficial to conduct negotiations on purchase and licenses in cooperation with your company's IT and legal groups. IT can advise on industry practices, and the legal department\u2019s support will be needed for any agreements.\n<\/p><p>Another direction you should take is participating in user groups. Most major vendor and products have user groups that may exist as virtual organizations on LinkedIn, Yahoo, or other forums. Additionally, they often have user group meetings at major conferences. Company-sponsored group meetings provide a means for learning about product directions, raising issues, discussing problems, etc. Normally these meeting are divided into private (registered users only) and public sessions, the former being the most interesting since they provide a means of unrestricted comments. If a new version or upgrade is being considered, it will be announced and discussed at group meetings. These will also provide a mechanism for making needs known and if a product is being retired, lobbying for support. The membership contact list will provide a resource for exchanging support dialogue, particularly if the vendor is reluctant to address points that are important to you.\n<\/p><p>If a group doesn\u2019t exist, start a virtual conference and see where it goes. If participation is active, let the vendor know about it; they may take an interest and participate or make it a corporate function. It is in a company's best interest to work with its customers rather than antagonize them. Your company\u2019s support may be needed for involvement in, or starting, user groups because of the potential for liability, intellectual property protection, or other issues. Activities performed in these types of groups can be wide-ranging, from providing support (e.g., trading advice, code, tutorials, etc.) and sharing information (e.g., where to get parts, information for out-of-warranty products) to identifying and critiquing repair options and meeting informally for conferences.\n<\/p><p>The key issue is to preserve your ability to carry out your work with as little disruption as possible. That means you have to protect your access to the K\/I\/D you\u2019ve collected, along with the ability to work with it. In this regards, software systems have one possible advantage: virtualization.\n<\/p><p><b>Virtualization: An alternative to traditional computing models<\/b>\n<\/p><p>There are situations in laboratory computing that are similar to the old joke \u201cyour teeth are in great shape but the gums have to go.\u201d The equivalent situation is running a software package and finding out that the computer hardware is failing and the software isn\u2019t compatible with new equipment. That can happen if the new computer uses a different processor than the one you are working with. An answer to the problem is a technology called virtualization. In the context of the joke, it lets you move your teeth to a new set of gums; or to put it another way, it allows you to run older software packages on new hardware and avoid losing access to older data (there are some limitations).\n<\/p><p>Briefly put, virtualization allows you to run software (including the operating system) designed for one computer on an entirely different system. An example: the Windows XP operating system and applications running on a Macintosh computer using the MAC OS X operating system via VMware\u2019s Fusion product. In addition to rescuing old software, virtualization can:\n<\/p>\n<ul><li>reduce computing costs by consolidating multiple software packages on servers;<\/li>\n<li>reduce software support issues by preventing operating system upgrades from conflicting with lab software;<\/li>\n<li>provide design options for multiple labs using informatics products without incurring hardware costs and giving up lab space to on-site computers; and<\/li>\n<li>reduce interference between software packages running on the same computer.<\/li><\/ul>\n<p>Regarding that last benefit, it's worth noting that with virtualization, adding software packages means each gets its own \u201ccomputer\u201d without additional hardware costs. Product warrantees may state the software warrantee is limited to instances where the software is installed on a \u201cclean machine\u201d (just the current operation system and that software package, nothing else). Most people put more than one application on a computer, technically voiding the warrantee. Virtualized containers let you go back to that clean machine concept without buying extra hardware.\n<\/p><p>In order to understand virtualization, we have to discuss computing, but just the basics. Figure 19 shows an arrangement of the elements. When the computer is first turned on, there are three key items engaged: the central processing unit (CPU), memory, and mass storage. The first thing that happens (after the hardware startup sequence) is that portions of the operating system are placed in the memory where the CPU can read instructions and begin working. The key point is that the operating system, applications, and files are a collection of binary data elements (words) that are passed on to the CPU.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig19_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"581b842fdbf22190f7395fe6b76cb393\"><img alt=\"Fig19 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/5\/5f\/Fig19_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" width=\"362\" height=\"318\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 19.<\/b> Key elements in a computer system<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>The behavior of the CPU can be emulated by a software program. We can have a program that acts like an Intel processor for example, or a processor from another vendor. If we feed that program the instructions from an application, it will execute that application. There are emulators for example, that will allow your computer to emulate an Atari 2600 game console and run Asteroids. There are also emulators for other game consoles, so your computer can behave like any game console you like, as long as you have an emulator for it. Each emulator has all the programming needed to execute copies of the original game programming. They don\u2019t wear out or break. This configuration is shown in Figure 20.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig20_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"d8a4ca1c3f8dd09c02b68769693e2ce8\"><img alt=\"Fig20 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/8\/8d\/Fig20_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 20.<\/b> Emulation on a computer<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>You have a series of game emulators, each with its collection of games. Any game emulator can be loaded into memory and execute a game from its collection; games for other emulators won\u2019t work. Each game emulator and game collection is called a container. When you want to play, you access the appropriate container and go. If the mass storage is on a shared server, other people can access the same containers and run the games on their computers without interfering with each other.\n<\/p><p>How does this apply to a computer with failing hardware that is running a data analysis program? Virtualized systems allow you to make a copy of mass storage on the computer, create a container containing the CPU emulator, OS, applications, and data files, and place it on a server for later access. The hardware no longer matters because it is being replaced with the CPU emulator. Your program\u2019s container can be copied, stored, backed up, etc. It will never wear out or grow old. When you want to run it, access the container and run it. A server can support many containers.\n<\/p><p>There are some restrictions, however. First, most computers that are purchased from stores or online come pre-loaded with an operating system. Those operating systems are OEM (original equipment manufacturer) copies of the OS whose license cost is buried in the purchase price. They can\u2019t be copied or transferred legally, and some virtual servers will recognize OEM copies and not transfer them. As a result, in order to make a virtualized container, you need a fully licensed copy of the OS. Your IT group may have corporate licenses for widely used operating systems, so that may not pose a problem. Next, recognize that some applications will require a separate license for use on a virtualized system. As frequently noted, planning ahead is key: explore this option as part of the purchase agreement: you may get a better deal. Third, it's important to note that virtualized systems cannot support real-time applications such as direct analog, clock-driven, time-critical data acquisition from an instrument. The virtualized software shares resources with other containers in a time-sharing mode, and as a result the close coordination for data acquisition will not work. Fortunately, direct data acquisition (as contrasted with computer-to-instrument communications via RS-232, USB, Ethernet, etc.) is occurring less often in favor of buffered data communications with dedicated data acquisition controllers, so this is becoming less of a problem. If you need direct computer-controlled data acquisition and experiment control, this isn\u2019t the technology for you. Finally, containerized software running on virtualized systems cannot access hardware that wasn\u2019t part of the original configuration. If the computer you are using has a piece of hardware that you\u2019d like to use but wasn\u2019t on the original virtualized computer, it won\u2019t be able to use it since it doesn\u2019t have the software drivers to access it.\n<\/p><p>If the applications software permits it, applications can have shared access to common database software. A virtualized LIMS may be a good way to implement the application since it doesn\u2019t require hardware in the lab and uses servers that are probably under IT control, and as a result the systems are backed up regularly. The major hang-up on these installations is instrument connections. IT groups tend to get very conservative about that subject. Middleware could help isolate actual instrument connections from the network and could potentially resolve the situation. The issue is part technical, part political. However, virtualized LIMS containers still prove beneficial for educational purposes. A student can work with the contents of a container, experiment as needed, and when done dismiss the container without saving it; the results of the experiments are gone, mistakes and all.\n<\/p><p>There are different types of virtualization. One has containers sharing a common emulator and operating system. As a result, you update or upgrade the emulator software and\/or operating system once and the change is made across all containers. That can cause problems for some applications; however, they can be moved to a second type of virtualization in which each container has it own copy of the operating system and they can be excluded from updates.\n<\/p><p>If you find this technology appealing, check with your vendor to see if the products of interest will function in a virtualized environment (not all will). Carefully ask questions, perhaps asking if their software will run under VMware\u2019s products or Microsoft\u2019s Desktop Virtualization products, or even Microsoft\u2019s Hyper-V server. Some vendors don\u2019t understand the difference between virtualization and client-server computing. Get any responses in writing.\n<\/p><p><b>Retirement of hardware<\/b>\n<\/p><p>Replacing retired hardware can be a challenge. If it is a stand-alone, isolated product (not connected to anything else), the problem can be resolved by determining the specifications for a replacement, conducting due diligence, etc. It is when data systems, storage, and connections to computers enter the picture that life gets interesting. For example, replacing an instrument sans data system, such as a chromatograph or spectrometer with analog and digital I\/O (sense switches not data) connections to a computer, is essentially just a hardware replacement.\n<\/p><p>Hardware interfaced to a computer has issues because of software controls and data exchanges. What appears to be the simplest and most common situation is with serial communications (RS-232, RS-xxx). Complications include:\n<\/p>\n<ul><li>Wiring: Serial communications products do not always obey conventions for wiring, so wiring changes have to be considered and tested.<\/li>\n<li>Control functions and data exchange: Interacting with serial devices via a computer requires both control functions and data exchange. There are no standards for these, so a new purchase will likely require software changes to these. That may be avoided if the replacement device (e.g., a balance) is from the same vendor as the one you currently have, and is part of a family of products. The vendor may preserve the older command set and add new commands to access new features. If that is the case, you still have a plug-compatible replacement that needs to be tested and qualified for use.<\/li>\n<li>Interfaces: Moving from an RS-232 or similar RS- device to another interface such as USB will require a new interface (although USB ports are on almost every computer) and software changes.<\/li><\/ul>\n<p>If you are using a USB device, the wiring problems go away but the command structure and data transfer issues remain. Potential software problems are best addressed when the software is first planned and designed; good design means planning ahead for change. The primary issue is control of the external device, as data formats may also change. Those points can be addressed by device-independent programming. That means placing all device-dependent commands in one place\u2014a subroutine\u2014and formatting data into a device-independent format. Doing this makes changes, testing, and other aspects easier.\n<\/p><p>Let\u2019s take a single pan balance that has two functions: <tt>tare<\/tt> and <tt>get_weight<\/tt>. Each of those has a different command sequence of characters that are sent to the balance, and each returns either a completed code or a numerical value that maybe encoded in ASCII, BCD, or binary depending on the vendor\u2019s choice. If the commands to work with the balance are scattered throughout a program, you have a lot of changes to find, make, test, and certify as working. Device-independent programming puts them in two areas: one for the <tt>tare<\/tt> command and one for the <tt>get_weight<\/tt> command, which returns a floating-point value (e.g., 1.67).\n<\/p><p>If you have to replace the device with a new one, the command codes are changed in two places, and the returned numeric code reformatted into a standard floating point value in one place. The rest of the program works with the value without any concern for its source. That allows for a lot of flexibility in choosing balances in the lab, as different units can be used for different applications with minor software adjustments.\n<\/p>\n<h4><span class=\"mw-headline\" id=\"Summary_2\">Summary<\/span><\/h4>\n<p>As noted when we first started talking about this goal, the ability to support a product has to be designed and built in, not added on. The issues can be difficult enough when you are working with one vendor. When a second or third vendor is added to the mix, you have an entirely new level of issues to deal with. This is a matter of engineering, not just science. Supportable systems and methods have to be designed, documented, engineered, and validated to be supportable. A system or method isn\u2019t supportable simply because its individual components or steps are.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Seventh_goal:_Addressing_systems_integration_and_harmonization\">Seventh goal: Addressing systems integration and harmonization<\/span><\/h3>\n<p>In the 1960s, audio stereo systems came in three forms: a packaged, integrated product that combined an AM\/FM radio tuner, turntable, and speakers; all those components but purchasable individually; and a do-it-yourself, get-out-the-soldering-iron format. The integrated products were attractive because you could just plug them into a power source and they worked. The packaging was attractive, and you didn\u2019t have to know much beyond choosing the functions you wanted to use. In terms of component quality, that was up to the manufacturer; it was rarely top-quality as the components still met the basic needs of the application at a particular price point. \n<\/p><p>Component systems appealed to a different type of customer. They wanted to pick the best components that met their budgets, trading off the characteristics of one item against another, always with the idea of upgrading elements as needed. Each components manufacturer would guarantee that their product worked, but making the entire system work was your issue. In the 1960s, everything was analog so it was a matter of connecting wires. Some went so far as to build the components from kits (lower cost), or design something and get it to work just for the fun of it. HDTV\u2019s have some of the same characteristics as component systems, as they can work out of the box, but if you want better sound or want to add a DVD or streaming box, you have to make sure that the right set of connectors exist on the product and that there is enough of them.\n<\/p><p>In the product cases noted above, there are a limited set of choices for mixing components, so user customization isn't that much of a problem. Most things work together from a hardware standpoint, but software apps are another matter. The laboratory world isn\u2019t quite so neat and tidy.\n<\/p><p>From a lab standpoint, integrated systems are attractive for a number of reasons:\n<\/p>\n<ul><li>It suggests that someone has actually thought about how the system should function and what components need to be present, and they put a working package together (may be more than one component).<\/li>\n<li>It\u2019s installed as a package: when the installation is done, all of it works, both hardware and software.<\/li>\n<li>It\u2019s been tested as a system and all the components work together.<\/li>\n<li>You have a single point of contact for support.<\/li>\n<li>If an upgrade occurs, someone (hopefully) has made sure that upgrading some portions of the system doesn\u2019t mean others are not working; an upgrade embraces the functionality of the entire system.<\/li>\n<li>The documentation addresses the entire system, from training and support to maintenance, etc.<\/li>\n<li>It should be easier to work with because the system\u2019s functionality and organization have been thought through.<\/li>\n<li>It looks nice. Someone designed a packaged system that doesn\u2019t have a number of separate boxes with wires exposed. In the lab, that may be pushing it.<\/li><\/ul>\n<p>Achieving that on a component-by-component basis may be a bit of a challenge. Obtaining an integrated system comes down to a few considerations, not the least of which is what you define as a \u201csystem.\u201d A system is a collection of elements that are used to accomplish a task. Figure 21 shows a view of overall laboratory operations.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig21_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"f033bca2810b1e921e33f13abdccb4f0\"><img alt=\"Fig21 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/a\/a2\/Fig21_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 21.<\/b> Laboratory operations and connections to other groups<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Laboratory operations have three levels of systems: the corporate level, the lab administrative level (darker grey, including office work), and the lab bench level. Our concern is going to be primarily with the latter two; however, we have to be aware that the results of lab work (both research and testing) will find their way into the corporate sphere. If the grey box is our system, where does integration fit in and what strategies are available for meeting that goal? Integration is about connecting devices and systems so that there is a smooth flow of command\/control messages, data, and information that does not depend on human intervention. We are not, for example, taking a printout from one instrument system and then manually entering it into another; that transfer should be electronic and bi-directional where appropriate. Why is this important? Improving the efficiency of lab operations, as well as ROI, has long been on the list of desired outcomes from the use of lab automation and computing. Developing integrated systems with carefully designed mechanisms for the flow, storage, and management of K\/I\/D is central to achieving those goals.\n<\/p><p>There are different strategies for building integrated systems like these. One is to create the all-encompassing computer system that does and controls everything. Think HAL in the movie 2001, or popular conceptions of an advanced AI. However, aside from the pitfalls in popular sci-fi, that isn\u2019t an advisable strategy. First, it will most likely never be finished. Trying to come up with a set of functional specifications would take years if they were ever completed. People would be constantly adding features, some conflicting, and that alone (called scope creep) would doom the process, as it has in similar situations. Even if somehow the project were completed, it would reflect the thinking of those involved at the start. In the time that the project was underway, the needs of the lab would change, and the system would be out-of-date as soon as it was turned on. If you were to develop an adaptable system, you'd again still be dealing with scope creep. Other problems would crop up too. If any component needed maintenance, the entire system could be brought to a halt and nothing would get done. Additionally, staff turn-over would be a constant source of delays as new people were brought on board and trained, and as this system would be unique, you couldn't find people with prior experience. Finally, the budget would be hard to deal with, from the initial estimate to the likely overruns.\n<\/p><p>Another approach is to redefine the overall system as a cooperative set of smaller systems, each with its own integration strategy, with the entire unit interconnected. Integrated systems in the lab world are difficult to define as a product set, since the full scope of a lab's processes is highly variable, drawing on a wide range of instruments and equipment. We can define a functional instrument system (e.g., titrators, chromatographic equipment, etc.) but sample prep variability frustrates a complete package. One place that has overcome this is the clinical chemistry market.\n<\/p><p>At this point, we have to pan back a bit and take a few additional aspects of laboratory operations into consideration. Let's look back at Figure 17 again:\n<\/p><p><a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig17_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"2d332579e52fb24a7257395161cd95c0\"><img alt=\"Fig17 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/5\/59\/Fig17_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 17.<\/b> A laboratory process<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>We are reminded that processes are what rule lab work, and the instrumentation and other equipment play an important but subservient role. We need to define the process first and then move on from there; this is the standard validation procedure. In regards to the integration of that instrumentation and equipment, any integration has to support the entire process, not just the individual instrument and equipment packages. This is one of the reasons integration is as difficult as it is. A vendor can create an instrument package, but their ability to put components together is limited to their understanding of their usage.\n<\/p><p>Unless the vendor is trying to fit in a well-defined process with enough of a market to justify the work, there is a limit to what they can do. This is why pre-validated products are available in the clinical chemistry market: the process is pre-defined and everyone does the same thing. Note that there is nothing to prevent the same thing happening in other industries; if there were a market for a product that fully implemented a particular ASTM or USP method, vendors might take notice. There is one issue with that, however. When a lab purchases laboratory instrumentation, they often buy general-purpose components. You may purchase an infrared spectrophotometer that covers a wide spectral range that can be used in a variety of applications to justify its cost. Yet you rarely, if ever, purchase one instrument for each lab process that it might be used in unless there was sufficient demand to justify it. And that's the run: if a vendor were to create an equipment package for a specific procedure, the measuring instrument would be stripped down and tailored to the application, and it may not be usable in another process. Is there enough demand for testing to warrant the development of a packaged system? If you were doing blood work, yes, because all blood testing is done the same way; it\u2019s just a question of whether or not your lab is getting enough samples. If it\u2019s ASTM xxx, maybe not.\n<\/p><p>Additionally, the development of integrated systems needs to take the real world into account. Ask whether or not the same equipment can be used for different processes. If the same equipment setup is being used in different processes with different reagents or instrument setups (e.g., different columns in chromatography), developing an integrated electro-mechanical computer system may be a viable project since all the control and electro-mechanical systems would be the same. Returning to the cookie analogy, it's the same equipment, different settings, and different dough mixes (e.g., chocolate chip, sugar cookies, etc.). You just have to demonstrate that the proper settings are being used to ensure that you are getting consistent results.\n<\/p><p>If this sounds a little confusing, it\u2019s because \u201cintegration\u201d can occur on two levels: the movement of data and information and the movement of material. On one hand, we may be talking about integration in the context of merging the sources of data and information (where they are generated) into a common area where they can be used, managed, and accessed according to the lab's and corporate's needs. Along the way, as we\u2019ve seen in the K\/I\/D discussions, different types of data and information will be produced, all of which has to be organized and coordinated. This flow is bi-directional: generated data and information in one direction, and work lists in the other. On the other hand\u2014regarding the movement of materials\u2014we may be talking about automated devices and robotics. Those two hands are joined at the measuring instrument.\n<\/p><p>We\u2019ll begin by discussing the movement of data and information: integration is a matter of bringing laboratory-generated data and information into a structured system where it can be accessed, used, and managed. That \u201csystem\u201d may be a single database or a collection of interconnected data structures. The intent in modern lab integration is that connections between the elements of that structure are electronic, not manual, and transfers may be initiated by user commands or automated processes. Yet note that someone may consider a completely manual implementation process to be \u201cintegrated,\u201d and be willing to accept slower and less efficient data transfers (that\u2019s what we had in the last century). However, that methodology doesn\u2019t give us the improvements in productivity and ROI that are desired.\n<\/p><p>The idea that integration is moving all laboratory K\/I\/D into a structured system is considerably different than the way we viewed things in the past. Previously, the goal was to accumulate lab results into a LIMS or an ELN, with the intermediate K\/I\/D (e.g., instrument data files, etc.) placed in an SDMS. That was a short-sighted attempt at considering only data storage, without fully considering the topic of data utilization. The end goal of using a LIMS or ELN to accumulate lab results was still valid\u2014particularly for further research, reporting, planning, and administrative work\u2014but it didn\u2019t deal effectively with the material those results were based on or the potential need to revisit that work.\n<\/p><p>In this discussion we\u2019ll consider the functional hub of the lab to be a LIMS and\/or ELN, with an optional SDMS; it\u2019s your lab, and you get to choose. We\u2019re referring to this as a hub for a couple of reasons. First, it is the center of K\/I\/D management, as well as planning and administrative efforts. Second, these should be the most stable information systems in the lab; durable and slow to change. Instruments and their data systems will change and be replaced as the lab\u2019s operational needs progress. As a result, the hub is where planning efforts have to begin since decisions made here have a major impact on a lab's ability to meet its goals. The choice of a cloud-based system vs. an on-site system is just one factor to consider.\n<\/p><p>Historically, laboratories have begun by putting the data- and information-generating capability in place first. That\u2019s not unusual, since new companies need data and information to drive their business development. However, what they really need is a mechanism for managing the data and information. Today, however, the place that the development of a laboratory electronic infrastructure needs to begin is with the systems that are used to collect and manage data and information. Then we can put in place the data and information generators. It\u2019s a bit like starting a production line without considering what you\u2019re going to do with all the material you\u2019re producing.\n<\/p><p>The types of data and information generators can vary greatly. Examples include:\n<\/p>\n<ul><li>a human-based reading that is recorded manually;<\/li>\n<li>a reading recorded by an instrument with limited storage and communication abilities, e.g., balances and pH meters;<\/li>\n<li>a reading recorded by a limited-functionality device, where data is recorded and stored but must be transmitted out of the machine to be analyzed; and<\/li>\n<li>a reading recorded by a combination instrument-computer, which has the ability to record, store, analyze, and output data in various forms.<\/li><\/ul>\n<p>The issue we need to deal with for each of those generators is how to plan for where the output should be stored so that it is accessible and useful. That was the problem with earlier thinking. We focused too much on where the K\/I\/D could be stored and maintained over the long term, but not enough on its ability to be used and managed. Once the analysis was done, we recognized the need to have access to the backup data to support the results, but not the ability to work with it. The next section will look at some of the ramifications of planning for those data types.\n<\/p><p><b>Planning the integration of your data generators<\/b>\n<\/p><p>There are several ramifications of planning for your data generators that need to be discussed. Before we begin, though, we need to add two additional criteria for the planning stage:\n<\/p>\n<ul><li>You should avoid duplication of K\/I\/D unless there is a clear need for it (e.g., a backup).<\/li>\n<li>In the progression from sample preparation to sample processing, to measurement, to analysis, and then to reporting, there should not be any question of both the provenance and the location of the K\/I\/D generated from that progression.<\/li><\/ul>\n<p>That said, let's look at each generator in greater detail to better understand how we plan for their integration and the harmonization of their resulting K\/I\/D.\n<\/p><p><br \/>\n1. <i>A human-based reading that is recorded manually, or recorded from an instrument with limited storage and communication abilities<\/i>\n<\/p><p><i>Examples<\/i>: The types of devices we are looking at for these generators are balances, pH meters, volt meters, single-reading <a href=\"https:\/\/www.limswiki.org\/index.php\/Spectrophotometer\" title=\"Spectrophotometer\" class=\"wiki-link\" data-key=\"6382bb48c914f3c490400c13f9eb16e6\">spectrophotometers<\/a>, etc.\n<\/p><p><i>Method of connection<\/i>: Both the manual and digital generators don\u2019t leave much choice: the integration is direct data entry into a hub system unless they are being used as part of a LES. Manual modes mean typing (with data entry verification), while digital systems provide for an electronic transfer as the method of integration.\n<\/p><p><i>Issues<\/i>: There are problems with these generators being directly tied to a hub component (see \u201cA,\u201d Figure 22). Each device or version has its own communications protocol, and the programming is specific to that model. If the item is replaced, the data transfer protocols may differ and the programming has to change. These devices are often used for single readings or weighing a sample, with the result stored in the hub, to be use in later calculations or reporting. However, even though these are single-reading instruments, things can get complicated. Problems may crop up if their measurements are part of a time series, require a weight measurement at specific time intervals, are used for measuring the weights of similar items in medical tablet uniformity testing, or used for measuring pH during a titration. Those applications wouldn\u2019t work well with a direct connection to a hub and would be better served through a separate processor (see \u201cB,\u201d Figure 22) that builds a file of measurements that could be processed, with the results sent to the hub. This creates a need to manage the file and its link to the transmitted results. An SDMS would work well, but the hub system just became more complex. Instead of a device being directly connected to a hub, we have an intermediate system connected to an SDMS and the HUB. Integration is still easily feasible, but more planning is required. Should the intermediate system take on the role of an SDMS (all the files are stored in its file structure), you would also have to provide backup and security facilities to ensure that the files weren\u2019t tampered with and secured against loss. The SDMS would be responsible for entering the results of the work into the hub. (Remember that access to the files is needed to support any questions about the results; printed versions would require re-entering the data to show that the calculations were done properly, which is time-consuming and requires verification.)\n<\/p><p><i>Diagram<\/i>:\n<\/p><p><a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig22_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"b9c60537496066c9b20c0fd2b7499bb0\"><img alt=\"Fig22 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/8\/8d\/Fig22_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 22.<\/b> Representation of the movement of data and information in the lab from a human-based reading that is recorded manually, or from an instrument with limited storage and communication abilities<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p><br \/>\n2. <i>A reading recorded by a limited-functionality device<\/i>\n<\/p><p><i>Examples<\/i>: Measurements are made on one or more samples, with results stored locally, which have to be transmitted to another computer for processing or viewing. Further use may be inhibited until that is done (e.g., the device may have to transmit one set of measurements before the next set can begin). Such devices include microplate readers and spectrophotometers. Some devices can be operated manually from front panel controls, or via network connections through a higher-level controller.\n<\/p><p><i>Method of connection<\/i>: Some devices may retain the old RS-232\/422 scheme for serial transmission of measurements and receiving commands, though most have transitioned to USB, Ethernet, or possibly wireless networking.\n<\/p><p><i>Issues<\/i>: Most of these devices do not produce final calculated results; that work is left to an intermediate process placed between the device and the hub system (Figure 23). As a result, integration depends on those intermediate processes controlling one or more devices, sometimes coordinating with other equipment, calculating final results, and communicating them to the hub system. Measurement files need to be kept in electronic form to make them easier to back up, copy, transmit, and generally work with. If only printed output is available, it should be scanned and an equivalent machine-readable version created and verified. Each experimental run, which may include one or more samples, should have the associated files bundled together into an archive so that all data is maintained in one place. That may be the intermediate processor\u2019s storage or an SDMS with the appropriate organization and indexing capabilities, including links back from the hub. The electronic files may be used to answer questions about how results were produced, re-run an analysis using the original or new set of algorithms, or have the results analyzed as part of a larger study.\n<\/p><p><i>Diagram<\/i>:\n<\/p><p><a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig23_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"de0127f63c2520c8ed3989ee6f4b4993\"><img alt=\"Fig23 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/c\/cd\/Fig23_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 23.<\/b> Representation of the movement of data and information in the lab from a limited-functionality device<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>3. <i>A reading recorded by a combination instrument-computer<\/i>\n<\/p><p><i>Examples<\/i>: These generators include one-to-one instrument-to-instrument data systems (IDS), many-to-one instrument-to-computer systems, <a href=\"https:\/\/www.limswiki.org\/index.php\/Nuclear_magnetic_resonance_spectroscopy\" title=\"Nuclear magnetic resonance spectroscopy\" class=\"wiki-link\" data-key=\"a05c6a4eb8775761248c099371cdb82f\">NMR spectroscopy<\/a>, chromatography, <a href=\"https:\/\/www.limswiki.org\/index.php\/Mass_spectrometry\" title=\"Mass spectrometry\" class=\"wiki-link\" data-key=\"fb548eafe2596c35d7ea741849aa83d4\">mass spectrometry<\/a>, thermal analysis, spectrophotometers, etc.\n<\/p><p><i>Method of connection<\/i>: There are several means of connection: 1) detector to computer (A\/D), 2) computer to instrument control and accessory devices such as autosamplers (via, e.g., digital I\/O, USB), and 3) computer to centralized hub systems (via, e.g., USB, Ethernet, wireless networks). Integration between the IDS is accomplished through vendor-supported <a href=\"https:\/\/www.limswiki.org\/index.php\/Application_programming_interface\" title=\"Application programming interface\" class=\"wiki-link\" data-key=\"36fc319869eba4613cb0854b421b0934\">application programming interfaces<\/a> (APIs) on both sides of the connection.\n<\/p><p><i>Issues<\/i>: The primary with these generators is managing the data structures that consist of captured detector output files, partially processed data (e.g., descriptors such as peak size, width, area, etc.) computed results, sample information, worklists, and processing algorithms. Some of this material will get transmitted to the hub, but the hub isn't generally designed to incorporate all of it. A portion of it\u2014 the content of a printed report, for example\u2014could be sent to an SDMS. However, the bulk of it has to stay with the IDS since the software needs to interpret and present the sample data contents; the data files by themselves are useless without software to unpack and make sense of them.\n<\/p><p>From a planning standpoint, you want to reduce the number of IDSs as much as possible. While chromatography is presumably the only technique that offers a choice between one-to-one and many-to-one instruments to computers, hopefully over time that list will expand to provide better data management. Consider three chromatographs, each with its own IDS. If you are looking for data, you have three systems to check, and hopefully each has its own series of unique sample IDs. Three instruments on one IDS is a lot easier to manage and search. You also have to consider backups, upgrades, general maintenance, and cost.\n<\/p><p>Moving the instrument data files to an SDMS may not be effective unless the vendor has made provision for it. The problem is data integrity. If you have the ability to move data out of the system and then re-import it, you open up the possibility of importing data that has been edited. Some vendors prohibit this sort of activity.\n<\/p><p><i>Diagram<\/i>:\n<\/p><p><a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig24_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"dc19aef89e726a9be01eb1409e4d1205\"><img alt=\"Fig24 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/f\/f0\/Fig24_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 24.<\/b> Representation of the movement of data and information in the lab from a combination instrument-computer<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>The above looked at generator types in isolation; however, in reality, devices and instruments are used in combinations, each producing results that have to be maintained and organized. We must look at the data sets that are generated in the course of executing an experiment or method.\n<\/p><p>The procedures associated with an experiment or method can be executed three ways: manually, using a LES, or using a robotics implementation. The real world isn\u2019t so neatly separated; manual and LES implementations may have some steps that use automated tools. The issue we need to address in planning is the creation of an \u201cexperiment data set\u201d that brings all the results produced into one package. Should questions arise about an experiment, you have a data set that can be used as a reference. That \u201cpackage\u201d may be pages in a notebook, a word processor file, or some other log. It should contain all data recorded during a procedure, or, in the case of IDS capture data, file references or pointers to that instrument data or information. You want to be able to pull up that record and be able to answer any questions that may arise about the work.\n<\/p><p>All of that may seem pretty obvious, but there is one point that needs to be addressed: the database structure, including hub systems, IDS file structures, and SDMS all have to be well defined before you accumulate a number of experiment packages. You don\u2019t want to find yourself in a situation where you have a working system of data or information storage and then have to make significant changes to it. That could mean that all previous packages have to be updated to reflect the new system, or, worse, have to deal with an \u201cold\u201d and \u201cnew\u201d system of managing experimental work.\n<\/p><p>LES systems come in two forms: stand-alone software packages, and script-based systems that are part of a LIMS or ELN. The stand-alone systems should produce the experiment record automatically with all data and pointers to IDS captured data or information. For script-based systems, the programming for the LES function has to take that into account. As for laboratory robotics, they can be viewed as an extension of a LES: instead of a person following instructions, a robot or a collection of robotic components follows its programming to carry out a process. Developing an experimental record is part of that process.\n<\/p><p>The bottom line in all of this is simple: the management architecture for your K\/I\/D has to be designed deliberately and put in place early in a lab's development. If it is allowed to be created on an as-needed basis, the resulting collection of computers and storage will be difficult to maintain, manage, and expand in an orderly fashion. At some point, someone is going to have to reorganize it, and that will be an expensive and perhaps painful process.\n<\/p>\n<h4><span class=\"mw-headline\" id=\"Harmonization\">Harmonization<\/span><\/h4>\n<p>Harmonization is a companion goal to integration. Approached with the right mindset it can reduce:\n<\/p>\n<ul><li>installation costs,<\/li>\n<li>support costs,<\/li>\n<li>education and training requirements, and<\/li>\n<li>development effort.<\/li><\/ul>\n<p>Harmonization efforts, if used inappropriately, can create strife, increasing inter-departmental friction and conflict. The general idea of harmonization is to use common hardware and software platforms to implement laboratory systems, while ensuring that move toward commonality doesn\u2019t force people to use products that are force-fits, that don\u2019t really meet the lab's needs, but serve some other agenda. The purpose of computing systems is to help people get their work done; if they need a specific product to do it, end of story. If it can be provided using common hardware and software platforms, great, but that should not be a limiting factor. If used as a guide in the development of database systems, harmonization can make it easier to access laboratory K\/I\/D across labs. It may slow down implementations because more people\u2019s opinions have to be taken into account, but the end result will be the ability to gain more use out of your K\/I\/D.\n<\/p><p>Figure 25 shows a common LIMS server supporting three different labs. Each lab has its own database structure, avoiding conflicts and unnecessary compromises in the conduct of lab work. It does benefit reduced implementation costs and support costs. While some vendors support this, others may not; see if they are willing to work a deal since there are multiple labs systems involved. If we couple this with the common structure definitions of K\/I\/D noted earlier, accessing information across labs will be more productive.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig25_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"6e7098bd863d226162005373e457ba4f\"><img alt=\"Fig25 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/2\/23\/Fig25_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 25.<\/b> Harmonizing LIMS platforms<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>An alternative is to force everyone into one data structure, usually to reduce costs. Savings on licensing costs may be offset by development delays as multiple labs resolve conflicts in database organization, security, access control, etc. In short, keep it simple; things will work smoother and in the long run be less costly from an implementation, maintenance, and support perspective. If there is a need or desire to go through the databases for accounting purposes or other organizational requirements, the necessary material can be exported into another file structure that can be analyzed as needed. This provides a layer of security between the lab and the rest of the organization. It\u2019s basically a matter of planning how database contents are being managed with the lab and what has to be accessed from other parts of the organization.\n<\/p><p>Part of harmonization planning process involves examining how computers are paired with instruments. You may not have multiple instances of higher-priced equipment such as mass spectrometers, NMRs, or other instruments, and having a computer dedicated to each device makes sense. However there is one instrument that you may have several of: chromatographs. You can purchase a computer for each instrument, but in this case, most CDS can support multiple instruments (Figure 26).\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig26_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"ae714afe1580a09ad1050e433d31f8ed\"><img alt=\"Fig26 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/2\/2b\/Fig26_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 26.<\/b> Consolidate instrument-computer connections where feasible<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>There are advantages to having multiple instruments on one computer:\n<\/p>\n<ul><li>You have only one system to support, maintain, and back up.<\/li>\n<li>All the K\/I\/D is in one system.<\/li>\n<li>The qualification or validation process is performed once, rather than having it repeated for each system.<\/li>\n<li>Overall cost is reduced.<\/li><\/ul>\n<p>This \"multiple instruments to one computer\" configuration is the result of a low data collection rate, the modest computing requirements needed to process the instrument data, and user demands on the vendors. Given the developments in computing power and distributed data acquisition and control, this many-to-one configuration should be extended to other instrument techniques, reducing costs and bringing more efficiency to the management of K\/I\/D.\n<\/p><p><b>Regarding computer systems...<\/b>\n<\/p><p>Harmonization doesn't mean that everything should run the same OS or the same version of the OS. It means doing it where possible, but not at the expense of doing lab work effectively.\n<\/p><p>With the wide diversity of products in the laboratory market, you\u2019re going to find a mix of large and small vendors. Some may be small, growing companies that are managed by a few people, and as a result, keeping up with the latest versions of operating systems and underlying software may not be critical if it doesn\u2019t affect their product's usability or performance. Their product certification on the latest version of a software platform may lag larger vendors. That means that requiring all systems to be at the same operating system level isn\u2019t realistic. Upgrading the OS may disable the software that lab personnel depend upon.\n<\/p><p><b>Regarding the data...<\/b>\n<\/p><p>During November 19-20, 2019 Pharma IQ\u2019s Laboratory Informatics Summit held a meeting on \"Data Standardization for Lab Informatics.\" The meeting highlighted the emerging <a href=\"https:\/\/www.limswiki.org\/index.php\/Journal:The_FAIR_Guiding_Principles_for_scientific_data_management_and_stewardship\" title=\"Journal:The FAIR Guiding Principles for scientific data management and stewardship\" class=\"wiki-link\" data-key=\"e5903ddcc7734415af1d91fcd258da90\">FAIR Guiding Principles<\/a>, which state that K\/I\/D should be findable, accessible, interoperable, and reusable (FAIR). The point of mentioning this is to highlight the growing, industry-wide importance of protecting the value of the K\/I\/D that you are collecting. No matter how much it costs to produce, if you can\u2019t find the K\/I\/D you need, it has no value because it isn\u2019t usable. The same holds true if the data supporting information can\u2019t be found.\n<\/p><p>Utilization is at the core of much of what we\u2019ve been discussing. Supporting the FAIR Guiding Principles should be part of every discussion about products and what they produce, how the database is designed, and what the interoperability between labs in your organization looks like.\n<\/p><p>Another aspect of this subject is harmonizing data definitions across your organization. The same set of terms should be used to describe an object or aspect, and their database representation should be compatible, etc. The point is to make it easier to find something and make use of it.\n<\/p><p><b>Putting this all to use<\/b>\n<\/p><p>How do you apply all of this to your new lab (an easier task) or existing lab (more challenging)? This is going to be a broad-brush discussion since every lab has their own way of handling things, from its overall mission to its equipment and procedures, so you\u2019re going to have to take these points and adjust them to fit your requirements.\n<\/p><p>To start, assume you have a hub system (either a LIMS or ELN) as the center of gravity for all your K\/I\/D collection. You build your lab's K\/I\/D management infrastructure from this center of gravity outward; effectively everything revolves around the hub and radiates out from it.<sup id=\"rdp-ebb-cite_ref-26\" class=\"reference\"><a href=\"#cite_note-26\">[l]<\/a><\/sup> \n<\/p><p>For each K\/I\/D generator, ask:\n<\/p>\n<ul><li>What does it produce, which of the K\/I\/D generator types noted earlier matches, and which model is appropriate?<\/li>\n<li>Does it generate a file that has to be processed, or is it the final measurement?<\/li>\n<li>Does the device or the system supporting it have all the information needed to move it on to the next phase of the process? For example if the device is a pH meter, what is going to key the result into the next step? It will need a sample or experiment reference ID so that it knows where the result should go.<\/li><\/ul>\n<p>For each device output, ask:\n<\/p>\n<ul><li>What happens to the generated K\/I\/D and how is it used? And remember, nothing should ever get deleted.<\/li>\n<li>Is the device output a single measurement or part of a set? Will it be combined with measurements from other devices, sample IDs, and calibration information?<\/li>\n<li>Where is the best place to put it? In an intermediate server, SDMS, or hub?<\/li>\n<li>If it is a final result of that process, should it be in the hub?<\/li>\n<li>If it is an intermediate file or result, then where?<\/li>\n<li>How might it be used in the future and what is the best way to prepare for that? Files may need to be examined in audits, transferred to another group or organization<sup id=\"rdp-ebb-cite_ref-27\" class=\"reference\"><a href=\"#cite_note-27\">[m]<\/a><\/sup>, or recalculated with new algorithms. Does your system provide trace-back from final results to source data?<\/li><\/ul>\n<p>For all devices, ask: \n<\/p>\n<ul><li>Does every device that has storage and communications capability have back up procedures put in place?<\/li><\/ul>\n<p>Depending on your point of view, whether it is on the science, laboratory operations management, or lab administration, your interest in lab computing may range from \u201cnecessary evil\u201d to \u201cmakes life easier\u201d to \u201cneeded to make the lab function,\u201d or some other perspective. You may be of the opinion that all this is interesting but not your responsibility. If not yours, then who? That topic will be covered in the next write-up.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Laboratory_systems_engineers\">Laboratory systems engineers<\/span><\/h2>\n<p>If you believe that the technology planning and management considerations noted so far in this guide are important to your laboratory, it's time to ask to whom that responsibility falls upon?\n<\/p><p>The purpose of this guide has been to highlight that the practice of science has changed, become more complex, and become more dependent on technologies that demand a lot of attention. Those technologies are not only the digital systems we\u2019ve covered, but also the scientific methodologies and instrumentation whose effective use can take\u2014through increasing specialization and depth of material\u2014an entire career to learn and apply. The user of scientific computing typically views it as a tool for getting work done and not another career. Once upon a time, the scientist knowledgeable in both laboratory work and computing was necessary; if you wanted to use computers you had to understand how they worked. Today, if you tried to do that, you\u2019d find yourself spread thin across your workload, with developments happening faster in science and computers than you a single individual can keep up with.\n<\/p><p>Let's look at what a laboratory scientist would need to be able to do in order to also support their laboratory's scientific computing needs, in addition to their normal tasks (Figure 27).\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig27_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"5bc3e8b033bd91aed2e64ada5b2b21a7\"><img alt=\"Fig27 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/5\/56\/Fig27_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 27.<\/b> Partial task list for supporting laboratory scientific computing<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>On top of those tasks, the lone scientist would also have to have the following technological knowledge and personal capabilities (Figure 28):\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig28_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"67bd475f8f92193277d0127c372ce997\"><img alt=\"Fig28 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/d\/db\/Fig28_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 28.<\/b> Partial list of technological knowledge and personal capability requirements for supporting laboratory scientific computing<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Looking at these two figures, we're realistically considering two levels of expertise: a high, overview level that can look at the broader issues and see how architectures can be constructed and applied, and specialists in areas such as robotic, etc. However, the current state of undergraduate\u2014and to a lesser extent graduate\u2014education doesn\u2019t typically have room for the depth of course work needed to cover the material noted above. Expanding your knowledge base into something that is synergistic with your current course work is straightforward; doing it with something that is from a separate discipline creates difficulties. Where do digital systems fit into your life and laboratory career?\n<\/p><p>Let's look at what the average laboratory scientist does today. Figure 29 shows the tasks that are found in modern laboratory operations in both research and testing facilities. \n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig29_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"810e8a66c3b1c1cf6a003f6dad9e1b4e\"><img alt=\"Fig29 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/9\/91\/Fig29_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 29.<\/b> Typical laboratory activities in any scientific discipline<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Now let's compare that set of tasks in Figure 29 with the task emphasis provided in today's undergraduate laboratory science courses and technology courses (Figure 30 and 31).\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig30_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"048fcaf47ab7284a45fca95c4a46ae44\"><img alt=\"Fig30 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/0\/00\/Fig30_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 30.<\/b> Task-level emphasis of laboratory science in higher-education courses<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p><a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig31_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"2988e1c6bc1347513af08a16c6337ea0\"><img alt=\"Fig31 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/2\/25\/Fig31_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 31.<\/b> Task-level emphasis of information technology in higher-education courses<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>In cases where science students have access to instrumentation-computer systems, the computers are treated as \u201cblack boxes\u201d that acquire the data (data capture), process it (data processing) and report it. How those things happen is rarely if ever discussed, with no mention of analog-digital converters, sampling rates, analysis algorithms, etc. \u201cStuff happens,\u201d yet that \u201cstuff,\u201d if not properly ran with tested parameters, can turn good bench science into junk data. How would they know? Students may or may not get exposure to LIMS or ELN systems even though it would be useful for students to capture and work with their lab results, but schools may not be willing to invest in them.\n<\/p><p>IT students will be exposed to data and information management through database courses, but not at the level that LIMS and ELNs require (e.g., instrument communications and control); the rest of the tasks in Figure 31 is practically unknown to them. They\u2019d be happy to work on the computer in Figure 31, but the instrument and the instrument connections\u2014the things that justify the computer's role\u2014aren\u2019t something they\u2019d be exposed to.\n<\/p><p>What we need are people with a foot in both fields, able to understand and be conversant in both the laboratory science and IT worlds, relating them to each other to the benefit of lab operation effectiveness while guiding IT in performing their roles. We need \u201claboratory systems engineers\u201d (LSEs).\n<\/p><p>Previously referred to as \"laboratory automation engineers\" (LAEs)<sup id=\"rdp-ebb-cite_ref-LiscouskiAreYou06_10-1\" class=\"reference\"><a href=\"#cite_note-LiscouskiAreYou06-10\">[6]<\/a><\/sup> and \"LAB-IT\" specialists, we now realize both titles fall short of the mark. \"Laboratory automation engineer\" emphasizes automation too strongly when the work is much broader than that. And \"LAB-IT\" is a way of nudging IT personnel into lab-related work without really addressing the full scope of systems that exist in labs, including robotics and data acquisition and control.\n<\/p><p>Laboratory information technology support differs considerably from classical IT work (Figure 32). The differences are primarily two-fold. First, the technologies used in lab work, including those in which instruments are attached to computers and robotics, are different than those commonly encountered in classical IT. The computers are the same, but the added interface and communications requirements imposed by instrument-computer connections change the nature of the work. When troubleshooting, it can be difficult to separate computer issues from those resulting from the connection to instruments and digital control systems. Second, the typical IT specialist, maybe straight out of school, doesn\u2019t have a frame of reference for understanding what they are dealing with in a laboratory setting. The work is foreign, the discussions involve terminology they may not understand, and there may be no common ground for discussing problems. In classical IT, the IT personnel may be using the same office software as the people they support, but they can't say the same for the laboratory software used by scientists.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig32_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"9008319c4b611d20d323042107195233\"><img alt=\"Fig32 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/2\/25\/Fig32_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 32.<\/b> Comparison of corporate IT with laboratory IT (LAB-IT)<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Having noted the differences between classing IT and laboratory IT, as well as the growing need for competent LSEs, we need to take a closer look at some of the roles that classic IT and LSE personnel can take. Figure 33 provides a sub-set of the items from Figure 32 and reflects tasks that IT groups could be comfortable with. Typical IT backgrounds with no lab tech familiarity won\u2019t get you beyond the basic level of support. To be effective, IT personnel need to become familiar with the lab environment, the applications and technologies used, and the language of laboratory work. It isn\u2019t necessary for IT support to become experts in instrumental techniques, but they should understand the basic \"instrument to control system to computer\" model as well as the related database applications, to the point where they can provide support, advise people on product selections, etc. We need people who can straddle the IT-laboratory application environment. They could be lab people with an interest in computing or IT people with a strong interest in science.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig33_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"689c853ced4614a2f9944f38b1349b14\"><img alt=\"Fig33 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/f\/fa\/Fig33_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 33.<\/b> Potential roles for IT and LSE support in laboratory work<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>There are ways of bridging that education gap (Figure 34), but today they depend upon individual initiative more than corporate direction to educate people to the level needed. On-the-job training is not an effective substitute for real education; on the surface it is cheaper, but you lose out in the long run because people really don\u2019t understand what is going on, which limits their effectiveness and prevents them from being innovative or even catching problems in the early stages before they become serious. A big issue is this: due to a lack of education, are people developing bad K\/I\/D and aren't aware of it? The problem isn\u2019t limited to the level of systems we are talking about here. It also extends to techniques such as pipetting.<sup id=\"rdp-ebb-cite_ref-BradshawTheImpo12_28-0\" class=\"reference\"><a href=\"#cite_note-BradshawTheImpo12-28\">[15]<\/a><\/sup>\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig34_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"e1bd516cc4e1a716178147a5712de6bf\"><img alt=\"Fig34 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/2\/2e\/Fig34_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 34.<\/b> Bridging the education gap<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>It is also a matter of getting people to understand the breadth of material they have to be familiar with. In 2018, a webinar series was created (Figure 35) to educate management on the planning requirements for implementing lab systems. The live sessions were well attended. The chart shows the viewing rate for the individual topics through early December 2020. Note that the highest viewed items were technology-specific; people wanted to know about LIMS, ELN, etc. The details about planning, education, support, etc. haven\u2019t received near the amount of attention they need. People want to know about product classes but aren\u2019t willing to learn about what it takes to be successful. Even if you are relying on vendors or consultants, lab management is still accountable for the success of planning, implementation, and effectiveness of lab systems.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig35_Liscouski_LabTechPlanMan20.png\" class=\"image wiki-link\" data-key=\"1d5fe54d86f59d9502cbbfd01a557035\"><img alt=\"Fig35 Liscouski LabTechPlanMan20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/f\/f5\/Fig35_Liscouski_LabTechPlanMan20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 35.<\/b> Laboratory technology webinar series views after initial release<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Prior to the COVID-19 pandemic of 2020, undergraduate education depended on the standard model of in-person instruction. With the challenges of COVID-19 spreading, online learning took on a new importance and stronger acceptance, building on the ground established by online universities and university programs. This gives us an acceptable model for two types of course development: a fully-dedicated LSE program or an expanded program that would expand student\u2019s backgrounds in both the laboratory sciences and IT. One issue that would need to be addressed, however, is bridging the gap between presentation material and hands-on experience with lab systems. Videos and evaluation tests will only get you so far; you need the hands-on experience to make it real and provide the confidence that what you've learned can be effectively applied.\n<\/p><p>There are several steps that can be taken to build an LSE program. The first is to develop a definition of a common set of skills and knowledge that an LSE should have, recognizing that people will come from two different backgrounds (i.e., laboratory science and IT), and those have to be built up to reach a common balanced knowledge base. Those with a strong laboratory science background need to add information technology experience, while those from IT will need to gain an understanding of how laboratory science is done. Remember, however, that those from IT experiences don\u2019t need to be educated in chemistry, biology, and physics, etc. After all, they aren\u2019t going to be developing methods; they will be helping to implement them. There are things common to all sciences that they need to understand such as record keeping, the workflow models of testing and research, data acquisition and processes, instrumentation, and so on. That curriculum should also help people who want to specialize in particular subject areas such as laboratory database systems, robotics, etc. The second step is to build a curriculum that allows students to meet those requirements. This requires solid forethought in the development and curation of course materials. A lot of material already exists and is spread over the internet on university, government, and company web sites. A good first step would be to collect and organize those references into a single site (the actual courses need not be moved to the site, just their descriptions, access requirements, and links). Presentation and organization of the content is also important. Someone visiting the site will need a guide of what LSE is about, how to find material appropriate for different subject areas, and how to get access to it. Consider your site audience a visitor that knows nothing about the field: where do they start and how do we facilitate their progress? Providing clear orientation and direction are key. First give them an understanding of what LSE is all about, and then a map to whatever interests them. With the curriculum built, you can then identify areas that need more material and then move to further develop the program. Of course, you'll also want to make it possible to take advantage of online demonstration systems and simulators to give people a feel for working with the various laboratory systems. This is a half-step to what is needed: there\u2019s no substitute for hands-on work with equipment.\n<\/p><p>As it stands today, we\u2019ve seemingly progressed from manual methods, to computer-assisted methods, and then to automated systems in the course of developing laboratory technologies over the years, and yet our educational programs are a patchwork of courses largely driven by individual needs. We need to take a new look at lab technologies and their use and how best to prepare people for their work with solid educational opportunities.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Closing\">Closing<\/span><\/h2>\n<p>This guide has addressed the following:\n<\/p>\n<ul><li><i>Why technology planning and management needs to be addressed<\/i>: Because integrated system need attention in their application and management to protect electronic laboratory K\/I\/D, ensure that it can be effectively used, and ensure that the systems and products put in place are both the right ones, and that they fully contribute to improvements in lab operations.<\/li><\/ul>\n<ul><li><i>What's changed about that planning and management since the introduction of computers in the lab<\/i>: As technology in the lab expanded, we lost the basic understanding of what the new computer and instrument system were and what they did, that they had faults, and that if we didn\u2019t plan for their effective use and counter those faults, we were opening ourselves to unpleasant surprises. The consequences at times were system crashes, lost data, and a lack of a real understanding of how the output of an instrument was transformed into a set of numbers, which meant we couldn\u2019t completely account for the results we were reporting. A more purposeful set of planning and management activities, at the earliest point possible, have become increasingly more important.<\/li><\/ul>\n<ul><li><i>Why developing an environment that fosters productivity and innovation is important<\/i>: Innovation doesn\u2019t happen in a highly structured environment: you need the freedom to question, challenge, etc. You also need the tools to work with. The inspiration that leads to innovation can happen anywhere, anytime. All of a sudden all the pieces fit. This requires flexibility and trust in people, an important part of corporate culture.<\/li><\/ul>\n<ul><li><i>Why developing high-quality K\/I\/D is desirable<\/i>: There are different types of data structures that are used in lab work, and careful attention is needed to work with and manage them. This includes the effective management of K\/I\/D, putting it in a structure that encourages its use and protects its value. When methods are proven and you have documented evidence that they were executed by properly educated personnel using qualified reagents, instruments, and methods, you should then have high-quality K\/I\/D to support each sample result and any other information gleaned from that data.<\/li><\/ul>\n<ul><li><i>Why fostering a culture around data integrity is important to lab operations, addressing both technical and personnel issues<\/i>: Positive outcomes will come from your data integrity efforts: your work will be easier and protected from loss, results will be easier to organize and analyze, and you\u2019ll have a better functioning lab. You\u2019ll also have fewer unpleasant surprises when technology changes occur and you need to transition from one way of doing things to another.<\/li><\/ul>\n<ul><li><i>How to address digital, facility, and backup security<\/i>: Preventing unauthorized electronic and physical intrusion is critical to data integrity and meeting regulatory requirements. It also ensures that access to K\/I\/D is protected against loss from a wide variety of threats to the organization's facilities, all while securing your ability to work. This included addressing power backup, continuity of operations, systems backup, and more.<\/li><\/ul>\n<ul><li><i>How to acquire and develop \"products\" that support regulatory requirements<\/i>: Careful engineering and well-planned and -documented internal processes are needed to ensure that systems and methods that are being used can remain in use and be supported over the life span of a lab. This means recognizing the initial design and planning of processes and methods has to be done well for a supportable product, and keeping in mind the potential for future process review and modification even as the initial process or method is being developed. Additionally, the lab must also recognize the complete product life cycle and how that affects the supportability of systems and methods.<\/li><\/ul>\n<ul><li><i>The importance of system integrations and the harmonization of K\/I\/D<\/i>: Integrated systems can benefit a lab's operations and the planning needed to work with different types of database systems as the results of lab work becoming more concentrated in LIMS and ELNs, including making decisions about how K\/I\/D is stored and distributed over multiple databases. At the same time, harmonization efforts using common hardware and software platforms to implement laboratory systems is important, but those efforts must also ensure that the move toward commonality doesn\u2019t force people to use products that are forced fits, that don\u2019t really meet the lab's needs, but serve some other agenda.<\/li><\/ul>\n<ul><li><i>Why the development of comprehensive higher-education courses dedicated to the laboratory systems engineer or lab science-IT hybrid is a must with today's modern laboratory technology<\/i>: In today's world, typical IT backgrounds with no lab tech familiarity, or typical laboratory science backgrounds with no IT familiarity won\u2019t get you beyond the basic level of support for your laboratory systems. To be effective, IT personnel need to become familiar with the lab environment, the applications and technologies used, and the language of laboratory work, while scientists must become more familiar with the management of K\/I\/D from the technical perspective. This gap must be closed through new and improved higher-education programs.<\/li><\/ul>\n<p>And thus we return back to the start to close this guide. First, there's a definite need for better planning and management of laboratory technologies. Careful attention is required in to protect electronic laboratory knowledge, information, and data (K\/I\/D), ensure that it can be effectively used, and ensure that the systems and products put in place are both the right ones, and that they fully contribute to improvements in lab operations. Second, seven clear goals highlight this apparent need for laboratory technology planning and management and improve how it's performed. From supporting an environment that fosters productivity and innovation all the way to ensuring proper systems integration and harmonization, planning and management is a multi-step process with many clear benefits. And finally, there's a definitive need for more laboratory systems engineers (LSEs) who have the education and skills needed to accomplish all that planning and management in an effective manner, from the very start. This will require a more concerted effort in academia, and perhaps even among professional organizations catering to laboratories. All of this together hopefully means a more thoughtful, modern, and deliberate approach to implementing laboratory technologies in your lab.\n<\/p>\n<h2><span id=\"rdp-ebb-Abbreviations,_acronyms,_and_initialisms\"><\/span><span class=\"mw-headline\" id=\"Abbreviations.2C_acronyms.2C_and_initialisms\">Abbreviations, acronyms, and initialisms<\/span><\/h2>\n<p><b>A\/D<\/b>: Analog-to-digital\n<\/p><p><b>AI<\/b>: Artificial intelligence\n<\/p><p><b>ALCOA<\/b>: Attributable, legible, contemporaneous, original, and accurate\n<\/p><p><b>API<\/b>: Application programming interface\n<\/p><p><b>CDS<\/b>: Chromatography data system\n<\/p><p><b>CPU<\/b>: Central processing unit\n<\/p><p><b>ELN<\/b>: Electronic laboratory notebook\n<\/p><p><b>EPA<\/b>: Environmental Protection Agency\n<\/p><p><b>FAIR<\/b>: Findable, accessible, interoperable, and reusable\n<\/p><p><b>FDA<\/b>: Food and Drug Administration\n<\/p><p><b>FRB<\/b>: Fast radio bursts\n<\/p><p><b>IT<\/b>: Information technology\n<\/p><p><b>ISO<\/b>: International Organization for Standardization\n<\/p><p><b>K\/D\/I<\/b>: Knowledge, data, and information\n<\/p><p><b>LAB-IT<\/b>: Laboratory information technology support staff\n<\/p><p><b>LAE<\/b>: Laboratory automation engineering (or engineer)\n<\/p><p><b>LES<\/b>: Laboratory execution system\n<\/p><p><b>LIMS<\/b>: Laboratory information management system\n<\/p><p><b>LIS<\/b>: Laboratory information system\n<\/p><p><b>LOF<\/b>: Laboratory of the future\n<\/p><p><b>LSE<\/b>: Laboratory systems engineer\n<\/p><p><b>ML<\/b>: Machine learning\n<\/p><p><b>OS<\/b>: Operating system\n<\/p><p><b>QA\/QC<\/b>: Quality assurance\/quality control\n<\/p><p><b>ROI<\/b>: Return on investment\n<\/p><p><b>SDMS<\/b>: Scientific data management system\n<\/p><p><b>SOP<\/b>: Standard operating procedure\n<\/p><p><b>TPM<\/b>: Technology planning and management\n<\/p><p><br \/>\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Footnotes\">Footnotes<\/span><\/h2>\n<div class=\"reflist\" style=\"list-style-type: lower-alpha;\">\n<div class=\"mw-references-wrap mw-references-columns\"><ol class=\"references\">\n<li id=\"cite_note-1\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-1\">\u2191<\/a><\/span> <span class=\"reference-text\">See <i><a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.13140\/RG.2.2.15605.42724\" target=\"_blank\">Elements of Laboratory Technology Management<\/a><\/i> and the LSE material in this document.<\/span>\n<\/li>\n<li id=\"cite_note-4\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-4\">\u2191<\/a><\/span> <span class=\"reference-text\">See the \"Scientific Manufacturing\" section of <i><a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.13140\/RG.2.2.15605.42724\" target=\"_blank\">Elements of Laboratory Technology Management<\/a><\/i>.<\/span>\n<\/li>\n<li id=\"cite_note-8\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-8\">\u2191<\/a><\/span> <span class=\"reference-text\">By \u201cgeneral systems\u201d I\u2019m not referring to simply computer systems, but the models and systems found under \u201cgeneral systems theory\u201d in mathematics.<\/span>\n<\/li>\n<li id=\"cite_note-9\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-9\">\u2191<\/a><\/span> <span class=\"reference-text\">Regarding LAB-IT and LAEs, my thinking about these titles has changed over time; the last section of this document \u201cLaboratory systems engineers\u201d goes into more detail.<\/span>\n<\/li>\n<li id=\"cite_note-11\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-11\">\u2191<\/a><\/span> <span class=\"reference-text\">\u201cReal-time\u201d has a different meaning inside the laboratory than it does in office applications. Instead of a response of a couple seconds between an action and response, lab \u201creal-time\u201d is often a millisecond or faster precision; missing a single sampling timing out of thousands can invalidate an entire sample analysis.<\/span>\n<\/li>\n<li id=\"cite_note-12\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-12\">\u2191<\/a><\/span> <span class=\"reference-text\">See <i><a href=\"https:\/\/www.limswiki.org\/index.php\/LII:Notes_on_Instrument_Data_Systems\" title=\"LII:Notes on Instrument Data Systems\" class=\"wiki-link\" data-key=\"1b7330228fd59158aab6fab82ad0e7cc\">Notes on Instrument Data Systems<\/a><\/i> for more on this topic.<\/span>\n<\/li>\n<li id=\"cite_note-14\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-14\">\u2191<\/a><\/span> <span class=\"reference-text\">For a more detailed description of the K\/I\/D model, please refer to <i><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.researchgate.net\/publication\/275351757_Computerized_Systems_in_the_Modern_Laboratory_A_Practical_Guide\" target=\"_blank\">Computerized Systems in the Modern Laboratory: A Practical Guide<\/a><\/i>.<\/span>\n<\/li>\n<li id=\"cite_note-15\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-15\">\u2191<\/a><\/span> <span class=\"reference-text\">For more detailed discussion on this, see <i><a href=\"https:\/\/www.limswiki.org\/index.php\/LII:Notes_on_Instrument_Data_Systems\" title=\"LII:Notes on Instrument Data Systems\" class=\"wiki-link\" data-key=\"1b7330228fd59158aab6fab82ad0e7cc\">Notes on Instrument Data Systems<\/a><\/i>.<\/span>\n<\/li>\n<li id=\"cite_note-16\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-16\">\u2191<\/a><\/span> <span class=\"reference-text\">For more information on virtualization, particularly if the subject is new to you, look at <i><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.vmware.com\/content\/dam\/learn\/en\/amer\/fy20\/pdf\/50872_20Q1_Next-Gen_Virtualization_FD_VMware_Special_Edition.pdf\" target=\"_blank\">Next-Gen Virtualization for Dummies<\/a><\/i>. The <i>for Dummies<\/i> series is designed to educate people new to a topic, getting away from jargon and presenting material in clear, easy-to-understand language. This book is particularly good at that.<\/span>\n<\/li>\n<li id=\"cite_note-19\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-19\">\u2191<\/a><\/span> <span class=\"reference-text\">One good reference on this subject is a presentation <i><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.pharma-iq.com\/events-laboratory-informatics-online\/downloads\/building-a-data-integrity-strategy-to-accompany-your-digital-enablement\" target=\"_blank\">Building A Data Integrity Strategy To Accompany Your Digital Enablement<\/a><\/i> by Julie Spirk Russom of BioTherapeutics Pharmaceutical Science.<\/span>\n<\/li>\n<li id=\"cite_note-23\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-23\">\u2191<\/a><\/span> <span class=\"reference-text\">Though it may not see significant updates, consider reading the <i><a href=\"https:\/\/www.limswiki.org\/index.php\/LII:Comprehensive_Guide_to_Developing_and_Implementing_a_Cybersecurity_Plan\" title=\"LII:Comprehensive Guide to Developing and Implementing a Cybersecurity Plan\" class=\"wiki-link\" data-key=\"fce1737a2e9697fc03e956327817f8ea\">Comprehensive Guide to Developing and Implementing a Cybersecurity Plan<\/a><\/i> for a much more comprehensive look at security in the lab.<\/span>\n<\/li>\n<li id=\"cite_note-26\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-26\">\u2191<\/a><\/span> <span class=\"reference-text\">Yes the scientific work you do is essential to the lab\u2019s purpose, but our focus is on one element of the lab\u2019s operations: what happens after the scientific work is done.<\/span>\n<\/li>\n<li id=\"cite_note-27\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-27\">\u2191<\/a><\/span> <span class=\"reference-text\">For example, a product line is sold to another company or transferred to another division, and they then want copies of all relevant information. Meeting regulatory requirements is another example.<\/span>\n<\/li>\n<\/ol><\/div><\/div>\n<h2><span class=\"mw-headline\" id=\"About_the_author\">About the author<\/span><\/h2>\n<p>Initially educated as a chemist, author Joe Liscouski (joe dot liscouski at gmail dot com) is an experienced laboratory automation\/computing professional with over forty years of experience in the field, including the design and development of automation systems (both custom and commercial systems), LIMS, robotics and data interchange standards. He also consults on the use of computing in laboratory work. He has held symposia on validation and presented technical material and short courses on laboratory automation and computing in the U.S., Europe, and Japan. He has worked\/consulted in pharmaceutical, biotech, polymer, medical, and government laboratories. His current work centers on working with companies to establish planning programs for lab systems, developing effective support groups, and helping people with the application of automation and information technologies in research and quality control environments.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"References\">References<\/span><\/h2>\n<div class=\"reflist references-column-width\" style=\"-moz-column-width: 30em; -webkit-column-width: 30em; column-width: 30em; list-style-type: decimal;\">\n<div class=\"mw-references-wrap mw-references-columns\"><ol class=\"references\">\n<li id=\"cite_note-BourneMyBoss13-2\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-BourneMyBoss13_2-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Bourne, D. (2013). \"My boss the robot\". <i>Scientific American<\/i> <b>308<\/b> (5): 38\u201341. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/Digital_object_identifier\" data-key=\"ae6d69c760ab710abc2dd89f3937d2f4\">doi<\/a>:<a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.1038%2Fscientificamerican0513-38\" target=\"_blank\">10.1038\/scientificamerican0513-38<\/a>. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/PubMed_Identifier\" data-key=\"1d34e999f13d8801964a6b3e9d7b4e30\">PMID<\/a> <a rel=\"external_link\" class=\"external text\" href=\"http:\/\/www.ncbi.nlm.nih.gov\/pubmed\/23627215\" target=\"_blank\">23627215<\/a>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=My+boss+the+robot&rft.jtitle=Scientific+American&rft.aulast=Bourne%2C+D.&rft.au=Bourne%2C+D.&rft.date=2013&rft.volume=308&rft.issue=5&rft.pages=38%E2%80%9341&rft_id=info:doi\/10.1038%2Fscientificamerican0513-38&rft_id=info:pmid\/23627215&rfr_id=info:sid\/en.wikipedia.org:LII:Laboratory_Technology_Planning_and_Management:_The_Practice_of_Laboratory_Systems_Engineering\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-CookCollab20-3\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-CookCollab20_3-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Cook, B. (2020). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.labmanager.com\/laboratory-technology\/collaborative-robots-mobile-and-adaptable-labmates-24474\" target=\"_blank\">\"Collaborative Robots: Mobile and Adaptable Labmates\"<\/a>. <i>Lab Manager<\/i> <b>15<\/b> (11): 10\u201313<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.labmanager.com\/laboratory-technology\/collaborative-robots-mobile-and-adaptable-labmates-24474\" target=\"_blank\">https:\/\/www.labmanager.com\/laboratory-technology\/collaborative-robots-mobile-and-adaptable-labmates-24474<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Collaborative+Robots%3A+Mobile+and+Adaptable+Labmates&rft.jtitle=Lab+Manager&rft.aulast=Cook%2C+B.&rft.au=Cook%2C+B.&rft.date=2020&rft.volume=15&rft.issue=11&rft.pages=10%E2%80%9313&rft_id=https%3A%2F%2Fwww.labmanager.com%2Flaboratory-technology%2Fcollaborative-robots-mobile-and-adaptable-labmates-24474&rfr_id=info:sid\/en.wikipedia.org:LII:Laboratory_Technology_Planning_and_Management:_The_Practice_of_Laboratory_Systems_Engineering\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-HsuIsIt18-5\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-HsuIsIt18_5-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">Hsu, J. (24 September 2018). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.nbcnews.com\/mach\/science\/it-aliens-scientists-detect-more-mysterious-radio-signals-distant-galaxy-ncna912586\" target=\"_blank\">\"Is it aliens? Scientists detect more mysterious radio signals from distant galaxy\"<\/a>. <i>NBC News MACH<\/i><span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.nbcnews.com\/mach\/science\/it-aliens-scientists-detect-more-mysterious-radio-signals-distant-galaxy-ncna912586\" target=\"_blank\">https:\/\/www.nbcnews.com\/mach\/science\/it-aliens-scientists-detect-more-mysterious-radio-signals-distant-galaxy-ncna912586<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 04 February 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Is+it+aliens%3F+Scientists+detect+more+mysterious+radio+signals+from+distant+galaxy&rft.atitle=NBC+News+MACH&rft.aulast=Hsu%2C+J.&rft.au=Hsu%2C+J.&rft.date=24+September+2018&rft_id=https%3A%2F%2Fwww.nbcnews.com%2Fmach%2Fscience%2Fit-aliens-scientists-detect-more-mysterious-radio-signals-distant-galaxy-ncna912586&rfr_id=info:sid\/en.wikipedia.org:LII:Laboratory_Technology_Planning_and_Management:_The_Practice_of_Laboratory_Systems_Engineering\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-TimmerAIPlus18-6\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-TimmerAIPlus18_6-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">Timmer, J. (18 July 2018). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/arstechnica.com\/science\/2018\/07\/ai-plus-a-chemistry-robot-finds-all-the-reactions-that-will-work\/5\/\" target=\"_blank\">\"AI plus a chemistry robot finds all the reactions that will work\"<\/a>. <i>Ars Technica<\/i><span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/arstechnica.com\/science\/2018\/07\/ai-plus-a-chemistry-robot-finds-all-the-reactions-that-will-work\/5\/\" target=\"_blank\">https:\/\/arstechnica.com\/science\/2018\/07\/ai-plus-a-chemistry-robot-finds-all-the-reactions-that-will-work\/5\/<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 04 February 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=AI+plus+a+chemistry+robot+finds+all+the+reactions+that+will+work&rft.atitle=Ars+Technica&rft.aulast=Timmer%2C+J.&rft.au=Timmer%2C+J.&rft.date=18+July+2018&rft_id=https%3A%2F%2Farstechnica.com%2Fscience%2F2018%2F07%2Fai-plus-a-chemistry-robot-finds-all-the-reactions-that-will-work%2F5%2F&rfr_id=info:sid\/en.wikipedia.org:LII:Laboratory_Technology_Planning_and_Management:_The_Practice_of_Laboratory_Systems_Engineering\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-HelixAIHome-7\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-HelixAIHome_7-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"http:\/\/www.askhelix.io\/\" target=\"_blank\">\"HelixAI - Voice Powered Digital Laboratory Assistants for Scientific Laboratories\"<\/a>. HelixAI<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"http:\/\/www.askhelix.io\/\" target=\"_blank\">http:\/\/www.askhelix.io\/<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 04 February 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=HelixAI+-+Voice+Powered+Digital+Laboratory+Assistants+for+Scientific+Laboratories&rft.atitle=&rft.pub=HelixAI&rft_id=http%3A%2F%2Fwww.askhelix.io%2F&rfr_id=info:sid\/en.wikipedia.org:LII:Laboratory_Technology_Planning_and_Management:_The_Practice_of_Laboratory_Systems_Engineering\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-LiscouskiAreYou06-10\"><span class=\"mw-cite-backlink\">\u2191 <sup><a href=\"#cite_ref-LiscouskiAreYou06_10-0\">6.0<\/a><\/sup> <sup><a href=\"#cite_ref-LiscouskiAreYou06_10-1\">6.1<\/a><\/sup><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Liscouski, J.G. (2006). \"Are You a Laboratory Automation Engineer?\". <i>SLAS Technology<\/i> <b>11<\/b> (3): 157-162. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/Digital_object_identifier\" data-key=\"ae6d69c760ab710abc2dd89f3937d2f4\">doi<\/a>:<a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.1016%2Fj.jala.2006.04.002\" target=\"_blank\">10.1016\/j.jala.2006.04.002<\/a>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Are+You+a+Laboratory+Automation+Engineer%3F&rft.jtitle=SLAS+Technology&rft.aulast=Liscouski%2C+J.G.&rft.au=Liscouski%2C+J.G.&rft.date=2006&rft.volume=11&rft.issue=3&rft.pages=157-162&rft_id=info:doi\/10.1016%2Fj.jala.2006.04.002&rfr_id=info:sid\/en.wikipedia.org:LII:Laboratory_Technology_Planning_and_Management:_The_Practice_of_Laboratory_Systems_Engineering\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-LiscouskiWhich15-13\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-LiscouskiWhich15_13-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Liscouski, J. (2015). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.researchgate.net\/publication\/291971749_Which_Laboratory_Software_is_the_Right_One_for_Your_Lab\" target=\"_blank\">\"Which Laboratory Software Is the Right One for Your Lab?\"<\/a>. <i>PDA Letter<\/i> (November\/December 2015): 38\u201341<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.researchgate.net\/publication\/291971749_Which_Laboratory_Software_is_the_Right_One_for_Your_Lab\" target=\"_blank\">https:\/\/www.researchgate.net\/publication\/291971749_Which_Laboratory_Software_is_the_Right_One_for_Your_Lab<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Which+Laboratory+Software+Is+the+Right+One+for+Your+Lab%3F&rft.jtitle=PDA+Letter&rft.aulast=Liscouski%2C+J.&rft.au=Liscouski%2C+J.&rft.date=2015&rft.issue=November%2FDecember+2015&rft.pages=38%E2%80%9341&rft_id=https%3A%2F%2Fwww.researchgate.net%2Fpublication%2F291971749_Which_Laboratory_Software_is_the_Right_One_for_Your_Lab&rfr_id=info:sid\/en.wikipedia.org:LII:Laboratory_Technology_Planning_and_Management:_The_Practice_of_Laboratory_Systems_Engineering\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-WPDataInt-17\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-WPDataInt_17-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"nofollow\" class=\"external text wiki-link\" href=\"https:\/\/en.wikipedia.org\/wiki\/Data_integrity\" data-key=\"42abdc13351f50028f23f899197d3349\">\"Data integrity\"<\/a>. <i>Wikipedia<\/i>. 3 February 2021<span class=\"printonly\">. <a rel=\"nofollow\" class=\"external free wiki-link\" href=\"https:\/\/en.wikipedia.org\/wiki\/Data_integrity\" data-key=\"42abdc13351f50028f23f899197d3349\">https:\/\/en.wikipedia.org\/wiki\/Data_integrity<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 07 February 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Data+integrity&rft.atitle=Wikipedia&rft.date=3+February+2021&rft_id=https%3A%2F%2Fen.wikipedia.org%2Fwiki%2FData_integrity&rfr_id=info:sid\/en.wikipedia.org:LII:Laboratory_Technology_Planning_and_Management:_The_Practice_of_Laboratory_Systems_Engineering\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-HarmonWhatIs20-18\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-HarmonWhatIs20_18-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">Harmon, C. (20 November 2020). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.technologynetworks.com\/informatics\/articles\/what-is-data-integrity-343068\" target=\"_blank\">\"What Is Data Integrity?\"<\/a>. <i>Technology Networks<\/i><span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.technologynetworks.com\/informatics\/articles\/what-is-data-integrity-343068\" target=\"_blank\">https:\/\/www.technologynetworks.com\/informatics\/articles\/what-is-data-integrity-343068<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 07 February 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=What+Is+Data+Integrity%3F&rft.atitle=Technology+Networks&rft.aulast=Harmon%2C+C.&rft.au=Harmon%2C+C.&rft.date=20+November+2020&rft_id=https%3A%2F%2Fwww.technologynetworks.com%2Finformatics%2Farticles%2Fwhat-is-data-integrity-343068&rfr_id=info:sid\/en.wikipedia.org:LII:Laboratory_Technology_Planning_and_Management:_The_Practice_of_Laboratory_Systems_Engineering\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-ChrobakTheUS20-20\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-ChrobakTheUS20_20-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">Chrobak, U. (17 August 2020). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.popsci.com\/story\/environment\/why-us-lose-power-storms\/\" target=\"_blank\">\"The US has more power outages than any other developed country. Here\u2019s why\"<\/a>. <i>Popular Science<\/i><span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.popsci.com\/story\/environment\/why-us-lose-power-storms\/\" target=\"_blank\">https:\/\/www.popsci.com\/story\/environment\/why-us-lose-power-storms\/<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 08 February 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=The+US+has+more+power+outages+than+any+other+developed+country.+Here%E2%80%99s+why&rft.atitle=Popular+Science&rft.aulast=Chrobak%2C+U.&rft.au=Chrobak%2C+U.&rft.date=17+August+2020&rft_id=https%3A%2F%2Fwww.popsci.com%2Fstory%2Fenvironment%2Fwhy-us-lose-power-storms%2F&rfr_id=info:sid\/en.wikipedia.org:LII:Laboratory_Technology_Planning_and_Management:_The_Practice_of_Laboratory_Systems_Engineering\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-TulsiGreater19-21\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-TulsiGreater19_21-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">Tulsi, B.B. (4 September 2019). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.labmanager.com\/business-management\/greater-awareness-and-vigilance-in-laboratory-data-security-776\" target=\"_blank\">\"Greater Awareness and Vigilance in Laboratory Data Security\"<\/a>. <i>Lab Manager<\/i><span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.labmanager.com\/business-management\/greater-awareness-and-vigilance-in-laboratory-data-security-776\" target=\"_blank\">https:\/\/www.labmanager.com\/business-management\/greater-awareness-and-vigilance-in-laboratory-data-security-776<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 09 February 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Greater+Awareness+and+Vigilance+in+Laboratory+Data+Security&rft.atitle=Lab+Manager&rft.aulast=Tulsi%2C+B.B.&rft.au=Tulsi%2C+B.B.&rft.date=4+September+2019&rft_id=https%3A%2F%2Fwww.labmanager.com%2Fbusiness-management%2Fgreater-awareness-and-vigilance-in-laboratory-data-security-776&rfr_id=info:sid\/en.wikipedia.org:LII:Laboratory_Technology_Planning_and_Management:_The_Practice_of_Laboratory_Systems_Engineering\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-RileyGitLab20-22\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-RileyGitLab20_22-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">Riley, D. (21 May 2020). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/siliconangle.com\/2020\/05\/21\/gitlab-runs-phishing-test-employees-20-handing-credentials\/\" target=\"_blank\">\"GitLab runs phishing test against employees \u2013 and 20% handed over credentials\"<\/a>. <i>Silicon Angle<\/i><span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/siliconangle.com\/2020\/05\/21\/gitlab-runs-phishing-test-employees-20-handing-credentials\/\" target=\"_blank\">https:\/\/siliconangle.com\/2020\/05\/21\/gitlab-runs-phishing-test-employees-20-handing-credentials\/<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 09 February 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=GitLab+runs+phishing+test+against+employees+%E2%80%93+and+20%25+handed+over+credentials&rft.atitle=Silicon+Angle&rft.aulast=Riley%2C+D.&rft.au=Riley%2C+D.&rft.date=21+May+2020&rft_id=https%3A%2F%2Fsiliconangle.com%2F2020%2F05%2F21%2Fgitlab-runs-phishing-test-employees-20-handing-credentials%2F&rfr_id=info:sid\/en.wikipedia.org:LII:Laboratory_Technology_Planning_and_Management:_The_Practice_of_Laboratory_Systems_Engineering\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-FMSAutomated-24\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-FMSAutomated_24-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.fms-inc.com\/sample-prep\/\" target=\"_blank\">\"Automated Sample Preparation\"<\/a>. Fluid Management Systems, Inc<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.fms-inc.com\/sample-prep\/\" target=\"_blank\">https:\/\/www.fms-inc.com\/sample-prep\/<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 09 February 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Automated+Sample+Preparation&rft.atitle=&rft.pub=Fluid+Management+Systems%2C+Inc&rft_id=https%3A%2F%2Fwww.fms-inc.com%2Fsample-prep%2F&rfr_id=info:sid\/en.wikipedia.org:LII:Laboratory_Technology_Planning_and_Management:_The_Practice_of_Laboratory_Systems_Engineering\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-AgilentPAL-25\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-AgilentPAL_25-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.agilent.com\/en\/product\/gas-chromatography\/gc-sample-preparation-introduction\/pal-auto-sampler-systems\" target=\"_blank\">\"PAL Auto Sampler Systems\"<\/a>. Agilent Technologies, Inc<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.agilent.com\/en\/product\/gas-chromatography\/gc-sample-preparation-introduction\/pal-auto-sampler-systems\" target=\"_blank\">https:\/\/www.agilent.com\/en\/product\/gas-chromatography\/gc-sample-preparation-introduction\/pal-auto-sampler-systems<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 09 February 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=PAL+Auto+Sampler+Systems&rft.atitle=&rft.pub=Agilent+Technologies%2C+Inc&rft_id=https%3A%2F%2Fwww.agilent.com%2Fen%2Fproduct%2Fgas-chromatography%2Fgc-sample-preparation-introduction%2Fpal-auto-sampler-systems&rfr_id=info:sid\/en.wikipedia.org:LII:Laboratory_Technology_Planning_and_Management:_The_Practice_of_Laboratory_Systems_Engineering\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-BradshawTheImpo12-28\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-BradshawTheImpo12_28-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">Bradshaw, J.T. (30 May 2012). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/d1wfu1xu79s6d2.cloudfront.net\/wp-content\/uploads\/2013\/10\/The-Importance-of-Liquid-Handling-Details-and-Their-Impact-on-Your-Assays.pdf\" target=\"_blank\">\"The Importance of Liquid Handling Details and Their Impact on Your Assays\"<\/a> (PDF). <i>European Lab Automation Conference 2012<\/i>. Artel, Inc<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/d1wfu1xu79s6d2.cloudfront.net\/wp-content\/uploads\/2013\/10\/The-Importance-of-Liquid-Handling-Details-and-Their-Impact-on-Your-Assays.pdf\" target=\"_blank\">https:\/\/d1wfu1xu79s6d2.cloudfront.net\/wp-content\/uploads\/2013\/10\/The-Importance-of-Liquid-Handling-Details-and-Their-Impact-on-Your-Assays.pdf<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 11 February 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=The+Importance+of+Liquid+Handling+Details+and+Their+Impact+on+Your+Assays&rft.atitle=European+Lab+Automation+Conference+2012&rft.aulast=Bradshaw%2C+J.T.&rft.au=Bradshaw%2C+J.T.&rft.date=30+May+2012&rft.pub=Artel%2C+Inc&rft_id=https%3A%2F%2Fd1wfu1xu79s6d2.cloudfront.net%2Fwp-content%2Fuploads%2F2013%2F10%2FThe-Importance-of-Liquid-Handling-Details-and-Their-Impact-on-Your-Assays.pdf&rfr_id=info:sid\/en.wikipedia.org:LII:Laboratory_Technology_Planning_and_Management:_The_Practice_of_Laboratory_Systems_Engineering\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<\/ol><\/div><\/div>\n<!-- \nNewPP limit report\nCached time: 20211118091844\nCache expiry: 86400\nDynamic content: false\nComplications: []\nCPU time usage: 0.343 seconds\nReal time usage: 1.421 seconds\nPreprocessor visited node count: 11057\/1000000\nPost\u2010expand include size: 73221\/2097152 bytes\nTemplate argument size: 29350\/2097152 bytes\nHighest expansion depth: 18\/40\nExpensive parser function count: 0\/100\nUnstrip recursion depth: 0\/20\nUnstrip post\u2010expand size: 29620\/5000000 bytes\n-->\n<!--\nTransclusion expansion time report (%,ms,calls,template)\n100.00% 166.184 1 -total\n 84.14% 139.834 2 Template:Reflist\n 66.81% 111.033 15 Template:Citation\/core\n 49.15% 81.678 11 Template:Cite_web\n 24.00% 39.886 4 Template:Cite_journal\n 9.94% 16.513 8 Template:Date\n 9.64% 16.021 13 Template:Efn\n 6.32% 10.509 23 Template:Citation\/make_link\n 3.81% 6.334 3 Template:Citation\/identifier\n 1.49% 2.471 1 Template:Column-width\n-->\n\n<!-- Saved in parser cache with key limswiki:pcache:idhash:12324-0!canonical and timestamp 20211118091843 and revision id 41797. Serialized with JSON.\n -->\n<\/div><\/div><div class=\"printfooter\">Source: <a rel=\"external_link\" class=\"external\" href=\"https:\/\/www.limswiki.org\/index.php\/LII:Laboratory_Technology_Planning_and_Management:_The_Practice_of_Laboratory_Systems_Engineering\">https:\/\/www.limswiki.org\/index.php\/LII:Laboratory_Technology_Planning_and_Management:_The_Practice_of_Laboratory_Systems_Engineering<\/a><\/div>\n<!-- end content --><div class=\"visualClear\"><\/div><\/div><\/div><div class=\"visualClear\"><\/div><\/div><!-- end of the left (by default at least) column --><div class=\"visualClear\"><\/div><\/div>\n\n\n\n<\/body>","655f7d48a642e9b45533745af73f0d59_images":["https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/f\/f1\/Fig1_Liscouski_LabTechPlanMan20.png","https:\/\/upload.wikimedia.org\/wikipedia\/commons\/b\/ba\/PSM_V43_D075_Chemical_laboratory.jpg","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/5\/56\/Fig3_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/a\/ad\/Fig4_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/2\/27\/Fig5_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/1\/13\/Fig6_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/6\/67\/Fig7_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/2\/20\/Fig8_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/3\/3a\/Fig9_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/6\/62\/Fig10_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/7\/7a\/Fig11_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/8\/80\/Fig12_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/4\/42\/Fig13_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/4\/41\/Fig14_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/1\/12\/Fig15_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/4\/44\/Fig16_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/5\/59\/Fig17_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/0\/02\/Fig18_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/5\/5f\/Fig19_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/8\/8d\/Fig20_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/a\/a2\/Fig21_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/8\/8d\/Fig22_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/c\/cd\/Fig23_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/f\/f0\/Fig24_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/2\/23\/Fig25_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/2\/2b\/Fig26_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/5\/56\/Fig27_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/d\/db\/Fig28_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/9\/91\/Fig29_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/0\/00\/Fig30_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/2\/25\/Fig31_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/2\/25\/Fig32_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/f\/fa\/Fig33_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/2\/2e\/Fig34_Liscouski_LabTechPlanMan20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/f\/f5\/Fig35_Liscouski_LabTechPlanMan20.png"],"655f7d48a642e9b45533745af73f0d59_timestamp":1637252747,"1b7330228fd59158aab6fab82ad0e7cc_type":"article","1b7330228fd59158aab6fab82ad0e7cc_title":"Notes on Instrument Data Systems (Liscouski 2020)","1b7330228fd59158aab6fab82ad0e7cc_url":"https:\/\/www.limswiki.org\/index.php\/LII:Notes_on_Instrument_Data_Systems","1b7330228fd59158aab6fab82ad0e7cc_plaintext":"\n\nLII:Notes on Instrument Data SystemsFrom LIMSWikiJump to navigationJump to searchTitle: Notes on Instrument Data Systems\nAuthor for citation: Joe Liscouski, with editorial modifications by Shawn Douglas\nLicense for content: Creative Commons Attribution 4.0 International\nPublication date: May 27, 2020\n\r\n\n\nContents \n\n1 Introduction \n2 An overall model for laboratory processes and instrumentation \n\n2.1 Instruments vs. instrument packages \n\n\n3 Working with more sophisticated instruments \n4 Moving to fully automated, interconnected systems \n5 Setting a direction for laboratory automation development \n6 The bottom line \n7 Abbreviations, acronyms, and initialisms \n8 Footnotes \n9 About the author \n10 References \n\n\n\nIntroduction \nThe goal of this brief paper is to examine what it will take to advance laboratory operations in terms of technical content, data quality, and productivity. Advancements in the past have been incremental, and isolated, the result of an individual's or group's work and not part of a broad industry plan. Disjointed, uncoordinated, incremental improvements have to give way to planned, directed methods, such that appropriate standards and products can be developed and mutually beneficial R&D programs instituted. We\u2019ve long since entered a phase where the cost of technology development and implementation is too high to rely on a \u201clet\u2019s try this\u201d approach as the dominant methodology. Making progress in lab technologies is too important to be done without some direction (i.e., deliberate planning). Individual insights, inspiration, and \u201cout of the box\u201d thinking is always valuable; it can inspire a change in direction. But building to a purpose is equally important. This paper revisits past developments in instrument data systems (IDS), looks at issues that need attention as we further venture into the use of integrated informatics systems, and suggests some directions further development can take.\nThere is a second aspect beyond planning that also deserves attention: education. Yes, there are people who really know what they are doing with instrumental systems and data handling. However, that knowledge base isn\u2019t universal across labs. Many industrial labs and schools have people using instrument data systems with no understanding of what is happening to their data. Others such as Hinshaw and Stevenson et al. have commented on this phenomenon in the past:\n\nChromatographers go to great lengths to prepare, inject, and separate their samples, but they sometimes do not pay as much attention to the next step: peak detection and measurement ... Despite a lot of exposure to computerized data handling, however, many practicing chromatographers do not have a good idea of how a stored chromatogram file\u2014a set of data points arrayed in time\u2014gets translated into a set of peaks with quantitative attributes such as area, height, and amount.[1]\nAt this point, I noticed that the discussion tipped from an academic recitation of technical needs and possible solutions to a session driven primarily by frustrations. Even today, the instruments are often more sophisticated than the average user, whether he\/she is a technician, graduate student, scientist, or principal investigator using chromatography as part of the project. Who is responsible for generating good data? Can the designs be improved to increase data integrity?[2]\nWe can expect that the same issue holds true for even more demanding individual or combined techniques. Unless lab personnel are well-educated in both the theory and the practice of their work, no amount of automation\u2014including any IDS components\u2014is going to matter in the development of usable data and information.\nThe IDS entered the laboratory initially as an aid to analysts doing their work. Its primary role was to off-load tedious measurements and calculations, giving analysts more time to inspect and evaluate lab results. The IDS has since morphed from a convenience to a necessity, and then to being a presumed part of an instrument system. That raises two sets of issues that we\u2019ll address here regarding people, technologies, and their intersections:\n1. People: Do the users of an IDS understand what is happening to their data once it leaves the instrument and enters the computer? Do they understand the settings that are available and the effect they have on data processing, as well as the potential for compromising the results of the analytical bench work? Are lab personnel educated so that they are effective and competent users of all the technologies used in the course of their work?\n2. Technologies: Are the systems we are using up to the task that has been assigned to them as we automate laboratory functions?\nWe\u2019ll begin with some basic material and then develop the argument from there.\n\nAn overall model for laboratory processes and instrumentation \nLaboratory work is dependent upon instruments, and the push for higher productivity is driving us toward automation. What may be lost in all this, particularly to those new to lab procedures, is an understanding of how things work, and how computer systems control the final steps in a process. Without that understanding, good bench work can be reduced to bad and misleading results. If you want to guard against that prospect, you have to understand how instrument data systems work, as well as your role in ensuring accurate results.\nLet's begin by looking at a model for lab processes (Figure 1) and define the important elements.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 1. A basic laboratory process model\n\n\n\nMost of the elements of Figure 1 are easy to understand. \"Instrument Input\" could be an injection port, an autosampler, or the pan on a balance. \u201cControl Systems & Communications,\u201d normally part of the same electronics package as \u201cData Acquisition & Processing,\u201d is separated out to provide for hierarchical control configurations that might be found in, e.g., robotics systems. The model is easily expanded to describe hyphenated systems such as gas chromatography\u2013mass spectrometry (GC-MS) applications, as shown in Figure 2. Mass spectrometers are best used when you have a clean sample (no contaminants), while chromatographs are great at separating mixtures and isolating components in the effluent. The combination of instruments makes an effective tool for separating and identifying the components in mixtures.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 2. A basic laboratory process model applied to a GC-MS combination (the lower model illustration is flipped vertically to simplify the diagram)\n\n\n\nNote that the chromatographic system and its associated components become the \"Sample Preparation\" element from Figure 1 when viewed from the perspective of the mass spectrometer (MS). The flow out of the chromatographic column is split (if needed, depending on the detector used, while controlling the input to the MS) so that the eluted material is directed to the input of the MS. Each eluted component from the chromatograph becomes a sample to the MS.\n\nInstruments vs. instrument packages \nOne important feature of the model shown in Figure 1 is that arrow labeled \u201cAnalog Output\u201d; this emphasizes that the standard output of instruments is an analog \"response\" or voltage. The response can take the form of a needle movement across a scale, a deflection, or some other indication (e.g., a level in a mercury or alcohol thermometer). Voltage output can be read with a meter or strip chart recorder. There are no instruments with digital outputs; they are all analog output devices. \nOn the other hand, instruments that appear to be digital devices are the result of packaging by the manufacturer. That packaging contains the analog signal-generating instrument, with a voltage output that is connected to an analog-to-digital converter. \nTake for example analog pH meters, which are in reality millivolt meters with a pH scale added to the meter face. If you look at Figure 3, the lower switch position has a setting for both MV (millivolt) and pH. When you read the pH on the meter, you are doing an analog-to-digital conversion; the needle's movement is the result of the incoming voltage from the pH electrode. The \u201cpH reading\u201d is performed by you as you project the needle's position onto the scale and note the corresponding numeric value.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 3. The front face of an Orion Research IONALYZER\n\n\n\nA digital pH meter has the following components from our model (Figure 1) incorporated into the instrument package (the light green elements in the model):\n\nThe instrument - One or more electrodes provide the analog signal, which depends on the pH of the solution.\nData acquisition and processing \u2013 An external connection connects the electrodes to an analog-to-digital converter that converts the incoming voltage to a digital value that is in turn used to calculate the pH.\nControl systems and communication \u2013 This manages the digital display, front panel control functions, and, if present, the RS-232\/422\/485 or network communications. If the communications feature is there, this module also interprets and responds to commands from an external computer.\nWorking with more sophisticated instruments \nPrior to the introduction of computers to instruments, the common method of viewing the instruments output was through meters and strip-chart recorders. That meant two things: all the measurements and calculations were done by hand, and you were intimately familiar with your data. Both those points are significant and have ramifications that warrant serious consideration in today\u2019s labs.\nIn order to keep things simple, let's focus on chromatography, although the points raised will have application to spectroscopy, thermal analysis, and other instrumental techniques in chemistry; material science; physics; and other domains. Chromatography has the richest history of automation in laboratory work, it is widely used, and gives us a model for thinking about other techniques. You will not have to be an expert in chromatography to understand what follows.[a] To put it simply, chromatography is a technique for taking complex mixtures and allowing the analyst (with a bit of work) to separate the components of the mixture and determine what those components might be and quantify them. \nOlder chromatographs looked like the instrument in front of the technician in Figure 4:\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 4. USGS, UMESC scientist Joseph Hunn at gas chromatograph, 1965\n\n\n\nThe instruments output was an analog signal, a voltage, that was recorded on a strip-chart recorder using a pen trace on moving chart paper, as shown in Figure 5 (the red line is the recorded signal; the baseline would be on the right).\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 5. Strip chart recorder\n\n\n\nThe output of the instrument is a voltage (the \u201cAnalog Output\u201d element in the model) that is connected to the recorder. The \u201cData Acquisition & Processing\u201d and \u201cControl Systems & Communication\u201d elements are, to put it simply, you and your calculator.\nNow let's look at a typical chromatogram from a modern instrument (Figure 6).\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 6. This chromatogram shows the separation of sugars in a mixture.[3] In an analog chromatogram, the line would be smooth since it would come from a continuous signal that was being recorded in real-time. Images like this will often show stair-stepping on the peaks, a result of a computer sampling the signal periodically (possibly 20 to 50 times\/sec depending on settings), and the resulting image is a connect-the-dots display of the sampled readings. This is a characteristic difference between real-time strip-chart analog recording and digital systems (though not a criticism of this particular chromatogram).\n\n\n\nAs Figure 6 demonstrates, chromatograms are a series of peaks. Those peaks represent the retention time (how long it takes a molecule to elute after injection) and can be used to help determine a molecule's identity (other techniques such as MS may be used to confirm it). The size of the peak is used in quantitative analysis.\nWhen doing quantitative analysis, part of the analyst's work is to determine the size of the peak and to ensure that there are no contaminants that would compromise the sample. This is where life gets interesting. Quantitative analysis requires the analysis of a series of standards used to construct a calibration curve (peak size vs. concentration), followed by the sample's. Their peak sizes are then placed on the calibration curve and their concentrations read (Figure 7).\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 7. Peak area measurement and calibration curve\n\n\n\nPeak size can be measured two ways: by the height of the peak above a baseline and by measuring the area under the peak above that baseline. The latter is more accurate, but in the days before computers, this was difficult to determine because peaks have irregular shapes, often with sloping baselines. As a result, unless there was a concern, peak height was often used as it was faster and easier.\nWhen done manually from strip chart recordings, there were two commonly used methods of measuring peak areas[4]:\n\nPlanimeters \u2013 a mechanical device used to trace the peak outline, which was tedious, exacting, and time-consuming\nCut-and-weigh \u2013 photocopies (usually) were made of the individual chromatograms, the peaks were cut out from the paper, put in a desiccator to normalize moisture content, and weighed. The weights, along with the surface area of the paper, were used to determine peak area, which was also slow, exacting, and time-consuming\nWhile peak height measurements were faster, they were still labor-intensive. There was one benefit to these methodologies: you were intimately familiar with the data. If anything was even a little off\u2014including unexpected peaks, distortion of peak shapes, etc.\u2014it was immediately noticeable and important because that meant that there was something unexpected in the sample, which could mean serious contamination. Part of the analyst\u2019s role was to identify those contaminants, which usually meant multiple injections and cold trapping of the peak components as they left the column. In the past, infrared spectrophotometry was used to scan the trapped material. Today, people use mass spectrometry methods, e.g., liquid chromatography\u2013mass spectrometry (LC-MS).\nAnything that could make those measurements easier and faster was welcomed into the lab (depending on your budget). \u201cAnything\u201d began with electronic integrators built around the Intel 4004 and 8008 chips[5][6] The integrators, connected to the analog output of the chromatograph, performed the data acquisition and processing to yield a paper strip with the peak number, retention time, peak height, peak area, and an indication if the peak ended on the baseline or formed a valley with a following peak, for all the peaks resulting from an injection. Most integrators had both digital (connecting to a digital I\/O board) and serial (RS-232) input-output (I\/O) ports for communication to a computer.\nWith the change in technologies and methods, some things were gained, others lost. There was certainly an increased speed of measuring peak characteristics, with the final results calculations being left to the analyst. However, any intimacy with the data was largely lost, as was some control over how baselines were measured, and how peak areas for overlapping peaks were partitioned. Additionally, unless a strip chart is connected to the same output as the integrator used, you don't see the chromatograms, making it easy to miss possible contaminants that would otherwise be evident by more obvious extra peaks or distorted peaks.\nAs instrumentation has developed, computers and software have become more accepted as part of lab operations, and the distinction between the instrument and the computational system has almost become lost. For example, strip-chart recorders, once standard, has become an optional accessory. Today, the computer's report has become an accepted summary of analysis, and, if referenced, a screen-sized graphic has become an acceptable representation of the chromatogram output, if captured. If the analyst chooses to do so, the digital chromatogram can be enlarged to see more detail. Additionally, considerable computing resources are available today to analyze peaks and what they might represent through hyphenated techniques. However, those elements are only useful if someone inspects the chromatogram and looks for anomalies that could take the form of unexpected peaks or distorted peak shapes.\nIssues with potential contaminants can be planned for by developing separation methods that properly separate all expected sample components and possible contaminants. The standard printed report will list everything found, including both expected molecules and those that could be there but shouldn\u2019t. There are also the unexpected things that show up either as well-defined peaks or shoulders to other eluted components.\nFigure 8 shows a pair of synthesized peaks (created in Excel) based on a normal distribution. The blue and red lines are the individual peaks, and the green is their combination. While the contribution of the blue peak is significant and distorts the overall peak envelope, it isn\u2019t sufficient to disrupt the continually ascending slope on the leading edge. The end result is that peak detection algorithms would likely lump the two areas together and report them as a single component; the retention time of the second peak was not affected. It may be possible to pick a set of peak detection parameters that would detect this, or an analytical report that would flag the unusual peak width (proportional to a standard\u2019s peak height). This would be something the analyst would have to build in rather than relying on default settings and reports.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 8. Excel-generated representation of chromatographic peaks\n\n\n\nThe same issues exist in other techniques such as spectroscopy and thermal analysis. In those techniques, there may not be much that can be done in the analytical phase[b] to deal with the problem, and you may have to use software systems to handle the problem. Peak deconvolution represents one possible method.\nThere are examples of performing peak deconvolution with software packages, including from OriginLab[7] and Wolfram Research.[8] One potential problem with deconvolution, however, is that too many peaks may be created to make the curve fitting match.\nIn the end, the success of all these methods depends upon well-educated analysts who are thoroughly familiar with the techniques they are using and the capabilities of the data handling systems used to collect and analyze instrument output.\n\n Moving to fully automated, interconnected systems \nOne of the benefits of a laboratory information management system (LIMS) and electronic laboratory notebook (ELN) is their ability to receive communications from an IDS and automatically parse reports and enter results into their databases. Given the previous commentary, how do we ensure that the results being entered are correct? The IDS has moved from being an aid to the analyst to a replacement, losing the analyst's observation and judgment facilities. That can compromise data quality and integrity, permitting questionable data and information into the system. We need to find an effective substitute for the analyst as we move toward fully automated lab environments.\nWhat needs to be developed on an industry-wide basis (it may exist in individual products or labs) is a system that verifies, certifies, and notifies of analytical results while also addressing the concerns raised above. Given the expanding interest in artificial intelligence (AI) systems, that would seem to be one path, but there are problems there. Most of the AI systems reported in the press are devised around machine learning processes based on the system's examination of test data sets, frequently in the form of images that can be evaluated. As of this writing, there are concerns being raised about the quality of the data sets and the reliability of the resulting systems. Another problem is that the decision-making process of machine-learning based systems is a black box, which wouldn\u2019t work in a regulated environment where validation does and should be a major concern. In short, if you don\u2019t know how it works, how do you develop trust in the results? After all, trust in the data is critical.\nWhat we will need is a deliberately designed system to detect anomalies in analytical and experimental data that allows users to take appropriate follow-up actions. When applied to an IDS, this functionality would act as a \u201cgatekeeper\u201d and prevent questionable information from entering laboratory informatics systems (Figure 9).\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 9. Example of \u201cgatekeeper\u201d evaluation criteria applied to the laboratory process model\n\n\n\nThe concept of a \u201cdata and information gatekeeper\u201d is an interesting point as we move forward into automation and the integration of laboratory informatics systems. We have the ability to connect systems together and transfer data and information from an IDS to a scientific data management system (SDMS), and then to a LIMS or ELN. Once that data and information enters this upper tier of informatics, it is generally considered to be trustworthy and able to be used on a reliable basis for decision making. That trustworthiness needs to be based on more than the ability to put stuff into a database. We need to ensure that the process of developing data and information is reliable, tested, and validated. This is part of the attraction of a laboratory execution system (LES), which has built-in checks to ensure a process\u2019 steps are performed properly. While a LES is primarily built to support human activities, we need a similar mechanism for checks on automated processes\u2014and in particular the final stages of data-to-information processing\u2014in IDSs when throughput is so high that we can\u2019t rely on human inspectors. As we move into increasing productivity through automation (essentially becoming \u201cscientific manufacturing or production\u201d work) we need to adapt production-line inspection concepts to the processes of laboratory work.\n\nSetting a direction for laboratory automation development \nThis paper began with a statement that an industry-wide direction is needed to facilitate the use of automation technologies in lab work. You might first ask \u201cwhich industry?\u201d The answer is \"all of them.\" There has been a tendency over the last several years to focus on the life science and healthcare industries, which is understandable as that\u2019s largely where the money is. More recently, however, there has been a push made for the cannabis industry for the same reason: it makes for a wise investment. That's to say that labeling technologies as suitable for a particular industry misses the key point that laboratory automation and its supporting technology needs are basically the same across all industries. The samples and specifics of the processes differ in what people will use, but the underlying technologies are the same. Tools like microplates and supporting technologies are concentrated in life sciences, but to a large extent that's because they haven\u2019t been tried elsewhere. The same holds true for software systems. Developers may have a more difficult time justifying projects if they are only targeted at one industry, but they may gain broader support if they take a wider view of the marketplace. If methods and standards are viewed as being marketplace-specific, they may be ignored by other potential users. As a result, developers will miss important input and opportunities that could make them more successful. In short, we need to be more inclusive in technology, looking for broader applicability of development rather than a narrowly focused one.\nWhen we construct a framework for developments in laboratory automation and computing, we should address the laboratory marketplace as a whole and then let individual markets adapt and refine that framework as required for their needs. For example, inter-system communications (e.g., bi-directional IDS to LIMS or ELN) has been a need for decades, a need that has been successfully addressed by the clinical chemistry industry. So why not adapt their approach?\nWe could develop a set of models for various lab operations. The one described earlier is one possibility, but it would need to be fleshed out to include administrative and other types of functions. Each element would have different sub-models to account for implementation methods (e.g., manual, semi-automated, fully robotic, etc.) with connection and control mechanisms (hierarchical for robotics, and LES for manual). These models would show us where developments are needed and how they are interconnected, while also offering a guide for people planning lab operations. As such, it would become a more structured or engineered approach to the identification and development of integrated technologies.\nBut there's still another need, and that is proper education. This could be facilitated by the created models, helping people understand what they need to learn to be effective lab managers, technology managers, lab personnel, and laboratory IT (LAB-IT) support personnel.\nTogether, education and technology development offer a natural feedback system. The more you learn, the better your ability to define and utilize laboratory technology, from generation to generation. That interaction leads to more effective education.\n\nThe bottom line \nThe bottom line to all this is pretty simple. Instrument vendors have produced a wide range of instrument systems, supported by instrument data systems, to assist lab personnel in their work. Their role is to develop systems that can be adapted to support instruments, with the user completing the process of analysis and reporting. General purpose reporting can be reconfigured to meet specialized needs; after all, those vendor supplied reports aren\u2019t the end of the process, just a guide to getting there.\nRegardless of how an instrument is packaged (e.g., as integrated packages or separated instrument-computer configurations), the end user has to be well-educated in both sets of components. They have to understand:\n\nwhat the instrument is doing,\nhow it functions, and\nthe characteristics of the analog data it produces.\nThey also have to be familiar with:\n\nwhat digital systems do with that data,\nhow they do it,\nthe limitations of those systems,\nthe interaction between the instrument and computer, and\nthe responsibilities of the analyst.\nDigital systems are not a replacement for the analyst, but rather they are an aid in streamlining workflows. Whatever data and information is produced, you are signing your name to it and are responsible for those results.\n\n Abbreviations, acronyms, and initialisms \nAI: Artificial intelligence\nELN: Electronic laboratory notebook\nGC-MS: Gas chromatography\u2013mass spectrometry\nIDS: Instrument data system\nLC-MS: Liquid chromatography\u2013mass spectrometry\nLES: Laboratory execution system\nLIMS: Laboratory information management system\nMS: Mass spectrometer\nSDMS: Scientific data management system\n\nFootnotes \n\n\n\u2191 If you\u2019d like to familiarize yourself with the technique, however, there are a many sources online, e.g., Khan Academy. \n\n\u2191 In some cases such as the infrared spectroscopic analysis of LDPE side-chain branching, interfering vinylidene peaks can be removed by bromination; see ASTM D3123 - 09(2017). Other techniques may have mechanisms for resolving the overlapping peak issue. \n\n\nAbout the author \nInitially educated as a chemist, author Joe Liscouski (joe dot liscouski at gmail dot com) is an experienced laboratory automation\/computing professional with over forty years of experience in the field, including the design and development of automation systems (both custom and commercial systems), LIMS, robotics and data interchange standards. He also consults on the use of computing in laboratory work. He has held symposia on validation and presented technical material and short courses on laboratory automation and computing in the U.S., Europe, and Japan. He has worked\/consulted in pharmaceutical, biotech, polymer, medical, and government laboratories. His current work centers on working with companies to establish planning programs for lab systems, developing effective support groups, and helping people with the application of automation and information technologies in research and quality control environments.\n\nReferences \n\n\n\u2191 Hinshaw, J.V. (2014). \"Finding a Needle in a Haystack\". LCGC Europe 27 (11): 584\u201389. https:\/\/www.chromatographyonline.com\/view\/finding-needle-haystack-0 .   \n \n\n\u2191 Stevenson, R.L.; Lee, M.; Gras, R. (1 September 2011). \"The Future of GC Instrumentation From the 35th International Symposium on Capillary Chromatography (ISCC)\". American Laboratory. https:\/\/americanlaboratory.com\/913-Technical-Articles\/34439-The-Future-of-GC-Instrumentation-From-the-35th-International-Symposium-on-Capillary-Chromatography-ISCC\/ .   \n \n\n\u2191 Valliyodan, B.; Shi, H.; Nguyen, H.T. (2015). \"A Simple Analytical Method for High-Throughput Screening of Major Sugars from Soybean by Normal-Phase HPLC with Evaporative Light Scattering Detection\". Chromatography Research International 2015: 757649. doi:10.1155\/2015\/757649.   \n \n\n\u2191 Ball, D.L.; Harris, W.E.; Habgood, H.W. (1967). \"Errors in Manual Integration Techniques for Chromatographic Peaks\". Journal of Chromatographic Science 5 (12): 613\u201320. doi:10.1093\/chromsci\/5.12.613.   \n \n\n\u2191 Goedert, M.; Wise, S.A.; Juvet Jr., R.S. (1974). \"Application of an inexpensive, general-purpose microcomputer in analytical chemistry\". Chromatographia 7 (9): 539\u201346. doi:10.1007\/BF02268338.   \n \n\n\u2191 Betteridge, D.; Goad, T.B. (1981). \"The impact of microprocessors on analytical instrumentation. A review\". Analyst 106 (1260): 257\u201382. doi:10.1039\/AN9810600257.   \n \n\n\u2191 Mao, S. (9 March 2015). \"How do I Perform Peak \u201cDeconvolution\u201d?\". Origin Blog. OriginLab. http:\/\/blog.originlab.com\/how-do-i-perform-peak-deconvolution .   \n \n\n\u2191 Binous, H.; Bellagi, A. (March 2010). \"Deconvolution of a Chromatogram\". Wolfram Demonstrations Project. https:\/\/demonstrations.wolfram.com\/DeconvolutionOfAChromatogram\/ .   \n \n\n\n\n\n\n\nSource: <a rel=\"external_link\" class=\"external\" href=\"https:\/\/www.limswiki.org\/index.php\/LII:Notes_on_Instrument_Data_Systems\">https:\/\/www.limswiki.org\/index.php\/LII:Notes_on_Instrument_Data_Systems<\/a>\nNavigation menuPage actionsLIIDiscussionView sourceHistoryPage actionsLIIDiscussionMoreToolsIn other languagesPersonal toolsLog inRequest accountNavigationMain pageRecent changesRandom pageHelp about MediaWikiSearch\u00a0 ToolsWhat links hereRelated changesSpecial pagesPermanent linkPage informationSponsors \r\n\n\t\r\n\n\t\r\n\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\r\n\n\t\r\n\n \n\t\n\t\n\t\r\n\n\t\r\n\n \n\t\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\r\nPrint\/exportCreate a bookDownload as PDFDownload as PDFDownload as Plain textPrintable version This page was last edited on 18 February 2021, at 19:23.Content is available under a Creative Commons Attribution-ShareAlike 4.0 International License unless otherwise noted.This page has been accessed 990 times.Privacy policyAbout LIMSWikiDisclaimers\n\n\n\n","1b7330228fd59158aab6fab82ad0e7cc_html":"<body class=\"mediawiki ltr sitedir-ltr mw-hide-empty-elt ns-202 ns-subject page-LII_Notes_on_Instrument_Data_Systems rootpage-LII_Notes_on_Instrument_Data_Systems skin-monobook action-view skin--responsive\"><div id=\"rdp-ebb-globalWrapper\"><div id=\"rdp-ebb-column-content\"><div id=\"rdp-ebb-content\" class=\"mw-body\" role=\"main\"><a id=\"rdp-ebb-top\"><\/a>\n<h1 id=\"rdp-ebb-firstHeading\" class=\"firstHeading\" lang=\"en\">LII:Notes on Instrument Data Systems<\/h1><div id=\"rdp-ebb-bodyContent\" class=\"mw-body-content\"><!-- start content --><div id=\"rdp-ebb-mw-content-text\" lang=\"en\" dir=\"ltr\" class=\"mw-content-ltr\"><div class=\"mw-parser-output\"><p><b>Title<\/b>: <i>Notes on Instrument Data Systems<\/i>\n<\/p><p><b>Author for citation<\/b>: Joe Liscouski, with editorial modifications by Shawn Douglas\n<\/p><p><b>License for content<\/b>: <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/creativecommons.org\/licenses\/by\/4.0\/\" target=\"_blank\">Creative Commons Attribution 4.0 International<\/a>\n<\/p><p><b>Publication date<\/b>: May 27, 2020\n<\/p><p><br \/>\n<\/p>\n\n\n<h2><span class=\"mw-headline\" id=\"Introduction\">Introduction<\/span><\/h2>\n<p>The goal of this brief paper is to examine what it will take to advance <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory\" title=\"Laboratory\" class=\"wiki-link\" data-key=\"c57fc5aac9e4abf31dccae81df664c33\">laboratory<\/a> operations in terms of technical content, <a href=\"https:\/\/www.limswiki.org\/index.php\/Data_quality\" title=\"Data quality\" class=\"wiki-link\" data-key=\"7fe43b05eae4dfa9b5c0547cc8cfcceb\">data quality<\/a>, and productivity. Advancements in the past have been incremental, and isolated, the result of an individual's or group's work and not part of a broad industry plan. Disjointed, uncoordinated, incremental improvements have to give way to planned, directed methods, such that appropriate standards and products can be developed and mutually beneficial R&D programs instituted. We\u2019ve long since entered a phase where the cost of technology development and implementation is too high to rely on a \u201clet\u2019s try this\u201d approach as the dominant methodology. Making progress in lab technologies is too important to be done without some direction (i.e., deliberate planning). Individual insights, inspiration, and \u201cout of the box\u201d thinking is always valuable; it can inspire a change in direction. But building to a purpose is equally important. This paper revisits past developments in instrument data systems (IDS), looks at issues that need attention as we further venture into the use of integrated systems, and suggests some directions further development can take.\n<\/p><p>There is a second aspect beyond planning that also deserves attention: education. Yes, there are people who really know what they are doing with instrumental systems and data handling. However, that knowledge base isn\u2019t universal across labs. Many industrial labs and schools have people using instrument data systems with no understanding of what is happening to their data. Others such as Hinshaw and Stevenson <i>et al.<\/i> have commented on this phenomenon in the past:\n<\/p>\n<blockquote><p>Chromatographers go to great lengths to prepare, inject, and separate their samples, but they sometimes do not pay as much attention to the next step: peak detection and measurement ... Despite a lot of exposure to computerized data handling, however, many practicing chromatographers do not have a good idea of how a stored chromatogram file\u2014a set of data points arrayed in time\u2014gets translated into a set of peaks with quantitative attributes such as area, height, and amount.<sup id=\"rdp-ebb-cite_ref-HinshawFinding14_1-0\" class=\"reference\"><a href=\"#cite_note-HinshawFinding14-1\">[1]<\/a><\/sup><\/p><\/blockquote>\n<blockquote><p>At this point, I noticed that the discussion tipped from an academic recitation of technical needs and possible solutions to a session driven primarily by frustrations. Even today, the instruments are often more sophisticated than the average user, whether he\/she is a technician, graduate student, scientist, or principal investigator using <a href=\"https:\/\/www.limswiki.org\/index.php\/Chromatography\" title=\"Chromatography\" class=\"wiki-link\" data-key=\"2615535d1f14c6cffdfad7285999ad9d\">chromatography<\/a> as part of the project. Who is responsible for generating good data? Can the designs be improved to increase data integrity?<sup id=\"rdp-ebb-cite_ref-StevensonTheFuture11_2-0\" class=\"reference\"><a href=\"#cite_note-StevensonTheFuture11-2\">[2]<\/a><\/sup><\/p><\/blockquote>\n<p>We can expect that the same issue holds true for even more demanding individual or combined techniques. Unless lab personnel are well-educated in both the theory and the practice of their work, no amount of automation\u2014including any IDS components\u2014is going to matter in the development of usable data and <a href=\"https:\/\/www.limswiki.org\/index.php\/Information\" title=\"Information\" class=\"wiki-link\" data-key=\"6300a14d9c2776dcca0999b5ed940e7d\">information<\/a>.\n<\/p><p>The IDS entered the laboratory initially as an aid to analysts doing their work. Its primary role was to off-load tedious measurements and calculations, giving analysts more time to inspect and evaluate lab results. The IDS has since morphed from a convenience to a necessity, and then to being a presumed part of an instrument system. That raises two sets of issues that we\u2019ll address here regarding people, technologies, and their intersections:\n<\/p><p>1. People: Do the users of an IDS understand what is happening to their data once it leaves the instrument and enters the computer? Do they understand the settings that are available and the effect they have on data processing, as well as the potential for compromising the results of the analytical bench work? Are lab personnel educated so that they are effective and competent users of all the technologies used in the course of their work?\n<\/p><p>2. Technologies: Are the systems we are using up to the task that has been assigned to them as we automate laboratory functions?\n<\/p><p>We\u2019ll begin with some basic material and then develop the argument from there.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"An_overall_model_for_laboratory_processes_and_instrumentation\">An overall model for laboratory processes and instrumentation<\/span><\/h2>\n<p>Laboratory work is dependent upon instruments, and the push for higher productivity is driving us toward automation. What may be lost in all this, particularly to those new to lab procedures, is an understanding of how things work, and how computer systems control the final steps in a process. Without that understanding, good bench work can be reduced to bad and misleading results. If you want to guard against that prospect, you have to understand how instrument data systems work, as well as your role in ensuring accurate results.\n<\/p><p>Let's begin by looking at a model for lab processes (Figure 1) and define the important elements.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig1_Liscouski_NotesOnInstDataSys20.png\" class=\"image wiki-link\" data-key=\"73166cc4ec253bb98abcfe11a1241589\"><img alt=\"Fig1 Liscouski NotesOnInstDataSys20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/c\/cf\/Fig1_Liscouski_NotesOnInstDataSys20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 1.<\/b> A basic laboratory process model<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Most of the elements of Figure 1 are easy to understand. \"Instrument Input\" could be an injection port, an autosampler, or the pan on a balance. \u201cControl Systems & Communications,\u201d normally part of the same electronics package as \u201cData Acquisition & Processing,\u201d is separated out to provide for hierarchical control configurations that might be found in, e.g., robotics systems. The model is easily expanded to describe hyphenated systems such as <a href=\"https:\/\/www.limswiki.org\/index.php\/Gas_chromatography%E2%80%93mass_spectrometry\" title=\"Gas chromatography\u2013mass spectrometry\" class=\"wiki-link\" data-key=\"d7fe02050f81fca3ad7a5845b1879ae2\">gas chromatography\u2013mass spectrometry<\/a> (GC-MS) applications, as shown in Figure 2. Mass spectrometers are best used when you have a clean sample (no contaminants), while chromatographs are great at separating mixtures and isolating components in the effluent. The combination of instruments makes an effective tool for separating and identifying the components in mixtures.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig2_Liscouski_NotesOnInstDataSys20.png\" class=\"image wiki-link\" data-key=\"c147d7f373f56b442793727e799cc005\"><img alt=\"Fig2 Liscouski NotesOnInstDataSys20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/1\/12\/Fig2_Liscouski_NotesOnInstDataSys20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 2.<\/b> A basic laboratory process model applied to a GC-MS combination (the lower model illustration is flipped vertically to simplify the diagram)<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Note that the chromatographic system and its associated components become the \"Sample Preparation\" element from Figure 1 when viewed from the perspective of the mass spectrometer (MS). The flow out of the chromatographic column is split (if needed, depending on the detector used, while controlling the input to the MS) so that the eluted material is directed to the input of the MS. Each eluted component from the chromatograph becomes a sample to the MS.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Instruments_vs._instrument_packages\">Instruments vs. instrument packages<\/span><\/h3>\n<p>One important feature of the model shown in Figure 1 is that arrow labeled \u201cAnalog Output\u201d; this emphasizes that the standard output of instruments is an analog \"response\" or voltage. The response can take the form of a needle movement across a scale, a deflection, or some other indication (e.g., a level in a mercury or alcohol thermometer). Voltage output can be read with a meter or strip chart recorder. There are no instruments with digital outputs; they are all analog output devices. \n<\/p><p>On the other hand, instruments that appear to be digital devices are the result of packaging by the manufacturer. That packaging contains the analog signal-generating instrument, with a voltage output that is connected to an analog-to-digital converter. \n<\/p><p>Take for example analog pH meters, which are in reality millivolt meters with a pH scale added to the meter face. If you look at Figure 3, the lower switch position has a setting for both MV (millivolt) and pH. When you read the pH on the meter, you are doing an analog-to-digital conversion; the needle's movement is the result of the incoming voltage from the pH electrode. The \u201cpH reading\u201d is performed by you as you project the needle's position onto the scale and note the corresponding numeric value.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig3_Liscouski_NotesOnInstDataSys20.png\" class=\"image wiki-link\" data-key=\"ddbaa9c410be4bba7a3bf71011e1b104\"><img alt=\"Fig3 Liscouski NotesOnInstDataSys20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/5\/5a\/Fig3_Liscouski_NotesOnInstDataSys20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 3.<\/b> The front face of an Orion Research IONALYZER<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>A digital pH meter has the following components from our model (Figure 1) incorporated into the instrument package (the light green elements in the model):\n<\/p>\n<ul><li>The instrument - One or more electrodes provide the analog signal, which depends on the pH of the solution.<\/li>\n<li>Data acquisition and processing \u2013 An external connection connects the electrodes to an analog-to-digital converter that converts the incoming voltage to a digital value that is in turn used to calculate the pH.<\/li>\n<li>Control systems and communication \u2013 This manages the digital display, front panel control functions, and, if present, the RS-232\/422\/485 or network communications. If the communications feature is there, this module also interprets and responds to commands from an external computer.<\/li><\/ul>\n<h2><span class=\"mw-headline\" id=\"Working_with_more_sophisticated_instruments\">Working with more sophisticated instruments<\/span><\/h2>\n<p>Prior to the introduction of computers to instruments, the common method of viewing the instruments output was through meters and strip-chart recorders. That meant two things: all the measurements and calculations were done by hand, and you were intimately familiar with your data. Both those points are significant and have ramifications that warrant serious consideration in today\u2019s labs.\n<\/p><p>In order to keep things simple, let's focus on chromatography, although the points raised will have application to spectroscopy, thermal analysis, and other instrumental techniques in chemistry; material science; physics; and other domains. Chromatography has the richest history of automation in laboratory work, it is widely used, and gives us a model for thinking about other techniques. You will not have to be an expert in chromatography to understand what follows.<sup id=\"rdp-ebb-cite_ref-3\" class=\"reference\"><a href=\"#cite_note-3\">[a]<\/a><\/sup> To put it simply, chromatography is a technique for taking complex mixtures and allowing the analyst (with a bit of work) to separate the components of the mixture and determine what those components might be and quantify them. \n<\/p><p>Older chromatographs looked like the instrument in front of the technician in Figure 4:\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig4_Liscouski_NotesOnInstDataSys20.jpg\" class=\"image wiki-link\" data-key=\"d2e187dc5fded5ec913c00993300fc89\"><img alt=\"Fig4 Liscouski NotesOnInstDataSys20.jpg\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/5\/5a\/Fig4_Liscouski_NotesOnInstDataSys20.jpg\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 4.<\/b> USGS, UMESC scientist Joseph Hunn at gas chromatograph, 1965<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>The instruments output was an analog signal, a voltage, that was recorded on a strip-chart recorder using a pen trace on moving chart paper, as shown in Figure 5 (the red line is the recorded signal; the baseline would be on the right).\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig5_Liscouski_NotesOnInstDataSys20.jpg\" class=\"image wiki-link\" data-key=\"dfed7fc8f3fb5f5a05a772f1a2cfd544\"><img alt=\"Fig5 Liscouski NotesOnInstDataSys20.jpg\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/b\/b7\/Fig5_Liscouski_NotesOnInstDataSys20.jpg\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 5.<\/b> Strip chart recorder<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>The output of the instrument is a voltage (the \u201cAnalog Output\u201d element in the model) that is connected to the recorder. The \u201cData Acquisition & Processing\u201d and \u201cControl Systems & Communication\u201d elements are, to put it simply, you and your calculator.\n<\/p><p>Now let's look at a typical chromatogram from a modern instrument (Figure 6).\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig6_Liscouski_NotesOnInstDataSys20.png\" class=\"image wiki-link\" data-key=\"4f85a987e762c43fbade699ba2cf6204\"><img alt=\"Fig6 Liscouski NotesOnInstDataSys20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/2\/21\/Fig6_Liscouski_NotesOnInstDataSys20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 6.<\/b> This chromatogram shows the separation of sugars in a mixture.<sup id=\"rdp-ebb-cite_ref-ValliyodanASimp15_4-0\" class=\"reference\"><a href=\"#cite_note-ValliyodanASimp15-4\">[3]<\/a><\/sup> In an analog chromatogram, the line would be smooth since it would come from a continuous signal that was being recorded in real-time. Images like this will often show stair-stepping on the peaks, a result of a computer sampling the signal periodically (possibly 20 to 50 times\/sec depending on settings), and the resulting image is a connect-the-dots display of the sampled readings. This is a characteristic difference between real-time strip-chart analog recording and digital systems (though not a criticism of this particular chromatogram).<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>As Figure 6 demonstrates, chromatograms are a series of peaks. Those peaks represent the retention time (how long it takes a molecule to elute after injection) and can be used to help determine a molecule's identity (other techniques such as MS may be used to confirm it). The size of the peak is used in quantitative analysis.\n<\/p><p>When doing quantitative analysis, part of the analyst's work is to determine the size of the peak and to ensure that there are no contaminants that would compromise the sample. This is where life gets interesting. Quantitative analysis requires the analysis of a series of standards used to construct a calibration curve (peak size vs. concentration), followed by the sample's. Their peak sizes are then placed on the calibration curve and their concentrations read (Figure 7).\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig7_Liscouski_NotesOnInstDataSys20.png\" class=\"image wiki-link\" data-key=\"83c122430dcdeb1d11577b56b7d4a9a2\"><img alt=\"Fig7 Liscouski NotesOnInstDataSys20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/b\/b2\/Fig7_Liscouski_NotesOnInstDataSys20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 7.<\/b> Peak area measurement and calibration curve<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Peak size can be measured two ways: by the height of the peak above a baseline and by measuring the area under the peak above that baseline. The latter is more accurate, but in the days before computers, this was difficult to determine because peaks have irregular shapes, often with sloping baselines. As a result, unless there was a concern, peak height was often used as it was faster and easier.\n<\/p><p>When done manually from strip chart recordings, there were two commonly used methods of measuring peak areas<sup id=\"rdp-ebb-cite_ref-BallErrors67_5-0\" class=\"reference\"><a href=\"#cite_note-BallErrors67-5\">[4]<\/a><\/sup>:\n<\/p>\n<ul><li>Planimeters \u2013 a mechanical device used to trace the peak outline, which was tedious, exacting, and time-consuming<\/li>\n<li>Cut-and-weigh \u2013 photocopies (usually) were made of the individual chromatograms, the peaks were cut out from the paper, put in a desiccator to normalize moisture content, and weighed. The weights, along with the surface area of the paper, were used to determine peak area, which was also slow, exacting, and time-consuming<\/li><\/ul>\n<p>While peak height measurements were faster, they were still labor-intensive. There was one benefit to these methodologies: you were intimately familiar with the data. If anything was even a little off\u2014including unexpected peaks, distortion of peak shapes, etc.\u2014it was immediately noticeable and important because that meant that there was something unexpected in the sample, which could mean serious contamination. Part of the analyst\u2019s role was to identify those contaminants, which usually meant multiple injections and cold trapping of the peak components as they left the column. In the past, <a href=\"https:\/\/www.limswiki.org\/index.php\/Infrared_spectroscopy\" title=\"Infrared spectroscopy\" class=\"wiki-link\" data-key=\"3f771154f4545ab501387433b2d1895d\">infrared spectrophotometry<\/a> was used to scan the trapped material. Today, people use mass spectrometry methods, e.g., <a href=\"https:\/\/www.limswiki.org\/index.php\/Liquid_chromatography%E2%80%93mass_spectrometry\" title=\"Liquid chromatography\u2013mass spectrometry\" class=\"wiki-link\" data-key=\"d171745b38c8d2ed7d274d2cc13fa1f3\">liquid chromatography\u2013mass spectrometry<\/a> (LC-MS).\n<\/p><p>Anything that could make those measurements easier and faster was welcomed into the lab (depending on your budget). \u201cAnything\u201d began with electronic integrators built around the Intel 4004 and 8008 chips<sup id=\"rdp-ebb-cite_ref-GoedertApp74_6-0\" class=\"reference\"><a href=\"#cite_note-GoedertApp74-6\">[5]<\/a><\/sup><sup id=\"rdp-ebb-cite_ref-BetteridgeTheImpact81_7-0\" class=\"reference\"><a href=\"#cite_note-BetteridgeTheImpact81-7\">[6]<\/a><\/sup> The integrators, connected to the analog output of the chromatograph, performed the data acquisition and processing to yield a paper strip with the peak number, retention time, peak height, peak area, and an indication if the peak ended on the baseline or formed a valley with a following peak, for all the peaks resulting from an injection. Most integrators had both digital (connecting to a digital I\/O board) and serial (RS-232) input-output (I\/O) ports for communication to a computer.\n<\/p><p>With the change in technologies and methods, some things were gained, others lost. There was certainly an increased speed of measuring peak characteristics, with the final results calculations being left to the analyst. However, any intimacy with the data was largely lost, as was some control over how baselines were measured, and how peak areas for overlapping peaks were partitioned. Additionally, unless a strip chart is connected to the same output as the integrator used, you don't see the chromatograms, making it easy to miss possible contaminants that would otherwise be evident by more obvious extra peaks or distorted peaks.\n<\/p><p>As instrumentation has developed, computers and software have become more accepted as part of lab operations, and the distinction between the instrument and the computational system has almost become lost. For example, strip-chart recorders, once standard, has become an optional accessory. Today, the computer's report has become an accepted summary of analysis, and, if referenced, a screen-sized graphic has become an acceptable representation of the chromatogram output, if captured. If the analyst chooses to do so, the digital chromatogram can be enlarged to see more detail. Additionally, considerable computing resources are available today to analyze peaks and what they might represent through hyphenated techniques. However, those elements are only useful if someone inspects the chromatogram and looks for anomalies that could take the form of unexpected peaks or distorted peak shapes.\n<\/p><p>Issues with potential contaminants can be planned for by developing separation methods that properly separate all expected sample components and possible contaminants. The standard printed report will list everything found, including both expected molecules and those that could be there but shouldn\u2019t. There are also the unexpected things that show up either as well-defined peaks or shoulders to other eluted components.\n<\/p><p>Figure 8 shows a pair of synthesized peaks (created in Excel) based on a normal distribution. The blue and red lines are the individual peaks, and the green is their combination. While the contribution of the blue peak is significant and distorts the overall peak envelope, it isn\u2019t sufficient to disrupt the continually ascending slope on the leading edge. The end result is that peak detection algorithms would likely lump the two areas together and report them as a single component; the retention time of the second peak was not affected. It may be possible to pick a set of peak detection parameters that would detect this, or an analytical report that would flag the unusual peak width (proportional to a standard\u2019s peak height). This would be something the analyst would have to build in rather than relying on default settings and reports.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig8_Liscouski_NotesOnInstDataSys20.png\" class=\"image wiki-link\" data-key=\"96c61130b31aff4fbf33ffae41166694\"><img alt=\"Fig8 Liscouski NotesOnInstDataSys20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/9\/96\/Fig8_Liscouski_NotesOnInstDataSys20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 8.<\/b> Excel-generated representation of chromatographic peaks<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>The same issues exist in other techniques such as spectroscopy and thermal analysis. In those techniques, there may not be much that can be done in the analytical phase<sup id=\"rdp-ebb-cite_ref-8\" class=\"reference\"><a href=\"#cite_note-8\">[b]<\/a><\/sup> to deal with the problem, and you may have to use software systems to handle the problem. Peak deconvolution represents one possible method.\n<\/p><p>There are examples of performing peak deconvolution with software packages, including from OriginLab<sup id=\"rdp-ebb-cite_ref-MaoHowDoI15_9-0\" class=\"reference\"><a href=\"#cite_note-MaoHowDoI15-9\">[7]<\/a><\/sup> and Wolfram Research.<sup id=\"rdp-ebb-cite_ref-BinousDeconv10_10-0\" class=\"reference\"><a href=\"#cite_note-BinousDeconv10-10\">[8]<\/a><\/sup> One potential problem with deconvolution, however, is that too many peaks may be created to make the curve fitting match.\n<\/p><p>In the end, the success of all these methods depends upon well-educated analysts who are thoroughly familiar with the techniques they are using and the capabilities of the data handling systems used to collect and analyze instrument output.\n<\/p>\n<h2><span id=\"rdp-ebb-Moving_to_fully_automated,_interconnected_systems\"><\/span><span class=\"mw-headline\" id=\"Moving_to_fully_automated.2C_interconnected_systems\">Moving to fully automated, interconnected systems<\/span><\/h2>\n<p>One of the benefits of a <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_information_management_system\" title=\"Laboratory information management system\" class=\"wiki-link\" data-key=\"8ff56a51d34c9b1806fcebdcde634d00\">laboratory information management system<\/a> (LIMS) and <a href=\"https:\/\/www.limswiki.org\/index.php\/Electronic_laboratory_notebook\" title=\"Electronic laboratory notebook\" class=\"wiki-link\" data-key=\"a9fbbd5e0807980106763fab31f1e72f\">electronic laboratory notebook<\/a> (ELN) is their ability to receive communications from an IDS and automatically parse reports and enter results into their databases. Given the previous commentary, how do we ensure that the results being entered are correct? The IDS has moved from being an aid to the analyst to a replacement, losing the analyst's observation and judgment facilities. That can compromise data quality and <a href=\"https:\/\/www.limswiki.org\/index.php\/Data_integrity\" title=\"Data integrity\" class=\"wiki-link\" data-key=\"382a9bb77ee3e36bb3b37c79ed813167\">integrity<\/a>, permitting questionable data and information into the system. We need to find an effective substitute for the analyst as we move toward fully automated lab environments.\n<\/p><p>What needs to be developed on an industry-wide basis (it may exist in individual products or labs) is a system that verifies, certifies, and notifies of analytical results while also addressing the concerns raised above. Given the expanding interest in <a href=\"https:\/\/www.limswiki.org\/index.php\/Artificial_intelligence\" title=\"Artificial intelligence\" class=\"wiki-link\" data-key=\"0c45a597361ca47e1cd8112af676276e\">artificial intelligence<\/a> (AI) systems, that would seem to be one path, but there are problems there. Most of the AI systems reported in the press are devised around <a href=\"https:\/\/www.limswiki.org\/index.php\/Machine_learning\" title=\"Machine learning\" class=\"wiki-link\" data-key=\"79aab39cfa124c958cd1dbcab3dde122\">machine learning<\/a> processes based on the system's examination of test data sets, frequently in the form of images that can be evaluated. As of this writing, there are concerns being raised about the quality of the data sets and the reliability of the resulting systems. Another problem is that the decision-making process of machine-learning based systems is a black box, which wouldn\u2019t work in a regulated environment where validation does and should be a major concern. In short, if you don\u2019t know how it works, how do you develop trust in the results? After all, trust in the data is critical.\n<\/p><p>What we will need is a deliberately designed system to detect anomalies in analytical and experimental data that allows users to take appropriate follow-up actions. When applied to an IDS, this functionality would act as a \u201cgatekeeper\u201d and prevent questionable information from entering <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_informatics\" title=\"Laboratory informatics\" class=\"wiki-link\" data-key=\"00edfa43edcde538a695f6d429280301\">laboratory informatics<\/a> systems (Figure 9).\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig9_Liscouski_NotesOnInstDataSys20.png\" class=\"image wiki-link\" data-key=\"3dfb1e43f4f025b805c39355ba8246ff\"><img alt=\"Fig9 Liscouski NotesOnInstDataSys20.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/9\/96\/Fig9_Liscouski_NotesOnInstDataSys20.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 9.<\/b> Example of \u201cgatekeeper\u201d evaluation criteria applied to the laboratory process model<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>The concept of a \u201cdata and information gatekeeper\u201d is an interesting point as we move forward into automation and the integration of laboratory informatics systems. We have the ability to connect systems together and transfer data and information from an IDS to a <a href=\"https:\/\/www.limswiki.org\/index.php\/Scientific_data_management_system\" title=\"Scientific data management system\" class=\"wiki-link\" data-key=\"9f38d322b743f578fef487b6f3d7c253\">scientific data management system<\/a> (SDMS), and then to a LIMS or ELN. Once that data and information enters this upper tier of informatics, it is generally considered to be trustworthy and able to be used on a reliable basis for decision making. That trustworthiness needs to be based on more than the ability to put stuff into a database. We need to ensure that the process of developing data and information is reliable, tested, and validated. This is part of the attraction of a <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_execution_system\" title=\"Laboratory execution system\" class=\"wiki-link\" data-key=\"774bdcab852f4d09565f0486bfafc26a\">laboratory execution system<\/a> (LES), which has built-in checks to ensure a process\u2019 steps are performed properly. While a LES is primarily built to support human activities, we need a similar mechanism for checks on automated processes\u2014and in particular the final stages of data-to-information processing\u2014in IDSs when throughput is so high that we can\u2019t rely on human inspectors. As we move into increasing productivity through automation (essentially becoming \u201cscientific manufacturing or production\u201d work) we need to adapt production-line inspection concepts to the processes of laboratory work.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Setting_a_direction_for_laboratory_automation_development\">Setting a direction for laboratory automation development<\/span><\/h2>\n<p>This paper began with a statement that an industry-wide direction is needed to facilitate the use of automation technologies in lab work. You might first ask \u201cwhich industry?\u201d The answer is \"all of them.\" There has been a tendency over the last several years to focus on the life science and healthcare industries, which is understandable as that\u2019s largely where the money is. More recently, however, there has been a push made for the cannabis industry for the same reason: it makes for a wise investment. That's to say that labeling technologies as suitable for a particular industry misses the key point that laboratory automation and its supporting technology needs are basically the same across all industries. The samples and specifics of the processes differ in what people will use, but the underlying technologies are the same. Tools like microplates and supporting technologies are concentrated in life sciences, but to a large extent that's because they haven\u2019t been tried elsewhere. The same holds true for software systems. Developers may have a more difficult time justifying projects if they are only targeted at one industry, but they may gain broader support if they take a wider view of the marketplace. If methods and standards are viewed as being marketplace-specific, they may be ignored by other potential users. As a result, developers will miss important input and opportunities that could make them more successful. In short, we need to be more inclusive in technology, looking for broader applicability of development rather than a narrowly focused one.\n<\/p><p>When we construct a framework for developments in laboratory automation and computing, we should address the laboratory marketplace as a whole and then let individual markets adapt and refine that framework as required for their needs. For example, inter-system communications (e.g., bi-directional IDS to LIMS or ELN) has been a need for decades, a need that has been successfully addressed by the clinical chemistry industry. So why not adapt their approach?\n<\/p><p>We could develop a set of models for various lab operations. The one described earlier is one possibility, but it would need to be fleshed out to include administrative and other types of functions. Each element would have different sub-models to account for implementation methods (e.g., manual, semi-automated, fully robotic, etc.) with connection and control mechanisms (hierarchical for robotics, and LES for manual). These models would show us where developments are needed and how they are interconnected, while also offering a guide for people planning lab operations. As such, it would become a more structured or engineered approach to the identification and development of integrated technologies.\n<\/p><p>But there's still another need, and that is proper education. This could be facilitated by the created models, helping people understand what they need to learn to be effective lab managers, technology managers, lab personnel, and laboratory IT (LAB-IT) support personnel.\n<\/p><p>Together, education and technology development offer a natural feedback system. The more you learn, the better your ability to define and utilize laboratory technology, from generation to generation. That interaction leads to more effective education.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"The_bottom_line\">The bottom line<\/span><\/h2>\n<p>The bottom line to all this is pretty simple. Instrument vendors have produced a wide range of instrument systems, supported by instrument data systems, to assist lab personnel in their work. Their role is to develop systems that can be adapted to support instruments, with the user completing the process of analysis and reporting. General purpose reporting can be reconfigured to meet specialized needs; after all, those vendor supplied reports aren\u2019t the end of the process, just a guide to getting there.\n<\/p><p>Regardless of how an instrument is packaged (e.g., as integrated packages or separated instrument-computer configurations), the end user has to be well-educated in both sets of components. They have to understand:\n<\/p>\n<ul><li>what the instrument is doing,<\/li>\n<li>how it functions, and<\/li>\n<li>the characteristics of the analog data it produces.<\/li><\/ul>\n<p>They also have to be familiar with:\n<\/p>\n<ul><li>what digital systems do with that data,<\/li>\n<li>how they do it,<\/li>\n<li>the limitations of those systems,<\/li>\n<li>the interaction between the instrument and computer, and<\/li>\n<li>the responsibilities of the analyst.<\/li><\/ul>\n<p>Digital systems are not a replacement for the analyst, but rather they are an aid in streamlining workflows. Whatever data and information is produced, you are signing your name to it and are responsible for those results.\n<\/p>\n<h2><span id=\"rdp-ebb-Abbreviations,_acronyms,_and_initialisms\"><\/span><span class=\"mw-headline\" id=\"Abbreviations.2C_acronyms.2C_and_initialisms\">Abbreviations, acronyms, and initialisms<\/span><\/h2>\n<p><b>AI<\/b>: Artificial intelligence\n<\/p><p><b>ELN<\/b>: Electronic laboratory notebook\n<\/p><p><b>GC-MS<\/b>: Gas chromatography\u2013mass spectrometry\n<\/p><p><b>IDS<\/b>: Instrument data system\n<\/p><p><b>LC-MS<\/b>: Liquid chromatography\u2013mass spectrometry\n<\/p><p><b>LES<\/b>: Laboratory execution system\n<\/p><p><b>LIMS<\/b>: Laboratory information management system\n<\/p><p><b>MS<\/b>: Mass spectrometer\n<\/p><p><b>SDMS<\/b>: Scientific data management system\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Footnotes\">Footnotes<\/span><\/h2>\n<div class=\"reflist\" style=\"list-style-type: lower-alpha;\">\n<div class=\"mw-references-wrap\"><ol class=\"references\">\n<li id=\"cite_note-3\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-3\">\u2191<\/a><\/span> <span class=\"reference-text\">If you\u2019d like to familiarize yourself with the technique, however, there are a many sources online, e.g., <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.khanacademy.org\/science\/class-11-chemistry-india\/xfbb6cb8fc2bd00c8:in-in-organic-chemistry-some-basic-principles-and-techniques\/xfbb6cb8fc2bd00c8:in-in-methods-of-purification-of-organic-compounds\/a\/principles-of-chromatography\" target=\"_blank\">Khan Academy<\/a>.<\/span>\n<\/li>\n<li id=\"cite_note-8\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-8\">\u2191<\/a><\/span> <span class=\"reference-text\">In some cases such as the infrared spectroscopic analysis of LDPE side-chain branching, interfering vinylidene peaks can be removed by bromination; see <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.astm.org\/Standards\/D3123.htm\" target=\"_blank\">ASTM D3123 - 09(2017)<\/a>. Other techniques may have mechanisms for resolving the overlapping peak issue.<\/span>\n<\/li>\n<\/ol><\/div><\/div>\n<h2><span class=\"mw-headline\" id=\"About_the_author\">About the author<\/span><\/h2>\n<p>Initially educated as a chemist, author Joe Liscouski (joe dot liscouski at gmail dot com) is an experienced laboratory automation\/computing professional with over forty years of experience in the field, including the design and development of automation systems (both custom and commercial systems), LIMS, robotics and data interchange standards. He also consults on the use of computing in laboratory work. He has held symposia on validation and presented technical material and short courses on laboratory automation and computing in the U.S., Europe, and Japan. He has worked\/consulted in pharmaceutical, biotech, polymer, medical, and government laboratories. His current work centers on working with companies to establish planning programs for lab systems, developing effective support groups, and helping people with the application of automation and information technologies in research and quality control environments.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"References\">References<\/span><\/h2>\n<div class=\"reflist references-column-width\" style=\"-moz-column-width: 30em; -webkit-column-width: 30em; column-width: 30em; list-style-type: decimal;\">\n<div class=\"mw-references-wrap\"><ol class=\"references\">\n<li id=\"cite_note-HinshawFinding14-1\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-HinshawFinding14_1-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Hinshaw, J.V. (2014). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.chromatographyonline.com\/view\/finding-needle-haystack-0\" target=\"_blank\">\"Finding a Needle in a Haystack\"<\/a>. <i>LCGC Europe<\/i> <b>27<\/b> (11): 584\u201389<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.chromatographyonline.com\/view\/finding-needle-haystack-0\" target=\"_blank\">https:\/\/www.chromatographyonline.com\/view\/finding-needle-haystack-0<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Finding+a+Needle+in+a+Haystack&rft.jtitle=LCGC+Europe&rft.aulast=Hinshaw%2C+J.V.&rft.au=Hinshaw%2C+J.V.&rft.date=2014&rft.volume=27&rft.issue=11&rft.pages=584%E2%80%9389&rft_id=https%3A%2F%2Fwww.chromatographyonline.com%2Fview%2Ffinding-needle-haystack-0&rfr_id=info:sid\/en.wikipedia.org:LII:Notes_on_Instrument_Data_Systems\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-StevensonTheFuture11-2\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-StevensonTheFuture11_2-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">Stevenson, R.L.; Lee, M.; Gras, R. (1 September 2011). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/americanlaboratory.com\/913-Technical-Articles\/34439-The-Future-of-GC-Instrumentation-From-the-35th-International-Symposium-on-Capillary-Chromatography-ISCC\/\" target=\"_blank\">\"The Future of GC Instrumentation From the 35th International Symposium on Capillary Chromatography (ISCC)\"<\/a>. <i>American Laboratory<\/i><span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/americanlaboratory.com\/913-Technical-Articles\/34439-The-Future-of-GC-Instrumentation-From-the-35th-International-Symposium-on-Capillary-Chromatography-ISCC\/\" target=\"_blank\">https:\/\/americanlaboratory.com\/913-Technical-Articles\/34439-The-Future-of-GC-Instrumentation-From-the-35th-International-Symposium-on-Capillary-Chromatography-ISCC\/<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=The+Future+of+GC+Instrumentation+From+the+35th+International+Symposium+on+Capillary+Chromatography+%28ISCC%29&rft.atitle=American+Laboratory&rft.aulast=Stevenson%2C+R.L.%3B+Lee%2C+M.%3B+Gras%2C+R.&rft.au=Stevenson%2C+R.L.%3B+Lee%2C+M.%3B+Gras%2C+R.&rft.date=1+September+2011&rft_id=https%3A%2F%2Famericanlaboratory.com%2F913-Technical-Articles%2F34439-The-Future-of-GC-Instrumentation-From-the-35th-International-Symposium-on-Capillary-Chromatography-ISCC%2F&rfr_id=info:sid\/en.wikipedia.org:LII:Notes_on_Instrument_Data_Systems\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-ValliyodanASimp15-4\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-ValliyodanASimp15_4-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Valliyodan, B.; Shi, H.; Nguyen, H.T. (2015). \"A Simple Analytical Method for High-Throughput Screening of Major Sugars from Soybean by Normal-Phase HPLC with Evaporative Light Scattering Detection\". <i>Chromatography Research International<\/i> <b>2015<\/b>: 757649. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/Digital_object_identifier\" data-key=\"ae6d69c760ab710abc2dd89f3937d2f4\">doi<\/a>:<a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.1155%2F2015%2F757649\" target=\"_blank\">10.1155\/2015\/757649<\/a>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=A+Simple+Analytical+Method+for+High-Throughput+Screening+of+Major+Sugars+from+Soybean+by+Normal-Phase+HPLC+with+Evaporative+Light+Scattering+Detection&rft.jtitle=Chromatography+Research+International&rft.aulast=Valliyodan%2C+B.%3B+Shi%2C+H.%3B+Nguyen%2C+H.T.&rft.au=Valliyodan%2C+B.%3B+Shi%2C+H.%3B+Nguyen%2C+H.T.&rft.date=2015&rft.volume=2015&rft.pages=757649&rft_id=info:doi\/10.1155%2F2015%2F757649&rfr_id=info:sid\/en.wikipedia.org:LII:Notes_on_Instrument_Data_Systems\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-BallErrors67-5\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-BallErrors67_5-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Ball, D.L.; Harris, W.E.; Habgood, H.W. (1967). \"Errors in Manual Integration Techniques for Chromatographic Peaks\". <i>Journal of Chromatographic Science<\/i> <b>5<\/b> (12): 613\u201320. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/Digital_object_identifier\" data-key=\"ae6d69c760ab710abc2dd89f3937d2f4\">doi<\/a>:<a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.1093%2Fchromsci%2F5.12.613\" target=\"_blank\">10.1093\/chromsci\/5.12.613<\/a>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Errors+in+Manual+Integration+Techniques+for+Chromatographic+Peaks&rft.jtitle=Journal+of+Chromatographic+Science&rft.aulast=Ball%2C+D.L.%3B+Harris%2C+W.E.%3B+Habgood%2C+H.W.&rft.au=Ball%2C+D.L.%3B+Harris%2C+W.E.%3B+Habgood%2C+H.W.&rft.date=1967&rft.volume=5&rft.issue=12&rft.pages=613%E2%80%9320&rft_id=info:doi\/10.1093%2Fchromsci%2F5.12.613&rfr_id=info:sid\/en.wikipedia.org:LII:Notes_on_Instrument_Data_Systems\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-GoedertApp74-6\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-GoedertApp74_6-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Goedert, M.; Wise, S.A.; Juvet Jr., R.S. (1974). \"Application of an inexpensive, general-purpose microcomputer in analytical chemistry\". <i>Chromatographia<\/i> <b>7<\/b> (9): 539\u201346. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/Digital_object_identifier\" data-key=\"ae6d69c760ab710abc2dd89f3937d2f4\">doi<\/a>:<a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.1007%2FBF02268338\" target=\"_blank\">10.1007\/BF02268338<\/a>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Application+of+an+inexpensive%2C+general-purpose+microcomputer+in+analytical+chemistry&rft.jtitle=Chromatographia&rft.aulast=Goedert%2C+M.%3B+Wise%2C+S.A.%3B+Juvet+Jr.%2C+R.S.&rft.au=Goedert%2C+M.%3B+Wise%2C+S.A.%3B+Juvet+Jr.%2C+R.S.&rft.date=1974&rft.volume=7&rft.issue=9&rft.pages=539%E2%80%9346&rft_id=info:doi\/10.1007%2FBF02268338&rfr_id=info:sid\/en.wikipedia.org:LII:Notes_on_Instrument_Data_Systems\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-BetteridgeTheImpact81-7\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-BetteridgeTheImpact81_7-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Betteridge, D.; Goad, T.B. (1981). \"The impact of microprocessors on analytical instrumentation. A review\". <i>Analyst<\/i> <b>106<\/b> (1260): 257\u201382. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/Digital_object_identifier\" data-key=\"ae6d69c760ab710abc2dd89f3937d2f4\">doi<\/a>:<a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.1039%2FAN9810600257\" target=\"_blank\">10.1039\/AN9810600257<\/a>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=The+impact+of+microprocessors+on+analytical+instrumentation.+A+review&rft.jtitle=Analyst&rft.aulast=Betteridge%2C+D.%3B+Goad%2C+T.B.&rft.au=Betteridge%2C+D.%3B+Goad%2C+T.B.&rft.date=1981&rft.volume=106&rft.issue=1260&rft.pages=257%E2%80%9382&rft_id=info:doi\/10.1039%2FAN9810600257&rfr_id=info:sid\/en.wikipedia.org:LII:Notes_on_Instrument_Data_Systems\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-MaoHowDoI15-9\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-MaoHowDoI15_9-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">Mao, S. (9 March 2015). <a rel=\"external_link\" class=\"external text\" href=\"http:\/\/blog.originlab.com\/how-do-i-perform-peak-deconvolution\" target=\"_blank\">\"How do I Perform Peak \u201cDeconvolution\u201d?\"<\/a>. <i>Origin Blog<\/i>. OriginLab<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"http:\/\/blog.originlab.com\/how-do-i-perform-peak-deconvolution\" target=\"_blank\">http:\/\/blog.originlab.com\/how-do-i-perform-peak-deconvolution<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=How+do+I+Perform+Peak+%E2%80%9CDeconvolution%E2%80%9D%3F&rft.atitle=Origin+Blog&rft.aulast=Mao%2C+S.&rft.au=Mao%2C+S.&rft.date=9+March+2015&rft.pub=OriginLab&rft_id=http%3A%2F%2Fblog.originlab.com%2Fhow-do-i-perform-peak-deconvolution&rfr_id=info:sid\/en.wikipedia.org:LII:Notes_on_Instrument_Data_Systems\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-BinousDeconv10-10\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-BinousDeconv10_10-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">Binous, H.; Bellagi, A. (March 2010). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/demonstrations.wolfram.com\/DeconvolutionOfAChromatogram\/\" target=\"_blank\">\"Deconvolution of a Chromatogram\"<\/a>. <i>Wolfram Demonstrations Project<\/i><span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/demonstrations.wolfram.com\/DeconvolutionOfAChromatogram\/\" target=\"_blank\">https:\/\/demonstrations.wolfram.com\/DeconvolutionOfAChromatogram\/<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Deconvolution+of+a+Chromatogram&rft.atitle=Wolfram+Demonstrations+Project&rft.aulast=Binous%2C+H.%3B+Bellagi%2C+A.&rft.au=Binous%2C+H.%3B+Bellagi%2C+A.&rft.date=March+2010&rft_id=https%3A%2F%2Fdemonstrations.wolfram.com%2FDeconvolutionOfAChromatogram%2F&rfr_id=info:sid\/en.wikipedia.org:LII:Notes_on_Instrument_Data_Systems\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<\/ol><\/div><\/div>\n<!-- \nNewPP limit report\nCached time: 20211118162547\nCache expiry: 86400\nDynamic content: false\nComplications: []\nCPU time usage: 0.168 seconds\nReal time usage: 0.199 seconds\nPreprocessor visited node count: 5848\/1000000\nPost\u2010expand include size: 39053\/2097152 bytes\nTemplate argument size: 14720\/2097152 bytes\nHighest expansion depth: 18\/40\nExpensive parser function count: 0\/100\nUnstrip recursion depth: 0\/20\nUnstrip post\u2010expand size: 13947\/5000000 bytes\n-->\n<!--\nTransclusion expansion time report (%,ms,calls,template)\n100.00% 91.179 1 -total\n 91.14% 83.103 2 Template:Reflist\n 67.55% 61.594 8 Template:Citation\/core\n 42.58% 38.822 5 Template:Cite_journal\n 33.83% 30.843 3 Template:Cite_web\n 9.37% 8.540 3 Template:Date\n 7.17% 6.537 4 Template:Citation\/identifier\n 4.42% 4.030 11 Template:Citation\/make_link\n 3.51% 3.200 2 Template:Efn\n 2.29% 2.088 8 Template:Hide_in_print\n-->\n\n<!-- Saved in parser cache with key limswiki:pcache:idhash:12363-0!canonical and timestamp 20211118162547 and revision id 41793. Serialized with JSON.\n -->\n<\/div><\/div><div class=\"printfooter\">Source: <a rel=\"external_link\" class=\"external\" href=\"https:\/\/www.limswiki.org\/index.php\/LII:Notes_on_Instrument_Data_Systems\">https:\/\/www.limswiki.org\/index.php\/LII:Notes_on_Instrument_Data_Systems<\/a><\/div>\n<!-- end content --><div class=\"visualClear\"><\/div><\/div><\/div><div class=\"visualClear\"><\/div><\/div><!-- end of the left (by default at least) column --><div class=\"visualClear\"><\/div><\/div>\n\n\n\n<\/body>","1b7330228fd59158aab6fab82ad0e7cc_images":["https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/c\/cf\/Fig1_Liscouski_NotesOnInstDataSys20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/1\/12\/Fig2_Liscouski_NotesOnInstDataSys20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/5\/5a\/Fig3_Liscouski_NotesOnInstDataSys20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/5\/5a\/Fig4_Liscouski_NotesOnInstDataSys20.jpg","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/b\/b7\/Fig5_Liscouski_NotesOnInstDataSys20.jpg","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/2\/21\/Fig6_Liscouski_NotesOnInstDataSys20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/b\/b2\/Fig7_Liscouski_NotesOnInstDataSys20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/9\/96\/Fig8_Liscouski_NotesOnInstDataSys20.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/9\/96\/Fig9_Liscouski_NotesOnInstDataSys20.png"],"1b7330228fd59158aab6fab82ad0e7cc_timestamp":1637252747,"2016524e3c4b551c982fcfc23e33220d_type":"article","2016524e3c4b551c982fcfc23e33220d_title":"Laboratory Technology Management & Planning (Liscouski 2019)","2016524e3c4b551c982fcfc23e33220d_url":"https:\/\/www.limswiki.org\/index.php\/LII:Laboratory_Technology_Management_%26_Planning","2016524e3c4b551c982fcfc23e33220d_plaintext":"\n\nLII:Laboratory Technology Management & PlanningFrom LIMSWikiJump to navigationJump to search\nTitle: Laboratory Technology Management & Planning\nAuthor for citation: Joe Liscouski\nLicense for content: Creative Commons Attribution 4.0 International\nPublication date: October 2019\n\r\n\nThis document is based on a presentation delivered at the 2nd Annual Lab Asset & Facility Management in Pharma 2019 conference held in San Diego, CA, on October 22nd, 2019. It is not a verbatim transcript, but an expansion of the material presented. The presentation ran two hours, and even with that there was a limit to the depth I could go into; and even with the added material we are only touching lightly on these topics. Plus there is the ever present \u201cI should have added\u2026\u201d as you move through the material, so that material has been included as well.\nAmong the complaints about scientific work\u2014in any discipline\u2014is that it is expensive, inefficient, sometimes difficult to reproduce, and slow to execute. Part of that is due to the nature of research; you are moving into new territory and there is no map. Another aspect to it is that we have a lot of technology to work with but it isn\u2019t used effectively.\nAmong the reasons we bring advanced technologies into scientific work, they enable us to do things we otherwise couldn\u2019t, improve operational efficiency, and improve the return on corporate investments in scientific projects.\nInstrumentation, computers, software, and networks, from a variety of vendors, are designed to do specific jobs but often do not to work well together. And then there are the results of scientific work: knowledge, information, and data, which are often not well managed, in incompatible databases, files, and spreadsheets. That doesn\u2019t include upgrades to systems and support. An answer to those issues is effective technology management and planning. That work should yield better organized systems, reduced costs, better workflows, and improved ROI. How do you go about it? That is what we\u2019ll start to address in this material.\n\r\n\nAbout the author\nInitially educated as a chemist, author Joe Liscouski is an experienced laboratory automation\/computing professional with over forty years experience in the field, including the design and development of automation systems (both custom and commercial systems), LIMS, robotics and data interchange standards. He also consults on the use of computing in laboratory work. He has held symposia on validation and presented technical material and short courses on laboratory automation and computing in the U.S., Europe, and Japan. He has worked\/consulted in pharmaceutical, biotech, polymer, medical, and government laboratories. His current work centers on working with companies to establish planning programs for lab systems, developing effective support groups, and helping people with the application of automation and information technologies in research and quality control environments.\n\n Slides\/document \n The document, which includes the slides for the original presentation, can be found on Google Drive: https:\/\/drive.google.com\/\n\n\n\n\n\nSource: <a rel=\"external_link\" class=\"external\" href=\"https:\/\/www.limswiki.org\/index.php\/LII:Laboratory_Technology_Management_%26_Planning\">https:\/\/www.limswiki.org\/index.php\/LII:Laboratory_Technology_Management_%26_Planning<\/a>\nNavigation menuPage actionsLIIDiscussionView sourceHistoryPage actionsLIIDiscussionMoreToolsIn other languagesPersonal toolsLog inRequest accountNavigationMain pageRecent changesRandom pageHelp about MediaWikiSearch\u00a0 ToolsWhat links hereRelated changesSpecial pagesPermanent linkPage informationSponsors \r\n\n\t\r\n\n\t\r\n\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\r\n\n\t\r\n\n \n\t\n\t\n\t\r\n\n\t\r\n\n \n\t\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\r\nPrint\/exportCreate a bookDownload as PDFDownload as PDFDownload as Plain textPrintable version This page was last edited on 9 December 2019, at 17:39.Content is available under a Creative Commons Attribution-ShareAlike 4.0 International License unless otherwise noted.This page has been accessed 464 times.Privacy policyAbout LIMSWikiDisclaimers\n\n\n\n","2016524e3c4b551c982fcfc23e33220d_html":"<body class=\"mediawiki ltr sitedir-ltr mw-hide-empty-elt ns-202 ns-subject page-LII_Laboratory_Technology_Management_Planning rootpage-LII_Laboratory_Technology_Management_Planning skin-monobook action-view skin--responsive\"><div id=\"rdp-ebb-globalWrapper\"><div id=\"rdp-ebb-column-content\"><div id=\"rdp-ebb-content\" class=\"mw-body\" role=\"main\"><a id=\"rdp-ebb-top\"><\/a>\n<h1 id=\"rdp-ebb-firstHeading\" class=\"firstHeading\" lang=\"en\">LII:Laboratory Technology Management & Planning<\/h1><div id=\"rdp-ebb-bodyContent\" class=\"mw-body-content\"><!-- start content --><div id=\"rdp-ebb-mw-content-text\" lang=\"en\" dir=\"ltr\" class=\"mw-content-ltr\"><div class=\"mw-parser-output\"><div class=\"floatright\"><a href=\"https:\/\/www.limswiki.org\/index.php\/File:LabCourses_Transparent.png\" class=\"image wiki-link\" data-key=\"e8c952781a315e89a294f7326bb7182d\"><img alt=\"LabCourses Transparent.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/b\/b4\/LabCourses_Transparent.png\" decoding=\"async\" width=\"240\" height=\"147\" \/><\/a><\/div>\n<p><b>Title<\/b>: <i>Laboratory Technology Management & Planning<\/i>\n<\/p><p><b>Author for citation<\/b>: Joe Liscouski\n<\/p><p><b>License for content<\/b>: <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/creativecommons.org\/licenses\/by\/4.0\/\" target=\"_blank\">Creative Commons Attribution 4.0 International<\/a>\n<\/p><p><b>Publication date<\/b>: October 2019\n<\/p><p><br \/>\nThis document is based on a presentation delivered at the 2nd Annual Lab Asset & Facility Management in Pharma 2019 conference held in San Diego, CA, on October 22nd, 2019. It is not a verbatim transcript, but an expansion of the material presented. The presentation ran two hours, and even with that there was a limit to the depth I could go into; and even with the added material we are only touching lightly on these topics. Plus there is the ever present \u201cI should have added\u2026\u201d as you move through the material, so that material has been included as well.\n<\/p><p>Among the complaints about scientific work\u2014in any discipline\u2014is that it is expensive, inefficient, sometimes difficult to reproduce, and slow to execute. Part of that is due to the nature of research; you are moving into new territory and there is no map. Another aspect to it is that we have a lot of technology to work with but it isn\u2019t used effectively.\n<\/p><p>Among the reasons we bring advanced technologies into scientific work, they enable us to do things we otherwise couldn\u2019t, improve operational efficiency, and improve the return on corporate investments in scientific projects.\n<\/p><p>Instrumentation, computers, software, and networks, from a variety of vendors, are designed to do specific jobs but often do not to work well together. And then there are the results of scientific work: knowledge, information, and data, which are often not well managed, in incompatible databases, files, and spreadsheets. That doesn\u2019t include upgrades to systems and support. An answer to those issues is effective technology management and planning. That work should yield better organized systems, reduced costs, better workflows, and improved ROI. How do you go about it? That is what we\u2019ll start to address in this material.\n<\/p><p><br \/>\n<b>About the author<\/b>\n<\/p><p>Initially educated as a chemist, author Joe Liscouski is an experienced laboratory automation\/computing professional with over forty years experience in the field, including the design and development of automation systems (both custom and commercial systems), LIMS, robotics and data interchange standards. He also consults on the use of computing in laboratory work. He has held symposia on validation and presented technical material and short courses on laboratory automation and computing in the U.S., Europe, and Japan. He has worked\/consulted in pharmaceutical, biotech, polymer, medical, and government laboratories. His current work centers on working with companies to establish planning programs for lab systems, developing effective support groups, and helping people with the application of automation and information technologies in research and quality control environments.\n<\/p>\n<h3><span id=\"rdp-ebb-Slides\/document\"><\/span><span class=\"mw-headline\" id=\"Slides.2Fdocument\">Slides\/document<\/span><\/h3>\n<p><a href=\"https:\/\/drive.google.com\/file\/d\/1Tt2u0kU3LXzbXzxFVQiubS06_KfOAL0G\/view\" rel=\"external_link\" target=\"_blank\"><img alt=\"PDF.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/a\/a6\/PDF.png\" decoding=\"async\" width=\"40\" height=\"40\" \/><\/a> <i>The document, which includes the slides for the original presentation, can be found on Google Drive<\/i>: <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/drive.google.com\/file\/d\/1Tt2u0kU3LXzbXzxFVQiubS06_KfOAL0G\/view\" target=\"_blank\">https:\/\/drive.google.com\/<\/a>\n<\/p>\n<!-- \nNewPP limit report\nCached time: 20211117213418\nCache expiry: 86400\nDynamic content: false\nComplications: []\nCPU time usage: 0.011 seconds\nReal time usage: 0.072 seconds\nPreprocessor visited node count: 3\/1000000\nPost\u2010expand include size: 0\/2097152 bytes\nTemplate argument size: 0\/2097152 bytes\nHighest expansion depth: 2\/40\nExpensive parser function count: 0\/100\nUnstrip recursion depth: 0\/20\nUnstrip post\u2010expand size: 0\/5000000 bytes\n-->\n<!--\nTransclusion expansion time report (%,ms,calls,template)\n100.00% 0.000 1 -total\n-->\n\n<!-- Saved in parser cache with key limswiki:pcache:idhash:11309-0!canonical and timestamp 20211117213418 and revision id 37039. Serialized with JSON.\n -->\n<\/div><\/div><div class=\"printfooter\">Source: <a rel=\"external_link\" class=\"external\" href=\"https:\/\/www.limswiki.org\/index.php\/LII:Laboratory_Technology_Management_%26_Planning\">https:\/\/www.limswiki.org\/index.php\/LII:Laboratory_Technology_Management_%26_Planning<\/a><\/div>\n<!-- end content --><div class=\"visualClear\"><\/div><\/div><\/div><div class=\"visualClear\"><\/div><\/div><!-- end of the left (by default at least) column --><div class=\"visualClear\"><\/div><\/div>\n\n\n\n<\/body>","2016524e3c4b551c982fcfc23e33220d_images":["https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/b\/b4\/LabCourses_Transparent.png"],"2016524e3c4b551c982fcfc23e33220d_timestamp":1637252747,"00b300565027cb0518bcb0410d6df360_type":"article","00b300565027cb0518bcb0410d6df360_title":"A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work (Liscouski 2018)","00b300565027cb0518bcb0410d6df360_url":"https:\/\/www.limswiki.org\/index.php\/LII:A_Guide_for_Management:_Successfully_Applying_Laboratory_Systems_to_Your_Organization%27s_Work","00b300565027cb0518bcb0410d6df360_plaintext":"\n\nLII:A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's WorkFrom LIMSWikiJump to navigationJump to search\nTitle: A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work\nAuthor for citation: Joe Liscouski\nLicense for content: Creative Commons Attribution-ShareAlike 4.0 International\nPublication date: January 2018\n\r\n\nLaboratory informatics involves a collection of technologies that range from sample storage management and robotics to database\/workflow management systems such as laboratory information management systems (LIMS) and electronic laboratory notebooks (ELN), with a lot of task-specific tools in between. These components were designed by a number of vendors who saw specific needs and developed products to address them. Those products in turn were presented to laboratories as a means of solving their instrumental data collection and analysis, sample preparation, data management, and document management issues. With many needs and so many ways to address them, how do you go about choosing a set of products that will work for you? \nThat is what this set of webinars is all about. We introduce the technologies and position them for you so that you can see how they may or may not apply to your work. Then we address the very real world topic of justifying the investment needed to put those tools to use in your laboratories. \nOnce that foundation has been put in place we cover:\n\nTechnology planning and education: Planning is essential for success in this work. We look at how to go about it, who to involve, and methodologies for carrying out the work. We also look at the associated knowledge necessary to be effective.\nImplementation: Informatics systems can be a challenge to implement. We look at what is needed to minimize risks and make the implementation easier, as well as the support requirements needed to manage their use in your laboratory environment.\nRegulatory guidelines and compliance: We also address regulatory guidelines and compliance and how they can affect every laboratory application.\nThe future: What developments will arise and be needed in the future? We wrap up the series with those details.\n\r\n\nAbout the author\nInitially educated as a chemist, author Joe Liscouski is an experienced laboratory automation\/computing professional with over forty years experience in the field, including the design and development of automation systems (both custom and commercial systems), LIMS, robotics and data interchange standards. He also consults on the use of computing in laboratory work. He has held symposia on validation and presented technical material and short courses on laboratory automation and computing in the U.S., Europe, and Japan. He has worked\/consulted in pharmaceutical, biotech, polymer, medical, and government laboratories. His current work centers on working with companies to establish planning programs for lab systems, developing effective support groups, and helping people with the application of automation and information technologies in research and quality control environments.\n\nContents \n\n1 Introduction \n\n1.1 Introductory video \n1.2 Slides and transcripts \n\n\n2 Part 1: Laboratory Informatics Technologies \n\n2.1 Webinar \n2.2 Slides and transcripts \n\n\n3 Part 2: Laboratory Informatics and Return on Investment \n\n3.1 Webinar \n3.2 Slides and transcripts \n\n\n4 Part 3: Technology, Planning, & Education \n\n4.1 Webinar \n4.2 Slides and transcripts \n\n\n5 Part 4: LIMS\/LIS, ELN, SDMS, IT & Education \n\n5.1 Webinar \n5.2 Slides and transcripts \n\n\n6 Part 5: Supporting Laboratory Systems \n\n6.1 Webinar \n6.2 Slides and transcripts \n\n\n7 Part 6: Instrument Data Systems \n\n7.1 Webinar \n7.2 Slides and transcripts \n\n\n8 Part 7: Laboratory Processes \n\n8.1 Webinar \n8.2 Slides and transcripts \n\n\n\n\n\nIntroduction \nOnce the scientific mission of a lab has been established, lab work is divided into three areas:\n\nAdministrative work, including report preparation\nPlanning experimental work, including method development and attendant documentation \/ validation\nExperiment \/procedure \/ task execution\nLaboratory computing and automation can play a useful role in each of these points. The effectiveness of these tools depends on preparation, planning, and the capabilities of those carrying out lab work. It is those points that need attention if we are going to transition from a collection of technologies to integrated functional systems. Similar transitions took other production environments from a succession of individual steps to highly efficient, cost-effective, and successful productivity.\nThe purpose of these webinars and the resources they carry with them, is to help people understand how to make that transition happen. How those tools can improve laboratory operations.\nWe, as a community, need to move from a series of incremental improvements to well-designed, engineered solutions to laboratory needs. That will require a change in mindset. That mindset change will lead to better science, better results, improved productivity, and more effective laboratory operations and technology management.\n\nIntroductory video \n The introductory video for this series of webinars (12:30) can be found at YouTube: https:\/\/www.youtube.com\/\n\nSlides and transcripts \n The slides and transcripts for this introductory video can be found on Google Drive: https:\/\/drive.google.com\/\n\nPart 1: Laboratory Informatics Technologies \nLIMS, ELN, scientific data management systems (SDMS), laboratory information systems (LIS), laboratory execution systems (LES), and instrument data systems are terms that have been part of laboratory discussions for years, but unless you\u2019ve been an active part of those conversations, they sound like techno-babble. The purpose of this webinar is two-fold:\n\nfirst, to introduce the webinar series and its objectives, and\nsecond, to convert that babble into meaningful subjects and see how they apply (or not) to your laboratory.\nInformatics is part of laboratory life: unless it is a piece of glassware, almost everything in the lab has a chip that is supposed to improve its usefulness. At the end of this webinar you will have an understanding of what the key technologies are, how they relate to each other, and how they might improve you lab's operations. The remainder of the series will build on this background, looking at return on investment (ROI) considerations, planning, education, and IT support requirements. You\u2019ll love it.\n\nWebinar \n The webinar for this session (50:17) can be found at YouTube: https:\/\/www.youtube.com\/\n\nSlides and transcripts \n The slides and transcripts for this webinar can be found on Google Drive: https:\/\/drive.google.com\/\n\nPart 2: Laboratory Informatics and Return on Investment \nWhen you are considering lab informatics and automation projects, someone is going to ask \u201cwhat is the return on the investment you\u2019re asking for?\u201d How do you answer them? This second webinar in the series provides practical guidance.\nThe introduction of informatics\/automation technologies into laboratory work requires larger investments than typical lab bench spending and involves people from outside support groups. It also brings up issues of knowledge and intellectual property management, an increasingly important corporate topic. Your ability to address these points within the ROI conversation will have a direct impact on the approval of your projects. Join us as we begin to look into these considerations:\n\nHow do you justify the expense of laboratory systems?\nWhat results can be expected from an investment in laboratory technologies?\nHow do you go about setting goals that are easily understood and play into corporate concerns?\nHow will these investments affect other groups?\nAs laboratory work and the investments in it become more visible to the corporate organization, more effort will have to be put into the justification and evaluation of these expenditures. This webinar will help you form a basis for that work.\n\nWebinar \n The webinar for this session (38:31) can be found at YouTube: https:\/\/www.youtube.com\/\n\nSlides and transcripts \n The slides and transcripts for this webinar can be found on Google Drive: https:\/\/drive.google.com\/\n\n Part 3: Technology, Planning, & Education \nThe first webinar introduced the subject of laboratory informatics, comparing the use of LIMS, ELN, SDMS, LIS, LES, and instrument data systems in different laboratory settings. The second looked at how you can evaluate the ROI in those technologies.\nThis third webinar looks at technology planning and education, and more specifically, how you can gain the benefits expected from your investment in informatics technologies. Effective planning, by people with a good understanding of laboratory technologies, is one of the key points in successfully applying informatics technologies to laboratory work. Most failures can be traced back to insufficient planning efforts.\nThis subject may span more than one webinar session and will look at the:\n\nmethodologies for laboratory systems planning, including where and when to start and who should be involved;\nroles of regulatory guidelines in different laboratories;\nroles that laboratory management, personnel, and IT support groups play;\nchanges taking place in laboratory work and the backgrounds needed to be successful; and\nefforts needed to support the technical work and meet ROI and performance goals.\nWebinar \n The webinar for this session (42:18) can be found at YouTube: https:\/\/www.youtube.com\/\n\nSlides and transcripts \n The slides and transcripts for this webinar can be found on Google Drive: https:\/\/drive.google.com\/\n\n Part 4: LIMS\/LIS, ELN, SDMS, IT & Education \nThis session builds on the material covered in the previous webinars. We extend the discussion of centralized laboratory database systems (LIMS\/LIS, ELN, SDMS) to look at multi-laboratory applications, as well as the use of virtual systems, local\/private cloud, remote cloud, and vendor-supported remote database applications. This session also begins looking at the roles that information technology support groups can play in this work. Points covered include:\n\nplanning for multiple laboratory use of database systems;\nusing local and cloud-based computing models; and\nintegrating IT teams to better play a role in the work.\nWebinar \n The webinar for this session (43:59) can be found at YouTube: https:\/\/www.youtube.com\/\n\nSlides and transcripts \n The slides and transcripts for this webinar can be found on Google Drive: https:\/\/drive.google.com\/\n\nPart 5: Supporting Laboratory Systems \nThe previous webinars have progressed from the descriptions of laboratory technologies to choosing some components, the financial justification for their use, and the beginnings of describing the topology and distribution of lab informatics. In the course of those sessions, we began looking at the need for outside support for the systems under consideration: what help do you need when carrying out this work, and where will it come from?\nThis webinar continues the development of that topic. What roles do lab personnel and corporate IT support play in identifying technologies, selecting products, implementing them, and providing for long-term support? Is there a need for additional players in this work and what are their roles?\nWhy is this important to you? We\u2019re past the point where traditional IT backgrounds are sufficient to support laboratory work. While the ability to effectively apply the available technologies, and identify missing components, is outside the experience of most lab personnel and IT support, it is an essential need if we are going to meet the growing productivity and return on investment (ROI) demands of laboratory work. This presentation looks at how we develop people to fit that need.\n\nWebinar \n The webinar for this session (43:31) can be found on YouTube: https:\/\/www.youtube.com\/\n\nSlides and transcripts \n The slides and transcripts for this webinar can be found on Google Drive: https:\/\/drive.google.com\/\n\nPart 6: Instrument Data Systems \nInstrument data systems represent a critical transition in lab work: it is the place where we stop working with materials and begin working with numerical representations of those materials, where lab bench science ends and informatics begins. Problems in that transition can negate all the work leading up to it, leading to results that are open to challenge.\nThis webinar takes a look at:\n\nthose instrument data systems;\nthe processes that affect your data;\nthe choices you face in designing an informatics architecture for your lab; and\nthe silo effect and the roles of lab managers, personnel, and support have in selecting and managing those technologies.\nWhy does this matter? We are moving toward the development of automated science as a means of gaining higher productivity and reducing costs. If the systems we are putting in place are not well understood and planned, that automation can result in questionable results and missed opportunities. Laboratory informatics architectures have become a major factor in the success of lab operations, and this webinar represents an important step in building that success.\n\nWebinar \n The webinar for this session (43:09) can be found on YouTube: https:\/\/www.youtube.com\/\n\nSlides and transcripts \n The slides and transcripts for this webinar can be found on Google Drive: https:\/\/drive.google.com\/\n\nPart 7: Laboratory Processes \nIn this seventh and final episode of the webinar series, we broaden our view to examine the full range of lab work, including the methods and procedures that put the \u201cscience\u201d in lab work. \nThis webinar takes a look at:\n\nhow the various informatics pieces relate to each other in the process of lab work, and\nwhat it takes to integrate them into computer-controlled, computer-assisted, and scientific production systems.\nWhy is this important to you? As the drive for higher productivity and improved ROI continues, laboratories need to take advantage of the full range of informatics and automation tools available to them. That includes viewing the lab's operations from a broader viewpoint, ensuring that resources are used effectively and opportunities aren\u2019t missed. This concluding session brings the entire series to a close, allowing you to apply all the concepts we've covered to your lab's work, all while realizing how regulatory guidelines come into play.\n\nWebinar \n The webinar for this session (33:10) can be found on YouTube: https:\/\/www.youtube.com\/\n\nSlides and transcripts \n The slides and transcripts for this webinar can be found on Google Drive: https:\/\/drive.google.com\/\n\r\n\n\n\n\n\n\nSource: <a rel=\"external_link\" class=\"external\" href=\"https:\/\/www.limswiki.org\/index.php\/LII:A_Guide_for_Management:_Successfully_Applying_Laboratory_Systems_to_Your_Organization%27s_Work\">https:\/\/www.limswiki.org\/index.php\/LII:A_Guide_for_Management:_Successfully_Applying_Laboratory_Systems_to_Your_Organization%27s_Work<\/a>\nNavigation menuPage actionsLIIDiscussionView sourceHistoryPage actionsLIIDiscussionMoreToolsIn other languagesPersonal toolsLog inRequest accountNavigationMain pageRecent changesRandom pageHelp about MediaWikiSearch\u00a0 ToolsWhat links hereRelated changesSpecial pagesPermanent linkPage informationSponsors \r\n\n\t\r\n\n\t\r\n\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\r\n\n\t\r\n\n \n\t\n\t\n\t\r\n\n\t\r\n\n \n\t\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\r\nPrint\/exportCreate a bookDownload as PDFDownload as PDFDownload as Plain textPrintable version This page was last edited on 28 June 2018, at 16:10.Content is available under a Creative Commons Attribution-ShareAlike 4.0 International License unless otherwise noted.This page has been accessed 11,422 times.Privacy policyAbout LIMSWikiDisclaimers\n\n\n\n","00b300565027cb0518bcb0410d6df360_html":"<body class=\"mediawiki ltr sitedir-ltr mw-hide-empty-elt ns-202 ns-subject page-LII_A_Guide_for_Management_Successfully_Applying_Laboratory_Systems_to_Your_Organization_s_Work rootpage-LII_A_Guide_for_Management_Successfully_Applying_Laboratory_Systems_to_Your_Organization_s_Work skin-monobook action-view skin--responsive\"><div id=\"rdp-ebb-globalWrapper\"><div id=\"rdp-ebb-column-content\"><div id=\"rdp-ebb-content\" class=\"mw-body\" role=\"main\"><a id=\"rdp-ebb-top\"><\/a>\n<h1 id=\"rdp-ebb-firstHeading\" class=\"firstHeading\" lang=\"en\">LII:A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work<\/h1><div id=\"rdp-ebb-bodyContent\" class=\"mw-body-content\"><!-- start content --><div id=\"rdp-ebb-mw-content-text\" lang=\"en\" dir=\"ltr\" class=\"mw-content-ltr\"><div class=\"mw-parser-output\"><div class=\"floatright\"><a href=\"https:\/\/www.limswiki.org\/index.php\/File:LabCourses_Transparent.png\" class=\"image wiki-link\" data-key=\"e8c952781a315e89a294f7326bb7182d\"><img alt=\"LabCourses Transparent.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/b\/b4\/LabCourses_Transparent.png\" decoding=\"async\" width=\"240\" height=\"147\" \/><\/a><\/div>\n<p><b>Title<\/b>: <i>A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work<\/i>\n<\/p><p><b>Author for citation<\/b>: Joe Liscouski\n<\/p><p><b>License for content<\/b>: <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/creativecommons.org\/licenses\/by-sa\/4.0\/\" target=\"_blank\">Creative Commons Attribution-ShareAlike 4.0 International<\/a>\n<\/p><p><b>Publication date<\/b>: January 2018\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_informatics\" title=\"Laboratory informatics\" class=\"wiki-link\" data-key=\"00edfa43edcde538a695f6d429280301\">Laboratory informatics<\/a> involves a collection of technologies that range from sample storage management and robotics to database\/workflow management systems such as <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_information_management_system\" title=\"Laboratory information management system\" class=\"wiki-link\" data-key=\"8ff56a51d34c9b1806fcebdcde634d00\">laboratory information management systems<\/a> (LIMS) and <a href=\"https:\/\/www.limswiki.org\/index.php\/Electronic_laboratory_notebook\" title=\"Electronic laboratory notebook\" class=\"wiki-link\" data-key=\"a9fbbd5e0807980106763fab31f1e72f\">electronic laboratory notebooks<\/a> (ELN), with a lot of task-specific tools in between. These components were designed by a number of vendors who saw specific needs and developed products to address them. Those products in turn were presented to laboratories as a means of solving their instrumental data collection and analysis, sample preparation, data management, and document management issues. With many needs and so many ways to address them, how do you go about choosing a set of products that will work for you? \n<\/p><p>That is what this set of webinars is all about. We introduce the technologies and position them for you so that you can see how they may or may not apply to your work. Then we address the very real world topic of justifying the investment needed to put those tools to use in your laboratories. \n<\/p><p>Once that foundation has been put in place we cover:\n<\/p>\n<ul><li>Technology planning and education: Planning is essential for success in this work. We look at how to go about it, who to involve, and methodologies for carrying out the work. We also look at the associated knowledge necessary to be effective.<\/li>\n<li>Implementation: Informatics systems can be a challenge to implement. We look at what is needed to minimize risks and make the implementation easier, as well as the support requirements needed to manage their use in your laboratory environment.<\/li>\n<li>Regulatory guidelines and compliance: We also address regulatory guidelines and compliance and how they can affect every laboratory application.<\/li>\n<li>The future: What developments will arise and be needed in the future? We wrap up the series with those details.<\/li><\/ul>\n<p><br \/>\n<b>About the author<\/b>\n<\/p><p>Initially educated as a chemist, author Joe Liscouski is an experienced laboratory automation\/computing professional with over forty years experience in the field, including the design and development of automation systems (both custom and commercial systems), LIMS, robotics and data interchange standards. He also consults on the use of computing in laboratory work. He has held symposia on validation and presented technical material and short courses on laboratory automation and computing in the U.S., Europe, and Japan. He has worked\/consulted in pharmaceutical, biotech, polymer, medical, and government laboratories. His current work centers on working with companies to establish planning programs for lab systems, developing effective support groups, and helping people with the application of automation and information technologies in research and quality control environments.\n<\/p>\n\n\n<h2><span class=\"mw-headline\" id=\"Introduction\">Introduction<\/span><\/h2>\n<p>Once the scientific mission of a lab has been established, lab work is divided into three areas:\n<\/p>\n<ol><li>Administrative work, including report preparation<\/li>\n<li>Planning experimental work, including method development and attendant documentation \/ validation<\/li>\n<li>Experiment \/procedure \/ task execution<\/li><\/ol>\n<p>Laboratory computing and automation can play a useful role in each of these points. The effectiveness of these tools depends on preparation, planning, and the capabilities of those carrying out lab work. It is those points that need attention if we are going to transition from a collection of technologies to integrated functional systems. Similar transitions took other production environments from a succession of individual steps to highly efficient, cost-effective, and successful productivity.\n<\/p><p>The purpose of these webinars and the resources they carry with them, is to help people understand how to make that transition happen. How those tools can improve laboratory operations.\n<\/p><p>We, as a community, need to move from a series of incremental improvements to well-designed, engineered solutions to laboratory needs. That will require a change in mindset. That mindset change will lead to better science, better results, improved productivity, and more effective laboratory operations and technology management.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Introductory_video\">Introductory video<\/span><\/h3>\n<p><a href=\"https:\/\/www.youtube.com\/watch?v=6p_aRNTiBQM\" rel=\"external_link\" target=\"_blank\"><img alt=\"PlayVideo.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/4\/46\/PlayVideo.png\" decoding=\"async\" width=\"40\" height=\"40\" \/><\/a> <i>The introductory video for this series of webinars (12:30) can be found at YouTube<\/i>: <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.youtube.com\/watch?v=6p_aRNTiBQM\" target=\"_blank\">https:\/\/www.youtube.com\/<\/a>\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Slides_and_transcripts\">Slides and transcripts<\/span><\/h3>\n<p><a href=\"https:\/\/drive.google.com\/open?id=15jiWwQ0PBflzjP7fvG6S48ppvJVVvmcn\" rel=\"external_link\" target=\"_blank\"><img alt=\"PDF.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/a\/a6\/PDF.png\" decoding=\"async\" width=\"40\" height=\"40\" \/><\/a> <i>The slides and transcripts for this introductory video can be found on Google Drive<\/i>: <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/drive.google.com\/open?id=15jiWwQ0PBflzjP7fvG6S48ppvJVVvmcn\" target=\"_blank\">https:\/\/drive.google.com\/<\/a>\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Part_1:_Laboratory_Informatics_Technologies\">Part 1: Laboratory Informatics Technologies<\/span><\/h2>\n<p>LIMS, ELN, <a href=\"https:\/\/www.limswiki.org\/index.php\/Scientific_data_management_system\" title=\"Scientific data management system\" class=\"wiki-link\" data-key=\"9f38d322b743f578fef487b6f3d7c253\">scientific data management systems<\/a> (SDMS), <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_information_system\" title=\"Laboratory information system\" class=\"wiki-link\" data-key=\"37add65b4d1c678b382a7d4817a9cf64\">laboratory information systems<\/a> (LIS), <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_execution_system\" title=\"Laboratory execution system\" class=\"wiki-link\" data-key=\"774bdcab852f4d09565f0486bfafc26a\">laboratory execution systems<\/a> (LES), and instrument data systems are terms that have been part of laboratory discussions for years, but unless you\u2019ve been an active part of those conversations, they sound like techno-babble. The purpose of this webinar is two-fold:\n<\/p>\n<ul><li>first, to introduce the webinar series and its objectives, and<\/li>\n<li>second, to convert that babble into meaningful subjects and see how they apply (or not) to your laboratory.<\/li><\/ul>\n<p>Informatics is part of laboratory life: unless it is a piece of glassware, almost everything in the lab has a chip that is supposed to improve its usefulness. At the end of this webinar you will have an understanding of what the key technologies are, how they relate to each other, and how they might improve you lab's operations. The remainder of the series will build on this background, looking at return on investment (ROI) considerations, planning, education, and IT support requirements. You\u2019ll love it.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Webinar\">Webinar<\/span><\/h3>\n<p><a href=\"https:\/\/www.youtube.com\/watch?v=p6FtygPEZS4\" rel=\"external_link\" target=\"_blank\"><img alt=\"PlayVideo.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/4\/46\/PlayVideo.png\" decoding=\"async\" width=\"40\" height=\"40\" \/><\/a> <i>The webinar for this session (50:17) can be found at YouTube<\/i>: <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.youtube.com\/watch?v=p6FtygPEZS4\" target=\"_blank\">https:\/\/www.youtube.com\/<\/a>\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Slides_and_transcripts_2\">Slides and transcripts<\/span><\/h3>\n<p><a href=\"https:\/\/drive.google.com\/file\/d\/1iykZUk9HcNrpFrqg5ZpuHmoAQdCn-yf_\/view\" rel=\"external_link\" target=\"_blank\"><img alt=\"PDF.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/a\/a6\/PDF.png\" decoding=\"async\" width=\"40\" height=\"40\" \/><\/a> <i>The slides and transcripts for this webinar can be found on Google Drive<\/i>: <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/drive.google.com\/file\/d\/1iykZUk9HcNrpFrqg5ZpuHmoAQdCn-yf_\/view\" target=\"_blank\">https:\/\/drive.google.com\/<\/a>\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Part_2:_Laboratory_Informatics_and_Return_on_Investment\">Part 2: Laboratory Informatics and Return on Investment<\/span><\/h2>\n<p>When you are considering lab informatics and automation projects, someone is going to ask \u201cwhat is the return on the investment you\u2019re asking for?\u201d How do you answer them? This second webinar in the series provides practical guidance.\n<\/p><p>The introduction of informatics\/automation technologies into laboratory work requires larger investments than typical lab bench spending and involves people from outside support groups. It also brings up issues of knowledge and intellectual property management, an increasingly important corporate topic. Your ability to address these points within the ROI conversation will have a direct impact on the approval of your projects. Join us as we begin to look into these considerations:\n<\/p>\n<ul><li>How do you justify the expense of laboratory systems?<\/li>\n<li>What results can be expected from an investment in laboratory technologies?<\/li>\n<li>How do you go about setting goals that are easily understood and play into corporate concerns?<\/li>\n<li>How will these investments affect other groups?<\/li><\/ul>\n<p>As laboratory work and the investments in it become more visible to the corporate organization, more effort will have to be put into the justification and evaluation of these expenditures. This webinar will help you form a basis for that work.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Webinar_2\">Webinar<\/span><\/h3>\n<p><a href=\"https:\/\/www.youtube.com\/watch?v=e6SfvFEXwSk\" rel=\"external_link\" target=\"_blank\"><img alt=\"PlayVideo.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/4\/46\/PlayVideo.png\" decoding=\"async\" width=\"40\" height=\"40\" \/><\/a> <i>The webinar for this session (38:31) can be found at YouTube<\/i>: <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.youtube.com\/watch?v=e6SfvFEXwSk\" target=\"_blank\">https:\/\/www.youtube.com\/<\/a>\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Slides_and_transcripts_3\">Slides and transcripts<\/span><\/h3>\n<p><a href=\"https:\/\/drive.google.com\/file\/d\/1az0Wq-SX1f-m4_-wVn838kgX8I6LZ3me\/view\" rel=\"external_link\" target=\"_blank\"><img alt=\"PDF.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/a\/a6\/PDF.png\" decoding=\"async\" width=\"40\" height=\"40\" \/><\/a> <i>The slides and transcripts for this webinar can be found on Google Drive<\/i>: <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/drive.google.com\/file\/d\/1az0Wq-SX1f-m4_-wVn838kgX8I6LZ3me\/view\" target=\"_blank\">https:\/\/drive.google.com\/<\/a>\n<\/p>\n<h2><span id=\"rdp-ebb-Part_3:_Technology,_Planning,_&_Education\"><\/span><span class=\"mw-headline\" id=\"Part_3:_Technology.2C_Planning.2C_.26_Education\">Part 3: Technology, Planning, & Education<\/span><\/h2>\n<p>The first webinar introduced the subject of laboratory informatics, comparing the use of LIMS, ELN, SDMS, LIS, LES, and instrument data systems in different laboratory settings. The second looked at how you can evaluate the ROI in those technologies.\n<\/p><p>This third webinar looks at technology planning and education, and more specifically, how you can gain the benefits expected from your investment in informatics technologies. Effective planning, by people with a good understanding of laboratory technologies, is one of the key points in successfully applying informatics technologies to laboratory work. Most failures can be traced back to insufficient planning efforts.\n<\/p><p>This subject may span more than one webinar session and will look at the:\n<\/p>\n<ul><li>methodologies for laboratory systems planning, including where and when to start and who should be involved;<\/li>\n<li>roles of regulatory guidelines in different laboratories;<\/li>\n<li>roles that laboratory management, personnel, and IT support groups play;<\/li>\n<li>changes taking place in laboratory work and the backgrounds needed to be successful; and<\/li>\n<li>efforts needed to support the technical work and meet ROI and performance goals.<\/li><\/ul>\n<h3><span class=\"mw-headline\" id=\"Webinar_3\">Webinar<\/span><\/h3>\n<p><a href=\"https:\/\/www.youtube.com\/watch?v=nyIt6j_6qzQ\" rel=\"external_link\" target=\"_blank\"><img alt=\"PlayVideo.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/4\/46\/PlayVideo.png\" decoding=\"async\" width=\"40\" height=\"40\" \/><\/a> <i>The webinar for this session (42:18) can be found at YouTube<\/i>: <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.youtube.com\/watch?v=nyIt6j_6qzQ\" target=\"_blank\">https:\/\/www.youtube.com\/<\/a>\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Slides_and_transcripts_4\">Slides and transcripts<\/span><\/h3>\n<p><a href=\"https:\/\/drive.google.com\/file\/d\/1sGeOlNLfzoJfcRXBYQBaT24GloSJ2zyB\/view\" rel=\"external_link\" target=\"_blank\"><img alt=\"PDF.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/a\/a6\/PDF.png\" decoding=\"async\" width=\"40\" height=\"40\" \/><\/a> <i>The slides and transcripts for this webinar can be found on Google Drive<\/i>: <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/drive.google.com\/file\/d\/1sGeOlNLfzoJfcRXBYQBaT24GloSJ2zyB\/view\" target=\"_blank\">https:\/\/drive.google.com\/<\/a>\n<\/p>\n<h2><span id=\"rdp-ebb-Part_4:_LIMS\/LIS,_ELN,_SDMS,_IT_&_Education\"><\/span><span class=\"mw-headline\" id=\"Part_4:_LIMS.2FLIS.2C_ELN.2C_SDMS.2C_IT_.26_Education\">Part 4: LIMS\/LIS, ELN, SDMS, IT & Education<\/span><\/h2>\n<p>This session builds on the material covered in the previous webinars. We extend the discussion of centralized laboratory database systems (LIMS\/LIS, ELN, SDMS) to look at multi-laboratory applications, as well as the use of virtual systems, local\/private cloud, remote cloud, and vendor-supported remote database applications. This session also begins looking at the roles that information technology support groups can play in this work. Points covered include:\n<\/p>\n<ul><li>planning for multiple laboratory use of database systems;<\/li>\n<li>using local and cloud-based computing models; and<\/li>\n<li>integrating IT teams to better play a role in the work.<\/li><\/ul>\n<h3><span class=\"mw-headline\" id=\"Webinar_4\">Webinar<\/span><\/h3>\n<p><a href=\"https:\/\/www.youtube.com\/watch?v=-8dkKsuEwdU\" rel=\"external_link\" target=\"_blank\"><img alt=\"PlayVideo.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/4\/46\/PlayVideo.png\" decoding=\"async\" width=\"40\" height=\"40\" \/><\/a> <i>The webinar for this session (43:59) can be found at YouTube<\/i>: <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.youtube.com\/watch?v=-8dkKsuEwdU\" target=\"_blank\">https:\/\/www.youtube.com\/<\/a>\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Slides_and_transcripts_5\">Slides and transcripts<\/span><\/h3>\n<p><a href=\"https:\/\/drive.google.com\/file\/d\/1RFlWkKHZAi-MPqV1XjDm5vYexyO7tGld\/view\" rel=\"external_link\" target=\"_blank\"><img alt=\"PDF.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/a\/a6\/PDF.png\" decoding=\"async\" width=\"40\" height=\"40\" \/><\/a> <i>The slides and transcripts for this webinar can be found on Google Drive<\/i>: <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/drive.google.com\/file\/d\/1RFlWkKHZAi-MPqV1XjDm5vYexyO7tGld\/view\" target=\"_blank\">https:\/\/drive.google.com\/<\/a>\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Part_5:_Supporting_Laboratory_Systems\">Part 5: Supporting Laboratory Systems<\/span><\/h2>\n<p>The previous webinars have progressed from the descriptions of laboratory technologies to choosing some components, the financial justification for their use, and the beginnings of describing the topology and distribution of lab informatics. In the course of those sessions, we began looking at the need for outside support for the systems under consideration: what help do you need when carrying out this work, and where will it come from?\n<\/p><p>This webinar continues the development of that topic. What roles do lab personnel and corporate IT support play in identifying technologies, selecting products, implementing them, and providing for long-term support? Is there a need for additional players in this work and what are their roles?\n<\/p><p>Why is this important to you? We\u2019re past the point where traditional IT backgrounds are sufficient to support laboratory work. While the ability to effectively apply the available technologies, and identify missing components, is outside the experience of most lab personnel and IT support, it is an essential need if we are going to meet the growing productivity and return on investment (ROI) demands of laboratory work. This presentation looks at how we develop people to fit that need.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Webinar_5\">Webinar<\/span><\/h3>\n<p><a href=\"https:\/\/youtu.be\/1Oi6Flczt8g\" rel=\"external_link\" target=\"_blank\"><img alt=\"PlayVideo.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/4\/46\/PlayVideo.png\" decoding=\"async\" width=\"40\" height=\"40\" \/><\/a> <i>The webinar for this session (43:31) can be found on YouTube<\/i>: <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/youtu.be\/1Oi6Flczt8g\" target=\"_blank\">https:\/\/www.youtube.com\/<\/a>\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Slides_and_transcripts_6\">Slides and transcripts<\/span><\/h3>\n<p><a href=\"https:\/\/drive.google.com\/open?id=1tfNnWYgVb49FyKU5S_saPi1TZ6n9_L4Q\" rel=\"external_link\" target=\"_blank\"><img alt=\"PDF.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/a\/a6\/PDF.png\" decoding=\"async\" width=\"40\" height=\"40\" \/><\/a> <i>The slides and transcripts for this webinar can be found on Google Drive<\/i>: <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/drive.google.com\/open?id=1tfNnWYgVb49FyKU5S_saPi1TZ6n9_L4Q\" target=\"_blank\">https:\/\/drive.google.com\/<\/a>\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Part_6:_Instrument_Data_Systems\">Part 6: Instrument Data Systems<\/span><\/h2>\n<p>Instrument data systems represent a critical transition in lab work: it is the place where we stop working with materials and begin working with numerical representations of those materials, where lab bench science ends and informatics begins. Problems in that transition can negate all the work leading up to it, leading to results that are open to challenge.\n<\/p><p>This webinar takes a look at:\n<\/p>\n<ul><li>those instrument data systems;<\/li>\n<li>the processes that affect your data;<\/li>\n<li>the choices you face in designing an informatics architecture for your lab; and<\/li>\n<li>the silo effect and the roles of lab managers, personnel, and support have in selecting and managing those technologies.<\/li><\/ul>\n<p>Why does this matter? We are moving toward the development of automated science as a means of gaining higher productivity and reducing costs. If the systems we are putting in place are not well understood and planned, that automation can result in questionable results and missed opportunities. Laboratory informatics architectures have become a major factor in the success of lab operations, and this webinar represents an important step in building that success.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Webinar_6\">Webinar<\/span><\/h3>\n<p><a href=\"https:\/\/www.youtube.com\/watch?v=gvkTjx8veBI\" rel=\"external_link\" target=\"_blank\"><img alt=\"PlayVideo.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/4\/46\/PlayVideo.png\" decoding=\"async\" width=\"40\" height=\"40\" \/><\/a> <i>The webinar for this session (43:09) can be found on YouTube<\/i>: <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.youtube.com\/watch?v=gvkTjx8veBI\" target=\"_blank\">https:\/\/www.youtube.com\/<\/a>\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Slides_and_transcripts_7\">Slides and transcripts<\/span><\/h3>\n<p><a href=\"https:\/\/drive.google.com\/open?id=1_vMiovIWRfUrA4Wf8kTP4YQplfPWH0KH\" rel=\"external_link\" target=\"_blank\"><img alt=\"PDF.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/a\/a6\/PDF.png\" decoding=\"async\" width=\"40\" height=\"40\" \/><\/a> <i>The slides and transcripts for this webinar can be found on Google Drive<\/i>: <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/drive.google.com\/open?id=1_vMiovIWRfUrA4Wf8kTP4YQplfPWH0KH\" target=\"_blank\">https:\/\/drive.google.com\/<\/a>\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Part_7:_Laboratory_Processes\">Part 7: Laboratory Processes<\/span><\/h2>\n<p>In this seventh and final episode of the webinar series, we broaden our view to examine the full range of lab work, including the methods and procedures that put the \u201cscience\u201d in lab work. \n<\/p><p>This webinar takes a look at:\n<\/p>\n<ul><li>how the various informatics pieces relate to each other in the process of lab work, and<\/li>\n<li>what it takes to integrate them into computer-controlled, computer-assisted, and scientific production systems.<\/li><\/ul>\n<p>Why is this important to you? As the drive for higher productivity and improved ROI continues, laboratories need to take advantage of the full range of informatics and automation tools available to them. That includes viewing the lab's operations from a broader viewpoint, ensuring that resources are used effectively and opportunities aren\u2019t missed. This concluding session brings the entire series to a close, allowing you to apply all the concepts we've covered to your lab's work, all while realizing how regulatory guidelines come into play.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Webinar_7\">Webinar<\/span><\/h3>\n<p><a href=\"https:\/\/www.youtube.com\/watch?v=-ppHVRoNkFE\" rel=\"external_link\" target=\"_blank\"><img alt=\"PlayVideo.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/4\/46\/PlayVideo.png\" decoding=\"async\" width=\"40\" height=\"40\" \/><\/a> <i>The webinar for this session (33:10) can be found on YouTube<\/i>: <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.youtube.com\/watch?v=-ppHVRoNkFE\" target=\"_blank\">https:\/\/www.youtube.com\/<\/a>\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Slides_and_transcripts_8\">Slides and transcripts<\/span><\/h3>\n<p><a href=\"https:\/\/drive.google.com\/open?id=1ex1qRsP9nw2_rT5u34NLoFSm2-Onwn4B\" rel=\"external_link\" target=\"_blank\"><img alt=\"PDF.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/a\/a6\/PDF.png\" decoding=\"async\" width=\"40\" height=\"40\" \/><\/a> <i>The slides and transcripts for this webinar can be found on Google Drive<\/i>: <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/drive.google.com\/open?id=1ex1qRsP9nw2_rT5u34NLoFSm2-Onwn4B\" target=\"_blank\">https:\/\/drive.google.com\/<\/a>\n<\/p><p><br \/>\n<\/p>\n<!-- \nNewPP limit report\nCached time: 20211118135628\nCache expiry: 86400\nDynamic content: false\nComplications: []\nCPU time usage: 0.081 seconds\nReal time usage: 0.163 seconds\nPreprocessor visited node count: 93\/1000000\nPost\u2010expand include size: 0\/2097152 bytes\nTemplate argument size: 0\/2097152 bytes\nHighest expansion depth: 2\/40\nExpensive parser function count: 0\/100\nUnstrip recursion depth: 0\/20\nUnstrip post\u2010expand size: 0\/5000000 bytes\n-->\n<!--\nTransclusion expansion time report (%,ms,calls,template)\n100.00% 0.000 1 -total\n-->\n\n<!-- Saved in parser cache with key limswiki:pcache:idhash:10382-0!canonical and timestamp 20211118135628 and revision id 33485. Serialized with JSON.\n -->\n<\/div><\/div><div class=\"printfooter\">Source: <a rel=\"external_link\" class=\"external\" href=\"https:\/\/www.limswiki.org\/index.php\/LII:A_Guide_for_Management:_Successfully_Applying_Laboratory_Systems_to_Your_Organization%27s_Work\">https:\/\/www.limswiki.org\/index.php\/LII:A_Guide_for_Management:_Successfully_Applying_Laboratory_Systems_to_Your_Organization%27s_Work<\/a><\/div>\n<!-- end content --><div class=\"visualClear\"><\/div><\/div><\/div><div class=\"visualClear\"><\/div><\/div><!-- end of the left (by default at least) column --><div class=\"visualClear\"><\/div><\/div>\n\n\n\n<\/body>","00b300565027cb0518bcb0410d6df360_images":["https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/b\/b4\/LabCourses_Transparent.png"],"00b300565027cb0518bcb0410d6df360_timestamp":1637252747,"2000eea677bcd5ee1fcecdab32743800_type":"article","2000eea677bcd5ee1fcecdab32743800_title":"Elements of Laboratory Technology Management (Liscouski 2014)","2000eea677bcd5ee1fcecdab32743800_url":"https:\/\/www.limswiki.org\/index.php\/LII:Elements_of_Laboratory_Technology_Management","2000eea677bcd5ee1fcecdab32743800_plaintext":"\n\nLII:Elements of Laboratory Technology ManagementFrom LIMSWikiJump to navigationJump to searchTitle: Elements of Laboratory Technology Management\nAuthor for citation: Joe Liscouski, with editorial modifications by Shawn Douglas\nLicense for content: Creative Commons Attribution 4.0 International\nPublication date: 2014\n\nContents \n\n1 Introduction \n2 Moving from lab functions and requirements to practical solutions \n3 What is laboratory automation? \n\n3.1 Why \"process\" is important \n3.2 The role of processes in integration \n\n\n4 The elements of laboratory technology management \n\n4.1 Management issues \n4.2 Classes of laboratory automation implementation \n\n4.2.1 Computer-controlled experiments \n4.2.2 Computer-assisted lab work\/experiments \n4.2.3 Scientific manufacturing \n\n\n4.3 Experimental methods \n4.4 Lab-specific technologies and information technology \n4.5 Systems integration \n\n4.5.1 What is an integrated system in the laboratory? \n4.5.2 A brief historical note \n4.5.3 The cost of the lack of progress \n4.5.4 What do we need to build integrated systems? \n\n\n4.6 Summary \n\n\n5 Skills required for working with lab technologies \n\n5.1 Development of manufacturing and production stages \n5.2 What does this all mean for laboratory workers \n\n\n6 Closing \n7 Abbreviations, acronyms, and initialisms \n8 Footnotes \n9 About the author \n10 References \n\n\n\nIntroduction \nThis discussion is less about specific technologies than it is about the ability to use advanced laboratory technologies effectively. When we say \u201ceffectively,\u201d we mean that those products and technologies should be used successfully to address needs in your lab, and that they improve the lab\u2019s ability to function. If they don't do that, you\u2019ve wasted your money. Additionally, if the technology in question hasn\u2019t been deployed according to a deliberate plan, your funded projects may not achieve everything they could. Optimally, when applied thoughtfully, the available technologies should result in the transformation of lab work from a labor-intensive effort to one that is intellectually intensive, making the most effective use of people and resources.\nPeople come to the subject of laboratory automation from widely differing perspectives. To some it\u2019s about robotics, to others it\u2019s about laboratory informatics, and even others view it as simply data acquisition and analysis. It all depends on what your interests are, and more importantly what your immediate needs are.\nPeople began working in this field in the 1940s and 1950s, with the work focused on analog electronics to improve instrumentation; this was the first phase of lab automation. Most notably were the development of scanning spectrophotometers and process chromatographs. Those who first encountered this equipment didn\u2019t think much of it and considered it the world as it\u2019s always been. Others who had to deal with products like the Spectronic 20[a] (a single-beam manual spectrophotometer), and use it to develop visible spectra one wavelength measurement at a time, appreciated the automation of scanning instruments.\nMercury switches and timers triggered by cams on a rotating shaft provided chromatographs with the ability to automatically take samples, actuate back flush valves, and take care of other functions without operator intervention. This left the analyst with the task of measuring peaks, developing calibration curves, and performing calculations, at least until data systems became available.\nThe direction of laboratory automation changed significantly when computer chips became available. In the 1960s, companies such as PerkinElmer were experimenting with the use of computer systems for data acquisition as precursors to commercial products. The availability of general-purpose computers such as the PDP-8 and PDP-12 series (along with the Lab 8e) from Digital Equipment, with other models available from other vendors, made it possible for researchers to connect their instruments to computers and carry out experiments. The development of microprocessors from Intel (4004, 8008) led to the evolution of \u201cintelligent\u201d laboratory equipment ranging from processor-controlled stirring hot-plates to chromatographic integrators.\nAs researchers learned to use these systems, their application rapidly progressed from data acquisition to interactive control of the experiments, including data storage, analysis, and reporting. Today, the product set available for laboratory applications includes data acquisition systems, laboratory information management systems (LIMS), electronic laboratory notebooks (ELNs), laboratory robotics, and specialized components to help researchers, scientists, and technicians apply modern technologies to their work.\nWhile there is a lot of technology available, the question remains \"how do you go about using it?\" Not only do we need to know how to use it, but we also must do so while avoiding our own biases about how computer systems operate. Our familiarity with using computer systems in our daily lives may cause us to assume they are doing what we need them to do, without questioning how it actually gets done. \u201cThe vendor knows what they are doing\u201d is a poor reason for not testing and evaluating control parameters to ensure they are suitable and appropriate for your work.\n\nMoving from lab functions and requirements to practical solutions \nBefore we can begin to understand the application of the tools and technologies that are available, we have to know what we want to accomplish, specifically what problems we want to solve. We can divide laboratory functions into two broad classes: management and work execution. Figure 1 addresses management functions, whereas Figure 2 addresses work execution functions, all common to laboratories. You can add to them based on your own experience.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 1. A breakdown of management-level functions in a typical laboratory\n\n\n\n\n\n\n\n\n\n\n\n\nFigure 2. A breakdown of work-level functions in a typical laboratory\n\n\n\nVendors have been developing products to address these work areas, and there are a lot of products available. Many of them are \"point\" solutions: products that are focused on one aspect of work without an effort to integrate them with others. That isn\u2019t surprising since there isn\u2019t an architectural basis for integration aside from specific hardware systems (e.g., Firewire, USB) or vendor-specific software systems (e.g., office product suites). Another issue in scientific work is that the vendor may only be interested in solving a particular problem, with most of the emphasis on an instrument or technique. They may provide the software needed to support their hardware, with data transfer and integration left to the user.\nAs you work through this document, you\u2019ll find a map of management responsibilities and technologies. How do you connect the above map of functions to the technologies? Applying software and hardware solutions to your lab's needs requires deliberate planning. The days of purchasing point solutions to problems have passed. Today's lab managers need to think more broadly about product usage and how components of lab software systems work together. The point of this document is to help you understand what you need to think about in that regard.\nGiven those summaries of lab activities, how do we apply available technologies to improve lab operations? Most of the answers fall under the heading of \"laboratory automation,\" so we\u2019ll begin by looking at what that is.\n\n What is laboratory automation? \nThis isn\u2019t a trivial question; your answer may depend on the field you are working in, your experience, and your current interests. To some it means robotics, to others it is a LIMS (or their clinical counterpart, the laboratory information systems or LIS). The ELN and instrument data systems (IDS) are additional elements worth noting. These are examples of product classes and technologies used in lab automation, but they don\u2019t define the field. Wikipedia provides the following as a definition[1]:\n\nLaboratory automation is a multi-disciplinary strategy to research, develop, optimize and capitalize on technologies in the laboratory that enable new and improved processes. Laboratory automation professionals are academic, commercial and government researchers, scientists and engineers who conduct research and develop new technologies to increase productivity, elevate experimental data quality, reduce lab process cycle times, or enable experimentation that otherwise would be impossible. \nThe most widely known application of laboratory automation technology is laboratory robotics. More generally, the field of laboratory automation comprises many different automated laboratory instruments, devices, software algorithms, and methodologies used to enable, expedite and increase the efficiency and effectiveness of scientific research in laboratories.\nAnd McDowall offered this definition in 1993[2]: \n\nApparatus, instrumentation, communications or computer applications designed to mechanize or automate the whole or specific portions of the analytical process in order for a laboratory to provide timely and quality information to an organization\nThese definitions emphasize equipment and products, and that is where typical approaches to lab automation and the work we are doing part company. Products and technologies are important, but what is more important is figuring out how to use them effectively. The lack of consistent success in the application of lab automation technologies appears to stem from this focus on technologies and equipment\u2014\u201cwhat will this product do for my lab?\u201d\u2014rather than methodologies for determining what is needed and how to implement solutions.\nHaving a useful definition of laboratory automation is crucial since how we approach the work depends on how we see the field developing. The definition the Institute for Laboratory Automation (ILA) bases its work on is this:\n\nLaboratory automation is the process of determining needs and requirements, planning projects and programs, evaluating products and technologies, and developing and implementing projects according to a set of methodologies that results in successful systems that increase productivity, improve the effectiveness and efficiency of laboratory operations, reduce operating costs, and provide higher-quality data.\nThe field includes the use of data acquisition, analysis, robotics, sample preparation, laboratory informatics, information technology, computer science, and a wide range of technologies and products from widely varying disciplines, used in the implementation of projects.\n\n Why \"process\" is important \nLab automation isn\u2019t about stuff, but how we use stuff. The \u201cprocess\u201d component of the ILA definition is central to what we do. To quote Frank Zenie[b] (one of the founders of Zymark Corporation), \u201cyou don\u2019t automate a thing, you automate a process.\u201d You don\u2019t automate an instrument; you automate the process of using one. Autosamplers are a good example of successful automation: they address the process of selecting a sample vial, withdrawing fluid, positioning a needle into an injection port, injecting the sample, preparing the syringe for the next injection, and indexing the sample vials when needed.\nA number of people have studied the structure of science and the relationship between disciplines. Lab automation is less about science and more about how the work of science is done. Before lab automation can be considered for a project, the underlying science has to be done. In other words, the process that automation is going to be applied to must exist first. It also has to be the right process for consideration. This is a point that needs attention.\nIf you are going to spend resources on a project, you have to make sure you have a well-characterized process and that the process is both optimal and suitable for automation. This means:\n\nThe process is well documented, people that use the process have been trained on that documented process, and their work is monitored to determine any shortcuts, workarounds, or other variations from that documented process. Differences between the published process and the one actually in use can have a significant impact on the success of a particular project design.\nThe process\u2019s \u201creadiness for automation\u201d has been determined. The equipment used is suitable for automation or the changes needed to make it suitable are known, and it can be done at reasonable cost. Any impact on warranties has been determined and found to be acceptable.\nIf several process options exist (e.g., different test protocols for the same test question), they are evaluated for their ability to meet the needs of the science and to be successfully implemented. Other options, such as outsourcing, should be considered to make the best use of resources; is may be more cost-effective to outsource than automate.\nWhen looking at laboratory processes, it's useful to recognize they may operate on different levels. There may be high-level processes that address the lab's reason for existence and cover the mechanics of how the lab functions, as well as low-level processes that address individual functions of the lab. This process view is important when we consider products and technologies, as the products have to fit the process, the general basis for product requirements. Discussions often revolve around LIMS and ELNs are one example, with questions being asked about whether one or both are required in the lab, or whether an ELN can replace a LIMS. \nThese and other similar questions reflect both vendor influence and a lack of understanding of the technologies and their application. Some differentiate the two types of technology based on \u201cstructured\u201d (as found in LIMS) vs. \u201cunstructured\u201d (as found in ELNs) data. Broadly speaking, LIMS come with a well-defined, extensible, database structure, while ELNs are viewed as unstructured since you can put almost anything into an ELN and organize the contents as you see fit. But this characterization doesn\u2019t work well either since as a user I might consider the contents, along with an index, as having a structure. This is more of an information technology approach rather than one that addresses laboratory needs. In the end, an understanding of lab processes is still required to resolve most issues.\nLIMS are well-defined entities[c] and the only one of the two to carry an objective industry-standard description. LIMS are also designed to manage the processes surrounding laboratory testing in a wide variety of industries (e.g., from analytical, physical, and environmental testing, to clinical testing, which is usually associated with the LIS). The lab behavior process model is essentially the same across industries and disciplines. Basically, if you are running an analytical lab and need to manage samples and test results while answering questions about the status of testing on a larger scale than what you can memorize[d], a LIMS is a good answer.\nAt the time of this writing, there is no standardized definition of an ELN, though a forthcoming update of ASTM E1578 intends to rectify that. However, given the current hype about ELNs, any product with that designation is going to get noticed. Let\u2019s avoid the term and replace it with a set of software types that addresses similar functionality:\n1. Scripted execution systems \u2013 These are software systems that guide an analyst in the conduct of a procedure (process); examples include Velquest (now owned by Accelrys) products and the scripting notebook function (vendor's description) in [LabWare, Inc.|LabWare LIMS]].\n2. Journal or diary system - These are software systems that provide a record of laboratory work; a word processing system might fill this need, although there are products with features specifically designed to assist lab work.\n3. Application- or discipline-specific record keeping systems \u2013 These are software systems designed for biology, chemistry, mathematics and other areas that contain features that allow you to record data and text in a variety of forms that are geared toward the needs of specific areas of science.\nThis is not an exhaustive list of forms or functionality, but it is sufficient to make a point. The first, scripted execution, is designed around a process or, more specifically, designed to give the user a mechanism to describe the sequential steps in a process so that they can be repeated under strict controls. These do not replace a LIMS but can be used synergistically with one, or with software that duplicates LIMS capabilities (some have suggested enterprise resource planning or ERP systems as a substitute). The other two types are repositories of lab information: equations, data, details of procedures, etc. There is no general underlying process as there is with LIMS. They can provide a researcher with a means of describing experiments, collecting data, and performing analyses, which you can correctly view as processes, but they are unique to that researcher or lab and not based on any generalized industry model, as we see in testing labs.\nWhy is this important? These descriptions illustrate something fundamental: process dictates needs, and needs set requirements for products and technology. The \u201cdo we need a LIMS or ELN\u201d question is meaningless without an understanding of the processes that operate in your laboratory.\n\nThe role of processes in integration \nFrom the standpoint of equipment, laboratories are often a collection of instruments, computer systems, sample preparation stations, and other test or measurement facilities. One goal frequently stated by lab managers is that \u201cideally we\u2019d like all this to be integrated.\u201d The purpose of integration is to streamline the operations, reduce human labor, and provide a more efficient way of doing work. You are integrating the equipment and systems used to execute one or more processes. However, without a thorough evaluation of the processes in the lab, there is no basis for integration.\nHighlighting the connection between processes and integration, we see why defining what laboratory automation is remains important. One definition can lead to purchasing products and limiting the scope of automation to individual tasks. Another will take you through an evaluation of how your lab works, how you want it to work, and how to produce a framework for getting you there.\n\nThe elements of laboratory technology management \nIf lab automation is a process, we need to look at the elements that can be used to make that process work. The first thing that is needed is a structure that shows how elements of laboratory automation relate to each other and act as guides for someone coming into the field. That structure also serves as a framework for organizing knowledge about the field. The major elements of the structure are shown in Figure 3.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 3. The elements of lab technology management\n\n\n\nWhy is it important to recognize these laboratory automation management elements? As automation technology sees broader appeal across multiple industries, there is inevitability to the use of automation technologies in labs. Vendors are putting chips and programmed intelligence into every product with the goal of making them easier to use, while reducing the role of human judgment (which can lead to an accumulation of errors in tasks like pipetting) and the amount of work people have to do to get work done. There are few negatives to this philosophy, aside from one point: if we don\u2019t understand how these systems are working and haven\u2019t planned for their use, we won\u2019t get the most benefit from them, and we may accept results from systems blindly without really questioning their suitability and the results they produce. One of the main points of this work is that the use of automation technologies should be planned and managed. That brings us to the first major element: management issues.\n\nManagement issues \nThe first thing we need to address is who \u201cmanagement\u201d is. Unless the lab is represented solely by you, there are presumably layers of management. Depending on the size of the organization, this may mean one individual or a group of individuals who are responsible for various management aspects of the laboratory. Broadly speaking, those individuals will have a skillset that can appropriately address laboratory technology management, including project and program management, people skills, workflow modeling, and regulatory knowledge, to name a few (Figure 4). The need for these skillsets may vary slightly depending on the type of manager.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 4. The skills required of managers when addressing laboratory technology management\n\n\n\nAside from reviewing and approving programs (\"programs\" are efforts that cover multiple projects), \u201csenior management\u201d is responsible for setting the policies and practices that govern the conduct of laboratory programs. This is part of an overall architecture for managing lab operations, and has two components (Figure 5): setting policies and practices and developing operational models.[e] \n\r\n\n\n\n\n\n\n\n\n\n\nFigure 5. The role of managers when addressing laboratory technology management\n\n\n\nBefore the incorporation of informatics into labs, senior management\u2019s involvement wasn\u2019t necessary. However, the storage of intellectual property in electronic media has forced a significant change in lab work. Before informatics, labs used to be isolated from the rest of the organization, with formal contact made through the delivery of results, reports, and presentations. The desire for more effective, streamlined, and integrated information technology operations, and the development of information technology support groups, means that labs are now also part of the corporate picture. In organizations that have multiple labs, more efficient use of resources results in a desire to reduce duplication of work. You might easily justify two labs having their own spectrophotometers, but duplicate LIMS doing the same thing is going to require some explaining.\nWith the addition of informatics in the lab, management involvement is critical to support the success of laboratory programs. Among the reasons often cited for failure of laboratory programs is the lack of management involvement and, by extension, a lack of oversight, though these failures are usually stated without clarifying what management involvement should actually look like. To be clear, senior management's role is to ensure that programs are conducted in a way that:\n\nare common across all labs, such that all programs are conducted according to the same set of guidelines;\nare well-designed, supportable, and can be upgraded;\nare consistent with good project management practices;\nare conducted in a way that allows the results to be reused elsewhere in the company;\nare well-documented; and\nlead to successful results.\nWhen work in lab automation began, it was usually the effort of one or two individuals in a lab or company. Today, we need a cooperative effort from management, lab staff, IT support, and, if available, LAEs. One of the reasons management must establish policies and practices (Figure 6) is to enable people to effectively work together, so they are working from the same set of ground rules and expectations and producing consistent and accurate results. \n\r\n\n\n\n\n\n\n\n\n\n\nFigure 6. The business' policies and practices requiring focus by managers when addressing laboratory technology management\n\n\n\n\"Lab management\" is responsible for understanding how their lab needs to operate in order to meet the lab's goals. Before automation became a factor, lab management\u2019s primary concern was managing laboratory personnel and helping them get their work done. In the early stages of lab automation, the technologies were treated as add-ons that assisted personnel in getting work done. Today, lab managers need to move beyond that mindset and look at how their role must shift towards planning for and managing automation systems that take on more of the work in the lab. As a result, lab managers need to take on the role of technology planners in addition to managing people. The implementation of those plans may be carried out by others (e.g., laboratory automation engineers [LAEs], IT specialists), but defining the objectives and how the lab will function with a combination of people and systems is a task squarely in the lap of lab management, requiring the use of operational workflow models (Figure 7) to define the technologies and products suitable for their lab's work.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 7. The importance of operational workflow models for lab managers when addressing laboratory technology management\n\n\n\nProblems can occur when different operational groups have conflicting sets of priorities. Take for example the case of labs and IT support. At one point, lab computing only affected the lab systems, but today's labs share resources with other groups, requiring \"IT management\" to ensure that work in one place doesn\u2019t adversely impact another. Most issues around elements such as networks and security can be handled through implementing effective network design and bridges to isolate network segments when necessary. More often than not, problems are likely to occur in the choice and maintenance of products, and the policies IT implements to provide cost-effective support. Operating system upgrades are one place issues can occur if those changes cause products used in lab work to break because the vendor is slow in responding to OS changes. Another place that issues can occur is in product selection; IT managers may want to minimize the number of vendors it has to deal with and prefer products that use the same database system deployed elsewhere. That policy may adversely limit the products that the lab can choose from. From the lab's perspective, they need the best products in order to get their work done; from the IT group's perspective, they see it as driving up support costs. The way to avoid these issues, and others, is for senior managers to determine the priorities and keep the inter-department politics out of the discussion.\n\nClasses of laboratory automation implementation \nThe second element of laboratory technology management to address is laboratory automation implementation. There are three classes of implementation to address (Figure 8):\n\nComputer-controlled experiments: This includes data collection in high-energy physics, LabVIEW implemented systems, instrument command and control, and robotics. This implementation class involves systems where the computer is an integral part of the experiment, doing data collection and\/or experiment control.\nComputer-assisted lab work\/experiments: This includes work that could be done without a computer, but machines and software have been added to improve the process. Examples include chromatography data systems (CDS), ELNs used for documentation, and classic LIMS.\nScientific manufacturing: This implementation class focuses on production systems, including high-throughput screening, lights-out lab automation, process analytical technologies, and quality by design (QbD) initiatives.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 8. The three classes of laboratory automation implementation to consider for laboratory technology management\n\n\n\nComputer-controlled experiments \nThis is where the second phase of laboratory automation began: when people began using digital computers with connections to their instrumentation. We moved in stages from simple data acquisition, acquiring a few points to show that we can accurately represent voltages from the equipment, to collecting multiple data streams over time and storing the results on disks. The next step consisted of automatically collecting the data, processing it, storing it, and reporting results from equipment such as chromatographs, spectrophotometers, and other devices.\nSoftware development moved from assembly language programming to higher-level languages programming, and then specialized systems that provide a graphical interface to the programmer. Products like LabVIEW[f] allow the developer to use block diagrams to describe the programming and processing that have to be done, and provide the user with an attractive user interface with which to work. This is a far cry from embedding machine language programming in the BASIC language code, as was done in some earlier PC systems.\nRobots offer another example of this class of work, where computers control the movement of equipment and materials through a process that prepares samples for analysis, and may include the analysis itself.\nWhile commercial products have overtaken much of the work of interfacing, data acquisition, and data processing (in some cases, the instruments-computer combination are almost indistinguishable from the instruments themselves), the ability to deal with instrument interfacing and programming is still an essential skillset for those working in research and applications where commercial systems have yet to be developed.\nIt\u2019s interesting that people often look at modern laboratory instrumentation and say that everything has gone digital. That\u2019s far from the case. They may point to a single pan balance or thermometer with a digital readout as examples of a \u201cdigital\u201d instrument, not realizing that the packaging contains sensors, analog-digital (A\/D) converters, and computer control systems to manage the device and its communications. The appearance of a \u201cdigital\u201d device masks what is going on inside; we still need to understand the transition from the analog world into the digital domain.\n\n Computer-assisted lab work\/experiments \nThere is a difference between work that can be done by computer and work that has to be done by computer; we just looked at the latter case. There\u2019s a lot of work that goes on in laboratories that could be done and has been done by people, but in today\u2019s labs we prefer to do that work with the aid of automated systems. For example, the management of samples in a testing laboratory used to be done by people logging samples in and keeping record books of the work that is scheduled and has been performed. Today, that work is better managed by a LIMS (or a LIS in the clinical world). The analysis of instrumental data used to be done by hand, and now it is more commonly done by instrument data systems that are faster, more accurate, and permit more complex analysis at a lower cost.\nDo robots fit in this category? One could argue they are simply doing work that could be done manually performed in the past. The reason we might not consider robots in this class is that in many cases the equipment used for the robots is different than the equipment that\u2019s used by human beings. As such, the two really aren\u2019t interchangeable. If the LIMS or instrument data system were down, people could pick up the work manually; however, that may not be the case if a robot goes offline. It\u2019s a small point and you can go either way on this.\nThe key element for this class of implementation is that the use of computers\u2014and, if you prefer, robotics\u2014is an option and not a requirement. Yet that option improves productivity, reduces cost, and provides better quality and more consistent data.\n\nScientific manufacturing \nThis isn\u2019t so much a new implementation class as it is a formal recognition that much of what goes on in a laboratory mirrors work that\u2019s done in manufacturing; some work in quality control is so routine that it matches assembly line work of the 1960s. The major difference, however, is that a lab's major production output is knowledge, information, and data (K\/I\/D). The work in this category is going to expand as a natural consequence of increasing automation, which must be addressed. If this is the direction things are going to go, then we need to do it right.\nRecognizing this point has significant consequences. Rather than just letting things evolve, we can take advantage of the situation and drive situations that are appropriate for this level of automation into useful practice. This means we must:\n\nconvert laboratory methods to fully automated systems;\ndeliberately design and manage equipment and control, acquisition, and processing systems to meet the needs of this kind of application;\ntrain people to work in a more complex environment than they had been; and\nbuild the automation infrastructure (i.e., interfacing and data standards) needed to make these systems realizable and effective, without taking on significant cost.\nIn short, this means opening up another dimension to laboratory work as a natural evolution of work practices. If you look in the direction things are going in the lab, where large sample volume processing is necessary (e.g., high-throughput screening), it is simply a reflection of reality.\nIf we look at quality control applications and manufacturing processes, we basically see one production process (quality control or QC) layered on another (production), where ultimately the two merge into continuous production and testing. This is a logical conclusion to work described by process analytical technologies and quality-by-design.\nThis doesn\u2019t diminish the science used in laboratory work; rather, it adds a level of sophistication that hasn\u2019t been widely dealt with: thinking beyond the basic science process to its implementation in a continuous automated system. This is a much more complex undertaking since 1. data will be created at a high rate and 2. we want to be sure that this is high-quality data and not just the production of junk at a high rate.\nThis type of thinking is not limited to quality control work. It can be readily applied to research as well, where economical high-volume experiments can be used to support statistical experimental design methodologies and more exhaustive sample processing, as well as today\u2019s screening applications in life sciences. It is also readily applied to environmental monitoring and regulatory evaluations. And while this kind of thinking may be new to scientific applications, it isn\u2019t new technology. Work that has been done in automated manufacturing can serve as a template for the work that has to be done in laboratory process automation.\nLet's return to the first two bullets in this section: laboratory method conversion and system management and planning. If you gave four labs a method description and asked them to automate it, you\u2019d get varied implementations of four groups doing the same thing independently. If we are going to turn lab automation into the useful tool it can be, we need to take a different approach: cooperative development of automated systems.\nIn order to be useful, a description of a fully automated system needs more that a method description. It needs equipment lists, source code, etc. in sufficient detail that you can purchase the equipment needed and put the system together expecting it work. However, we can do better than that. In a given industry, where labs are doing the same testing on the same types of samples, we should be able to have them come together and designate and test automated systems to meet the need. Once that is done, vendors can pick up the description and be able to build products suitable to carry out the analysis or test. The problem labs face is getting the work done at a reasonable cost. If there isn\u2019t a competitive advantage to having a unique test, cooperate so that standardized modules for testing can be developed.\nThis changes the process of lab automation from a build-it-from-scratch mentality to the planned connection of standardized automated components into a functioning system.\n\nExperimental methods \nThere is relatively little that can be said about experimental methods at this point. Aside from the clinical industry, not enough work has been done to give really good examples of intentionally designed automated systems that can be purchased, installed in a lab, and expected to function.[g] There are some examples, including ELIZA robotics analysis systems from Caliper Life Sciences and Pressurized Liquid Extraction Systems from Fluid Management Systems. Many laboratory methods are designed with the assumption that people will be doing the work (manually), and any addition of automation would require conversion of that method (Figure 9).\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 9. The consideration of a lab's experimental methods for laboratory technology management\n\n\n\nWhat we need to begin doing is looking at the development of automated methods as a distinct task similar to the published manual methods by ASTM International (ASTM), the Environmental Protection Agency (EPA), and the United States Pharmacopeia (USP), with the difference being that automation is not viewed as mimicking human actions but as well-designed and optimized systems that support the science and any associated production processes (i.e., a scientific manufacturing implementation that includes integration with informatics systems). We need to think \u201cbigger,\u201d without limiting our vision to just the immediate task but rather looking at how it fits into lab-wide operations.\n\nLab-specific technologies and information technology \nThis section quickly covers the next two elements of laboratory technology management at the same time, as many aspects of these elements have already been covered elsewhere. See Figure 10 and Figure 11 for a deeper dive into these two elements.[h]\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 10. Reviewing the many aspects of lab-specific technologies for laboratory technology management\n\n\n\n\n\n\n\n\n\n\n\n\nFigure 11. Consideration of information technologies for laboratory technology management\n\n\n\nThere is one point that needs to be made with respect to information technologies. While lab managers do not need to understand the implementation details of those technologies, they do need to be aware of the potential they offer in the development of a structure for laboratory automation implementations. Management is responsible for lab automation planning, including choosing the best technologies; in other words, management must manage the \u201cbig picture\u201d of how technologies are used to meet their lab's purpose.\nIn particular, managers should pay close attention to the role of client-server systems and virtualization, since they offer design alternatives that impact the choice of products and the options for managing technology. This is one area where good relationships with IT departments are essential. We\u2019ll be addressing these and other information technologies in more detail in other publications.\n\nSystems integration \nSystems integration is the final element of laboratory technology management, one that has been dealt with at length in other areas.[3][4] Many of the points noted above, particularly in the management sections, demand attention be paid to integration in order to develop systems that work well. When systems are planned, they need to be done with an eye toward integrating the components, something that today\u2019s technologies are largely not capable doing as of yet (aside from those built around microplates and clinical chemistry applications). This isn\u2019t going to happen magically, nor is it the province of vendors to define it. This is a realm that the user community has to address by defining the standards and methodologies for integration (Figure 12). The planning that managers have to do as part of technology management has to be done with an understanding of the role integration plays and an ability to choose solutions that lead to well-designed integrated systems. The concepts behind scientific manufacturing depend on it, just as integration is required in any efficient production process.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 12. Addressing systems integration for laboratory technology management\n\n\n\nThe purpose of integration in the lab is to make it easier to connect systems. For example, a CDS may need to pass information and data to a LIMS or ELN and then on to other groups. The resulting benefits of this ability to integrate systems include:\n\nsmoother workflow, meaning less manual effort while avoiding duplication of data entry and data entry errors, something strived for and being accomplished in production environments, including manufacturing, video production, and graphics design;\neasier path for meeting regulatory requirements, as integrated systems, with integration built in by vendors, results in systems that are easier to validate and maintain;\nreduced cost of development and support;\nreduction in duplication of records via better data management; and\nmore flexibility, as integrated systems built on modular components will make it easier to upgrade or update systems, and meet changing requirements.\nThe inability to integrate systems and components through vendor-provided mechanisms results in higher development and support costs, increased regulatory burden, and reduced likelihood that projects will be successful.\n\n What is an integrated system in the laboratory? \nPhrases like \u201cintegrated system\u201d are used so commonly that it seems as though there should be instant recognition of what they are. While the words may bring a concept to mind, do we have the same concept in mind? For the sake of this discussion, the concept of an integrated system has several characteristics. First, in an integrated system, a given piece of information is entered once and then becomes available throughout the system, restricted only by access privileges. The word \u201csystem\u201d in this case is the summation of all the information handling equipment in the lab. It may extend beyond the lab if process connections to other departments are needed. Second, the movement of materials (e.g., during sample preparation), information, and data is continuous from the start of a process through to the end of that process, without the need for human effort. The sequence doesn\u2019t have to wait for someone to do a manual portion of the process in order for it to continue, aside from policy conditions that require checks, reviews, and approvals before subsequent steps are taken.\nAn integrated system should result in a better place for personnel to work as humans wouldn't be fully depended upon for conducting repetitive work. After all, leaving personnel to conduct repetitive work has several drawbacks. First, people get bored and make mistakes (some minor, some not, both of which contribute to variability in results), and second, the progress of work (productivity) is dependent on human effort, which may limit the number of hours that a process can operate. More broadly, it's also a bad way of using intelligent, educated personnel.\n\nA brief historical note \nFor those who are new to the field, we\u2019ve been working on system integration for a long time, with not nearly as much to show for it as we\u2019d expect, particularly when compared to other fields that have seen an infusion of computer technologies. During the 1980s, the pages of Analytical Chemistry saw the initial ideas that would shape the development of automation in chemistry. Dr. Ray Dessy\u2019s (then at Virginia Polytechnical Institute) articles on LIMS, robotics, networking, and IDS laid out the promise and expectation for electronic systems used to acquire and manage the flow of data and information throughout the lab.\nThat concept\u2014the computer integrated lab\u2014was the basis of work by instrument and computer vendors, resulting in proof-of-concept displays and exhibits at PITTCON and other trade shows. After more than 30 years, we are still waiting for that potential to be realized, and we may not be much closer today than we were then. What we have seen is an increase in the sophistication of the tools available for lab work, including client-server chromatography systems and ELNs in their varied forms. In each case, we keep running into the same problem: an inability to connect things into working systems. The result is the use of product-specific code and workarounds to moving and parsing data streams. These are fixes, not solutions. Solutions require careful design not just for the short-term \"what-do-we-need-today\" but also long term robust designs that permit graceful upgrades and improvements without the need to start over from scratch.\n\nThe cost of the lack of progress \nEvery day the scientists and technicians in your labs are working to produce the knowledge, information, and data (K\/I\/D) your company depends upon to meet its goals. That K\/I\/D is recorded in notebooks and electronic systems. How well are those systems going to support your need for access today, tomorrow, or over the next 20 or more years? This is the minimum most companies require for guaranteed access to data.\nThe systems being put in place to manage laboratory K\/I\/D are complex. Most laboratory data management systems (i.e., LIMS, ELN, IDS) are a combination of four separate products: hardware, operating system, database management system, and the application you and your staff uses, each from a different company with its own product life cycle. This means that changes can occur at any of those levels, asynchronously, without consideration for the impact they have on your ability to work.\nLab managers are usually trained in the sciences and personnel aspects of laboratory management. They are rarely trained in technology management and planning for laboratory robotics and informatics, the tools used today to get laboratory work done and manage the results. The consequences of inadequate planning can be significant:\n\nIn January 2006, the FBI ended the LIMS project, and in March 2006 the FBI and JusticeTrax agreed to terminate the contract for the convenience of the government. The FBI agreed to pay a settlement of $523,932 to the company in addition to the money already spent on developing the system and obtaining hardware. Therefore, the FBI spent a total of $1,380,151 on the project. With only the hardware usable, the FBI lost $1,175,015 on the unsuccessful LIMS project.[5]\nOther instances of problems during laboratory informatics projects include:\n\nA 2006 Association for Laboratory Automation survey on the topic of industrial laboratory automation posed the following questions, with percentage of respondents agreeing in parentheses: My Company\/Organization\u2019s Senior Management Feels its Investment in Laboratory Automation Has: succeeded in delivering the expected benefits (56%); produced mixed results (43%); has not delivered the expected benefits (1%). 44% failed to fully realize expectations.[6]\nA long-circulated statistic says that some 60 percent of LIMS installations fail.[7]\nThe Standish Group's CHAOS Report (1995) on project failures (looking at ERP implementations) shows that over half will fail, and 31.1% of projects will be canceled before they ever get completed. Further results indicate 52.7% of projects will cost over 189% of their original estimates.[8]\nFrom a more anecdotal standpoint, we\u2019ve received a number of emails discussing the results of improperly managing projects. Stories that stand out among them include:\n\nAn anonymous LIMS customer was given a precise fixed-price quote somewhere around $90,000 and then got hit with several $100,000 in extras after the contract was signed.\nAn anonymous major pharmaceutical company some years back had implemented a LIMS with a lot of customization that was generally considered to be successful, until it came time to upgrade. They couldn\u2019t do it and went back to square one, requiring the purchase of another system.\nAn anonymous business reports of robotics system failures totaling over $700,000.\nSome report vendors are using customer sites as test-beds for software development.\nA group of three different types of labs with differing requirements were trying to use the same system to reduce costs; nearly $500,000 was spent before the project was cancelled.\nIn addition to those costs, there are the costs of missed opportunities, project delays, departmental and employee frustration, and the fact that the problems you wanted to solve are still sitting there.\nThe causes for failures are varied, but most include factors that could have been avoided by making sure those involved were properly trained. Poor planning, unrealistic goals, inadequate specifications (including a lack of regulatory compliance requirements), project management difficulties, scope creep, and lack of experienced resources can all play a part in a failed laboratory technology project. The lack of features that permit the easy development of integrated systems can also be added to that list. That missing element can cause projects to balloon in scope, requiring people to take on work that they may not be properly prepared for, or projects that are not technically feasible, something developers don\u2019t realize until they are deeply involved in the work.\nThe methods people use today to achieve integration results in cost overruns, project failures, and systems that can\u2019t be upgraded or modified without significant risk of damaging the integrity of the existing system. One individual reported that his company\u2019s survey of customers found that systems were integrated in ways that prevented upgrades or updates; the coding was specific to a particular version of software, and any changes could result in scrapping the current system and starting over.\nOne way of achieving \u201cintegration\u201d is similar to how one might integrate household wiring by hard-wiring all the lamps, appliances, TV\u2019s, etc. to the electrical cables. Everything is integrated, but change isn\u2019t possible without shutting off the power to everything, going into the wall and making the wiring changes, and then repairing the walls and turning things back on. When considering systems integration, that\u2019s not the model we\u2019re considering; however, from the comments we\u2019ve received, it is the way people are implementing software. We\u2019re looking for the ability to connect things in ways that permit change, like the wiring in most households: plug and unplug. That level of compatibility and integration results from the development of standards for power distribution and for connections: the design of plugs and sockets for specific voltages, phasing, and polarity so that the right type of power is supplied to the right devices.\nOf course, there are other ways to practically connect systems, new and old. The Universal Serial Bus (USB) standard allows the same connector to be used for connecting portable storage, cameras, scanners, printers, and other communications devices with a computer. Another older example can be found with modular telephone jacks and tone dialing, which evolved to the more mobile system we have today. However, we probably wouldn't have the level of sophistication we have now if we relied on rotary dials and hard-wired phones.\nThese are just a few examples of component connections that can lead to systems integration. When we consider integrating systems in the lab, we need to look at connectivity and modularity (allowing us to make changes without tearing the entire system apart) as goals.\n\n What do we need to build integrated systems? \nThe lab systems we have today are not built for system-wide integration. They are built by vendors and developers to accomplish a specific set of tasks; connections to other systems is either not considered or avoided for competitive reasons. If we want to consider the possibility of building integrated systems, there are at least five elements that are needed:\n\nEducation\nUser community commitment\nStandards (e.g., file formatting, messaging, interconnection)\nModular systems\nStable operating system environment\nEducation: Facilities with integrated systems are built by people trained to do it. This has been discussed within the concept of LAEs, published in 2006.[9][i] However, the educational issues don\u2019t stop there. Laboratory management needs to understand their role in technology management. It isn\u2019t enough to understand the science and how to manage people, as was the case 30 or 40 years ago. Managers have to understand how the work gets done and what technology is used to do it. The effective use (or the unintended misuse) of technologies can have as big an impact on productivity as anything else. The science also has to be adjusted for advanced lab technologies. Method development should be done with an eye toward method execution, asking \"can this technique be automated?\"\nUser community commitment: Vendors and developers aren\u2019t going to provide the facilities needed for integration unless the user community demands them. Suppliers are going to have to spend resources in order to meet the demands for integration, and they aren\u2019t going to do this unless there is a clear market need and users force them to meet that need. If we continue with \u201cbusiness as usual\u201d practices of force fitting things together and not being satisfied with the result, where is the incentive for vendors to spend development money? The choices come down to these: you only purchase products that meet your needs for integration, you spend resources trying to integrate systems that aren\u2019t designed for it, or your labs continue to operate as they have for the last 30 years, with incremental improvements.\nStandards: Building systems that can be integrated depend upon two elements in particular: standardized file formats and messaging or interconnection systems that permit one vendor\u2019s software package to communicate with another\u2019s. \nFirst, the output of an instrument should be packaged in an industry standardized file format that allows it to be used with any appropriate application. The structure of that file format should be published and include the instrument output plus other relevant information such as date, time, instrument ID, sample ID (read via barcode or other mechanism), instrument parameters, etc. Digital cameras have a similar setup for their raw data files: the pixel data and the camera metadata that tells you everything about the camera used to take the shot.\nIn the 1990s, the Analytical Instrument Association (AIA) (now the Analytical and Life Science Systems Association) had a program underway to develop a set of file format standards for chromatography and mass spectrometry. The program made progress and was turned over to ASTM, where momentum stalled. It was a good first attempt. There were several problems with it that bear noting. The first problem is found in the name of the standard: the Analytical Data Interchange (ANDI) standard.[10] It was viewed as a means of transferring data between instrument systems and served as a secondary file format, with the instrument vendors being the primary format. This has regulatory implications since the Food and Drug Administration (FDA) requires storage of the primary data and that the primary data is used to support submissions. It also means that files would have to have been converted between formats as it moved between systems.\nA standardized file format would be ideal for an instrumental technique. Data collected from an instrument would be in that format and be implemented and used by each vendor. In fact, it would be feasible to have a circuit board in an instrument that would function as a network node. It would collect and store instrument data and forward it to another computer for long-term storage, analysis and reporting, thus separating data collection and use. A similar situation currently exists with instrument vendors that use networked data collection modules. The issue is further complicated by the nature of analytical work. A data file is meaningless without its associated reference material: standards, calibration files, etc., that are used to develop calibration curves and evaluate qualitative and quantitative results. \nWhile file format standards are essential, so is a second-order description: sample set descriptors that provide a context for each sample\u2019s data file (e.g., a sample set might be a sample tray in an autosampler, and the descriptor would be a list of the tray\u2019s contents). Work is underway for the development of another standard for laboratory data: ASTM WK23265 - New Specification for Analytical Information Markup Language. Its description indicates that it does take the context of the sample\u2014its relationship to other samples in a run or tray\u2014into account as part of the standard description.[11]\nThe second problem with the AIA\u2019s program was that it was vendor-driven with little user participation. The transfer to ASTM should have resolved this, but by that point user interest had waned. People had to buy systems and they couldn\u2019t wait for standards to be developed and implemented. The transition from proprietary file formats to standardized formats has to be addressed in any standards program.\nThe third issue with their program involved standards testing. Before you ask a customer to commit their work to a vendor\u2019s implementation of a standard, they should have the assurance, through an independent third-party, that things work as expected.\nModular systems: The previous section notes that vendors have to assume that their software may be running in a stand-alone environment in order to ensure that all of the needed facilities are available to meet the user's needs. This can lead to duplication of functions. A multi-user instrument data system and a LIMS both have a need for sample login. If both systems exist in the lab, you\u2019ll have two sample login systems. The issue can be compounded even further with the addition of more multi-instrument packages.\nWhy not break down the functionality in a lab and use one sample login module? It is simply a multi-user database system. If we were to do a functional analysis of the elements needed in a lab, with an eye toward eliminating redundancy and duplication while designing components as modules, integration would be a simpler issue. A modular approach\u2014a system with a login module, lab management module, modules for data acquisition, chromatographic analysis, spectra analysis, etc.\u2014would provide a more streamlined design, with the ability to upgrade functionality as needed. For example, a new approach to chromatographic peak detection, peak deconvolution, could be integrated into an analysis method without having to reconstruct the entire data system.\nWhen people talk about modular applications, the phrase \u201cLEGO-like\u201d comes to mind. It is a good illustration of what we\u2019d like to accomplish. The easily connectable blocks and components can be structured in a wide variety of items, all based on a simple standardized connection concept. There are two differences that we need to understand. With LEGOs, almost everything connects. In the lab, connections need to make sense. Secondly LEGOs are a single-vendor solution; unless you\u2019re the vendor, that isn\u2019t a good model. A LEGO-like multi-source model (including open source) of well-structured and well-designed and -supported modules that could be connected or configured by the user would be an interesting approach to the development of integrable systems.\nModularity would also be of benefit when upgrading or updating systems. With more functions distributed over several modules, the amount of testing and validation needed would be reduced. It should also be easier to add functionality. This isn\u2019t some fantasy, this is what LAE is when you look at the entire lab environment rather than implementing products task-by-task in isolation.\nStable operating system environment: The foundation of an integrated system must be a stable operating environment. Operating system upgrades that require changes in applications coding are disruptive and lead to a loss of performance and integrity. It may be necessary to forgo the bells and whistles of some commercial operating systems in favor of open-source software that provides required stability. Upgrades should be improvements in quality and functionality where that change in functionality has a clear benefit to the user.\nThe elements noted above are just introductory commentary; each could fill a healthy document by itself. At some point, these steps are going to have to be taken. Until they are, and they result in tools you can use, labs\u2014your labs\u2014are going to be committing the results of your work into products and formats you have little control over. That should not be an acceptable situation; the use of proprietary file formats that limit your ability to work with your company\u2019s data should end and be replaced with industry-standard formats that give you the flexibility to work as you choose, with whatever products you need.\nWe need to be deliberate in how we approach this problem. When discussing file format standards, it was noted that the data file for a single sample is useless by itself. If you had the file for a chromatogram for instance, you could display it and look at the conditions used to collect it; however, interpretation requires data from other files, so standards for file sets have to be developed. That wasn\u2019t a consideration in the original AIA work on chromatography and mass spectrometry (though it was in work done on Atomic Absorption, Emission and Mass Spectroscopy Data Interchange Specification standards for the Army Corps of Engineers, 1995).\nThe first step in this process is for lab managers and IT professionals to become educated in laboratory automation and what it takes to get the job done. The role of management can\u2019t be understated; they have to sign off on the direction work takes and support it for the long haul. The education needs to focus on the management and implementation of automation technologies, not just the underlying science. After all, it is the exclusive focus on the science that leads to the silo-like implementations we have today. The user community's active participation in the process is central to success, and unless that group is educated in the work, the effect of that participation will be limited.\nSecondly, we need to renew the development of industry-standard file formats, not just from the standpoint of encapsulating data files, but formats that ensure that the data is usable. The initial focus for each technique needs to be a review of how laboratory data is used, particularly with the advent of hyphenated techniques (e.g., Gas chromatography\u2013mass spectrometry or GC-MS), and use that review as a basis for defining the layers of standards needed to develop a useable product. This is a complex undertaking but worth the effort. If you\u2019re not sure, consider how much your lab\u2019s data is worth and the impact of its loss.\nIn the short term, we need to start pushing vendors\u2014you have the buying power\u2014to develop products with the characteristics needed to allow you to work with and control the results of your lab\u2019s work. Products need to be developed to meet your needs, not the vendor's. Product criterion needs to be set with the points above in mind, not on a company-by-company basis but as a community; you\u2019re more likely to get results with a community effort.\nOvercoming the barriers to the integration of laboratory systems is going to take a change in mindset on the part of lab management and those working in the labs. That change will result in a significant evolution in the way labs work, yielding higher productivity and a better working environment, with an improvement in the return on your company\u2019s investment in your lab's operations. Laboratory systems need to be designed to be effective. The points noted here are one basis for that design.\n\nSummary \nThat is a brief tour of what the major elements of laboratory technology management looks like right now. The diagrams will change and details will be left to additional layers to keep the structure easy to understand and use. One thing that was sacrificed in order to facilitate clarity is the relationship between technologies. For example, a robotics system might use data acquisition and control components in its operations, which could be noted by a link between those elements.\nThere is room for added complexity to the map. Someone may ask where bioinformatics or some other subject resides. That as well as other points\u2014and there are a number of them \u2014would be addressed in successive levels, giving the viewer the ability to drill down to whatever level of detail they need. The best way to view this is an electronic map that can be explored by clicking on subjects for added information and relationships.\nAn entire view of the diagram of the elements of laboratory technology can be found here (as a PDF).\n\nSkills required for working with lab technologies \nWhile this subject could have arguably been discussed in the management section above, we needed to wait until the major elements were described before taking up this critical point. In particular, we had to address the idea behind \"scientific manufacturing.\"\nLab automation has an identity problem. Many people don\u2019t recognize it as a field. It appears to be a collection of products and technologies that people can use as needed. Emphasis has shifted from one technology to another depending on what is new, hot, or interesting, with conferences and papers discussing that technology until something else comes along. Robotics, LIMS, and neural networks have all had their periods of intense activity, and now the spotlight is on ELNs, integration, and paperless labs. \nLab automation needs to be addressed as a multi-disciplinary field, working in all scientific disciplines, by lab personnel, consultants, developers, and those in IT support groups. That means addressing three broad groups of people: scientists and technicians (i.e., the end users), LAEs (i.e., those designing, and implementing systems for the end users), and the technology developers (Figure 13).\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 13. Groups that need to be addressed when discussing laboratory automation\n\n\n\nDiscussions concerning lab automation and the use of advanced technologies in lab work are usually done from the standpoint of the technologies themselves: what they are, what they do, benefits, etc. Missing from these conversations is an appreciation of the ability of those scientists and technicians\u2014the end users\u2014in the lab to use these tools, and how they will change the nature of laboratory work.\nThe application of analog electronic systems to laboratory work began in the early part of the twentieth century. For the most part, those systems made it easier for a scientist to make measurements. Recording spectrophotometers replaced wavelength-by-wavelength manual measurements, and process chromatographs automated sample taking, back-flush valves, attenuation changes, etc. They made it easier to collect measurements but did not change the analyst's job of data analysis. After all, analysts still had to look at each curve or chromatogram, make judgments, and apply their skills to making sense of the experiment. At this point, scientists were in charge of executing the science, while analog electronics made the science easier to deal with.\nWhen processor-based systems were added to the lab\u2019s tool set, things moved in a different direction. The computers could then perform the data acquisition, display, and analysis. This left the science to be performed by a program, with the analyst able to adjust the behavior of the program by setting numerical parameters. This represents a major departure in the nature of laboratory work, from scientist being completely responsible for the execution of lab procedures to allowing a computer-based system to take over control of all or a portion of the work.\nFor many labs, the use of increasingly sophisticated technologies is just a better way of individuals doing tasks better, faster, and with less cost. In others, the technology takes over a task and frees the analyst to do other things. We\u2019ve been in a slow transition from people driving work to technology driving work. As the use of lab technologies moves further into automation, the practice of laboratory work is going to change substantially until we get to the point where scientific manufacturing and production is the dominant function: automation applied from sample acceptance to the final test or experimental result.\n\nDevelopment of manufacturing and production stages \nWe can get a sense of how work will change by looking at the development of manufacturing and production systems, where we see a transition from manual methods to fully automated production, in the end driven by the same issues as laboratories: a need or desire for high productivity, lower costs, and improved and consistent product results. The major difference is that in labs, the \u201cproduct\u201d isn\u2019t a widget, it is information and data. In product manufacturing, we also see a reduction in manpower as a goal; in labs, it is a shift from manual effort to using that same energy to understand data and improve the science. One significant benefit from a shift to automation is that lab staff will be able to redesign lab processes\u2014the science behind lab work\u2014to function better in an automated environment; most of the processes and equipment in place today assume manual labor and are not well-designed for automated control.\nThat said, we\u2019re going to\u2014by analogy\u2014look at a set of manufacturing and production stages associated with wood working, e.g., making the components of door frames or moldings. The trim you see on wood windows and the framing on cabinet doors are examples of shaping wood. When we look at the stages of manufacturing or production, we have to consider five attributes of that production effort: relative production cost, productivity, required skills, product quality, and flexibility.\nInitially, hand planes were used to remove wood and form trim components. Multiple passes were needed, each deepening the grooves and shaping the wood. It took practice, skill, and patience to do the work well and avoid waste. This was the domain of the craftsman, the skilled woodworker, who represents the first stage of production evolution. In terms of our evaluation, Figure 14 shows the characteristics of this first stage (we\u2019ll fill in the table as we go along).\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 14. The first stage of an evolving manufacturing and production process, using wood working as an example\n\n\n\nThe next stage sees the craftsman turn to hand-operated, electric motor-driven routers to shape the wood. Instead of multiple passes with a hand plane, motor-driven set of bits removes material, leaving the finished product. A variety of cutting bits allow the craftsman to create different shapes. For example, a matching set of bits may be specially designed for the router to create the interlocking rails and stiles that frame cabinet doors.\nFigure 15 shows the impact of this equipment on this second evolutionary stage of production. While still available to the home woodworker, the use of this equipment implies that the craftsman is going to be producing the shaped wood in quantity, so we are moving beyond level of production found in a cottage industry to the seeds of a growth industry. The cost of good quality routers and bits is modest and requires an investment in developing skills to use them effectively. Used well (and safely) they can produce good products; they can also produce a lot of waste if the individual isn\u2019t properly schooled.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 15. The second stage of an evolving manufacturing and production process, using wood working as an example\n\n\n\nThe third stage sees automated elements work their way into the wood working mix, with the multi-headed numerically controlled router. Instead of one hand-operated router-bit combination, there are four router-bit assemblies directed by a computer program to follow a specific path such that highly repeatable and precise cuts can be made. Of course, with the addition of multiple heads and software, the complexity of the product increases.\nFigure 16 shows the impact of this equipment on the third evolutionary stage of production. We\u2019ve moved from the casual woodworker to a full production operation. The cost of the equipment is significant, and the operators\u2014both the program designer and the machine operator\u2014have to be skilled in the use of the equipment to reduce mistakes and waste material. The \u201cLess Manual Skill\u201d notation under \"Skills\" indicates a transition point where we have moved almost entirely from the craftsman or woodworker to the skilled operator, requiring different skill sets than previous production methods. One of the side-effects of higher production is that if you make a design error, you can make out-of-specification product rather quickly.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 16. The third stage of an evolving manufacturing and production process, using wood working as an example\n\n\n\nFrom there, it's not a far jump to the final stage: a fully automated assembly line. Their inclusion completes the chart that we\u2019ve been developing (Figure 17).\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 17. The fourth and final stage of an evolving manufacturing and production process, using wood working as an example\n\n\n\nWhen we take the information from Figure 17, we can summarize the entire process as follows (Figure 18):\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 18. All stages of a wood working manufacturing and production process, and their attributes, summarized\n\n\n\nWhen we look at that summary, we can't help but notice that it translates fairly well when we replace \"wood working\" with \"laboratory work,\" moving from the entirely manual processes of the skilled technician or scientist to the full automated scientific manufacturing and production process of the skilled operator or system supervisor. We visualize that in Figure 19. (The image in the last column of Figure 19 is of an automated extraction system from Fluid Management Systems, Watertown, MA.)\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 19. All four stages of an evolving scientific manufacturing and production process, this time using lab work\n\n\n\nWhat does this all mean for laboratory workers \nThe skills needed today and in the future to work in a modern lab have changed significantly, and they will continue to change as automation takes hold. We\u2019ve seen these changes occur already. Clinical chemistry, high-throughput screening (HTS), and automated bioassays using microplates are some examples.\nThe discussion here mirrors the development in the woodworking example. We\u2019ll look at the changes in skills, using chromatography as an example. The following material is applicable to any laboratory environment, be it electronics, forensics, physical properties testing, etc. Chromatography is being used because of its wide application in lab work.\nStage 1: The analyst using manual methods\nBy \u201cmanual methods\u201d we mean 100% manual work, including having the chromatographic detector output (an analog signal) recorded on a standard strip chart recorder and the pen trace getting analyzed by the hand, eye, and skill of the analyst. The process begins with the analyst finding out what samples need to be processed, finding those samples, and preparing them for injection into the instrument. The instrument has to be set up for the analysis, which includes installing the proper columns, adjusting flow rates, confirming component temperatures, and making sure that the instrument is working properly.\nAs each injection is done, the starting point for the data\u2014a pen trace of the analog signal\u2014is noted on the strip chart. This process is repeated for each sample and reference standard. Depending on the type of analysis, each sample\u2019s data may take up to several feet of chart paper. The recording is a continuous trace and is a faithful representation of the detector output, without any filtering aside from attenuator adjustments (range selections to keep the signal recording within the limits of the paper; some peaks may peg the pen at the top of the chart because of their size in which case that data is lost) and electrical or mechanical noise reduction.\nWhen all the injections have been completed, the analyst begins the evaluation of each sample\u2019s data. That includes:\n\ninspecting the chromatogram for anomalies, including peaks that weren\u2019t expected (possible contaminants), separations that aren\u2019t as clear as they should be, noise, baseline drifts, and any other unusual conditions that would indicate a problem with that sample or the entire run of samples;\ntaking the measurements needed for qualitative and\/or quantitative analysis;\ndeveloping the calibration curves; and\nmaking the calculations needed to complete the analysis.\nThe analysis would include any in-process control samples, and addressing issues with problem samples. The final step would be the administrative work, including checks of the work by another analyst, reporting results, and updating work request lists.\nStage 2: Point automation, applied to specific tasks\nThe next major development in this evolution is the introduction of automated injectors. Instead of the analyst spending the day injecting samples into the instrument's injection port, a piece of equipment does it, ushering in the first expansion of the analyst\u2019s skill set. (In the previous stage, the analysis time is generally too short to allow the analyst to do anything else, so the analyst's day is spent injecting and waiting.) Granted, this doesn't represent a major change, but it is a change. It requires the analyst to confirm that the samples and standards are in the right order, that the right number of injections per sample are set, or that duplicate vials are put in the tray (duplicate injections get used to confirm that problems don't occur during the injection process). The analyst has to ensure that the auto-injector is connected to the strip chart recorder so that the injection timing mark is made automatically.\nThis simple change of adding an auto-injector to the process has some impact on the analyst's skill set. The same holds true for the use of automatic integrators, and sample preparation systems; in addition to understanding the science, the lab work takes on the added dimension of managing systems, trading labor for systems supervision with a gain of higher productivity.\nStage 3: Sequential-step automation\nThe addition of data systems to the sample analysis process train (from worklist generation to sample preparation to instrument analysis sequence) further reduces the amount of work the analyst does in sample analysis and changes the nature of the work performed. Starting with the simple integrators and moving on to advanced computer systems, the data system works with the auto-injector to start the instrument analysis phase of work, acquire the signal from the detector, convert it to a digital form, process that data (e.g., peak detection, area and peak height calculations, retention time), and perform the calculations needed for quantitative analysis. This yields less work with higher productivity.\nWhile systems like this are common in labs today, there are problems, which we\u2019ll address shortly.\nStage 4: Production-level automation (scientific manufacturing and production)\nDepending on what is necessary for sample preparation, it may not be much of a stretch to have automated sample prep, injection, data collection, analysis, and reporting (with automated updates into a LIMS) performed in a small footprint with equipment available today. One vendor has an auto-injection system that is capable of dissolving material, performing extractions, mixing, and barcode reading, as well as other functions. Connect that to a chromatograph and data station, with programmed connection to a LIMS, and you have the basis of an automated sample preparation\u2013chromatographic system. However, there are some issues that have to be noted and addressed.\nThe goal with such a system has to be high-volume, automated sample processing with the generation of high-quality data. The intent is to reduce the amount of work the analyst has to perform, ideally so that the system can run unattended. Note that \u201chigh-quality\u201d in this case means to have a high level of confidence in the results. There is more to that than the ability to do calculations for quantitative analysis or having a validated system; you have to validate the right system.\nComputer systems used in chromatographic analysis can be tuned to control how peaks are detected, what is rejected as noise, and how separations are identified so that baselines can be properly drawn and peak areas allocated. The analyst needs to evaluate the impact of these parameters for each analytical procedure and make sure that the proper settings are used.\nAs previously noted regarding manual processes, the inspection of the chromatogram for elements that don\u2019t match the expectations for a well-characterized sample (the number of peaks that should be there, the type of separations between peaks, etc.) is vital. This screening of samples has to be applied to every sample whether by human eye or automated system, the latter giving lower labor costs and higher productivity. If we are going to build fully automated production systems, we have to be able to describe a screening template that is applied to every sample to either confirm that the sample fits the standard criteria or has to be noted for further evaluation. That \u201cfurther evaluation\u201d may be frustrated by not having the data system keep sufficient data for that evaluation, and require rerunning the sample.\nThe data acquired by the computer system undergoes several levels of filtering and processing before you see the final results. The sampling algorithms don\u2019t give us the level of detail in the analog chromatogram. The visual display of a chromatogram is going to be limited by the data collected and the resolution of the display. The stair-stepping of a digitized chromatogram is an example of that, while an analog chromatogram is a smooth line. Small details and anomalies that could be evidence of contamination may be missed because of the processing.\nAdditionally, the entire process needs to be continually monitored and evaluated to make sure that it is working properly. This is process-level statistical quality control, and it includes options for evolutionary operations updates, small changes to improve process performance. Standard samples have to be run to test the system. If screening templates are used, samples designed to exhibit problems have to be run to trigger those templates to make sure that problem samples are detected. These in-process checks have to include every phase of the process and be able to evaluate all potential risks. The intent is to build confidence in the data by building confidence in the system used to produce it.\nThe goals of higher productivity can be achieved for sample processing, but in doing so, the work of the analyst will change from carrying out a procedure to managing and continuously tuning a system that is doing the work for them. The science has to be well-understood, as does the implementation of that science. As we shift to more automation, and use analytical techniques to monitor in-process production systems, more emphasis has to be placed on characterizing all the assumptions and possible failure points of the technique and building in tests to ensure that the data being used to evaluate and control a production process is sound.\nDuring the development of the process described above, the analyst has to first determine the sequence of steps, demonstrate that they work as expected, and prepare the documentation needed to support the process and guide someone through it\u2019s execution. This includes the details of how the autosampler programming is developed, stored, and maintained. The same holds true for the data systems parameters, screening templates, and processing routines. This is process engineering. Then, when the system is being used, the analyst has to ensure that the proper programming is loaded into each component and that it is set up and ready for use.\nThis is a very simple example of what is possible and an illustration of the changes that could occur in the work of lab professionals. The performance of a system such as that described could be doubled by implementing a second process stream without significantly increasing the analyst workload.\nThe key element is the skill level of those working in the lab (Figure 20): are they capable of meeting the challenge? Much of what has been described is process engineering, and there are people in manufacturing and production who are good at that. We need to combine process engineering skills with the science. Developing automation teams is one approach, but no matter how you address the idea, those working in labs need an additional layer of skills, beyond what they have been exposed to in formal education settings.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 20. The skills and education required at the various stages of the lab-based scientific manufacturing and production process\n\n\n\nOf the sciences, clinical chemistry has moved the furthest into the advanced application of laboratory automation in the lab (Figure 20), transforming lab work in the process, moving from lab staff executing procedures manually to managing systems. The following quote from Diana Mass of Associated Laboratory Consultants [j] helps delineate the difference in lab work styles:\n\nWhat I have observed is that automation has replaced some of the routine repetitive steps in performing analysis; however, the individual has to be even more knowledgeable to troubleshoot sophisticated instrumentation. Even if the equipment is simple to operate, the person has to know how to evaluate quality control results and have a quality assurance system in place to ensure quality test information.\nAnd here's a quote from Martha Casassa, Laboratory Director of Braintree Rehabilitation Hospital[k], who has experience in both clinical and non-clinical labs:\n\nHaving a background both clinical (as a medical technologist) and non-clinical (chemistry major and managing a non-clinical research lab), I can attest to the training\/education being different. I was much more prepared coming through the clinical experience to handle automation and computers and the subsequent troubleshooting and repair necessary as well as the maintenance and upkeep of the systems. During my non-clinical training the emphasis was not so much on theory as practical application in manual methods. I learned assays on some automated equipment, but that education was more to obtain an end-product than to really understand the system and how it produced that product. On the clinical side I learned not only how to get the end-product, but the way it was produced so I could identify issues sooner, produce quality results, and more effectively troubleshoot.\nThe bottom line is simple: if people are going to be effective working in modern labs, the need is to understand both the science and the way science is done using the tools of lab automation. We have a long way to go before we get there. A joint survey by the ILA and Lab Managers Association[l] yielded the following:\n\nLab automation is essential for most labs, but not all.\nThe skill set necessary to work with automation has changed significantly.\nEntry-level scientists are generally capable of working with the hardware and software.\nEntry-level technicians often are not entry-level technicians.\nIn general, applicants for positions are not well qualified to work with automation.\nHow well-educated in the use of automated systems are those working in the lab? The following text was used earlier: \"When processor-based systems were added to the lab\u2019s tool set, things moved in a different direction. The computers could then perform the data acquisition, display, and analysis. This left the science to be performed by a program, with the analyst able to adjust the behavior of the program by setting numerical parameters.\"\nLet's follow that thought down a different path. In chromatography, those numerical parameters were used to determine the start and end of a peak, and how baselines were drawn. In some cases, an inappropriate set of parameters would reduce a data set to junk. Do people understand what the parameters are in an IDS and how to use them? Many industrial labs and schools have people using and IDS with no understanding of what is happening to their data. Others such as Hinshaw and Stevenson et al. have commented on this phenomenon in the past:\n\nChromatographers go to great lengths to prepare, inject, and separate their samples, but they sometimes do not pay as much attention to the next step: peak detection and measurement ... Despite a lot of exposure to computerized data handling, however, many practicing chromatographers do not have a good idea of how a stored chromatogram file\u2014a set of data points arrayed in time\u2014gets translated into a set of peaks with quantitative attributes such as area, height, and amount.[12]\nAt this point, I noticed that the discussion tipped from an academic recitation of technical needs and possible solutions to a session driven primarily by frustrations. Even today, the instruments are often more sophisticated than the average user, whether he\/she is a technician, graduate student, scientist, or principal investigator using chromatography as part of the project. Who is responsible for generating good data? Can the designs be improved to increase data integrity?[13]\nIn yet another example, at the European Lab Automation 2012 Meeting, one liquid-handling equipment vendor gave a presentation on how improper calibration and use of liquid handling systems would yield poor data.[14] Discussion with other vendors supported that point, citing poor training as the cause.\nOne of the problems that has developed is \u201cpush-button science,\u201d or to be more precise, the execution of tasks by pushing a button: put the sample in, push a button to get the measurements, and get a printout. Measurements are being made, and those using the equipment don\u2019t understand what is being done, or if it is being done properly, exemplifying a \u201ctrust the vendor\u201d or the \u201cvendor is the expert\u201d mindset. Those points run against the concepts of validation described by the FDA and International Organization for Standardization (ISO) organizations, as well as others. People need to approach equipment with a healthy skepticism, not assuming that it is working but being able to demonstrate that it is working. Trusting the system is based on experience and proof that the system works as expected, not assumptions.\nEducation on the use of current lab systems is sorely needed in today\u2019s working environment. What are the needs going to be in the future as we move from people using equipment in workstations to the integrated workflows of scientific manufacturing and production processes?\n\nClosing \nThe direction of laboratory automation has changed significantly over time, and with it so has the work associated with a lab. However, many have looked at that automation as just a means to an end, when it reality laboratory automation is a process, like most any other in a manufacturing and production setting. As a process, we need to look at the elements that can be used to make that process work effectively: management issues, implementation issues, experimental methods, lab-specific technologies, broader information technologies, and systems integration issues. Addressing those early, as part of a planning process, better ensures successful implementation of automation in the lab setting. \nBut there's more to it than that: the personnel doing the work must have the skills and education to fully realize lab automation's benefits. In fact, it's more than the scientists and technicians doing the lab work, but also those designing and implementing the technology in the lab, as well as the vendors actually making the components. Everyone must be communicating ideas, making suggestions, and working together in order to make automation work well in the lab. But the addition of that technology does not mean a more efficient and cost-effective lab; it requires knowledge of production and how the underlying technology actually works. While, when implemented well, automation may come with lower costs and better productivity but it also comes with a demand for higher-level, specialized skills. In other words, lab personnel must understand both the science and the way science is done when using the tools of laboratory automation.\nWe need lab personnel who are competent users of modern lab instrumentation systems, robotics, and informatics (LIMS, ELNs, SDMS, CDS, etc.), the tools used to do lab work. They should understand the science behind the techniques and how the systems are used in their execution. If a computer system is used to do data capture and processing, that understanding includes:\n\nhow the data capture is accomplished,\nhow it is processed,\nwhat the control parameters are and how the current set in use was arrived at (not \u201cthat\u2019s what came from the vendor\u201d), and\nhow to detect and correct problems.\nThey should also understand statistical process control so that the behavior of automated systems can be monitored, with potential problems detected and corrected before they become significant. Rather than simply being part of the execution of a procedure, they manage the process. We're talking about LAEs.\nThose LAEs must be capable of planning, implementing, and supporting lab systems, as well as developing products and technologies for labs.[m] This type of knowledge isn't limited to the people developing systems; those supporting them also require those capabilities. After all, the implementation of laboratory systems is an engineering program and should be approached in the same manner as any systems development activity.\nThe use of advanced technology products isn\u2019t going to improve until we have people that are fully competent to work with them, understand their limitations, and drive vendors to create better products.\n\r\n\n\n Abbreviations, acronyms, and initialisms \nAIA: Analytical Instrument Association\nCDS: Chromatography data system\nELN: Electronic laboratory notebook\nFDA: Food and Drug Administration\nGC-MS: Gas chromatography\u2013mass spectrometry\nIDS: Instrument data system\nISO: International Organization for Standardization\nK\/D\/I: Knowledge, data, and information\nLAE: Laboratory automation engineering (or engineer)\nLIMS: Laboratory information management system\nLIS: Laboratory information system\n\nFootnotes \n\n\n\u2191 The Spectronic 20 was developed by Bausch & Lomb in 1954 and is currently owned and marketed in updated versions by ThermoFisher. \n\n\u2191 Mr. Zenie often introduced robotics courses at Zymark with that statement. \n\n\u2191 See ASTM E1578 - 06. \n\n\u2191 This is not a recommended method. \n\n\u2191 This topic will be given light treatment in this work, but will be covered in more detail elsewhere. \n\n\u2191 LabVIEW is a product of National Instruments; similar products are available from other vendors. \n\n\u2191 Having a data system connected to, or in control of, a process is not the same as full automation. For example, there are automated Karl Fisher systems (for water analysis), but they only address titration activities and not sample preparation. A vendor can only take things so far in commercial products unless labs describe a larger role for automation, one that will vary by application. The point is we need a formalized description of that larger context. \n\n\u2191 For more information about lab-specific technologies, please refer to Computerized Systems in the Modern Laboratory: A Practical Guide. \n\n\u2191 You can also find the expanded version of the paper Are You a Laboratory Automation Engineer here. \n\n\u2191 Formerly Professor and Director of Clinical Laboratory Sciences Program, Arizona State University. Quote from private communications, used with permission. \n\n\u2191 Also from private communications, used with permission. \n\n\u2191 Originally available on the ILA site, but no longer available. \n\n\u2191 See Are You a Laboratory Automation Engineer? for more details. \n\n\nAbout the author \nInitially educated as a chemist, author Joe Liscouski (joe dot liscouski at gmail dot com) is an experienced laboratory automation\/computing professional with over forty years of experience in the field, including the design and development of automation systems (both custom and commercial systems), LIMS, robotics and data interchange standards. He also consults on the use of computing in laboratory work. He has held symposia on validation and presented technical material and short courses on laboratory automation and computing in the U.S., Europe, and Japan. He has worked\/consulted in pharmaceutical, biotech, polymer, medical, and government laboratories. His current work centers on working with companies to establish planning programs for lab systems, developing effective support groups, and helping people with the application of automation and information technologies in research and quality control environments.\n\nReferences \n\n\n\u2191 \"Laboratory automation\". Wikipedia. Archived from the original on 05 December 2014. https:\/\/en.wikipedia.org\/w\/index.php?title=Laboratory_automation&oldid=636846823 .   \n \n\n\u2191 McDowall, R.D. (1993). \"A Matrix for the Development of a Strategic Laboratory Information Management System\". Analytical Chemistry 65 (20): 896A\u2013901A. doi:10.1021\/ac00068a725.   \n \n\n\u2191 Trigg, J. (2014). \"The Integrated Lab\". Archived from the original on 18 December 2014. https:\/\/web.archive.org\/web\/20141218063422\/http:\/\/www.theintegratedlab.com\/ .   \n \n\n\u2191 Liscouski, J. (2012). \"Integrating Systems\". Lab Manager 7 (1): 26\u20139. https:\/\/www.labmanager.com\/computing-and-automation\/integrating-systems-17595 .   \n \n\n\u2191 Office of the Inspector General (June 2006). \"Executive Summary\". The Federal Bureau of Investigation's Implementation of the Laboratory Information Management System - Audit Report 06-33. https:\/\/oig.justice.gov\/reports\/FBI\/a0633\/index.htm .   \n \n\n\u2191 Hamilton, S.D. (2007). \"2006 ALA Survey on Industrial Laboratory Automation\". SLAS Technology 12 (4): 239\u201346. doi:10.1016\/j.jala.2007.04.003.   \n \n\n\u2191 Scientific Computing World (September\/October 2002). \"Choosing the right client\". Scientific Computing World. Archived from the original on 19 October 2007. https:\/\/web.archive.org\/web\/20071019002613\/http:\/\/www.scientific-computing.com\/features\/feature.php?feature_id=132 .   \n \n\n\u2191 \"The CHAOS Report\" (PDF). Standish Group International, Inc. 1995. https:\/\/www.csus.edu\/indiv\/r\/rengstorffj\/obe152-spring02\/articles\/standishchaos.pdf .   \n \n\n\u2191 Liscouski, J.G. (2006). \"Are You a Laboratory Automation Engineer?\". SLAS Technology 11 (3): 157-162. doi:10.1016\/j.jala.2006.04.002.   \n \n\n\u2191 Julian, R.K. (22 July 2003). \"Analytical Data Interchange\". SourceForge. Archived from the original on 04 August 2003. https:\/\/web.archive.org\/web\/20030804151150\/http:\/\/andi.sourceforge.net\/ .   \n \n\n\u2191 \"ASTM WK23265 - New Specification for Analytical Information Markup Language\". ASTM International. Archived from the original on 13 August 2013. https:\/\/web.archive.org\/web\/20130813072546\/https:\/\/www.astm.org\/DATABASE.CART\/WORKITEMS\/WK23265.htm .   \n \n\n\u2191 Hinshaw, J.V. (2014). \"Finding a Needle in a Haystack\". LCGC Europe 27 (11): 584\u201389. https:\/\/www.chromatographyonline.com\/view\/finding-needle-haystack-0 .   \n \n\n\u2191 Stevenson, R.L.; Lee, M.; Gras, R. (1 September 2011). \"The Future of GC Instrumentation From the 35th International Symposium on Capillary Chromatography (ISCC)\". American Laboratory. https:\/\/americanlaboratory.com\/913-Technical-Articles\/34439-The-Future-of-GC-Instrumentation-From-the-35th-International-Symposium-on-Capillary-Chromatography-ISCC\/ .   \n \n\n\u2191 Bradshaw, J.T. (30 May 2012). \"The Importance of Liquid Handling Details and Their Impact on Your Assays\" (PDF). European Lab Automation Conference 2012. Artel, Inc. https:\/\/d1wfu1xu79s6d2.cloudfront.net\/wp-content\/uploads\/2013\/10\/The-Importance-of-Liquid-Handling-Details-and-Their-Impact-on-Your-Assays.pdf . Retrieved 11 February 2021 .   \n \n\n\n\n\n\n\nSource: <a rel=\"external_link\" class=\"external\" href=\"https:\/\/www.limswiki.org\/index.php\/LII:Elements_of_Laboratory_Technology_Management\">https:\/\/www.limswiki.org\/index.php\/LII:Elements_of_Laboratory_Technology_Management<\/a>\nNavigation menuPage actionsLIIDiscussionView sourceHistoryPage actionsLIIDiscussionMoreToolsIn other languagesPersonal toolsLog inRequest accountNavigationMain pageRecent changesRandom pageHelp about MediaWikiSearch\u00a0 ToolsWhat links hereRelated changesSpecial pagesPermanent linkPage informationSponsors \r\n\n\t\r\n\n\t\r\n\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\r\n\n\t\r\n\n \n\t\n\t\n\t\r\n\n\t\r\n\n \n\t\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\r\nPrint\/exportCreate a bookDownload as PDFDownload as PDFDownload as Plain textPrintable version This page was last edited on 18 February 2021, at 19:23.Content is available under a Creative Commons Attribution-ShareAlike 4.0 International License unless otherwise noted.This page has been accessed 963 times.Privacy policyAbout LIMSWikiDisclaimers\n\n\n\n","2000eea677bcd5ee1fcecdab32743800_html":"<body class=\"mediawiki ltr sitedir-ltr mw-hide-empty-elt ns-202 ns-subject page-LII_Elements_of_Laboratory_Technology_Management rootpage-LII_Elements_of_Laboratory_Technology_Management skin-monobook action-view skin--responsive\"><div id=\"rdp-ebb-globalWrapper\"><div id=\"rdp-ebb-column-content\"><div id=\"rdp-ebb-content\" class=\"mw-body\" role=\"main\"><a id=\"rdp-ebb-top\"><\/a>\n<h1 id=\"rdp-ebb-firstHeading\" class=\"firstHeading\" lang=\"en\">LII:Elements of Laboratory Technology Management<\/h1><div id=\"rdp-ebb-bodyContent\" class=\"mw-body-content\"><!-- start content --><div id=\"rdp-ebb-mw-content-text\" lang=\"en\" dir=\"ltr\" class=\"mw-content-ltr\"><div class=\"mw-parser-output\"><p><b>Title<\/b>: <i>Elements of Laboratory Technology Management<\/i>\n<\/p><p><b>Author for citation<\/b>: Joe Liscouski, with editorial modifications by Shawn Douglas\n<\/p><p><b>License for content<\/b>: <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/creativecommons.org\/licenses\/by\/4.0\/\" target=\"_blank\">Creative Commons Attribution 4.0 International<\/a>\n<\/p><p><b>Publication date<\/b>: 2014\n<\/p>\n\n\n<h2><span class=\"mw-headline\" id=\"Introduction\">Introduction<\/span><\/h2>\n<p>This discussion is less about specific technologies than it is about the ability to use advanced <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory\" title=\"Laboratory\" class=\"wiki-link\" data-key=\"c57fc5aac9e4abf31dccae81df664c33\">laboratory<\/a> technologies effectively. When we say \u201ceffectively,\u201d we mean that those products and technologies should be used successfully to address needs in your lab, and that they improve the lab\u2019s ability to function. If they don't do that, you\u2019ve wasted your money. Additionally, if the technology in question hasn\u2019t been deployed according to a deliberate plan, your funded projects may not achieve everything they could. Optimally, when applied thoughtfully, the available technologies should result in the transformation of lab work from a labor-intensive effort to one that is intellectually intensive, making the most effective use of people and resources.\n<\/p><p>People come to the subject of <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_automation\" title=\"Laboratory automation\" class=\"wiki-link\" data-key=\"0061880849aeaca05f8aa27ae171f331\">laboratory automation<\/a> from widely differing perspectives. To some it\u2019s about robotics, to others it\u2019s about <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_informatics\" title=\"Laboratory informatics\" class=\"wiki-link\" data-key=\"00edfa43edcde538a695f6d429280301\">laboratory informatics<\/a>, and even others view it as simply data acquisition and <a href=\"https:\/\/www.limswiki.org\/index.php\/Data_analysis\" title=\"Data analysis\" class=\"wiki-link\" data-key=\"545c95e40ca67c9e63cd0a16042a5bd1\">analysis<\/a>. It all depends on what your interests are, and more importantly what your immediate needs are.\n<\/p><p>People began working in this field in the 1940s and 1950s, with the work focused on analog electronics to improve instrumentation; this was the first phase of lab automation. Most notably were the development of scanning <a href=\"https:\/\/www.limswiki.org\/index.php\/Spectrophotometer\" title=\"Spectrophotometer\" class=\"wiki-link\" data-key=\"6382bb48c914f3c490400c13f9eb16e6\">spectrophotometers<\/a> and process <a href=\"https:\/\/www.limswiki.org\/index.php\/Chromatography\" title=\"Chromatography\" class=\"wiki-link\" data-key=\"2615535d1f14c6cffdfad7285999ad9d\">chromatographs<\/a>. Those who first encountered this equipment didn\u2019t think much of it and considered it the world as it\u2019s always been. Others who had to deal with products like the Spectronic 20<sup id=\"rdp-ebb-cite_ref-1\" class=\"reference\"><a href=\"#cite_note-1\">[a]<\/a><\/sup> (a single-beam manual spectrophotometer), and use it to develop visible spectra one wavelength measurement at a time, appreciated the automation of scanning instruments.\n<\/p><p>Mercury switches and timers triggered by cams on a rotating shaft provided chromatographs with the ability to automatically take <a href=\"https:\/\/www.limswiki.org\/index.php\/Sample_(material)\" title=\"Sample (material)\" class=\"wiki-link\" data-key=\"7f8cd41a077a88d02370c02a3ba3d9d6\">samples<\/a>, actuate back flush valves, and take care of other functions without operator intervention. This left the analyst with the task of measuring peaks, developing calibration curves, and performing calculations, at least until data systems became available.\n<\/p><p>The direction of laboratory automation changed significantly when computer chips became available. In the 1960s, companies such as <a href=\"https:\/\/www.limswiki.org\/index.php\/PerkinElmer_Inc.\" title=\"PerkinElmer Inc.\" class=\"wiki-link\" data-key=\"dabda40785b60866d056709e611512f8\">PerkinElmer<\/a> were experimenting with the use of computer systems for data acquisition as precursors to commercial products. The availability of general-purpose computers such as the PDP-8 and PDP-12 series (along with the Lab 8e) from Digital Equipment, with other models available from other vendors, made it possible for researchers to connect their instruments to computers and carry out experiments. The development of microprocessors from Intel (4004, 8008) led to the evolution of \u201cintelligent\u201d laboratory equipment ranging from processor-controlled stirring hot-plates to chromatographic integrators.\n<\/p><p>As researchers learned to use these systems, their application rapidly progressed from data acquisition to interactive control of the experiments, including data storage, analysis, and reporting. Today, the product set available for laboratory applications includes data acquisition systems, <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_information_management_system\" title=\"Laboratory information management system\" class=\"wiki-link\" data-key=\"8ff56a51d34c9b1806fcebdcde634d00\">laboratory information management systems<\/a> (LIMS), <a href=\"https:\/\/www.limswiki.org\/index.php\/Electronic_laboratory_notebook\" title=\"Electronic laboratory notebook\" class=\"wiki-link\" data-key=\"a9fbbd5e0807980106763fab31f1e72f\">electronic laboratory notebooks<\/a> (ELNs), laboratory robotics, and specialized components to help researchers, scientists, and technicians apply modern technologies to their work.\n<\/p><p>While there is a lot of technology available, the question remains \"how do you go about using it?\" Not only do we need to know how to use it, but we also must do so while avoiding our own biases about how computer systems operate. Our familiarity with using computer systems in our daily lives may cause us to assume they are doing what we need them to do, without questioning how it actually gets done. \u201cThe vendor knows what they are doing\u201d is a poor reason for not testing and evaluating control parameters to ensure they are suitable and appropriate for your work.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Moving_from_lab_functions_and_requirements_to_practical_solutions\">Moving from lab functions and requirements to practical solutions<\/span><\/h2>\n<p>Before we can begin to understand the application of the tools and technologies that are available, we have to know what we want to accomplish, specifically what problems we want to solve. We can divide laboratory functions into two broad classes: management and work execution. Figure 1 addresses management functions, whereas Figure 2 addresses work execution functions, all common to laboratories. You can add to them based on your own experience.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig1_Liscouski_ElementsLabTechMan14.png\" class=\"image wiki-link\" data-key=\"9a6fd1795524924c8a3adf0b304813cd\"><img alt=\"Fig1 Liscouski ElementsLabTechMan14.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/9\/96\/Fig1_Liscouski_ElementsLabTechMan14.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 1.<\/b> A breakdown of management-level functions in a typical laboratory<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p><a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig2_Liscouski_ElementsLabTechMan14.png\" class=\"image wiki-link\" data-key=\"33e5d8e8768a472a8bc743722c32393e\"><img alt=\"Fig2 Liscouski ElementsLabTechMan14.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/5\/57\/Fig2_Liscouski_ElementsLabTechMan14.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 2.<\/b> A breakdown of work-level functions in a typical laboratory<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Vendors have been developing products to address these work areas, and there are a lot of products available. Many of them are \"point\" solutions: products that are focused on one aspect of work without an effort to integrate them with others. That isn\u2019t surprising since there isn\u2019t an architectural basis for integration aside from specific hardware systems (e.g., Firewire, USB) or vendor-specific software systems (e.g., office product suites). Another issue in scientific work is that the vendor may only be interested in solving a particular problem, with most of the emphasis on an instrument or technique. They may provide the software needed to support their hardware, with data transfer and integration left to the user.\n<\/p><p>As you work through this document, you\u2019ll find a map of management responsibilities and technologies. How do you connect the above map of functions to the technologies? Applying software and hardware solutions to your lab's needs requires deliberate planning. The days of purchasing point solutions to problems have passed. Today's lab managers need to think more broadly about product usage and how components of lab software systems work together. The point of this document is to help you understand what you need to think about in that regard.\n<\/p><p>Given those summaries of lab activities, how do we apply available technologies to improve lab operations? Most of the answers fall under the heading of \"laboratory automation,\" so we\u2019ll begin by looking at what that is.\n<\/p>\n<h2><span id=\"rdp-ebb-What_is_laboratory_automation?\"><\/span><span class=\"mw-headline\" id=\"What_is_laboratory_automation.3F\">What is laboratory automation?<\/span><\/h2>\n<p>This isn\u2019t a trivial question; your answer may depend on the field you are working in, your experience, and your current interests. To some it means robotics, to others it is a LIMS (or their clinical counterpart, the <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_information_system\" title=\"Laboratory information system\" class=\"wiki-link\" data-key=\"37add65b4d1c678b382a7d4817a9cf64\">laboratory information systems<\/a> or LIS). The ELN and instrument data systems (IDS) are additional elements worth noting. These are examples of product classes and technologies used in lab automation, but they don\u2019t define the field. Wikipedia provides the following as a definition<sup id=\"rdp-ebb-cite_ref-WPLabAuto14_2-0\" class=\"reference\"><a href=\"#cite_note-WPLabAuto14-2\">[1]<\/a><\/sup>:\n<\/p>\n<blockquote><p>Laboratory automation is a multi-disciplinary strategy to research, develop, optimize and capitalize on technologies in the laboratory that enable new and improved processes. Laboratory automation professionals are academic, commercial and government researchers, scientists and engineers who conduct research and develop new technologies to increase productivity, elevate experimental data quality, reduce lab process cycle times, or enable experimentation that otherwise would be impossible. \nThe most widely known application of laboratory automation technology is laboratory robotics. More generally, the field of laboratory automation comprises many different automated laboratory instruments, devices, software algorithms, and methodologies used to enable, expedite and increase the efficiency and effectiveness of scientific research in laboratories.<\/p><\/blockquote>\n<p>And McDowall offered this definition in 1993<sup id=\"rdp-ebb-cite_ref-McDowallAMatrix93_3-0\" class=\"reference\"><a href=\"#cite_note-McDowallAMatrix93-3\">[2]<\/a><\/sup>: \n<\/p>\n<blockquote><p>Apparatus, instrumentation, communications or computer applications designed to mechanize or automate the whole or specific portions of the analytical process in order for a laboratory to provide timely and quality information to an organization<\/p><\/blockquote>\n<p>These definitions emphasize equipment and products, and that is where typical approaches to lab automation and the work we are doing part company. Products and technologies are important, but what is more important is figuring out how to use them effectively. The lack of consistent success in the application of lab automation technologies appears to stem from this focus on technologies and equipment\u2014\u201cwhat will this product do for my lab?\u201d\u2014rather than methodologies for determining what is needed and how to implement solutions.\n<\/p><p>Having a useful definition of laboratory automation is crucial since how we approach the work depends on how we see the field developing. The definition the Institute for Laboratory Automation (ILA) bases its work on is this:\n<\/p>\n<blockquote><p>Laboratory automation is the process of determining needs and requirements, planning projects and programs, evaluating products and technologies, and developing and implementing projects according to a set of methodologies that results in successful systems that increase productivity, improve the effectiveness and efficiency of laboratory operations, reduce operating costs, and provide higher-quality data.<\/p><\/blockquote>\n<p>The field includes the use of data acquisition, analysis, robotics, sample preparation, laboratory informatics, information technology, computer science, and a wide range of technologies and products from widely varying disciplines, used in the implementation of projects.\n<\/p>\n<h3><span id=\"rdp-ebb-Why_"process"_is_important\"><\/span><span class=\"mw-headline\" id=\"Why_.22process.22_is_important\">Why \"process\" is important<\/span><\/h3>\n<p>Lab automation isn\u2019t about stuff, but how we use stuff. The \u201cprocess\u201d component of the ILA definition is central to what we do. To quote Frank Zenie<sup id=\"rdp-ebb-cite_ref-4\" class=\"reference\"><a href=\"#cite_note-4\">[b]<\/a><\/sup> (one of the founders of Zymark Corporation), \u201cyou don\u2019t automate a thing, you automate a process.\u201d You don\u2019t automate an instrument; you automate the process of using one. Autosamplers are a good example of successful automation: they address the process of selecting a sample vial, withdrawing fluid, positioning a needle into an injection port, injecting the sample, preparing the syringe for the next injection, and indexing the sample vials when needed.\n<\/p><p>A number of people have studied the structure of science and the relationship between disciplines. Lab automation is less about science and more about how the work of science is done. Before lab automation can be considered for a project, the underlying science has to be done. In other words, the process that automation is going to be applied to must exist first. It also has to be the right process for consideration. This is a point that needs attention.\n<\/p><p>If you are going to spend resources on a project, you have to make sure you have a well-characterized process and that the process is both optimal and suitable for automation. This means:\n<\/p>\n<ul><li>The process is well documented, people that use the process have been trained on that documented process, and their work is monitored to determine any shortcuts, workarounds, or other variations from that documented process. Differences between the published process and the one actually in use can have a significant impact on the success of a particular project design.<\/li>\n<li>The process\u2019s \u201creadiness for automation\u201d has been determined. The equipment used is suitable for automation or the changes needed to make it suitable are known, and it can be done at reasonable cost. Any impact on warranties has been determined and found to be acceptable.<\/li>\n<li>If several process options exist (e.g., different test protocols for the same test question), they are evaluated for their ability to meet the needs of the science and to be successfully implemented. Other options, such as outsourcing, should be considered to make the best use of resources; is may be more cost-effective to outsource than automate.<\/li><\/ul>\n<p>When looking at laboratory processes, it's useful to recognize they may operate on different levels. There may be high-level processes that address the lab's reason for existence and cover the mechanics of how the lab functions, as well as low-level processes that address individual functions of the lab. This process view is important when we consider products and technologies, as the products have to fit the process, the general basis for product requirements. Discussions often revolve around LIMS and ELNs are one example, with questions being asked about whether one or both are required in the lab, or whether an ELN can replace a LIMS. \n<\/p><p>These and other similar questions reflect both vendor influence and a lack of understanding of the technologies and their application. Some differentiate the two types of technology based on \u201cstructured\u201d (as found in LIMS) vs. \u201cunstructured\u201d (as found in ELNs) data. Broadly speaking, LIMS come with a well-defined, extensible, database structure, while ELNs are viewed as unstructured since you can put almost anything into an ELN and organize the contents as you see fit. But this characterization doesn\u2019t work well either since as a user I might consider the contents, along with an index, as having a structure. This is more of an information technology approach rather than one that addresses laboratory needs. In the end, an understanding of lab processes is still required to resolve most issues.\n<\/p><p>LIMS are well-defined entities<sup id=\"rdp-ebb-cite_ref-5\" class=\"reference\"><a href=\"#cite_note-5\">[c]<\/a><\/sup> and the only one of the two to carry an objective industry-standard description. LIMS are also designed to manage the processes surrounding laboratory testing in a wide variety of industries (e.g., from analytical, physical, and environmental testing, to clinical testing, which is usually associated with the LIS). The lab behavior process model is essentially the same across industries and disciplines. Basically, if you are running an analytical lab and need to manage samples and test results while answering questions about the status of testing on a larger scale than what you can memorize<sup id=\"rdp-ebb-cite_ref-6\" class=\"reference\"><a href=\"#cite_note-6\">[d]<\/a><\/sup>, a LIMS is a good answer.\n<\/p><p>At the time of this writing, there is no standardized definition of an ELN, though a forthcoming update of <a href=\"https:\/\/www.limswiki.org\/index.php\/ASTM_E1578\" title=\"ASTM E1578\" class=\"wiki-link\" data-key=\"74b64479a39bee791aacc4605b78a061\">ASTM E1578<\/a> intends to rectify that. However, given the current hype about ELNs, any product with that designation is going to get noticed. Let\u2019s avoid the term and replace it with a set of software types that addresses similar functionality:\n<\/p><p>1. Scripted execution systems \u2013 These are software systems that guide an analyst in the conduct of a procedure (process); examples include Velquest (now owned by <a href=\"https:\/\/www.limswiki.org\/index.php\/Accelrys,_Inc.\" title=\"Accelrys, Inc.\" class=\"wiki-link\" data-key=\"80b4cca2a39ccb7ba8e07211d62267a7\">Accelrys<\/a>) products and the scripting notebook function (vendor's description) in [LabWare, Inc.|LabWare LIMS]].\n<\/p><p>2. Journal or diary system - These are software systems that provide a record of laboratory work; a word processing system might fill this need, although there are products with features specifically designed to assist lab work.\n<\/p><p>3. Application- or discipline-specific record keeping systems \u2013 These are software systems designed for biology, chemistry, mathematics and other areas that contain features that allow you to record data and text in a variety of forms that are geared toward the needs of specific areas of science.\n<\/p><p>This is not an exhaustive list of forms or functionality, but it is sufficient to make a point. The first, scripted execution, is designed around a process or, more specifically, designed to give the user a mechanism to describe the sequential steps in a process so that they can be repeated under strict controls. These do not replace a LIMS but can be used synergistically with one, or with software that duplicates LIMS capabilities (some have suggested <a href=\"https:\/\/www.limswiki.org\/index.php\/Enterprise_resource_planning\" title=\"Enterprise resource planning\" class=\"wiki-link\" data-key=\"07be791b94a208f794e38224f0c0950b\">enterprise resource planning<\/a> or ERP systems as a substitute). The other two types are repositories of lab information: equations, data, details of procedures, etc. There is no general underlying process as there is with LIMS. They can provide a researcher with a means of describing experiments, collecting data, and performing analyses, which you can correctly view as processes, but they are unique to that researcher or lab and not based on any generalized industry model, as we see in testing labs.\n<\/p><p>Why is this important? These descriptions illustrate something fundamental: process dictates needs, and needs set requirements for products and technology. The \u201cdo we need a LIMS or ELN\u201d question is meaningless without an understanding of the processes that operate in your laboratory.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"The_role_of_processes_in_integration\">The role of processes in integration<\/span><\/h3>\n<p>From the standpoint of equipment, laboratories are often a collection of instruments, computer systems, sample preparation stations, and other test or measurement facilities. One goal frequently stated by lab managers is that \u201cideally we\u2019d like all this to be integrated.\u201d The purpose of integration is to streamline the operations, reduce human labor, and provide a more efficient way of doing work. You are integrating the equipment and systems used to execute one or more processes. However, without a thorough evaluation of the processes in the lab, there is no basis for integration.\n<\/p><p>Highlighting the connection between processes and integration, we see why defining what laboratory automation is remains important. One definition can lead to purchasing products and limiting the scope of automation to individual tasks. Another will take you through an evaluation of how your lab works, how you want it to work, and how to produce a framework for getting you there.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"The_elements_of_laboratory_technology_management\">The elements of laboratory technology management<\/span><\/h2>\n<p>If lab automation is a process, we need to look at the elements that can be used to make that process work. The first thing that is needed is a structure that shows how elements of laboratory automation relate to each other and act as guides for someone coming into the field. That structure also serves as a framework for organizing knowledge about the field. The major elements of the structure are shown in Figure 3.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig3_Liscouski_ElementsLabTechMan14.png\" class=\"image wiki-link\" data-key=\"e013842cc52494513795d5c95b63e829\"><img alt=\"Fig3 Liscouski ElementsLabTechMan14.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/6\/6d\/Fig3_Liscouski_ElementsLabTechMan14.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 3.<\/b> The elements of lab technology management<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Why is it important to recognize these laboratory automation management elements? As automation technology sees broader appeal across multiple industries, there is inevitability to the use of automation technologies in labs. Vendors are putting chips and programmed intelligence into every product with the goal of making them easier to use, while reducing the role of human judgment (which can lead to an accumulation of errors in tasks like pipetting) and the amount of work people have to do to get work done. There are few negatives to this philosophy, aside from one point: if we don\u2019t understand how these systems are working and haven\u2019t planned for their use, we won\u2019t get the most benefit from them, and we may accept results from systems blindly without really questioning their suitability and the results they produce. One of the main points of this work is that the use of automation technologies should be planned and managed. That brings us to the first major element: management issues.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Management_issues\">Management issues<\/span><\/h3>\n<p>The first thing we need to address is who \u201cmanagement\u201d is. Unless the lab is represented solely by you, there are presumably layers of management. Depending on the size of the organization, this may mean one individual or a group of individuals who are responsible for various management aspects of the laboratory. Broadly speaking, those individuals will have a skillset that can appropriately address laboratory technology management, including project and program management, people skills, workflow modeling, and regulatory knowledge, to name a few (Figure 4). The need for these skillsets may vary slightly depending on the type of manager.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig4_Liscouski_ElementsLabTechMan14.png\" class=\"image wiki-link\" data-key=\"2edf5fd5c514808aebd11ccaed659cef\"><img alt=\"Fig4 Liscouski ElementsLabTechMan14.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/f\/fa\/Fig4_Liscouski_ElementsLabTechMan14.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 4.<\/b> The skills required of managers when addressing laboratory technology management<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Aside from reviewing and approving programs (\"programs\" are efforts that cover multiple projects), \u201csenior management\u201d is responsible for setting the policies and practices that govern the conduct of laboratory programs. This is part of an overall architecture for managing lab operations, and has two components (Figure 5): setting policies and practices and developing operational models.<sup id=\"rdp-ebb-cite_ref-7\" class=\"reference\"><a href=\"#cite_note-7\">[e]<\/a><\/sup> \n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig5_Liscouski_ElementsLabTechMan14.png\" class=\"image wiki-link\" data-key=\"ee9d96fc8b9f4462b7c750033cee40be\"><img alt=\"Fig5 Liscouski ElementsLabTechMan14.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/8\/81\/Fig5_Liscouski_ElementsLabTechMan14.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 5.<\/b> The role of managers when addressing laboratory technology management<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Before the incorporation of <a href=\"https:\/\/www.limswiki.org\/index.php\/Informatics_(academic_field)\" title=\"Informatics (academic field)\" class=\"wiki-link\" data-key=\"0391318826a5d9f9a1a1bcc88394739f\">informatics<\/a> into labs, senior management\u2019s involvement wasn\u2019t necessary. However, the storage of intellectual property in electronic media has forced a significant change in lab work. Before informatics, labs used to be isolated from the rest of the organization, with formal contact made through the delivery of results, reports, and presentations. The desire for more effective, streamlined, and integrated information technology operations, and the development of information technology support groups, means that labs are now also part of the corporate picture. In organizations that have multiple labs, more efficient use of resources results in a desire to reduce duplication of work. You might easily justify two labs having their own spectrophotometers, but duplicate LIMS doing the same thing is going to require some explaining.\n<\/p><p>With the addition of informatics in the lab, management involvement is critical to support the success of laboratory programs. Among the reasons often cited for failure of laboratory programs is the lack of management involvement and, by extension, a lack of oversight, though these failures are usually stated without clarifying what management involvement should actually look like. To be clear, senior management's role is to ensure that programs are conducted in a way that:\n<\/p>\n<ul><li>are common across all labs, such that all programs are conducted according to the same set of guidelines;<\/li>\n<li>are well-designed, supportable, and can be upgraded;<\/li>\n<li>are consistent with good project management practices;<\/li>\n<li>are conducted in a way that allows the results to be reused elsewhere in the company;<\/li>\n<li>are well-documented; and<\/li>\n<li>lead to successful results.<\/li><\/ul>\n<p>When work in lab automation began, it was usually the effort of one or two individuals in a lab or company. Today, we need a cooperative effort from management, lab staff, IT support, and, if available, LAEs. One of the reasons management must establish policies and practices (Figure 6) is to enable people to effectively work together, so they are working from the same set of ground rules and expectations and producing consistent and accurate results. \n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig6_Liscouski_ElementsLabTechMan14.png\" class=\"image wiki-link\" data-key=\"d92704744e64f57edc0b3fecc298d4ae\"><img alt=\"Fig6 Liscouski ElementsLabTechMan14.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/3\/36\/Fig6_Liscouski_ElementsLabTechMan14.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 6.<\/b> The business' policies and practices requiring focus by managers when addressing laboratory technology management<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>\"Lab management\" is responsible for understanding how their lab needs to operate in order to meet the lab's goals. Before automation became a factor, lab management\u2019s primary concern was managing laboratory personnel and helping them get their work done. In the early stages of lab automation, the technologies were treated as add-ons that assisted personnel in getting work done. Today, lab managers need to move beyond that mindset and look at how their role must shift towards planning for and managing automation systems that take on more of the work in the lab. As a result, lab managers need to take on the role of technology planners in addition to managing people. The implementation of those plans may be carried out by others (e.g., laboratory automation engineers [LAEs], IT specialists), but defining the objectives and how the lab will function with a combination of people and systems is a task squarely in the lap of lab management, requiring the use of operational workflow models (Figure 7) to define the technologies and products suitable for their lab's work.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig7_Liscouski_ElementsLabTechMan14.png\" class=\"image wiki-link\" data-key=\"0ded3e1e42eaa1b1b253b6ba68b422aa\"><img alt=\"Fig7 Liscouski ElementsLabTechMan14.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/8\/81\/Fig7_Liscouski_ElementsLabTechMan14.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 7.<\/b> The importance of operational workflow models for lab managers when addressing laboratory technology management<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Problems can occur when different operational groups have conflicting sets of priorities. Take for example the case of labs and IT support. At one point, lab computing only affected the lab systems, but today's labs share resources with other groups, requiring \"IT management\" to ensure that work in one place doesn\u2019t adversely impact another. Most issues around elements such as networks and security can be handled through implementing effective network design and bridges to isolate network segments when necessary. More often than not, problems are likely to occur in the choice and maintenance of products, and the policies IT implements to provide cost-effective support. Operating system upgrades are one place issues can occur if those changes cause products used in lab work to break because the vendor is slow in responding to OS changes. Another place that issues can occur is in product selection; IT managers may want to minimize the number of vendors it has to deal with and prefer products that use the same database system deployed elsewhere. That policy may adversely limit the products that the lab can choose from. From the lab's perspective, they need the best products in order to get their work done; from the IT group's perspective, they see it as driving up support costs. The way to avoid these issues, and others, is for senior managers to determine the priorities and keep the inter-department politics out of the discussion.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Classes_of_laboratory_automation_implementation\">Classes of laboratory automation implementation<\/span><\/h3>\n<p>The second element of laboratory technology management to address is laboratory automation implementation. There are three classes of implementation to address (Figure 8):\n<\/p>\n<ul><li><i>Computer-controlled experiments<\/i>: This includes data collection in high-energy physics, LabVIEW implemented systems, instrument command and control, and robotics. This implementation class involves systems where the computer is an integral part of the experiment, doing data collection and\/or experiment control.<\/li>\n<li><i>Computer-assisted lab work\/experiments<\/i>: This includes work that could be done without a computer, but machines and software have been added to improve the process. Examples include <a href=\"https:\/\/www.limswiki.org\/index.php\/Chromatography_data_system\" title=\"Chromatography data system\" class=\"wiki-link\" data-key=\"a424bb889d8507b7e8912f2faf2570c6\">chromatography data systems<\/a> (CDS), ELNs used for documentation, and classic LIMS.<\/li>\n<li><i>Scientific manufacturing<\/i>: This implementation class focuses on production systems, including high-throughput screening, lights-out lab automation, process analytical technologies, and quality by design (QbD) initiatives.<\/li><\/ul>\n<p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig8_Liscouski_ElementsLabTechMan14.png\" class=\"image wiki-link\" data-key=\"37a1a78025ae73b00129398bd282ac13\"><img alt=\"Fig8 Liscouski ElementsLabTechMan14.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/e\/e5\/Fig8_Liscouski_ElementsLabTechMan14.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 8.<\/b> The three classes of laboratory automation implementation to consider for laboratory technology management<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<h4><span class=\"mw-headline\" id=\"Computer-controlled_experiments\">Computer-controlled experiments<\/span><\/h4>\n<p>This is where the second phase of laboratory automation began: when people began using digital computers with connections to their instrumentation. We moved in stages from simple data acquisition, acquiring a few points to show that we can accurately represent voltages from the equipment, to collecting multiple data streams over time and storing the results on disks. The next step consisted of automatically collecting the data, processing it, storing it, and reporting results from equipment such as chromatographs, spectrophotometers, and other devices.\n<\/p><p>Software development moved from assembly language programming to higher-level languages programming, and then specialized systems that provide a graphical interface to the programmer. Products like LabVIEW<sup id=\"rdp-ebb-cite_ref-8\" class=\"reference\"><a href=\"#cite_note-8\">[f]<\/a><\/sup> allow the developer to use block diagrams to describe the programming and processing that have to be done, and provide the user with an attractive user interface with which to work. This is a far cry from embedding machine language programming in the BASIC language code, as was done in some earlier PC systems.\n<\/p><p>Robots offer another example of this class of work, where computers control the movement of equipment and materials through a process that prepares samples for analysis, and may include the analysis itself.\n<\/p><p>While commercial products have overtaken much of the work of interfacing, data acquisition, and data processing (in some cases, the instruments-computer combination are almost indistinguishable from the instruments themselves), the ability to deal with instrument interfacing and programming is still an essential skillset for those working in research and applications where commercial systems have yet to be developed.\n<\/p><p>It\u2019s interesting that people often look at modern laboratory instrumentation and say that everything has gone digital. That\u2019s far from the case. They may point to a single pan balance or thermometer with a digital readout as examples of a \u201cdigital\u201d instrument, not realizing that the packaging contains sensors, analog-digital (A\/D) converters, and computer control systems to manage the device and its communications. The appearance of a \u201cdigital\u201d device masks what is going on inside; we still need to understand the transition from the analog world into the digital domain.\n<\/p>\n<h4><span id=\"rdp-ebb-Computer-assisted_lab_work\/experiments\"><\/span><span class=\"mw-headline\" id=\"Computer-assisted_lab_work.2Fexperiments\">Computer-assisted lab work\/experiments<\/span><\/h4>\n<p>There is a difference between work that can be done by computer and work that has to be done by computer; we just looked at the latter case. There\u2019s a lot of work that goes on in laboratories that could be done and has been done by people, but in today\u2019s labs we prefer to do that work with the aid of automated systems. For example, the management of samples in a testing laboratory used to be done by people logging samples in and keeping record books of the work that is scheduled and has been performed. Today, that work is better managed by a LIMS (or a LIS in the clinical world). The analysis of instrumental data used to be done by hand, and now it is more commonly done by instrument data systems that are faster, more accurate, and permit more complex analysis at a lower cost.\n<\/p><p>Do robots fit in this category? One could argue they are simply doing work that could be done manually performed in the past. The reason we might not consider robots in this class is that in many cases the equipment used for the robots is different than the equipment that\u2019s used by human beings. As such, the two really aren\u2019t interchangeable. If the LIMS or instrument data system were down, people could pick up the work manually; however, that may not be the case if a robot goes offline. It\u2019s a small point and you can go either way on this.\n<\/p><p>The key element for this class of implementation is that the use of computers\u2014and, if you prefer, robotics\u2014is an option and not a requirement. Yet that option improves productivity, reduces cost, and provides better quality and more consistent data.\n<\/p>\n<h4><span class=\"mw-headline\" id=\"Scientific_manufacturing\">Scientific manufacturing<\/span><\/h4>\n<p>This isn\u2019t so much a new implementation class as it is a formal recognition that much of what goes on in a laboratory mirrors work that\u2019s done in manufacturing; some work in quality control is so routine that it matches assembly line work of the 1960s. The major difference, however, is that a lab's major production output is knowledge, information, and data (K\/I\/D). The work in this category is going to expand as a natural consequence of increasing automation, which must be addressed. If this is the direction things are going to go, then we need to do it right.\n<\/p><p>Recognizing this point has significant consequences. Rather than just letting things evolve, we can take advantage of the situation and drive situations that are appropriate for this level of automation into useful practice. This means we must:\n<\/p>\n<ul><li>convert laboratory methods to fully automated systems;<\/li>\n<li>deliberately design and manage equipment and control, acquisition, and processing systems to meet the needs of this kind of application;<\/li>\n<li>train people to work in a more complex environment than they had been; and<\/li>\n<li>build the automation infrastructure (i.e., interfacing and data standards) needed to make these systems realizable and effective, without taking on significant cost.<\/li><\/ul>\n<p>In short, this means opening up another dimension to laboratory work as a natural evolution of work practices. If you look in the direction things are going in the lab, where large sample volume processing is necessary (e.g., high-throughput screening), it is simply a reflection of reality.\n<\/p><p>If we look at quality control applications and manufacturing processes, we basically see one production process (<a href=\"https:\/\/www.limswiki.org\/index.php\/Quality_control\" title=\"Quality control\" class=\"wiki-link\" data-key=\"1e0e0c2eb3e45aff02f5d61799821f0f\">quality control<\/a> or QC) layered on another (production), where ultimately the two merge into continuous production and testing. This is a logical conclusion to work described by process analytical technologies and quality-by-design.\n<\/p><p>This doesn\u2019t diminish the science used in laboratory work; rather, it adds a level of sophistication that hasn\u2019t been widely dealt with: thinking beyond the basic science process to its implementation in a continuous automated system. This is a much more complex undertaking since 1. data will be created at a high rate and 2. we want to be sure that this is high-quality data and not just the production of junk at a high rate.\n<\/p><p>This type of thinking is not limited to quality control work. It can be readily applied to research as well, where economical high-volume experiments can be used to support statistical experimental design methodologies and more exhaustive sample processing, as well as today\u2019s screening applications in life sciences. It is also readily applied to environmental monitoring and regulatory evaluations. And while this kind of thinking may be new to scientific applications, it isn\u2019t new technology. Work that has been done in automated manufacturing can serve as a template for the work that has to be done in laboratory process automation.\n<\/p><p>Let's return to the first two bullets in this section: laboratory method conversion and system management and planning. If you gave four labs a method description and asked them to automate it, you\u2019d get varied implementations of four groups doing the same thing independently. If we are going to turn lab automation into the useful tool it can be, we need to take a different approach: cooperative development of automated systems.\n<\/p><p>In order to be useful, a description of a fully automated system needs more that a method description. It needs equipment lists, source code, etc. in sufficient detail that you can purchase the equipment needed and put the system together expecting it work. However, we can do better than that. In a given industry, where labs are doing the same testing on the same types of samples, we should be able to have them come together and designate and test automated systems to meet the need. Once that is done, vendors can pick up the description and be able to build products suitable to carry out the analysis or test. The problem labs face is getting the work done at a reasonable cost. If there isn\u2019t a competitive advantage to having a unique test, cooperate so that standardized modules for testing can be developed.\n<\/p><p>This changes the process of lab automation from a build-it-from-scratch mentality to the planned connection of standardized automated components into a functioning system.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Experimental_methods\">Experimental methods<\/span><\/h3>\n<p>There is relatively little that can be said about experimental methods at this point. Aside from the clinical industry, not enough work has been done to give really good examples of intentionally designed automated systems that can be purchased, installed in a lab, and expected to function.<sup id=\"rdp-ebb-cite_ref-9\" class=\"reference\"><a href=\"#cite_note-9\">[g]<\/a><\/sup> There are some examples, including ELIZA robotics analysis systems from Caliper Life Sciences and Pressurized Liquid Extraction Systems from Fluid Management Systems. Many laboratory methods are designed with the assumption that people will be doing the work (manually), and any addition of automation would require conversion of that method (Figure 9).\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig9_Liscouski_ElementsLabTechMan14.png\" class=\"image wiki-link\" data-key=\"bc63229f421c41790090fd99e7132384\"><img alt=\"Fig9 Liscouski ElementsLabTechMan14.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/9\/9d\/Fig9_Liscouski_ElementsLabTechMan14.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 9.<\/b> The consideration of a lab's experimental methods for laboratory technology management<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>What we need to begin doing is looking at the development of automated methods as a distinct task similar to the published manual methods by <a href=\"https:\/\/www.limswiki.org\/index.php\/ASTM_International\" title=\"ASTM International\" class=\"wiki-link\" data-key=\"dfeafbac63fa786e77b472c3f86d07ed\">ASTM International<\/a> (ASTM), the <a href=\"https:\/\/www.limswiki.org\/index.php\/United_States_Environmental_Protection_Agency\" title=\"United States Environmental Protection Agency\" class=\"wiki-link\" data-key=\"877b052e12328aa52f6f7c3f2d56f99a\">Environmental Protection Agency<\/a> (EPA), and the United States Pharmacopeia (USP), with the difference being that automation is not viewed as mimicking human actions but as well-designed and optimized systems that support the science and any associated production processes (i.e., a scientific manufacturing implementation that includes integration with informatics systems). We need to think \u201cbigger,\u201d without limiting our vision to just the immediate task but rather looking at how it fits into lab-wide operations.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Lab-specific_technologies_and_information_technology\">Lab-specific technologies and information technology<\/span><\/h3>\n<p>This section quickly covers the next two elements of laboratory technology management at the same time, as many aspects of these elements have already been covered elsewhere. See Figure 10 and Figure 11 for a deeper dive into these two elements.<sup id=\"rdp-ebb-cite_ref-10\" class=\"reference\"><a href=\"#cite_note-10\">[h]<\/a><\/sup>\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig10_Liscouski_ElementsLabTechMan14.png\" class=\"image wiki-link\" data-key=\"ac3f1e7634c59073f9690015ae1a6bd3\"><img alt=\"Fig10 Liscouski ElementsLabTechMan14.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/f\/f4\/Fig10_Liscouski_ElementsLabTechMan14.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 10.<\/b> Reviewing the many aspects of lab-specific technologies for laboratory technology management<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p><a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig11_Liscouski_ElementsLabTechMan14.png\" class=\"image wiki-link\" data-key=\"0e30145a4123f8146e705255884deada\"><img alt=\"Fig11 Liscouski ElementsLabTechMan14.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/8\/84\/Fig11_Liscouski_ElementsLabTechMan14.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 11.<\/b> Consideration of information technologies for laboratory technology management<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>There is one point that needs to be made with respect to information technologies. While lab managers do not need to understand the implementation details of those technologies, they do need to be aware of the potential they offer in the development of a structure for laboratory automation implementations. Management is responsible for lab automation planning, including choosing the best technologies; in other words, management must manage the \u201cbig picture\u201d of how technologies are used to meet their lab's purpose.\n<\/p><p>In particular, managers should pay close attention to the role of client-server systems and virtualization, since they offer design alternatives that impact the choice of products and the options for managing technology. This is one area where good relationships with IT departments are essential. We\u2019ll be addressing these and other information technologies in more detail in other publications.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Systems_integration\">Systems integration<\/span><\/h3>\n<p>Systems integration is the final element of laboratory technology management, one that has been dealt with at length in other areas.<sup id=\"rdp-ebb-cite_ref-TriggTheIntegArch14_11-0\" class=\"reference\"><a href=\"#cite_note-TriggTheIntegArch14-11\">[3]<\/a><\/sup><sup id=\"rdp-ebb-cite_ref-LiscouskiInteg12_12-0\" class=\"reference\"><a href=\"#cite_note-LiscouskiInteg12-12\">[4]<\/a><\/sup> Many of the points noted above, particularly in the management sections, demand attention be paid to integration in order to develop systems that work well. When systems are planned, they need to be done with an eye toward integrating the components, something that today\u2019s technologies are largely not capable doing as of yet (aside from those built around microplates and clinical chemistry applications). This isn\u2019t going to happen magically, nor is it the province of vendors to define it. This is a realm that the user community has to address by defining the standards and methodologies for integration (Figure 12). The planning that managers have to do as part of technology management has to be done with an understanding of the role integration plays and an ability to choose solutions that lead to well-designed integrated systems. The concepts behind scientific manufacturing depend on it, just as integration is required in any efficient production process.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig12_Liscouski_ElementsLabTechMan14.png\" class=\"image wiki-link\" data-key=\"782b37925ad8c874103394b44566097e\"><img alt=\"Fig12 Liscouski ElementsLabTechMan14.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/d\/d5\/Fig12_Liscouski_ElementsLabTechMan14.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 12.<\/b> Addressing systems integration for laboratory technology management<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>The purpose of integration in the lab is to make it easier to connect systems. For example, a CDS may need to pass information and data to a LIMS or ELN and then on to other groups. The resulting benefits of this ability to integrate systems include:\n<\/p>\n<ul><li>smoother workflow, meaning less manual effort while avoiding duplication of data entry and data entry errors, something strived for and being accomplished in production environments, including manufacturing, video production, and graphics design;<\/li>\n<li>easier path for meeting regulatory requirements, as integrated systems, with integration built in by vendors, results in systems that are easier to validate and maintain;<\/li>\n<li>reduced cost of development and support;<\/li>\n<li>reduction in duplication of records via better data management; and<\/li>\n<li>more flexibility, as integrated systems built on modular components will make it easier to upgrade or update systems, and meet changing requirements.<\/li><\/ul>\n<p>The inability to integrate systems and components through vendor-provided mechanisms results in higher development and support costs, increased regulatory burden, and reduced likelihood that projects will be successful.\n<\/p>\n<h4><span id=\"rdp-ebb-What_is_an_integrated_system_in_the_laboratory?\"><\/span><span class=\"mw-headline\" id=\"What_is_an_integrated_system_in_the_laboratory.3F\">What is an integrated system in the laboratory?<\/span><\/h4>\n<p>Phrases like \u201cintegrated system\u201d are used so commonly that it seems as though there should be instant recognition of what they are. While the words may bring a concept to mind, do we have the same concept in mind? For the sake of this discussion, the concept of an integrated system has several characteristics. First, in an integrated system, a given piece of information is entered once and then becomes available throughout the system, restricted only by access privileges. The word \u201csystem\u201d in this case is the summation of all the information handling equipment in the lab. It may extend beyond the lab if process connections to other departments are needed. Second, the movement of materials (e.g., during sample preparation), information, and data is continuous from the start of a process through to the end of that process, without the need for human effort. The sequence doesn\u2019t have to wait for someone to do a manual portion of the process in order for it to continue, aside from policy conditions that require checks, reviews, and approvals before subsequent steps are taken.\n<\/p><p>An integrated system should result in a better place for personnel to work as humans wouldn't be fully depended upon for conducting repetitive work. After all, leaving personnel to conduct repetitive work has several drawbacks. First, people get bored and make mistakes (some minor, some not, both of which contribute to variability in results), and second, the progress of work (productivity) is dependent on human effort, which may limit the number of hours that a process can operate. More broadly, it's also a bad way of using intelligent, educated personnel.\n<\/p>\n<h4><span class=\"mw-headline\" id=\"A_brief_historical_note\">A brief historical note<\/span><\/h4>\n<p>For those who are new to the field, we\u2019ve been working on system integration for a long time, with not nearly as much to show for it as we\u2019d expect, particularly when compared to other fields that have seen an infusion of computer technologies. During the 1980s, the pages of <i>Analytical Chemistry<\/i> saw the initial ideas that would shape the development of automation in chemistry. Dr. Ray Dessy\u2019s (then at Virginia Polytechnical Institute) articles on LIMS, robotics, networking, and IDS laid out the promise and expectation for electronic systems used to acquire and manage the flow of data and information throughout the lab.\n<\/p><p>That concept\u2014the computer integrated lab\u2014was the basis of work by instrument and computer vendors, resulting in proof-of-concept displays and exhibits at PITTCON and other trade shows. After more than 30 years, we are still waiting for that potential to be realized, and we may not be much closer today than we were then. What we have seen is an increase in the sophistication of the tools available for lab work, including client-server chromatography systems and ELNs in their varied forms. In each case, we keep running into the same problem: an inability to connect things into working systems. The result is the use of product-specific code and workarounds to moving and parsing data streams. These are fixes, not solutions. Solutions require careful design not just for the short-term \"what-do-we-need-today\" but also long term robust designs that permit graceful upgrades and improvements without the need to start over from scratch.\n<\/p>\n<h4><span class=\"mw-headline\" id=\"The_cost_of_the_lack_of_progress\">The cost of the lack of progress<\/span><\/h4>\n<p>Every day the scientists and technicians in your labs are working to produce the knowledge, information, and data (K\/I\/D) your company depends upon to meet its goals. That K\/I\/D is recorded in notebooks and electronic systems. How well are those systems going to support your need for access today, tomorrow, or over the next 20 or more years? This is the minimum most companies require for guaranteed access to data.\n<\/p><p>The systems being put in place to manage laboratory K\/I\/D are complex. Most laboratory data management systems (i.e., LIMS, ELN, IDS) are a combination of four separate products: hardware, operating system, database management system, and the application you and your staff uses, each from a different company with its own product life cycle. This means that changes can occur at any of those levels, asynchronously, without consideration for the impact they have on your ability to work.\n<\/p><p>Lab managers are usually trained in the sciences and personnel aspects of laboratory management. They are rarely trained in technology management and planning for laboratory robotics and informatics, the tools used today to get laboratory work done and manage the results. The consequences of inadequate planning can be significant:\n<\/p>\n<blockquote><p>In January 2006, the FBI ended the LIMS project, and in March 2006 the FBI and JusticeTrax agreed to terminate the contract for the convenience of the government. The FBI agreed to pay a settlement of $523,932 to the company in addition to the money already spent on developing the system and obtaining hardware. Therefore, the FBI spent a total of $1,380,151 on the project. With only the hardware usable, the FBI lost $1,175,015 on the unsuccessful LIMS project.<sup id=\"rdp-ebb-cite_ref-OIGTheFed06_13-0\" class=\"reference\"><a href=\"#cite_note-OIGTheFed06-13\">[5]<\/a><\/sup><\/p><\/blockquote>\n<p>Other instances of problems during laboratory informatics projects include:\n<\/p>\n<ul><li>A 2006 Association for Laboratory Automation survey on the topic of industrial laboratory automation posed the following questions, with percentage of respondents agreeing in parentheses: My Company\/Organization\u2019s Senior Management Feels its Investment in Laboratory Automation Has: succeeded in delivering the expected benefits (56%); produced mixed results (43%); has not delivered the expected benefits (1%). 44% failed to fully realize expectations.<sup id=\"rdp-ebb-cite_ref-Hamilton2006_14-0\" class=\"reference\"><a href=\"#cite_note-Hamilton2006-14\">[6]<\/a><\/sup><\/li><\/ul>\n<ul><li>A long-circulated statistic says that some 60 percent of LIMS installations fail.<sup id=\"rdp-ebb-cite_ref-SCWChoosing02_15-0\" class=\"reference\"><a href=\"#cite_note-SCWChoosing02-15\">[7]<\/a><\/sup><\/li><\/ul>\n<ul><li>The Standish Group's CHAOS Report (1995) on project failures (looking at ERP implementations) shows that over half will fail, and 31.1% of projects will be canceled before they ever get completed. Further results indicate 52.7% of projects will cost over 189% of their original estimates.<sup id=\"rdp-ebb-cite_ref-SGITheCHAOS95_16-0\" class=\"reference\"><a href=\"#cite_note-SGITheCHAOS95-16\">[8]<\/a><\/sup><\/li><\/ul>\n<p>From a more anecdotal standpoint, we\u2019ve received a number of emails discussing the results of improperly managing projects. Stories that stand out among them include:\n<\/p>\n<ul><li>An anonymous LIMS customer was given a precise fixed-price quote somewhere around $90,000 and then got hit with several $100,000 in extras after the contract was signed.<\/li>\n<li>An anonymous major pharmaceutical company some years back had implemented a LIMS with a lot of customization that was generally considered to be successful, until it came time to upgrade. They couldn\u2019t do it and went back to square one, requiring the purchase of another system.<\/li>\n<li>An anonymous business reports of robotics system failures totaling over $700,000.<\/li>\n<li>Some report vendors are using customer sites as test-beds for software development.<\/li>\n<li>A group of three different types of labs with differing requirements were trying to use the same system to reduce costs; nearly $500,000 was spent before the project was cancelled.<\/li><\/ul>\n<p>In addition to those costs, there are the costs of missed opportunities, project delays, departmental and employee frustration, and the fact that the problems you wanted to solve are still sitting there.\n<\/p><p>The causes for failures are varied, but most include factors that could have been avoided by making sure those involved were properly trained. Poor planning, unrealistic goals, inadequate specifications (including a lack of regulatory compliance requirements), project management difficulties, scope creep, and lack of experienced resources can all play a part in a failed laboratory technology project. The lack of features that permit the easy development of integrated systems can also be added to that list. That missing element can cause projects to balloon in scope, requiring people to take on work that they may not be properly prepared for, or projects that are not technically feasible, something developers don\u2019t realize until they are deeply involved in the work.\n<\/p><p>The methods people use today to achieve integration results in cost overruns, project failures, and systems that can\u2019t be upgraded or modified without significant risk of damaging the integrity of the existing system. One individual reported that his company\u2019s survey of customers found that systems were integrated in ways that prevented upgrades or updates; the coding was specific to a particular version of software, and any changes could result in scrapping the current system and starting over.\n<\/p><p>One way of achieving \u201cintegration\u201d is similar to how one might integrate household wiring by hard-wiring all the lamps, appliances, TV\u2019s, etc. to the electrical cables. Everything is integrated, but change isn\u2019t possible without shutting off the power to everything, going into the wall and making the wiring changes, and then repairing the walls and turning things back on. When considering systems integration, that\u2019s not the model we\u2019re considering; however, from the comments we\u2019ve received, it is the way people are implementing software. We\u2019re looking for the ability to connect things in ways that permit change, like the wiring in most households: plug and unplug. That level of compatibility and integration results from the development of standards for power distribution and for connections: the design of plugs and sockets for specific voltages, phasing, and polarity so that the right type of power is supplied to the right devices.\n<\/p><p>Of course, there are other ways to practically connect systems, new and old. The Universal Serial Bus (USB) standard allows the same connector to be used for connecting portable storage, cameras, scanners, printers, and other communications devices with a computer. Another older example can be found with modular telephone jacks and tone dialing, which evolved to the more mobile system we have today. However, we probably wouldn't have the level of sophistication we have now if we relied on rotary dials and hard-wired phones.\n<\/p><p>These are just a few examples of component connections that can lead to systems integration. When we consider integrating systems in the lab, we need to look at connectivity and modularity (allowing us to make changes without tearing the entire system apart) as goals.\n<\/p>\n<h4><span id=\"rdp-ebb-What_do_we_need_to_build_integrated_systems?\"><\/span><span class=\"mw-headline\" id=\"What_do_we_need_to_build_integrated_systems.3F\">What do we need to build integrated systems?<\/span><\/h4>\n<p>The lab systems we have today are not built for system-wide integration. They are built by vendors and developers to accomplish a specific set of tasks; connections to other systems is either not considered or avoided for competitive reasons. If we want to consider the possibility of building integrated systems, there are at least five elements that are needed:\n<\/p>\n<ol><li>Education<\/li>\n<li>User community commitment<\/li>\n<li>Standards (e.g., file formatting, messaging, interconnection)<\/li>\n<li>Modular systems<\/li>\n<li>Stable operating system environment<\/li><\/ol>\n<p><b>Education<\/b>: Facilities with integrated systems are built by people trained to do it. This has been discussed within the concept of LAEs, published in 2006.<sup id=\"rdp-ebb-cite_ref-LiscouskiAreYou06_17-0\" class=\"reference\"><a href=\"#cite_note-LiscouskiAreYou06-17\">[9]<\/a><\/sup><sup id=\"rdp-ebb-cite_ref-18\" class=\"reference\"><a href=\"#cite_note-18\">[i]<\/a><\/sup> However, the educational issues don\u2019t stop there. Laboratory management needs to understand their role in technology management. It isn\u2019t enough to understand the science and how to manage people, as was the case 30 or 40 years ago. Managers have to understand how the work gets done and what technology is used to do it. The effective use (or the unintended misuse) of technologies can have as big an impact on productivity as anything else. The science also has to be adjusted for advanced lab technologies. Method development should be done with an eye toward method execution, asking \"can this technique be automated?\"\n<\/p><p><b>User community commitment<\/b>: Vendors and developers aren\u2019t going to provide the facilities needed for integration unless the user community demands them. Suppliers are going to have to spend resources in order to meet the demands for integration, and they aren\u2019t going to do this unless there is a clear market need and users force them to meet that need. If we continue with \u201cbusiness as usual\u201d practices of force fitting things together and not being satisfied with the result, where is the incentive for vendors to spend development money? The choices come down to these: you only purchase products that meet your needs for integration, you spend resources trying to integrate systems that aren\u2019t designed for it, or your labs continue to operate as they have for the last 30 years, with incremental improvements.\n<\/p><p><b>Standards<\/b>: Building systems that can be integrated depend upon two elements in particular: standardized file formats and messaging or interconnection systems that permit one vendor\u2019s software package to communicate with another\u2019s. \n<\/p><p>First, the output of an instrument should be packaged in an industry standardized file format that allows it to be used with any appropriate application. The structure of that file format should be published and include the instrument output plus other relevant information such as date, time, instrument ID, sample ID (read via barcode or other mechanism), instrument parameters, etc. Digital cameras have a similar setup for their raw data files: the pixel data and the camera metadata that tells you everything about the camera used to take the shot.\n<\/p><p>In the 1990s, the Analytical Instrument Association (AIA) (now the Analytical and Life Science Systems Association) had a program underway to develop a set of file format standards for chromatography and mass spectrometry. The program made progress and was turned over to ASTM, where momentum stalled. It was a good first attempt. There were several problems with it that bear noting. The first problem is found in the name of the standard: the Analytical Data Interchange (ANDI) standard.<sup id=\"rdp-ebb-cite_ref-ANDISF03Arch_19-0\" class=\"reference\"><a href=\"#cite_note-ANDISF03Arch-19\">[10]<\/a><\/sup> It was viewed as a means of transferring data between instrument systems and served as a secondary file format, with the instrument vendors being the primary format. This has regulatory implications since the <a href=\"https:\/\/www.limswiki.org\/index.php\/Food_and_Drug_Administration\" title=\"Food and Drug Administration\" class=\"wiki-link\" data-key=\"e2be8927071ac419c0929f7aa1ede7fe\">Food and Drug Administration<\/a> (FDA) requires storage of the primary data and that the primary data is used to support submissions. It also means that files would have to have been converted between formats as it moved between systems.\n<\/p><p>A standardized file format would be ideal for an instrumental technique. Data collected from an instrument would be in that format and be implemented and used by each vendor. In fact, it would be feasible to have a circuit board in an instrument that would function as a network node. It would collect and store instrument data and forward it to another computer for long-term storage, analysis and reporting, thus separating data collection and use. A similar situation currently exists with instrument vendors that use networked data collection modules. The issue is further complicated by the nature of analytical work. A data file is meaningless without its associated reference material: standards, calibration files, etc., that are used to develop calibration curves and evaluate qualitative and quantitative results. \n<\/p><p>While file format standards are essential, so is a second-order description: sample set descriptors that provide a context for each sample\u2019s data file (e.g., a sample set might be a sample tray in an autosampler, and the descriptor would be a list of the tray\u2019s contents). Work is underway for the development of another standard for laboratory data: ASTM WK23265 - New Specification for Analytical Information Markup Language. Its description indicates that it does take the context of the sample\u2014its relationship to other samples in a run or tray\u2014into account as part of the standard description.<sup id=\"rdp-ebb-cite_ref-ASTMWK23265Arch_20-0\" class=\"reference\"><a href=\"#cite_note-ASTMWK23265Arch-20\">[11]<\/a><\/sup>\n<\/p><p>The second problem with the AIA\u2019s program was that it was vendor-driven with little user participation. The transfer to ASTM should have resolved this, but by that point user interest had waned. People had to buy systems and they couldn\u2019t wait for standards to be developed and implemented. The transition from proprietary file formats to standardized formats has to be addressed in any standards program.\n<\/p><p>The third issue with their program involved standards testing. Before you ask a customer to commit their work to a vendor\u2019s implementation of a standard, they should have the assurance, through an independent third-party, that things work as expected.\n<\/p><p><b>Modular systems<\/b>: The previous section notes that vendors have to assume that their software may be running in a stand-alone environment in order to ensure that all of the needed facilities are available to meet the user's needs. This can lead to duplication of functions. A multi-user instrument data system and a LIMS both have a need for sample login. If both systems exist in the lab, you\u2019ll have two sample login systems. The issue can be compounded even further with the addition of more multi-instrument packages.\n<\/p><p>Why not break down the functionality in a lab and use one sample login module? It is simply a multi-user database system. If we were to do a functional analysis of the elements needed in a lab, with an eye toward eliminating redundancy and duplication while designing components as modules, integration would be a simpler issue. A modular approach\u2014a system with a login module, lab management module, modules for data acquisition, chromatographic analysis, spectra analysis, etc.\u2014would provide a more streamlined design, with the ability to upgrade functionality as needed. For example, a new approach to chromatographic peak detection, peak deconvolution, could be integrated into an analysis method without having to reconstruct the entire data system.\n<\/p><p>When people talk about modular applications, the phrase \u201cLEGO-like\u201d comes to mind. It is a good illustration of what we\u2019d like to accomplish. The easily connectable blocks and components can be structured in a wide variety of items, all based on a simple standardized connection concept. There are two differences that we need to understand. With LEGOs, almost everything connects. In the lab, connections need to make sense. Secondly LEGOs are a single-vendor solution; unless you\u2019re the vendor, that isn\u2019t a good model. A LEGO-like multi-source model (including open source) of well-structured and well-designed and -supported modules that could be connected or configured by the user would be an interesting approach to the development of integrable systems.\n<\/p><p>Modularity would also be of benefit when upgrading or updating systems. With more functions distributed over several modules, the amount of testing and validation needed would be reduced. It should also be easier to add functionality. This isn\u2019t some fantasy, this is what LAE is when you look at the entire lab environment rather than implementing products task-by-task in isolation.\n<\/p><p><b>Stable operating system environment<\/b>: The foundation of an integrated system must be a stable operating environment. Operating system upgrades that require changes in applications coding are disruptive and lead to a loss of performance and integrity. It may be necessary to forgo the bells and whistles of some commercial operating systems in favor of open-source software that provides required stability. Upgrades should be improvements in quality and functionality where that change in functionality has a clear benefit to the user.\n<\/p><p>The elements noted above are just introductory commentary; each could fill a healthy document by itself. At some point, these steps are going to have to be taken. Until they are, and they result in tools you can use, labs\u2014your labs\u2014are going to be committing the results of your work into products and formats you have little control over. That should not be an acceptable situation; the use of proprietary file formats that limit your ability to work with your company\u2019s data should end and be replaced with industry-standard formats that give you the flexibility to work as you choose, with whatever products you need.\n<\/p><p>We need to be deliberate in how we approach this problem. When discussing file format standards, it was noted that the data file for a single sample is useless by itself. If you had the file for a chromatogram for instance, you could display it and look at the conditions used to collect it; however, interpretation requires data from other files, so standards for file sets have to be developed. That wasn\u2019t a consideration in the original AIA work on chromatography and mass spectrometry (though it was in work done on Atomic Absorption, Emission and Mass Spectroscopy Data Interchange Specification standards for the Army Corps of Engineers, 1995).\n<\/p><p>The first step in this process is for lab managers and IT professionals to become educated in laboratory automation and what it takes to get the job done. The role of management can\u2019t be understated; they have to sign off on the direction work takes and support it for the long haul. The education needs to focus on the management and implementation of automation technologies, not just the underlying science. After all, it is the exclusive focus on the science that leads to the silo-like implementations we have today. The user community's active participation in the process is central to success, and unless that group is educated in the work, the effect of that participation will be limited.\n<\/p><p>Secondly, we need to renew the development of industry-standard file formats, not just from the standpoint of encapsulating data files, but formats that ensure that the data is usable. The initial focus for each technique needs to be a review of how laboratory data is used, particularly with the advent of hyphenated techniques (e.g., <a href=\"https:\/\/www.limswiki.org\/index.php\/Gas_chromatography%E2%80%93mass_spectrometry\" title=\"Gas chromatography\u2013mass spectrometry\" class=\"wiki-link\" data-key=\"d7fe02050f81fca3ad7a5845b1879ae2\">Gas chromatography\u2013mass spectrometry<\/a> or GC-MS), and use that review as a basis for defining the layers of standards needed to develop a useable product. This is a complex undertaking but worth the effort. If you\u2019re not sure, consider how much your lab\u2019s data is worth and the impact of its loss.\n<\/p><p>In the short term, we need to start pushing vendors\u2014you have the buying power\u2014to develop products with the characteristics needed to allow you to work with and control the results of your lab\u2019s work. Products need to be developed to meet your needs, not the vendor's. Product criterion needs to be set with the points above in mind, not on a company-by-company basis but as a community; you\u2019re more likely to get results with a community effort.\n<\/p><p>Overcoming the barriers to the integration of laboratory systems is going to take a change in mindset on the part of lab management and those working in the labs. That change will result in a significant evolution in the way labs work, yielding higher productivity and a better working environment, with an improvement in the return on your company\u2019s investment in your lab's operations. Laboratory systems need to be designed to be effective. The points noted here are one basis for that design.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Summary\">Summary<\/span><\/h3>\n<p>That is a brief tour of what the major elements of laboratory technology management looks like right now. The diagrams will change and details will be left to additional layers to keep the structure easy to understand and use. One thing that was sacrificed in order to facilitate clarity is the relationship between technologies. For example, a robotics system might use data acquisition and control components in its operations, which could be noted by a link between those elements.\n<\/p><p>There is room for added complexity to the map. Someone may ask where <a href=\"https:\/\/www.limswiki.org\/index.php\/Bioinformatics\" title=\"Bioinformatics\" class=\"wiki-link\" data-key=\"8f506695fdbb26e3f314da308f8c053b\">bioinformatics<\/a> or some other subject resides. That as well as other points\u2014and there are a number of them \u2014would be addressed in successive levels, giving the viewer the ability to drill down to whatever level of detail they need. The best way to view this is an electronic map that can be explored by clicking on subjects for added information and relationships.\n<\/p><p>An entire view of the diagram of the elements of laboratory technology can be <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/web.archive.org\/web\/20151025114744\/http:\/\/www.institutelabauto.org\/publications\/SOLA-11x17.pdf\" target=\"_blank\">found here<\/a> (as a PDF).\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Skills_required_for_working_with_lab_technologies\">Skills required for working with lab technologies<\/span><\/h2>\n<p>While this subject could have arguably been discussed in the management section above, we needed to wait until the major elements were described before taking up this critical point. In particular, we had to address the idea behind \"scientific manufacturing.\"\n<\/p><p>Lab automation has an identity problem. Many people don\u2019t recognize it as a field. It appears to be a collection of products and technologies that people can use as needed. Emphasis has shifted from one technology to another depending on what is new, hot, or interesting, with conferences and papers discussing that technology until something else comes along. Robotics, LIMS, and neural networks have all had their periods of intense activity, and now the spotlight is on ELNs, integration, and paperless labs. \n<\/p><p>Lab automation needs to be addressed as a multi-disciplinary field, working in all scientific disciplines, by lab personnel, consultants, developers, and those in IT support groups. That means addressing three broad groups of people: scientists and technicians (i.e., the end users), LAEs (i.e., those designing, and implementing systems for the end users), and the technology developers (Figure 13).\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig13_Liscouski_ElementsLabTechMan14.png\" class=\"image wiki-link\" data-key=\"8b4101f667922e103048c4950d80b1b9\"><img alt=\"Fig13 Liscouski ElementsLabTechMan14.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/3\/31\/Fig13_Liscouski_ElementsLabTechMan14.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 13.<\/b> Groups that need to be addressed when discussing laboratory automation<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Discussions concerning lab automation and the use of advanced technologies in lab work are usually done from the standpoint of the technologies themselves: what they are, what they do, benefits, etc. Missing from these conversations is an appreciation of the ability of those scientists and technicians\u2014the end users\u2014in the lab to use these tools, and how they will change the nature of laboratory work.\n<\/p><p>The application of analog electronic systems to laboratory work began in the early part of the twentieth century. For the most part, those systems made it easier for a scientist to make measurements. Recording spectrophotometers replaced wavelength-by-wavelength manual measurements, and process chromatographs automated sample taking, back-flush valves, attenuation changes, etc. They made it easier to collect measurements but did not change the analyst's job of data analysis. After all, analysts still had to look at each curve or chromatogram, make judgments, and apply their skills to making sense of the experiment. At this point, scientists were in charge of executing the science, while analog electronics made the science easier to deal with.\n<\/p><p>When processor-based systems were added to the lab\u2019s tool set, things moved in a different direction. The computers could then perform the data acquisition, display, and analysis. This left the science to be performed by a program, with the analyst able to adjust the behavior of the program by setting numerical parameters. This represents a major departure in the nature of laboratory work, from scientist being completely responsible for the execution of lab procedures to allowing a computer-based system to take over control of all or a portion of the work.\n<\/p><p>For many labs, the use of increasingly sophisticated technologies is just a better way of individuals doing tasks better, faster, and with less cost. In others, the technology takes over a task and frees the analyst to do other things. We\u2019ve been in a slow transition from people driving work to technology driving work. As the use of lab technologies moves further into automation, the practice of laboratory work is going to change substantially until we get to the point where scientific manufacturing and production is the dominant function: automation applied from sample acceptance to the final test or experimental result.\n<\/p>\n<h4><span class=\"mw-headline\" id=\"Development_of_manufacturing_and_production_stages\">Development of manufacturing and production stages<\/span><\/h4>\n<p>We can get a sense of how work will change by looking at the development of manufacturing and production systems, where we see a transition from manual methods to fully automated production, in the end driven by the same issues as laboratories: a need or desire for high productivity, lower costs, and improved and consistent product results. The major difference is that in labs, the \u201cproduct\u201d isn\u2019t a widget, it is information and data. In product manufacturing, we also see a reduction in manpower as a goal; in labs, it is a shift from manual effort to using that same energy to understand data and improve the science. One significant benefit from a shift to automation is that lab staff will be able to redesign lab processes\u2014the science behind lab work\u2014to function better in an automated environment; most of the processes and equipment in place today assume manual labor and are not well-designed for automated control.\n<\/p><p>That said, we\u2019re going to\u2014by analogy\u2014look at a set of manufacturing and production stages associated with wood working, e.g., making the components of door frames or moldings. The trim you see on wood windows and the framing on cabinet doors are examples of shaping wood. When we look at the stages of manufacturing or production, we have to consider five attributes of that production effort: relative production cost, productivity, required skills, product quality, and flexibility.\n<\/p><p>Initially, hand planes were used to remove wood and form trim components. Multiple passes were needed, each deepening the grooves and shaping the wood. It took practice, skill, and patience to do the work well and avoid waste. This was the domain of the craftsman, the skilled woodworker, who represents the first stage of production evolution. In terms of our evaluation, Figure 14 shows the characteristics of this first stage (we\u2019ll fill in the table as we go along).\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig14_Liscouski_ElementsLabTechMan14.png\" class=\"image wiki-link\" data-key=\"32f18c842e8c6b956f0e74e3ea567dea\"><img alt=\"Fig14 Liscouski ElementsLabTechMan14.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/a\/a7\/Fig14_Liscouski_ElementsLabTechMan14.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 14.<\/b> The first stage of an evolving manufacturing and production process, using wood working as an example<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>The next stage sees the craftsman turn to hand-operated, electric motor-driven routers to shape the wood. Instead of multiple passes with a hand plane, motor-driven set of bits removes material, leaving the finished product. A variety of cutting bits allow the craftsman to create different shapes. For example, a matching set of bits may be specially designed for the router to create the interlocking rails and stiles that frame cabinet doors.\n<\/p><p>Figure 15 shows the impact of this equipment on this second evolutionary stage of production. While still available to the home woodworker, the use of this equipment implies that the craftsman is going to be producing the shaped wood in quantity, so we are moving beyond level of production found in a cottage industry to the seeds of a growth industry. The cost of good quality routers and bits is modest and requires an investment in developing skills to use them effectively. Used well (and safely) they can produce good products; they can also produce a lot of waste if the individual isn\u2019t properly schooled.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig15_Liscouski_ElementsLabTechMan14.png\" class=\"image wiki-link\" data-key=\"d670312d77f716548d166e75d2f44e48\"><img alt=\"Fig15 Liscouski ElementsLabTechMan14.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/d\/d0\/Fig15_Liscouski_ElementsLabTechMan14.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 15.<\/b> The second stage of an evolving manufacturing and production process, using wood working as an example<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>The third stage sees automated elements work their way into the wood working mix, with the multi-headed numerically controlled router. Instead of one hand-operated router-bit combination, there are four router-bit assemblies directed by a computer program to follow a specific path such that highly repeatable and precise cuts can be made. Of course, with the addition of multiple heads and software, the complexity of the product increases.\n<\/p><p>Figure 16 shows the impact of this equipment on the third evolutionary stage of production. We\u2019ve moved from the casual woodworker to a full production operation. The cost of the equipment is significant, and the operators\u2014both the program designer and the machine operator\u2014have to be skilled in the use of the equipment to reduce mistakes and waste material. The \u201cLess Manual Skill\u201d notation under \"Skills\" indicates a transition point where we have moved almost entirely from the craftsman or woodworker to the skilled operator, requiring different skill sets than previous production methods. One of the side-effects of higher production is that if you make a design error, you can make out-of-specification product rather quickly.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig16_Liscouski_ElementsLabTechMan14.png\" class=\"image wiki-link\" data-key=\"9359949ed96a1d377c2643396f9816f4\"><img alt=\"Fig16 Liscouski ElementsLabTechMan14.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/8\/89\/Fig16_Liscouski_ElementsLabTechMan14.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 16.<\/b> The third stage of an evolving manufacturing and production process, using wood working as an example<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>From there, it's not a far jump to the final stage: a fully automated assembly line. Their inclusion completes the chart that we\u2019ve been developing (Figure 17).\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig17_Liscouski_ElementsLabTechMan14.png\" class=\"image wiki-link\" data-key=\"2258110fd4c3a7891ce7d135fcf047c0\"><img alt=\"Fig17 Liscouski ElementsLabTechMan14.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/1\/17\/Fig17_Liscouski_ElementsLabTechMan14.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 17.<\/b> The fourth and final stage of an evolving manufacturing and production process, using wood working as an example<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>When we take the information from Figure 17, we can summarize the entire process as follows (Figure 18):\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig18_Liscouski_ElementsLabTechMan14.png\" class=\"image wiki-link\" data-key=\"5f6a501e1a46ec06a833f4bee952c9df\"><img alt=\"Fig18 Liscouski ElementsLabTechMan14.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/3\/3f\/Fig18_Liscouski_ElementsLabTechMan14.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 18.<\/b> All stages of a wood working manufacturing and production process, and their attributes, summarized<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>When we look at that summary, we can't help but notice that it translates fairly well when we replace \"wood working\" with \"laboratory work,\" moving from the entirely manual processes of the skilled technician or scientist to the full automated scientific manufacturing and production process of the skilled operator or system supervisor. We visualize that in Figure 19. (The image in the last column of Figure 19 is of an automated extraction system from Fluid Management Systems, Watertown, MA.)\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig19_Liscouski_ElementsLabTechMan14.png\" class=\"image wiki-link\" data-key=\"7fa1515f8f2bf884a103bf8f5eecf9d9\"><img alt=\"Fig19 Liscouski ElementsLabTechMan14.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/3\/39\/Fig19_Liscouski_ElementsLabTechMan14.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 19.<\/b> All four stages of an evolving scientific manufacturing and production process, this time using lab work<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<h4><span class=\"mw-headline\" id=\"What_does_this_all_mean_for_laboratory_workers\">What does this all mean for laboratory workers<\/span><\/h4>\n<p>The skills needed today and in the future to work in a modern lab have changed significantly, and they will continue to change as automation takes hold. We\u2019ve seen these changes occur already. Clinical chemistry, high-throughput screening (HTS), and automated bioassays using microplates are some examples.\n<\/p><p>The discussion here mirrors the development in the woodworking example. We\u2019ll look at the changes in skills, using chromatography as an example. The following material is applicable to any laboratory environment, be it electronics, forensics, physical properties testing, etc. Chromatography is being used because of its wide application in lab work.\n<\/p><p><b>Stage 1: The analyst using manual methods<\/b>\n<\/p><p>By \u201cmanual methods\u201d we mean 100% manual work, including having the chromatographic detector output (an analog signal) recorded on a standard strip chart recorder and the pen trace getting analyzed by the hand, eye, and skill of the analyst. The process begins with the analyst finding out what samples need to be processed, finding those samples, and preparing them for injection into the instrument. The instrument has to be set up for the analysis, which includes installing the proper columns, adjusting flow rates, confirming component temperatures, and making sure that the instrument is working properly.\n<\/p><p>As each injection is done, the starting point for the data\u2014a pen trace of the analog signal\u2014is noted on the strip chart. This process is repeated for each sample and reference standard. Depending on the type of analysis, each sample\u2019s data may take up to several feet of chart paper. The recording is a continuous trace and is a faithful representation of the detector output, without any filtering aside from attenuator adjustments (range selections to keep the signal recording within the limits of the paper; some peaks may peg the pen at the top of the chart because of their size in which case that data is lost) and electrical or mechanical noise reduction.\n<\/p><p>When all the injections have been completed, the analyst begins the evaluation of each sample\u2019s data. That includes:\n<\/p>\n<ul><li>inspecting the chromatogram for anomalies, including peaks that weren\u2019t expected (possible contaminants), separations that aren\u2019t as clear as they should be, noise, baseline drifts, and any other unusual conditions that would indicate a problem with that sample or the entire run of samples;<\/li>\n<li>taking the measurements needed for qualitative and\/or quantitative analysis;<\/li>\n<li>developing the calibration curves; and<\/li>\n<li>making the calculations needed to complete the analysis.<\/li><\/ul>\n<p>The analysis would include any in-process control samples, and addressing issues with problem samples. The final step would be the administrative work, including checks of the work by another analyst, reporting results, and updating work request lists.\n<\/p><p><b>Stage 2: Point automation, applied to specific tasks<\/b>\n<\/p><p>The next major development in this evolution is the introduction of automated injectors. Instead of the analyst spending the day injecting samples into the instrument's injection port, a piece of equipment does it, ushering in the first expansion of the analyst\u2019s skill set. (In the previous stage, the analysis time is generally too short to allow the analyst to do anything else, so the analyst's day is spent injecting and waiting.) Granted, this doesn't represent a major change, but it is a change. It requires the analyst to confirm that the samples and standards are in the right order, that the right number of injections per sample are set, or that duplicate vials are put in the tray (duplicate injections get used to confirm that problems don't occur during the injection process). The analyst has to ensure that the auto-injector is connected to the strip chart recorder so that the injection timing mark is made automatically.\n<\/p><p>This simple change of adding an auto-injector to the process has some impact on the analyst's skill set. The same holds true for the use of automatic integrators, and sample preparation systems; in addition to understanding the science, the lab work takes on the added dimension of managing systems, trading labor for systems supervision with a gain of higher productivity.\n<\/p><p><b>Stage 3: Sequential-step automation<\/b>\n<\/p><p>The addition of data systems to the sample analysis process train (from worklist generation to sample preparation to instrument analysis sequence) further reduces the amount of work the analyst does in sample analysis and changes the nature of the work performed. Starting with the simple integrators and moving on to advanced computer systems, the data system works with the auto-injector to start the instrument analysis phase of work, acquire the signal from the detector, convert it to a digital form, process that data (e.g., peak detection, area and peak height calculations, retention time), and perform the calculations needed for quantitative analysis. This yields less work with higher productivity.\n<\/p><p>While systems like this are common in labs today, there are problems, which we\u2019ll address shortly.\n<\/p><p><b>Stage 4: Production-level automation (scientific manufacturing and production)<\/b>\n<\/p><p>Depending on what is necessary for sample preparation, it may not be much of a stretch to have automated sample prep, injection, data collection, analysis, and reporting (with automated updates into a LIMS) performed in a small footprint with equipment available today. One vendor has an auto-injection system that is capable of dissolving material, performing extractions, mixing, and barcode reading, as well as other functions. Connect that to a chromatograph and data station, with programmed connection to a LIMS, and you have the basis of an automated sample preparation\u2013chromatographic system. However, there are some issues that have to be noted and addressed.\n<\/p><p>The goal with such a system has to be high-volume, automated sample processing with the generation of high-quality data. The intent is to reduce the amount of work the analyst has to perform, ideally so that the system can run unattended. Note that \u201chigh-quality\u201d in this case means to have a high level of confidence in the results. There is more to that than the ability to do calculations for quantitative analysis or having a validated system; you have to validate the right system.\n<\/p><p>Computer systems used in chromatographic analysis can be tuned to control how peaks are detected, what is rejected as noise, and how separations are identified so that baselines can be properly drawn and peak areas allocated. The analyst needs to evaluate the impact of these parameters for each analytical procedure and make sure that the proper settings are used.\n<\/p><p>As previously noted regarding manual processes, the inspection of the chromatogram for elements that don\u2019t match the expectations for a well-characterized sample (the number of peaks that should be there, the type of separations between peaks, etc.) is vital. This screening of samples has to be applied to every sample whether by human eye or automated system, the latter giving lower labor costs and higher productivity. If we are going to build fully automated production systems, we have to be able to describe a screening template that is applied to every sample to either confirm that the sample fits the standard criteria or has to be noted for further evaluation. That \u201cfurther evaluation\u201d may be frustrated by not having the data system keep sufficient data for that evaluation, and require rerunning the sample.\n<\/p><p>The data acquired by the computer system undergoes several levels of filtering and processing before you see the final results. The sampling algorithms don\u2019t give us the level of detail in the analog chromatogram. The visual display of a chromatogram is going to be limited by the data collected and the resolution of the display. The stair-stepping of a digitized chromatogram is an example of that, while an analog chromatogram is a smooth line. Small details and anomalies that could be evidence of contamination may be missed because of the processing.\n<\/p><p>Additionally, the entire process needs to be continually monitored and evaluated to make sure that it is working properly. This is process-level statistical quality control, and it includes options for evolutionary operations updates, small changes to improve process performance. Standard samples have to be run to test the system. If screening templates are used, samples designed to exhibit problems have to be run to trigger those templates to make sure that problem samples are detected. These in-process checks have to include every phase of the process and be able to evaluate all potential risks. The intent is to build confidence in the data by building confidence in the system used to produce it.\n<\/p><p>The goals of higher productivity can be achieved for sample processing, but in doing so, the work of the analyst will change from carrying out a procedure to managing and continuously tuning a system that is doing the work for them. The science has to be well-understood, as does the implementation of that science. As we shift to more automation, and use analytical techniques to monitor in-process production systems, more emphasis has to be placed on characterizing all the assumptions and possible failure points of the technique and building in tests to ensure that the data being used to evaluate and control a production process is sound.\n<\/p><p>During the development of the process described above, the analyst has to first determine the sequence of steps, demonstrate that they work as expected, and prepare the documentation needed to support the process and guide someone through it\u2019s execution. This includes the details of how the autosampler programming is developed, stored, and maintained. The same holds true for the data systems parameters, screening templates, and processing routines. This is process engineering. Then, when the system is being used, the analyst has to ensure that the proper programming is loaded into each component and that it is set up and ready for use.\n<\/p><p>This is a very simple example of what is possible and an illustration of the changes that could occur in the work of lab professionals. The performance of a system such as that described could be doubled by implementing a second process stream without significantly increasing the analyst workload.\n<\/p><p>The key element is the skill level of those working in the lab (Figure 20): are they capable of meeting the challenge? Much of what has been described is process engineering, and there are people in manufacturing and production who are good at that. We need to combine process engineering skills with the science. Developing automation teams is one approach, but no matter how you address the idea, those working in labs need an additional layer of skills, beyond what they have been exposed to in formal education settings.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig20_Liscouski_ElementsLabTechMan14.png\" class=\"image wiki-link\" data-key=\"dcc454cd623d34bb559ffd051a910fe0\"><img alt=\"Fig20 Liscouski ElementsLabTechMan14.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/c\/ca\/Fig20_Liscouski_ElementsLabTechMan14.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 20.<\/b> The skills and education required at the various stages of the lab-based scientific manufacturing and production process<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>Of the sciences, clinical chemistry has moved the furthest into the advanced application of laboratory automation in the lab (Figure 20), transforming lab work in the process, moving from lab staff executing procedures manually to managing systems. The following quote from Diana Mass of Associated Laboratory Consultants <sup id=\"rdp-ebb-cite_ref-21\" class=\"reference\"><a href=\"#cite_note-21\">[j]<\/a><\/sup> helps delineate the difference in lab work styles:\n<\/p>\n<blockquote><p>What I have observed is that automation has replaced some of the routine repetitive steps in performing analysis; however, the individual has to be even more knowledgeable to troubleshoot sophisticated instrumentation. Even if the equipment is simple to operate, the person has to know how to evaluate quality control results and have a quality assurance system in place to ensure quality test information.<\/p><\/blockquote>\n<p>And here's a quote from Martha Casassa, Laboratory Director of Braintree Rehabilitation Hospital<sup id=\"rdp-ebb-cite_ref-22\" class=\"reference\"><a href=\"#cite_note-22\">[k]<\/a><\/sup>, who has experience in both clinical and non-clinical labs:\n<\/p>\n<blockquote><p>Having a background both clinical (as a medical technologist) and non-clinical (chemistry major and managing a non-clinical research lab), I can attest to the training\/education being different. I was much more prepared coming through the clinical experience to handle automation and computers and the subsequent troubleshooting and repair necessary as well as the maintenance and upkeep of the systems. During my non-clinical training the emphasis was not so much on theory as practical application in manual methods. I learned assays on some automated equipment, but that education was more to obtain an end-product than to really understand the system and how it produced that product. On the clinical side I learned not only how to get the end-product, but the way it was produced so I could identify issues sooner, produce quality results, and more effectively troubleshoot.<\/p><\/blockquote>\n<p>The bottom line is simple: if people are going to be effective working in modern labs, the need is to understand both the science and the way science is done using the tools of lab automation. We have a long way to go before we get there. A joint survey by the ILA and Lab Managers Association<sup id=\"rdp-ebb-cite_ref-23\" class=\"reference\"><a href=\"#cite_note-23\">[l]<\/a><\/sup> yielded the following:\n<\/p>\n<ul><li>Lab automation is essential for most labs, but not all.<\/li>\n<li>The skill set necessary to work with automation has changed significantly.<\/li>\n<li>Entry-level scientists are generally capable of working with the hardware and software.<\/li>\n<li>Entry-level technicians often are not entry-level technicians.<\/li>\n<li>In general, applicants for positions are not well qualified to work with automation.<\/li><\/ul>\n<p>How well-educated in the use of automated systems are those working in the lab? The following text was used earlier: \"When processor-based systems were added to the lab\u2019s tool set, things moved in a different direction. The computers could then perform the data acquisition, display, and analysis. This left the science to be performed by a program, with the analyst able to adjust the behavior of the program by setting numerical parameters.\"\n<\/p><p>Let's follow that thought down a different path. In chromatography, those numerical parameters were used to determine the start and end of a peak, and how baselines were drawn. In some cases, an inappropriate set of parameters would reduce a data set to junk. Do people understand what the parameters are in an IDS and how to use them? Many industrial labs and schools have people using and IDS with no understanding of what is happening to their data. Others such as Hinshaw and Stevenson <i>et al.<\/i> have commented on this phenomenon in the past:\n<\/p>\n<blockquote><p>Chromatographers go to great lengths to prepare, inject, and separate their samples, but they sometimes do not pay as much attention to the next step: peak detection and measurement ... Despite a lot of exposure to computerized data handling, however, many practicing chromatographers do not have a good idea of how a stored chromatogram file\u2014a set of data points arrayed in time\u2014gets translated into a set of peaks with quantitative attributes such as area, height, and amount.<sup id=\"rdp-ebb-cite_ref-HinshawFinding14_24-0\" class=\"reference\"><a href=\"#cite_note-HinshawFinding14-24\">[12]<\/a><\/sup><\/p><\/blockquote>\n<blockquote><p>At this point, I noticed that the discussion tipped from an academic recitation of technical needs and possible solutions to a session driven primarily by frustrations. Even today, the instruments are often more sophisticated than the average user, whether he\/she is a technician, graduate student, scientist, or principal investigator using <a href=\"https:\/\/www.limswiki.org\/index.php\/Chromatography\" title=\"Chromatography\" class=\"wiki-link\" data-key=\"2615535d1f14c6cffdfad7285999ad9d\">chromatography<\/a> as part of the project. Who is responsible for generating good data? Can the designs be improved to increase data integrity?<sup id=\"rdp-ebb-cite_ref-StevensonTheFuture11_25-0\" class=\"reference\"><a href=\"#cite_note-StevensonTheFuture11-25\">[13]<\/a><\/sup><\/p><\/blockquote>\n<p>In yet another example, at the European Lab Automation 2012 Meeting, one liquid-handling equipment vendor gave a presentation on how improper calibration and use of liquid handling systems would yield poor data.<sup id=\"rdp-ebb-cite_ref-BradshawTheImpo12_26-0\" class=\"reference\"><a href=\"#cite_note-BradshawTheImpo12-26\">[14]<\/a><\/sup> Discussion with other vendors supported that point, citing poor training as the cause.\n<\/p><p>One of the problems that has developed is \u201cpush-button science,\u201d or to be more precise, the execution of tasks by pushing a button: put the sample in, push a button to get the measurements, and get a printout. Measurements are being made, and those using the equipment don\u2019t understand what is being done, or if it is being done properly, exemplifying a \u201ctrust the vendor\u201d or the \u201cvendor is the expert\u201d mindset. Those points run against the concepts of validation described by the FDA and <a href=\"https:\/\/www.limswiki.org\/index.php\/International_Organization_for_Standardization\" title=\"International Organization for Standardization\" class=\"wiki-link\" data-key=\"116defc5d89c8a55f5b7c1be0790b442\">International Organization for Standardization<\/a> (ISO) organizations, as well as others. People need to approach equipment with a healthy skepticism, not assuming that it is working but being able to demonstrate that it is working. Trusting the system is based on experience and proof that the system works as expected, not assumptions.\n<\/p><p>Education on the use of current lab systems is sorely needed in today\u2019s working environment. What are the needs going to be in the future as we move from people using equipment in workstations to the integrated workflows of scientific manufacturing and production processes?\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Closing\">Closing<\/span><\/h2>\n<p>The direction of laboratory automation has changed significantly over time, and with it so has the work associated with a lab. However, many have looked at that automation as just a means to an end, when it reality laboratory automation is a process, like most any other in a manufacturing and production setting. As a process, we need to look at the elements that can be used to make that process work effectively: management issues, implementation issues, experimental methods, lab-specific technologies, broader information technologies, and systems integration issues. Addressing those early, as part of a planning process, better ensures successful implementation of automation in the lab setting. \n<\/p><p>But there's more to it than that: the personnel doing the work must have the skills and education to fully realize lab automation's benefits. In fact, it's more than the scientists and technicians doing the lab work, but also those designing and implementing the technology in the lab, as well as the vendors actually making the components. Everyone must be communicating ideas, making suggestions, and working together in order to make automation work well in the lab. But the addition of that technology does not mean a more efficient and cost-effective lab; it requires knowledge of production and how the underlying technology actually works. While, when implemented well, automation may come with lower costs and better productivity but it also comes with a demand for higher-level, specialized skills. In other words, lab personnel must understand both the science and the way science is done when using the tools of laboratory automation.\n<\/p><p>We need lab personnel who are competent users of modern lab instrumentation systems, robotics, and informatics (LIMS, ELNs, SDMS, CDS, etc.), the tools used to do lab work. They should understand the science behind the techniques and how the systems are used in their execution. If a computer system is used to do data capture and processing, that understanding includes:\n<\/p>\n<ul><li>how the data capture is accomplished,<\/li>\n<li>how it is processed,<\/li>\n<li>what the control parameters are and how the current set in use was arrived at (not \u201cthat\u2019s what came from the vendor\u201d), and<\/li>\n<li>how to detect and correct problems.<\/li><\/ul>\n<p>They should also understand statistical process control so that the behavior of automated systems can be monitored, with potential problems detected and corrected before they become significant. Rather than simply being part of the execution of a procedure, they manage the process. We're talking about LAEs.\n<\/p><p>Those LAEs must be capable of planning, implementing, and supporting lab systems, as well as developing products and technologies for labs.<sup id=\"rdp-ebb-cite_ref-27\" class=\"reference\"><a href=\"#cite_note-27\">[m]<\/a><\/sup> This type of knowledge isn't limited to the people developing systems; those supporting them also require those capabilities. After all, the implementation of laboratory systems is an engineering program and should be approached in the same manner as any systems development activity.\n<\/p><p>The use of advanced technology products isn\u2019t going to improve until we have people that are fully competent to work with them, understand their limitations, and drive vendors to create better products.\n<\/p><p><br \/>\n<\/p>\n<h2><span id=\"rdp-ebb-Abbreviations,_acronyms,_and_initialisms\"><\/span><span class=\"mw-headline\" id=\"Abbreviations.2C_acronyms.2C_and_initialisms\">Abbreviations, acronyms, and initialisms<\/span><\/h2>\n<p><b>AIA<\/b>: Analytical Instrument Association\n<\/p><p><b>CDS<\/b>: Chromatography data system\n<\/p><p><b>ELN<\/b>: Electronic laboratory notebook\n<\/p><p><b>FDA<\/b>: Food and Drug Administration\n<\/p><p><b>GC-MS<\/b>: Gas chromatography\u2013mass spectrometry\n<\/p><p><b>IDS<\/b>: Instrument data system\n<\/p><p><b>ISO<\/b>: International Organization for Standardization\n<\/p><p><b>K\/D\/I<\/b>: Knowledge, data, and information\n<\/p><p><b>LAE<\/b>: Laboratory automation engineering (or engineer)\n<\/p><p><b>LIMS<\/b>: Laboratory information management system\n<\/p><p><b>LIS<\/b>: Laboratory information system\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Footnotes\">Footnotes<\/span><\/h2>\n<div class=\"reflist\" style=\"list-style-type: lower-alpha;\">\n<div class=\"mw-references-wrap mw-references-columns\"><ol class=\"references\">\n<li id=\"cite_note-1\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-1\">\u2191<\/a><\/span> <span class=\"reference-text\">The Spectronic 20 was developed by Bausch & Lomb in 1954 and is currently owned and marketed in updated versions by ThermoFisher.<\/span>\n<\/li>\n<li id=\"cite_note-4\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-4\">\u2191<\/a><\/span> <span class=\"reference-text\">Mr. Zenie often introduced robotics courses at Zymark with that statement.<\/span>\n<\/li>\n<li id=\"cite_note-5\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-5\">\u2191<\/a><\/span> <span class=\"reference-text\">See <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.astm.org\/DATABASE.CART\/HISTORICAL\/E1578-06.htm\" target=\"_blank\">ASTM E1578 - 06<\/a>.<\/span>\n<\/li>\n<li id=\"cite_note-6\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-6\">\u2191<\/a><\/span> <span class=\"reference-text\">This is not a recommended method.<\/span>\n<\/li>\n<li id=\"cite_note-7\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-7\">\u2191<\/a><\/span> <span class=\"reference-text\">This topic will be given light treatment in this work, but will be covered in more detail elsewhere.<\/span>\n<\/li>\n<li id=\"cite_note-8\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-8\">\u2191<\/a><\/span> <span class=\"reference-text\">LabVIEW is a product of <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.ni.com\/en-us\/shop\/labview.html\" target=\"_blank\">National Instruments<\/a>; similar products are available from other vendors.<\/span>\n<\/li>\n<li id=\"cite_note-9\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-9\">\u2191<\/a><\/span> <span class=\"reference-text\">Having a data system connected to, or in control of, a process is not the same as full automation. For example, there are automated Karl Fisher systems (for water analysis), but they only address titration activities and not sample preparation. A vendor can only take things so far in commercial products unless labs describe a larger role for automation, one that will vary by application. The point is we need a formalized description of that larger context.<\/span>\n<\/li>\n<li id=\"cite_note-10\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-10\">\u2191<\/a><\/span> <span class=\"reference-text\">For more information about lab-specific technologies, please refer to <i><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.researchgate.net\/publication\/275351757_Computerized_Systems_in_the_Modern_Laboratory_A_Practical_Guide\" target=\"_blank\">Computerized Systems in the Modern Laboratory: A Practical Guide<\/a><\/i>.<\/span>\n<\/li>\n<li id=\"cite_note-18\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-18\">\u2191<\/a><\/span> <span class=\"reference-text\">You can also find the expanded version of the paper <i>Are You a Laboratory Automation Engineer<\/i> <a href=\"https:\/\/www.limswiki.org\/index.php\/LII:Are_You_a_Laboratory_Automation_Engineer%3F\" title=\"LII:Are You a Laboratory Automation Engineer?\" class=\"wiki-link\" data-key=\"67df76407d0807e78d9cde61bb3f82c9\">here<\/a>.<\/span>\n<\/li>\n<li id=\"cite_note-21\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-21\">\u2191<\/a><\/span> <span class=\"reference-text\">Formerly Professor and Director of Clinical Laboratory Sciences Program, Arizona State University. Quote from private communications, used with permission.<\/span>\n<\/li>\n<li id=\"cite_note-22\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-22\">\u2191<\/a><\/span> <span class=\"reference-text\">Also from private communications, used with permission.<\/span>\n<\/li>\n<li id=\"cite_note-23\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-23\">\u2191<\/a><\/span> <span class=\"reference-text\">Originally available on <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/web.archive.org\/web\/20150215144434\/http:\/\/www.institutelabauto.org\/research\/index.htm\" target=\"_blank\">the ILA site<\/a>, but no longer available.<\/span>\n<\/li>\n<li id=\"cite_note-27\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-27\">\u2191<\/a><\/span> <span class=\"reference-text\">See <i><a href=\"https:\/\/www.limswiki.org\/index.php\/LII:Are_You_a_Laboratory_Automation_Engineer%3F\" title=\"LII:Are You a Laboratory Automation Engineer?\" class=\"wiki-link\" data-key=\"67df76407d0807e78d9cde61bb3f82c9\">Are You a Laboratory Automation Engineer?<\/a><\/i> for more details.<\/span>\n<\/li>\n<\/ol><\/div><\/div>\n<h2><span class=\"mw-headline\" id=\"About_the_author\">About the author<\/span><\/h2>\n<p>Initially educated as a chemist, author Joe Liscouski (joe dot liscouski at gmail dot com) is an experienced laboratory automation\/computing professional with over forty years of experience in the field, including the design and development of automation systems (both custom and commercial systems), LIMS, robotics and data interchange standards. He also consults on the use of computing in laboratory work. He has held symposia on validation and presented technical material and short courses on laboratory automation and computing in the U.S., Europe, and Japan. He has worked\/consulted in pharmaceutical, biotech, polymer, medical, and government laboratories. His current work centers on working with companies to establish planning programs for lab systems, developing effective support groups, and helping people with the application of automation and information technologies in research and quality control environments.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"References\">References<\/span><\/h2>\n<div class=\"reflist references-column-width\" style=\"-moz-column-width: 30em; -webkit-column-width: 30em; column-width: 30em; list-style-type: decimal;\">\n<div class=\"mw-references-wrap mw-references-columns\"><ol class=\"references\">\n<li id=\"cite_note-WPLabAuto14-2\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-WPLabAuto14_2-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"nofollow\" class=\"external text wiki-link\" href=\"https:\/\/en.wikipedia.org\/w\/index.php?title=Laboratory_automation&oldid=636846823\" data-key=\"21eebd9a56e747538e53fcb619ea4428\">\"Laboratory automation\"<\/a>. <i>Wikipedia<\/i>. Archived from <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"https:\/\/en.wikipedia.org\/wiki\/Laboratory_automation\" data-key=\"035711b891e560e9b9819d6b0644849a\">the original<\/a> on 05 December 2014<span class=\"printonly\">. <a rel=\"nofollow\" class=\"external free wiki-link\" href=\"https:\/\/en.wikipedia.org\/w\/index.php?title=Laboratory_automation&oldid=636846823\" data-key=\"21eebd9a56e747538e53fcb619ea4428\">https:\/\/en.wikipedia.org\/w\/index.php?title=Laboratory_automation&oldid=636846823<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Laboratory+automation&rft.atitle=Wikipedia&rft_id=https%3A%2F%2Fen.wikipedia.org%2Fw%2Findex.php%3Ftitle%3DLaboratory_automation%26oldid%3D636846823&rfr_id=info:sid\/en.wikipedia.org:LII:Elements_of_Laboratory_Technology_Management\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-McDowallAMatrix93-3\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-McDowallAMatrix93_3-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">McDowall, R.D. (1993). \"A Matrix for the Development of a Strategic Laboratory Information Management System\". <i>Analytical Chemistry<\/i> <b>65<\/b> (20): 896A\u2013901A. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/Digital_object_identifier\" data-key=\"ae6d69c760ab710abc2dd89f3937d2f4\">doi<\/a>:<a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.1021%2Fac00068a725\" target=\"_blank\">10.1021\/ac00068a725<\/a>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=A+Matrix+for+the+Development+of+a+Strategic+Laboratory+Information+Management+System&rft.jtitle=Analytical+Chemistry&rft.aulast=McDowall%2C+R.D.&rft.au=McDowall%2C+R.D.&rft.date=1993&rft.volume=65&rft.issue=20&rft.pages=896A%E2%80%93901A&rft_id=info:doi\/10.1021%2Fac00068a725&rfr_id=info:sid\/en.wikipedia.org:LII:Elements_of_Laboratory_Technology_Management\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-TriggTheIntegArch14-11\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-TriggTheIntegArch14_11-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">Trigg, J. (2014). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/web.archive.org\/web\/20141218063422\/http:\/\/www.theintegratedlab.com\/\" target=\"_blank\">\"The Integrated Lab\"<\/a>. Archived from <a rel=\"external_link\" class=\"external text\" href=\"http:\/\/www.theintegratedlab.com\/\" target=\"_blank\">the original<\/a> on 18 December 2014<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/web.archive.org\/web\/20141218063422\/http:\/\/www.theintegratedlab.com\/\" target=\"_blank\">https:\/\/web.archive.org\/web\/20141218063422\/http:\/\/www.theintegratedlab.com\/<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=The+Integrated+Lab&rft.atitle=&rft.aulast=Trigg%2C+J.&rft.au=Trigg%2C+J.&rft.date=2014&rft_id=https%3A%2F%2Fweb.archive.org%2Fweb%2F20141218063422%2Fhttp%3A%2F%2Fwww.theintegratedlab.com%2F&rfr_id=info:sid\/en.wikipedia.org:LII:Elements_of_Laboratory_Technology_Management\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-LiscouskiInteg12-12\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-LiscouskiInteg12_12-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Liscouski, J. (2012). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.labmanager.com\/computing-and-automation\/integrating-systems-17595\" target=\"_blank\">\"Integrating Systems\"<\/a>. <i>Lab Manager<\/i> <b>7<\/b> (1): 26\u20139<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.labmanager.com\/computing-and-automation\/integrating-systems-17595\" target=\"_blank\">https:\/\/www.labmanager.com\/computing-and-automation\/integrating-systems-17595<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Integrating+Systems&rft.jtitle=Lab+Manager&rft.aulast=Liscouski%2C+J.&rft.au=Liscouski%2C+J.&rft.date=2012&rft.volume=7&rft.issue=1&rft.pages=26%E2%80%939&rft_id=https%3A%2F%2Fwww.labmanager.com%2Fcomputing-and-automation%2Fintegrating-systems-17595&rfr_id=info:sid\/en.wikipedia.org:LII:Elements_of_Laboratory_Technology_Management\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-OIGTheFed06-13\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-OIGTheFed06_13-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">Office of the Inspector General (June 2006). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/oig.justice.gov\/reports\/FBI\/a0633\/index.htm\" target=\"_blank\">\"Executive Summary\"<\/a>. <i>The Federal Bureau of Investigation's Implementation of the Laboratory Information Management System - Audit Report 06-33<\/i><span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/oig.justice.gov\/reports\/FBI\/a0633\/index.htm\" target=\"_blank\">https:\/\/oig.justice.gov\/reports\/FBI\/a0633\/index.htm<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Executive+Summary&rft.atitle=The+Federal+Bureau+of+Investigation%27s+Implementation+of+the+Laboratory+Information+Management+System+-+Audit+Report+06-33&rft.aulast=Office+of+the+Inspector+General&rft.au=Office+of+the+Inspector+General&rft.date=June+2006&rft_id=https%3A%2F%2Foig.justice.gov%2Freports%2FFBI%2Fa0633%2Findex.htm&rfr_id=info:sid\/en.wikipedia.org:LII:Elements_of_Laboratory_Technology_Management\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-Hamilton2006-14\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-Hamilton2006_14-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Hamilton, S.D. (2007). \"2006 ALA Survey on Industrial Laboratory Automation\". <i>SLAS Technology<\/i> <b>12<\/b> (4): 239\u201346. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/Digital_object_identifier\" data-key=\"ae6d69c760ab710abc2dd89f3937d2f4\">doi<\/a>:<a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.1016%2Fj.jala.2007.04.003\" target=\"_blank\">10.1016\/j.jala.2007.04.003<\/a>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=2006+ALA+Survey+on+Industrial+Laboratory+Automation&rft.jtitle=SLAS+Technology&rft.aulast=Hamilton%2C+S.D.&rft.au=Hamilton%2C+S.D.&rft.date=2007&rft.volume=12&rft.issue=4&rft.pages=239%E2%80%9346&rft_id=info:doi\/10.1016%2Fj.jala.2007.04.003&rfr_id=info:sid\/en.wikipedia.org:LII:Elements_of_Laboratory_Technology_Management\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-SCWChoosing02-15\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-SCWChoosing02_15-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">Scientific Computing World (September\/October 2002). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/web.archive.org\/web\/20071019002613\/http:\/\/www.scientific-computing.com\/features\/feature.php?feature_id=132\" target=\"_blank\">\"Choosing the right client\"<\/a>. <i>Scientific Computing World<\/i>. Archived from <a rel=\"external_link\" class=\"external text\" href=\"http:\/\/www.scientific-computing.com\/features\/feature.php?feature_id=132\" target=\"_blank\">the original<\/a> on 19 October 2007<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/web.archive.org\/web\/20071019002613\/http:\/\/www.scientific-computing.com\/features\/feature.php?feature_id=132\" target=\"_blank\">https:\/\/web.archive.org\/web\/20071019002613\/http:\/\/www.scientific-computing.com\/features\/feature.php?feature_id=132<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Choosing+the+right+client&rft.atitle=Scientific+Computing+World&rft.aulast=Scientific+Computing+World&rft.au=Scientific+Computing+World&rft.date=September%2FOctober+2002&rft_id=https%3A%2F%2Fweb.archive.org%2Fweb%2F20071019002613%2Fhttp%3A%2F%2Fwww.scientific-computing.com%2Ffeatures%2Ffeature.php%3Ffeature_id%3D132&rfr_id=info:sid\/en.wikipedia.org:LII:Elements_of_Laboratory_Technology_Management\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-SGITheCHAOS95-16\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-SGITheCHAOS95_16-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.csus.edu\/indiv\/r\/rengstorffj\/obe152-spring02\/articles\/standishchaos.pdf\" target=\"_blank\">\"The CHAOS Report\"<\/a> (PDF). Standish Group International, Inc. 1995<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.csus.edu\/indiv\/r\/rengstorffj\/obe152-spring02\/articles\/standishchaos.pdf\" target=\"_blank\">https:\/\/www.csus.edu\/indiv\/r\/rengstorffj\/obe152-spring02\/articles\/standishchaos.pdf<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=The+CHAOS+Report&rft.atitle=&rft.date=1995&rft.pub=Standish+Group+International%2C+Inc&rft_id=https%3A%2F%2Fwww.csus.edu%2Findiv%2Fr%2Frengstorffj%2Fobe152-spring02%2Farticles%2Fstandishchaos.pdf&rfr_id=info:sid\/en.wikipedia.org:LII:Elements_of_Laboratory_Technology_Management\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-LiscouskiAreYou06-17\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-LiscouskiAreYou06_17-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Liscouski, J.G. (2006). \"Are You a Laboratory Automation Engineer?\". <i>SLAS Technology<\/i> <b>11<\/b> (3): 157-162. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/Digital_object_identifier\" data-key=\"ae6d69c760ab710abc2dd89f3937d2f4\">doi<\/a>:<a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.1016%2Fj.jala.2006.04.002\" target=\"_blank\">10.1016\/j.jala.2006.04.002<\/a>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Are+You+a+Laboratory+Automation+Engineer%3F&rft.jtitle=SLAS+Technology&rft.aulast=Liscouski%2C+J.G.&rft.au=Liscouski%2C+J.G.&rft.date=2006&rft.volume=11&rft.issue=3&rft.pages=157-162&rft_id=info:doi\/10.1016%2Fj.jala.2006.04.002&rfr_id=info:sid\/en.wikipedia.org:LII:Elements_of_Laboratory_Technology_Management\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-ANDISF03Arch-19\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-ANDISF03Arch_19-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">Julian, R.K. (22 July 2003). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/web.archive.org\/web\/20030804151150\/http:\/\/andi.sourceforge.net\/\" target=\"_blank\">\"Analytical Data Interchange\"<\/a>. <i>SourceForge<\/i>. Archived from <a rel=\"external_link\" class=\"external text\" href=\"http:\/\/andi.sourceforge.net\/\" target=\"_blank\">the original<\/a> on 04 August 2003<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/web.archive.org\/web\/20030804151150\/http:\/\/andi.sourceforge.net\/\" target=\"_blank\">https:\/\/web.archive.org\/web\/20030804151150\/http:\/\/andi.sourceforge.net\/<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Analytical+Data+Interchange&rft.atitle=SourceForge&rft.aulast=Julian%2C+R.K.&rft.au=Julian%2C+R.K.&rft.date=22+July+2003&rft_id=https%3A%2F%2Fweb.archive.org%2Fweb%2F20030804151150%2Fhttp%3A%2F%2Fandi.sourceforge.net%2F&rfr_id=info:sid\/en.wikipedia.org:LII:Elements_of_Laboratory_Technology_Management\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-ASTMWK23265Arch-20\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-ASTMWK23265Arch_20-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/web.archive.org\/web\/20130813072546\/https:\/\/www.astm.org\/DATABASE.CART\/WORKITEMS\/WK23265.htm\" target=\"_blank\">\"ASTM WK23265 - New Specification for Analytical Information Markup Language\"<\/a>. ASTM International. Archived from <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.astm.org\/DATABASE.CART\/WORKITEMS\/WK23265.htm\" target=\"_blank\">the original<\/a> on 13 August 2013<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/web.archive.org\/web\/20130813072546\/https:\/\/www.astm.org\/DATABASE.CART\/WORKITEMS\/WK23265.htm\" target=\"_blank\">https:\/\/web.archive.org\/web\/20130813072546\/https:\/\/www.astm.org\/DATABASE.CART\/WORKITEMS\/WK23265.htm<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=ASTM+WK23265+-+New+Specification+for+Analytical+Information+Markup+Language&rft.atitle=&rft.pub=ASTM+International&rft_id=https%3A%2F%2Fweb.archive.org%2Fweb%2F20130813072546%2Fhttps%3A%2F%2Fwww.astm.org%2FDATABASE.CART%2FWORKITEMS%2FWK23265.htm&rfr_id=info:sid\/en.wikipedia.org:LII:Elements_of_Laboratory_Technology_Management\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-HinshawFinding14-24\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-HinshawFinding14_24-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Hinshaw, J.V. (2014). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.chromatographyonline.com\/view\/finding-needle-haystack-0\" target=\"_blank\">\"Finding a Needle in a Haystack\"<\/a>. <i>LCGC Europe<\/i> <b>27<\/b> (11): 584\u201389<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.chromatographyonline.com\/view\/finding-needle-haystack-0\" target=\"_blank\">https:\/\/www.chromatographyonline.com\/view\/finding-needle-haystack-0<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Finding+a+Needle+in+a+Haystack&rft.jtitle=LCGC+Europe&rft.aulast=Hinshaw%2C+J.V.&rft.au=Hinshaw%2C+J.V.&rft.date=2014&rft.volume=27&rft.issue=11&rft.pages=584%E2%80%9389&rft_id=https%3A%2F%2Fwww.chromatographyonline.com%2Fview%2Ffinding-needle-haystack-0&rfr_id=info:sid\/en.wikipedia.org:LII:Elements_of_Laboratory_Technology_Management\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-StevensonTheFuture11-25\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-StevensonTheFuture11_25-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">Stevenson, R.L.; Lee, M.; Gras, R. (1 September 2011). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/americanlaboratory.com\/913-Technical-Articles\/34439-The-Future-of-GC-Instrumentation-From-the-35th-International-Symposium-on-Capillary-Chromatography-ISCC\/\" target=\"_blank\">\"The Future of GC Instrumentation From the 35th International Symposium on Capillary Chromatography (ISCC)\"<\/a>. <i>American Laboratory<\/i><span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/americanlaboratory.com\/913-Technical-Articles\/34439-The-Future-of-GC-Instrumentation-From-the-35th-International-Symposium-on-Capillary-Chromatography-ISCC\/\" target=\"_blank\">https:\/\/americanlaboratory.com\/913-Technical-Articles\/34439-The-Future-of-GC-Instrumentation-From-the-35th-International-Symposium-on-Capillary-Chromatography-ISCC\/<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=The+Future+of+GC+Instrumentation+From+the+35th+International+Symposium+on+Capillary+Chromatography+%28ISCC%29&rft.atitle=American+Laboratory&rft.aulast=Stevenson%2C+R.L.%3B+Lee%2C+M.%3B+Gras%2C+R.&rft.au=Stevenson%2C+R.L.%3B+Lee%2C+M.%3B+Gras%2C+R.&rft.date=1+September+2011&rft_id=https%3A%2F%2Famericanlaboratory.com%2F913-Technical-Articles%2F34439-The-Future-of-GC-Instrumentation-From-the-35th-International-Symposium-on-Capillary-Chromatography-ISCC%2F&rfr_id=info:sid\/en.wikipedia.org:LII:Elements_of_Laboratory_Technology_Management\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-BradshawTheImpo12-26\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-BradshawTheImpo12_26-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\">Bradshaw, J.T. (30 May 2012). <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/d1wfu1xu79s6d2.cloudfront.net\/wp-content\/uploads\/2013\/10\/The-Importance-of-Liquid-Handling-Details-and-Their-Impact-on-Your-Assays.pdf\" target=\"_blank\">\"The Importance of Liquid Handling Details and Their Impact on Your Assays\"<\/a> (PDF). <i>European Lab Automation Conference 2012<\/i>. Artel, Inc<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/d1wfu1xu79s6d2.cloudfront.net\/wp-content\/uploads\/2013\/10\/The-Importance-of-Liquid-Handling-Details-and-Their-Impact-on-Your-Assays.pdf\" target=\"_blank\">https:\/\/d1wfu1xu79s6d2.cloudfront.net\/wp-content\/uploads\/2013\/10\/The-Importance-of-Liquid-Handling-Details-and-Their-Impact-on-Your-Assays.pdf<\/a><\/span><span class=\"reference-accessdate\">. Retrieved 11 February 2021<\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=The+Importance+of+Liquid+Handling+Details+and+Their+Impact+on+Your+Assays&rft.atitle=European+Lab+Automation+Conference+2012&rft.aulast=Bradshaw%2C+J.T.&rft.au=Bradshaw%2C+J.T.&rft.date=30+May+2012&rft.pub=Artel%2C+Inc&rft_id=https%3A%2F%2Fd1wfu1xu79s6d2.cloudfront.net%2Fwp-content%2Fuploads%2F2013%2F10%2FThe-Importance-of-Liquid-Handling-Details-and-Their-Impact-on-Your-Assays.pdf&rfr_id=info:sid\/en.wikipedia.org:LII:Elements_of_Laboratory_Technology_Management\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<\/ol><\/div><\/div>\n<!-- \nNewPP limit report\nCached time: 20211118084457\nCache expiry: 86400\nDynamic content: false\nComplications: []\nCPU time usage: 0.228 seconds\nReal time usage: 0.321 seconds\nPreprocessor visited node count: 10321\/1000000\nPost\u2010expand include size: 67268\/2097152 bytes\nTemplate argument size: 30082\/2097152 bytes\nHighest expansion depth: 18\/40\nExpensive parser function count: 0\/100\nUnstrip recursion depth: 0\/20\nUnstrip post\u2010expand size: 26775\/5000000 bytes\n-->\n<!--\nTransclusion expansion time report (%,ms,calls,template)\n100.00% 128.214 1 -total\n 81.26% 104.193 2 Template:Reflist\n 62.99% 80.758 14 Template:Citation\/core\n 44.86% 57.519 9 Template:Cite_web\n 24.80% 31.800 5 Template:Cite_journal\n 12.47% 15.984 13 Template:Efn\n 9.25% 11.862 7 Template:Date\n 6.41% 8.217 3 Template:Citation\/identifier\n 4.17% 5.341 25 Template:Citation\/make_link\n 3.08% 3.955 3 Template:Only_in_print\n-->\n\n<!-- Saved in parser cache with key limswiki:pcache:idhash:12375-0!canonical and timestamp 20211118084456 and revision id 41792. Serialized with JSON.\n -->\n<\/div><\/div><div class=\"printfooter\">Source: <a rel=\"external_link\" class=\"external\" href=\"https:\/\/www.limswiki.org\/index.php\/LII:Elements_of_Laboratory_Technology_Management\">https:\/\/www.limswiki.org\/index.php\/LII:Elements_of_Laboratory_Technology_Management<\/a><\/div>\n<!-- end content --><div class=\"visualClear\"><\/div><\/div><\/div><div class=\"visualClear\"><\/div><\/div><!-- end of the left (by default at least) column --><div class=\"visualClear\"><\/div><\/div>\n\n\n\n<\/body>","2000eea677bcd5ee1fcecdab32743800_images":["https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/9\/96\/Fig1_Liscouski_ElementsLabTechMan14.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/5\/57\/Fig2_Liscouski_ElementsLabTechMan14.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/6\/6d\/Fig3_Liscouski_ElementsLabTechMan14.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/f\/fa\/Fig4_Liscouski_ElementsLabTechMan14.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/8\/81\/Fig5_Liscouski_ElementsLabTechMan14.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/3\/36\/Fig6_Liscouski_ElementsLabTechMan14.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/8\/81\/Fig7_Liscouski_ElementsLabTechMan14.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/e\/e5\/Fig8_Liscouski_ElementsLabTechMan14.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/9\/9d\/Fig9_Liscouski_ElementsLabTechMan14.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/f\/f4\/Fig10_Liscouski_ElementsLabTechMan14.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/8\/84\/Fig11_Liscouski_ElementsLabTechMan14.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/d\/d5\/Fig12_Liscouski_ElementsLabTechMan14.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/3\/31\/Fig13_Liscouski_ElementsLabTechMan14.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/a\/a7\/Fig14_Liscouski_ElementsLabTechMan14.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/d\/d0\/Fig15_Liscouski_ElementsLabTechMan14.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/8\/89\/Fig16_Liscouski_ElementsLabTechMan14.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/1\/17\/Fig17_Liscouski_ElementsLabTechMan14.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/3\/3f\/Fig18_Liscouski_ElementsLabTechMan14.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/3\/39\/Fig19_Liscouski_ElementsLabTechMan14.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/c\/ca\/Fig20_Liscouski_ElementsLabTechMan14.png"],"2000eea677bcd5ee1fcecdab32743800_timestamp":1637252747,"67df76407d0807e78d9cde61bb3f82c9_type":"article","67df76407d0807e78d9cde61bb3f82c9_title":"Are You a Laboratory Automation Engineer? (Liscouski 2006)","67df76407d0807e78d9cde61bb3f82c9_url":"https:\/\/www.limswiki.org\/index.php\/LII:Are_You_a_Laboratory_Automation_Engineer%3F","67df76407d0807e78d9cde61bb3f82c9_plaintext":"\n\nLII:Are You a Laboratory Automation Engineer?From LIMSWikiJump to navigationJump to searchTitle: Are You a Laboratory Automation Engineer?\nAuthor for citation: Joe Liscouski, with editorial modifications by Shawn Douglas\nLicense for content: Creative Commons Attribution 4.0 International\nPublication date: 2006\n\nContents \n\n1 Summary \n2 Introduction \n3 Developments in laboratory automation \n4 Benefits of formalizing laboratory automation engineering \n5 Sub-disciplines within LAE \n6 Skills required for LAE \n\n6.1 Project and change management \n6.2 Strong communications skills \n6.3 Strong understanding of regulatory issues \n6.4 General systems theory and beyond \n\n\n7 What comes next? \n8 Conclusion \n9 Abbreviations, acronyms, and initialisms \n10 Footnotes \n11 Acknowledgements \n12 About the author \n13 References \n\n\n\nSummary \nThe technology and techniques that are used in automating laboratory work have been documented for over 40 years. Work done under the heading of \u201claboratory automation\u201d has progressed from pioneering work in data acquisition and instrument control, to instrumentation with integral computing and communications. If the field is to move forward, we need to organize the practices and education of those applying automation and computing technologies to laboratory work. This note[a] is an opening to a dialog in the definition and development of the field of \"laboratory automation engineering\u201d (LAE).\nIn this article the following points will be considered:\n\nDefinition of laboratory automation engineering (LAE)\nLaboratory automation as an engineering discipline\nDevelopments in laboratory automation\nBenefits of formalizing laboratory automation engineering\nSub-disciplines within LAE\nSkills required for LAE\nWhat comes next?\nIntroduction \nThere are many of us working on the use of automation technologies in scientific disciplines such as chemistry, high-throughput screening, physics, quality control, electronics, oceanography, and materials testing. Yet, given the diversity of applications, we have many characteristics in common. There are enough common traits and skills that we can declare a field of laboratory automation engineering (LAE). The establishment of this field of work benefits both the practitioner and those in need of their skills. In addition, we may finally come to the full realization of the rewards we expect from automation systems.\nLaboratory automation engineering can be defined as the application of automation, information, computing, and networking technologies to solve problems in, or improve the practice of, a scientific discipline. As such, it fits the accepted definition of \"engineering\": the discipline dealing with the art or science of applying scientific knowledge to practical problems.\nThe common characteristics of LAE are:\n\nIt is practiced in facilities where the focus is scientific in nature, including research and development, testing, and quality control activities.\nThe end product of the work consists of information (e.g., initial and processed test data) and\/or systems for managing and improving people\u2019s ability to work with information (intermediate steps result in prepared samples or specimens).\nThe automation practice draws upon robotics, mechanical and electrical engineering, data and information handling and management, information and networking technology, and the science that the automation is addressing.\nIt requires the practitioners to be knowledgeable in both automation technologies and the discipline in which they are being applied.\nLAE differs from other forms of automation engineering found in manufacturing and product production in several ways. First, modern manufacturing and production facilities are designed with automation as an integral part of the process. Laboratory automation is a replacement process: manual techniques, once they mature, are replaced with automated systems. Economics, and the need to improve process uniformity, drive the replacement. If a testing or quality control lab could be designed initially as a fully automated facility, it would be difficult to distinguish the automated facility from any other manufacturing facility aside from the point that its \u201cproduct\u201d is data rather than tangible materials. The second difference is the point just made: the results are ultimately data.[b] LAE would result in systems and methods (e.g., high-throughput screening and combinatorial methods) that enable science to be pursued in ways that could not be done without automation.\n\nDevelopments in laboratory automation \nLaboratory automation was initiated by scientists with a need to do things differently. Whether it was to improve the efficiency of the work, to substitute intelligent control and acquisition systems for manual labor, or because it was the best or only way to implement an experimental system, automation eventually moved into the laboratory. Those doing the work had to be as well-versed in computing hardware and software as they were in their science. The choice of hardware and software, as well the development of the system, was in their hands.[1][2]\nThe first generation of laboratory automation systems was concerned with interfacing instrument, sensors, and controllers to computers. Then the control of experiments through robotics and direct interfacing with computer systems arrived. A variety of experimental control software packages became available, as did laboratory information management systems (LIMS) and, in the last few years, electronic laboratory notebooks (ELNs).[1][2]\nIf you have visited a conference in any discipline with a discussion on automation, it would appear that automation has seriously taken hold and most things are already automated or soon will be. One problem with this product array is the ability to integrate products from different vendors. Individual procedures are being addressed, but the movement of data and information within the lab and your ability to work with it is not as robust as it could be. Some vendors answer this need with comprehensive ELN systems in which their products are the hub of a lab's operations. Data is collected from instruments with local computing capability (e.g., the simple data translation of balances or similar devices) into their systems for storage, cataloging, analysis, etc. The movement is primarily one-way, but this is only the current generation of systems. The capabilities of today\u2019s products generate new possibilities requiring upgrades to, or replacement of, these products.\nThe early automation models were driven by limitations in computing power, data storage, and communications. There was one experimental setup, with one system to provide automation.[1][2] The next generation of laboratory automation has to replace previous models, and develop implementations that can be gracefully upgraded rather than replaced. This will require increased modularity and less dependence upon local-to-the-instrument systems for automation. Achieving it requires engineering systems, not just building products.\nToday, data communication is readily available thanks to the development of communication standards. Storage is cheap and there is an abundance of computing capability available. Data analysis does not have to be done at the instrument. Instead, the data file with instrument parameters can be sent to a central system for cataloging, storage, analysis, and reporting, while still making the raw data readily available for further analysis as new data treatment techniques are developed. The dots in the automation map will be connected with bi-directional communications. The field needs people who are trained to do the work and to take advantage of the increased capabilities available to them.\nBeyond the issues of experimental work lay the requirements of corporate information architectures. Laboratory systems are part of the corporate network, whether it is a for-profit company, a university, or a research organization. Laboratory computing does not stop at the door. The purpose of a laboratory is to create knowledge, information, and data. The purpose of laboratory automation is to assist in that development, making the process more successful, effective, and cost-efficient. This can be accomplished by engineering systems that have security built-in to protect them, while at the same time giving scientists access to their data wherever they need it. It also means designing systems that permit a graceful evolution and continual improvement without disrupting the lab workings.\nToo often we find corporate information technology (IT) departments at odds with those using computers in laboratory settings. On one side you hear \u201cIT doesn\u2019t understand our needs,\u201d and on the other \u201cwe\u2019re responsible for all corporate computing.\u201d Both sides have a point. IT departments are responsible for corporate computing, and labs are part of the corporation. However, IT departments usually do not understand the computing needs of scientists; they do not speak the same language. IT-speak and science-speak do not readily connect and the issue will worsen as more instruments become network-ready. Specialized software packages used in laboratory work raise support issues for IT departments. IT would be concerned about who was going to support these programs and the effect they may have on the security of the corporation\u2019s information network. LAE\u2019s have the ability to bridge that gap.\n\nBenefits of formalizing laboratory automation engineering \nPeople have been successful for several decades doing lab automation work, so why do we need a change in the way it is done? While there certainly have been successful projects, there have also been projects that have ranged from less-than-complete successes, or even cancellations. In the end, the potential for laboratory automation to assist or even revolutionize laboratory work is too great to have projects be hit-or-miss successes. Even initially apparent successes may lead to long-term problems when the choice of software platform leads to an eventual dead-end, or the lab finds itself locked into a product set that does not give it the flexibility needed as lab requirements change over time.\nTo date, laboratory automation has moved along a developmental path that has been traveled in other areas of technology applications. In commercial chemical synthesis, aerospace, computers, civil engineering, and other fields, progress has been made first by individuals, followed by small groups practicing within a particular field. As the need to apply these technologies on a broader scale grew, training became formalized, as did methods of design and development. The methodology shifted from an \u201cart\u201d to actual engineering. The end result has been the ability to create buildings, bridges, aircraft, etc. on a scale that independent groups working on their own could not hope to achieve.\nIn the fields noted in the previous paragraph, and in others, \u201cengineering\u201d a project means that trained people have analyzed, understood, and defined it. It also means that plans have been laid out to meet the project goals, and risks have been identified and evaluated. It is a deliberate, confident movement from requirements to completion. This is what is needed to advance the application of automation technologies to scientific and laboratory problems. Hence, a movement to an engineering disciple for laboratory automation is necessary.\nThe benefits[c] of formalizing LAE to the individual practitioner, laboratory management, and the field are outlined below.\nBenefits to the individual:\n\nprovides a thorough and systematic education in a broadly practiced field;\ninforms of past activities regarding what has worked and what has failed;\nprovides evidence of relevant knowledge and training through a degree or certificate program; and\nlends a sense of identity and community with others on the same career path.\nBenefits to laboratory management:\n\nprovides a basis for evaluating people\u2019s credentials and ability to work on laboratory automation projects;\nprovides a basis for employee evaluation;\nlimits the need for on-the-job training, reducing the impact on budgets;\nlends to faster project implementation with a higher expectation of success; and\nlends expertise to the design of laboratories using automation as part of their initial blueprint, rather than using it to replace existing manual procedures, while reducing the cost and improving the efficiency of a laboratory\u2019s operations.\nBenefits to the field of laboratory automation:\n\ncreates a foundation of documented knowledge that LAEs can use for learning and to improve the effectiveness of the profession;\nencourages a community of people that can drive the organized development of systems and technologies that will provide advancements to the practice of science, while creating re-useable resources instead of re-inventing systems from scratch on similar projects;\nprovides a basis for research into new technologies that can significantly improve scientist\u2019s ability to do their work, encouring a move from incremental advancement in automation systems to leaps in advancement; and\npromotes a community of like-minded individuals that can discuss, and where appropriate, develop positions on key issues in the field (e.g., the impact of regulatory requirements and standards development) and develop position papers and guidelines on those points as warranted.\nSub-disciplines within LAE \nLaboratory automation has the ability to allow us to do science that would be both physically and economically prohibitive without it. As noted above, high-throughput screening and combinatorial methods depend on automation. In physics, the data rates would be impossible to handle.[3] In most industrial analytical chemistry laboratories, automation keeps budgets and testing schedules workable. As the discipline develops, we are going to become more dependent on automation systems for data access, data mining, and visualization.\nGiven the level of sophistication of existing systems, and the demands that will be placed on them in the future, there will be a need for specialization, even within the field of LAE. Sub-disciplines within LAE include:\n\nSample handling, experiments, and testing \u2013 Automation is applied to the movement, manipulation, and analysis of samples and objects of interest. A common type of automation is robotics, which includes special-purpose devices such as autosamplers, as well as user-defined configurable systems.\nData acquisition and analysis \u2013 This includes any sensor-based data entry, with subsequent data evaluation.\nData, information, and knowledge management - This includes managing the access to and storage of thos objects, as well as understanding the uses of LIMS and ELN.\nThese sub-disciplines are not mutually exclusive but have considerable overlap (Figure 1) and share underlying technologies such as computing (which includes programming), networking, and communications. In some applications it may be difficult to separate \u201cdata acquisition\u201d from experiment control; however, that separation could lead to insights in system design. The sub-disciplines also have distinctly different sets of skill requirements and, as you move from left to right in the Figure 1, less involvement with the actual laboratory science.\n\n\n\n\n\n\n\n\n\nFigure 1. The three sub-disciplines within laboratory automation engineering\n\n\n\nA systems approach to laboratory automation is essential in the implementation of projects in modern laboratories; without it, we are going to continually repeat the \u201cislands of automation\u201d practices so common in the past. Project designers must look beyond the immediate needs and make provision for integration with other systems, in particular bi-directional communication with knowledge, information, and data management systems.\n\nSkills required for LAE \nIn the mid-1960s, \u201cprogramming\u201d instruments was a mechanical problem handled with needle-nosed pliers, screwdrivers, and a stopwatch, as you adjusted cams and micro-switches.[d] This process is more complicated today. Automation systems are built around computer components, therefore, a strong background in computer science and programming would be necessary to be successful. In addition, an engineering background along with management, regulatory, and strong organizational skills would be expected. An understanding of the basic science in which automation is being applied is also crucial. As noted below under communications, it is up to the LAE to understand the science and the scientists in order to convert their needs into a functional requirements document and then implement working systems.\nThose working in LAE should have the following basic skills in their backgrounds. These would be common across all sub-disciplines:\n\nproject and change management\nstrong communications skills\nstrong understanding of regulatory issues (e.g. FDA, ISO)\ngeneral systems theory and beyond\nprocess analysis and development\nProject and change management \nIn addition to the skills you might expect to be covered by project management, the topic of change management also deserves to be noted. Change management is included to emphasize their mutual importance. Until we get to the point where we can anticipate automation issues within a lab and build them into the lab as it is constructed, laboratory automation is going to be both a replacement for an existing process (a project) and a new set of processes imposed on those working in that lab (some change). Even when we get to a stage where pre-built automation systems exist, there will be both step-wise and radical changes in the applied technologies as new techniques are developed.\nThe LAE practitioner will have to deal with the normal issues of budgets, schedules, and documentation common to projects. The need for thorough documentation cannot be stressed enough. Well-written documentation of the project goals, structure, milestones, the reasoning behind choices of materials, equipment, alternative ways of doing things, and final acceptance criteria are necessary to provide a basis for regulatory review. This will lead to the development of a supportable system that can be upgraded, and provide an objective reference for determining when the project is completed. As part of their practice, LAEs should adopt practices in other engineering disciplines such as the development of a functional requirements specification (FRS), user requirements specification (URS), and design specifications prior to system development. The LAE should also be familiar with the concept of project life cycle management[4][5] found in a number of industries, paying particular attention to software life cycle management.[6]\nMore interestingly, the LAE will have to be able to respond to statements like \u201cI know we\u2019ve never done anything like this before but I still need to know how much it will cost and when it will be done\u201d and questions like \u201cok, maybe you don\u2019t know why it isn\u2019t working, but can you tell me when it will be fixed?\u201d. The politics of managing changes, adding requirements to projects (with appropriate documentation, change orders, budget, and scheduling adjustment) are part of professional work.\nChange management also addresses \u201cpeople issues,\u201d and they can be among the more stressful. Moving from an existing process of doing work to an automated one means that people\u2019s jobs will change. For some it will be a minor adjustment to the way they do work, while for others it will be the possibility of their job dramatically changing (and change raises people\u2019s anxiety levels). Those working in LAE are going to have to be able to face that reality head-on because that will affect their ability to get work done. If those working in the lab believe that your project will cause them to lose their job, or have to make a significant change (particularly if it is viewed as a negative change) they may limit your ability to complete the project. The fact that this point deals with people does not mean that it cannot affect what may be viewed as a technology project, or that the issue can be viewed as a problem to be solved by lab managers.\nPaying attention to \u201cpeople issues\u201d can:\n\npoint to deficiencies in project planning,\navoid delays in the final acceptance of the project, and\nhelp you uncover important factors in designing a system.\nRegarding the first point, one of the lessons learned in the installation of early LIMS was the need for both training and the availability of practice databases so that those working with the system could familiarize themselves with a non-production database before using the live system. Having the project plan reviewed by those in the lab who will be affected by the system can lead to the discovery of issues that need to be addressed in training, scheduling, and potential conflicts in the lab\u2019s operations. By extension, if deficiencies are caught early, delays in final project acceptance may be avoided. \nAs for the final point, laboratory work contains elements of art as well as science. People\u2019s familiarity with the samples they are processing can lead to undocumented shortcuts and adjustments to procedures, which can make the difference between your implementation of an automated process working or failing. In the late 1980s, Shoshana Zuboff documented the problems that occurred in the automation of pulp mills when the plant personnel\u2019s experience was ignored.[7] The parallel here is if you ignore peoples experience in doing the work you are about to automate, you run a significant risk of failing.\nProject and change management also has to provide planning for the stresses that develop during a project's implementation. As noted earlier, laboratory automation today is largely a replacement of existing manual procedures. LIMS are installed to facilitate the bookkeeping of service laboratories, improve the lab\u2019s ability to respond to questions about test results, and increase the efficiency of the lab\u2019s operations. Robotic sample preparation systems are introduced into a lab because the existing manual processes are no longer economical, and there is a need for better data precision, or higher sample throughput.\nThese replacement activities are going to introduce stressors on the lab, which need to be taken into account in the planning process. First, there will be some stress on the lab workers and budget during the development of the system. That stress will come in the form of the cost of running the lab while the development is taking place, people\u2019s attitudes if they feel their jobs will be adversely affected, potential disruption of lab operations, and the problems associated with installing systems in labs that are usually short on space. Second, once the automation system has been developed, it will have to be evaluated against the existing process to show that it meets the design criteria. Once that has been successfully accomplished, a change-over process has to be designed to switch from one method of operation to another. Both of these activities will increase the workload in the lab since, for a time, both the automated system and the process it is replacing will run in parallel.\n\nStrong communications skills \nThe previous section mentioned managing people issues, and effective communication is part of that. The strong communication is also useful in making sure people understand what work is being done, what the benefits are, and what the costs are, as well as declaring any schedules, risks, etc. Those aspects, as well as the ramifications of any changes, have to be clearly understood. The lack of clear communications is often the basis of misdirection, delays, and frustration around projects. Communication is a two-way street. The LAE needs to be able to discuss the project, as well as listen to and understand the needs of those working in the lab.\n\nStrong understanding of regulatory issues \nNo matter what industry in which you are working, meeting regulatory requirements is a fact of life. Companies are under increasing pressure from regulatory agencies to have their internal practices conform to a standard. This is no longer a specific regulatory- or standards-based issue that is focused on manufacturing systems, but rather an organization-wide set of activities encompassing financial and accounting practices, human resources, etc. Name a department and it will be affected by a regulation.\nWhile the initial reaction is that this adds one more thing to do; for the most part, what the regulations require is part of a good system design process. The requirements for the validation of a system, for example, call for the designer to show that the system works, that it is supportable, well-documented, and uses equipment suited to the designer\u2019s purpose and from reliable vendors. Tinkering in order to get a system together is no longer an accepted practice; it may work for prototyping, but not for production use. The point of any applicable regulations and standards is to ensure that any implemented system can stand up to long-term use, and that it fits a well-defined, documented need. In short, regulations and standards help ensure a system is well-engineered.\n\nGeneral systems theory and beyond \nFrank Zenie (one of the founders of Zymark Corporation[e]) usually made the statement that \u201cyou can only automate a process, you can\u2019t automate a thing\u201d as part of his introductory comments for Zymark\u2019s robotics courses. Recognizing that a process exists and that it can be automated is the initial step in defining a project. General systems theory[f] provides the tools for documenting a process, as well as the triggers and the state changes that occur as the process develops. It is particularly useful in working with inter-related systems.\nRecognizing that a process exists and can be well described is one part of the problem. Another is its ability to be automated. Labs and lab equipment are designed for people. Using that same equipment and tools with automated systems, including robotics, may require a significant amount of re-engineering, and bring into question the economics and perceived benefits of a project.\nProcess engineering should also include statistical process control, statistical quality control, and productivity measurements. Robotics systems are small-scale manufacturing systems even if the end result is data. As portions of laboratory work become automated and integrated, the systems will behave like manufacturing processes and should fall under the same types of control. Productivity measurements are also important. Automation systems are put in place to achieve a number of goals, including a focus on improving efficiency of operations. Productivity measurements test the system\u2019s ability to meet those goals, and if successful, will lead to the development of additional projects.\nMuch of what has been written so far could easily be recognized as part of an engineering or engineering management program, and that is exactly the point: laboratory automation is an engineering activity. The elements that differentiate it from other engineering activities include that:\n\nit's a laboratory or scientific environment, as noted earlier;\nautomation is an enabling technology opening the door to new scientific methodologies, including discovery-based science[g];\nautomation is usually a replacement for existing manual operations, which will require the LAE to be sensitive to change management issues;\nthe scope of activities can include materials handling (robotics), data acquisition, analysis, reporting, and integration with database systems; and\nLAEs need to be knowledgeable in automation technologies and their application, and have strong backgrounds in science practiced in the lab.\nThis last point is significant; the LAE cannot expect the lab personnel to describe their requirements in engineering terms. Yes, lab personnel can explain what they would like to accomplish; however, it is the LAE\u2019s responsibility to determine how to do it and understand the ramifications in system design and implementation. The LAE functions in part as a translator, converting a set of needs of scientists into a project plan, and ultimately into a functioning system. In some respects this is similar to traditional software engineering.\nAll the work associated with the three sub-disciplines reflects the transition of working with \u201cthings\u201d and materials, making measurements, and creating a description of those items, and then managing and working with the resulting knowledge, information, and data. The lists found in Figure 2 are a summary of skills needed in addition to those noted above. Software engineering is repeated in the lists for additional emphasis.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 2. Summary of skills required for each of the three sub-disciplines within laboratory automation engineering\n\n\n\n What comes next? \nThere are two major tasks that need to be undertaken, both under the heading of education. First is the development of a curriculum for laboratory automation engineering at the university level. This will require support from both organizations like the Association for Laboratory Automation (ALA) and from industry. A university is not going to create a program, even one that can be assembled to a large extent from existing programs, unless it has some assurance that its graduates will find jobs. The availability of employment opportunities would be used to attract students, and the process would move on from there. One issue that needs to be addressed is how much laboratory science knowledge an LAE has to have in order to be effective. Another closely tied issue is whether we are talking about an undergraduate program with particular science specializations or a graduate program.\nUndergraduate and graduate science programs need to address the inclusion of automation in their courses. The point is not to teach them engineering but to acquaint them with how the work of modern science is done. Just as computer literacy is part of high-school curriculums, automation literacy should be part of the course of science studies. James Sterling\u2019s 2004 journal article on laboratory automation curriculum provides one example of what can be done.[8] The ALA can take a lead and develop materials (e.g., course outlines, source material, etc.) to assist instructors who want to include this material in their programs.\nWhile a university could address a long-term need, there is the need to provide a means of having those already working in the field to augment their backgrounds. Expanding the short-course structure already put in place by the ALA could satisfy that need. Another possibility is the development of certification programs similar to those used in computer science, working in conjunction with short- courses and the development of an \u201cInstitute for Laboratory Automation\u201d with an organized summer program.\nThe second task is the organization of a \u201cbody of knowledge\u201d about laboratory automation engineering that would include compilations of relevant texts, web sites, knowledge bases, etc., while encouraging those working in the field to contribute material to its development. The initial step would be the development of a framework to organize material and then to begin populating it. Once a framework is in place, publications like the Journal of the Association for Laboratory Automation could have authors key their papers to that structure. The details of the framework need input from others in the field and more depth of evaluation than this introductory piece can afford. \nIn the development of this framework (Figure 3), the following points should be considered. First, there are fundamental techniques, skills, and technologies in laboratory automation that cut across scientific disciplines pretty much intact, including analog data acquisition, fundamentals of robotics, interfacing techniques, project management, etc. While there may be application-specific caveats (e.g., \u201cConsiderations in analog interfacing in secure biohazard facilities\u201d), for the most part they are common elements and can be treated as a common block of knowledge. Second, once we get beyond basics and into applications of techniques in scientific disciplines, we may see the same basic outline structure duplicated with parallel sections. Robotics in chemistry may be similar to robotics in another discipline but with differences in equipment details and implementations. The outline will appear similar, with differences in the details. The framework should consider linkages between parallel outlines where similar needs generate similar systems development. In short, we should make it easy to recognize cross-fertilization between similar technologies in different disciplines.\n\r\n\n\n\n\n\n\n\n\n\n\nFigure 3. Example of an organized and interconnected LAE knowledge framework\n\n\n\nConclusion \nLaboratory automation is still in the early phases of its development. Some may say that significant progress has been made, but in comparison to the rate of development of automation and information technologies in other areas, our progress has been slow and incremental (the internet, for example, is a little more than a decade old).\nWhat can and must be done is getting the user community to embrace the establishment and development of a discipline focused on envisioning, creating, and improving the tools and techniques of laboratory automation. That in turn will allow us to realize the promise that proponents of automation have long held: giving people the opportunity to do better science. Laboratory automation needs to be driven by people who want to do good work and are trained to do it.\nAn LAE\u2019s employment will be driven by market demand. However, the skill sets should be transferable. Just as computer science professionals have flexibility in where their skills are applied, LAEs should enjoy the same ability to move from one scientific application to another. The major differences will be learning the underlying science.\n\n Abbreviations, acronyms, and initialisms \nELN: Electronic laboratory notebook\nFDA: Food and Drug Administration\nFRS: Functional requirements specification\nIT: Information technology\nISO: International Organization for Standardization\nLAE: Laboratory automation engineering (or engineer)\nLIMS: Laboratory information management system\nURS: User requirements specification\n\nFootnotes \n\n\n\u2191 A portion of this material was published as a guest editorial in the Journal of the Association for Laboratory Automation (now SLAS Technology), June 2006, volume 11, number 3, pages 157-162. \n\n\u2191 As a point of fact, and as noted later in this piece, the results of laboratory work are knowledge, information, and data. \u201cData\u201d as used here is a short-hand for all three. For a full discussion of this point from the author's point of view, please refer to Laboratory and Scientific Computing: A Strategic Approach, J. Liscouski, Wiley Interscience, 1995. \n\n\u2191 I\u2019d like to thank Mark Russo, executive editor of the JALA, for his comments and, in this section in particular, for his insights and the material he contributed. \n\n\u2191 For example, process chromatographs made by Mine Safety Appliances and Fisher Control used cams and micro-switches to control sampling and back-flush valves. \n\n\u2191 Zymark was acquired by Caliper Life Sciences in 2003. \n\n\u2191 See \"What Is Systems Theory?\" for an introduction. In particular, review the work of George Klir, including An Approach to General Systems Theory and Facets of Systems Science. \n\n\u2191 As suggested by a reviewer. \n\n\nAcknowledgements \nI\u2019d like to thank Mark Russo (Bristol-Myers Squibb) for his comments and support in the development of this article.\n\nAbout the author \nInitially educated as a chemist, author Joe Liscouski (joe dot liscouski at gmail dot com) is an experienced laboratory automation\/computing professional with over forty years of experience in the field, including the design and development of automation systems (both custom and commercial systems), LIMS, robotics and data interchange standards. He also consults on the use of computing in laboratory work. He has held symposia on validation and presented technical material and short courses on laboratory automation and computing in the U.S., Europe, and Japan. He has worked\/consulted in pharmaceutical, biotech, polymer, medical, and government laboratories. His current work centers on working with companies to establish planning programs for lab systems, developing effective support groups, and helping people with the application of automation and information technologies in research and quality control environments.\n\nReferences \n\n\n\u2191 1.0 1.1 1.2 Hallock, N. (2005). \"Creative Combustion: A History of the Association for Laboratory Automation\". SLAS Technology 10 (6): 423\u201331. doi:10.1016\/j.jala.2005.10.001.   \n \n\n\u2191 2.0 2.1 2.2 Liscouski, J. (1985). \"Laboratory Automation\". JCOMP 25 (3): 288\u201392.   \n \n\n\u2191 Matey, J.R. (1999). \"History of Laboratory Automation - Session BA01.05, Cent. Symposium: 20th Century Developments in Instrumentation & Measurements\". Centennial Meeting of the American Physical Society. http:\/\/flux.aps.org\/meetings\/YR99\/CENT99\/abs\/S350.html#SBA01.001 .   \n \n\n\u2191 \"Project Management Life Cycle\". Method123. https:\/\/www.method123.com\/project-lifecycle.php .   \n \n\n\u2191 \"Systems development life cycle\". Wikipedia. https:\/\/en.wikipedia.org\/wiki\/Systems_development_life_cycle .   \n \n\n\u2191 \"Lesson 2.2 Project Management\". Adobe Web Tech Curriculum. 2005. Archived from the original on 10 January 2006. https:\/\/web.archive.org\/web\/20060110072101\/http:\/\/www.adobe.com\/education\/webtech\/CS\/unit_planning1\/pm_home_id.htm .   \n \n\n\u2191 Zuboff, S. (1988). In the Age of the Smart Machine. Butterworth\u2013Heinemann. ISBN 0434924865.   \n \n\n\u2191 Sterling, J.D. (2004). \"Laboratory Automation Curriculum at Keck Graduate Institute\". SLAS Technology 9 (5): 331\u201335. doi:10.1016\/j.jala.2004.07.005.   \n \n\n\n\n\n\n\nSource: <a rel=\"external_link\" class=\"external\" href=\"https:\/\/www.limswiki.org\/index.php\/LII:Are_You_a_Laboratory_Automation_Engineer%3F\">https:\/\/www.limswiki.org\/index.php\/LII:Are_You_a_Laboratory_Automation_Engineer%3F<\/a>\nNavigation menuPage actionsLIIDiscussionView sourceHistoryPage actionsLIIDiscussionMoreToolsIn other languagesPersonal toolsLog inRequest accountNavigationMain pageRecent changesRandom pageHelp about MediaWikiSearch\u00a0 ToolsWhat links hereRelated changesSpecial pagesPermanent linkPage informationSponsors \r\n\n\t\r\n\n\t\r\n\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\r\n\n\t\r\n\n \n\t\n\t\n\t\r\n\n\t\r\n\n \n\t\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\n\t\r\n\n\t\r\n\n\t\n\t\r\nPrint\/exportCreate a bookDownload as PDFDownload as PDFDownload as Plain textPrintable version This page was last edited on 18 February 2021, at 19:23.Content is available under a Creative Commons Attribution-ShareAlike 4.0 International License unless otherwise noted.This page has been accessed 523 times.Privacy policyAbout LIMSWikiDisclaimers\n\n\n\n","67df76407d0807e78d9cde61bb3f82c9_html":"<body class=\"mediawiki ltr sitedir-ltr mw-hide-empty-elt ns-202 ns-subject page-LII_Are_You_a_Laboratory_Automation_Engineer rootpage-LII_Are_You_a_Laboratory_Automation_Engineer skin-monobook action-view skin--responsive\"><div id=\"rdp-ebb-globalWrapper\"><div id=\"rdp-ebb-column-content\"><div id=\"rdp-ebb-content\" class=\"mw-body\" role=\"main\"><a id=\"rdp-ebb-top\"><\/a>\n<h1 id=\"rdp-ebb-firstHeading\" class=\"firstHeading\" lang=\"en\">LII:Are You a Laboratory Automation Engineer?<\/h1><div id=\"rdp-ebb-bodyContent\" class=\"mw-body-content\"><!-- start content --><div id=\"rdp-ebb-mw-content-text\" lang=\"en\" dir=\"ltr\" class=\"mw-content-ltr\"><div class=\"mw-parser-output\"><p><b>Title<\/b>: <i>Are You a Laboratory Automation Engineer?<\/i>\n<\/p><p><b>Author for citation<\/b>: Joe Liscouski, with editorial modifications by Shawn Douglas\n<\/p><p><b>License for content<\/b>: <a rel=\"external_link\" class=\"external text\" href=\"https:\/\/creativecommons.org\/licenses\/by\/4.0\/\" target=\"_blank\">Creative Commons Attribution 4.0 International<\/a>\n<\/p><p><b>Publication date<\/b>: 2006\n<\/p>\n\n\n<h2><span class=\"mw-headline\" id=\"Summary\">Summary<\/span><\/h2>\n<p>The technology and techniques that are used in automating <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory\" title=\"Laboratory\" class=\"wiki-link\" data-key=\"c57fc5aac9e4abf31dccae81df664c33\">laboratory<\/a> work have been documented for over 40 years. Work done under the heading of \u201c<a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_automation\" title=\"Laboratory automation\" class=\"wiki-link\" data-key=\"0061880849aeaca05f8aa27ae171f331\">laboratory automation<\/a>\u201d has progressed from pioneering work in data acquisition and instrument control, to instrumentation with integral computing and communications. If the field is to move forward, we need to organize the practices and education of those applying automation and computing technologies to laboratory work. This note<sup id=\"rdp-ebb-cite_ref-1\" class=\"reference\"><a href=\"#cite_note-1\">[a]<\/a><\/sup> is an opening to a dialog in the definition and development of the field of \"laboratory automation engineering\u201d (LAE).\n<\/p><p>In this article the following points will be considered:\n<\/p>\n<ul><li>Definition of laboratory automation engineering (LAE)<\/li>\n<li>Laboratory automation as an engineering discipline<\/li>\n<li>Developments in laboratory automation<\/li>\n<li>Benefits of formalizing laboratory automation engineering<\/li>\n<li>Sub-disciplines within LAE<\/li>\n<li>Skills required for LAE<\/li>\n<li>What comes next?<\/li><\/ul>\n<h2><span class=\"mw-headline\" id=\"Introduction\">Introduction<\/span><\/h2>\n<p>There are many of us working on the use of automation technologies in scientific disciplines such as chemistry, high-throughput screening, physics, quality control, electronics, oceanography, and materials testing. Yet, given the diversity of applications, we have many characteristics in common. There are enough common traits and skills that we can declare a field of laboratory automation engineering (LAE). The establishment of this field of work benefits both the practitioner and those in need of their skills. In addition, we may finally come to the full realization of the rewards we expect from automation systems.\n<\/p><p>Laboratory automation engineering can be defined as the application of automation, information, computing, and networking technologies to solve problems in, or improve the practice of, a scientific discipline. As such, it fits the accepted definition of \"engineering\": the discipline dealing with the art or science of applying scientific knowledge to practical problems.\n<\/p><p>The common characteristics of LAE are:\n<\/p>\n<ul><li>It is practiced in facilities where the focus is scientific in nature, including research and development, testing, and quality control activities.<\/li>\n<li>The end product of the work consists of <a href=\"https:\/\/www.limswiki.org\/index.php\/Information\" title=\"Information\" class=\"wiki-link\" data-key=\"6300a14d9c2776dcca0999b5ed940e7d\">information<\/a> (e.g., initial and processed test data) and\/or systems for managing and improving people\u2019s ability to work with information (intermediate steps result in prepared <a href=\"https:\/\/www.limswiki.org\/index.php\/Sample_(material)\" title=\"Sample (material)\" class=\"wiki-link\" data-key=\"7f8cd41a077a88d02370c02a3ba3d9d6\">samples<\/a> or specimens).<\/li>\n<li>The automation practice draws upon robotics, mechanical and electrical engineering, data and <a href=\"https:\/\/www.limswiki.org\/index.php\/Information_management\" title=\"Information management\" class=\"wiki-link\" data-key=\"f8672d270c0750a858ed940158ca0a73\">information handling and management<\/a>, information and networking technology, and the science that the automation is addressing.<\/li>\n<li>It requires the practitioners to be knowledgeable in both automation technologies and the discipline in which they are being applied.<\/li><\/ul>\n<p>LAE differs from other forms of automation engineering found in manufacturing and product production in several ways. First, modern manufacturing and production facilities are designed with automation as an integral part of the process. Laboratory automation is a replacement process: manual techniques, once they mature, are replaced with automated systems. Economics, and the need to improve process uniformity, drive the replacement. If a testing or quality control lab could be designed initially as a fully automated facility, it would be difficult to distinguish the automated facility from any other manufacturing facility aside from the point that its \u201cproduct\u201d is data rather than tangible materials. The second difference is the point just made: the results are ultimately data.<sup id=\"rdp-ebb-cite_ref-2\" class=\"reference\"><a href=\"#cite_note-2\">[b]<\/a><\/sup> LAE would result in systems and methods (e.g., high-throughput screening and combinatorial methods) that enable science to be pursued in ways that could not be done without automation.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Developments_in_laboratory_automation\">Developments in laboratory automation<\/span><\/h2>\n<p>Laboratory automation was initiated by scientists with a need to do things differently. Whether it was to improve the efficiency of the work, to substitute intelligent control and acquisition systems for manual labor, or because it was the best or only way to implement an experimental system, automation eventually moved into the laboratory. Those doing the work had to be as well-versed in computing hardware and software as they were in their science. The choice of hardware and software, as well the development of the system, was in their hands.<sup id=\"rdp-ebb-cite_ref-HallockCreative05_3-0\" class=\"reference\"><a href=\"#cite_note-HallockCreative05-3\">[1]<\/a><\/sup><sup id=\"rdp-ebb-cite_ref-LiscouskiLab85_4-0\" class=\"reference\"><a href=\"#cite_note-LiscouskiLab85-4\">[2]<\/a><\/sup>\n<\/p><p>The first generation of laboratory automation systems was concerned with interfacing instrument, sensors, and controllers to computers. Then the control of experiments through robotics and direct interfacing with computer systems arrived. A variety of experimental control software packages became available, as did <a href=\"https:\/\/www.limswiki.org\/index.php\/Laboratory_information_management_system\" title=\"Laboratory information management system\" class=\"wiki-link\" data-key=\"8ff56a51d34c9b1806fcebdcde634d00\">laboratory information management systems<\/a> (LIMS) and, in the last few years, <a href=\"https:\/\/www.limswiki.org\/index.php\/Electronic_laboratory_notebook\" title=\"Electronic laboratory notebook\" class=\"wiki-link\" data-key=\"a9fbbd5e0807980106763fab31f1e72f\">electronic laboratory notebooks<\/a> (ELNs).<sup id=\"rdp-ebb-cite_ref-HallockCreative05_3-1\" class=\"reference\"><a href=\"#cite_note-HallockCreative05-3\">[1]<\/a><\/sup><sup id=\"rdp-ebb-cite_ref-LiscouskiLab85_4-1\" class=\"reference\"><a href=\"#cite_note-LiscouskiLab85-4\">[2]<\/a><\/sup>\n<\/p><p>If you have visited a conference in any discipline with a discussion on automation, it would appear that automation has seriously taken hold and most things are already automated or soon will be. One problem with this product array is the ability to integrate products from different vendors. Individual procedures are being addressed, but the movement of data and information within the lab and your ability to work with it is not as robust as it could be. Some vendors answer this need with comprehensive ELN systems in which their products are the hub of a lab's operations. Data is collected from instruments with local computing capability (e.g., the simple data translation of balances or similar devices) into their systems for storage, cataloging, analysis, etc. The movement is primarily one-way, but this is only the current generation of systems. The capabilities of today\u2019s products generate new possibilities requiring upgrades to, or replacement of, these products.\n<\/p><p>The early automation models were driven by limitations in computing power, data storage, and communications. There was one experimental setup, with one system to provide automation.<sup id=\"rdp-ebb-cite_ref-HallockCreative05_3-2\" class=\"reference\"><a href=\"#cite_note-HallockCreative05-3\">[1]<\/a><\/sup><sup id=\"rdp-ebb-cite_ref-LiscouskiLab85_4-2\" class=\"reference\"><a href=\"#cite_note-LiscouskiLab85-4\">[2]<\/a><\/sup> The next generation of laboratory automation has to replace previous models, and develop implementations that can be gracefully upgraded rather than replaced. This will require increased modularity and less dependence upon local-to-the-instrument systems for automation. Achieving it requires engineering systems, not just building products.\n<\/p><p>Today, data communication is readily available thanks to the development of communication standards. Storage is cheap and there is an abundance of computing capability available. <a href=\"https:\/\/www.limswiki.org\/index.php\/Data_analysis\" title=\"Data analysis\" class=\"wiki-link\" data-key=\"545c95e40ca67c9e63cd0a16042a5bd1\">Data analysis<\/a> does not have to be done at the instrument. Instead, the data file with instrument parameters can be sent to a central system for cataloging, storage, analysis, and reporting, while still making the raw data readily available for further analysis as new data treatment techniques are developed. The dots in the automation map will be connected with bi-directional communications. The field needs people who are trained to do the work and to take advantage of the increased capabilities available to them.\n<\/p><p>Beyond the issues of experimental work lay the requirements of corporate information architectures. Laboratory systems are part of the corporate network, whether it is a for-profit company, a university, or a research organization. Laboratory computing does not stop at the door. The purpose of a laboratory is to create knowledge, information, and data. The purpose of laboratory automation is to assist in that development, making the process more successful, effective, and cost-efficient. This can be accomplished by engineering systems that have security built-in to protect them, while at the same time giving scientists access to their data wherever they need it. It also means designing systems that permit a graceful evolution and continual improvement without disrupting the lab workings.\n<\/p><p>Too often we find corporate information technology (IT) departments at odds with those using computers in laboratory settings. On one side you hear \u201cIT doesn\u2019t understand our needs,\u201d and on the other \u201cwe\u2019re responsible for all corporate computing.\u201d Both sides have a point. IT departments are responsible for corporate computing, and labs are part of the corporation. However, IT departments usually do not understand the computing needs of scientists; they do not speak the same language. IT-speak and science-speak do not readily connect and the issue will worsen as more instruments become network-ready. Specialized software packages used in laboratory work raise support issues for IT departments. IT would be concerned about who was going to support these programs and the effect they may have on the security of the corporation\u2019s information network. LAE\u2019s have the ability to bridge that gap.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Benefits_of_formalizing_laboratory_automation_engineering\">Benefits of formalizing laboratory automation engineering<\/span><\/h2>\n<p>People have been successful for several decades doing lab automation work, so why do we need a change in the way it is done? While there certainly have been successful projects, there have also been projects that have ranged from less-than-complete successes, or even cancellations. In the end, the potential for laboratory automation to assist or even revolutionize laboratory work is too great to have projects be hit-or-miss successes. Even initially apparent successes may lead to long-term problems when the choice of software platform leads to an eventual dead-end, or the lab finds itself locked into a product set that does not give it the flexibility needed as lab requirements change over time.\n<\/p><p>To date, laboratory automation has moved along a developmental path that has been traveled in other areas of technology applications. In commercial chemical synthesis, aerospace, computers, civil engineering, and other fields, progress has been made first by individuals, followed by small groups practicing within a particular field. As the need to apply these technologies on a broader scale grew, training became formalized, as did methods of design and development. The methodology shifted from an \u201cart\u201d to actual engineering. The end result has been the ability to create buildings, bridges, aircraft, etc. on a scale that independent groups working on their own could not hope to achieve.\n<\/p><p>In the fields noted in the previous paragraph, and in others, \u201cengineering\u201d a project means that trained people have analyzed, understood, and defined it. It also means that plans have been laid out to meet the project goals, and risks have been identified and evaluated. It is a deliberate, confident movement from requirements to completion. This is what is needed to advance the application of automation technologies to scientific and laboratory problems. Hence, a movement to an engineering disciple for laboratory automation is necessary.\n<\/p><p>The benefits<sup id=\"rdp-ebb-cite_ref-5\" class=\"reference\"><a href=\"#cite_note-5\">[c]<\/a><\/sup> of formalizing LAE to the individual practitioner, laboratory management, and the field are outlined below.\n<\/p><p>Benefits to the individual:\n<\/p>\n<ul><li>provides a thorough and systematic education in a broadly practiced field;<\/li>\n<li>informs of past activities regarding what has worked and what has failed;<\/li>\n<li>provides evidence of relevant knowledge and training through a degree or certificate program; and<\/li>\n<li>lends a sense of identity and community with others on the same career path.<\/li><\/ul>\n<p>Benefits to laboratory management:\n<\/p>\n<ul><li>provides a basis for evaluating people\u2019s credentials and ability to work on laboratory automation projects;<\/li>\n<li>provides a basis for employee evaluation;<\/li>\n<li>limits the need for on-the-job training, reducing the impact on budgets;<\/li>\n<li>lends to faster project implementation with a higher expectation of success; and<\/li>\n<li>lends expertise to the design of laboratories using automation as part of their initial blueprint, rather than using it to replace existing manual procedures, while reducing the cost and improving the efficiency of a laboratory\u2019s operations.<\/li><\/ul>\n<p>Benefits to the field of laboratory automation:\n<\/p>\n<ul><li>creates a foundation of documented knowledge that LAEs can use for learning and to improve the effectiveness of the profession;<\/li>\n<li>encourages a community of people that can drive the organized development of systems and technologies that will provide advancements to the practice of science, while creating re-useable resources instead of re-inventing systems from scratch on similar projects;<\/li>\n<li>provides a basis for research into new technologies that can significantly improve scientist\u2019s ability to do their work, encouring a move from incremental advancement in automation systems to leaps in advancement; and<\/li>\n<li>promotes a community of like-minded individuals that can discuss, and where appropriate, develop positions on key issues in the field (e.g., the impact of regulatory requirements and standards development) and develop position papers and guidelines on those points as warranted.<\/li><\/ul>\n<h2><span class=\"mw-headline\" id=\"Sub-disciplines_within_LAE\">Sub-disciplines within LAE<\/span><\/h2>\n<p>Laboratory automation has the ability to allow us to do science that would be both physically and economically prohibitive without it. As noted above, high-throughput screening and combinatorial methods depend on automation. In physics, the data rates would be impossible to handle.<sup id=\"rdp-ebb-cite_ref-MateyHist99_6-0\" class=\"reference\"><a href=\"#cite_note-MateyHist99-6\">[3]<\/a><\/sup> In most industrial analytical chemistry laboratories, automation keeps budgets and testing schedules workable. As the discipline develops, we are going to become more dependent on automation systems for data access, <a href=\"https:\/\/www.limswiki.org\/index.php\/Data_mining\" title=\"Data mining\" class=\"wiki-link\" data-key=\"be09d3680fe1608addedf6f62692ee47\">data mining<\/a>, and <a href=\"https:\/\/www.limswiki.org\/index.php\/Data_visualization\" title=\"Data visualization\" class=\"wiki-link\" data-key=\"4a3b86cba74bc7bb7471aa3fc2fcccc3\">visualization<\/a>.\n<\/p><p>Given the level of sophistication of existing systems, and the demands that will be placed on them in the future, there will be a need for specialization, even within the field of LAE. Sub-disciplines within LAE include:\n<\/p>\n<ul><li>Sample handling, experiments, and testing \u2013 Automation is applied to the movement, manipulation, and analysis of samples and objects of interest. A common type of automation is robotics, which includes special-purpose devices such as autosamplers, as well as user-defined configurable systems.<\/li>\n<li>Data acquisition and analysis \u2013 This includes any sensor-based data entry, with subsequent data evaluation.<\/li>\n<li>Data, information, and knowledge management - This includes managing the access to and storage of thos objects, as well as understanding the uses of LIMS and ELN.<\/li><\/ul>\n<p>These sub-disciplines are not mutually exclusive but have considerable overlap (Figure 1) and share underlying technologies such as computing (which includes programming), networking, and communications. In some applications it may be difficult to separate \u201cdata acquisition\u201d from experiment control; however, that separation could lead to insights in system design. The sub-disciplines also have distinctly different sets of skill requirements and, as you move from left to right in the Figure 1, less involvement with the actual laboratory science.\n<\/p><p><a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig1_Liscouski_AreYouLabAutoEng06.png\" class=\"image wiki-link\" data-key=\"9b6feb1a2c3b7f7dd731b471418c05d8\"><img alt=\"Fig1 Liscouski AreYouLabAutoEng06.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/4\/4f\/Fig1_Liscouski_AreYouLabAutoEng06.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 1.<\/b> The three sub-disciplines within laboratory automation engineering<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<p>A systems approach to laboratory automation is essential in the implementation of projects in modern laboratories; without it, we are going to continually repeat the \u201cislands of automation\u201d practices so common in the past. Project designers must look beyond the immediate needs and make provision for integration with other systems, in particular bi-directional communication with knowledge, information, and data management systems.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Skills_required_for_LAE\">Skills required for LAE<\/span><\/h2>\n<p>In the mid-1960s, \u201cprogramming\u201d instruments was a mechanical problem handled with needle-nosed pliers, screwdrivers, and a stopwatch, as you adjusted cams and micro-switches.<sup id=\"rdp-ebb-cite_ref-7\" class=\"reference\"><a href=\"#cite_note-7\">[d]<\/a><\/sup> This process is more complicated today. Automation systems are built around computer components, therefore, a strong background in computer science and programming would be necessary to be successful. In addition, an engineering background along with management, regulatory, and strong organizational skills would be expected. An understanding of the basic science in which automation is being applied is also crucial. As noted below under communications, it is up to the LAE to understand the science and the scientists in order to convert their needs into a functional requirements document and then implement working systems.\n<\/p><p>Those working in LAE should have the following basic skills in their backgrounds. These would be common across all sub-disciplines:\n<\/p>\n<ul><li>project and change management<\/li>\n<li>strong communications skills<\/li>\n<li>strong understanding of regulatory issues (e.g. <a href=\"https:\/\/www.limswiki.org\/index.php\/Food_and_Drug_Administration\" title=\"Food and Drug Administration\" class=\"wiki-link\" data-key=\"e2be8927071ac419c0929f7aa1ede7fe\">FDA<\/a>, <a href=\"https:\/\/www.limswiki.org\/index.php\/International_Organization_for_Standardization\" title=\"International Organization for Standardization\" class=\"wiki-link\" data-key=\"116defc5d89c8a55f5b7c1be0790b442\">ISO<\/a>)<\/li>\n<li>general systems theory and beyond<\/li>\n<li>process analysis and development<\/li><\/ul>\n<h3><span class=\"mw-headline\" id=\"Project_and_change_management\">Project and change management<\/span><\/h3>\n<p>In addition to the skills you might expect to be covered by project management, the topic of change management also deserves to be noted. Change management is included to emphasize their mutual importance. Until we get to the point where we can anticipate automation issues within a lab and build them into the lab as it is constructed, laboratory automation is going to be both a replacement for an existing process (a project) and a new set of processes imposed on those working in that lab (some change). Even when we get to a stage where pre-built automation systems exist, there will be both step-wise and radical changes in the applied technologies as new techniques are developed.\n<\/p><p>The LAE practitioner will have to deal with the normal issues of budgets, schedules, and documentation common to projects. The need for thorough documentation cannot be stressed enough. Well-written documentation of the project goals, structure, milestones, the reasoning behind choices of materials, equipment, alternative ways of doing things, and final acceptance criteria are necessary to provide a basis for regulatory review. This will lead to the development of a supportable system that can be upgraded, and provide an objective reference for determining when the project is completed. As part of their practice, LAEs should adopt practices in other engineering disciplines such as the development of a functional requirements specification (FRS), user requirements specification (URS), and design specifications prior to system development. The LAE should also be familiar with the concept of project life cycle management<sup id=\"rdp-ebb-cite_ref-M123Project_8-0\" class=\"reference\"><a href=\"#cite_note-M123Project-8\">[4]<\/a><\/sup><sup id=\"rdp-ebb-cite_ref-WPSystems_9-0\" class=\"reference\"><a href=\"#cite_note-WPSystems-9\">[5]<\/a><\/sup> found in a number of industries, paying particular attention to software life cycle management.<sup id=\"rdp-ebb-cite_ref-AdobeLesson2.2_10-0\" class=\"reference\"><a href=\"#cite_note-AdobeLesson2.2-10\">[6]<\/a><\/sup>\n<\/p><p>More interestingly, the LAE will have to be able to respond to statements like \u201cI know we\u2019ve never done anything like this before but I still need to know how much it will cost and when it will be done\u201d and questions like \u201cok, maybe you don\u2019t know why it isn\u2019t working, but can you tell me when it will be fixed?\u201d. The politics of managing changes, adding requirements to projects (with appropriate documentation, change orders, budget, and scheduling adjustment) are part of professional work.\n<\/p><p>Change management also addresses \u201cpeople issues,\u201d and they can be among the more stressful. Moving from an existing process of doing work to an automated one means that people\u2019s jobs will change. For some it will be a minor adjustment to the way they do work, while for others it will be the possibility of their job dramatically changing (and change raises people\u2019s anxiety levels). Those working in LAE are going to have to be able to face that reality head-on because that will affect their ability to get work done. If those working in the lab believe that your project will cause them to lose their job, or have to make a significant change (particularly if it is viewed as a negative change) they may limit your ability to complete the project. The fact that this point deals with people does not mean that it cannot affect what may be viewed as a technology project, or that the issue can be viewed as a problem to be solved by lab managers.\n<\/p><p>Paying attention to \u201cpeople issues\u201d can:\n<\/p>\n<ul><li>point to deficiencies in project planning,<\/li>\n<li>avoid delays in the final acceptance of the project, and<\/li>\n<li>help you uncover important factors in designing a system.<\/li><\/ul>\n<p>Regarding the first point, one of the lessons learned in the installation of early LIMS was the need for both training and the availability of practice databases so that those working with the system could familiarize themselves with a non-production database before using the live system. Having the project plan reviewed by those in the lab who will be affected by the system can lead to the discovery of issues that need to be addressed in training, scheduling, and potential conflicts in the lab\u2019s operations. By extension, if deficiencies are caught early, delays in final project acceptance may be avoided. \n<\/p><p>As for the final point, laboratory work contains elements of art as well as science. People\u2019s familiarity with the samples they are processing can lead to undocumented shortcuts and adjustments to procedures, which can make the difference between your implementation of an automated process working or failing. In the late 1980s, Shoshana Zuboff documented the problems that occurred in the automation of pulp mills when the plant personnel\u2019s experience was ignored.<sup id=\"rdp-ebb-cite_ref-ZuboffInTheAge88_11-0\" class=\"reference\"><a href=\"#cite_note-ZuboffInTheAge88-11\">[7]<\/a><\/sup> The parallel here is if you ignore peoples experience in doing the work you are about to automate, you run a significant risk of failing.\n<\/p><p>Project and change management also has to provide planning for the stresses that develop during a project's implementation. As noted earlier, laboratory automation today is largely a replacement of existing manual procedures. LIMS are installed to facilitate the bookkeeping of service laboratories, improve the lab\u2019s ability to respond to questions about test results, and increase the efficiency of the lab\u2019s operations. Robotic sample preparation systems are introduced into a lab because the existing manual processes are no longer economical, and there is a need for better data precision, or higher sample throughput.\n<\/p><p>These replacement activities are going to introduce stressors on the lab, which need to be taken into account in the planning process. First, there will be some stress on the lab workers and budget during the development of the system. That stress will come in the form of the cost of running the lab while the development is taking place, people\u2019s attitudes if they feel their jobs will be adversely affected, potential disruption of lab operations, and the problems associated with installing systems in labs that are usually short on space. Second, once the automation system has been developed, it will have to be evaluated against the existing process to show that it meets the design criteria. Once that has been successfully accomplished, a change-over process has to be designed to switch from one method of operation to another. Both of these activities will increase the workload in the lab since, for a time, both the automated system and the process it is replacing will run in parallel.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Strong_communications_skills\">Strong communications skills<\/span><\/h3>\n<p>The previous section mentioned managing people issues, and effective communication is part of that. The strong communication is also useful in making sure people understand what work is being done, what the benefits are, and what the costs are, as well as declaring any schedules, risks, etc. Those aspects, as well as the ramifications of any changes, have to be clearly understood. The lack of clear communications is often the basis of misdirection, delays, and frustration around projects. Communication is a two-way street. The LAE needs to be able to discuss the project, as well as listen to and understand the needs of those working in the lab.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"Strong_understanding_of_regulatory_issues\">Strong understanding of regulatory issues<\/span><\/h3>\n<p>No matter what industry in which you are working, meeting regulatory requirements is a fact of life. Companies are under increasing pressure from regulatory agencies to have their internal practices conform to a standard. This is no longer a specific regulatory- or standards-based issue that is focused on manufacturing systems, but rather an organization-wide set of activities encompassing financial and accounting practices, human resources, etc. Name a department and it will be affected by a regulation.\n<\/p><p>While the initial reaction is that this adds one more thing to do; for the most part, what the regulations require is part of a good system design process. The requirements for the validation of a system, for example, call for the designer to show that the system works, that it is supportable, well-documented, and uses equipment suited to the designer\u2019s purpose and from reliable vendors. Tinkering in order to get a system together is no longer an accepted practice; it may work for prototyping, but not for production use. The point of any applicable regulations and standards is to ensure that any implemented system can stand up to long-term use, and that it fits a well-defined, documented need. In short, regulations and standards help ensure a system is well-engineered.\n<\/p>\n<h3><span class=\"mw-headline\" id=\"General_systems_theory_and_beyond\">General systems theory and beyond<\/span><\/h3>\n<p>Frank Zenie (one of the founders of Zymark Corporation<sup id=\"rdp-ebb-cite_ref-12\" class=\"reference\"><a href=\"#cite_note-12\">[e]<\/a><\/sup>) usually made the statement that \u201cyou can only automate a process, you can\u2019t automate a thing\u201d as part of his introductory comments for Zymark\u2019s robotics courses. Recognizing that a process exists and that it can be automated is the initial step in defining a project. General systems theory<sup id=\"rdp-ebb-cite_ref-13\" class=\"reference\"><a href=\"#cite_note-13\">[f]<\/a><\/sup> provides the tools for documenting a process, as well as the triggers and the state changes that occur as the process develops. It is particularly useful in working with inter-related systems.\n<\/p><p>Recognizing that a process exists and can be well described is one part of the problem. Another is its ability to be automated. Labs and lab equipment are designed for people. Using that same equipment and tools with automated systems, including robotics, may require a significant amount of re-engineering, and bring into question the economics and perceived benefits of a project.\n<\/p><p>Process engineering should also include statistical process control, statistical quality control, and productivity measurements. Robotics systems are small-scale manufacturing systems even if the end result is data. As portions of laboratory work become automated and integrated, the systems will behave like manufacturing processes and should fall under the same types of control. Productivity measurements are also important. Automation systems are put in place to achieve a number of goals, including a focus on improving efficiency of operations. Productivity measurements test the system\u2019s ability to meet those goals, and if successful, will lead to the development of additional projects.\n<\/p><p>Much of what has been written so far could easily be recognized as part of an engineering or engineering management program, and that is exactly the point: laboratory automation is an engineering activity. The elements that differentiate it from other engineering activities include that:\n<\/p>\n<ul><li>it's a laboratory or scientific environment, as noted earlier;<\/li>\n<li>automation is an enabling technology opening the door to new scientific methodologies, including discovery-based science<sup id=\"rdp-ebb-cite_ref-14\" class=\"reference\"><a href=\"#cite_note-14\">[g]<\/a><\/sup>;<\/li>\n<li>automation is usually a replacement for existing manual operations, which will require the LAE to be sensitive to change management issues;<\/li>\n<li>the scope of activities can include materials handling (robotics), data acquisition, analysis, reporting, and integration with database systems; and<\/li>\n<li>LAEs need to be knowledgeable in automation technologies and their application, and have strong backgrounds in science practiced in the lab.<\/li><\/ul>\n<p>This last point is significant; the LAE cannot expect the lab personnel to describe their requirements in engineering terms. Yes, lab personnel can explain what they would like to accomplish; however, it is the LAE\u2019s responsibility to determine how to do it and understand the ramifications in system design and implementation. The LAE functions in part as a translator, converting a set of needs of scientists into a project plan, and ultimately into a functioning system. In some respects this is similar to traditional software engineering.\n<\/p><p>All the work associated with the three sub-disciplines reflects the transition of working with \u201cthings\u201d and materials, making measurements, and creating a description of those items, and then managing and working with the resulting knowledge, information, and data. The lists found in Figure 2 are a summary of skills needed in addition to those noted above. Software engineering is repeated in the lists for additional emphasis.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig2_Liscouski_AreYouLabAutoEng06.png\" class=\"image wiki-link\" data-key=\"ff742b1abdab0a3de38afb84bf3f1da4\"><img alt=\"Fig2 Liscouski AreYouLabAutoEng06.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/e\/e2\/Fig2_Liscouski_AreYouLabAutoEng06.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 2.<\/b> Summary of skills required for each of the three sub-disciplines within laboratory automation engineering<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<h2><span id=\"rdp-ebb-What_comes_next?\"><\/span><span class=\"mw-headline\" id=\"What_comes_next.3F\">What comes next?<\/span><\/h2>\n<p>There are two major tasks that need to be undertaken, both under the heading of education. First is the development of a curriculum for laboratory automation engineering at the university level. This will require support from both organizations like the Association for Laboratory Automation (ALA) and from industry. A university is not going to create a program, even one that can be assembled to a large extent from existing programs, unless it has some assurance that its graduates will find jobs. The availability of employment opportunities would be used to attract students, and the process would move on from there. One issue that needs to be addressed is how much laboratory science knowledge an LAE has to have in order to be effective. Another closely tied issue is whether we are talking about an undergraduate program with particular science specializations or a graduate program.\n<\/p><p>Undergraduate and graduate science programs need to address the inclusion of automation in their courses. The point is not to teach them engineering but to acquaint them with how the work of modern science is done. Just as computer literacy is part of high-school curriculums, automation literacy should be part of the course of science studies. James Sterling\u2019s 2004 journal article on laboratory automation curriculum provides one example of what can be done.<sup id=\"rdp-ebb-cite_ref-SterlingLab04_15-0\" class=\"reference\"><a href=\"#cite_note-SterlingLab04-15\">[8]<\/a><\/sup> The ALA can take a lead and develop materials (e.g., course outlines, source material, etc.) to assist instructors who want to include this material in their programs.\n<\/p><p>While a university could address a long-term need, there is the need to provide a means of having those already working in the field to augment their backgrounds. Expanding the short-course structure already put in place by the ALA could satisfy that need. Another possibility is the development of certification programs similar to those used in computer science, working in conjunction with short- courses and the development of an \u201cInstitute for Laboratory Automation\u201d with an organized summer program.\n<\/p><p>The second task is the organization of a \u201cbody of knowledge\u201d about laboratory automation engineering that would include compilations of relevant texts, web sites, knowledge bases, etc., while encouraging those working in the field to contribute material to its development. The initial step would be the development of a framework to organize material and then to begin populating it. Once a framework is in place, publications like the <i>Journal of the Association for Laboratory Automation<\/i> could have authors key their papers to that structure. The details of the framework need input from others in the field and more depth of evaluation than this introductory piece can afford. \n<\/p><p>In the development of this framework (Figure 3), the following points should be considered. First, there are fundamental techniques, skills, and technologies in laboratory automation that cut across scientific disciplines pretty much intact, including analog data acquisition, fundamentals of robotics, interfacing techniques, project management, etc. While there may be application-specific caveats (e.g., \u201cConsiderations in analog interfacing in secure biohazard facilities\u201d), for the most part they are common elements and can be treated as a common block of knowledge. Second, once we get beyond basics and into applications of techniques in scientific disciplines, we may see the same basic outline structure duplicated with parallel sections. Robotics in chemistry may be similar to robotics in another discipline but with differences in equipment details and implementations. The outline will appear similar, with differences in the details. The framework should consider linkages between parallel outlines where similar needs generate similar systems development. In short, we should make it easy to recognize cross-fertilization between similar technologies in different disciplines.\n<\/p><p><br \/>\n<a href=\"https:\/\/www.limswiki.org\/index.php\/File:Fig3_Liscouski_AreYouLabAutoEng06.png\" class=\"image wiki-link\" data-key=\"8d25a5e071255f6da3243f9a3c3c254e\"><img alt=\"Fig3 Liscouski AreYouLabAutoEng06.png\" src=\"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/1\/1d\/Fig3_Liscouski_AreYouLabAutoEng06.png\" decoding=\"async\" style=\"width: 100%;max-width: 400px;height: auto;\" \/><\/a>\n<\/p>\n<div style=\"clear:both;\"><\/div>\n<table style=\"\">\n<tbody><tr>\n<td style=\"vertical-align:top;\">\n<table border=\"0\" cellpadding=\"5\" cellspacing=\"0\" style=\"\">\n\n<tbody><tr>\n<td style=\"background-color:white; padding-left:10px; padding-right:10px;\"><blockquote><p><b>Figure 3.<\/b> Example of an organized and interconnected LAE knowledge framework<\/p><\/blockquote>\n<\/td><\/tr>\n<\/tbody><\/table>\n<\/td><\/tr><\/tbody><\/table>\n<h2><span class=\"mw-headline\" id=\"Conclusion\">Conclusion<\/span><\/h2>\n<p>Laboratory automation is still in the early phases of its development. Some may say that significant progress has been made, but in comparison to the rate of development of automation and information technologies in other areas, our progress has been slow and incremental (the internet, for example, is a little more than a decade old).\n<\/p><p>What can and must be done is getting the user community to embrace the establishment and development of a discipline focused on envisioning, creating, and improving the tools and techniques of laboratory automation. That in turn will allow us to realize the promise that proponents of automation have long held: giving people the opportunity to do better science. Laboratory automation needs to be driven by people who want to do good work and are trained to do it.\n<\/p><p>An LAE\u2019s employment will be driven by market demand. However, the skill sets should be transferable. Just as computer science professionals have flexibility in where their skills are applied, LAEs should enjoy the same ability to move from one scientific application to another. The major differences will be learning the underlying science.\n<\/p>\n<h2><span id=\"rdp-ebb-Abbreviations,_acronyms,_and_initialisms\"><\/span><span class=\"mw-headline\" id=\"Abbreviations.2C_acronyms.2C_and_initialisms\">Abbreviations, acronyms, and initialisms<\/span><\/h2>\n<p><b>ELN<\/b>: Electronic laboratory notebook\n<\/p><p><b>FDA<\/b>: Food and Drug Administration\n<\/p><p><b>FRS<\/b>: Functional requirements specification\n<\/p><p><b>IT<\/b>: Information technology\n<\/p><p><b>ISO<\/b>: International Organization for Standardization\n<\/p><p><b>LAE<\/b>: Laboratory automation engineering (or engineer)\n<\/p><p><b>LIMS<\/b>: Laboratory information management system\n<\/p><p><b>URS<\/b>: User requirements specification\n<\/p>\n<h2><span class=\"mw-headline\" id=\"Footnotes\">Footnotes<\/span><\/h2>\n<div class=\"reflist\" style=\"list-style-type: lower-alpha;\">\n<div class=\"mw-references-wrap\"><ol class=\"references\">\n<li id=\"cite_note-1\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-1\">\u2191<\/a><\/span> <span class=\"reference-text\">A portion of this material <a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.1016%2Fj.jala.2006.04.002\" target=\"_blank\">was published<\/a> as a guest editorial in the <i>Journal of the Association for Laboratory Automation<\/i> (now <i>SLAS Technology<\/i>), June 2006, volume 11, number 3, pages 157-162.<\/span>\n<\/li>\n<li id=\"cite_note-2\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-2\">\u2191<\/a><\/span> <span class=\"reference-text\">As a point of fact, and as noted later in this piece, the results of laboratory work are knowledge, information, and data. \u201cData\u201d as used here is a short-hand for all three. For a full discussion of this point from the author's point of view, please refer to <i><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/isbnsearch.org\/isbn\/0471594229\" target=\"_blank\">Laboratory and Scientific Computing: A Strategic Approach<\/a><\/i>, J. Liscouski, Wiley Interscience, 1995.<\/span>\n<\/li>\n<li id=\"cite_note-5\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-5\">\u2191<\/a><\/span> <span class=\"reference-text\">I\u2019d like to thank Mark Russo, executive editor of the JALA, for his comments and, in this section in particular, for his insights and the material he contributed.<\/span>\n<\/li>\n<li id=\"cite_note-7\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-7\">\u2191<\/a><\/span> <span class=\"reference-text\">For example, process chromatographs made by Mine Safety Appliances and Fisher Control used cams and micro-switches to control sampling and back-flush valves.<\/span>\n<\/li>\n<li id=\"cite_note-12\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-12\">\u2191<\/a><\/span> <span class=\"reference-text\">Zymark was acquired by Caliper Life Sciences in 2003.<\/span>\n<\/li>\n<li id=\"cite_note-13\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-13\">\u2191<\/a><\/span> <span class=\"reference-text\">See \"<a rel=\"external_link\" class=\"external text\" href=\"http:\/\/pespmc1.vub.ac.be\/SYSTHEOR.html\" target=\"_blank\">What Is Systems Theory?<\/a>\" for an introduction. In particular, review the work of George Klir, including <i><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.worldcat.org\/title\/approach-to-general-systems-theory\/oclc\/51307\" target=\"_blank\">An Approach to General Systems Theory<\/a><\/i> and <i><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/doi.org\/10.1007\/978-1-4615-1331-5\" target=\"_blank\">Facets of Systems Science<\/a><\/i>.<\/span>\n<\/li>\n<li id=\"cite_note-14\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-14\">\u2191<\/a><\/span> <span class=\"reference-text\">As suggested by a reviewer.<\/span>\n<\/li>\n<\/ol><\/div><\/div>\n<h2><span class=\"mw-headline\" id=\"Acknowledgements\">Acknowledgements<\/span><\/h2>\n<p>I\u2019d like to thank Mark Russo (Bristol-Myers Squibb) for his comments and support in the development of this article.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"About_the_author\">About the author<\/span><\/h2>\n<p>Initially educated as a chemist, author Joe Liscouski (joe dot liscouski at gmail dot com) is an experienced laboratory automation\/computing professional with over forty years of experience in the field, including the design and development of automation systems (both custom and commercial systems), LIMS, robotics and data interchange standards. He also consults on the use of computing in laboratory work. He has held symposia on validation and presented technical material and short courses on laboratory automation and computing in the U.S., Europe, and Japan. He has worked\/consulted in pharmaceutical, biotech, polymer, medical, and government laboratories. His current work centers on working with companies to establish planning programs for lab systems, developing effective support groups, and helping people with the application of automation and information technologies in research and quality control environments.\n<\/p>\n<h2><span class=\"mw-headline\" id=\"References\">References<\/span><\/h2>\n<div class=\"reflist references-column-width\" style=\"-moz-column-width: 30em; -webkit-column-width: 30em; column-width: 30em; list-style-type: decimal;\">\n<div class=\"mw-references-wrap\"><ol class=\"references\">\n<li id=\"cite_note-HallockCreative05-3\"><span class=\"mw-cite-backlink\">\u2191 <sup><a href=\"#cite_ref-HallockCreative05_3-0\">1.0<\/a><\/sup> <sup><a href=\"#cite_ref-HallockCreative05_3-1\">1.1<\/a><\/sup> <sup><a href=\"#cite_ref-HallockCreative05_3-2\">1.2<\/a><\/sup><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Hallock, N. (2005). \"Creative Combustion: A History of the Association for Laboratory Automation\". <i>SLAS Technology<\/i> <b>10<\/b> (6): 423\u201331. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/Digital_object_identifier\" data-key=\"ae6d69c760ab710abc2dd89f3937d2f4\">doi<\/a>:<a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.1016%2Fj.jala.2005.10.001\" target=\"_blank\">10.1016\/j.jala.2005.10.001<\/a>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Creative+Combustion%3A+A+History+of+the+Association+for+Laboratory+Automation&rft.jtitle=SLAS+Technology&rft.aulast=Hallock%2C+N.&rft.au=Hallock%2C+N.&rft.date=2005&rft.volume=10&rft.issue=6&rft.pages=423%E2%80%9331&rft_id=info:doi\/10.1016%2Fj.jala.2005.10.001&rfr_id=info:sid\/en.wikipedia.org:LII:Are_You_a_Laboratory_Automation_Engineer%3F\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-LiscouskiLab85-4\"><span class=\"mw-cite-backlink\">\u2191 <sup><a href=\"#cite_ref-LiscouskiLab85_4-0\">2.0<\/a><\/sup> <sup><a href=\"#cite_ref-LiscouskiLab85_4-1\">2.1<\/a><\/sup> <sup><a href=\"#cite_ref-LiscouskiLab85_4-2\">2.2<\/a><\/sup><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Liscouski, J. (1985). \"Laboratory Automation\". <i>JCOMP<\/i> <b>25<\/b> (3): 288\u201392.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Laboratory+Automation&rft.jtitle=JCOMP&rft.aulast=Liscouski%2C+J.&rft.au=Liscouski%2C+J.&rft.date=1985&rft.volume=25&rft.issue=3&rft.pages=288%E2%80%9392&rfr_id=info:sid\/en.wikipedia.org:LII:Are_You_a_Laboratory_Automation_Engineer%3F\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-MateyHist99-6\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-MateyHist99_6-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Matey, J.R. (1999). <a rel=\"nofollow\" class=\"external text\" href=\"#SBA01.001\">\"History of Laboratory Automation - Session BA01.05, Cent. Symposium: 20th Century Developments in Instrumentation & Measurements\"<\/a>. <i>Centennial Meeting of the American Physical Society<\/i><span class=\"printonly\">. <a rel=\"nofollow\" class=\"external free\" href=\"#SBA01.001\">http:\/\/flux.aps.org\/meetings\/YR99\/CENT99\/abs\/S350.html#SBA01.001<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=History+of+Laboratory+Automation+-+Session+BA01.05%2C+Cent.+Symposium%3A+20th+Century+Developments+in+Instrumentation+%26+Measurements&rft.jtitle=Centennial+Meeting+of+the+American+Physical+Society&rft.aulast=Matey%2C+J.R.&rft.au=Matey%2C+J.R.&rft.date=1999&rft_id=http%3A%2F%2Fflux.aps.org%2Fmeetings%2FYR99%2FCENT99%2Fabs%2FS350.html%23SBA01.001&rfr_id=info:sid\/en.wikipedia.org:LII:Are_You_a_Laboratory_Automation_Engineer%3F\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-M123Project-8\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-M123Project_8-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/www.method123.com\/project-lifecycle.php\" target=\"_blank\">\"Project Management Life Cycle\"<\/a>. <i>Method123<\/i><span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/www.method123.com\/project-lifecycle.php\" target=\"_blank\">https:\/\/www.method123.com\/project-lifecycle.php<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Project+Management+Life+Cycle&rft.atitle=Method123&rft_id=https%3A%2F%2Fwww.method123.com%2Fproject-lifecycle.php&rfr_id=info:sid\/en.wikipedia.org:LII:Are_You_a_Laboratory_Automation_Engineer%3F\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-WPSystems-9\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-WPSystems_9-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"nofollow\" class=\"external text wiki-link\" href=\"https:\/\/en.wikipedia.org\/wiki\/Systems_development_life_cycle\" data-key=\"3e10b60754dcc11d1f32334e2f41a76e\">\"Systems development life cycle\"<\/a>. <i>Wikipedia<\/i><span class=\"printonly\">. <a rel=\"nofollow\" class=\"external free wiki-link\" href=\"https:\/\/en.wikipedia.org\/wiki\/Systems_development_life_cycle\" data-key=\"3e10b60754dcc11d1f32334e2f41a76e\">https:\/\/en.wikipedia.org\/wiki\/Systems_development_life_cycle<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Systems+development+life+cycle&rft.atitle=Wikipedia&rft_id=https%3A%2F%2Fen.wikipedia.org%2Fwiki%2FSystems_development_life_cycle&rfr_id=info:sid\/en.wikipedia.org:LII:Are_You_a_Laboratory_Automation_Engineer%3F\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-AdobeLesson2.2-10\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-AdobeLesson2.2_10-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation web\"><a rel=\"external_link\" class=\"external text\" href=\"https:\/\/web.archive.org\/web\/20060110072101\/http:\/\/www.adobe.com\/education\/webtech\/CS\/unit_planning1\/pm_home_id.htm\" target=\"_blank\">\"Lesson 2.2 Project Management\"<\/a>. <i>Adobe Web Tech Curriculum<\/i>. 2005. Archived from <a rel=\"external_link\" class=\"external text\" href=\"http:\/\/www.adobe.com\/education\/webtech\/CS\/unit_planning1\/pm_home_id.htm\" target=\"_blank\">the original<\/a> on 10 January 2006<span class=\"printonly\">. <a rel=\"external_link\" class=\"external free\" href=\"https:\/\/web.archive.org\/web\/20060110072101\/http:\/\/www.adobe.com\/education\/webtech\/CS\/unit_planning1\/pm_home_id.htm\" target=\"_blank\">https:\/\/web.archive.org\/web\/20060110072101\/http:\/\/www.adobe.com\/education\/webtech\/CS\/unit_planning1\/pm_home_id.htm<\/a><\/span>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=bookitem&rft.btitle=Lesson+2.2+Project+Management&rft.atitle=Adobe+Web+Tech+Curriculum&rft.date=2005&rft_id=https%3A%2F%2Fweb.archive.org%2Fweb%2F20060110072101%2Fhttp%3A%2F%2Fwww.adobe.com%2Feducation%2Fwebtech%2FCS%2Funit_planning1%2Fpm_home_id.htm&rfr_id=info:sid\/en.wikipedia.org:LII:Are_You_a_Laboratory_Automation_Engineer%3F\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-ZuboffInTheAge88-11\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-ZuboffInTheAge88_11-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation book\">Zuboff, S. (1988). <i>In the Age of the Smart Machine<\/i>. Butterworth\u2013Heinemann. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/International_Standard_Book_Number\" data-key=\"f64947ba21e884434bd70e8d9e60bae6\">ISBN<\/a> 0434924865.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=book&rft.btitle=In+the+Age+of+the+Smart+Machine&rft.aulast=Zuboff%2C+S.&rft.au=Zuboff%2C+S.&rft.date=1988&rft.pub=Butterworth%E2%80%93Heinemann&rft.isbn=0434924865&rfr_id=info:sid\/en.wikipedia.org:LII:Are_You_a_Laboratory_Automation_Engineer%3F\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<li id=\"cite_note-SterlingLab04-15\"><span class=\"mw-cite-backlink\"><a href=\"#cite_ref-SterlingLab04_15-0\">\u2191<\/a><\/span> <span class=\"reference-text\"><span class=\"citation Journal\">Sterling, J.D. (2004). \"Laboratory Automation Curriculum at Keck Graduate Institute\". <i>SLAS Technology<\/i> <b>9<\/b> (5): 331\u201335. <a rel=\"nofollow\" class=\"external text wiki-link\" href=\"http:\/\/en.wikipedia.org\/wiki\/Digital_object_identifier\" data-key=\"ae6d69c760ab710abc2dd89f3937d2f4\">doi<\/a>:<a rel=\"external_link\" class=\"external text\" href=\"http:\/\/dx.doi.org\/10.1016%2Fj.jala.2004.07.005\" target=\"_blank\">10.1016\/j.jala.2004.07.005<\/a>.<\/span><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Laboratory+Automation+Curriculum+at+Keck+Graduate+Institute&rft.jtitle=SLAS+Technology&rft.aulast=Sterling%2C+J.D.&rft.au=Sterling%2C+J.D.&rft.date=2004&rft.volume=9&rft.issue=5&rft.pages=331%E2%80%9335&rft_id=info:doi\/10.1016%2Fj.jala.2004.07.005&rfr_id=info:sid\/en.wikipedia.org:LII:Are_You_a_Laboratory_Automation_Engineer%3F\"><span style=\"display: none;\"> <\/span><\/span>\n<\/span>\n<\/li>\n<\/ol><\/div><\/div>\n<!-- \nNewPP limit report\nCached time: 20211118162547\nCache expiry: 86400\nDynamic content: false\nComplications: []\nCPU time usage: 0.156 seconds\nReal time usage: 0.250 seconds\nPreprocessor visited node count: 5568\/1000000\nPost\u2010expand include size: 30501\/2097152 bytes\nTemplate argument size: 11988\/2097152 bytes\nHighest expansion depth: 17\/40\nExpensive parser function count: 0\/100\nUnstrip recursion depth: 0\/20\nUnstrip post\u2010expand size: 14708\/5000000 bytes\n-->\n<!--\nTransclusion expansion time report (%,ms,calls,template)\n100.00% 98.118 1 -total\n 79.79% 78.285 2 Template:Reflist\n 55.73% 54.677 8 Template:Citation\/core\n 36.79% 36.097 4 Template:Cite_journal\n 21.48% 21.080 3 Template:Cite_web\n 13.03% 12.781 7 Template:Efn\n 8.27% 8.115 1 Template:Cite_book\n 7.79% 7.646 3 Template:Citation\/identifier\n 4.57% 4.484 1 Template:Date\n 4.04% 3.967 12 Template:Citation\/make_link\n-->\n\n<!-- Saved in parser cache with key limswiki:pcache:idhash:12359-0!canonical and timestamp 20211118162546 and revision id 41794. Serialized with JSON.\n -->\n<\/div><\/div><div class=\"printfooter\">Source: <a rel=\"external_link\" class=\"external\" href=\"https:\/\/www.limswiki.org\/index.php\/LII:Are_You_a_Laboratory_Automation_Engineer%3F\">https:\/\/www.limswiki.org\/index.php\/LII:Are_You_a_Laboratory_Automation_Engineer%3F<\/a><\/div>\n<!-- end content --><div class=\"visualClear\"><\/div><\/div><\/div><div class=\"visualClear\"><\/div><\/div><!-- end of the left (by default at least) column --><div class=\"visualClear\"><\/div><\/div>\n\n\n\n<\/body>","67df76407d0807e78d9cde61bb3f82c9_images":["https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/4\/4f\/Fig1_Liscouski_AreYouLabAutoEng06.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/e\/e2\/Fig2_Liscouski_AreYouLabAutoEng06.png","https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/1\/1d\/Fig3_Liscouski_AreYouLabAutoEng06.png"],"67df76407d0807e78d9cde61bb3f82c9_timestamp":1637252746},"link":"https:\/\/www.limswiki.org\/index.php\/Book:LIMSjournal_-_Laboratory_Technology_Special_Edition","price_currency":"","price_amount":"","book_size":"","download_url":"https:\/\/www.limsforum.com?ebb_action=book_download&book_id=95346","language":"","cta_button_content":"","toc":[{"type":"article","name":"Are You a Laboratory Automation Engineer? (Liscouski 2006)","id":"67df76407d0807e78d9cde61bb3f82c9","pageUrl":"https:\/\/www.limswiki.org\/index.php\/LII:Are_You_a_Laboratory_Automation_Engineer%3F"},{"type":"article","name":"Elements of Laboratory Technology Management (Liscouski 2014)","id":"2000eea677bcd5ee1fcecdab32743800","pageUrl":"https:\/\/www.limswiki.org\/index.php\/LII:Elements_of_Laboratory_Technology_Management"},{"type":"article","name":"A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work (Liscouski 2018)","id":"00b300565027cb0518bcb0410d6df360","pageUrl":"https:\/\/www.limswiki.org\/index.php\/LII:A_Guide_for_Management:_Successfully_Applying_Laboratory_Systems_to_Your_Organization%27s_Work"},{"type":"article","name":"Laboratory Technology Management & Planning (Liscouski 2019)","id":"2016524e3c4b551c982fcfc23e33220d","pageUrl":"https:\/\/www.limswiki.org\/index.php\/LII:Laboratory_Technology_Management_%26_Planning"},{"type":"article","name":"Notes on Instrument Data Systems (Liscouski 2020)","id":"1b7330228fd59158aab6fab82ad0e7cc","pageUrl":"https:\/\/www.limswiki.org\/index.php\/LII:Notes_on_Instrument_Data_Systems"},{"type":"article","name":"Laboratory Technology Planning and Management: The Practice of Laboratory Systems Engineering (Liscouski 2020)","id":"655f7d48a642e9b45533745af73f0d59","pageUrl":"https:\/\/www.limswiki.org\/index.php\/LII:Laboratory_Technology_Planning_and_Management:_The_Practice_of_Laboratory_Systems_Engineering"},{"type":"article","name":"Considerations in the Automation of Laboratory Procedures (Liscouski 2021)","id":"e0147011cc1eb892e1a35e821657a6d9","pageUrl":"https:\/\/www.limswiki.org\/index.php\/LII:Considerations_in_the_Automation_of_Laboratory_Procedures"},{"type":"article","name":"The Application of Informatics to Scientific Work: Laboratory Informatics for Newbies (Liscouski 2021)","id":"d8b467af534a70312a21f63b61be26cd","pageUrl":"https:\/\/www.limswiki.org\/index.php\/LII:The_Application_of_Informatics_to_Scientific_Work:_Laboratory_Informatics_for_Newbies"},{"type":"article","name":"Directions in Laboratory Systems: One Person's Perspective (Liscouski 2021)","id":"87d7f050e0c47d7762a90382989592a1","pageUrl":"https:\/\/www.limswiki.org\/index.php\/LII:Directions_in_Laboratory_Systems:_One_Person%27s_Perspective"}],"settings":{"show_cover":1,"show_title":1,"show_subtitle":0,"show_full_title":1,"show_editor":1,"show_editor_pic":1,"show_publisher":1,"show_language":1,"show_size":1,"show_toc":1,"show_content_beneath_cover":1,"toc_links":"logged-in","cta_button":"1","content_location":"1","log_in_msg":"","cover_size":"medium"},"title_image":"https:\/\/s3.limswiki.org\/www.limswiki.org\/images\/5\/5a\/Fig4_Liscouski_NotesOnInstDataSys20.jpg"}}
LIMSjournal - Laboratory Technology Special Edition
Editor: Shawn Douglas
Publisher: LabLynx Press
Copyright LabLynx Inc. All rights reserved.