Interfacing LIMS to Mine Data Systems
Most analysts understand the benefits a LIMS can bring to their operation. Among the more obvious ones, a LIMS can help to:
- Reduce transcription errors
- Allow better utilization of staff by reducing the amount of time spent keying in data or calculating results.
- Improve turnaround time.
- Provide a clear picture of the status of a user’s samples.
- Allow for rapid extraction of large datasets, for statistical or other analysis.
- Avoid mix-ups with Samples and / or results.
- Respond to problems in a timely manner. Most are difficult if not impossible to quantify, explaining, perhaps, why Laboratory Managers often find it so difficult to get funding for LIMS.
We will examine some of the benefits above in a little more detail.
Data transcription errors are amongst the most insidious of all errors in laboratory data. They are notoriously difficult to detect since, unlike a persistent analytical bias, they are random and sporadic. Modern analytical instruments such as ICP-MS spectrometers can generate enormous amounts of data, simultaneously analyzing hundreds of different parameters. When these results are copied manually the potential for error becomes enormous. Cutting and pasting the raw data into spreadsheets is a little better, but not by much. Columns can be transposed, cells can be skipped and samples can be mixed up.
Calculations are another area where a simple mistake in, say, a dilution factor, can lead to results that are incorrect by orders of magnitude. When a dedicated LIMS performs calculations, it does so consistently and reliably, freeing the analyst from some of the burden of
With a suitable LIMS, the entire assay is automated. Weights are transferred directly from the balance to the LIMS. All calculations are performed internally, and the final analysis report or certificate is printed directly.
One of the most critical parameters in many Mine-site labs, turnaround time is often the major yardstick against which a lab is measured. This is not only because of its importance to clients. Many mineral samples are time-sensitive, and the accuracy of the result can be impacted if holding times are exceeded. A LIMS can improve turnaround time in several ways. Firstly, by automating many data entry tasks, the LIMS frees up staff for other activities. Pre-defined templates for sample entry and analysis setup mean that routine sample lists can be created in seconds. A LIMS allows managers to prioritize samples, so that non-essential ones are not analyzed at the expense of critical ones.
Many labs include sample turnaround statistics in their month-end reports. Without a LIMS, this can involve spending the better part of a day with a heap of sample submission records, analysis reports and a calculator. Lab Managers generally approach turnaround-time reports with the same reluctance that they do performance appraisals. This need not be the case. A suitable LIMS allows Managers to generate these reports simply and quickly, and to view trends over longer periods. Furthermore, the LIMS can allow several filtering and grouping options, so that Managers can identify problems within particular areas, such as sample preparation, or with certain analytical methods. This type of information is invaluable and allows Managers to make informed decisions to improve the efficiency of resource allocation.
Tougher standards are making it increasingly difficult, if not impossible, to meet accreditation or certification guidelines without some form of computerized data management. In the mining field, this has certainly been the prime mover toward LIMS implementation, and most of these labs now operate a LIMS.
For most labs that analyze mineral samples, accreditation is a pre-condition for acceptance by regulatory agencies. A LIMS that is designed to conform to state or national accreditation standards therefore allows lab manager and QA officers to concentrate on other areas, knowing that the data management requirements such as audit trails, electronic signatures and data integrity are fully met.
If the benefits above are areas in which a LIMS is superior to a manual system, then data analysis is an area in which a LIMS excels. Modern, server-based relational databases can access and process data with phenomenal speed and perfect accuracy. A well-designed LIMS allows managers to extract pinpoint datasets from amongst millions of records, and to analyze those datasets with unparalleled speed. With a set of built-in QC/QA tools, the LIMS lets analysts review quality control charts before submitting results for approval. It allows QA officers and managers to examine long-term trends, answering questions such as “Is our quality improving”. Finally, it provides auditors with documentary evidence that the laboratory is in control.
It’s no good processing samples within twenty minutes if the results are to sit on the manager’s desk until the following morning. By providing process plant operators with up-to-the-minute analytical data, a LIMS allows them to adjust process conditions in a timely manner, and to respond to process changes and upsets with the minimum of delay. Whether it is the particle size distribution of a classifier overflow, turbidity of a plant feedstock or the free cyanide content of a mill discharge, operators need this vital information as soon as possible. Apart from putting health and safety at risk, delays can cost many thousands of dollars in penalties and fines, which is why online access to analytical data is often of paramount importance.
Many LIMS incorporate web-based data access, allowing authorized personnel to view results from any PC with an Internet connection, in a nearby town or on the other side of the world. Lab managers retain absolute control over the data; deciding who can access what.
As a service department; the laboratory exists to support other departments. In a typical mining operation these may include:
The laboratory may analyze samples for areas such as; exploration, ore reserve estimation, grade control, mine modeling and environmental monitoring. In addition to requiring different types of analyses, each of these areas may also have different requirements for turnaround-time, accuracy and precision required.
Process samples are typically analyzed to provide inputs for process control, metallurgical testing, metallurgical accounting, environmental monitoring and plant troubleshooting. As with the sections for geology / mining, each of these areas will likely have different requirements regarding turnaround, accuracy and precision.
Although the mining and process departments will likely account for the bulk of the analyses done by the lab, other departments on the mine often require analytical services too. For example, analysis of wear metals for equipment monitoring is a common requirement. These analyses are used as part of the mine’s preventive maintenance program, where timely warning about contaminant buildup in engine oils can save many thousands of dollars in repair and downtime costs.
In some cases the lab may be asked to provide analytical services to monitor employee health and safety. Examples include measuring lead and mercury to monitor exposure to these hazards.
Given the amount of interaction between a minesite laboratory and it’s customer departments, it is surprising that many in the industry still view the lab as a “black box”; where samples go in one end and results come out the other, with little regard to what goes on inside. This is unfortunate at best. In the worst case, this attitude can lead to major errors. If end users do not understand or appreciate the limitations in the analytical techniques used, there is a danger that they can make invalid assumptions and draw incorrect conclusions. For example, if a metallurgist tests a particular reagent to examine it’s effectiveness in a flotation circuit, he needs to be certain that any observed differences in recovery are in fact genuine, statistical differences and not related to uncertainties in sampling or in the analytical method used.
Analysts have a responsibility to try to change this thinking by improving communication between the lab and its users.
In one respect at least, our “black box” analogy is accurate; samples do indeed go in at one end of the lab, and results do come out the other. Either or both of these areas may benefit from some degree of automation. Furthermore, it will likely be the case that each user will have widely different requirements in terms of sample submission and results reporting. For example, it may be that the daily crusher heads samples will benefit most from automating the reporting side, whereas exploration samples might gain the most from automated submission. In all cases, analysts should discuss with users all of the different ways that their processes can be automated, identifying both the benefits and drawbacks of each. Only when this is done can informed decisions be made regarding the suitability or otherwise of automation, and the specific geometries required.
As a first step, analysts should compile a list all of laboratory customers, (i.e. consumers of laboratory data). The list should be as comprehensive as possible and should include both routine and non-routine submissions. For each customer, the analyst should note all of the different types of samples submitted to the laboratory for analysis. Once the list is compiled, analysts should set up meetings with each of the customer departments to map out a strategy for data interfacing. These meetings should address the following issues:
- Examine the existing systems, in each case determining what are the strengths and weaknesses.
- Identify key areas for improvement.
- List and discuss alternatives, and
- Develop an action plan.
The action plan identifies the what, when and how of the automation project.
During these kinds of exercises, it is sometimes helpful to view the laboratory as if it were a commercial operation, where success or failure depends in large part on keeping clients happy. However, it is also important to remember the “big picture”. Automation just for automation’s sake is counterproductive – it should be done only to address a real need within or provide a measurable benefit to the organization.
To illustrate these concepts, we will look at three hypothetical users in a typical minesite scenario.
- Exploration geology.
- Mining department, and
- Processing department.
In each case we will examine the particular needs of each user and recommend an automation strategy to address them.
The first example looks at the exploration geologists working on a mining project. While exploration samples are often assayed commercially, it is not uncommon to find them analyzed on site or at one of the companies own facilities. This may give benefits, such as lower cost and improved confidentially.
The following list shows the particular features of exploration sample submissions and reporting:
- Samples are submitted on an irregular frequency.
- Samples are mostly of the same type or matrix.
- There is a wide variance in the numbers of samples submitted.
- The department generally requests a few, specific types of tests.
- They traditionally receive results via spreadsheet.
The department has identified has identified several areas for improvement. Specifically, they would like to be able to
- Automate sample submission to reduce errors and save time.
- Track samples after they have been submitted.
- Access results from remote locations.
Based on an analysis of the needs of the exploration department, we make the following recommendations for automation of submission and reporting:
- Use a Browser interface for sample submission and results access across the Internet or over an Intranet.
- Print barcode labels directly from the user’s computer.
- Receive email alerts when results have been approved for release.
- Download results in Excel spreadsheets.
Although many LIMS today incorporate a web component to allow remote access across an Intranet or over the Internet, the specific functionality offered may be limited. If the LIMS does not include web-access, or if the features desired are not available, it will be necessary to develop the interface. Browser-based applications are built using web programming languages such as PHP and ASP. They are complicated to develop and the task may not be possible in-house, so that a contractor may need to carry out the work.
Users should understand fully the security implications of running a web server in the lab, and they should also ensure security within the application itself, for example by using standardized authentication methodologies and encryption protocols such as SSL for data transmission.
If the LIMS does not support a web interface natively, an alternative could be to use a thin client such as MS Terminal Services, Citrix Metaframe or even VNC. These programs work by projecting a Windows desktop across a network, so that a remote user can connect over a relatively slow connection and interact with the program just as if he or she were inside the lab.
Our next example is of the mining department, possibly the biggest consumer of laboratory services.
This list shows key features about mining samples submitted for analysis.
- Samples are submitted on a more regular frequency than those for exploration samples.
- Samples are of generally of the same type or matrix.
- The number of samples submitted does not vary greatly between batches.
- The department generally requests a narrow and specific range of analyses.
- They traditionally receive their results via spreadsheet or fixed-format printed report.
The mining department needs to be able to:
- Generate Sample lists from blast-hole patterns and upload them directly to the LIMS
- Import results from production drill hole assays directly into the mine model.
- Submit samples and receive results from the mine’s R/C drilling program.
After analyzing the specific requirements of the mining department, we can make the following list of recommendations:
- Use customized spreadsheet macros to export sample lists to the LIMS e.g. in text, spreadsheet or xml format.
- Link the mine model directly to the LIMS database via ODBC or OLE-DB, to enable bi-directional transfer of data.
- Use a web interface to submit R/C sample ID’s and retrieve results.
Spreadsheets have become the de-facto standard for computerized data analysis and storage in many areas. Furthermore, there is often at least one person in each department who is capable of writing spreadsheet macros to automate various tasks, such as exporting sample lists to a LIMS. In particular, the latest versions of Microsoft Excel incorporate a powerful programming language that offers virtually limitless possibilities for automating the transfer of data both in and out of the spreadsheet. Alternatively, many LIMS allow users to import sample lists directly from spreadsheets.
For database to database transfer, the answer is most likely to be found in a protocol called Open DataBase Connectivity, or ODBC. ODBC allows different database systems to communicate easily with each other, and provides the data translation syntax to ensure that numeric data types, for example, are recognized as such across different systems.
There are two general methods of data transfer between systems; known as ‘push’ and ‘pull’. As their name implies, these methods differ in how and where the trigger event is provided to exchange data. Users should choose whichever is appropriate. A variation of these methods is to use drop-boxes for data transfer instead of allowing direct access. Applications should poll the drop boxes periodically, looking for data, and purge the drop box after importing.
As before, users must understand the security considerations of opening database systems for access and modification. SQL, the language by which most relational database systems access and manipulate data, is a phenomenally powerful and unforgiving language. A single line of code can wipe out millions of records at once! Having said this, there are several things that users can do to mitigate the potential for disaster. Among them:
- Do not give users direct access to tables – use stored procedures instead.
- Implement logging of database activity and review the logs to see who is doing what.
- Consider the implications of bad data. Make sure to have sufficient table-level constraints to protect against null values, incorrect data types, unmatched foreign keys etc.
- Set up alerts, (e.g. email notification), to advise users of any drop box activity.
The final example looks at the needs of the process department.
Key features of process department analyses are:
- Very regular sample frequency covering a full 24-hour workday.
- Wide variety of sample types and matrices.
- The number of samples submitted does not vary greatly between batches.
- Generally request a narrow and very specific range of analyses.
- Traditionally receive results via spreadsheet or fixed-format printed report.
The process department has the following specific requirements for their analyses:
- Steps must be taken to ensure that priority samples are processed promptly the moment they arrive at the lab..
- Plant operators require real-time access to process assays, to enable them to react quickly to process upsets and to more effectively monitor and control the plant..
- Supervisors should receive immediate notification when something goes out of control.
- The process manager needs to be able to generate and download historical data sets of results for trend and other analysis.
Based on an analysis of the process department requirements, we can make the following recommendations:
- Automate login for routine samples using a scheduling utility.
- Establish a system of priorities so that critical samples are processed first.
- Provide a Windows client interface to the LIMS for plant operators.
- Establish compliance tables and alerts so that appropriate personnel are informed automatically when results are out of compliance.
- Provide a Windows client interface for the Process Manager so he can generate historical data sets for process analyses.
Essentially, this option entails extending the LIMS outside the lab. As with the other examples, users must make sure they understand all of the security implications involved. If the LIMS allows fine-grained security, external users should be given as few permissions as possible; basically just the minimum required to enable them to retrieve the data they need.
This solution is suited to cases where the outside users and the Lab are located on the same LAN. If that is not the case, and if connection speeds are too slow, consider using a thin client instead.
The following table summarizes the findings of our analysis for the three different areas.
Browser-based web interface for remote access over the Internet.
Custom utilities to link mining spreadsheets and databases with the LIMS.
Windows client enables direct access to LIMS data