Literature

The SIX-S team strives to be scientifically and practically active and always up to date. Some of the recognised publications from our team members are the following.

Books

Introduction to the Theory and Practice of Sampling

Kim H. Esbensen (2020)

ISBN: 978-1-906715-29-8

Abstract:

“Sampling is not gambling”. Analytical results that form the basis for decision-making in science, technology, industry and society must be relevant, valid and reliable. However, the results of the analysis cannot be considered in isolation from the specific conditions under which they were produced. Sampling is the critical success factor prior to analysis, which should only be carried out on documented representative samples. The journey from heterogeneous materials in " lots " (in the kiloton range) to the tiny laboratory aliquots (in the g-µg range) that are actually analysed is very long and complex. There are specific principles and rules behind representativity.
The textbook “Introduction to the Theory and Oractice of Sampling” (2018) is a popular vehicle to get started in the field of representative sampling; since its publication it has been a major outreach offer to all individuals and groups woth responsibility for sampling in all of science, industry, technology, commerce, trading and in society, not least within environmental impact monitoring and mitigation.

Multivariate Data Analysis

Kim H. Esbensen and Brad Swarbrick (2017)

Abstract:

Esbensen is the sole author of five editions of the highly appreciated textbook: “Multivariate Data Analysis – an Introduction” (33.000 copies in the period 2001-2015), used at university and in-house courses for companies. It has been the literature background for CAMO’s professional courses since 2001. After two decades of continued use, 2017 finally saw completion of the 6.th edition Esbensen & Swarbrick: “Multivariate Data Analysis – An introduction to Multivariate Analysis, Process Analytical Technology and Quality by Design” (to be published ultimo 2017).

Characteristics of sensor-based sorting and implementation in mining

Christopher Robben (2014)

ISBN: 978-3-8440-2498-2

Abstract:

Sensor-based sorting is a sustainable processing technology for the separation of coarse materials that has an impact on all processes in the mineral production chain. It is still at the beginning of the market for many raw materials and applications and is far from reaching technical saturation as the development of sensor technology and mechanical platforms is still accelerating.
The presented study evaluates the technical and financial characteristics of sensor-based sorting and presents a framework and methodology for project development and assessment, implementation, efficiency testing and optimisation, future research and development.

Articles

Determination of Gold Particle Characteristics for Sampling Protocol Optimisation

Simon Dominy, Ian M. Platten, Hylke J. Glass

In: Minerals (2021)

ISBN: 2075-163X

Abstract:

Sampling, sample preparation, and assay protocols aim to achieve an acceptable estimation variance, as expressed by a relatively low nugget variance compared to the sill of the variogram. With gold ore, the typical heterogeneity and low grade generally indicate that a large sample size is required, and the effectiveness of the sampling protocol merits attention. While sampling protocols can be optimised using the Theory of Sampling, this requires determination of the liberation diameter (dℓAu) of gold, which is linked to the size of the gold particles present. In practice, the liberation diameter of gold is often represented by the most influential particle size fraction, which is the coarsest size. It is important to understand the occurrence of gold particle clustering and the proportion of coarse versus fine gold. This paper presents a case study from the former high-grade Crystal Hill mine, Australia. Visible gold-bearing laminated quartz vein (LV) ore was scanned using X-ray computed micro-tomography (XCT). Gold particle size and its distribution in the context of liberation diameter and clustering was investigated. A combined mineralogical and metallurgical test programme identified a liberation diameter value of 850 µm for run of mine (ROM) ore. XCT data were integrated with field observations to define gold particle clusters, which ranged from 3–5 mm equivalent spherical diameter in ROM ore to >10 mm for very high-grade ore. For ROM ore with clusters of gold particles, a representative sample mass is estimated to be 45 kg. For very-high grade ore, this rises to 500 kg or more. An optimised grade control sampling protocol is recommended based on 11 kg panel samples taken proportionally across 0.7 m of LV, which provides 44 kg across four mine faces. An assay protocol using the PhotonAssay technique is recommended.

X-ray-transmission based ore sorting at the San Rafael tin mine

Christopher Robben, Pedro Condori, Angel Pinto, Ronald Machaca, AnssiTakala

In: Minerals Engineering (2020)

Abstract:

San Rafael is the largest underground tin mining operation in the world. The main methods for processing the ore are gravity separation and flotation. In 2016 a new addition to the plant was an X-ray transmission (XRT)-based ore sorting island. The objective was to reject waste from a marginal development waste dump, and it is now also treating low grade material from the underground mine. XRT sorting discriminates particles on a planar projection of the density of matter attenuating X-ray radiation. Physical separation is achieved by means of an array of high speed air jets. The feed grade of approximately 0.6% tin is concentrated to 2.8% tin in the product at recovery of 90% in about 19% mass yield to product. This enriched product fraction from the sensor-based ore sorting plant feeds into the main plant.

The plant, which is designed for 3600 tonnes per day, went into operation in 2016. The total capital expenditures of 24 M$ US were paid back within 4 months. The projected contribution to the production of refined tin for the year 2017 was 6000 tonnes. In summary, the positive impacts were:
- Value added from previously sub-economic waste that can be mined with lower costs.
- Increase of productivity in the main wet sections of the plant.
- Reduction in cut-off grade and increase in reserves.
- Reduction of environmental impacts.

The paper describes the project from concept, through the development phase, including test work confirming feasibility, to a discussion of operational data and includes shared experiences from operation and maintenance.

Importance of representative metallurgical sampling and testwork programmes to reduce project risk – a gold case study

Simon Dominy, Louisa O'Connor, Saranchimeg Purevgerel

In: Mining Technology (2019)

Abstract:

When developing a process flowsheet, the risks in achieving positive financial outcomes are minimised by ensuring representative metallurgical samples and quality testwork. A case study is presented based on an underground gold operation, where poor metallurgical sampling led to grade and recovery underperformance. Sampling-related issues included: poor liaison between geologists and metallurgists; poor domaining; too few metallurgical samples collected and tested; unrepresentative sample composites and sub-samples; poor laboratory practice; and a lack of documentation and QAQC. These issues led to disruption over four years and are estimated to have cost around US$115M in lost revenue and US$7.5M in corrective expenditure. After an initial characterisation programme, a variability mini-bulk sampling and testwork phase was undertaken. This was followed by a pilot programme, progressing to trial mining and production. The paper emphasises the need for fit for purpose metallurgical sampling and testwork, and the early application of variability sampling.

Strategic and Tactical Geometallurgical Application in an Underground High-Grade Narrow-Vein Gold Operation

Simon Dominy, Louisa O’Connor, Hylke Glass, Saranchimeg Purevgerel

In: Proceedings of the 28th International Symposium on Mine Planning and Equipment Selection - MPES 2019 (2019)

ISBN: 978-3-030-33953-1

Abstract:

Vein gold deposits are often characterised by multiple sub-parallel veins and free-milling coarse gold. Inherent heterogeneity results in grade and process parameter variability, which increases project risk if not quantified and controlled. The geometallurgical approach can be broadly split into two activities: strategic and tactical. The strategic approach focuses on the whole orebody and long-term life-of-mine view, whereas tactical geometallurgy relates to a more short- to medium-term view during mining. The geometallurgical approach requires spatially distributed samples within a deposit to support variability modelling. A variability sampling and testwork protocol was developed to quantify gold grade and recovery. Additional attributes from core logging, mineralogical determination were integrated with grade and recovery data. This contribution presents a case study of strategic and tactical geometallurgical programme application to a narrow-vein deposit. It exemplifies how data can be used to support resource estimation, a pre-feasibility study, trial mining and production. Subsequent to production commencing, a tactical geometallurgical/ore control programme was introduced to optimise mine scheduling and process activities.

Sensor-based ore sorting technology – part, present and future

Sensor-based ore sorting technology – part, present and future

In: Minerals (2019)

ISBN: 2075-163X

Abstract:

While the deposit qualities for mineral raw materials are constantly decreasing, the challenges for sustainable raw material processing are increasing. This applies not only to the demand for minimizing the consumption of energy, water, and reagents, but also to the reduction of residual materials, especially fine and difficult-to-landfill materials. Sensor-based ore sorting can be used as a separation process for coarser grain sizes before the application of fine comminution and separation technologies and is applicable for a large variety of mineral raw materials. Sensor-based ore sorting applies at various points in the process flow diagram and is suitable for waste elimination, for material diversion into different process lines, for the production of pre- and final concentrates, as well as for the reprocessing of coarse-grained waste dumps and other applications. The article gives an overview of the development and state of the art of sensor-based ore sorting for mineral raw materials and introduces various applications.

Sensor-based ore sorting to maximise profit in a gold operation

B Nielsen; J Rohleder; H Lehto; C Robben

In: AusIMM Bulletin (2018)

Abstract:

Sensor-based ore sorting is being increasingly used to reduce the amount of low-grade and waste material processed in mineral concentrators. This type of preconcentration provides bottom-line bene..ts to users by reducing the amount of energy, water and consumables, as well as reducing capital cost. Existing operations can increase metal production, while previously uneconomic deposits and low-grade stockpiles can also be exploited. The technology can also be used to separate ore types for selective processing. The path to implementing sensor-based sorting may include: • geometallurgical evaluation • first inspection testing to investigate sensor response • bench-scale testing where sensor selection is not obvious or for difficult applications • performance testing in full-scale sensor-based sorting machines • larger scale site-based piloting with a temporary semi-mobile plant installation. Sorting requires material to be suitably prepared and presented to the machines, and typically this consists of crushing and screening to limit top size and optimise liberation. However, where material streams are suitably sized and prepared, additional equipment may not be required such as the sorting of semi-autogenous grinding (SAG) mill pebble streams. This paper presents a case study of economic upgrading of gold ore by preconcentrating with sensor-based ore sorting. The case study examines sorting amenability, test work and the feasibility study through to implementation, with associated flow sheet development. The development process is analysed and evaluated with a view to rationalising the process for development of future projects. In addition, limited financial modelling based on expected results is shown to illustrate the benefit to the operation.

Theory of sampling (TOS) versus measurement uncertainty (MU) – A call for integration

Kim H Esbensen, Claas Wagner

In: TrAC Trends in Analytical Chemistry (2014)

Abstract:

We assess current approaches to measurement uncertainty (MU) with respect to the complete ensemble of sources affecting the measurement process, in particular the extent to which sampling errors as set out in the Theory of Sampling (TOS) are appropriately considered in the GUM and EURACHEM/CITAC guides. All pre-analysis sampling steps play an important, often dominant role in the total uncertainty budget, thereby critically affecting the validity of MU estimates, but most of these contributions are not included in the current MU framework. The TOS constitutes the only complete theoretical platform for dealing appropriately with the entire pathway from field sample to test portion. We here propose a way to reconcile the often strongly felt differences between MU and TOS. There is no need to debate terminology, as both TOS and MU can be left with their current usages.

Experiences in dry coarse coal separation using X-ray-transmission-based sorting

Christopher Robben, Johan de Korte, Hermann Wotruba, Mathilde Robben

In: International Journal of Coal Preparation and Utilization (2014)

ISBN: 1939-2699

Abstract:

Coal sorting with an x-ray-transmission (XRT) sensor has experienced a lot of development since first described by Jenkinson et al. in 1974 Powerful computers and increasingly sensitive X-ray scintillation counters enabled the development of high-performance sensor-based sorting machines. The first industrial installation of a belt-type XRT sorting for coal has been in operation since 2004. It is upgrading high-quality coal from 1% ash to 0.7% ash for cathode production. Since 2010 chute-type sorters are in the South African market and, next to production, have been used for extensive test work. Improvements on both separation efficiency and availability of containerized semi-mobile systems have been achieved. The article will introduce the principle of the technology and will summarize the latest experiences in both separation efficiency and operational stability.

Process Sampling: Theory of Sampling – the Missing Link in Process Analytical Technologies (PAT)

Kim H. Esbensen, Peter Paasch-Mortensen

In: Katherine A. Bakeev: Process Analytical Technology (2010)

ISBN: 9780470722077

Abstract:

Process analytical chemistry (PAT) and its predecessor, process analytical chemistry (PAC), involve chemometric analysis and modelling based on adequate data quality in terms of what the sensor signals/measured values [X,Y] represent in the industrial process environment. Both [X,Y] data must be obtained through representative sampling to ensure accuracy in terms of batch characteristics and precision in terms of sampling and analytical errors that have not been eliminated. Chemometric data models must adhere closely to reliable performance validation, e.g. in terms of prediction, classification or time prediction.
9.4.2.1.1 Chain of Evidence

Without representative process samples, the chain of evidence against lot characteristics is poor at all scales due to inherent material heterogeneity. This applies to both samples and sensor signals. It is not possible to correct for the shortcomings of non-representativeness in the subsequent analytical modelling of the data. Since the analytical errors are usually one or two orders of magnitude smaller than the combined sampling errors, the dominant aspect of "data quality" in practice depends almost entirely on the sampling. Based on the theory of sampling (TOS), a minimum set of operating principles can be outlined for both zero-dimensional (0-D, batch samples) and one-dimensional (1-D) lots (process samples), including characterisation of heterogeneity, systematisation of eight sampling errors and four practical sampling unit operations (SUO) to combat heterogeneity. For more detailed information browse in Chapter 3 in the book "Process Analytical Technology" : "Theory of Sampling (TOS) - the missing link in process analytics" here.

Principles of Proper Validation: use and abuse of re-sampling for validation

Kim H. Esbensen, Paul Geladi

In: Journal of Chemometrics (2010)

Abstract:

Validation in chemometrics is presented using the exemplar context of multivariate calibration/prediction. A phenomenological analysis of common validation practices in data analysis and chemometrics leads to formulation of a set of generic Principles of Proper Validation (PPV), which is based on a set of characterizing distinctions: (i) Validation cannot be understood by focusing on the methods of validation only; validation must be based on full knowledge of the underlying definitions, objectives, methods, effects and consequences—which are all outlined and discussed here. (ii) Analysis of proper validation objectives implies that there is one valid paradigm only: test set validation. (iii) Contrary to much contemporary chemometric practices (and validation myths), cross-validation is shown to be unjustified in the form of monolithic application of a one-for-all procedure (segmented cross-validation) on all data sets. Within its own design and scope, cross-validation is in reality a sub-optimal simulation of test set validation, crippled by a critical sampling variance omission, as it manifestly is based on one data set only (training data set). Other re-sampling validation methods are shown to suffer from the same deficiencies. The PPV are universal and can be applied to all situations in which the assessment of performance is desired: prediction-, classification-, time series forecasting-, modeling validation. The key element of PPV is the Theory of Sampling (TOS), which allow insight into all variance generating factors, especially the so-called incorrect sampling errors, which, if not properly eliminated, are responsible for a fatal inconstant sampling bias, for which no statistical correction is possible. In the light of TOS it is shown how a second data set (test set, validation set) is critically necessary for the inclusion of the sampling errors incurred in all ‘future’ situations in which the validated model must perform. Logically, therefore, all one data set re-sampling approaches for validation, especially cross-validation and leverage-corrected validation, should be terminated, or at the very least used only with full scientific understanding and disclosure of their detrimental variance omissions and consequences. Regarding PLS-regression, an emphatic call is made for stringent commitment to test set validation based on graphical inspection of pertinent t–u plots for optimal understanding of the X–Y interrelationships and for validation guidance. QSAR/QSAP forms a partial exemption from the present test set imperative with no generalization potential.

Representative sampling for reliable data analysis: Theory of Sampling

Lars Petersen, Pentti Minkkinen, Kim H.Esbensen

In: Chemometrics and Intelligent Laboratory Systems (2005)

Abstract:

The Theory of Sampling (TOS) provides a description of all errors involved in sampling of heterogeneous materials as well as all necessary tools for their evaluation, elimination and/or minimization. This tutorial elaborates on—and illustrates—selected central aspects of TOS. The theoretical aspects are illustrated with many practical examples of TOS at work in typical scenarios, presented to yield a general overview. TOS provides a full scientific definition of the concept of sampling correctness, an attribute of the sampling process that must never be compromised. For this purpose the Fundamental Sampling Principle (FSP) also receives special attention. TOS provides the first complete scientific definition of sampling representativeness. Only correct (unbiased) mass reduction will ensure representative sampling. It is essential to induct scientific and technological professions in the TOS regime in order to secure the necessary reliability of: samples (which must be representative, from the primary sampling onwards), analysis (which will not mean anything outside the miniscule analytical volume without representativity ruling all mass reductions involved, also in the laboratory) and data analysis (“data” do not exist in isolation of their provenance). The Total Sampling Error (TSE) is by far the dominating contribution to all analytical endeavours, often 100+ times larger than the Total Analytical Error (TAE).We present a summarizing set of only seven Sampling Unit Operations (SUOs) that fully cover all practical aspects of sampling and provides a handy “toolbox” for samplers, engineers, laboratory and scientific personnel.

Representative mass reduction in sampling—a critical survey of techniques and hardware

Lars Petersen, Casper K Dahl, Kim H Esbensen

In: Chemometrics and Intelligent Laboratory Systems (2004)

Abstract:

We here present a comprehensive survey of current mass reduction principles and hardware available in the current market. We conduct a rigorous comparison study of the performance of 17 field and/or laboratory instruments or methods which are quantitatively characterized (and ranked) for accuracy (bias), reproducibility (precision), material loss (external as well as internal loss), user-dependency, operation time, and ease of cleaning. Graphical comparison of these quantitative results allow a complete overview of the relative strengths and weaknesses of riffle splitters, various rotational dividers, the Boerner Divider, the “spoon method”, alternate/fractional shoveling and grab sampling.

Only devices based on riffle splitting principles (static or rotational) passes the ultimate representativity test (with minor, but significant relative differences). Grab sampling, the overwhelmingly most often used mass reduction method, performs appallingly—its use must be discontinued (with the singular exception for completely homogenized fine powders). Only proper mass reduction (i.e. carried out in complete compliance with all appropriate design principles, maintenance and cleaning rules) can always be representative in the full Theory of Sampling (TOS) sense. This survey also allows empirical verification of the merits of the famous “Gy's formula” for order-of-magnitude estimation of the Fundamental Sampling Error (FSE).

Small-scale vein-gold exploitation in Gwynfynydd Mine, Wales, UK

Simon C. Dominy, Roland F.G. Phelps

In: Mine Planning and Equipment Selection (1997)

ISBN: 9781003078166

Abstract:

The black shale-hosted veins of Gwynfynydd mine have intermittently produced more than 1.4 tonnes of gold since 1863. Reopened in 1992, the mine extracts 5,000 tonnes annually from a total resource of 180,000 tonnes. A historical mining grade of 15.2 grammes/tonne is given for the Chidlaw Lode (159,000 tonnes total resource) on which production is based. The mineralization is characterized by erratically distributed gold pockets hosted in the footwall zone of a 3–6 m wide mother vein. The pockets are found within a discrete east plunging oreshoot associated with the development of impersistent narrow footwall leader vein(s). The gold exhibits a high-nugget effect and as a consequence the accurate determination of tonnage and grade in particular blocks is difficult. Development is facilitated by lode drives, raises and sub-levels which assist with geological evaluation and block definition. Production is undertaken by overhand shrinkage stoping with an average stope width of 1.5 m. A 1 tonne capacity load-and-carry dumper has recently been introduced on lode drives, with slushing in stopes and sub-levels, this has reduced double handling and increased output. An underground mill based on physical separation methods has been designed in response to environmental constraints.

Narrow-vein mining – a challenge to the operator

Simon C. Dominy, G. Simon Camm, Roland F.G. Phelps

In: Mine Planning and Equipment Selection (1997)

ISBN: 9781003078166

Abstract:

Narrow veins represent an important resource of metals throughout the world, however they are generally technically challenging to exploit because of their geologically complex nature. The requirement of the operator is to produce ore with little dilution and within the framework of economic viability and safety. A number of factors are the key; effective reserve evaluation, mine design, grade control and reconciliation. Reserves define the grade and tonnage of the deposit and are based on the collection and modelling of geological data. The geological reserve represents the in-situ grade and tonnage from which mine design activities produce the mining reserve after accounting for dilution and pillar design. Narrow veins exploitation has traditionally been labour intensive, utilizing methods such as overhand shrinkage stoping, though the application of bulk and mechanized methods have improved productivity and lowered costs. Grade control ensures that the mine head grade is above cut-off and monitors planned and additional dilution. Throughout evaluation and mining, high quality geological data and interpretation is the key to success and should undertaken in collaboration with engineers.

Principal component analysis

Svante Wold, Kim Esbensen, Paul Geladi

In: Chemometrics and Intelligent Laboratory Systems (1987)

Abstract:

Principal component analysis of a data matrix extracts the dominant patterns in the matrix in terms of a complementary set of score and loading plots. It is the responsibility of the data analyst to formulate the scientific issue at hand in terms of PC projections, PLS regressions, etc. Ask yourself, or the investigator, why the data matrix was collected, and for what purpose the experiments and measurements were made. Specify before the analysis what kinds of patterns you would expect and what you would find exciting. The results of the analysis depend on the scaling of the matrix, which therefore must be specified. Variance scaling, where each variable is scaled to unit variance, can be recommended for general use, provided that almost constant variables are left unscaled. Combining different types of variables warrants blockscaling. In the initial analysis, look for outliers and strong groupings in the plots, indicating that the data matrix perhaps should be “polished” or whether disjoint modeling is the proper course. For plotting purposes, two or three principal components are usually sufficient, but for modeling purposes the number of significant components should be properly determined, e.g. by cross-validation. Use the resulting principal components to guide your continued investigation or chemical experimentation, not as an end in itself.