2.4 Monitoring

Monitoring of air quality is commonly required to establish baseline conditions to use in an air quality assessment before a mining proposal is decided on. It may also be required after operations start, either for model validation or as part of an ongoing air quality management plan. The reader here is also referred to Evaluating performance: monitoring and auditing leading practice handbook (DRET, 2009).

2.4.1 Monitoring design and logistics

The sensible design and operation of a monitoring program involves some strategic and logistic considerations. Many a monitoring program has failed to yield its potential value through inadequate planning and poor quality control.

The purpose of monitoring influences its design. If it is required as part of a baseline study, the authorities will require a certain period of data to be gathered (usually at least a year) to capture seasonal variations. It is necessary to adhere to accepted standards for instrument choice, siting, calibration and data management. Similar requirements will apply if the monitoring is required for compliance and model validation as part of a licence condition. However, if dust monitoring is instigated for real time management, where responsiveness and flexibility are more important than data precision, the use of instruments complying with Australian Standards or other authorities is not as critical.

Baseline studies

If a monitoring program is required as part of a baseline air quality study, it is important to check with the agency overseeing the approval process to clarify what specific monitoring is expected. For mines, the relevant aspect is usually dust, typically represented by PM10. A monitoring program may require simultaneous measurement of TSP, PM10 or PM2.5, or a combination of these (PM10 is the most widely used indicator, while PM2.5 is routinely considered in some jurisdictions). Various types of instrumentation can achieve this, but the choice of instrument will determine whether it is possible to gather continuous data averaged over 10 minute periods, or 24 hours, for example.

Measurements over short intervals (such as 10 minutes) provide a better basis for identifying and understanding the sources of emissions. However, this requires meteorological data as well: the simultaneous monitoring of wind speed and wind direction, as a minimum. A well-configured weather station forming part of an air quality monitoring program will also measure fluctuations in wind speed and direction (the fluctuations are indicators of turbulence), solar or net radiation, temperature, rainfall, air pressure and humidity. It may also have wind and temperature sensors at two or more levels above the ground.

Deposition monitoring

An aspect of dust that most directly affects neighbours of mining operations is fallout or deposition. The accumulation of dust deposits causes annoyance because of its aesthetic impact and the need for frequent cleaning. Dust deposition monitoring using dust gauges is a simple method and more directly measures the cause of complaint than methods that measure suspended dust concentrations (PM10, PM2.5 or TSP).

The standard dust deposition measurement involves passive collection of the sample over a 30-day period, while most dust deposition problems are caused by short events, typically over some hours. The standard 30-day sample tells nothing of the timing of the fallout, and may not be a very good indicator of the level of annoyance caused. Nevertheless, this type of deposition monitoring remains common because it is relatively cheap and simple.

Targeted emissions monitoring

In some cases, there may be site-specific issues that require more targeted monitoring. For example, if rocks contain significant silica content, the respirable crystalline silica concentration should be measured. If there is a radioactive component, monitoring of radio nuclides and/or radon may be important. For most mining proposals and operations, there is no need to be concerned with monitoring of gaseous pollutants such as sulphur dioxide for environmental baseline or compliance purposes.

For operational mines, it is possible that the threshold for annual reporting to the NPI or National Greenhouse Gas and Energy Reporting scheme will be triggered. Reporting for these programs involves various methods, typically ‘default’, simple methods that use emission factor calculations that require the input of data on characteristics of materials, rates of activity and throughputs: for example, the estimation of PM emissions from haul roads requires input of data on vehicle mass, distances travelled, road silt content and daily rainfall.

However, emission factors are relatively crude, especially when used with default, rather than site-specific, values for various inputs. Hence, it may be decided to gather more site-specific data (for example, on road silt content). Depending on the specific emission source and pollutant under consideration, there will be one or more parameters to be measured on site in order to obtain more reliable emissions data. Such monitoring programs are voluntary, and usually require some specialist input or advice.

Instrument selection

The selection of instruments for monitoring is an important step, and needs to take into account any necessary standards (such as Australian Standards, United States Environmental Protection Agency standards, and methods approved by state regulatory authorities), particularly if monitoring is for compliance or statutory purposes. The selection also needs to consider cost, maintenance needs, power requirements, siting (for example exposure to wind), security and site accessibility.

2.4.2 Data quality

Many monitoring programs pay insufficient attention to maximising both data quantity and quality. Data loss can be minimised by regular checking and maintenance of equipment. The more quickly sensor or logging problems are identified, the better the chances of quickly rectifying the problem and reducing data loss. The best results are achieved by having data available in real time or downloaded frequently and checked at least daily.

Data quality is highly important, but can often be taken for granted. The regular maintenance and calibration of instruments assists in ensuring good data quality, but data needs to be regularly screened and checked. Real time data-checking software with alarms communicated to the user provides the best results, but regular ‘reality checking’ of data by skilled staff is also very useful. Checks should be made to test whether data are within expected ranges for the season and time of day, and whether expected relationships between measured parameters exist (for example, whether wind speed and temperature increase during the day). If data outliers are identified and checked routinely, instrumental errors can be quickly dealt with.

Ultimately, a monitoring program should aim to consistently achieve at least 90 per cent to 95 per cent valid data return. Specific performance level requirements may be specified by regulators; this should be checked. Monitoring in remote locations without power poses particular logistic problems, so it might be necessary to use low-power samplers with solar recharge instead of more standard instruments such as high-volume samplers or TEOMs (tapered element oscillating microbalances) which require 240 volt power. There is a greater risk of data loss from remote monitoring sites.

Share this Page