Quality Control of Survey Data
Article

Quality Control of Survey Data

A Continuous Process?

Quality control of survey data is an integral part of any hydrographic survey or inspection operation, but the focus is too often concentrated on the wrong sensors, the carrying out of unnecessary comparisons and ultimately burying real issues in a flood of confusing and contradictory error messages, print-outs and alarms. So what is meant by quality control of survey data, why do we need it, how should we measure it and when is the best time to carry it out?

‘Quality’ is a phrase often used and is expected to be highly visible in the literature and mission statements of most companies and organisations. The problem facing the surveyor is how to translate this into actions that can be taken on board a vessel conducting a survey.

Quality control originated in the manufacturing industry and can be defined as a mechanism for ensuring that a product conforms to a pre-determined specification. It is distinct from ‘quality assurance’ which is a higher level process concerned with the ability of a process to deliver a quality product or service. With respect to survey data the focus has been heavily weighted towards quality control i.e. the trapping of errors present in the data, either at the acquisition or processing stage.

There is a cost associated with any failure in the quality control system as you move through the acquisition and processing stages. The further along the process the error is detected the more severe the potential consequences are in terms of time, cost and reputation. It is clear that it is desirable to detect any errors at the earliest possible stage either during or very shortly after the acquisition stage or better still, forestall errors at the set‐up stage before acquisition has commenced. (See Figure 1)

Data Quality

The ultimate aim is for the perfect dataset, but it is implicit in the act of measurement that the true value can never be measured, there will always be a residual error between the measurement and the true value. The problem remains as to how to decide what is sufficient data quality and what should be used as a benchmark.

There are several sources which can be used when determining what accuracies the surveyor should be working towards.

As a survey contractor, the first point of reference will be the client specifications. When working for a client a company is contractually obliged to perform the work to conform to the requirements in these specifications. This is good when the specifications are well set out, clear and achievable. Too often this is not the case and the specifications can be vague and open to interpretation or worst of all, based on unrealistic expectations of equipment and sensor performance.

The sensor specifications listed in the manufacturer’s data sheets are too often theoretical, or based on overly optimistic tests performed under ideal environments. An MRU that only performs within the published specification when it is bolted to a test bed in a laboratory is not useful if it does not perform to this level in the field. (See Figures 2 & 3)

In addition, the figures used in data sheets almost seem to be designed to confuse, as different error figures are often quoted, without specifying their confidence level (1 σ, 2 σ, 95%). This leaves the users with the task of identifying and normalising these figures to understand the actual performance, and it makes it challenging to compare similar sensors.

All providers of survey services will have a set of Internal Procedures: these documents are based on experience and best practice but they are by their nature generic documents. Project specific procedures should be based on knowledge of the scope of work, the specific instrumentation and the client specifications and should be more detailed with respect to specific operations than the generic procedures.

Also available are Survey Standards: these are documents set out by organisations such as IMCA (International Marine Contractors Association). For example, IMCA produces a set of guidelines covering topics including:

  • use of Multi-beam echo sounders
  • use of USBL systems
  • use of GNSS systems

While these guidelines provide much useful information they are, by their nature, based on generic scenarios.

Standards are also published by International and National Hydrographic offices, including:

  • IHO – International Hydrographic Organisation (Monaco)
  • NOAA – National Oceanic and Atmosphere Administration (USA)
  • LINZ – Land Information (New Zealand)

These standards are largely concerned with surveys for navigational purposes and are generally not applicable to the type of tasks often carried out by hydrographic surveyors working for offshore and subsea organisations – in addition they are almost exclusively concerned with vessel-based operations and do not consider subsea sensor platforms e.g. ROVs, AUVs and ROTVs.

Finally, the surveyor can base their expectations of system and sensor accuracies in a subjective manner based on experience: experience is extremely useful in determining data quality, but it is also extremely dangerous as it can be based on different spreads and client requirements – what is acceptable to one client may be completely unacceptable to another.

Sensor Performance

A major driving factor for improved quality control of survey data is the improvements in sensor performance over the past 10 to 15 years. Pipeline inspection used to be carried out using mechanical scanners, where one scan of 200 points took 4 seconds. Today, more than 500 points can be acquired from a multi-beam echo sounder 15 to 20 times a second, and other sensors have shown similar improvements. With these progressions, accurate time-stamping of the data is much more critical than before, and failure to achieve this correctly is probably the most common error found for survey data (See Figure 4).

Another factor is the much greater availability of advanced software on the users’ desktop, allowing them not only to view the interpreted data, but also the raw data in far more detail than before.

Online Quality Control

Online quality control is performed in real time or near real time to ensure sufficient quality has been achieved in the acquired data.

Historically, online QC was a function performed by the online surveyor looking at the information in real time and assessing whether problems developed.

The main drawback with this traditional approach is that, with the increased number and complexity of sensors it is often not easy to detect all errors online and problems may only become apparent at the data-processing stage.

There has been emphasis on improving online QC, especially towards the reporting and documentation of the performance of the online system.

The main drivers for this have been:

  • Multiple sensors, more sophisticated deliveries and improved viewing tools which allow direct comparisons and highlight errors
  • Client requirement to document the QC Process

Daily QC reports will inform the operator (and the client) that something has happened in the previous 24 hours but will not assist during the actual data gathering operations. It is therefore likely that some online QC functions will continue to be manual. This has been carried out using the tools available in the online navigation programs, by setting up:

  • Real-time comparison of ‘similar’ instruments with maximum tolerances for deviation specified
    • Motion vs. motion; GPS vs. GPS etc.
  • Monitoring of maximum or minimum values
    • Maximum or minimum allowed pitch etc.
  • ‘Frozen’ values
  • QC figures from instruments
    • MBE, pipetracker etc.
  • Statistics (near real time, e.g. by file)
  • All of the above can trigger alarms

While some online QC functions should continue to be manual, e.g. multi-beam quality (noise), the majority of error captures should be performed using the online QC function. This should identify problem data as soon as possible without flooding the users with unnecessary information.

Continuous QC

Although the online QC is considered to be the most critical QC function, due to the cost and effort to rectify issues missed there, the QC process should not stop there. It needs to be an integral and continuous part of the chain from data acquisition until final delivery. In order to achieve this, the data processing should follow a pre-defined set of rules based on:

  • Client requirements and deliverables
  • Contractors experience and procedures
  • The software in use
  • The quality of the data

The main advantage with a pre-defined approach, is that the processing becomes less dependent on the individual interpretation, and the results can be better documented and QC’ed.

It should be noted that even though the above is desirable, there are some tasks that by nature are dependent on someone’s interpretation, e.g. reviewing video footage from pipeline inspections. The key is then to document this as well as possible and to QC it against other data, which may be less dependent on individual interpretation.

Conclusion

Quality control functionality should be an integral part of any survey application, both during data acquisition and data processing. The software used should provide the users with the necessary tools to identify errors as soon as possible. Duplication of sensors provides a means to identify that there is a problem, but does not always indicate which sensor is in error. It is also not always possible to duplicate critical sensors, e.g. pipetracker during a Depth of Burial survey.

The future may give us more intelligent sensors, with internal time-stamping of data and with better self-assessment of the data quality, but the key factor to improve the overall quality is to integrate all data real time, and have software that can assess the quality and identify problem areas in real time or very shortly after.

 

 

Hydrography Newsletter

Value staying current with hydrography?

Stay on the map with our expertly curated newsletters.

We provide educational insights, industry updates, and inspiring stories from the world of hydrography to help you learn, grow, and navigate your field with confidence. Don't miss out - subscribe today and ensure you're always informed, educated, and inspired by the latest in hydrographic technology and research.

Choose your newsletter(s)