The Magazine for Underwater Professionals
The NaviSuite Workflow Manager software tool is designed to enable a previously unseen degree of automatic data processing of massive amounts of subsea sensor data
Danish software engineering specialist EIVA has released a new tool, NaviSuite Workflow Manager, for automatic data processing of large amounts of subsea data, with minimum human involvement.
“With the Workflow Manager, repetitive tasks are automated and executed in parallel by the software. This means that data is processed faster, and crew members’ time is spent more efficiently on errors, interpretations and quality control,” the company says.
The first performance measurements on real-life operations data and workflow setups have shown positive results, according to EIVA, with 50 hours of data processed in less than two hours. “This is 25 times faster than other, typical data processing setups, where the industry has been satisfied with being able to process one to one, that is, process one hour of data in one hour,” the firm says.
It adds: “This particular performance measurement was reached with six AUVs running continuously, in parallel. All the data brought in by these was processed with a single crew, as opposed to increasing the crew size accordingly.”
According to EIVA, the Workflow Manager can be applied in various types of operations, including: autonomous underwater vehicle/unmanned surface vehicle operations, where the Workflow Manager processes hundreds of steps in parallel when data is recovered at the end of each mission; and shallow-water surveys and pipeline/ cable route inspections, where each sensor file is automatically processed as it is recorded, thus completing most of the processing during acquisition, leaving quality control and manual eventing as the only task for the data processer to carry out.
“The Workflow Manager is a configurable tool, which allows the user to automate the small steps data processors normally go through when processing subsea data. A few examples are: loading files, including waiting for files to become available; applying tide, SVP and so forth; cleaning data; checking a number of quality control parameters such as density, gaps, noise levels and TVU; correcting seabed height level across multiple surveys; and exporting data in different formats,” the company explains.